Jan 23 03:15:07 np0005593234 kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 23 03:15:07 np0005593234 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 23 03:15:07 np0005593234 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 03:15:07 np0005593234 kernel: BIOS-provided physical RAM map:
Jan 23 03:15:07 np0005593234 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 23 03:15:07 np0005593234 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 23 03:15:07 np0005593234 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 23 03:15:07 np0005593234 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 23 03:15:07 np0005593234 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 23 03:15:07 np0005593234 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 23 03:15:07 np0005593234 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 23 03:15:07 np0005593234 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 23 03:15:07 np0005593234 kernel: NX (Execute Disable) protection: active
Jan 23 03:15:07 np0005593234 kernel: APIC: Static calls initialized
Jan 23 03:15:07 np0005593234 kernel: SMBIOS 2.8 present.
Jan 23 03:15:07 np0005593234 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 23 03:15:07 np0005593234 kernel: Hypervisor detected: KVM
Jan 23 03:15:07 np0005593234 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 23 03:15:07 np0005593234 kernel: kvm-clock: using sched offset of 3269554116 cycles
Jan 23 03:15:07 np0005593234 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 23 03:15:07 np0005593234 kernel: tsc: Detected 2800.000 MHz processor
Jan 23 03:15:07 np0005593234 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 23 03:15:07 np0005593234 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 23 03:15:07 np0005593234 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 23 03:15:07 np0005593234 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 23 03:15:07 np0005593234 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 23 03:15:07 np0005593234 kernel: Using GB pages for direct mapping
Jan 23 03:15:07 np0005593234 kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 23 03:15:07 np0005593234 kernel: ACPI: Early table checksum verification disabled
Jan 23 03:15:07 np0005593234 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 23 03:15:07 np0005593234 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 03:15:07 np0005593234 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 03:15:07 np0005593234 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 03:15:07 np0005593234 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 23 03:15:07 np0005593234 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 03:15:07 np0005593234 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 03:15:07 np0005593234 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 23 03:15:07 np0005593234 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 23 03:15:07 np0005593234 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 23 03:15:07 np0005593234 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 23 03:15:07 np0005593234 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 23 03:15:07 np0005593234 kernel: No NUMA configuration found
Jan 23 03:15:07 np0005593234 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 23 03:15:07 np0005593234 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 23 03:15:07 np0005593234 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 23 03:15:07 np0005593234 kernel: Zone ranges:
Jan 23 03:15:07 np0005593234 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 23 03:15:07 np0005593234 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 23 03:15:07 np0005593234 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 23 03:15:07 np0005593234 kernel:  Device   empty
Jan 23 03:15:07 np0005593234 kernel: Movable zone start for each node
Jan 23 03:15:07 np0005593234 kernel: Early memory node ranges
Jan 23 03:15:07 np0005593234 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 23 03:15:07 np0005593234 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 23 03:15:07 np0005593234 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 23 03:15:07 np0005593234 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 23 03:15:07 np0005593234 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 23 03:15:07 np0005593234 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 23 03:15:07 np0005593234 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 23 03:15:07 np0005593234 kernel: ACPI: PM-Timer IO Port: 0x608
Jan 23 03:15:07 np0005593234 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 23 03:15:07 np0005593234 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 23 03:15:07 np0005593234 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 23 03:15:07 np0005593234 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 23 03:15:07 np0005593234 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 23 03:15:07 np0005593234 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 23 03:15:07 np0005593234 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 23 03:15:07 np0005593234 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 23 03:15:07 np0005593234 kernel: TSC deadline timer available
Jan 23 03:15:07 np0005593234 kernel: CPU topo: Max. logical packages:   8
Jan 23 03:15:07 np0005593234 kernel: CPU topo: Max. logical dies:       8
Jan 23 03:15:07 np0005593234 kernel: CPU topo: Max. dies per package:   1
Jan 23 03:15:07 np0005593234 kernel: CPU topo: Max. threads per core:   1
Jan 23 03:15:07 np0005593234 kernel: CPU topo: Num. cores per package:     1
Jan 23 03:15:07 np0005593234 kernel: CPU topo: Num. threads per package:   1
Jan 23 03:15:07 np0005593234 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 23 03:15:07 np0005593234 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 23 03:15:07 np0005593234 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 23 03:15:07 np0005593234 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 23 03:15:07 np0005593234 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 23 03:15:07 np0005593234 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 23 03:15:07 np0005593234 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 23 03:15:07 np0005593234 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 23 03:15:07 np0005593234 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 23 03:15:07 np0005593234 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 23 03:15:07 np0005593234 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 23 03:15:07 np0005593234 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 23 03:15:07 np0005593234 kernel: Booting paravirtualized kernel on KVM
Jan 23 03:15:07 np0005593234 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 23 03:15:07 np0005593234 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 23 03:15:07 np0005593234 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 23 03:15:07 np0005593234 kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 23 03:15:07 np0005593234 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 03:15:07 np0005593234 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 23 03:15:07 np0005593234 kernel: random: crng init done
Jan 23 03:15:07 np0005593234 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 23 03:15:07 np0005593234 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 23 03:15:07 np0005593234 kernel: Fallback order for Node 0: 0 
Jan 23 03:15:07 np0005593234 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 23 03:15:07 np0005593234 kernel: Policy zone: Normal
Jan 23 03:15:07 np0005593234 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 23 03:15:07 np0005593234 kernel: software IO TLB: area num 8.
Jan 23 03:15:07 np0005593234 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 23 03:15:07 np0005593234 kernel: ftrace: allocating 49417 entries in 194 pages
Jan 23 03:15:07 np0005593234 kernel: ftrace: allocated 194 pages with 3 groups
Jan 23 03:15:07 np0005593234 kernel: Dynamic Preempt: voluntary
Jan 23 03:15:07 np0005593234 kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 23 03:15:07 np0005593234 kernel: rcu: #011RCU event tracing is enabled.
Jan 23 03:15:07 np0005593234 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 23 03:15:07 np0005593234 kernel: #011Trampoline variant of Tasks RCU enabled.
Jan 23 03:15:07 np0005593234 kernel: #011Rude variant of Tasks RCU enabled.
Jan 23 03:15:07 np0005593234 kernel: #011Tracing variant of Tasks RCU enabled.
Jan 23 03:15:07 np0005593234 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 23 03:15:07 np0005593234 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 23 03:15:07 np0005593234 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 03:15:07 np0005593234 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 03:15:07 np0005593234 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 03:15:07 np0005593234 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 23 03:15:07 np0005593234 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 23 03:15:07 np0005593234 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 23 03:15:07 np0005593234 kernel: Console: colour VGA+ 80x25
Jan 23 03:15:07 np0005593234 kernel: printk: console [ttyS0] enabled
Jan 23 03:15:07 np0005593234 kernel: ACPI: Core revision 20230331
Jan 23 03:15:07 np0005593234 kernel: APIC: Switch to symmetric I/O mode setup
Jan 23 03:15:07 np0005593234 kernel: x2apic enabled
Jan 23 03:15:07 np0005593234 kernel: APIC: Switched APIC routing to: physical x2apic
Jan 23 03:15:07 np0005593234 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 23 03:15:07 np0005593234 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 23 03:15:07 np0005593234 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 23 03:15:07 np0005593234 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 23 03:15:07 np0005593234 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 23 03:15:07 np0005593234 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 23 03:15:07 np0005593234 kernel: Spectre V2 : Mitigation: Retpolines
Jan 23 03:15:07 np0005593234 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 23 03:15:07 np0005593234 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 23 03:15:07 np0005593234 kernel: RETBleed: Mitigation: untrained return thunk
Jan 23 03:15:07 np0005593234 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 23 03:15:07 np0005593234 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 23 03:15:07 np0005593234 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 23 03:15:07 np0005593234 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 23 03:15:07 np0005593234 kernel: x86/bugs: return thunk changed
Jan 23 03:15:07 np0005593234 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 23 03:15:07 np0005593234 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 23 03:15:07 np0005593234 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 23 03:15:07 np0005593234 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 23 03:15:07 np0005593234 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 23 03:15:07 np0005593234 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 23 03:15:07 np0005593234 kernel: Freeing SMP alternatives memory: 40K
Jan 23 03:15:07 np0005593234 kernel: pid_max: default: 32768 minimum: 301
Jan 23 03:15:07 np0005593234 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 23 03:15:07 np0005593234 kernel: landlock: Up and running.
Jan 23 03:15:07 np0005593234 kernel: Yama: becoming mindful.
Jan 23 03:15:07 np0005593234 kernel: SELinux:  Initializing.
Jan 23 03:15:07 np0005593234 kernel: LSM support for eBPF active
Jan 23 03:15:07 np0005593234 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 03:15:07 np0005593234 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 03:15:07 np0005593234 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 23 03:15:07 np0005593234 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 23 03:15:07 np0005593234 kernel: ... version:                0
Jan 23 03:15:07 np0005593234 kernel: ... bit width:              48
Jan 23 03:15:07 np0005593234 kernel: ... generic registers:      6
Jan 23 03:15:07 np0005593234 kernel: ... value mask:             0000ffffffffffff
Jan 23 03:15:07 np0005593234 kernel: ... max period:             00007fffffffffff
Jan 23 03:15:07 np0005593234 kernel: ... fixed-purpose events:   0
Jan 23 03:15:07 np0005593234 kernel: ... event mask:             000000000000003f
Jan 23 03:15:07 np0005593234 kernel: signal: max sigframe size: 1776
Jan 23 03:15:07 np0005593234 kernel: rcu: Hierarchical SRCU implementation.
Jan 23 03:15:07 np0005593234 kernel: rcu: #011Max phase no-delay instances is 400.
Jan 23 03:15:07 np0005593234 kernel: smp: Bringing up secondary CPUs ...
Jan 23 03:15:07 np0005593234 kernel: smpboot: x86: Booting SMP configuration:
Jan 23 03:15:07 np0005593234 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 23 03:15:07 np0005593234 kernel: smp: Brought up 1 node, 8 CPUs
Jan 23 03:15:07 np0005593234 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 23 03:15:07 np0005593234 kernel: node 0 deferred pages initialised in 16ms
Jan 23 03:15:07 np0005593234 kernel: Memory: 7763576K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618356K reserved, 0K cma-reserved)
Jan 23 03:15:07 np0005593234 kernel: devtmpfs: initialized
Jan 23 03:15:07 np0005593234 kernel: x86/mm: Memory block size: 128MB
Jan 23 03:15:07 np0005593234 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 23 03:15:07 np0005593234 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 23 03:15:07 np0005593234 kernel: pinctrl core: initialized pinctrl subsystem
Jan 23 03:15:07 np0005593234 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 23 03:15:07 np0005593234 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 23 03:15:07 np0005593234 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 23 03:15:07 np0005593234 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 23 03:15:07 np0005593234 kernel: audit: initializing netlink subsys (disabled)
Jan 23 03:15:07 np0005593234 kernel: audit: type=2000 audit(1769156104.502:1): state=initialized audit_enabled=0 res=1
Jan 23 03:15:07 np0005593234 kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 23 03:15:07 np0005593234 kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 23 03:15:07 np0005593234 kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 23 03:15:07 np0005593234 kernel: cpuidle: using governor menu
Jan 23 03:15:07 np0005593234 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 23 03:15:07 np0005593234 kernel: PCI: Using configuration type 1 for base access
Jan 23 03:15:07 np0005593234 kernel: PCI: Using configuration type 1 for extended access
Jan 23 03:15:07 np0005593234 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 23 03:15:07 np0005593234 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 23 03:15:07 np0005593234 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 23 03:15:07 np0005593234 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 23 03:15:07 np0005593234 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 23 03:15:07 np0005593234 kernel: Demotion targets for Node 0: null
Jan 23 03:15:07 np0005593234 kernel: cryptd: max_cpu_qlen set to 1000
Jan 23 03:15:07 np0005593234 kernel: ACPI: Added _OSI(Module Device)
Jan 23 03:15:07 np0005593234 kernel: ACPI: Added _OSI(Processor Device)
Jan 23 03:15:07 np0005593234 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 23 03:15:07 np0005593234 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 23 03:15:07 np0005593234 kernel: ACPI: Interpreter enabled
Jan 23 03:15:07 np0005593234 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 23 03:15:07 np0005593234 kernel: ACPI: Using IOAPIC for interrupt routing
Jan 23 03:15:07 np0005593234 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 23 03:15:07 np0005593234 kernel: PCI: Using E820 reservations for host bridge windows
Jan 23 03:15:07 np0005593234 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 23 03:15:07 np0005593234 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 23 03:15:07 np0005593234 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [3] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [4] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [5] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [6] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [7] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [8] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [9] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [10] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [11] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [12] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [13] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [14] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [15] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [16] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [17] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [18] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [19] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [20] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [21] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [22] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [23] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [24] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [25] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [26] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [27] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [28] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [29] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [30] registered
Jan 23 03:15:07 np0005593234 kernel: acpiphp: Slot [31] registered
Jan 23 03:15:07 np0005593234 kernel: PCI host bridge to bus 0000:00
Jan 23 03:15:07 np0005593234 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 23 03:15:07 np0005593234 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 23 03:15:07 np0005593234 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 23 03:15:07 np0005593234 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 23 03:15:07 np0005593234 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 23 03:15:07 np0005593234 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 23 03:15:07 np0005593234 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 23 03:15:07 np0005593234 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 23 03:15:07 np0005593234 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 23 03:15:07 np0005593234 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 23 03:15:07 np0005593234 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 23 03:15:07 np0005593234 kernel: iommu: Default domain type: Translated
Jan 23 03:15:07 np0005593234 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 23 03:15:07 np0005593234 kernel: SCSI subsystem initialized
Jan 23 03:15:07 np0005593234 kernel: ACPI: bus type USB registered
Jan 23 03:15:07 np0005593234 kernel: usbcore: registered new interface driver usbfs
Jan 23 03:15:07 np0005593234 kernel: usbcore: registered new interface driver hub
Jan 23 03:15:07 np0005593234 kernel: usbcore: registered new device driver usb
Jan 23 03:15:07 np0005593234 kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 23 03:15:07 np0005593234 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 23 03:15:07 np0005593234 kernel: PTP clock support registered
Jan 23 03:15:07 np0005593234 kernel: EDAC MC: Ver: 3.0.0
Jan 23 03:15:07 np0005593234 kernel: NetLabel: Initializing
Jan 23 03:15:07 np0005593234 kernel: NetLabel:  domain hash size = 128
Jan 23 03:15:07 np0005593234 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 23 03:15:07 np0005593234 kernel: NetLabel:  unlabeled traffic allowed by default
Jan 23 03:15:07 np0005593234 kernel: PCI: Using ACPI for IRQ routing
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 23 03:15:07 np0005593234 kernel: vgaarb: loaded
Jan 23 03:15:07 np0005593234 kernel: clocksource: Switched to clocksource kvm-clock
Jan 23 03:15:07 np0005593234 kernel: VFS: Disk quotas dquot_6.6.0
Jan 23 03:15:07 np0005593234 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 23 03:15:07 np0005593234 kernel: pnp: PnP ACPI init
Jan 23 03:15:07 np0005593234 kernel: pnp: PnP ACPI: found 5 devices
Jan 23 03:15:07 np0005593234 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 23 03:15:07 np0005593234 kernel: NET: Registered PF_INET protocol family
Jan 23 03:15:07 np0005593234 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 23 03:15:07 np0005593234 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 23 03:15:07 np0005593234 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 23 03:15:07 np0005593234 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 23 03:15:07 np0005593234 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 23 03:15:07 np0005593234 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 23 03:15:07 np0005593234 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 23 03:15:07 np0005593234 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 03:15:07 np0005593234 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 03:15:07 np0005593234 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 23 03:15:07 np0005593234 kernel: NET: Registered PF_XDP protocol family
Jan 23 03:15:07 np0005593234 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 23 03:15:07 np0005593234 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 23 03:15:07 np0005593234 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 23 03:15:07 np0005593234 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 23 03:15:07 np0005593234 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 23 03:15:07 np0005593234 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 23 03:15:07 np0005593234 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 83758 usecs
Jan 23 03:15:07 np0005593234 kernel: PCI: CLS 0 bytes, default 64
Jan 23 03:15:07 np0005593234 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 23 03:15:07 np0005593234 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 23 03:15:07 np0005593234 kernel: Trying to unpack rootfs image as initramfs...
Jan 23 03:15:07 np0005593234 kernel: ACPI: bus type thunderbolt registered
Jan 23 03:15:07 np0005593234 kernel: Initialise system trusted keyrings
Jan 23 03:15:07 np0005593234 kernel: Key type blacklist registered
Jan 23 03:15:07 np0005593234 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 23 03:15:07 np0005593234 kernel: zbud: loaded
Jan 23 03:15:07 np0005593234 kernel: integrity: Platform Keyring initialized
Jan 23 03:15:07 np0005593234 kernel: integrity: Machine keyring initialized
Jan 23 03:15:07 np0005593234 kernel: Freeing initrd memory: 87956K
Jan 23 03:15:07 np0005593234 kernel: NET: Registered PF_ALG protocol family
Jan 23 03:15:07 np0005593234 kernel: xor: automatically using best checksumming function   avx       
Jan 23 03:15:07 np0005593234 kernel: Key type asymmetric registered
Jan 23 03:15:07 np0005593234 kernel: Asymmetric key parser 'x509' registered
Jan 23 03:15:07 np0005593234 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 23 03:15:07 np0005593234 kernel: io scheduler mq-deadline registered
Jan 23 03:15:07 np0005593234 kernel: io scheduler kyber registered
Jan 23 03:15:07 np0005593234 kernel: io scheduler bfq registered
Jan 23 03:15:07 np0005593234 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 23 03:15:07 np0005593234 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 23 03:15:07 np0005593234 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 23 03:15:07 np0005593234 kernel: ACPI: button: Power Button [PWRF]
Jan 23 03:15:07 np0005593234 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 23 03:15:07 np0005593234 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 23 03:15:07 np0005593234 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 23 03:15:07 np0005593234 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 23 03:15:07 np0005593234 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 23 03:15:07 np0005593234 kernel: Non-volatile memory driver v1.3
Jan 23 03:15:07 np0005593234 kernel: rdac: device handler registered
Jan 23 03:15:07 np0005593234 kernel: hp_sw: device handler registered
Jan 23 03:15:07 np0005593234 kernel: emc: device handler registered
Jan 23 03:15:07 np0005593234 kernel: alua: device handler registered
Jan 23 03:15:07 np0005593234 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 23 03:15:07 np0005593234 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 23 03:15:07 np0005593234 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 23 03:15:07 np0005593234 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 23 03:15:07 np0005593234 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 23 03:15:07 np0005593234 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 23 03:15:07 np0005593234 kernel: usb usb1: Product: UHCI Host Controller
Jan 23 03:15:07 np0005593234 kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 23 03:15:07 np0005593234 kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 23 03:15:07 np0005593234 kernel: hub 1-0:1.0: USB hub found
Jan 23 03:15:07 np0005593234 kernel: hub 1-0:1.0: 2 ports detected
Jan 23 03:15:07 np0005593234 kernel: usbcore: registered new interface driver usbserial_generic
Jan 23 03:15:07 np0005593234 kernel: usbserial: USB Serial support registered for generic
Jan 23 03:15:07 np0005593234 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 23 03:15:07 np0005593234 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 23 03:15:07 np0005593234 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 23 03:15:07 np0005593234 kernel: mousedev: PS/2 mouse device common for all mice
Jan 23 03:15:07 np0005593234 kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 23 03:15:07 np0005593234 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 23 03:15:07 np0005593234 kernel: rtc_cmos 00:04: registered as rtc0
Jan 23 03:15:07 np0005593234 kernel: rtc_cmos 00:04: setting system clock to 2026-01-23T08:15:06 UTC (1769156106)
Jan 23 03:15:07 np0005593234 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 23 03:15:07 np0005593234 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 23 03:15:07 np0005593234 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 23 03:15:07 np0005593234 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 23 03:15:07 np0005593234 kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 23 03:15:07 np0005593234 kernel: usbcore: registered new interface driver usbhid
Jan 23 03:15:07 np0005593234 kernel: usbhid: USB HID core driver
Jan 23 03:15:07 np0005593234 kernel: drop_monitor: Initializing network drop monitor service
Jan 23 03:15:07 np0005593234 kernel: Initializing XFRM netlink socket
Jan 23 03:15:07 np0005593234 kernel: NET: Registered PF_INET6 protocol family
Jan 23 03:15:07 np0005593234 kernel: Segment Routing with IPv6
Jan 23 03:15:07 np0005593234 kernel: NET: Registered PF_PACKET protocol family
Jan 23 03:15:07 np0005593234 kernel: mpls_gso: MPLS GSO support
Jan 23 03:15:07 np0005593234 kernel: IPI shorthand broadcast: enabled
Jan 23 03:15:07 np0005593234 kernel: AVX2 version of gcm_enc/dec engaged.
Jan 23 03:15:07 np0005593234 kernel: AES CTR mode by8 optimization enabled
Jan 23 03:15:07 np0005593234 kernel: sched_clock: Marking stable (1568002022, 179666616)->(1923778563, -176109925)
Jan 23 03:15:07 np0005593234 kernel: registered taskstats version 1
Jan 23 03:15:07 np0005593234 kernel: Loading compiled-in X.509 certificates
Jan 23 03:15:07 np0005593234 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 03:15:07 np0005593234 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 23 03:15:07 np0005593234 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 23 03:15:07 np0005593234 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 23 03:15:07 np0005593234 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 23 03:15:07 np0005593234 kernel: Demotion targets for Node 0: null
Jan 23 03:15:07 np0005593234 kernel: page_owner is disabled
Jan 23 03:15:07 np0005593234 kernel: Key type .fscrypt registered
Jan 23 03:15:07 np0005593234 kernel: Key type fscrypt-provisioning registered
Jan 23 03:15:07 np0005593234 kernel: Key type big_key registered
Jan 23 03:15:07 np0005593234 kernel: Key type encrypted registered
Jan 23 03:15:07 np0005593234 kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 23 03:15:07 np0005593234 kernel: Loading compiled-in module X.509 certificates
Jan 23 03:15:07 np0005593234 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 03:15:07 np0005593234 kernel: ima: Allocated hash algorithm: sha256
Jan 23 03:15:07 np0005593234 kernel: ima: No architecture policies found
Jan 23 03:15:07 np0005593234 kernel: evm: Initialising EVM extended attributes:
Jan 23 03:15:07 np0005593234 kernel: evm: security.selinux
Jan 23 03:15:07 np0005593234 kernel: evm: security.SMACK64 (disabled)
Jan 23 03:15:07 np0005593234 kernel: evm: security.SMACK64EXEC (disabled)
Jan 23 03:15:07 np0005593234 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 23 03:15:07 np0005593234 kernel: evm: security.SMACK64MMAP (disabled)
Jan 23 03:15:07 np0005593234 kernel: evm: security.apparmor (disabled)
Jan 23 03:15:07 np0005593234 kernel: evm: security.ima
Jan 23 03:15:07 np0005593234 kernel: evm: security.capability
Jan 23 03:15:07 np0005593234 kernel: evm: HMAC attrs: 0x1
Jan 23 03:15:07 np0005593234 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 23 03:15:07 np0005593234 kernel: Running certificate verification RSA selftest
Jan 23 03:15:07 np0005593234 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 23 03:15:07 np0005593234 kernel: Running certificate verification ECDSA selftest
Jan 23 03:15:07 np0005593234 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 23 03:15:07 np0005593234 kernel: clk: Disabling unused clocks
Jan 23 03:15:07 np0005593234 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 23 03:15:07 np0005593234 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 23 03:15:07 np0005593234 kernel: usb 1-1: Product: QEMU USB Tablet
Jan 23 03:15:07 np0005593234 kernel: usb 1-1: Manufacturer: QEMU
Jan 23 03:15:07 np0005593234 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 23 03:15:07 np0005593234 kernel: Freeing unused decrypted memory: 2028K
Jan 23 03:15:07 np0005593234 kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 23 03:15:07 np0005593234 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 23 03:15:07 np0005593234 kernel: Write protecting the kernel read-only data: 30720k
Jan 23 03:15:07 np0005593234 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 23 03:15:07 np0005593234 kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 23 03:15:07 np0005593234 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 23 03:15:07 np0005593234 kernel: Run /init as init process
Jan 23 03:15:07 np0005593234 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 03:15:07 np0005593234 systemd: Detected virtualization kvm.
Jan 23 03:15:07 np0005593234 systemd: Detected architecture x86-64.
Jan 23 03:15:07 np0005593234 systemd: Running in initrd.
Jan 23 03:15:07 np0005593234 systemd: No hostname configured, using default hostname.
Jan 23 03:15:07 np0005593234 systemd: Hostname set to <localhost>.
Jan 23 03:15:07 np0005593234 systemd: Initializing machine ID from VM UUID.
Jan 23 03:15:07 np0005593234 systemd: Queued start job for default target Initrd Default Target.
Jan 23 03:15:07 np0005593234 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 03:15:07 np0005593234 systemd: Reached target Local Encrypted Volumes.
Jan 23 03:15:07 np0005593234 systemd: Reached target Initrd /usr File System.
Jan 23 03:15:07 np0005593234 systemd: Reached target Local File Systems.
Jan 23 03:15:07 np0005593234 systemd: Reached target Path Units.
Jan 23 03:15:07 np0005593234 systemd: Reached target Slice Units.
Jan 23 03:15:07 np0005593234 systemd: Reached target Swaps.
Jan 23 03:15:07 np0005593234 systemd: Reached target Timer Units.
Jan 23 03:15:07 np0005593234 systemd: Listening on D-Bus System Message Bus Socket.
Jan 23 03:15:07 np0005593234 systemd: Listening on Journal Socket (/dev/log).
Jan 23 03:15:07 np0005593234 systemd: Listening on Journal Socket.
Jan 23 03:15:07 np0005593234 systemd: Listening on udev Control Socket.
Jan 23 03:15:07 np0005593234 systemd: Listening on udev Kernel Socket.
Jan 23 03:15:07 np0005593234 systemd: Reached target Socket Units.
Jan 23 03:15:07 np0005593234 systemd: Starting Create List of Static Device Nodes...
Jan 23 03:15:07 np0005593234 systemd: Starting Journal Service...
Jan 23 03:15:07 np0005593234 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 03:15:07 np0005593234 systemd: Starting Apply Kernel Variables...
Jan 23 03:15:07 np0005593234 systemd: Starting Create System Users...
Jan 23 03:15:07 np0005593234 systemd: Starting Setup Virtual Console...
Jan 23 03:15:07 np0005593234 systemd: Finished Create List of Static Device Nodes.
Jan 23 03:15:07 np0005593234 systemd: Finished Apply Kernel Variables.
Jan 23 03:15:07 np0005593234 systemd: Finished Create System Users.
Jan 23 03:15:07 np0005593234 systemd-journald[309]: Journal started
Jan 23 03:15:07 np0005593234 systemd-journald[309]: Runtime Journal (/run/log/journal/3e200bf7763442a081842372f58672f7) is 8.0M, max 153.6M, 145.6M free.
Jan 23 03:15:07 np0005593234 systemd-sysusers[312]: Creating group 'users' with GID 100.
Jan 23 03:15:07 np0005593234 systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Jan 23 03:15:07 np0005593234 systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 23 03:15:07 np0005593234 systemd: Started Journal Service.
Jan 23 03:15:07 np0005593234 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 23 03:15:07 np0005593234 systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 03:15:07 np0005593234 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 03:15:07 np0005593234 systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 03:15:07 np0005593234 systemd[1]: Finished Setup Virtual Console.
Jan 23 03:15:07 np0005593234 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 23 03:15:07 np0005593234 systemd[1]: Starting dracut cmdline hook...
Jan 23 03:15:07 np0005593234 dracut-cmdline[328]: dracut-9 dracut-057-102.git20250818.el9
Jan 23 03:15:07 np0005593234 dracut-cmdline[328]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 03:15:07 np0005593234 systemd[1]: Finished dracut cmdline hook.
Jan 23 03:15:07 np0005593234 systemd[1]: Starting dracut pre-udev hook...
Jan 23 03:15:07 np0005593234 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 23 03:15:07 np0005593234 kernel: device-mapper: uevent: version 1.0.3
Jan 23 03:15:07 np0005593234 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 23 03:15:07 np0005593234 kernel: RPC: Registered named UNIX socket transport module.
Jan 23 03:15:07 np0005593234 kernel: RPC: Registered udp transport module.
Jan 23 03:15:07 np0005593234 kernel: RPC: Registered tcp transport module.
Jan 23 03:15:07 np0005593234 kernel: RPC: Registered tcp-with-tls transport module.
Jan 23 03:15:07 np0005593234 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 23 03:15:07 np0005593234 rpc.statd[443]: Version 2.5.4 starting
Jan 23 03:15:07 np0005593234 rpc.statd[443]: Initializing NSM state
Jan 23 03:15:07 np0005593234 rpc.idmapd[448]: Setting log level to 0
Jan 23 03:15:07 np0005593234 systemd[1]: Finished dracut pre-udev hook.
Jan 23 03:15:07 np0005593234 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 03:15:07 np0005593234 systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 03:15:07 np0005593234 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 03:15:07 np0005593234 systemd[1]: Starting dracut pre-trigger hook...
Jan 23 03:15:07 np0005593234 systemd[1]: Finished dracut pre-trigger hook.
Jan 23 03:15:08 np0005593234 systemd[1]: Starting Coldplug All udev Devices...
Jan 23 03:15:08 np0005593234 systemd[1]: Created slice Slice /system/modprobe.
Jan 23 03:15:08 np0005593234 systemd[1]: Starting Load Kernel Module configfs...
Jan 23 03:15:08 np0005593234 systemd[1]: Finished Coldplug All udev Devices.
Jan 23 03:15:08 np0005593234 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 03:15:08 np0005593234 systemd[1]: Finished Load Kernel Module configfs.
Jan 23 03:15:08 np0005593234 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 03:15:08 np0005593234 systemd[1]: Reached target Network.
Jan 23 03:15:08 np0005593234 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 03:15:08 np0005593234 systemd[1]: Starting dracut initqueue hook...
Jan 23 03:15:08 np0005593234 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 23 03:15:08 np0005593234 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 23 03:15:08 np0005593234 systemd-udevd[479]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 03:15:08 np0005593234 kernel: vda: vda1
Jan 23 03:15:08 np0005593234 kernel: scsi host0: ata_piix
Jan 23 03:15:08 np0005593234 kernel: scsi host1: ata_piix
Jan 23 03:15:08 np0005593234 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 23 03:15:08 np0005593234 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 23 03:15:08 np0005593234 systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 03:15:08 np0005593234 systemd[1]: Reached target Initrd Root Device.
Jan 23 03:15:08 np0005593234 systemd[1]: Mounting Kernel Configuration File System...
Jan 23 03:15:08 np0005593234 systemd[1]: Mounted Kernel Configuration File System.
Jan 23 03:15:08 np0005593234 systemd[1]: Reached target System Initialization.
Jan 23 03:15:08 np0005593234 systemd[1]: Reached target Basic System.
Jan 23 03:15:08 np0005593234 kernel: ata1: found unknown device (class 0)
Jan 23 03:15:08 np0005593234 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 23 03:15:08 np0005593234 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 23 03:15:08 np0005593234 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 23 03:15:08 np0005593234 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 23 03:15:08 np0005593234 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 23 03:15:08 np0005593234 systemd[1]: Finished dracut initqueue hook.
Jan 23 03:15:08 np0005593234 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 03:15:08 np0005593234 systemd[1]: Reached target Remote Encrypted Volumes.
Jan 23 03:15:08 np0005593234 systemd[1]: Reached target Remote File Systems.
Jan 23 03:15:08 np0005593234 systemd[1]: Starting dracut pre-mount hook...
Jan 23 03:15:08 np0005593234 systemd[1]: Finished dracut pre-mount hook.
Jan 23 03:15:08 np0005593234 systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 23 03:15:08 np0005593234 systemd-fsck[554]: /usr/sbin/fsck.xfs: XFS file system.
Jan 23 03:15:08 np0005593234 systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 03:15:08 np0005593234 systemd[1]: Mounting /sysroot...
Jan 23 03:15:09 np0005593234 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 23 03:15:09 np0005593234 kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 23 03:15:09 np0005593234 kernel: XFS (vda1): Ending clean mount
Jan 23 03:15:09 np0005593234 systemd[1]: Mounted /sysroot.
Jan 23 03:15:09 np0005593234 systemd[1]: Reached target Initrd Root File System.
Jan 23 03:15:09 np0005593234 systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 23 03:15:09 np0005593234 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 23 03:15:09 np0005593234 systemd[1]: Reached target Initrd File Systems.
Jan 23 03:15:09 np0005593234 systemd[1]: Reached target Initrd Default Target.
Jan 23 03:15:09 np0005593234 systemd[1]: Starting dracut mount hook...
Jan 23 03:15:09 np0005593234 systemd[1]: Finished dracut mount hook.
Jan 23 03:15:09 np0005593234 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 23 03:15:09 np0005593234 rpc.idmapd[448]: exiting on signal 15
Jan 23 03:15:09 np0005593234 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 23 03:15:09 np0005593234 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Network.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Timer Units.
Jan 23 03:15:09 np0005593234 systemd[1]: dbus.socket: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 23 03:15:09 np0005593234 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Initrd Default Target.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Basic System.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Initrd Root Device.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Initrd /usr File System.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Path Units.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Remote File Systems.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Slice Units.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Socket Units.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target System Initialization.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Local File Systems.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Swaps.
Jan 23 03:15:09 np0005593234 systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped dracut mount hook.
Jan 23 03:15:09 np0005593234 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped dracut pre-mount hook.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped target Local Encrypted Volumes.
Jan 23 03:15:09 np0005593234 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 23 03:15:09 np0005593234 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped dracut initqueue hook.
Jan 23 03:15:09 np0005593234 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped Apply Kernel Variables.
Jan 23 03:15:09 np0005593234 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped Create Volatile Files and Directories.
Jan 23 03:15:09 np0005593234 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped Coldplug All udev Devices.
Jan 23 03:15:09 np0005593234 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped dracut pre-trigger hook.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 23 03:15:09 np0005593234 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped Setup Virtual Console.
Jan 23 03:15:09 np0005593234 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 23 03:15:09 np0005593234 systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 23 03:15:09 np0005593234 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Closed udev Control Socket.
Jan 23 03:15:09 np0005593234 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Closed udev Kernel Socket.
Jan 23 03:15:09 np0005593234 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped dracut pre-udev hook.
Jan 23 03:15:09 np0005593234 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped dracut cmdline hook.
Jan 23 03:15:09 np0005593234 systemd[1]: Starting Cleanup udev Database...
Jan 23 03:15:09 np0005593234 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 23 03:15:09 np0005593234 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped Create List of Static Device Nodes.
Jan 23 03:15:09 np0005593234 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Stopped Create System Users.
Jan 23 03:15:09 np0005593234 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 23 03:15:09 np0005593234 systemd[1]: Finished Cleanup udev Database.
Jan 23 03:15:09 np0005593234 systemd[1]: Reached target Switch Root.
Jan 23 03:15:09 np0005593234 systemd[1]: Starting Switch Root...
Jan 23 03:15:09 np0005593234 systemd[1]: Switching root.
Jan 23 03:15:09 np0005593234 systemd-journald[309]: Received SIGTERM from PID 1 (systemd).
Jan 23 03:15:09 np0005593234 systemd-journald[309]: Journal stopped
Jan 23 03:15:10 np0005593234 kernel: audit: type=1404 audit(1769156109.719:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 23 03:15:10 np0005593234 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 03:15:10 np0005593234 kernel: SELinux:  policy capability open_perms=1
Jan 23 03:15:10 np0005593234 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 03:15:10 np0005593234 kernel: SELinux:  policy capability always_check_network=0
Jan 23 03:15:10 np0005593234 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 03:15:10 np0005593234 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 03:15:10 np0005593234 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 03:15:10 np0005593234 kernel: audit: type=1403 audit(1769156109.842:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 23 03:15:10 np0005593234 systemd: Successfully loaded SELinux policy in 127.426ms.
Jan 23 03:15:10 np0005593234 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 26.764ms.
Jan 23 03:15:10 np0005593234 systemd: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 03:15:10 np0005593234 systemd: Detected virtualization kvm.
Jan 23 03:15:10 np0005593234 systemd: Detected architecture x86-64.
Jan 23 03:15:10 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:15:10 np0005593234 systemd: initrd-switch-root.service: Deactivated successfully.
Jan 23 03:15:10 np0005593234 systemd: Stopped Switch Root.
Jan 23 03:15:10 np0005593234 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 23 03:15:10 np0005593234 systemd: Created slice Slice /system/getty.
Jan 23 03:15:10 np0005593234 systemd: Created slice Slice /system/serial-getty.
Jan 23 03:15:10 np0005593234 systemd: Created slice Slice /system/sshd-keygen.
Jan 23 03:15:10 np0005593234 systemd: Created slice User and Session Slice.
Jan 23 03:15:10 np0005593234 systemd: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 03:15:10 np0005593234 systemd: Started Forward Password Requests to Wall Directory Watch.
Jan 23 03:15:10 np0005593234 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 23 03:15:10 np0005593234 systemd: Reached target Local Encrypted Volumes.
Jan 23 03:15:10 np0005593234 systemd: Stopped target Switch Root.
Jan 23 03:15:10 np0005593234 systemd: Stopped target Initrd File Systems.
Jan 23 03:15:10 np0005593234 systemd: Stopped target Initrd Root File System.
Jan 23 03:15:10 np0005593234 systemd: Reached target Local Integrity Protected Volumes.
Jan 23 03:15:10 np0005593234 systemd: Reached target Path Units.
Jan 23 03:15:10 np0005593234 systemd: Reached target rpc_pipefs.target.
Jan 23 03:15:10 np0005593234 systemd: Reached target Slice Units.
Jan 23 03:15:10 np0005593234 systemd: Reached target Swaps.
Jan 23 03:15:10 np0005593234 systemd: Reached target Local Verity Protected Volumes.
Jan 23 03:15:10 np0005593234 systemd: Listening on RPCbind Server Activation Socket.
Jan 23 03:15:10 np0005593234 systemd: Reached target RPC Port Mapper.
Jan 23 03:15:10 np0005593234 systemd: Listening on Process Core Dump Socket.
Jan 23 03:15:10 np0005593234 systemd: Listening on initctl Compatibility Named Pipe.
Jan 23 03:15:10 np0005593234 systemd: Listening on udev Control Socket.
Jan 23 03:15:10 np0005593234 systemd: Listening on udev Kernel Socket.
Jan 23 03:15:10 np0005593234 systemd: Mounting Huge Pages File System...
Jan 23 03:15:10 np0005593234 systemd: Mounting POSIX Message Queue File System...
Jan 23 03:15:10 np0005593234 systemd: Mounting Kernel Debug File System...
Jan 23 03:15:10 np0005593234 systemd: Mounting Kernel Trace File System...
Jan 23 03:15:10 np0005593234 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 03:15:10 np0005593234 systemd: Starting Create List of Static Device Nodes...
Jan 23 03:15:10 np0005593234 systemd: Starting Load Kernel Module configfs...
Jan 23 03:15:10 np0005593234 systemd: Starting Load Kernel Module drm...
Jan 23 03:15:10 np0005593234 systemd: Starting Load Kernel Module efi_pstore...
Jan 23 03:15:10 np0005593234 systemd: Starting Load Kernel Module fuse...
Jan 23 03:15:10 np0005593234 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 23 03:15:10 np0005593234 systemd: systemd-fsck-root.service: Deactivated successfully.
Jan 23 03:15:10 np0005593234 systemd: Stopped File System Check on Root Device.
Jan 23 03:15:10 np0005593234 systemd: Stopped Journal Service.
Jan 23 03:15:10 np0005593234 systemd: Starting Journal Service...
Jan 23 03:15:10 np0005593234 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 03:15:10 np0005593234 systemd: Starting Generate network units from Kernel command line...
Jan 23 03:15:10 np0005593234 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 03:15:10 np0005593234 systemd: Starting Remount Root and Kernel File Systems...
Jan 23 03:15:10 np0005593234 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 23 03:15:10 np0005593234 kernel: fuse: init (API version 7.37)
Jan 23 03:15:10 np0005593234 systemd: Starting Apply Kernel Variables...
Jan 23 03:15:10 np0005593234 systemd: Starting Coldplug All udev Devices...
Jan 23 03:15:10 np0005593234 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 23 03:15:10 np0005593234 systemd: Mounted Huge Pages File System.
Jan 23 03:15:10 np0005593234 systemd: Mounted POSIX Message Queue File System.
Jan 23 03:15:10 np0005593234 systemd: Mounted Kernel Debug File System.
Jan 23 03:15:10 np0005593234 systemd: Mounted Kernel Trace File System.
Jan 23 03:15:10 np0005593234 systemd-journald[677]: Journal started
Jan 23 03:15:10 np0005593234 systemd-journald[677]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 03:15:10 np0005593234 systemd[1]: Queued start job for default target Multi-User System.
Jan 23 03:15:10 np0005593234 systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 23 03:15:10 np0005593234 systemd: Started Journal Service.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Create List of Static Device Nodes.
Jan 23 03:15:10 np0005593234 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Load Kernel Module configfs.
Jan 23 03:15:10 np0005593234 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 23 03:15:10 np0005593234 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Load Kernel Module fuse.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Generate network units from Kernel command line.
Jan 23 03:15:10 np0005593234 kernel: ACPI: bus type drm_connector registered
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 23 03:15:10 np0005593234 systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Load Kernel Module drm.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Apply Kernel Variables.
Jan 23 03:15:10 np0005593234 systemd[1]: Mounting FUSE Control File System...
Jan 23 03:15:10 np0005593234 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 03:15:10 np0005593234 systemd[1]: Starting Rebuild Hardware Database...
Jan 23 03:15:10 np0005593234 systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 23 03:15:10 np0005593234 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 23 03:15:10 np0005593234 systemd[1]: Starting Load/Save OS Random Seed...
Jan 23 03:15:10 np0005593234 systemd[1]: Starting Create System Users...
Jan 23 03:15:10 np0005593234 systemd[1]: Mounted FUSE Control File System.
Jan 23 03:15:10 np0005593234 systemd-journald[677]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 03:15:10 np0005593234 systemd-journald[677]: Received client request to flush runtime journal.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Load/Save OS Random Seed.
Jan 23 03:15:10 np0005593234 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Create System Users.
Jan 23 03:15:10 np0005593234 systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Coldplug All udev Devices.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 03:15:10 np0005593234 systemd[1]: Reached target Preparation for Local File Systems.
Jan 23 03:15:10 np0005593234 systemd[1]: Reached target Local File Systems.
Jan 23 03:15:10 np0005593234 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 23 03:15:10 np0005593234 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 23 03:15:10 np0005593234 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 23 03:15:10 np0005593234 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 23 03:15:10 np0005593234 systemd[1]: Starting Automatic Boot Loader Update...
Jan 23 03:15:10 np0005593234 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 23 03:15:10 np0005593234 systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 03:15:10 np0005593234 bootctl[695]: Couldn't find EFI system partition, skipping.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Automatic Boot Loader Update.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 03:15:10 np0005593234 systemd[1]: Starting Security Auditing Service...
Jan 23 03:15:10 np0005593234 systemd[1]: Starting RPC Bind...
Jan 23 03:15:10 np0005593234 systemd[1]: Starting Rebuild Journal Catalog...
Jan 23 03:15:10 np0005593234 auditd[702]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 23 03:15:10 np0005593234 auditd[702]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Rebuild Journal Catalog.
Jan 23 03:15:10 np0005593234 augenrules[707]: /sbin/augenrules: No change
Jan 23 03:15:10 np0005593234 systemd[1]: Started RPC Bind.
Jan 23 03:15:10 np0005593234 augenrules[722]: No rules
Jan 23 03:15:10 np0005593234 augenrules[722]: enabled 1
Jan 23 03:15:10 np0005593234 augenrules[722]: failure 1
Jan 23 03:15:10 np0005593234 augenrules[722]: pid 702
Jan 23 03:15:10 np0005593234 augenrules[722]: rate_limit 0
Jan 23 03:15:10 np0005593234 augenrules[722]: backlog_limit 8192
Jan 23 03:15:10 np0005593234 augenrules[722]: lost 0
Jan 23 03:15:10 np0005593234 augenrules[722]: backlog 3
Jan 23 03:15:10 np0005593234 augenrules[722]: backlog_wait_time 60000
Jan 23 03:15:10 np0005593234 augenrules[722]: backlog_wait_time_actual 0
Jan 23 03:15:10 np0005593234 augenrules[722]: enabled 1
Jan 23 03:15:10 np0005593234 augenrules[722]: failure 1
Jan 23 03:15:10 np0005593234 augenrules[722]: pid 702
Jan 23 03:15:10 np0005593234 augenrules[722]: rate_limit 0
Jan 23 03:15:10 np0005593234 augenrules[722]: backlog_limit 8192
Jan 23 03:15:10 np0005593234 augenrules[722]: lost 0
Jan 23 03:15:10 np0005593234 augenrules[722]: backlog 3
Jan 23 03:15:10 np0005593234 augenrules[722]: backlog_wait_time 60000
Jan 23 03:15:10 np0005593234 augenrules[722]: backlog_wait_time_actual 0
Jan 23 03:15:10 np0005593234 augenrules[722]: enabled 1
Jan 23 03:15:10 np0005593234 augenrules[722]: failure 1
Jan 23 03:15:10 np0005593234 augenrules[722]: pid 702
Jan 23 03:15:10 np0005593234 augenrules[722]: rate_limit 0
Jan 23 03:15:10 np0005593234 augenrules[722]: backlog_limit 8192
Jan 23 03:15:10 np0005593234 augenrules[722]: lost 0
Jan 23 03:15:10 np0005593234 augenrules[722]: backlog 3
Jan 23 03:15:10 np0005593234 augenrules[722]: backlog_wait_time 60000
Jan 23 03:15:10 np0005593234 augenrules[722]: backlog_wait_time_actual 0
Jan 23 03:15:10 np0005593234 systemd[1]: Started Security Auditing Service.
Jan 23 03:15:10 np0005593234 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Rebuild Hardware Database.
Jan 23 03:15:10 np0005593234 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 03:15:10 np0005593234 systemd[1]: Starting Update is Completed...
Jan 23 03:15:10 np0005593234 systemd[1]: Finished Update is Completed.
Jan 23 03:15:10 np0005593234 systemd-udevd[730]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 03:15:11 np0005593234 systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 03:15:11 np0005593234 systemd[1]: Reached target System Initialization.
Jan 23 03:15:11 np0005593234 systemd[1]: Started dnf makecache --timer.
Jan 23 03:15:11 np0005593234 systemd[1]: Started Daily rotation of log files.
Jan 23 03:15:11 np0005593234 systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 23 03:15:11 np0005593234 systemd[1]: Reached target Timer Units.
Jan 23 03:15:11 np0005593234 systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 23 03:15:11 np0005593234 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 23 03:15:11 np0005593234 systemd[1]: Reached target Socket Units.
Jan 23 03:15:11 np0005593234 systemd[1]: Starting D-Bus System Message Bus...
Jan 23 03:15:11 np0005593234 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 03:15:11 np0005593234 systemd[1]: Starting Load Kernel Module configfs...
Jan 23 03:15:11 np0005593234 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 03:15:11 np0005593234 systemd[1]: Finished Load Kernel Module configfs.
Jan 23 03:15:11 np0005593234 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 23 03:15:11 np0005593234 systemd-udevd[733]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 03:15:11 np0005593234 systemd[1]: Started D-Bus System Message Bus.
Jan 23 03:15:11 np0005593234 systemd[1]: Reached target Basic System.
Jan 23 03:15:11 np0005593234 dbus-broker-lau[753]: Ready
Jan 23 03:15:11 np0005593234 systemd[1]: Starting NTP client/server...
Jan 23 03:15:11 np0005593234 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 23 03:15:11 np0005593234 systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 23 03:15:11 np0005593234 systemd[1]: Starting IPv4 firewall with iptables...
Jan 23 03:15:11 np0005593234 systemd[1]: Started irqbalance daemon.
Jan 23 03:15:11 np0005593234 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 23 03:15:11 np0005593234 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 03:15:11 np0005593234 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 03:15:11 np0005593234 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 03:15:11 np0005593234 systemd[1]: Reached target sshd-keygen.target.
Jan 23 03:15:11 np0005593234 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 23 03:15:11 np0005593234 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 23 03:15:11 np0005593234 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 23 03:15:11 np0005593234 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 23 03:15:11 np0005593234 systemd[1]: Reached target User and Group Name Lookups.
Jan 23 03:15:11 np0005593234 chronyd[787]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 03:15:11 np0005593234 chronyd[787]: Loaded 0 symmetric keys
Jan 23 03:15:11 np0005593234 chronyd[787]: Using right/UTC timezone to obtain leap second data
Jan 23 03:15:11 np0005593234 chronyd[787]: Loaded seccomp filter (level 2)
Jan 23 03:15:11 np0005593234 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 23 03:15:11 np0005593234 systemd[1]: Starting User Login Management...
Jan 23 03:15:11 np0005593234 systemd[1]: Started NTP client/server.
Jan 23 03:15:11 np0005593234 systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 23 03:15:11 np0005593234 systemd-logind[794]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 03:15:11 np0005593234 systemd-logind[794]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 03:15:11 np0005593234 systemd-logind[794]: New seat seat0.
Jan 23 03:15:11 np0005593234 systemd[1]: Started User Login Management.
Jan 23 03:15:11 np0005593234 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 23 03:15:11 np0005593234 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 23 03:15:11 np0005593234 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 23 03:15:11 np0005593234 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 23 03:15:11 np0005593234 kernel: kvm_amd: TSC scaling supported
Jan 23 03:15:11 np0005593234 kernel: kvm_amd: Nested Virtualization enabled
Jan 23 03:15:11 np0005593234 kernel: kvm_amd: Nested Paging enabled
Jan 23 03:15:11 np0005593234 kernel: kvm_amd: LBR virtualization supported
Jan 23 03:15:11 np0005593234 kernel: Console: switching to colour dummy device 80x25
Jan 23 03:15:11 np0005593234 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 23 03:15:11 np0005593234 kernel: [drm] features: -context_init
Jan 23 03:15:11 np0005593234 kernel: [drm] number of scanouts: 1
Jan 23 03:15:11 np0005593234 kernel: [drm] number of cap sets: 0
Jan 23 03:15:11 np0005593234 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 23 03:15:11 np0005593234 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 23 03:15:11 np0005593234 kernel: Console: switching to colour frame buffer device 128x48
Jan 23 03:15:11 np0005593234 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 23 03:15:11 np0005593234 iptables.init[777]: iptables: Applying firewall rules: [  OK  ]
Jan 23 03:15:11 np0005593234 systemd[1]: Finished IPv4 firewall with iptables.
Jan 23 03:15:11 np0005593234 cloud-init[840]: Cloud-init v. 24.4-8.el9 running 'init-local' at Fri, 23 Jan 2026 08:15:11 +0000. Up 6.62 seconds.
Jan 23 03:15:11 np0005593234 systemd[1]: run-cloud\x2dinit-tmp-tmpmu2p4n8l.mount: Deactivated successfully.
Jan 23 03:15:11 np0005593234 systemd[1]: Starting Hostname Service...
Jan 23 03:15:11 np0005593234 systemd[1]: Started Hostname Service.
Jan 23 03:15:11 np0005593234 systemd-hostnamed[854]: Hostname set to <np0005593234.novalocal> (static)
Jan 23 03:15:12 np0005593234 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 23 03:15:12 np0005593234 systemd[1]: Reached target Preparation for Network.
Jan 23 03:15:12 np0005593234 systemd[1]: Starting Network Manager...
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1073] NetworkManager (version 1.54.3-2.el9) is starting... (boot:ecc0581c-6611-4866-b3e1-0bc978951940)
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1078] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1162] manager[0x56265fd05000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1200] hostname: hostname: using hostnamed
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1201] hostname: static hostname changed from (none) to "np0005593234.novalocal"
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1206] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1301] manager[0x56265fd05000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1301] manager[0x56265fd05000]: rfkill: WWAN hardware radio set enabled
Jan 23 03:15:12 np0005593234 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1359] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1360] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1361] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1362] manager: Networking is enabled by state file
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1364] settings: Loaded settings plugin: keyfile (internal)
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1382] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1409] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1423] dhcp: init: Using DHCP client 'internal'
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1428] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1445] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1453] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1463] device (lo): Activation: starting connection 'lo' (4dda678f-e4eb-42eb-8b71-b7827298a97a)
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1474] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1477] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1510] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1515] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1518] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1519] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1521] device (eth0): carrier: link connected
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1525] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1531] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1539] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1544] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1545] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1550] manager: NetworkManager state is now CONNECTING
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1551] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1558] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1561] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:15:12 np0005593234 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1597] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Jan 23 03:15:12 np0005593234 systemd[1]: Started Network Manager.
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1607] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 03:15:12 np0005593234 systemd[1]: Reached target Network.
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1629] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:15:12 np0005593234 systemd[1]: Starting Network Manager Wait Online...
Jan 23 03:15:12 np0005593234 systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1719] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1721] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1725] device (lo): Activation: successful, device activated.
Jan 23 03:15:12 np0005593234 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1735] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1737] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1741] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1744] device (eth0): Activation: successful, device activated.
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1753] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 03:15:12 np0005593234 NetworkManager[858]: <info>  [1769156112.1756] manager: startup complete
Jan 23 03:15:12 np0005593234 systemd[1]: Started GSSAPI Proxy Daemon.
Jan 23 03:15:12 np0005593234 systemd[1]: Finished Network Manager Wait Online.
Jan 23 03:15:12 np0005593234 systemd[1]: Starting Cloud-init: Network Stage...
Jan 23 03:15:12 np0005593234 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 03:15:12 np0005593234 systemd[1]: Reached target NFS client services.
Jan 23 03:15:12 np0005593234 systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 03:15:12 np0005593234 systemd[1]: Reached target Remote File Systems.
Jan 23 03:15:12 np0005593234 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 03:15:12 np0005593234 cloud-init[922]: Cloud-init v. 24.4-8.el9 running 'init' at Fri, 23 Jan 2026 08:15:12 +0000. Up 7.53 seconds.
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: |  eth0  | True |         38.102.83.50         | 255.255.255.0 | global | fa:16:3e:37:d7:d3 |
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: |  eth0  | True | fe80::f816:3eff:fe37:d7d3/64 |       .       |  link  | fa:16:3e:37:d7:d3 |
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 23 03:15:12 np0005593234 cloud-init[922]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 03:15:13 np0005593234 cloud-init[922]: Generating public/private rsa key pair.
Jan 23 03:15:13 np0005593234 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 23 03:15:13 np0005593234 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 23 03:15:13 np0005593234 cloud-init[922]: The key fingerprint is:
Jan 23 03:15:13 np0005593234 cloud-init[922]: SHA256:o3BGXdJbVXb/5wzvb0HpYHpAXmqNkMHgkuQL/zJeooo root@np0005593234.novalocal
Jan 23 03:15:13 np0005593234 cloud-init[922]: The key's randomart image is:
Jan 23 03:15:13 np0005593234 cloud-init[922]: +---[RSA 3072]----+
Jan 23 03:15:13 np0005593234 cloud-init[922]: |     . .+o+  ...+|
Jan 23 03:15:13 np0005593234 cloud-init[922]: |    o o. *.... .o|
Jan 23 03:15:13 np0005593234 cloud-init[922]: |   . +... +o=   o|
Jan 23 03:15:13 np0005593234 cloud-init[922]: |    o.o   .* + o.|
Jan 23 03:15:13 np0005593234 cloud-init[922]: |    .oo S . + = o|
Jan 23 03:15:13 np0005593234 cloud-init[922]: |     +.. . . . B.|
Jan 23 03:15:13 np0005593234 cloud-init[922]: |     +.o    .   =|
Jan 23 03:15:13 np0005593234 cloud-init[922]: |.   o =        ..|
Jan 23 03:15:13 np0005593234 cloud-init[922]: |E... .         .+|
Jan 23 03:15:13 np0005593234 cloud-init[922]: +----[SHA256]-----+
Jan 23 03:15:13 np0005593234 cloud-init[922]: Generating public/private ecdsa key pair.
Jan 23 03:15:13 np0005593234 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 23 03:15:13 np0005593234 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 23 03:15:13 np0005593234 cloud-init[922]: The key fingerprint is:
Jan 23 03:15:13 np0005593234 cloud-init[922]: SHA256:tSTeqACxmKaR/nxUCTAXqk+Vv/v4DBEgDKVtjXsRxPM root@np0005593234.novalocal
Jan 23 03:15:13 np0005593234 cloud-init[922]: The key's randomart image is:
Jan 23 03:15:13 np0005593234 cloud-init[922]: +---[ECDSA 256]---+
Jan 23 03:15:13 np0005593234 cloud-init[922]: | .==+B.          |
Jan 23 03:15:13 np0005593234 cloud-init[922]: | +o+Bo= .        |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |=oo= =o+. o      |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |+.o.o +E.* .     |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |.o o.o oS o      |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |  = o. .o        |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |   + ..o         |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |    .   =        |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |       oo+       |
Jan 23 03:15:13 np0005593234 cloud-init[922]: +----[SHA256]-----+
Jan 23 03:15:13 np0005593234 cloud-init[922]: Generating public/private ed25519 key pair.
Jan 23 03:15:13 np0005593234 cloud-init[922]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 23 03:15:13 np0005593234 cloud-init[922]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 23 03:15:13 np0005593234 cloud-init[922]: The key fingerprint is:
Jan 23 03:15:13 np0005593234 cloud-init[922]: SHA256:bI7k3kEQLOOBSaEDvx5ayFi8tBEYMKUCXNS8QzWsYAM root@np0005593234.novalocal
Jan 23 03:15:13 np0005593234 cloud-init[922]: The key's randomart image is:
Jan 23 03:15:13 np0005593234 cloud-init[922]: +--[ED25519 256]--+
Jan 23 03:15:13 np0005593234 cloud-init[922]: |OEO*o.+o         |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |+Bo=++.o.        |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |= B.+++          |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |+= =.+ o         |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |o.*   o S        |
Jan 23 03:15:13 np0005593234 cloud-init[922]: | + . o =         |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |. .   o o        |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |     . . .       |
Jan 23 03:15:13 np0005593234 cloud-init[922]: |      . .        |
Jan 23 03:15:13 np0005593234 cloud-init[922]: +----[SHA256]-----+
Jan 23 03:15:13 np0005593234 systemd[1]: Finished Cloud-init: Network Stage.
Jan 23 03:15:13 np0005593234 systemd[1]: Reached target Cloud-config availability.
Jan 23 03:15:13 np0005593234 systemd[1]: Reached target Network is Online.
Jan 23 03:15:13 np0005593234 systemd[1]: Starting Cloud-init: Config Stage...
Jan 23 03:15:13 np0005593234 systemd[1]: Starting Crash recovery kernel arming...
Jan 23 03:15:13 np0005593234 systemd[1]: Starting Notify NFS peers of a restart...
Jan 23 03:15:13 np0005593234 systemd[1]: Starting System Logging Service...
Jan 23 03:15:13 np0005593234 sm-notify[1005]: Version 2.5.4 starting
Jan 23 03:15:13 np0005593234 systemd[1]: Starting OpenSSH server daemon...
Jan 23 03:15:13 np0005593234 systemd[1]: Starting Permit User Sessions...
Jan 23 03:15:13 np0005593234 systemd[1]: Started Notify NFS peers of a restart.
Jan 23 03:15:13 np0005593234 systemd[1]: Started OpenSSH server daemon.
Jan 23 03:15:14 np0005593234 systemd[1]: Finished Permit User Sessions.
Jan 23 03:15:14 np0005593234 systemd[1]: Started Command Scheduler.
Jan 23 03:15:14 np0005593234 systemd[1]: Started Getty on tty1.
Jan 23 03:15:14 np0005593234 rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Jan 23 03:15:14 np0005593234 rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 23 03:15:14 np0005593234 systemd[1]: Started Serial Getty on ttyS0.
Jan 23 03:15:14 np0005593234 systemd[1]: Reached target Login Prompts.
Jan 23 03:15:14 np0005593234 systemd[1]: Started System Logging Service.
Jan 23 03:15:14 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 03:15:14 np0005593234 kdumpctl[1018]: kdump: No kdump initial ramdisk found.
Jan 23 03:15:14 np0005593234 kdumpctl[1018]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 23 03:15:14 np0005593234 systemd[1]: Reached target Multi-User System.
Jan 23 03:15:14 np0005593234 systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 23 03:15:14 np0005593234 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 23 03:15:14 np0005593234 systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 23 03:15:14 np0005593234 cloud-init[1140]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Fri, 23 Jan 2026 08:15:14 +0000. Up 9.26 seconds.
Jan 23 03:15:14 np0005593234 systemd[1]: Finished Cloud-init: Config Stage.
Jan 23 03:15:14 np0005593234 systemd[1]: Starting Cloud-init: Final Stage...
Jan 23 03:15:14 np0005593234 dracut[1284]: dracut-057-102.git20250818.el9
Jan 23 03:15:14 np0005593234 cloud-init[1302]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Fri, 23 Jan 2026 08:15:14 +0000. Up 9.66 seconds.
Jan 23 03:15:14 np0005593234 cloud-init[1309]: #############################################################
Jan 23 03:15:14 np0005593234 cloud-init[1314]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 23 03:15:14 np0005593234 dracut[1286]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 23 03:15:14 np0005593234 cloud-init[1322]: 256 SHA256:tSTeqACxmKaR/nxUCTAXqk+Vv/v4DBEgDKVtjXsRxPM root@np0005593234.novalocal (ECDSA)
Jan 23 03:15:14 np0005593234 cloud-init[1327]: 256 SHA256:bI7k3kEQLOOBSaEDvx5ayFi8tBEYMKUCXNS8QzWsYAM root@np0005593234.novalocal (ED25519)
Jan 23 03:15:14 np0005593234 cloud-init[1333]: 3072 SHA256:o3BGXdJbVXb/5wzvb0HpYHpAXmqNkMHgkuQL/zJeooo root@np0005593234.novalocal (RSA)
Jan 23 03:15:14 np0005593234 cloud-init[1337]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 23 03:15:14 np0005593234 cloud-init[1340]: #############################################################
Jan 23 03:15:14 np0005593234 cloud-init[1302]: Cloud-init v. 24.4-8.el9 finished at Fri, 23 Jan 2026 08:15:14 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 9.85 seconds
Jan 23 03:15:14 np0005593234 systemd[1]: Finished Cloud-init: Final Stage.
Jan 23 03:15:14 np0005593234 systemd[1]: Reached target Cloud-init target.
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: memstrack is not available
Jan 23 03:15:15 np0005593234 dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 03:15:15 np0005593234 dracut[1286]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 03:15:16 np0005593234 dracut[1286]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 03:15:16 np0005593234 dracut[1286]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 03:15:16 np0005593234 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 03:15:16 np0005593234 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 03:15:16 np0005593234 dracut[1286]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 03:15:16 np0005593234 dracut[1286]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 03:15:16 np0005593234 dracut[1286]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 03:15:16 np0005593234 dracut[1286]: memstrack is not available
Jan 23 03:15:16 np0005593234 dracut[1286]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 03:15:16 np0005593234 dracut[1286]: *** Including module: systemd ***
Jan 23 03:15:16 np0005593234 dracut[1286]: *** Including module: fips ***
Jan 23 03:15:16 np0005593234 dracut[1286]: *** Including module: systemd-initrd ***
Jan 23 03:15:16 np0005593234 dracut[1286]: *** Including module: i18n ***
Jan 23 03:15:16 np0005593234 dracut[1286]: *** Including module: drm ***
Jan 23 03:15:17 np0005593234 chronyd[787]: Selected source 23.159.16.194 (2.centos.pool.ntp.org)
Jan 23 03:15:17 np0005593234 chronyd[787]: System clock TAI offset set to 37 seconds
Jan 23 03:15:17 np0005593234 dracut[1286]: *** Including module: prefixdevname ***
Jan 23 03:15:17 np0005593234 dracut[1286]: *** Including module: kernel-modules ***
Jan 23 03:15:17 np0005593234 kernel: block vda: the capability attribute has been deprecated.
Jan 23 03:15:17 np0005593234 dracut[1286]: *** Including module: kernel-modules-extra ***
Jan 23 03:15:17 np0005593234 dracut[1286]: *** Including module: qemu ***
Jan 23 03:15:18 np0005593234 dracut[1286]: *** Including module: fstab-sys ***
Jan 23 03:15:18 np0005593234 dracut[1286]: *** Including module: rootfs-block ***
Jan 23 03:15:18 np0005593234 dracut[1286]: *** Including module: terminfo ***
Jan 23 03:15:18 np0005593234 dracut[1286]: *** Including module: udev-rules ***
Jan 23 03:15:18 np0005593234 chronyd[787]: Selected source 54.39.23.64 (2.centos.pool.ntp.org)
Jan 23 03:15:18 np0005593234 dracut[1286]: Skipping udev rule: 91-permissions.rules
Jan 23 03:15:18 np0005593234 dracut[1286]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 23 03:15:18 np0005593234 dracut[1286]: *** Including module: virtiofs ***
Jan 23 03:15:18 np0005593234 dracut[1286]: *** Including module: dracut-systemd ***
Jan 23 03:15:19 np0005593234 dracut[1286]: *** Including module: usrmount ***
Jan 23 03:15:19 np0005593234 dracut[1286]: *** Including module: base ***
Jan 23 03:15:19 np0005593234 dracut[1286]: *** Including module: fs-lib ***
Jan 23 03:15:19 np0005593234 dracut[1286]: *** Including module: kdumpbase ***
Jan 23 03:15:19 np0005593234 dracut[1286]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 23 03:15:19 np0005593234 dracut[1286]:  microcode_ctl module: mangling fw_dir
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: configuration "intel" is ignored
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 23 03:15:19 np0005593234 dracut[1286]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 23 03:15:20 np0005593234 dracut[1286]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 23 03:15:20 np0005593234 dracut[1286]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 23 03:15:20 np0005593234 dracut[1286]: *** Including module: openssl ***
Jan 23 03:15:20 np0005593234 dracut[1286]: *** Including module: shutdown ***
Jan 23 03:15:20 np0005593234 dracut[1286]: *** Including module: squash ***
Jan 23 03:15:20 np0005593234 dracut[1286]: *** Including modules done ***
Jan 23 03:15:20 np0005593234 dracut[1286]: *** Installing kernel module dependencies ***
Jan 23 03:15:20 np0005593234 dracut[1286]: *** Installing kernel module dependencies done ***
Jan 23 03:15:20 np0005593234 dracut[1286]: *** Resolving executable dependencies ***
Jan 23 03:15:20 np0005593234 irqbalance[781]: Cannot change IRQ 35 affinity: Operation not permitted
Jan 23 03:15:20 np0005593234 irqbalance[781]: IRQ 35 affinity is now unmanaged
Jan 23 03:15:20 np0005593234 irqbalance[781]: Cannot change IRQ 33 affinity: Operation not permitted
Jan 23 03:15:20 np0005593234 irqbalance[781]: IRQ 33 affinity is now unmanaged
Jan 23 03:15:20 np0005593234 irqbalance[781]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 23 03:15:20 np0005593234 irqbalance[781]: IRQ 31 affinity is now unmanaged
Jan 23 03:15:20 np0005593234 irqbalance[781]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 23 03:15:20 np0005593234 irqbalance[781]: IRQ 28 affinity is now unmanaged
Jan 23 03:15:20 np0005593234 irqbalance[781]: Cannot change IRQ 34 affinity: Operation not permitted
Jan 23 03:15:20 np0005593234 irqbalance[781]: IRQ 34 affinity is now unmanaged
Jan 23 03:15:20 np0005593234 irqbalance[781]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 23 03:15:20 np0005593234 irqbalance[781]: IRQ 32 affinity is now unmanaged
Jan 23 03:15:20 np0005593234 irqbalance[781]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 23 03:15:20 np0005593234 irqbalance[781]: IRQ 30 affinity is now unmanaged
Jan 23 03:15:20 np0005593234 irqbalance[781]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 23 03:15:20 np0005593234 irqbalance[781]: IRQ 29 affinity is now unmanaged
Jan 23 03:15:22 np0005593234 dracut[1286]: *** Resolving executable dependencies done ***
Jan 23 03:15:22 np0005593234 dracut[1286]: *** Generating early-microcode cpio image ***
Jan 23 03:15:22 np0005593234 dracut[1286]: *** Store current command line parameters ***
Jan 23 03:15:22 np0005593234 dracut[1286]: Stored kernel commandline:
Jan 23 03:15:22 np0005593234 dracut[1286]: No dracut internal kernel commandline stored in the initramfs
Jan 23 03:15:22 np0005593234 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 03:15:22 np0005593234 dracut[1286]: *** Install squash loader ***
Jan 23 03:15:23 np0005593234 dracut[1286]: *** Squashing the files inside the initramfs ***
Jan 23 03:15:24 np0005593234 dracut[1286]: *** Squashing the files inside the initramfs done ***
Jan 23 03:15:24 np0005593234 dracut[1286]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 23 03:15:24 np0005593234 dracut[1286]: *** Hardlinking files ***
Jan 23 03:15:24 np0005593234 dracut[1286]: *** Hardlinking files done ***
Jan 23 03:15:24 np0005593234 dracut[1286]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 23 03:15:25 np0005593234 kdumpctl[1018]: kdump: kexec: loaded kdump kernel
Jan 23 03:15:25 np0005593234 kdumpctl[1018]: kdump: Starting kdump: [OK]
Jan 23 03:15:25 np0005593234 systemd[1]: Finished Crash recovery kernel arming.
Jan 23 03:15:25 np0005593234 systemd[1]: Startup finished in 2.031s (kernel) + 2.711s (initrd) + 15.424s (userspace) = 20.167s.
Jan 23 03:15:42 np0005593234 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 03:16:11 np0005593234 systemd[1]: Created slice User Slice of UID 1000.
Jan 23 03:16:11 np0005593234 systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 23 03:16:11 np0005593234 systemd-logind[794]: New session 1 of user zuul.
Jan 23 03:16:11 np0005593234 systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 23 03:16:11 np0005593234 systemd[1]: Starting User Manager for UID 1000...
Jan 23 03:16:11 np0005593234 systemd[4310]: Queued start job for default target Main User Target.
Jan 23 03:16:11 np0005593234 systemd[4310]: Created slice User Application Slice.
Jan 23 03:16:11 np0005593234 systemd[4310]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 03:16:11 np0005593234 systemd[4310]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 03:16:11 np0005593234 systemd[4310]: Reached target Paths.
Jan 23 03:16:11 np0005593234 systemd[4310]: Reached target Timers.
Jan 23 03:16:11 np0005593234 systemd[4310]: Starting D-Bus User Message Bus Socket...
Jan 23 03:16:11 np0005593234 systemd[4310]: Starting Create User's Volatile Files and Directories...
Jan 23 03:16:12 np0005593234 systemd[4310]: Finished Create User's Volatile Files and Directories.
Jan 23 03:16:12 np0005593234 systemd[4310]: Listening on D-Bus User Message Bus Socket.
Jan 23 03:16:12 np0005593234 systemd[4310]: Reached target Sockets.
Jan 23 03:16:12 np0005593234 systemd[4310]: Reached target Basic System.
Jan 23 03:16:12 np0005593234 systemd[4310]: Reached target Main User Target.
Jan 23 03:16:12 np0005593234 systemd[4310]: Startup finished in 130ms.
Jan 23 03:16:12 np0005593234 systemd[1]: Started User Manager for UID 1000.
Jan 23 03:16:12 np0005593234 systemd[1]: Started Session 1 of User zuul.
Jan 23 03:16:12 np0005593234 python3[4392]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:16:19 np0005593234 python3[4420]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:16:30 np0005593234 python3[4478]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:16:31 np0005593234 python3[4518]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 23 03:16:34 np0005593234 python3[4544]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC9EiUM/Wr1dKB5RqtCvSqVc226WpmY1hGtuMMJuG9WgOJFIcFQvrVt53c29e+5OTV1q1e4Rmvj7g1SyULPxSS7/DtzxyTWY79kkBpxNFmAUeUA4U0adsVRTquvEONpBa4UE3bqCTtaRQa8XED98xqS4bCBXmbcLROlQ4Qc91Uj3wxKY4/fplPdXYZdZXz3cxwEsyC6dRkYcfiUSowlrmecr3FZO6SJfG9H4YFxzwAu1R4led86PwzjZJyHfDeIHcdaDUVFcX2hGQv9iIqgYP58aTb2gRp2PxSQJfGAevolpgA3xrQKo2uBDBuRTC/hE81toPd5IIPQ3lX2JDXxauMMbmmxSjYCltaP2/bcvZ697yZh1vEmyz62itMHt6GV69XsjsX5jHWhY2RtQ6ZpsNSqrOHSUj4jlPcZEFk+4UshKJJZNaM1psuS+KAGeodosF43EuKDbWMGeqCe/kwZaBXj/Xxob+rLcVQBMVOBq+EHuNNKxSIqaNZiMz0RBf11CUk= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:34 np0005593234 python3[4568]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:35 np0005593234 python3[4667]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:16:35 np0005593234 python3[4738]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769156194.8699286-254-181285763462791/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=87e5a27e13f74e79b84f2ddd13a58bce_id_rsa follow=False checksum=40e82d1acb27268baed51ce64c7c4dfd80f45a5d backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:36 np0005593234 python3[4861]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:16:36 np0005593234 python3[4932]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769156196.1172624-308-92202610850656/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=87e5a27e13f74e79b84f2ddd13a58bce_id_rsa.pub follow=False checksum=1c043d58f1d2a49f415267f4f9437247d6d980d7 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:38 np0005593234 python3[4980]: ansible-ping Invoked with data=pong
Jan 23 03:16:39 np0005593234 python3[5004]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:16:41 np0005593234 python3[5062]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 23 03:16:42 np0005593234 python3[5094]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:42 np0005593234 python3[5118]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:42 np0005593234 python3[5142]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:43 np0005593234 python3[5166]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:43 np0005593234 python3[5190]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:43 np0005593234 python3[5214]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:45 np0005593234 python3[5240]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:46 np0005593234 python3[5318]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:16:46 np0005593234 python3[5391]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769156205.721043-35-90947939046821/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:47 np0005593234 python3[5439]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:47 np0005593234 python3[5463]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:47 np0005593234 python3[5487]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:47 np0005593234 python3[5511]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:48 np0005593234 python3[5535]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:48 np0005593234 python3[5559]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:48 np0005593234 python3[5583]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:48 np0005593234 python3[5607]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:49 np0005593234 python3[5631]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:49 np0005593234 python3[5655]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:49 np0005593234 python3[5679]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:50 np0005593234 python3[5703]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:50 np0005593234 python3[5727]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:50 np0005593234 python3[5751]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:51 np0005593234 python3[5775]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:51 np0005593234 python3[5799]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:51 np0005593234 python3[5823]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:52 np0005593234 python3[5847]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:52 np0005593234 python3[5871]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:52 np0005593234 python3[5895]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:52 np0005593234 python3[5919]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:53 np0005593234 python3[5943]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:53 np0005593234 python3[5967]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:53 np0005593234 python3[5991]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:53 np0005593234 python3[6015]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:54 np0005593234 python3[6039]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:16:56 np0005593234 python3[6065]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 03:16:56 np0005593234 systemd[1]: Starting Time & Date Service...
Jan 23 03:16:56 np0005593234 systemd[1]: Started Time & Date Service.
Jan 23 03:16:56 np0005593234 systemd-timedated[6067]: Changed time zone to 'UTC' (UTC).
Jan 23 03:16:58 np0005593234 python3[6098]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:58 np0005593234 python3[6174]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:16:59 np0005593234 python3[6245]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769156218.467752-255-34384502288647/source _original_basename=tmpt_g6euo1 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:16:59 np0005593234 python3[6345]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:16:59 np0005593234 python3[6416]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769156219.2801986-304-1992643512360/source _original_basename=tmpj0it3bzi follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:17:00 np0005593234 python3[6518]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:17:01 np0005593234 python3[6591]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769156220.4782577-384-221536886738379/source _original_basename=tmpnt_ebx1e follow=False checksum=9dc2039529c0f35ddba9b5f747501467f5135778 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:17:01 np0005593234 python3[6639]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:17:01 np0005593234 python3[6665]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:17:02 np0005593234 python3[6745]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:17:02 np0005593234 python3[6818]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769156222.211341-455-33273876619304/source _original_basename=tmp5p_1znbv follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:17:03 np0005593234 python3[6869]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-51e8-11f9-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:17:04 np0005593234 python3[6897]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-51e8-11f9-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 23 03:17:05 np0005593234 python3[6925]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:17:26 np0005593234 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 03:17:38 np0005593234 python3[6953]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:18:22 np0005593234 systemd[4310]: Starting Mark boot as successful...
Jan 23 03:18:22 np0005593234 systemd[4310]: Finished Mark boot as successful.
Jan 23 03:18:38 np0005593234 systemd-logind[794]: Session 1 logged out. Waiting for processes to exit.
Jan 23 03:19:09 np0005593234 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 23 03:19:09 np0005593234 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 23 03:19:09 np0005593234 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 23 03:19:09 np0005593234 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 23 03:19:09 np0005593234 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 23 03:19:09 np0005593234 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 23 03:19:09 np0005593234 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 23 03:19:09 np0005593234 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 23 03:19:09 np0005593234 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 23 03:19:09 np0005593234 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 23 03:19:09 np0005593234 NetworkManager[858]: <info>  [1769156349.5826] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 03:19:09 np0005593234 systemd-udevd[6956]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 03:19:09 np0005593234 NetworkManager[858]: <info>  [1769156349.5984] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:19:09 np0005593234 NetworkManager[858]: <info>  [1769156349.6010] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 23 03:19:09 np0005593234 NetworkManager[858]: <info>  [1769156349.6016] device (eth1): carrier: link connected
Jan 23 03:19:09 np0005593234 NetworkManager[858]: <info>  [1769156349.6019] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 03:19:09 np0005593234 NetworkManager[858]: <info>  [1769156349.6025] policy: auto-activating connection 'Wired connection 1' (1265e500-f61d-3e4a-baaf-c93c8bb5c40a)
Jan 23 03:19:09 np0005593234 NetworkManager[858]: <info>  [1769156349.6029] device (eth1): Activation: starting connection 'Wired connection 1' (1265e500-f61d-3e4a-baaf-c93c8bb5c40a)
Jan 23 03:19:09 np0005593234 NetworkManager[858]: <info>  [1769156349.6031] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:19:09 np0005593234 NetworkManager[858]: <info>  [1769156349.6035] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:19:09 np0005593234 NetworkManager[858]: <info>  [1769156349.6039] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:19:09 np0005593234 NetworkManager[858]: <info>  [1769156349.6044] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:19:10 np0005593234 systemd-logind[794]: New session 3 of user zuul.
Jan 23 03:19:10 np0005593234 systemd[1]: Started Session 3 of User zuul.
Jan 23 03:19:10 np0005593234 python3[6988]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-ea81-0856-0000000001f6-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:19:20 np0005593234 python3[7068]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:19:20 np0005593234 python3[7141]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769156360.0913858-206-104121910095112/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=b16cf6806b012f44881e641077d509c0980e9144 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:19:21 np0005593234 python3[7191]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:19:21 np0005593234 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 03:19:21 np0005593234 systemd[1]: Stopped Network Manager Wait Online.
Jan 23 03:19:21 np0005593234 systemd[1]: Stopping Network Manager Wait Online...
Jan 23 03:19:21 np0005593234 systemd[1]: Stopping Network Manager...
Jan 23 03:19:21 np0005593234 NetworkManager[858]: <info>  [1769156361.2041] caught SIGTERM, shutting down normally.
Jan 23 03:19:21 np0005593234 NetworkManager[858]: <info>  [1769156361.2050] dhcp4 (eth0): canceled DHCP transaction
Jan 23 03:19:21 np0005593234 NetworkManager[858]: <info>  [1769156361.2050] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:19:21 np0005593234 NetworkManager[858]: <info>  [1769156361.2050] dhcp4 (eth0): state changed no lease
Jan 23 03:19:21 np0005593234 NetworkManager[858]: <info>  [1769156361.2051] manager: NetworkManager state is now CONNECTING
Jan 23 03:19:21 np0005593234 NetworkManager[858]: <info>  [1769156361.2136] dhcp4 (eth1): canceled DHCP transaction
Jan 23 03:19:21 np0005593234 NetworkManager[858]: <info>  [1769156361.2136] dhcp4 (eth1): state changed no lease
Jan 23 03:19:21 np0005593234 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 03:19:21 np0005593234 NetworkManager[858]: <info>  [1769156361.2229] exiting (success)
Jan 23 03:19:21 np0005593234 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 03:19:21 np0005593234 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 03:19:21 np0005593234 systemd[1]: Stopped Network Manager.
Jan 23 03:19:21 np0005593234 systemd[1]: NetworkManager.service: Consumed 1.412s CPU time, 10.0M memory peak.
Jan 23 03:19:21 np0005593234 systemd[1]: Starting Network Manager...
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.2771] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ecc0581c-6611-4866-b3e1-0bc978951940)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.2772] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.2817] manager[0x55b8aa87c000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 03:19:21 np0005593234 systemd[1]: Starting Hostname Service...
Jan 23 03:19:21 np0005593234 systemd[1]: Started Hostname Service.
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3558] hostname: hostname: using hostnamed
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3558] hostname: static hostname changed from (none) to "np0005593234.novalocal"
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3563] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3567] manager[0x55b8aa87c000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3567] manager[0x55b8aa87c000]: rfkill: WWAN hardware radio set enabled
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3595] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3595] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3596] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3596] manager: Networking is enabled by state file
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3598] settings: Loaded settings plugin: keyfile (internal)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3601] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3627] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3635] dhcp: init: Using DHCP client 'internal'
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3637] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3641] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3646] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3654] device (lo): Activation: starting connection 'lo' (4dda678f-e4eb-42eb-8b71-b7827298a97a)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3660] device (eth0): carrier: link connected
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3664] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3668] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3668] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3674] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3680] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3685] device (eth1): carrier: link connected
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3689] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3694] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (1265e500-f61d-3e4a-baaf-c93c8bb5c40a) (indicated)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3694] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3699] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3705] device (eth1): Activation: starting connection 'Wired connection 1' (1265e500-f61d-3e4a-baaf-c93c8bb5c40a)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3711] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 03:19:21 np0005593234 systemd[1]: Started Network Manager.
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3715] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3717] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3719] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3721] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3724] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3728] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3731] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3733] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3744] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3755] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3763] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3766] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3782] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3792] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3798] device (lo): Activation: successful, device activated.
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3803] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.3810] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 03:19:21 np0005593234 systemd[1]: Starting Network Manager Wait Online...
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.4441] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.4463] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.4464] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.4468] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.4471] device (eth0): Activation: successful, device activated.
Jan 23 03:19:21 np0005593234 NetworkManager[7203]: <info>  [1769156361.4476] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 03:19:21 np0005593234 python3[7275]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-ea81-0856-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:19:31 np0005593234 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 03:19:51 np0005593234 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 03:20:06 np0005593234 NetworkManager[7203]: <info>  [1769156406.9746] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 03:20:06 np0005593234 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 03:20:06 np0005593234 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 03:20:06 np0005593234 NetworkManager[7203]: <info>  [1769156406.9978] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 03:20:06 np0005593234 NetworkManager[7203]: <info>  [1769156406.9981] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 03:20:06 np0005593234 NetworkManager[7203]: <info>  [1769156406.9988] device (eth1): Activation: successful, device activated.
Jan 23 03:20:06 np0005593234 NetworkManager[7203]: <info>  [1769156406.9996] manager: startup complete
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156406.9999] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <warn>  [1769156407.0005] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156407.0012] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 23 03:20:07 np0005593234 systemd[1]: Finished Network Manager Wait Online.
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156407.0166] dhcp4 (eth1): canceled DHCP transaction
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156407.0167] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156407.0167] dhcp4 (eth1): state changed no lease
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156407.0181] policy: auto-activating connection 'ci-private-network' (8808941e-5666-5885-aa66-75a36520f7d1)
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156407.0184] device (eth1): Activation: starting connection 'ci-private-network' (8808941e-5666-5885-aa66-75a36520f7d1)
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156407.0185] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156407.0187] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156407.0193] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156407.0201] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156407.0621] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156407.0623] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:20:07 np0005593234 NetworkManager[7203]: <info>  [1769156407.0627] device (eth1): Activation: successful, device activated.
Jan 23 03:20:17 np0005593234 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 03:20:21 np0005593234 systemd[1]: session-3.scope: Deactivated successfully.
Jan 23 03:20:21 np0005593234 systemd[1]: session-3.scope: Consumed 1.354s CPU time.
Jan 23 03:20:21 np0005593234 systemd-logind[794]: Session 3 logged out. Waiting for processes to exit.
Jan 23 03:20:21 np0005593234 systemd-logind[794]: Removed session 3.
Jan 23 03:20:33 np0005593234 systemd-logind[794]: New session 4 of user zuul.
Jan 23 03:20:33 np0005593234 systemd[1]: Started Session 4 of User zuul.
Jan 23 03:20:34 np0005593234 python3[7386]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:20:34 np0005593234 python3[7459]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769156433.9332118-373-64932963123265/source _original_basename=tmpz_5jatnd follow=False checksum=2312614d64ff3d0f4e5be15d02ba4fd1b13cd8df backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:20:36 np0005593234 systemd[1]: session-4.scope: Deactivated successfully.
Jan 23 03:20:36 np0005593234 systemd-logind[794]: Session 4 logged out. Waiting for processes to exit.
Jan 23 03:20:36 np0005593234 systemd-logind[794]: Removed session 4.
Jan 23 03:21:22 np0005593234 systemd[4310]: Created slice User Background Tasks Slice.
Jan 23 03:21:22 np0005593234 systemd[4310]: Starting Cleanup of User's Temporary Files and Directories...
Jan 23 03:21:22 np0005593234 systemd[4310]: Finished Cleanup of User's Temporary Files and Directories.
Jan 23 03:21:45 np0005593234 chronyd[787]: Selected source 23.159.16.194 (2.centos.pool.ntp.org)
Jan 23 03:26:26 np0005593234 systemd-logind[794]: New session 5 of user zuul.
Jan 23 03:26:26 np0005593234 systemd[1]: Started Session 5 of User zuul.
Jan 23 03:26:26 np0005593234 python3[7530]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-81c6-2885-000000000ca6-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:26:27 np0005593234 python3[7559]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:26:27 np0005593234 python3[7585]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:26:27 np0005593234 python3[7611]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:26:28 np0005593234 python3[7637]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:26:28 np0005593234 python3[7663]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:26:29 np0005593234 python3[7741]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:26:29 np0005593234 python3[7814]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769156789.151696-369-138798110658348/source _original_basename=tmpv2d6x9wg follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:26:31 np0005593234 python3[7864]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 03:26:31 np0005593234 systemd[1]: Reloading.
Jan 23 03:26:31 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:26:33 np0005593234 python3[7920]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 23 03:26:34 np0005593234 python3[7946]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:26:34 np0005593234 python3[7974]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:26:35 np0005593234 python3[8002]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:26:35 np0005593234 python3[8030]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:26:36 np0005593234 python3[8057]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-81c6-2885-000000000cad-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:26:37 np0005593234 python3[8087]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 23 03:26:40 np0005593234 systemd[1]: session-5.scope: Deactivated successfully.
Jan 23 03:26:40 np0005593234 systemd[1]: session-5.scope: Consumed 3.711s CPU time.
Jan 23 03:26:40 np0005593234 systemd-logind[794]: Session 5 logged out. Waiting for processes to exit.
Jan 23 03:26:40 np0005593234 systemd-logind[794]: Removed session 5.
Jan 23 03:26:41 np0005593234 systemd-logind[794]: New session 6 of user zuul.
Jan 23 03:26:41 np0005593234 systemd[1]: Started Session 6 of User zuul.
Jan 23 03:26:42 np0005593234 python3[8121]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 23 03:26:48 np0005593234 setsebool[8163]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 23 03:26:48 np0005593234 setsebool[8163]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 23 03:27:00 np0005593234 kernel: SELinux:  Converting 385 SID table entries...
Jan 23 03:27:00 np0005593234 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 03:27:00 np0005593234 kernel: SELinux:  policy capability open_perms=1
Jan 23 03:27:00 np0005593234 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 03:27:00 np0005593234 kernel: SELinux:  policy capability always_check_network=0
Jan 23 03:27:00 np0005593234 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 03:27:00 np0005593234 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 03:27:00 np0005593234 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 03:27:11 np0005593234 kernel: SELinux:  Converting 388 SID table entries...
Jan 23 03:27:12 np0005593234 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 03:27:12 np0005593234 kernel: SELinux:  policy capability open_perms=1
Jan 23 03:27:12 np0005593234 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 03:27:12 np0005593234 kernel: SELinux:  policy capability always_check_network=0
Jan 23 03:27:12 np0005593234 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 03:27:12 np0005593234 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 03:27:12 np0005593234 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 03:27:30 np0005593234 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 03:27:30 np0005593234 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 03:27:30 np0005593234 systemd[1]: Starting man-db-cache-update.service...
Jan 23 03:27:30 np0005593234 systemd[1]: Reloading.
Jan 23 03:27:30 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:27:30 np0005593234 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 03:28:13 np0005593234 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 03:28:13 np0005593234 systemd[1]: Finished man-db-cache-update.service.
Jan 23 03:28:13 np0005593234 systemd[1]: man-db-cache-update.service: Consumed 53.041s CPU time.
Jan 23 03:28:13 np0005593234 systemd[1]: run-r3cad523bf1264a28b5382f1694a12619.service: Deactivated successfully.
Jan 23 03:28:19 np0005593234 python3[29516]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-57be-5ddd-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:28:20 np0005593234 kernel: evm: overlay not supported
Jan 23 03:28:20 np0005593234 systemd[4310]: Starting D-Bus User Message Bus...
Jan 23 03:28:20 np0005593234 dbus-broker-launch[29576]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 23 03:28:20 np0005593234 dbus-broker-launch[29576]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 23 03:28:20 np0005593234 systemd[4310]: Started D-Bus User Message Bus.
Jan 23 03:28:20 np0005593234 dbus-broker-lau[29576]: Ready
Jan 23 03:28:20 np0005593234 systemd[4310]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 03:28:20 np0005593234 systemd[4310]: Created slice Slice /user.
Jan 23 03:28:20 np0005593234 systemd[4310]: podman-29556.scope: unit configures an IP firewall, but not running as root.
Jan 23 03:28:20 np0005593234 systemd[4310]: (This warning is only shown for the first unit using IP firewalling.)
Jan 23 03:28:20 np0005593234 systemd[4310]: Started podman-29556.scope.
Jan 23 03:28:20 np0005593234 systemd[4310]: Started podman-pause-d8bb6e70.scope.
Jan 23 03:28:21 np0005593234 python3[29604]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.129.56.147:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.129.56.147:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:28:21 np0005593234 python3[29604]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Jan 23 03:28:21 np0005593234 systemd[1]: session-6.scope: Deactivated successfully.
Jan 23 03:28:21 np0005593234 systemd[1]: session-6.scope: Consumed 44.667s CPU time.
Jan 23 03:28:21 np0005593234 systemd-logind[794]: Session 6 logged out. Waiting for processes to exit.
Jan 23 03:28:21 np0005593234 systemd-logind[794]: Removed session 6.
Jan 23 03:28:49 np0005593234 systemd-logind[794]: New session 7 of user zuul.
Jan 23 03:28:49 np0005593234 systemd[1]: Started Session 7 of User zuul.
Jan 23 03:28:49 np0005593234 python3[29643]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO5lLQe2ste4Gmi1Ir356q/C15WL/7fzZRoS9rMZVgSYU6jYrJKHH43bSyQAo3PQspZm2qMkx0r+2fgxF65A8l0= zuul@np0005593231.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:28:50 np0005593234 python3[29669]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO5lLQe2ste4Gmi1Ir356q/C15WL/7fzZRoS9rMZVgSYU6jYrJKHH43bSyQAo3PQspZm2qMkx0r+2fgxF65A8l0= zuul@np0005593231.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:28:51 np0005593234 python3[29695]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005593234.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 23 03:28:52 np0005593234 python3[29729]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBO5lLQe2ste4Gmi1Ir356q/C15WL/7fzZRoS9rMZVgSYU6jYrJKHH43bSyQAo3PQspZm2qMkx0r+2fgxF65A8l0= zuul@np0005593231.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 03:28:52 np0005593234 python3[29807]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:28:53 np0005593234 python3[29880]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769156932.3176847-170-171599610491860/source _original_basename=tmp0qx5eke_ follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:28:53 np0005593234 python3[29930]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Jan 23 03:28:53 np0005593234 systemd[1]: Starting Hostname Service...
Jan 23 03:28:53 np0005593234 systemd[1]: Started Hostname Service.
Jan 23 03:28:54 np0005593234 systemd-hostnamed[29934]: Changed pretty hostname to 'compute-2'
Jan 23 03:28:54 np0005593234 systemd-hostnamed[29934]: Hostname set to <compute-2> (static)
Jan 23 03:28:54 np0005593234 NetworkManager[7203]: <info>  [1769156934.0289] hostname: static hostname changed from "np0005593234.novalocal" to "compute-2"
Jan 23 03:28:54 np0005593234 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 03:28:54 np0005593234 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 03:28:54 np0005593234 systemd-logind[794]: Session 7 logged out. Waiting for processes to exit.
Jan 23 03:28:54 np0005593234 systemd[1]: session-7.scope: Deactivated successfully.
Jan 23 03:28:54 np0005593234 systemd[1]: session-7.scope: Consumed 2.048s CPU time.
Jan 23 03:28:54 np0005593234 systemd-logind[794]: Removed session 7.
Jan 23 03:29:04 np0005593234 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 03:29:24 np0005593234 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 03:30:22 np0005593234 systemd[1]: Starting Cleanup of Temporary Directories...
Jan 23 03:30:22 np0005593234 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 23 03:30:22 np0005593234 systemd[1]: Finished Cleanup of Temporary Directories.
Jan 23 03:30:22 np0005593234 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 23 03:33:53 np0005593234 systemd-logind[794]: New session 8 of user zuul.
Jan 23 03:33:53 np0005593234 systemd[1]: Started Session 8 of User zuul.
Jan 23 03:33:54 np0005593234 python3[30041]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:33:56 np0005593234 python3[30157]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:33:57 np0005593234 python3[30230]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1673222-34018-267897590446211/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:33:57 np0005593234 python3[30256]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:33:57 np0005593234 python3[30329]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1673222-34018-267897590446211/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:33:58 np0005593234 python3[30355]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:33:58 np0005593234 python3[30428]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1673222-34018-267897590446211/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:33:58 np0005593234 python3[30454]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:33:59 np0005593234 python3[30527]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1673222-34018-267897590446211/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:33:59 np0005593234 python3[30553]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:33:59 np0005593234 python3[30626]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1673222-34018-267897590446211/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:33:59 np0005593234 python3[30652]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:34:00 np0005593234 python3[30725]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1673222-34018-267897590446211/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:34:00 np0005593234 python3[30751]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:34:00 np0005593234 python3[30824]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769157236.1673222-34018-267897590446211/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:34:15 np0005593234 python3[30872]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:39:14 np0005593234 systemd[1]: session-8.scope: Deactivated successfully.
Jan 23 03:39:14 np0005593234 systemd[1]: session-8.scope: Consumed 5.034s CPU time.
Jan 23 03:39:14 np0005593234 systemd-logind[794]: Session 8 logged out. Waiting for processes to exit.
Jan 23 03:39:14 np0005593234 systemd-logind[794]: Removed session 8.
Jan 23 03:49:32 np0005593234 systemd-logind[794]: New session 9 of user zuul.
Jan 23 03:49:32 np0005593234 systemd[1]: Started Session 9 of User zuul.
Jan 23 03:49:34 np0005593234 python3.9[31043]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:49:35 np0005593234 python3.9[31224]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:49:43 np0005593234 systemd[1]: session-9.scope: Deactivated successfully.
Jan 23 03:49:43 np0005593234 systemd[1]: session-9.scope: Consumed 7.736s CPU time.
Jan 23 03:49:43 np0005593234 systemd-logind[794]: Session 9 logged out. Waiting for processes to exit.
Jan 23 03:49:43 np0005593234 systemd-logind[794]: Removed session 9.
Jan 23 03:50:00 np0005593234 systemd-logind[794]: New session 10 of user zuul.
Jan 23 03:50:00 np0005593234 systemd[1]: Started Session 10 of User zuul.
Jan 23 03:50:00 np0005593234 python3.9[31435]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 23 03:50:02 np0005593234 python3.9[31609]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:50:03 np0005593234 python3.9[31761]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:50:05 np0005593234 python3.9[31914]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:50:06 np0005593234 python3.9[32066]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:50:06 np0005593234 python3.9[32218]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:50:07 np0005593234 python3.9[32341]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158206.3624496-180-22151981205476/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:50:08 np0005593234 python3.9[32493]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:50:09 np0005593234 python3.9[32649]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:50:09 np0005593234 python3.9[32801]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:50:10 np0005593234 python3.9[32951]: ansible-ansible.builtin.service_facts Invoked
Jan 23 03:50:16 np0005593234 python3.9[33204]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:50:17 np0005593234 python3.9[33354]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:50:18 np0005593234 python3.9[33508]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:50:19 np0005593234 python3.9[33666]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:50:20 np0005593234 python3.9[33750]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:51:03 np0005593234 systemd[1]: Reloading.
Jan 23 03:51:03 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:51:03 np0005593234 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 23 03:51:03 np0005593234 systemd[1]: Reloading.
Jan 23 03:51:03 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:51:03 np0005593234 systemd[1]: Starting dnf makecache...
Jan 23 03:51:03 np0005593234 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 23 03:51:03 np0005593234 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 23 03:51:03 np0005593234 systemd[1]: Reloading.
Jan 23 03:51:03 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:51:03 np0005593234 dnf[34001]: Failed determining last makecache time.
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-openstack-barbican-42b4c41831408a8e323 156 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 195 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-openstack-cinder-1c00d6490d88e436f26ef 148 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-python-stevedore-c4acc5639fd2329372142 174 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-python-cloudkitty-tests-tempest-2c80f8 183 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-os-refresh-config-9bfc52b5049be2d8de61 179 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 160 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-python-designate-tests-tempest-347fdbc 186 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-openstack-glance-1fd12c29b339f30fe823e 149 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 03:51:04 np0005593234 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 152 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-openstack-manila-3c01b7181572c95dac462 172 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-python-whitebox-neutron-tests-tempest- 187 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-openstack-octavia-ba397f07a7331190208c 180 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-openstack-watcher-c014f81a8647287f6dcc 196 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-ansible-config_template-5ccaa22121a7ff 197 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 182 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-openstack-swift-dc98a8463506ac520c469a 187 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-python-tempestconf-8515371b7cceebd4282 186 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: delorean-openstack-heat-ui-013accbfd179753bc3f0 168 kB/s | 3.0 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: CentOS Stream 9 - BaseOS                         29 kB/s | 6.7 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: CentOS Stream 9 - AppStream                      68 kB/s | 6.8 kB     00:00
Jan 23 03:51:04 np0005593234 dnf[34001]: CentOS Stream 9 - CRB                            62 kB/s | 6.6 kB     00:00
Jan 23 03:51:05 np0005593234 dnf[34001]: CentOS Stream 9 - Extras packages                77 kB/s | 7.3 kB     00:00
Jan 23 03:51:05 np0005593234 dnf[34001]: dlrn-antelope-testing                           110 kB/s | 3.0 kB     00:00
Jan 23 03:51:05 np0005593234 dnf[34001]: dlrn-antelope-build-deps                         95 kB/s | 3.0 kB     00:00
Jan 23 03:51:05 np0005593234 dnf[34001]: centos9-rabbitmq                                 54 kB/s | 3.0 kB     00:00
Jan 23 03:51:05 np0005593234 dnf[34001]: centos9-storage                                 126 kB/s | 3.0 kB     00:00
Jan 23 03:51:05 np0005593234 dnf[34001]: centos9-opstools                                141 kB/s | 3.0 kB     00:00
Jan 23 03:51:05 np0005593234 dnf[34001]: NFV SIG OpenvSwitch                             149 kB/s | 3.0 kB     00:00
Jan 23 03:51:05 np0005593234 dnf[34001]: repo-setup-centos-appstream                     203 kB/s | 4.4 kB     00:00
Jan 23 03:51:05 np0005593234 dnf[34001]: repo-setup-centos-baseos                        183 kB/s | 3.9 kB     00:00
Jan 23 03:51:05 np0005593234 dnf[34001]: repo-setup-centos-highavailability              180 kB/s | 3.9 kB     00:00
Jan 23 03:51:05 np0005593234 dnf[34001]: repo-setup-centos-powertools                    167 kB/s | 4.3 kB     00:00
Jan 23 03:51:05 np0005593234 dnf[34001]: Extra Packages for Enterprise Linux 9 - x86_64  126 kB/s |  17 kB     00:00
Jan 23 03:51:06 np0005593234 dnf[34001]: Metadata cache created.
Jan 23 03:51:06 np0005593234 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 23 03:51:06 np0005593234 systemd[1]: Finished dnf makecache.
Jan 23 03:51:06 np0005593234 systemd[1]: dnf-makecache.service: Consumed 1.725s CPU time.
Jan 23 03:52:04 np0005593234 kernel: SELinux:  Converting 2724 SID table entries...
Jan 23 03:52:04 np0005593234 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 03:52:04 np0005593234 kernel: SELinux:  policy capability open_perms=1
Jan 23 03:52:04 np0005593234 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 03:52:04 np0005593234 kernel: SELinux:  policy capability always_check_network=0
Jan 23 03:52:04 np0005593234 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 03:52:04 np0005593234 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 03:52:04 np0005593234 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 03:52:04 np0005593234 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 23 03:52:05 np0005593234 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 03:52:05 np0005593234 systemd[1]: Starting man-db-cache-update.service...
Jan 23 03:52:05 np0005593234 systemd[1]: Reloading.
Jan 23 03:52:05 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:52:05 np0005593234 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 03:52:06 np0005593234 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 03:52:06 np0005593234 systemd[1]: Finished man-db-cache-update.service.
Jan 23 03:52:06 np0005593234 systemd[1]: man-db-cache-update.service: Consumed 1.203s CPU time.
Jan 23 03:52:06 np0005593234 systemd[1]: run-rb3cb631477134a86b8cd64b70bf82221.service: Deactivated successfully.
Jan 23 03:52:14 np0005593234 python3.9[35314]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:52:16 np0005593234 python3.9[35595]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 23 03:52:17 np0005593234 python3.9[35747]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 23 03:52:19 np0005593234 python3.9[35900]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:52:21 np0005593234 python3.9[36052]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 23 03:52:22 np0005593234 python3.9[36204]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:52:28 np0005593234 python3.9[36356]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:52:29 np0005593234 python3.9[36480]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158343.2644074-670-178750651122569/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:52:30 np0005593234 python3.9[36632]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:52:35 np0005593234 python3.9[36784]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:52:36 np0005593234 python3.9[36937]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:52:37 np0005593234 python3.9[37089]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 23 03:52:37 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 03:52:38 np0005593234 python3.9[37243]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 03:52:39 np0005593234 python3.9[37401]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 03:52:40 np0005593234 python3.9[37561]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 23 03:52:41 np0005593234 python3.9[37714]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 03:52:42 np0005593234 python3.9[37872]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 23 03:52:43 np0005593234 python3.9[38024]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:52:46 np0005593234 python3.9[38177]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:52:47 np0005593234 python3.9[38329]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:52:47 np0005593234 python3.9[38452]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158366.662569-1026-46453009455099/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:52:48 np0005593234 python3.9[38604]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:52:48 np0005593234 systemd[1]: Starting Load Kernel Modules...
Jan 23 03:52:48 np0005593234 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 23 03:52:48 np0005593234 kernel: Bridge firewalling registered
Jan 23 03:52:48 np0005593234 systemd-modules-load[38608]: Inserted module 'br_netfilter'
Jan 23 03:52:48 np0005593234 systemd[1]: Finished Load Kernel Modules.
Jan 23 03:52:49 np0005593234 python3.9[38763]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:52:50 np0005593234 python3.9[38886]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158369.2748446-1095-206596266162940/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:52:51 np0005593234 python3.9[39038]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:52:54 np0005593234 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 03:52:54 np0005593234 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 03:52:55 np0005593234 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 03:52:55 np0005593234 systemd[1]: Starting man-db-cache-update.service...
Jan 23 03:52:55 np0005593234 systemd[1]: Reloading.
Jan 23 03:52:55 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:52:55 np0005593234 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 03:52:58 np0005593234 python3.9[42341]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:52:59 np0005593234 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 03:52:59 np0005593234 systemd[1]: Finished man-db-cache-update.service.
Jan 23 03:52:59 np0005593234 systemd[1]: man-db-cache-update.service: Consumed 4.702s CPU time.
Jan 23 03:52:59 np0005593234 systemd[1]: run-r6ed434cd0eeb435e876bd088944c6c88.service: Deactivated successfully.
Jan 23 03:52:59 np0005593234 python3.9[42905]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 23 03:53:00 np0005593234 python3.9[43058]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:53:01 np0005593234 python3.9[43210]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:53:01 np0005593234 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 03:53:01 np0005593234 systemd[1]: Starting Authorization Manager...
Jan 23 03:53:01 np0005593234 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 03:53:01 np0005593234 polkitd[43427]: Started polkitd version 0.117
Jan 23 03:53:01 np0005593234 systemd[1]: Started Authorization Manager.
Jan 23 03:53:02 np0005593234 python3.9[43597]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:53:02 np0005593234 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 23 03:53:02 np0005593234 systemd[1]: tuned.service: Deactivated successfully.
Jan 23 03:53:02 np0005593234 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 23 03:53:02 np0005593234 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 03:53:02 np0005593234 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 03:53:03 np0005593234 python3.9[43758]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 23 03:53:08 np0005593234 python3.9[43910]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:53:08 np0005593234 systemd[1]: Reloading.
Jan 23 03:53:08 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:53:09 np0005593234 python3.9[44099]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:53:09 np0005593234 systemd[1]: Reloading.
Jan 23 03:53:09 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:53:10 np0005593234 python3.9[44287]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:53:11 np0005593234 python3.9[44440]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:53:11 np0005593234 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 23 03:53:12 np0005593234 python3.9[44593]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:53:14 np0005593234 python3.9[44755]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:53:14 np0005593234 python3.9[44908]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:53:15 np0005593234 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 03:53:15 np0005593234 systemd[1]: Stopped Apply Kernel Variables.
Jan 23 03:53:15 np0005593234 systemd[1]: Stopping Apply Kernel Variables...
Jan 23 03:53:15 np0005593234 systemd[1]: Starting Apply Kernel Variables...
Jan 23 03:53:15 np0005593234 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 23 03:53:15 np0005593234 systemd[1]: Finished Apply Kernel Variables.
Jan 23 03:53:15 np0005593234 systemd[1]: session-10.scope: Deactivated successfully.
Jan 23 03:53:15 np0005593234 systemd[1]: session-10.scope: Consumed 2min 10.857s CPU time.
Jan 23 03:53:15 np0005593234 systemd-logind[794]: Session 10 logged out. Waiting for processes to exit.
Jan 23 03:53:15 np0005593234 systemd-logind[794]: Removed session 10.
Jan 23 03:53:21 np0005593234 systemd-logind[794]: New session 11 of user zuul.
Jan 23 03:53:21 np0005593234 systemd[1]: Started Session 11 of User zuul.
Jan 23 03:53:22 np0005593234 python3.9[45091]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:53:24 np0005593234 python3.9[45247]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 23 03:53:24 np0005593234 python3.9[45400]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 03:53:26 np0005593234 python3.9[45558]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 03:53:27 np0005593234 python3.9[45718]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:53:28 np0005593234 python3.9[45802]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 03:53:31 np0005593234 python3.9[45965]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:53:43 np0005593234 kernel: SELinux:  Converting 2736 SID table entries...
Jan 23 03:53:43 np0005593234 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 03:53:43 np0005593234 kernel: SELinux:  policy capability open_perms=1
Jan 23 03:53:43 np0005593234 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 03:53:43 np0005593234 kernel: SELinux:  policy capability always_check_network=0
Jan 23 03:53:43 np0005593234 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 03:53:43 np0005593234 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 03:53:43 np0005593234 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 03:53:43 np0005593234 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 23 03:53:43 np0005593234 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 23 03:53:44 np0005593234 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 03:53:44 np0005593234 systemd[1]: Starting man-db-cache-update.service...
Jan 23 03:53:44 np0005593234 systemd[1]: Reloading.
Jan 23 03:53:44 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:53:44 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:53:45 np0005593234 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 03:53:46 np0005593234 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 03:53:46 np0005593234 systemd[1]: Finished man-db-cache-update.service.
Jan 23 03:53:46 np0005593234 systemd[1]: run-r1e4872809adf4a79a8a9f743a0232e74.service: Deactivated successfully.
Jan 23 03:53:48 np0005593234 python3.9[47062]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 03:53:49 np0005593234 systemd[1]: Reloading.
Jan 23 03:53:49 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:53:49 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:53:49 np0005593234 systemd[1]: Starting Open vSwitch Database Unit...
Jan 23 03:53:49 np0005593234 chown[47104]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 23 03:53:49 np0005593234 ovs-ctl[47109]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 23 03:53:49 np0005593234 ovs-ctl[47109]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 23 03:53:49 np0005593234 ovs-ctl[47109]: Starting ovsdb-server [  OK  ]
Jan 23 03:53:49 np0005593234 ovs-vsctl[47158]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 23 03:53:49 np0005593234 ovs-vsctl[47174]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"3ec410d4-99bb-47ec-9f70-86f8400b2621\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 23 03:53:49 np0005593234 ovs-ctl[47109]: Configuring Open vSwitch system IDs [  OK  ]
Jan 23 03:53:49 np0005593234 ovs-vsctl[47184]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 23 03:53:49 np0005593234 ovs-ctl[47109]: Enabling remote OVSDB managers [  OK  ]
Jan 23 03:53:49 np0005593234 systemd[1]: Started Open vSwitch Database Unit.
Jan 23 03:53:49 np0005593234 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 23 03:53:49 np0005593234 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 23 03:53:49 np0005593234 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 23 03:53:49 np0005593234 kernel: openvswitch: Open vSwitch switching datapath
Jan 23 03:53:49 np0005593234 ovs-ctl[47229]: Inserting openvswitch module [  OK  ]
Jan 23 03:53:49 np0005593234 ovs-ctl[47198]: Starting ovs-vswitchd [  OK  ]
Jan 23 03:53:49 np0005593234 ovs-ctl[47198]: Enabling remote OVSDB managers [  OK  ]
Jan 23 03:53:49 np0005593234 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 23 03:53:49 np0005593234 ovs-vsctl[47247]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Jan 23 03:53:50 np0005593234 systemd[1]: Starting Open vSwitch...
Jan 23 03:53:50 np0005593234 systemd[1]: Finished Open vSwitch.
Jan 23 03:53:50 np0005593234 python3.9[47398]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:53:51 np0005593234 python3.9[47550]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 23 03:53:53 np0005593234 kernel: SELinux:  Converting 2750 SID table entries...
Jan 23 03:53:53 np0005593234 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 03:53:53 np0005593234 kernel: SELinux:  policy capability open_perms=1
Jan 23 03:53:53 np0005593234 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 03:53:53 np0005593234 kernel: SELinux:  policy capability always_check_network=0
Jan 23 03:53:53 np0005593234 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 03:53:53 np0005593234 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 03:53:53 np0005593234 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 03:53:54 np0005593234 python3.9[47705]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:53:55 np0005593234 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 23 03:53:55 np0005593234 python3.9[47863]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:53:57 np0005593234 python3.9[48016]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:53:59 np0005593234 python3.9[48303]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 23 03:54:00 np0005593234 python3.9[48453]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:54:01 np0005593234 python3.9[48607]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:54:05 np0005593234 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 03:54:05 np0005593234 systemd[1]: Starting man-db-cache-update.service...
Jan 23 03:54:05 np0005593234 systemd[1]: Reloading.
Jan 23 03:54:05 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:54:05 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:54:05 np0005593234 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 03:54:05 np0005593234 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 03:54:05 np0005593234 systemd[1]: Finished man-db-cache-update.service.
Jan 23 03:54:05 np0005593234 systemd[1]: run-r7906dea60c0446f8839d2acffe6181c0.service: Deactivated successfully.
Jan 23 03:54:06 np0005593234 python3.9[48924]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:54:06 np0005593234 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 03:54:06 np0005593234 systemd[1]: Stopped Network Manager Wait Online.
Jan 23 03:54:06 np0005593234 systemd[1]: Stopping Network Manager Wait Online...
Jan 23 03:54:06 np0005593234 systemd[1]: Stopping Network Manager...
Jan 23 03:54:06 np0005593234 NetworkManager[7203]: <info>  [1769158446.3950] caught SIGTERM, shutting down normally.
Jan 23 03:54:06 np0005593234 NetworkManager[7203]: <info>  [1769158446.3961] dhcp4 (eth0): canceled DHCP transaction
Jan 23 03:54:06 np0005593234 NetworkManager[7203]: <info>  [1769158446.3961] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:54:06 np0005593234 NetworkManager[7203]: <info>  [1769158446.3961] dhcp4 (eth0): state changed no lease
Jan 23 03:54:06 np0005593234 NetworkManager[7203]: <info>  [1769158446.3963] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 03:54:06 np0005593234 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 03:54:06 np0005593234 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 03:54:06 np0005593234 NetworkManager[7203]: <info>  [1769158446.9051] exiting (success)
Jan 23 03:54:06 np0005593234 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 03:54:06 np0005593234 systemd[1]: Stopped Network Manager.
Jan 23 03:54:06 np0005593234 systemd[1]: NetworkManager.service: Consumed 11.269s CPU time, 4.1M memory peak, read 0B from disk, written 31.0K to disk.
Jan 23 03:54:06 np0005593234 systemd[1]: Starting Network Manager...
Jan 23 03:54:06 np0005593234 NetworkManager[48942]: <info>  [1769158446.9700] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:ecc0581c-6611-4866-b3e1-0bc978951940)
Jan 23 03:54:06 np0005593234 NetworkManager[48942]: <info>  [1769158446.9701] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 03:54:06 np0005593234 NetworkManager[48942]: <info>  [1769158446.9762] manager[0x55dd4722a000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 03:54:07 np0005593234 systemd[1]: Starting Hostname Service...
Jan 23 03:54:07 np0005593234 systemd[1]: Started Hostname Service.
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0851] hostname: hostname: using hostnamed
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0851] hostname: static hostname changed from (none) to "compute-2"
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0858] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0862] manager[0x55dd4722a000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0862] manager[0x55dd4722a000]: rfkill: WWAN hardware radio set enabled
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0884] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0893] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0894] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0894] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0894] manager: Networking is enabled by state file
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0897] settings: Loaded settings plugin: keyfile (internal)
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0900] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0923] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0934] dhcp: init: Using DHCP client 'internal'
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0937] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0943] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0948] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0959] device (lo): Activation: starting connection 'lo' (4dda678f-e4eb-42eb-8b71-b7827298a97a)
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0968] device (eth0): carrier: link connected
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0971] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0975] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0975] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0980] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0987] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0993] device (eth1): carrier: link connected
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.0997] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1002] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (8808941e-5666-5885-aa66-75a36520f7d1) (indicated)
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1003] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1008] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1014] device (eth1): Activation: starting connection 'ci-private-network' (8808941e-5666-5885-aa66-75a36520f7d1)
Jan 23 03:54:07 np0005593234 systemd[1]: Started Network Manager.
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1030] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1040] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1042] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1043] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1046] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1048] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1050] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1052] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1054] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1060] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1063] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1085] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1101] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1112] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1114] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1119] device (lo): Activation: successful, device activated.
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1125] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.1132] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 03:54:07 np0005593234 systemd[1]: Starting Network Manager Wait Online...
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.5791] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.5809] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.5819] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.5828] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.5831] device (eth1): Activation: successful, device activated.
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.5873] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.5875] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.5879] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.5882] device (eth0): Activation: successful, device activated.
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.5887] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 03:54:07 np0005593234 NetworkManager[48942]: <info>  [1769158447.6301] manager: startup complete
Jan 23 03:54:07 np0005593234 systemd[1]: Finished Network Manager Wait Online.
Jan 23 03:54:07 np0005593234 python3.9[49133]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:54:13 np0005593234 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 03:54:13 np0005593234 systemd[1]: Starting man-db-cache-update.service...
Jan 23 03:54:13 np0005593234 systemd[1]: Reloading.
Jan 23 03:54:13 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:54:13 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:54:13 np0005593234 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 03:54:15 np0005593234 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 03:54:15 np0005593234 systemd[1]: Finished man-db-cache-update.service.
Jan 23 03:54:15 np0005593234 systemd[1]: run-r29709630966b4f2dbc8664fb09556286.service: Deactivated successfully.
Jan 23 03:54:16 np0005593234 python3.9[49611]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:54:17 np0005593234 python3.9[49763]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:17 np0005593234 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 03:54:18 np0005593234 python3.9[49917]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:18 np0005593234 python3.9[50069]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:19 np0005593234 python3.9[50221]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:20 np0005593234 python3.9[50373]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:21 np0005593234 python3.9[50525]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:54:21 np0005593234 python3.9[50648]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158460.7278094-649-196311165690384/.source _original_basename=.kikqmkr5 follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:22 np0005593234 python3.9[50800]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:23 np0005593234 python3.9[50952]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 23 03:54:24 np0005593234 python3.9[51104]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:26 np0005593234 python3.9[51531]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 23 03:54:28 np0005593234 ansible-async_wrapper.py[51706]: Invoked with j971549089058 300 /home/zuul/.ansible/tmp/ansible-tmp-1769158467.162613-847-3571855413891/AnsiballZ_edpm_os_net_config.py _
Jan 23 03:54:28 np0005593234 ansible-async_wrapper.py[51709]: Starting module and watcher
Jan 23 03:54:28 np0005593234 ansible-async_wrapper.py[51709]: Start watching 51710 (300)
Jan 23 03:54:28 np0005593234 ansible-async_wrapper.py[51710]: Start module (51710)
Jan 23 03:54:28 np0005593234 ansible-async_wrapper.py[51706]: Return async_wrapper task started.
Jan 23 03:54:28 np0005593234 python3.9[51711]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 23 03:54:29 np0005593234 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 23 03:54:29 np0005593234 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 23 03:54:29 np0005593234 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 23 03:54:29 np0005593234 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 23 03:54:29 np0005593234 kernel: cfg80211: failed to load regulatory.db
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.1741] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.1758] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2276] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2277] audit: op="connection-add" uuid="ed053e5e-5387-4b3c-b1c0-e016270bfa97" name="br-ex-br" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2292] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2293] audit: op="connection-add" uuid="5e58f561-c83f-4d52-bb34-bc6f29da1548" name="br-ex-port" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2305] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2306] audit: op="connection-add" uuid="3d281dfb-006c-46d2-b90d-24d5f6d5da98" name="eth1-port" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2318] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2319] audit: op="connection-add" uuid="a0479fd4-174b-4a17-8272-344234e29854" name="vlan20-port" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2331] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2332] audit: op="connection-add" uuid="4435d74c-827c-433a-858c-7925f1e7dcae" name="vlan21-port" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2342] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2343] audit: op="connection-add" uuid="4e06e010-5155-4c7b-87d0-ca072e9a3ac9" name="vlan22-port" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2354] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2355] audit: op="connection-add" uuid="d49d21ff-147c-441e-9ea3-4b5a145c621a" name="vlan23-port" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2371] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="connection.timestamp,connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2386] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.2387] audit: op="connection-add" uuid="e955e3e1-ef5c-4383-a675-b54f00db06e1" name="br-ex-if" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4195] audit: op="connection-update" uuid="8808941e-5666-5885-aa66-75a36520f7d1" name="ci-private-network" args="connection.slave-type,connection.port-type,connection.timestamp,connection.controller,connection.master,ovs-interface.type,ovs-external-ids.data,ipv4.dns,ipv4.routing-rules,ipv4.routes,ipv4.method,ipv4.addresses,ipv4.never-default,ipv6.dns,ipv6.routes,ipv6.method,ipv6.addr-gen-mode,ipv6.addresses,ipv6.routing-rules" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4221] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4223] audit: op="connection-add" uuid="57a4fb20-86bd-45df-be1b-8202d05068f8" name="vlan20-if" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4242] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4244] audit: op="connection-add" uuid="a1d71b44-6d6b-4477-9666-c8d170733f13" name="vlan21-if" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4264] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4266] audit: op="connection-add" uuid="e53e81ca-2d2d-441a-92ca-2cceee0f2083" name="vlan22-if" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4293] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4296] audit: op="connection-add" uuid="4e1196ab-0ca5-4822-9a42-288997020bb6" name="vlan23-if" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4310] audit: op="connection-delete" uuid="1265e500-f61d-3e4a-baaf-c93c8bb5c40a" name="Wired connection 1" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4324] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <warn>  [1769158470.4328] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4338] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4343] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (ed053e5e-5387-4b3c-b1c0-e016270bfa97)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4344] audit: op="connection-activate" uuid="ed053e5e-5387-4b3c-b1c0-e016270bfa97" name="br-ex-br" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4346] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <warn>  [1769158470.4347] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4353] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4358] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (5e58f561-c83f-4d52-bb34-bc6f29da1548)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4360] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <warn>  [1769158470.4362] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4368] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4374] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (3d281dfb-006c-46d2-b90d-24d5f6d5da98)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4377] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <warn>  [1769158470.4378] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4386] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4392] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (a0479fd4-174b-4a17-8272-344234e29854)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4395] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <warn>  [1769158470.4397] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4404] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4411] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (4435d74c-827c-433a-858c-7925f1e7dcae)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4414] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <warn>  [1769158470.4415] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4423] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4429] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (4e06e010-5155-4c7b-87d0-ca072e9a3ac9)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4432] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <warn>  [1769158470.4433] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4441] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4447] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (d49d21ff-147c-441e-9ea3-4b5a145c621a)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4448] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4452] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4454] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4467] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <warn>  [1769158470.4469] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4473] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4481] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (e955e3e1-ef5c-4383-a675-b54f00db06e1)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4482] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4487] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4489] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4491] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4493] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4510] device (eth1): disconnecting for new activation request.
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4511] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4516] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4520] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4521] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4524] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <warn>  [1769158470.4526] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4530] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4534] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (57a4fb20-86bd-45df-be1b-8202d05068f8)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4536] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4541] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4543] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4544] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4548] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <warn>  [1769158470.4549] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4553] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4558] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (a1d71b44-6d6b-4477-9666-c8d170733f13)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4559] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4562] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4565] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4566] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4570] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <warn>  [1769158470.4571] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4574] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4580] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (e53e81ca-2d2d-441a-92ca-2cceee0f2083)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4581] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4584] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4586] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4588] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4591] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <warn>  [1769158470.4592] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4596] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4601] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (4e1196ab-0ca5-4822-9a42-288997020bb6)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4602] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4606] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4608] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4610] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4611] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4630] audit: op="device-reapply" interface="eth0" ifindex=2 args="connection.autoconnect-priority,802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,ipv6.method,ipv6.addr-gen-mode" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4633] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4638] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4640] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4648] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4652] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4657] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4661] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4663] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 kernel: ovs-system: entered promiscuous mode
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4669] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4674] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4678] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4681] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 kernel: Timeout policy base is empty
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4687] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4691] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4695] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4697] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 systemd-udevd[51718]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4703] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4708] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4712] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4713] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4719] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4723] dhcp4 (eth0): canceled DHCP transaction
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4723] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4723] dhcp4 (eth0): state changed no lease
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4725] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 23 03:54:30 np0005593234 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4736] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4740] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51712 uid=0 result="fail" reason="Device is not activated"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.4746] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 23 03:54:30 np0005593234 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 03:54:30 np0005593234 kernel: br-ex: entered promiscuous mode
Jan 23 03:54:30 np0005593234 kernel: vlan21: entered promiscuous mode
Jan 23 03:54:30 np0005593234 systemd-udevd[51716]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 03:54:30 np0005593234 kernel: vlan20: entered promiscuous mode
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6441] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6459] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6467] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6473] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6474] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6489] device (eth1): Activation: starting connection 'ci-private-network' (8808941e-5666-5885-aa66-75a36520f7d1)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6492] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6493] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6495] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6496] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6497] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6498] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6499] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6502] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6516] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6519] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6525] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6528] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6532] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6535] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6539] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6542] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6546] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6549] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6551] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6554] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6558] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6560] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 03:54:30 np0005593234 kernel: vlan22: entered promiscuous mode
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6563] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6566] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6569] device (eth1): state change: config -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6570] device (eth1): released from controller device eth1
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6576] device (eth1): disconnecting for new activation request.
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6577] audit: op="connection-activate" uuid="8808941e-5666-5885-aa66-75a36520f7d1" name="ci-private-network" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6579] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6600] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6614] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6621] device (eth1): Activation: starting connection 'ci-private-network' (8808941e-5666-5885-aa66-75a36520f7d1)
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6632] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6637] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6646] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6649] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6652] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6653] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51712 uid=0 result="success"
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6654] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6660] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6663] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 03:54:30 np0005593234 kernel: vlan23: entered promiscuous mode
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6671] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6673] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6682] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6685] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6688] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6693] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6696] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.6699] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.7789] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.7790] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.7793] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.7807] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.7811] device (eth1): Activation: successful, device activated.
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.7840] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.7848] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.7861] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.7863] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.7867] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.7902] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.7904] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 03:54:30 np0005593234 NetworkManager[48942]: <info>  [1769158470.7910] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 03:54:31 np0005593234 NetworkManager[48942]: <info>  [1769158471.1286] dhcp4 (eth0): state changed new lease, address=38.102.83.50
Jan 23 03:54:31 np0005593234 python3.9[52074]: ansible-ansible.legacy.async_status Invoked with jid=j971549089058.51706 mode=status _async_dir=/root/.ansible_async
Jan 23 03:54:31 np0005593234 NetworkManager[48942]: <info>  [1769158471.9566] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51712 uid=0 result="success"
Jan 23 03:54:32 np0005593234 NetworkManager[48942]: <info>  [1769158472.1138] checkpoint[0x55dd47200950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 23 03:54:32 np0005593234 NetworkManager[48942]: <info>  [1769158472.1140] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51712 uid=0 result="success"
Jan 23 03:54:32 np0005593234 NetworkManager[48942]: <info>  [1769158472.3842] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51712 uid=0 result="success"
Jan 23 03:54:32 np0005593234 NetworkManager[48942]: <info>  [1769158472.3853] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51712 uid=0 result="success"
Jan 23 03:54:32 np0005593234 NetworkManager[48942]: <info>  [1769158472.6130] audit: op="networking-control" arg="global-dns-configuration" pid=51712 uid=0 result="success"
Jan 23 03:54:32 np0005593234 NetworkManager[48942]: <info>  [1769158472.6213] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 23 03:54:32 np0005593234 NetworkManager[48942]: <info>  [1769158472.6573] audit: op="networking-control" arg="global-dns-configuration" pid=51712 uid=0 result="success"
Jan 23 03:54:32 np0005593234 NetworkManager[48942]: <info>  [1769158472.6604] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51712 uid=0 result="success"
Jan 23 03:54:32 np0005593234 NetworkManager[48942]: <info>  [1769158472.8026] checkpoint[0x55dd47200a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 23 03:54:32 np0005593234 NetworkManager[48942]: <info>  [1769158472.8030] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51712 uid=0 result="success"
Jan 23 03:54:32 np0005593234 ansible-async_wrapper.py[51710]: Module complete (51710)
Jan 23 03:54:33 np0005593234 ansible-async_wrapper.py[51709]: Done in kid B.
Jan 23 03:54:35 np0005593234 python3.9[52180]: ansible-ansible.legacy.async_status Invoked with jid=j971549089058.51706 mode=status _async_dir=/root/.ansible_async
Jan 23 03:54:35 np0005593234 python3.9[52280]: ansible-ansible.legacy.async_status Invoked with jid=j971549089058.51706 mode=cleanup _async_dir=/root/.ansible_async
Jan 23 03:54:37 np0005593234 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 03:54:37 np0005593234 python3.9[52432]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:54:37 np0005593234 python3.9[52558]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158476.700358-928-105639002491047/.source.returncode _original_basename=.s_1bslxt follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:38 np0005593234 python3.9[52710]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:54:39 np0005593234 python3.9[52834]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158478.074304-976-176541794698652/.source.cfg _original_basename=.gis39r2r follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:54:40 np0005593234 python3.9[52986]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:54:40 np0005593234 systemd[1]: Reloading Network Manager...
Jan 23 03:54:40 np0005593234 NetworkManager[48942]: <info>  [1769158480.1170] audit: op="reload" arg="0" pid=52990 uid=0 result="success"
Jan 23 03:54:40 np0005593234 NetworkManager[48942]: <info>  [1769158480.1176] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 23 03:54:40 np0005593234 systemd[1]: Reloaded Network Manager.
Jan 23 03:54:41 np0005593234 systemd[1]: session-11.scope: Deactivated successfully.
Jan 23 03:54:41 np0005593234 systemd[1]: session-11.scope: Consumed 47.835s CPU time.
Jan 23 03:54:41 np0005593234 systemd-logind[794]: Session 11 logged out. Waiting for processes to exit.
Jan 23 03:54:41 np0005593234 systemd-logind[794]: Removed session 11.
Jan 23 03:54:46 np0005593234 systemd-logind[794]: New session 12 of user zuul.
Jan 23 03:54:46 np0005593234 systemd[1]: Started Session 12 of User zuul.
Jan 23 03:54:47 np0005593234 python3.9[53174]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:54:48 np0005593234 python3.9[53328]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:54:49 np0005593234 python3.9[53522]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:54:50 np0005593234 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 03:54:50 np0005593234 systemd[1]: session-12.scope: Deactivated successfully.
Jan 23 03:54:50 np0005593234 systemd[1]: session-12.scope: Consumed 2.095s CPU time.
Jan 23 03:54:50 np0005593234 systemd-logind[794]: Session 12 logged out. Waiting for processes to exit.
Jan 23 03:54:50 np0005593234 systemd-logind[794]: Removed session 12.
Jan 23 03:54:56 np0005593234 systemd-logind[794]: New session 13 of user zuul.
Jan 23 03:54:56 np0005593234 systemd[1]: Started Session 13 of User zuul.
Jan 23 03:54:57 np0005593234 python3.9[53705]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:54:58 np0005593234 python3.9[53860]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:54:59 np0005593234 python3.9[54016]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:55:00 np0005593234 python3.9[54100]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:55:02 np0005593234 python3.9[54254]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:55:04 np0005593234 python3.9[54449]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:05 np0005593234 python3.9[54601]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:55:06 np0005593234 systemd[1]: var-lib-containers-storage-overlay-compat1577345440-merged.mount: Deactivated successfully.
Jan 23 03:55:08 np0005593234 podman[54602]: 2026-01-23 08:55:08.285374369 +0000 UTC m=+2.498364623 system refresh
Jan 23 03:55:08 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 03:55:09 np0005593234 python3.9[54764]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:55:09 np0005593234 python3.9[54887]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158508.5527644-199-177828845925528/.source.json follow=False _original_basename=podman_network_config.j2 checksum=b8672e88ec7252e1b63ce4cc5de8b94bf66cdad4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:10 np0005593234 python3.9[55039]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:55:11 np0005593234 python3.9[55162]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158510.186298-245-276923266170119/.source.conf follow=False _original_basename=registries.conf.j2 checksum=7d6103ee1a01cd01d921f72f1af62704e0a47ff2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:55:12 np0005593234 python3.9[55314]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:55:12 np0005593234 python3.9[55466]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:55:13 np0005593234 python3.9[55618]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:55:14 np0005593234 python3.9[55770]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:55:15 np0005593234 python3.9[55922]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:55:17 np0005593234 python3.9[56075]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:55:18 np0005593234 python3.9[56229]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:55:19 np0005593234 python3.9[56381]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:55:20 np0005593234 python3.9[56533]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:55:21 np0005593234 python3.9[56686]: ansible-service_facts Invoked
Jan 23 03:55:21 np0005593234 network[56703]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 03:55:21 np0005593234 network[56704]: 'network-scripts' will be removed from distribution in near future.
Jan 23 03:55:21 np0005593234 network[56705]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 03:55:27 np0005593234 python3.9[57157]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 03:55:30 np0005593234 python3.9[57310]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 23 03:55:31 np0005593234 python3.9[57462]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:55:32 np0005593234 python3.9[57587]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158531.3632088-677-70961534383506/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:33 np0005593234 python3.9[57741]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:55:33 np0005593234 python3.9[57866]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158532.8181813-723-209237763760897/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:35 np0005593234 python3.9[58020]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:37 np0005593234 python3.9[58174]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:55:40 np0005593234 python3.9[58258]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:55:41 np0005593234 python3.9[58412]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:55:42 np0005593234 python3.9[58496]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:55:42 np0005593234 chronyd[787]: chronyd exiting
Jan 23 03:55:42 np0005593234 systemd[1]: Stopping NTP client/server...
Jan 23 03:55:42 np0005593234 systemd[1]: chronyd.service: Deactivated successfully.
Jan 23 03:55:42 np0005593234 systemd[1]: Stopped NTP client/server.
Jan 23 03:55:42 np0005593234 systemd[1]: Starting NTP client/server...
Jan 23 03:55:42 np0005593234 chronyd[58504]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 03:55:42 np0005593234 chronyd[58504]: Frequency -31.441 +/- 0.116 ppm read from /var/lib/chrony/drift
Jan 23 03:55:42 np0005593234 chronyd[58504]: Loaded seccomp filter (level 2)
Jan 23 03:55:42 np0005593234 systemd[1]: Started NTP client/server.
Jan 23 03:55:43 np0005593234 systemd[1]: session-13.scope: Deactivated successfully.
Jan 23 03:55:43 np0005593234 systemd[1]: session-13.scope: Consumed 23.865s CPU time.
Jan 23 03:55:43 np0005593234 systemd-logind[794]: Session 13 logged out. Waiting for processes to exit.
Jan 23 03:55:43 np0005593234 systemd-logind[794]: Removed session 13.
Jan 23 03:55:49 np0005593234 systemd-logind[794]: New session 14 of user zuul.
Jan 23 03:55:49 np0005593234 systemd[1]: Started Session 14 of User zuul.
Jan 23 03:55:49 np0005593234 python3.9[58687]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:50 np0005593234 python3.9[58839]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:55:51 np0005593234 python3.9[58962]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158550.273181-64-17721371340034/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:55:51 np0005593234 systemd[1]: session-14.scope: Deactivated successfully.
Jan 23 03:55:51 np0005593234 systemd[1]: session-14.scope: Consumed 1.495s CPU time.
Jan 23 03:55:51 np0005593234 systemd-logind[794]: Session 14 logged out. Waiting for processes to exit.
Jan 23 03:55:51 np0005593234 systemd-logind[794]: Removed session 14.
Jan 23 03:55:57 np0005593234 systemd-logind[794]: New session 15 of user zuul.
Jan 23 03:55:57 np0005593234 systemd[1]: Started Session 15 of User zuul.
Jan 23 03:55:58 np0005593234 python3.9[59140]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:55:59 np0005593234 python3.9[59296]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:00 np0005593234 python3.9[59471]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:01 np0005593234 python3.9[59594]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769158559.598853-86-248039317684989/.source.json _original_basename=.54w232rh follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:02 np0005593234 python3.9[59746]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:02 np0005593234 python3.9[59869]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158561.8895888-154-248382020105589/.source _original_basename=.p0hcmxkd follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:03 np0005593234 python3.9[60021]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:56:04 np0005593234 python3.9[60173]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:04 np0005593234 python3.9[60296]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158563.8831234-226-155847144625479/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:56:05 np0005593234 python3.9[60448]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:06 np0005593234 python3.9[60571]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769158564.955592-226-255567118914649/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 03:56:06 np0005593234 python3.9[60723]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:07 np0005593234 python3.9[60875]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:07 np0005593234 python3.9[60998]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158566.9519398-338-187806886675953/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:08 np0005593234 python3.9[61150]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:09 np0005593234 python3.9[61273]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158568.145334-383-55730530269722/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:10 np0005593234 python3.9[61425]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:56:10 np0005593234 systemd[1]: Reloading.
Jan 23 03:56:10 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:56:10 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:56:10 np0005593234 systemd[1]: Reloading.
Jan 23 03:56:10 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:56:10 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:56:10 np0005593234 systemd[1]: Starting EDPM Container Shutdown...
Jan 23 03:56:10 np0005593234 systemd[1]: Finished EDPM Container Shutdown.
Jan 23 03:56:11 np0005593234 python3.9[61653]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:12 np0005593234 python3.9[61776]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158571.2478669-451-255387430369332/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:13 np0005593234 python3.9[61928]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:13 np0005593234 python3.9[62051]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158572.6574726-497-50637282690185/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:14 np0005593234 python3.9[62203]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:56:14 np0005593234 systemd[1]: Reloading.
Jan 23 03:56:14 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:56:14 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:56:14 np0005593234 systemd[1]: Reloading.
Jan 23 03:56:14 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:56:14 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:56:14 np0005593234 systemd[1]: Starting Create netns directory...
Jan 23 03:56:14 np0005593234 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 03:56:14 np0005593234 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 03:56:14 np0005593234 systemd[1]: Finished Create netns directory.
Jan 23 03:56:15 np0005593234 python3.9[62429]: ansible-ansible.builtin.service_facts Invoked
Jan 23 03:56:15 np0005593234 network[62446]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 03:56:15 np0005593234 network[62447]: 'network-scripts' will be removed from distribution in near future.
Jan 23 03:56:15 np0005593234 network[62448]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 03:56:20 np0005593234 python3.9[62710]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:56:20 np0005593234 systemd[1]: Reloading.
Jan 23 03:56:20 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:56:20 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:56:20 np0005593234 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 23 03:56:20 np0005593234 iptables.init[62750]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 23 03:56:20 np0005593234 iptables.init[62750]: iptables: Flushing firewall rules: [  OK  ]
Jan 23 03:56:20 np0005593234 systemd[1]: iptables.service: Deactivated successfully.
Jan 23 03:56:20 np0005593234 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 23 03:56:21 np0005593234 python3.9[62946]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:56:22 np0005593234 python3.9[63100]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:56:22 np0005593234 systemd[1]: Reloading.
Jan 23 03:56:22 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:56:22 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:56:23 np0005593234 systemd[1]: Starting Netfilter Tables...
Jan 23 03:56:23 np0005593234 systemd[1]: Finished Netfilter Tables.
Jan 23 03:56:24 np0005593234 python3.9[63293]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:56:25 np0005593234 python3.9[63446]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:25 np0005593234 python3.9[63571]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158584.7823842-704-88812775542840/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:26 np0005593234 python3.9[63724]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:56:26 np0005593234 systemd[1]: Reloading OpenSSH server daemon...
Jan 23 03:56:26 np0005593234 systemd[1]: Reloaded OpenSSH server daemon.
Jan 23 03:56:27 np0005593234 python3.9[63880]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:28 np0005593234 python3.9[64032]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:28 np0005593234 python3.9[64155]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158587.7878585-797-200719095115719/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:29 np0005593234 python3.9[64307]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 03:56:29 np0005593234 systemd[1]: Starting Time & Date Service...
Jan 23 03:56:29 np0005593234 systemd[1]: Started Time & Date Service.
Jan 23 03:56:31 np0005593234 python3.9[64463]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:32 np0005593234 python3.9[64615]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:33 np0005593234 python3.9[64738]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158592.093152-902-75405802668345/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:33 np0005593234 python3.9[64890]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:34 np0005593234 python3.9[65013]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158593.2997456-947-14571533524233/.source.yaml _original_basename=.l66gs6g9 follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:35 np0005593234 python3.9[65165]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:35 np0005593234 python3.9[65288]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158594.6083488-993-88561885278570/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:36 np0005593234 python3.9[65440]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:56:37 np0005593234 python3.9[65593]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:56:38 np0005593234 python3[65746]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 03:56:38 np0005593234 python3.9[65898]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:39 np0005593234 python3.9[66021]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158598.3546271-1109-180986533620013/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:40 np0005593234 python3.9[66173]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:40 np0005593234 python3.9[66296]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158599.8416839-1154-85879459572989/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:41 np0005593234 python3.9[66448]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:42 np0005593234 python3.9[66571]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158601.0951526-1199-188533650787370/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:42 np0005593234 python3.9[66723]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:43 np0005593234 python3.9[66846]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158602.4260936-1244-41691975174114/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:44 np0005593234 python3.9[66998]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 03:56:44 np0005593234 python3.9[67121]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769158603.7137778-1289-133524942147039/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:45 np0005593234 python3.9[67273]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:46 np0005593234 python3.9[67425]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:56:47 np0005593234 python3.9[67584]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:48 np0005593234 python3.9[67737]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:48 np0005593234 python3.9[67889]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:56:49 np0005593234 python3.9[68041]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 03:56:50 np0005593234 python3.9[68194]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 03:56:51 np0005593234 systemd[1]: session-15.scope: Deactivated successfully.
Jan 23 03:56:51 np0005593234 systemd[1]: session-15.scope: Consumed 33.874s CPU time.
Jan 23 03:56:51 np0005593234 systemd-logind[794]: Session 15 logged out. Waiting for processes to exit.
Jan 23 03:56:51 np0005593234 systemd-logind[794]: Removed session 15.
Jan 23 03:56:56 np0005593234 systemd-logind[794]: New session 16 of user zuul.
Jan 23 03:56:56 np0005593234 systemd[1]: Started Session 16 of User zuul.
Jan 23 03:56:57 np0005593234 python3.9[68375]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 23 03:56:58 np0005593234 python3.9[68527]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:56:59 np0005593234 python3.9[68679]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:56:59 np0005593234 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 03:57:00 np0005593234 python3.9[68833]: ansible-ansible.builtin.blockinfile Invoked with block=compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD7jdzOPltwN8PSb4q9DCiO5zY7TIK6sENpltjjN4gdZgxOTsj/dxnfxJlO2lYI1dFyyFnDdZj88a4x1KI5Bnnvl5KRvvZiianfivZWKq9Ngf9fzf7+5CsDFBiu6a7GAfXMf9FocVpqlXf7fsXmb5Iv2xUpNnye4EFIuW965X3SNrRpujRnDe+i0lIwrOsus4R86qn38MWOLfPBAWFYdBaVfTUYjC0eT/I81Y/T2RKqf7XK/bsuHobZ+/a7lymuPsS9L0DFg25ZoIlvkPUVfZxTO5FCyw8GMR+AgbnMQyHwx2JAmewwH3M2l+zVdDQjsE1ZRFlJCmwle9LBa1oFhuLfxLqsykQploeB5Ch/VppbnRQ/GamwWLU5HEKMH2wZ6IymURW7nSStlEhNWvK+Bb9rIy65M6AFOEW94xId4nc+IraS6rc2cuM3Rp97S/6olqjlFDZisdUwdAlhIKuJjA7SsYZ6HyCEbRN3mvMnWbkqpyY605kewQ6kdmucNeWgRtk=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIE++PPNOKtggGl2mGWEm1DV2WpblvGA/F2TEEVeMrsU2#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP3uOoytpWGDF46u3wwDFxwF05HMnZd51GvbceZrDgZRmc5sxbF+OawPD9kGTcjnaUTzvqWgbFNvcmpuaNTnpzc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsLbdPIA8nc52wSKcOItc1xJ6faU3FwhWecUgXZZC+Q1wLSrdN9vgOExBhQSwwodluzJ5/GT9VbCuujyBvk7RMEim1+fw7T58Th56PR8y2lL6F6F3ni4S21QxInTLml+/id8wwEZAkFjbCF/AjCRDyH7a6H4wIZtd5ZuzWJuuBENNdtu/qD1QQYkNegqllogNpkdpAFZgvee26yw2sbCX8kpbJoJsowaQUckoRtT2jj7985CLxErKZ8YO8ZozjfuCDCKbcJT0KFimievJZmKXvGaWG5H+P509XDsfN62aQr22US8FbYjdK1lfrJoetkc/MK4h7QuCs6MH2qYiqXIkJYKMSReM+sH3X7V7pSWSUkr0DHREVvBGcC2lRSx45lUCTEtcTY7XmxGORvCORMYla0l1H3mEIkfYLS4sXYtRSHkyFnyQgbNP5MnrmXlK0vrAA81r5U+dOhIL/H2e7S4xcLItH7weUOHIAmCj266mm9+xJyyd7NZ+eUgS0Md5p4Bc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUSudroiFEdRPXgUCqRHbNRLelYP5RQGMMCn6zD8pfH#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDJLsx8RxJz6M7PIyGcFdzR+Ldl788501Y8ZWLJ8hnDzMCaRkGjzE+kzO/uN75IEtV3aVEl1jNQlk7wON+lORGQ=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq2Yxebv3BUxXHPuf6nN00teEMYUUVEWMZOqcwNO1dyibdbyxre6VweeeiBR/lerW1mIcmB67juCuLffEgDo8uPtZx9HrD1psd+ji78YeJuvbKIEcTwdtGF0I8PeogHunx+4KBxFsHeF6JHN9+H7lTHiSSIDFzk9BwDkAKEWsYHe8z+5SPDU//XiYNv0drE59KiQF586rnjPR3VZk6WaR+hp2PiHbUUSOvnyB4kI4bCXSCU/Oxv7HDvgeCJapABjisMZg4aiteZ7EaD1yVndkQiS6OxfOGP1srgtNkRL4Idc/XCFXH754lbRd8GzUF0n8N0HbWTcFDuTU+bvhuIH+3EDNxsDQkSCdJTw2EPb/mqZVdXSFxLXUBcXnYkBWZirpgC3g6okg2RQU2bxigFs7lFwJT6QE+wz0DK7Z3ib0XQxjRlY6PIwn1D2soMwKVarxpeM2FfsGrHMHaHioRTVbKpzBMA1oUICSUCvzyhd0I43cO2rUEK/8EMYSsTVRulKs=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAII4nVnNUbCVQAtKJF7UUtMQxNhMw9eVlRVofBpQ70iUi#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPqfkBgoQjr/gZBK1F9K576GMtkxSY6lVgROItGrW+R9EA2lvnOt71IGO0M0lGVvCkTtLktdNpSsYnBu2cJn+4c=#012 create=True mode=0644 path=/tmp/ansible.ngu7f0e2 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:57:01 np0005593234 python3.9[68985]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.ngu7f0e2' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:57:02 np0005593234 python3.9[69139]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.ngu7f0e2 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:57:03 np0005593234 systemd[1]: session-16.scope: Deactivated successfully.
Jan 23 03:57:03 np0005593234 systemd[1]: session-16.scope: Consumed 3.439s CPU time.
Jan 23 03:57:03 np0005593234 systemd-logind[794]: Session 16 logged out. Waiting for processes to exit.
Jan 23 03:57:03 np0005593234 systemd-logind[794]: Removed session 16.
Jan 23 03:57:09 np0005593234 systemd-logind[794]: New session 17 of user zuul.
Jan 23 03:57:09 np0005593234 systemd[1]: Started Session 17 of User zuul.
Jan 23 03:57:10 np0005593234 python3.9[69317]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:57:11 np0005593234 python3.9[69473]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 03:57:13 np0005593234 python3.9[69627]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 03:57:14 np0005593234 python3.9[69780]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:57:15 np0005593234 python3.9[69933]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:57:16 np0005593234 python3.9[70087]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:57:17 np0005593234 python3.9[70242]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:57:17 np0005593234 systemd[1]: session-17.scope: Deactivated successfully.
Jan 23 03:57:17 np0005593234 systemd[1]: session-17.scope: Consumed 4.535s CPU time.
Jan 23 03:57:17 np0005593234 systemd-logind[794]: Session 17 logged out. Waiting for processes to exit.
Jan 23 03:57:17 np0005593234 systemd-logind[794]: Removed session 17.
Jan 23 03:57:23 np0005593234 systemd-logind[794]: New session 18 of user zuul.
Jan 23 03:57:23 np0005593234 systemd[1]: Started Session 18 of User zuul.
Jan 23 03:57:24 np0005593234 python3.9[70421]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:57:26 np0005593234 python3.9[70578]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 03:57:26 np0005593234 python3.9[70662]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 03:57:29 np0005593234 python3.9[70815]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:57:30 np0005593234 python3.9[70967]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 03:57:31 np0005593234 python3.9[71117]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:57:31 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 03:57:32 np0005593234 python3.9[71268]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 03:57:33 np0005593234 systemd[1]: session-18.scope: Deactivated successfully.
Jan 23 03:57:33 np0005593234 systemd[1]: session-18.scope: Consumed 5.722s CPU time.
Jan 23 03:57:33 np0005593234 systemd-logind[794]: Session 18 logged out. Waiting for processes to exit.
Jan 23 03:57:33 np0005593234 systemd-logind[794]: Removed session 18.
Jan 23 03:57:40 np0005593234 systemd-logind[794]: New session 19 of user zuul.
Jan 23 03:57:40 np0005593234 systemd[1]: Started Session 19 of User zuul.
Jan 23 03:57:46 np0005593234 python3[72034]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 03:57:48 np0005593234 python3[72129]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 23 03:57:50 np0005593234 python3[72156]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 23 03:57:50 np0005593234 python3[72182]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G#012losetup /dev/loop3 /var/lib/ceph-osd-0.img#012lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:57:50 np0005593234 kernel: loop: module loaded
Jan 23 03:57:50 np0005593234 kernel: loop3: detected capacity change from 0 to 14680064
Jan 23 03:57:51 np0005593234 python3[72217]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3#012vgcreate ceph_vg0 /dev/loop3#012lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0#012lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 03:57:51 np0005593234 lvm[72220]: PV /dev/loop3 not used.
Jan 23 03:57:51 np0005593234 lvm[72229]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 03:57:51 np0005593234 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 23 03:57:51 np0005593234 lvm[72231]:  1 logical volume(s) in volume group "ceph_vg0" now active
Jan 23 03:57:51 np0005593234 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 23 03:57:51 np0005593234 chronyd[58504]: Selected source 167.160.187.179 (pool.ntp.org)
Jan 23 03:57:51 np0005593234 python3[72309]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 03:57:52 np0005593234 python3[72382]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769158671.5257545-36871-263587142293385/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 03:57:52 np0005593234 python3[72432]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 03:57:53 np0005593234 systemd[1]: Reloading.
Jan 23 03:57:53 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 03:57:53 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 03:57:53 np0005593234 systemd[1]: Starting Ceph OSD losetup...
Jan 23 03:57:53 np0005593234 bash[72472]: /dev/loop3: [64513]:4328453 (/var/lib/ceph-osd-0.img)
Jan 23 03:57:53 np0005593234 systemd[1]: Finished Ceph OSD losetup.
Jan 23 03:57:53 np0005593234 lvm[72474]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 03:57:53 np0005593234 lvm[72474]: VG ceph_vg0 finished
Jan 23 03:57:56 np0005593234 python3[72498]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:00:14 np0005593234 systemd[1]: Created slice User Slice of UID 42477.
Jan 23 04:00:14 np0005593234 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 23 04:00:14 np0005593234 systemd-logind[794]: New session 20 of user ceph-admin.
Jan 23 04:00:14 np0005593234 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 23 04:00:14 np0005593234 systemd[1]: Starting User Manager for UID 42477...
Jan 23 04:00:14 np0005593234 systemd[72548]: Queued start job for default target Main User Target.
Jan 23 04:00:14 np0005593234 systemd[72548]: Created slice User Application Slice.
Jan 23 04:00:14 np0005593234 systemd[72548]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:00:14 np0005593234 systemd[72548]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:00:14 np0005593234 systemd[72548]: Reached target Paths.
Jan 23 04:00:14 np0005593234 systemd[72548]: Reached target Timers.
Jan 23 04:00:14 np0005593234 systemd[72548]: Starting D-Bus User Message Bus Socket...
Jan 23 04:00:14 np0005593234 systemd[72548]: Starting Create User's Volatile Files and Directories...
Jan 23 04:00:14 np0005593234 systemd[72548]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:00:14 np0005593234 systemd[72548]: Finished Create User's Volatile Files and Directories.
Jan 23 04:00:14 np0005593234 systemd[72548]: Reached target Sockets.
Jan 23 04:00:14 np0005593234 systemd[72548]: Reached target Basic System.
Jan 23 04:00:14 np0005593234 systemd[72548]: Reached target Main User Target.
Jan 23 04:00:14 np0005593234 systemd[72548]: Startup finished in 113ms.
Jan 23 04:00:14 np0005593234 systemd[1]: Started User Manager for UID 42477.
Jan 23 04:00:14 np0005593234 systemd[1]: Started Session 20 of User ceph-admin.
Jan 23 04:00:14 np0005593234 systemd-logind[794]: New session 22 of user ceph-admin.
Jan 23 04:00:14 np0005593234 systemd[1]: Started Session 22 of User ceph-admin.
Jan 23 04:00:14 np0005593234 systemd-logind[794]: New session 23 of user ceph-admin.
Jan 23 04:00:14 np0005593234 systemd[1]: Started Session 23 of User ceph-admin.
Jan 23 04:00:15 np0005593234 systemd-logind[794]: New session 24 of user ceph-admin.
Jan 23 04:00:15 np0005593234 systemd[1]: Started Session 24 of User ceph-admin.
Jan 23 04:00:15 np0005593234 systemd-logind[794]: New session 25 of user ceph-admin.
Jan 23 04:00:15 np0005593234 systemd[1]: Started Session 25 of User ceph-admin.
Jan 23 04:00:16 np0005593234 systemd-logind[794]: New session 26 of user ceph-admin.
Jan 23 04:00:16 np0005593234 systemd[1]: Started Session 26 of User ceph-admin.
Jan 23 04:00:16 np0005593234 systemd-logind[794]: New session 27 of user ceph-admin.
Jan 23 04:00:16 np0005593234 systemd[1]: Started Session 27 of User ceph-admin.
Jan 23 04:00:16 np0005593234 systemd-logind[794]: New session 28 of user ceph-admin.
Jan 23 04:00:16 np0005593234 systemd[1]: Started Session 28 of User ceph-admin.
Jan 23 04:00:17 np0005593234 systemd-logind[794]: New session 29 of user ceph-admin.
Jan 23 04:00:17 np0005593234 systemd[1]: Started Session 29 of User ceph-admin.
Jan 23 04:00:17 np0005593234 systemd-logind[794]: New session 30 of user ceph-admin.
Jan 23 04:00:17 np0005593234 systemd[1]: Started Session 30 of User ceph-admin.
Jan 23 04:00:18 np0005593234 systemd-logind[794]: New session 31 of user ceph-admin.
Jan 23 04:00:18 np0005593234 systemd[1]: Started Session 31 of User ceph-admin.
Jan 23 04:00:18 np0005593234 systemd-logind[794]: New session 32 of user ceph-admin.
Jan 23 04:00:18 np0005593234 systemd[1]: Started Session 32 of User ceph-admin.
Jan 23 04:00:18 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:20 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:21 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:21 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:21 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:21 np0005593234 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73587 (sysctl)
Jan 23 04:01:21 np0005593234 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 23 04:01:21 np0005593234 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 23 04:01:22 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:23 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:26 np0005593234 systemd[1]: var-lib-containers-storage-overlay-compat1554499375-lower\x2dmapped.mount: Deactivated successfully.
Jan 23 04:01:43 np0005593234 podman[73863]: 2026-01-23 09:01:43.848591788 +0000 UTC m=+20.456126802 container create a2e1ad934bd7f87549cd7a2fdc60e1aee1d64612860cc8dca5c010231d53a973 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_mendel, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:01:43 np0005593234 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck3025172256-merged.mount: Deactivated successfully.
Jan 23 04:01:43 np0005593234 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 23 04:01:43 np0005593234 systemd[1]: Started libpod-conmon-a2e1ad934bd7f87549cd7a2fdc60e1aee1d64612860cc8dca5c010231d53a973.scope.
Jan 23 04:01:43 np0005593234 podman[73863]: 2026-01-23 09:01:43.829280928 +0000 UTC m=+20.436815942 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:43 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:01:43 np0005593234 podman[73863]: 2026-01-23 09:01:43.952349161 +0000 UTC m=+20.559884195 container init a2e1ad934bd7f87549cd7a2fdc60e1aee1d64612860cc8dca5c010231d53a973 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_mendel, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:01:43 np0005593234 podman[73863]: 2026-01-23 09:01:43.959469272 +0000 UTC m=+20.567004286 container start a2e1ad934bd7f87549cd7a2fdc60e1aee1d64612860cc8dca5c010231d53a973 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_mendel, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:01:43 np0005593234 podman[73863]: 2026-01-23 09:01:43.963121636 +0000 UTC m=+20.570656670 container attach a2e1ad934bd7f87549cd7a2fdc60e1aee1d64612860cc8dca5c010231d53a973 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_mendel, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:01:43 np0005593234 dreamy_mendel[73981]: 167 167
Jan 23 04:01:43 np0005593234 systemd[1]: libpod-a2e1ad934bd7f87549cd7a2fdc60e1aee1d64612860cc8dca5c010231d53a973.scope: Deactivated successfully.
Jan 23 04:01:44 np0005593234 podman[73987]: 2026-01-23 09:01:44.002820979 +0000 UTC m=+0.022832260 container died a2e1ad934bd7f87549cd7a2fdc60e1aee1d64612860cc8dca5c010231d53a973 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_mendel, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:01:44 np0005593234 systemd[1]: var-lib-containers-storage-overlay-be70b1dba0e2eb10961b1f03474494c55b1c92d6c4966e78d09a7bcbf7bdeb34-merged.mount: Deactivated successfully.
Jan 23 04:01:44 np0005593234 podman[73987]: 2026-01-23 09:01:44.031702666 +0000 UTC m=+0.051713947 container remove a2e1ad934bd7f87549cd7a2fdc60e1aee1d64612860cc8dca5c010231d53a973 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=dreamy_mendel, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:01:44 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:44 np0005593234 systemd[1]: libpod-conmon-a2e1ad934bd7f87549cd7a2fdc60e1aee1d64612860cc8dca5c010231d53a973.scope: Deactivated successfully.
Jan 23 04:01:44 np0005593234 podman[74009]: 2026-01-23 09:01:44.172671656 +0000 UTC m=+0.035588787 container create b2413b8825118c9ef41c044fbde37700caae7b9a30a2c50f53c5c3bfe900e66e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_pare, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.license=GPLv2)
Jan 23 04:01:44 np0005593234 systemd[1]: Started libpod-conmon-b2413b8825118c9ef41c044fbde37700caae7b9a30a2c50f53c5c3bfe900e66e.scope.
Jan 23 04:01:44 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:01:44 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92a3760df399e29d05ec8b2c778b1d1d737ccd94d1d32553f1195a5914e142a1/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:44 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/92a3760df399e29d05ec8b2c778b1d1d737ccd94d1d32553f1195a5914e142a1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:44 np0005593234 podman[74009]: 2026-01-23 09:01:44.236956163 +0000 UTC m=+0.099873294 container init b2413b8825118c9ef41c044fbde37700caae7b9a30a2c50f53c5c3bfe900e66e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_pare, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True)
Jan 23 04:01:44 np0005593234 podman[74009]: 2026-01-23 09:01:44.242928778 +0000 UTC m=+0.105845899 container start b2413b8825118c9ef41c044fbde37700caae7b9a30a2c50f53c5c3bfe900e66e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_pare, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef)
Jan 23 04:01:44 np0005593234 podman[74009]: 2026-01-23 09:01:44.24687929 +0000 UTC m=+0.109796451 container attach b2413b8825118c9ef41c044fbde37700caae7b9a30a2c50f53c5c3bfe900e66e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_pare, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:01:44 np0005593234 podman[74009]: 2026-01-23 09:01:44.156038059 +0000 UTC m=+0.018955210 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:45 np0005593234 laughing_pare[74026]: [
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:    {
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:        "available": false,
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:        "ceph_device": false,
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:        "lsm_data": {},
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:        "lvs": [],
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:        "path": "/dev/sr0",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:        "rejected_reasons": [
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "Has a FileSystem",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "Insufficient space (<5GB)"
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:        ],
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:        "sys_api": {
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "actuators": null,
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "device_nodes": "sr0",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "devname": "sr0",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "human_readable_size": "482.00 KB",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "id_bus": "ata",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "model": "QEMU DVD-ROM",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "nr_requests": "2",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "parent": "/dev/sr0",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "partitions": {},
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "path": "/dev/sr0",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "removable": "1",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "rev": "2.5+",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "ro": "0",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "rotational": "1",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "sas_address": "",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "sas_device_handle": "",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "scheduler_mode": "mq-deadline",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "sectors": 0,
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "sectorsize": "2048",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "size": 493568.0,
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "support_discard": "2048",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "type": "disk",
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:            "vendor": "QEMU"
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:        }
Jan 23 04:01:45 np0005593234 laughing_pare[74026]:    }
Jan 23 04:01:45 np0005593234 laughing_pare[74026]: ]
Jan 23 04:01:45 np0005593234 systemd[1]: libpod-b2413b8825118c9ef41c044fbde37700caae7b9a30a2c50f53c5c3bfe900e66e.scope: Deactivated successfully.
Jan 23 04:01:45 np0005593234 systemd[1]: libpod-b2413b8825118c9ef41c044fbde37700caae7b9a30a2c50f53c5c3bfe900e66e.scope: Consumed 1.115s CPU time.
Jan 23 04:01:45 np0005593234 podman[74889]: 2026-01-23 09:01:45.403827231 +0000 UTC m=+0.026106883 container died b2413b8825118c9ef41c044fbde37700caae7b9a30a2c50f53c5c3bfe900e66e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_pare, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Jan 23 04:01:46 np0005593234 systemd[1]: var-lib-containers-storage-overlay-92a3760df399e29d05ec8b2c778b1d1d737ccd94d1d32553f1195a5914e142a1-merged.mount: Deactivated successfully.
Jan 23 04:01:46 np0005593234 podman[74889]: 2026-01-23 09:01:46.751913288 +0000 UTC m=+1.374192910 container remove b2413b8825118c9ef41c044fbde37700caae7b9a30a2c50f53c5c3bfe900e66e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=laughing_pare, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Jan 23 04:01:46 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:46 np0005593234 systemd[1]: libpod-conmon-b2413b8825118c9ef41c044fbde37700caae7b9a30a2c50f53c5c3bfe900e66e.scope: Deactivated successfully.
Jan 23 04:01:52 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:52 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:52 np0005593234 podman[76733]: 2026-01-23 09:01:52.432659108 +0000 UTC m=+0.048794607 container create 023356ba92118abef1aa8fddd76a58677765439bfac7956170aaac970223dbc2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_hawking, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:01:52 np0005593234 systemd[1]: Started libpod-conmon-023356ba92118abef1aa8fddd76a58677765439bfac7956170aaac970223dbc2.scope.
Jan 23 04:01:52 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:01:52 np0005593234 podman[76733]: 2026-01-23 09:01:52.495052866 +0000 UTC m=+0.111188385 container init 023356ba92118abef1aa8fddd76a58677765439bfac7956170aaac970223dbc2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_hawking, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0)
Jan 23 04:01:52 np0005593234 podman[76733]: 2026-01-23 09:01:52.501528258 +0000 UTC m=+0.117663757 container start 023356ba92118abef1aa8fddd76a58677765439bfac7956170aaac970223dbc2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_hawking, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:01:52 np0005593234 podman[76733]: 2026-01-23 09:01:52.407464565 +0000 UTC m=+0.023600084 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:52 np0005593234 podman[76733]: 2026-01-23 09:01:52.50518497 +0000 UTC m=+0.121320489 container attach 023356ba92118abef1aa8fddd76a58677765439bfac7956170aaac970223dbc2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_hawking, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:01:52 np0005593234 intelligent_hawking[76749]: 167 167
Jan 23 04:01:52 np0005593234 systemd[1]: libpod-023356ba92118abef1aa8fddd76a58677765439bfac7956170aaac970223dbc2.scope: Deactivated successfully.
Jan 23 04:01:52 np0005593234 podman[76733]: 2026-01-23 09:01:52.507696969 +0000 UTC m=+0.123832478 container died 023356ba92118abef1aa8fddd76a58677765439bfac7956170aaac970223dbc2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_hawking, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Jan 23 04:01:52 np0005593234 podman[76733]: 2026-01-23 09:01:52.560927253 +0000 UTC m=+0.177062752 container remove 023356ba92118abef1aa8fddd76a58677765439bfac7956170aaac970223dbc2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=intelligent_hawking, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Jan 23 04:01:52 np0005593234 systemd[1]: libpod-conmon-023356ba92118abef1aa8fddd76a58677765439bfac7956170aaac970223dbc2.scope: Deactivated successfully.
Jan 23 04:01:52 np0005593234 podman[76770]: 2026-01-23 09:01:52.622612888 +0000 UTC m=+0.036378001 container create 2f0ac8797e0f1bb1e384ac8b20fc8c7259879ce51b54b5ea1b4b9bc6f5780547 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_moser, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:01:52 np0005593234 systemd[1]: Started libpod-conmon-2f0ac8797e0f1bb1e384ac8b20fc8c7259879ce51b54b5ea1b4b9bc6f5780547.scope.
Jan 23 04:01:52 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:01:52 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/248ac7e62d59f25b9099f0546d80dafd2f1c9d69c72aee58e261ebfe5e5fcd2e/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:52 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/248ac7e62d59f25b9099f0546d80dafd2f1c9d69c72aee58e261ebfe5e5fcd2e/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:52 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/248ac7e62d59f25b9099f0546d80dafd2f1c9d69c72aee58e261ebfe5e5fcd2e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:52 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/248ac7e62d59f25b9099f0546d80dafd2f1c9d69c72aee58e261ebfe5e5fcd2e/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:52 np0005593234 podman[76770]: 2026-01-23 09:01:52.688746653 +0000 UTC m=+0.102511766 container init 2f0ac8797e0f1bb1e384ac8b20fc8c7259879ce51b54b5ea1b4b9bc6f5780547 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_moser, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:01:52 np0005593234 podman[76770]: 2026-01-23 09:01:52.698218787 +0000 UTC m=+0.111983910 container start 2f0ac8797e0f1bb1e384ac8b20fc8c7259879ce51b54b5ea1b4b9bc6f5780547 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_moser, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:01:52 np0005593234 podman[76770]: 2026-01-23 09:01:52.606192308 +0000 UTC m=+0.019957441 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:52 np0005593234 podman[76770]: 2026-01-23 09:01:52.70217605 +0000 UTC m=+0.115941163 container attach 2f0ac8797e0f1bb1e384ac8b20fc8c7259879ce51b54b5ea1b4b9bc6f5780547 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_moser, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef)
Jan 23 04:01:52 np0005593234 systemd[1]: libpod-2f0ac8797e0f1bb1e384ac8b20fc8c7259879ce51b54b5ea1b4b9bc6f5780547.scope: Deactivated successfully.
Jan 23 04:01:52 np0005593234 podman[76770]: 2026-01-23 09:01:52.769852663 +0000 UTC m=+0.183617776 container died 2f0ac8797e0f1bb1e384ac8b20fc8c7259879ce51b54b5ea1b4b9bc6f5780547 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_moser, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS)
Jan 23 04:01:52 np0005593234 podman[76770]: 2026-01-23 09:01:52.799927197 +0000 UTC m=+0.213692340 container remove 2f0ac8797e0f1bb1e384ac8b20fc8c7259879ce51b54b5ea1b4b9bc6f5780547 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=funny_moser, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:01:52 np0005593234 systemd[1]: libpod-conmon-2f0ac8797e0f1bb1e384ac8b20fc8c7259879ce51b54b5ea1b4b9bc6f5780547.scope: Deactivated successfully.
Jan 23 04:01:52 np0005593234 systemd[1]: Reloading.
Jan 23 04:01:52 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:01:52 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:01:53 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:53 np0005593234 systemd[1]: Reloading.
Jan 23 04:01:53 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:01:53 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:01:53 np0005593234 systemd[1]: Reached target All Ceph clusters and services.
Jan 23 04:01:53 np0005593234 systemd[1]: Reloading.
Jan 23 04:01:53 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:01:53 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:01:53 np0005593234 systemd[1]: Reached target Ceph cluster e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:01:53 np0005593234 systemd[1]: Reloading.
Jan 23 04:01:53 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:01:53 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:01:53 np0005593234 systemd[1]: Reloading.
Jan 23 04:01:53 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:01:53 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:01:54 np0005593234 systemd[1]: Created slice Slice /system/ceph-e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:01:54 np0005593234 systemd[1]: Reached target System Time Set.
Jan 23 04:01:54 np0005593234 systemd[1]: Reached target System Time Synchronized.
Jan 23 04:01:54 np0005593234 systemd[1]: Starting Ceph mon.compute-2 for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:01:54 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:54 np0005593234 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 04:01:54 np0005593234 podman[77065]: 2026-01-23 09:01:54.367491953 +0000 UTC m=+0.055688571 container create 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Jan 23 04:01:54 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15ab8c3255e324f6a6c75c303f831536bde4622bfa64b8addda26ce9afd33734/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:54 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15ab8c3255e324f6a6c75c303f831536bde4622bfa64b8addda26ce9afd33734/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:54 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15ab8c3255e324f6a6c75c303f831536bde4622bfa64b8addda26ce9afd33734/merged/var/lib/ceph/mon/ceph-compute-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:01:54 np0005593234 podman[77065]: 2026-01-23 09:01:54.34581848 +0000 UTC m=+0.034015148 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:01:54 np0005593234 podman[77065]: 2026-01-23 09:01:54.445896678 +0000 UTC m=+0.134093316 container init 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:01:54 np0005593234 podman[77065]: 2026-01-23 09:01:54.453172185 +0000 UTC m=+0.141368803 container start 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3)
Jan 23 04:01:54 np0005593234 bash[77065]: 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f
Jan 23 04:01:54 np0005593234 systemd[1]: Started Ceph mon.compute-2 for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mon, pid 2
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: pidfile_write: ignore empty --pid-file
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: load: jerasure load: lrc 
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: RocksDB version: 7.9.2
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Git sha 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: DB SUMMARY
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: DB Session ID:  HUKC432V5FL221EKKD8A
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: CURRENT file:  CURRENT
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-2/store.db dir, Total Num: 0, files: 
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-2/store.db: 000004.log size: 511 ; 
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                         Options.error_if_exists: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                       Options.create_if_missing: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                                     Options.env: 0x55f10dd7bc40
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                                Options.info_log: 0x55f10fc1efc0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                              Options.statistics: (nil)
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                               Options.use_fsync: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                              Options.db_log_dir: 
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                                 Options.wal_dir: 
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                    Options.write_buffer_manager: 0x55f10fc2eb40
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                  Options.unordered_write: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                               Options.row_cache: None
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                              Options.wal_filter: None
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.two_write_queues: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.wal_compression: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.atomic_flush: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.max_background_jobs: 2
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.max_background_compactions: -1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.max_subcompactions: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.max_total_wal_size: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                          Options.max_open_files: -1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:       Options.compaction_readahead_size: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Compression algorithms supported:
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: #011kZSTD supported: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: #011kXpressCompression supported: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: #011kBZip2Compression supported: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: #011kLZ4Compression supported: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: #011kZlibCompression supported: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: #011kSnappyCompression supported: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:           Options.merge_operator: 
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:        Options.compaction_filter: None
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55f10fc1ec00)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55f10fc171f0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:        Options.write_buffer_size: 33554432
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:  Options.max_write_buffer_number: 2
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:          Options.compression: NoCompression
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.num_levels: 7
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-2/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 8e19a509-cda7-49b3-9222-61516e1c69d3
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158914506416, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158914509122, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1648, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158914509232, "job": 1, "event": "recovery_finished"}
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55f10fc40e00
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: DB pointer 0x55f10fcca000
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.0 total, 0.0 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      1/0    1.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012 Sum      1/0    1.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.0 total, 0.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.13 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.7e-05 secs_since: 0#012Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2 does not exist in monmap, will attempt to join an existing cluster
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: using public_addr v2:192.168.122.102:0/0 -> [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0]
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: starting mon.compute-2 rank -1 at public addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] at bind addrs [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-2 fsid e1533653-0a5a-584c-b34b-8689f0d32e77
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(???) e0 preinit fsid e1533653-0a5a-584c-b34b-8689f0d32e77
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).mds e1 new map
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).mds e1 print_map#012e1#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: -1#012 #012No filesystems configured
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e9 e9: 2 total, 1 up, 2 in
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e10 e10: 2 total, 2 up, 2 in
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e11 e11: 2 total, 2 up, 2 in
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e12 e12: 2 total, 2 up, 2 in
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e12 crush map has features 3314933000852226048, adjusting msgr requires
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).osd e12 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: Updating compute-2:/var/lib/ceph/e1533653-0a5a-584c-b34b-8689f0d32e77/config/ceph.conf
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: Updating compute-2:/var/lib/ceph/e1533653-0a5a-584c-b34b-8689f0d32e77/config/ceph.client.admin.keyring
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: Deploying daemon mon.compute-2 on compute-2
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: Cluster is now healthy
Jan 23 04:01:54 np0005593234 ceph-mon[77084]: mon.compute-2@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 23 04:01:56 np0005593234 ceph-mon[77084]: mon.compute-2@-1(probing) e2  my rank is now 1 (was -1)
Jan 23 04:01:56 np0005593234 ceph-mon[77084]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 23 04:01:56 np0005593234 ceph-mon[77084]: paxos.1).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 23 04:01:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:01:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 23 04:01:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 23 04:01:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e2 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e2 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: mgrc update_daemon_metadata mon.compute-2 metadata {addrs=[v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,ceph_version_when_created=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,created_at=2026-01-23T09:01:52.740477Z,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-2,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,os=Linux}
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: Deploying daemon mon.compute-1 on compute-1
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: mon.compute-0 calling monitor election
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: mon.compute-2 calling monitor election
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.nrjyzu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:01:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.nrjyzu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 23 04:02:00 np0005593234 podman[77265]: 2026-01-23 09:02:00.470240962 +0000 UTC m=+0.032844641 container create 114a5ee318800dc7fb36565e629ee1bf2c9a5296a23eefa85be52663ec334200 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_maxwell, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Jan 23 04:02:00 np0005593234 systemd[1]: Started libpod-conmon-114a5ee318800dc7fb36565e629ee1bf2c9a5296a23eefa85be52663ec334200.scope.
Jan 23 04:02:00 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:00 np0005593234 podman[77265]: 2026-01-23 09:02:00.552228729 +0000 UTC m=+0.114832438 container init 114a5ee318800dc7fb36565e629ee1bf2c9a5296a23eefa85be52663ec334200 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_maxwell, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_REF=reef, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:02:00 np0005593234 podman[77265]: 2026-01-23 09:02:00.455723962 +0000 UTC m=+0.018327661 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:00 np0005593234 podman[77265]: 2026-01-23 09:02:00.558658439 +0000 UTC m=+0.121262118 container start 114a5ee318800dc7fb36565e629ee1bf2c9a5296a23eefa85be52663ec334200 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_maxwell, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:02:00 np0005593234 podman[77265]: 2026-01-23 09:02:00.56189997 +0000 UTC m=+0.124503649 container attach 114a5ee318800dc7fb36565e629ee1bf2c9a5296a23eefa85be52663ec334200 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_maxwell, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:02:00 np0005593234 systemd[1]: libpod-114a5ee318800dc7fb36565e629ee1bf2c9a5296a23eefa85be52663ec334200.scope: Deactivated successfully.
Jan 23 04:02:00 np0005593234 musing_maxwell[77281]: 167 167
Jan 23 04:02:00 np0005593234 conmon[77281]: conmon 114a5ee318800dc7fb36 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-114a5ee318800dc7fb36565e629ee1bf2c9a5296a23eefa85be52663ec334200.scope/container/memory.events
Jan 23 04:02:00 np0005593234 podman[77265]: 2026-01-23 09:02:00.56513389 +0000 UTC m=+0.127737569 container died 114a5ee318800dc7fb36565e629ee1bf2c9a5296a23eefa85be52663ec334200 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_maxwell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Jan 23 04:02:00 np0005593234 systemd[1]: var-lib-containers-storage-overlay-84810e91144778c31c9c0b4f5857d6a26272b5940443e41ce6349197d7f57c21-merged.mount: Deactivated successfully.
Jan 23 04:02:00 np0005593234 podman[77265]: 2026-01-23 09:02:00.597588069 +0000 UTC m=+0.160191748 container remove 114a5ee318800dc7fb36565e629ee1bf2c9a5296a23eefa85be52663ec334200 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=musing_maxwell, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0)
Jan 23 04:02:00 np0005593234 systemd[1]: libpod-conmon-114a5ee318800dc7fb36565e629ee1bf2c9a5296a23eefa85be52663ec334200.scope: Deactivated successfully.
Jan 23 04:02:00 np0005593234 systemd[1]: Reloading.
Jan 23 04:02:00 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:02:00 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:02:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e2  adding peer [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] to list of hints
Jan 23 04:02:00 np0005593234 ceph-mon[77084]: log_channel(cluster) log [INF] : mon.compute-2 calling monitor election
Jan 23 04:02:00 np0005593234 ceph-mon[77084]: paxos.1).electionLogic(10) init, last seen epoch 10
Jan 23 04:02:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:02:00 np0005593234 systemd[1]: Reloading.
Jan 23 04:02:00 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:02:00 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:02:01 np0005593234 systemd[1]: Starting Ceph mgr.compute-2.nrjyzu for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:02:01 np0005593234 podman[77429]: 2026-01-23 09:02:01.354512271 +0000 UTC m=+0.037353000 container create f6d49bbf16ca28bdccb1d86737e99c938b315375470edae32266d040ff94af31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:02:01 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb31b05fd79d298656a23512b5156ee6ca4c7b1505deb63f8539b5f64c55c0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:01 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb31b05fd79d298656a23512b5156ee6ca4c7b1505deb63f8539b5f64c55c0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:01 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb31b05fd79d298656a23512b5156ee6ca4c7b1505deb63f8539b5f64c55c0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:01 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7eeb31b05fd79d298656a23512b5156ee6ca4c7b1505deb63f8539b5f64c55c0/merged/var/lib/ceph/mgr/ceph-compute-2.nrjyzu supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:01 np0005593234 podman[77429]: 2026-01-23 09:02:01.401422769 +0000 UTC m=+0.084263518 container init f6d49bbf16ca28bdccb1d86737e99c938b315375470edae32266d040ff94af31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu, ceph=True, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:02:01 np0005593234 podman[77429]: 2026-01-23 09:02:01.40790777 +0000 UTC m=+0.090748519 container start f6d49bbf16ca28bdccb1d86737e99c938b315375470edae32266d040ff94af31 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:02:01 np0005593234 bash[77429]: f6d49bbf16ca28bdccb1d86737e99c938b315375470edae32266d040ff94af31
Jan 23 04:02:01 np0005593234 podman[77429]: 2026-01-23 09:02:01.337748981 +0000 UTC m=+0.020589730 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:01 np0005593234 systemd[1]: Started Ceph mgr.compute-2.nrjyzu for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:02:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 23 04:02:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 23 04:02:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 23 04:02:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 23 04:02:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 23 04:02:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 23 04:02:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e3 handle_auth_request failed to assign global_id
Jan 23 04:02:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:02:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 23 04:02:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 04:02:05 np0005593234 ceph-mgr[77448]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:02:05 np0005593234 ceph-mgr[77448]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mgr, pid 2
Jan 23 04:02:05 np0005593234 ceph-mgr[77448]: pidfile_write: ignore empty --pid-file
Jan 23 04:02:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_auth_request failed to assign global_id
Jan 23 04:02:05 np0005593234 ceph-mon[77084]: mon.compute-0 calling monitor election
Jan 23 04:02:05 np0005593234 ceph-mon[77084]: mon.compute-2 calling monitor election
Jan 23 04:02:05 np0005593234 ceph-mon[77084]: mon.compute-1 calling monitor election
Jan 23 04:02:05 np0005593234 ceph-mon[77084]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 23 04:02:05 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 04:02:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:05 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'alerts'
Jan 23 04:02:06 np0005593234 ceph-mgr[77448]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:02:06 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'balancer'
Jan 23 04:02:06 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:06.343+0000 7fd1ba9ce140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 04:02:06 np0005593234 ceph-mgr[77448]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:02:06 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'cephadm'
Jan 23 04:02:06 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:06.627+0000 7fd1ba9ce140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 04:02:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.wsgywz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:02:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.wsgywz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 23 04:02:08 np0005593234 ceph-mon[77084]: Deploying daemon mgr.compute-1.wsgywz on compute-1
Jan 23 04:02:08 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/936567403' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 23 04:02:08 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'crash'
Jan 23 04:02:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e13 e13: 2 total, 2 up, 2 in
Jan 23 04:02:09 np0005593234 ceph-mgr[77448]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:02:09 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'dashboard'
Jan 23 04:02:09 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:09.010+0000 7fd1ba9ce140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 04:02:09 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/936567403' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "vms", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 23 04:02:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e13 _set_new_cache_sizes cache_size:1019934905 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:09 np0005593234 podman[77624]: 2026-01-23 09:02:09.741548763 +0000 UTC m=+0.020078815 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e14 e14: 2 total, 2 up, 2 in
Jan 23 04:02:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:02:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 23 04:02:10 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'devicehealth'
Jan 23 04:02:10 np0005593234 ceph-mgr[77448]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:02:10 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 04:02:10 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:10.920+0000 7fd1ba9ce140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 04:02:11 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 04:02:11 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 04:02:11 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]:  from numpy import show_config as show_numpy_config
Jan 23 04:02:11 np0005593234 ceph-mgr[77448]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:02:11 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:11.487+0000 7fd1ba9ce140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 04:02:11 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'influx'
Jan 23 04:02:11 np0005593234 ceph-mgr[77448]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:02:11 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'insights'
Jan 23 04:02:11 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:11.731+0000 7fd1ba9ce140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 04:02:12 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'iostat'
Jan 23 04:02:12 np0005593234 ceph-mgr[77448]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:02:12 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:12.252+0000 7fd1ba9ce140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 04:02:12 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'k8sevents'
Jan 23 04:02:12 np0005593234 podman[77624]: 2026-01-23 09:02:12.968842719 +0000 UTC m=+3.247372771 container create 0758b64d5fe06a12175b48f3f82edfeec2a2689f5cc14710ef022b629758c87e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_wiles, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Jan 23 04:02:13 np0005593234 systemd[1]: Started libpod-conmon-0758b64d5fe06a12175b48f3f82edfeec2a2689f5cc14710ef022b629758c87e.scope.
Jan 23 04:02:13 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:13 np0005593234 podman[77624]: 2026-01-23 09:02:13.101931199 +0000 UTC m=+3.380461271 container init 0758b64d5fe06a12175b48f3f82edfeec2a2689f5cc14710ef022b629758c87e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_wiles, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3)
Jan 23 04:02:13 np0005593234 podman[77624]: 2026-01-23 09:02:13.109172755 +0000 UTC m=+3.387702797 container start 0758b64d5fe06a12175b48f3f82edfeec2a2689f5cc14710ef022b629758c87e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_wiles, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Jan 23 04:02:13 np0005593234 podman[77624]: 2026-01-23 09:02:13.113295162 +0000 UTC m=+3.391825234 container attach 0758b64d5fe06a12175b48f3f82edfeec2a2689f5cc14710ef022b629758c87e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_wiles, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:02:13 np0005593234 pensive_wiles[77641]: 167 167
Jan 23 04:02:13 np0005593234 systemd[1]: libpod-0758b64d5fe06a12175b48f3f82edfeec2a2689f5cc14710ef022b629758c87e.scope: Deactivated successfully.
Jan 23 04:02:13 np0005593234 podman[77624]: 2026-01-23 09:02:13.114746508 +0000 UTC m=+3.393276550 container died 0758b64d5fe06a12175b48f3f82edfeec2a2689f5cc14710ef022b629758c87e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_wiles, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:02:13 np0005593234 systemd[1]: var-lib-containers-storage-overlay-8921a12fddf5d6cbe85ca3c365f927b6b0af31b6e9b45921d4689b651c502b4e-merged.mount: Deactivated successfully.
Jan 23 04:02:13 np0005593234 podman[77624]: 2026-01-23 09:02:13.147405071 +0000 UTC m=+3.425935113 container remove 0758b64d5fe06a12175b48f3f82edfeec2a2689f5cc14710ef022b629758c87e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=pensive_wiles, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:02:13 np0005593234 systemd[1]: libpod-conmon-0758b64d5fe06a12175b48f3f82edfeec2a2689f5cc14710ef022b629758c87e.scope: Deactivated successfully.
Jan 23 04:02:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e15 e15: 2 total, 2 up, 2 in
Jan 23 04:02:13 np0005593234 ceph-mon[77084]: Deploying daemon crash.compute-2 on compute-2
Jan 23 04:02:13 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/2880218519' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 23 04:02:13 np0005593234 ceph-mon[77084]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:02:13 np0005593234 systemd[1]: Reloading.
Jan 23 04:02:13 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:02:13 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:02:13 np0005593234 systemd[1]: Reloading.
Jan 23 04:02:13 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:02:13 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:02:13 np0005593234 systemd[1]: Starting Ceph crash.compute-2 for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:02:13 np0005593234 podman[77785]: 2026-01-23 09:02:13.952307243 +0000 UTC m=+0.041556242 container create 5d87f145630e65c59ad06267901d8bc9c5a7b69470c8282e1b9faa75c35cd6c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2, CEPH_REF=reef, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:02:13 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78ea63103c31dd12e4c524e0179bf489763d18735558a5af4518a4b216989a6c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:13 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78ea63103c31dd12e4c524e0179bf489763d18735558a5af4518a4b216989a6c/merged/etc/ceph/ceph.client.crash.compute-2.keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:13 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78ea63103c31dd12e4c524e0179bf489763d18735558a5af4518a4b216989a6c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:13 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/78ea63103c31dd12e4c524e0179bf489763d18735558a5af4518a4b216989a6c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:14 np0005593234 podman[77785]: 2026-01-23 09:02:14.021984925 +0000 UTC m=+0.111233944 container init 5d87f145630e65c59ad06267901d8bc9c5a7b69470c8282e1b9faa75c35cd6c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:02:14 np0005593234 podman[77785]: 2026-01-23 09:02:14.027847338 +0000 UTC m=+0.117096337 container start 5d87f145630e65c59ad06267901d8bc9c5a7b69470c8282e1b9faa75c35cd6c3 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:02:14 np0005593234 podman[77785]: 2026-01-23 09:02:13.933470608 +0000 UTC m=+0.022719617 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:14 np0005593234 bash[77785]: 5d87f145630e65c59ad06267901d8bc9c5a7b69470c8282e1b9faa75c35cd6c3
Jan 23 04:02:14 np0005593234 systemd[1]: Started Ceph crash.compute-2 for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:02:14 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'localpool'
Jan 23 04:02:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 23 04:02:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: 2026-01-23T09:02:14.432+0000 7fdc119f6640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 23 04:02:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: 2026-01-23T09:02:14.432+0000 7fdc119f6640 -1 AuthRegistry(0x7fdc0c0675b0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 23 04:02:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: 2026-01-23T09:02:14.433+0000 7fdc119f6640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 23 04:02:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: 2026-01-23T09:02:14.433+0000 7fdc119f6640 -1 AuthRegistry(0x7fdc119f5000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 23 04:02:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: 2026-01-23T09:02:14.434+0000 7fdc0affd640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 23 04:02:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: 2026-01-23T09:02:14.436+0000 7fdc0b7fe640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 23 04:02:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: 2026-01-23T09:02:14.436+0000 7fdc0a7fc640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 23 04:02:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: 2026-01-23T09:02:14.436+0000 7fdc119f6640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 23 04:02:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 23 04:02:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 23 04:02:14 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 04:02:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e15 _set_new_cache_sizes cache_size:1020053282 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:15 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'mirroring'
Jan 23 04:02:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e16 e16: 2 total, 2 up, 2 in
Jan 23 04:02:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:15 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/2880218519' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "volumes", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 23 04:02:15 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'nfs'
Jan 23 04:02:16 np0005593234 podman[77959]: 2026-01-23 09:02:16.246163746 +0000 UTC m=+0.055264046 container create 51cf98661442e303d5a8a7deaa461ea0ab59bb7d73dae1cb00396a986b11e3bc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_tesla, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:02:16 np0005593234 ceph-mgr[77448]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:02:16 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'orchestrator'
Jan 23 04:02:16 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:16.265+0000 7fd1ba9ce140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 04:02:16 np0005593234 systemd[72548]: Starting Mark boot as successful...
Jan 23 04:02:16 np0005593234 systemd[72548]: Finished Mark boot as successful.
Jan 23 04:02:16 np0005593234 systemd[1]: Started libpod-conmon-51cf98661442e303d5a8a7deaa461ea0ab59bb7d73dae1cb00396a986b11e3bc.scope.
Jan 23 04:02:16 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:16 np0005593234 podman[77959]: 2026-01-23 09:02:16.226823366 +0000 UTC m=+0.035923696 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:16 np0005593234 podman[77959]: 2026-01-23 09:02:16.32908441 +0000 UTC m=+0.138184760 container init 51cf98661442e303d5a8a7deaa461ea0ab59bb7d73dae1cb00396a986b11e3bc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_tesla, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 23 04:02:16 np0005593234 podman[77959]: 2026-01-23 09:02:16.336687016 +0000 UTC m=+0.145787316 container start 51cf98661442e303d5a8a7deaa461ea0ab59bb7d73dae1cb00396a986b11e3bc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_tesla, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:02:16 np0005593234 podman[77959]: 2026-01-23 09:02:16.343527288 +0000 UTC m=+0.152627638 container attach 51cf98661442e303d5a8a7deaa461ea0ab59bb7d73dae1cb00396a986b11e3bc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_tesla, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:02:16 np0005593234 condescending_tesla[77977]: 167 167
Jan 23 04:02:16 np0005593234 systemd[1]: libpod-51cf98661442e303d5a8a7deaa461ea0ab59bb7d73dae1cb00396a986b11e3bc.scope: Deactivated successfully.
Jan 23 04:02:16 np0005593234 podman[77959]: 2026-01-23 09:02:16.34487336 +0000 UTC m=+0.153973660 container died 51cf98661442e303d5a8a7deaa461ea0ab59bb7d73dae1cb00396a986b11e3bc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_tesla, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True)
Jan 23 04:02:16 np0005593234 systemd[1]: var-lib-containers-storage-overlay-85f3a1a8e7ad1a520bc8e4d4260135b992f8dc1373007eddcabff758ac4eb920-merged.mount: Deactivated successfully.
Jan 23 04:02:16 np0005593234 podman[77959]: 2026-01-23 09:02:16.38258802 +0000 UTC m=+0.191688320 container remove 51cf98661442e303d5a8a7deaa461ea0ab59bb7d73dae1cb00396a986b11e3bc (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=condescending_tesla, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:02:16 np0005593234 systemd[1]: libpod-conmon-51cf98661442e303d5a8a7deaa461ea0ab59bb7d73dae1cb00396a986b11e3bc.scope: Deactivated successfully.
Jan 23 04:02:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e17 e17: 2 total, 2 up, 2 in
Jan 23 04:02:16 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/136452763' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 23 04:02:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:02:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:02:16 np0005593234 podman[78001]: 2026-01-23 09:02:16.55879087 +0000 UTC m=+0.061656875 container create cd31a4eaf64c84133fa4ac0e87c89d2d6d614ace7f64acebcafaf0579574a955 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_williams, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:02:16 np0005593234 systemd[1]: Started libpod-conmon-cd31a4eaf64c84133fa4ac0e87c89d2d6d614ace7f64acebcafaf0579574a955.scope.
Jan 23 04:02:16 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:16 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a13f5c0f5014abe5c2c9ae79edc72cb9a2882706cc026bdd8ec3ee22cb52a17f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:16 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a13f5c0f5014abe5c2c9ae79edc72cb9a2882706cc026bdd8ec3ee22cb52a17f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:16 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a13f5c0f5014abe5c2c9ae79edc72cb9a2882706cc026bdd8ec3ee22cb52a17f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:16 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a13f5c0f5014abe5c2c9ae79edc72cb9a2882706cc026bdd8ec3ee22cb52a17f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:16 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a13f5c0f5014abe5c2c9ae79edc72cb9a2882706cc026bdd8ec3ee22cb52a17f/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:16 np0005593234 podman[78001]: 2026-01-23 09:02:16.54173818 +0000 UTC m=+0.044604215 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:16 np0005593234 podman[78001]: 2026-01-23 09:02:16.654061777 +0000 UTC m=+0.156927802 container init cd31a4eaf64c84133fa4ac0e87c89d2d6d614ace7f64acebcafaf0579574a955 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_williams, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 23 04:02:16 np0005593234 podman[78001]: 2026-01-23 09:02:16.661004352 +0000 UTC m=+0.163870357 container start cd31a4eaf64c84133fa4ac0e87c89d2d6d614ace7f64acebcafaf0579574a955 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_williams, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:02:16 np0005593234 podman[78001]: 2026-01-23 09:02:16.664988376 +0000 UTC m=+0.167854401 container attach cd31a4eaf64c84133fa4ac0e87c89d2d6d614ace7f64acebcafaf0579574a955 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_williams, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3)
Jan 23 04:02:16 np0005593234 ceph-mgr[77448]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:02:16 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 04:02:16 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:16.990+0000 7fd1ba9ce140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 04:02:17 np0005593234 ceph-mgr[77448]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:02:17 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:17.273+0000 7fd1ba9ce140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 04:02:17 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'osd_support'
Jan 23 04:02:17 np0005593234 cool_williams[78018]: --> passed data devices: 0 physical, 1 LVM
Jan 23 04:02:17 np0005593234 cool_williams[78018]: --> relative data size: 1.0
Jan 23 04:02:17 np0005593234 cool_williams[78018]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:02:17 np0005593234 cool_williams[78018]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 1694e9fb-559e-40c4-a465-98d21c9c2b03
Jan 23 04:02:17 np0005593234 ceph-mgr[77448]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:02:17 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 04:02:17 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:17.529+0000 7fd1ba9ce140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 04:02:17 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/136452763' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "backups", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 23 04:02:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e18 e18: 2 total, 2 up, 2 in
Jan 23 04:02:17 np0005593234 ceph-mgr[77448]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:02:17 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'progress'
Jan 23 04:02:17 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:17.819+0000 7fd1ba9ce140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 04:02:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd new", "uuid": "1694e9fb-559e-40c4-a465-98d21c9c2b03"} v 0) v1
Jan 23 04:02:17 np0005593234 ceph-mon[77084]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/4125312751' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "1694e9fb-559e-40c4-a465-98d21c9c2b03"}]: dispatch
Jan 23 04:02:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e19 e19: 3 total, 2 up, 3 in
Jan 23 04:02:18 np0005593234 cool_williams[78018]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 04:02:18 np0005593234 lvm[78066]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:02:18 np0005593234 lvm[78066]: VG ceph_vg0 finished
Jan 23 04:02:18 np0005593234 cool_williams[78018]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Jan 23 04:02:18 np0005593234 cool_williams[78018]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 23 04:02:18 np0005593234 cool_williams[78018]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:02:18 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:18.073+0000 7fd1ba9ce140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:02:18 np0005593234 ceph-mgr[77448]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 04:02:18 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'prometheus'
Jan 23 04:02:18 np0005593234 cool_williams[78018]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 23 04:02:18 np0005593234 cool_williams[78018]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Jan 23 04:02:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon getmap"} v 0) v1
Jan 23 04:02:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3977154631' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Jan 23 04:02:18 np0005593234 cool_williams[78018]: stderr: got monmap epoch 3
Jan 23 04:02:18 np0005593234 cool_williams[78018]: --> Creating keyring file for osd.2
Jan 23 04:02:18 np0005593234 cool_williams[78018]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Jan 23 04:02:18 np0005593234 cool_williams[78018]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Jan 23 04:02:18 np0005593234 cool_williams[78018]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid 1694e9fb-559e-40c4-a465-98d21c9c2b03 --setuser ceph --setgroup ceph
Jan 23 04:02:18 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.102:0/4125312751' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "1694e9fb-559e-40c4-a465-98d21c9c2b03"}]: dispatch
Jan 23 04:02:18 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "1694e9fb-559e-40c4-a465-98d21c9c2b03"}]: dispatch
Jan 23 04:02:18 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "1694e9fb-559e-40c4-a465-98d21c9c2b03"}]': finished
Jan 23 04:02:18 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:18 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/3224534207' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 23 04:02:18 np0005593234 ceph-mon[77084]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:02:19 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:19.167+0000 7fd1ba9ce140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:02:19 np0005593234 ceph-mgr[77448]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 04:02:19 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'rbd_support'
Jan 23 04:02:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e20 e20: 3 total, 2 up, 3 in
Jan 23 04:02:19 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:19.480+0000 7fd1ba9ce140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:02:19 np0005593234 ceph-mgr[77448]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 04:02:19 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'restful'
Jan 23 04:02:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e20 _set_new_cache_sizes cache_size:1020054714 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e21 e21: 3 total, 2 up, 3 in
Jan 23 04:02:20 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/3224534207' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "images", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 23 04:02:20 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'rgw'
Jan 23 04:02:20 np0005593234 cool_williams[78018]: stderr: 2026-01-23T09:02:18.606+0000 7f7ffc39f740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 23 04:02:20 np0005593234 cool_williams[78018]: stderr: 2026-01-23T09:02:18.607+0000 7f7ffc39f740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 23 04:02:20 np0005593234 cool_williams[78018]: stderr: 2026-01-23T09:02:18.607+0000 7f7ffc39f740 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Jan 23 04:02:20 np0005593234 cool_williams[78018]: stderr: 2026-01-23T09:02:18.607+0000 7f7ffc39f740 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Jan 23 04:02:20 np0005593234 cool_williams[78018]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 23 04:02:20 np0005593234 cool_williams[78018]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 04:02:20 np0005593234 cool_williams[78018]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Jan 23 04:02:20 np0005593234 cool_williams[78018]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 23 04:02:20 np0005593234 cool_williams[78018]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Jan 23 04:02:20 np0005593234 cool_williams[78018]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:02:20 np0005593234 cool_williams[78018]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 04:02:20 np0005593234 cool_williams[78018]: --> ceph-volume lvm activate successful for osd ID: 2
Jan 23 04:02:20 np0005593234 cool_williams[78018]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 23 04:02:20 np0005593234 systemd[1]: libpod-cd31a4eaf64c84133fa4ac0e87c89d2d6d614ace7f64acebcafaf0579574a955.scope: Deactivated successfully.
Jan 23 04:02:20 np0005593234 podman[78001]: 2026-01-23 09:02:20.950903015 +0000 UTC m=+4.453769020 container died cd31a4eaf64c84133fa4ac0e87c89d2d6d614ace7f64acebcafaf0579574a955 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_williams, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:02:20 np0005593234 systemd[1]: libpod-cd31a4eaf64c84133fa4ac0e87c89d2d6d614ace7f64acebcafaf0579574a955.scope: Consumed 2.425s CPU time.
Jan 23 04:02:20 np0005593234 systemd[1]: var-lib-containers-storage-overlay-a13f5c0f5014abe5c2c9ae79edc72cb9a2882706cc026bdd8ec3ee22cb52a17f-merged.mount: Deactivated successfully.
Jan 23 04:02:21 np0005593234 podman[78001]: 2026-01-23 09:02:21.01164376 +0000 UTC m=+4.514509765 container remove cd31a4eaf64c84133fa4ac0e87c89d2d6d614ace7f64acebcafaf0579574a955 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=cool_williams, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 23 04:02:21 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:21.022+0000 7fd1ba9ce140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:02:21 np0005593234 ceph-mgr[77448]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 04:02:21 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'rook'
Jan 23 04:02:21 np0005593234 systemd[1]: libpod-conmon-cd31a4eaf64c84133fa4ac0e87c89d2d6d614ace7f64acebcafaf0579574a955.scope: Deactivated successfully.
Jan 23 04:02:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e22 e22: 3 total, 2 up, 3 in
Jan 23 04:02:21 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/2806909960' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 23 04:02:21 np0005593234 podman[79117]: 2026-01-23 09:02:21.61382908 +0000 UTC m=+0.064960417 container create 065a8515b1ab16244f4806d3c63bb985dce2810342948dd278a8ec167f37635a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_saha, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_REF=reef)
Jan 23 04:02:21 np0005593234 podman[79117]: 2026-01-23 09:02:21.575119669 +0000 UTC m=+0.026251026 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:21 np0005593234 systemd[1]: Started libpod-conmon-065a8515b1ab16244f4806d3c63bb985dce2810342948dd278a8ec167f37635a.scope.
Jan 23 04:02:21 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:21 np0005593234 podman[79117]: 2026-01-23 09:02:21.784147146 +0000 UTC m=+0.235278513 container init 065a8515b1ab16244f4806d3c63bb985dce2810342948dd278a8ec167f37635a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_saha, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:02:21 np0005593234 podman[79117]: 2026-01-23 09:02:21.792531346 +0000 UTC m=+0.243662693 container start 065a8515b1ab16244f4806d3c63bb985dce2810342948dd278a8ec167f37635a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_saha, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:02:21 np0005593234 podman[79117]: 2026-01-23 09:02:21.795885221 +0000 UTC m=+0.247016558 container attach 065a8515b1ab16244f4806d3c63bb985dce2810342948dd278a8ec167f37635a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_saha, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:02:21 np0005593234 vigorous_saha[79133]: 167 167
Jan 23 04:02:21 np0005593234 systemd[1]: libpod-065a8515b1ab16244f4806d3c63bb985dce2810342948dd278a8ec167f37635a.scope: Deactivated successfully.
Jan 23 04:02:21 np0005593234 podman[79117]: 2026-01-23 09:02:21.798951776 +0000 UTC m=+0.250083113 container died 065a8515b1ab16244f4806d3c63bb985dce2810342948dd278a8ec167f37635a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_saha, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default)
Jan 23 04:02:21 np0005593234 systemd[1]: var-lib-containers-storage-overlay-bd42fe022b4e3e2148df630a4fd9fb18453d9cbab4b0b7bdfd3d1ea50fe94230-merged.mount: Deactivated successfully.
Jan 23 04:02:21 np0005593234 podman[79117]: 2026-01-23 09:02:21.840140554 +0000 UTC m=+0.291271881 container remove 065a8515b1ab16244f4806d3c63bb985dce2810342948dd278a8ec167f37635a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=vigorous_saha, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:02:21 np0005593234 systemd[1]: libpod-conmon-065a8515b1ab16244f4806d3c63bb985dce2810342948dd278a8ec167f37635a.scope: Deactivated successfully.
Jan 23 04:02:22 np0005593234 podman[79157]: 2026-01-23 09:02:22.062846167 +0000 UTC m=+0.095863497 container create 1f7324f1ab6b25cd1e36b5dcc0e07d4fe8459f58d0f5ff3b1827a7ed0ad3f920 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_maxwell, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:02:22 np0005593234 podman[79157]: 2026-01-23 09:02:21.99596799 +0000 UTC m=+0.028985320 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:22 np0005593234 systemd[1]: Started libpod-conmon-1f7324f1ab6b25cd1e36b5dcc0e07d4fe8459f58d0f5ff3b1827a7ed0ad3f920.scope.
Jan 23 04:02:22 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:22 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/2806909960' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 23 04:02:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:22 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e1f026102d211ba66786638f6e8465cdbbf09bfc44f4f8a0d79351d19e911c5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:22 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e1f026102d211ba66786638f6e8465cdbbf09bfc44f4f8a0d79351d19e911c5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:22 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e1f026102d211ba66786638f6e8465cdbbf09bfc44f4f8a0d79351d19e911c5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:22 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9e1f026102d211ba66786638f6e8465cdbbf09bfc44f4f8a0d79351d19e911c5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e23 e23: 3 total, 2 up, 3 in
Jan 23 04:02:22 np0005593234 podman[79157]: 2026-01-23 09:02:22.163645385 +0000 UTC m=+0.196662725 container init 1f7324f1ab6b25cd1e36b5dcc0e07d4fe8459f58d0f5ff3b1827a7ed0ad3f920 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_maxwell, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:02:22 np0005593234 podman[79157]: 2026-01-23 09:02:22.171426167 +0000 UTC m=+0.204443467 container start 1f7324f1ab6b25cd1e36b5dcc0e07d4fe8459f58d0f5ff3b1827a7ed0ad3f920 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_maxwell, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3)
Jan 23 04:02:22 np0005593234 podman[79157]: 2026-01-23 09:02:22.174923264 +0000 UTC m=+0.207940594 container attach 1f7324f1ab6b25cd1e36b5dcc0e07d4fe8459f58d0f5ff3b1827a7ed0ad3f920 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_maxwell, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]: {
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:    "2": [
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:        {
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:            "devices": [
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:                "/dev/loop3"
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:            ],
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:            "lv_name": "ceph_lv0",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:            "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:            "lv_size": "7511998464",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:            "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=gjpadf-yXLV-C91T-Rusq-xLSD-s18Y-eoCNx5,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=e1533653-0a5a-584c-b34b-8689f0d32e77,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=1694e9fb-559e-40c4-a465-98d21c9c2b03,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:            "lv_uuid": "gjpadf-yXLV-C91T-Rusq-xLSD-s18Y-eoCNx5",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:            "name": "ceph_lv0",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:            "path": "/dev/ceph_vg0/ceph_lv0",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:            "tags": {
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:                "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:                "ceph.block_uuid": "gjpadf-yXLV-C91T-Rusq-xLSD-s18Y-eoCNx5",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:                "ceph.cephx_lockbox_secret": "",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:                "ceph.cluster_fsid": "e1533653-0a5a-584c-b34b-8689f0d32e77",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:                "ceph.cluster_name": "ceph",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:                "ceph.crush_device_class": "",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:                "ceph.encrypted": "0",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:                "ceph.osd_fsid": "1694e9fb-559e-40c4-a465-98d21c9c2b03",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:                "ceph.osd_id": "2",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:                "ceph.osdspec_affinity": "default_drive_group",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:                "ceph.type": "block",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:                "ceph.vdo": "0"
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:            },
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:            "type": "block",
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:            "vg_name": "ceph_vg0"
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:        }
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]:    ]
Jan 23 04:02:22 np0005593234 quirky_maxwell[79173]: }
Jan 23 04:02:22 np0005593234 systemd[1]: libpod-1f7324f1ab6b25cd1e36b5dcc0e07d4fe8459f58d0f5ff3b1827a7ed0ad3f920.scope: Deactivated successfully.
Jan 23 04:02:22 np0005593234 podman[79157]: 2026-01-23 09:02:22.993778629 +0000 UTC m=+1.026795949 container died 1f7324f1ab6b25cd1e36b5dcc0e07d4fe8459f58d0f5ff3b1827a7ed0ad3f920 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_maxwell, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:02:23 np0005593234 systemd[1]: var-lib-containers-storage-overlay-9e1f026102d211ba66786638f6e8465cdbbf09bfc44f4f8a0d79351d19e911c5-merged.mount: Deactivated successfully.
Jan 23 04:02:23 np0005593234 podman[79157]: 2026-01-23 09:02:23.063872475 +0000 UTC m=+1.096889785 container remove 1f7324f1ab6b25cd1e36b5dcc0e07d4fe8459f58d0f5ff3b1827a7ed0ad3f920 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=quirky_maxwell, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:02:23 np0005593234 systemd[1]: libpod-conmon-1f7324f1ab6b25cd1e36b5dcc0e07d4fe8459f58d0f5ff3b1827a7ed0ad3f920.scope: Deactivated successfully.
Jan 23 04:02:23 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/3443060591' entity='client.admin' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]: dispatch
Jan 23 04:02:23 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 23 04:02:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e24 e24: 3 total, 2 up, 3 in
Jan 23 04:02:23 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:23.352+0000 7fd1ba9ce140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:02:23 np0005593234 ceph-mgr[77448]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 04:02:23 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'selftest'
Jan 23 04:02:23 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:23.624+0000 7fd1ba9ce140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:02:23 np0005593234 ceph-mgr[77448]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 04:02:23 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'snap_schedule'
Jan 23 04:02:23 np0005593234 podman[79336]: 2026-01-23 09:02:23.693423614 +0000 UTC m=+0.109413767 container create d2d806f7b822c41a92b1c06e7d32cec17e6e4ebb68b5bb8ecddb873c8718b7a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_stonebraker, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:02:23 np0005593234 podman[79336]: 2026-01-23 09:02:23.605457834 +0000 UTC m=+0.021448027 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:23 np0005593234 systemd[1]: Started libpod-conmon-d2d806f7b822c41a92b1c06e7d32cec17e6e4ebb68b5bb8ecddb873c8718b7a8.scope.
Jan 23 04:02:23 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:23 np0005593234 podman[79336]: 2026-01-23 09:02:23.801529649 +0000 UTC m=+0.217519842 container init d2d806f7b822c41a92b1c06e7d32cec17e6e4ebb68b5bb8ecddb873c8718b7a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_stonebraker, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Jan 23 04:02:23 np0005593234 podman[79336]: 2026-01-23 09:02:23.835983849 +0000 UTC m=+0.251974012 container start d2d806f7b822c41a92b1c06e7d32cec17e6e4ebb68b5bb8ecddb873c8718b7a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_stonebraker, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:02:23 np0005593234 gallant_stonebraker[79352]: 167 167
Jan 23 04:02:23 np0005593234 systemd[1]: libpod-d2d806f7b822c41a92b1c06e7d32cec17e6e4ebb68b5bb8ecddb873c8718b7a8.scope: Deactivated successfully.
Jan 23 04:02:23 np0005593234 podman[79336]: 2026-01-23 09:02:23.854396971 +0000 UTC m=+0.270387134 container attach d2d806f7b822c41a92b1c06e7d32cec17e6e4ebb68b5bb8ecddb873c8718b7a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_stonebraker, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS)
Jan 23 04:02:23 np0005593234 podman[79336]: 2026-01-23 09:02:23.855004399 +0000 UTC m=+0.270994562 container died d2d806f7b822c41a92b1c06e7d32cec17e6e4ebb68b5bb8ecddb873c8718b7a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_stonebraker, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:02:23 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:23.899+0000 7fd1ba9ce140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:02:23 np0005593234 ceph-mgr[77448]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 04:02:23 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'stats'
Jan 23 04:02:23 np0005593234 systemd[1]: var-lib-containers-storage-overlay-88c6531d2f449aaf9d8a58321f0007b1c9a4d839e79261dad2a3a2c08eda9573-merged.mount: Deactivated successfully.
Jan 23 04:02:23 np0005593234 podman[79336]: 2026-01-23 09:02:23.96394656 +0000 UTC m=+0.379936723 container remove d2d806f7b822c41a92b1c06e7d32cec17e6e4ebb68b5bb8ecddb873c8718b7a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=gallant_stonebraker, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 04:02:23 np0005593234 systemd[1]: libpod-conmon-d2d806f7b822c41a92b1c06e7d32cec17e6e4ebb68b5bb8ecddb873c8718b7a8.scope: Deactivated successfully.
Jan 23 04:02:24 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'status'
Jan 23 04:02:24 np0005593234 podman[79385]: 2026-01-23 09:02:24.237361457 +0000 UTC m=+0.042485490 container create cb7951f9491be62554d18a21aeff599099f8503d31ba220ce90083578d305d28 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507)
Jan 23 04:02:24 np0005593234 systemd[1]: Started libpod-conmon-cb7951f9491be62554d18a21aeff599099f8503d31ba220ce90083578d305d28.scope.
Jan 23 04:02:24 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:24 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68e4e8cc6ad88c7eb0b52f92cb63bd43382936f825bbefcc063345290cff1abe/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:24 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68e4e8cc6ad88c7eb0b52f92cb63bd43382936f825bbefcc063345290cff1abe/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:24 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68e4e8cc6ad88c7eb0b52f92cb63bd43382936f825bbefcc063345290cff1abe/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:24 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68e4e8cc6ad88c7eb0b52f92cb63bd43382936f825bbefcc063345290cff1abe/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:24 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68e4e8cc6ad88c7eb0b52f92cb63bd43382936f825bbefcc063345290cff1abe/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:24 np0005593234 podman[79385]: 2026-01-23 09:02:24.305519072 +0000 UTC m=+0.110643125 container init cb7951f9491be62554d18a21aeff599099f8503d31ba220ce90083578d305d28 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate-test, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:02:24 np0005593234 podman[79385]: 2026-01-23 09:02:24.219020577 +0000 UTC m=+0.024144640 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:24 np0005593234 podman[79385]: 2026-01-23 09:02:24.316605496 +0000 UTC m=+0.121729529 container start cb7951f9491be62554d18a21aeff599099f8503d31ba220ce90083578d305d28 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate-test, org.label-schema.build-date=20250507, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:02:24 np0005593234 podman[79385]: 2026-01-23 09:02:24.320018392 +0000 UTC m=+0.125142425 container attach cb7951f9491be62554d18a21aeff599099f8503d31ba220ce90083578d305d28 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate-test, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, ceph=True)
Jan 23 04:02:24 np0005593234 ceph-mon[77084]: Deploying daemon osd.2 on compute-2
Jan 23 04:02:24 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/3443060591' entity='client.admin' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.data", "erasure_code_profile": "replicated_rule", "autoscale_mode": "on"}]': finished
Jan 23 04:02:24 np0005593234 ceph-mon[77084]: Health check update: 5 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:02:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e25 e25: 3 total, 2 up, 3 in
Jan 23 04:02:24 np0005593234 ceph-mgr[77448]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:02:24 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'telegraf'
Jan 23 04:02:24 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:24.512+0000 7fd1ba9ce140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 04:02:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e25 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:24 np0005593234 ceph-mgr[77448]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:02:24 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'telemetry'
Jan 23 04:02:24 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:24.779+0000 7fd1ba9ce140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 04:02:24 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate-test[79401]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Jan 23 04:02:25 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate-test[79401]:                            [--no-systemd] [--no-tmpfs]
Jan 23 04:02:25 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate-test[79401]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 23 04:02:25 np0005593234 systemd[1]: libpod-cb7951f9491be62554d18a21aeff599099f8503d31ba220ce90083578d305d28.scope: Deactivated successfully.
Jan 23 04:02:25 np0005593234 podman[79385]: 2026-01-23 09:02:25.025352583 +0000 UTC m=+0.830476626 container died cb7951f9491be62554d18a21aeff599099f8503d31ba220ce90083578d305d28 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate-test, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS)
Jan 23 04:02:25 np0005593234 systemd[1]: var-lib-containers-storage-overlay-68e4e8cc6ad88c7eb0b52f92cb63bd43382936f825bbefcc063345290cff1abe-merged.mount: Deactivated successfully.
Jan 23 04:02:25 np0005593234 podman[79385]: 2026-01-23 09:02:25.092796016 +0000 UTC m=+0.897920049 container remove cb7951f9491be62554d18a21aeff599099f8503d31ba220ce90083578d305d28 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate-test, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 23 04:02:25 np0005593234 systemd[1]: libpod-conmon-cb7951f9491be62554d18a21aeff599099f8503d31ba220ce90083578d305d28.scope: Deactivated successfully.
Jan 23 04:02:25 np0005593234 systemd[1]: Reloading.
Jan 23 04:02:25 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:02:25 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:02:25 np0005593234 ceph-mgr[77448]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:02:25 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 04:02:25 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:25.474+0000 7fd1ba9ce140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 04:02:25 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/927391621' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]: dispatch
Jan 23 04:02:25 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/927391621' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "vms", "app": "rbd"}]': finished
Jan 23 04:02:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e26 e26: 3 total, 2 up, 3 in
Jan 23 04:02:25 np0005593234 systemd[1]: Reloading.
Jan 23 04:02:25 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:02:25 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:02:25 np0005593234 systemd[1]: Starting Ceph osd.2 for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:02:26 np0005593234 podman[79562]: 2026-01-23 09:02:26.104175956 +0000 UTC m=+0.039940831 container create 4e392d338fbc79006107be156f5cd2915d0060ca242fcc12d1be88e376440a69 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:02:26 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:26 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a5df293291a27a6cab57e888844c8bc579bf08066243cf445e852985da09ee0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:26 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a5df293291a27a6cab57e888844c8bc579bf08066243cf445e852985da09ee0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:26 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a5df293291a27a6cab57e888844c8bc579bf08066243cf445e852985da09ee0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:26 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a5df293291a27a6cab57e888844c8bc579bf08066243cf445e852985da09ee0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:26 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3a5df293291a27a6cab57e888844c8bc579bf08066243cf445e852985da09ee0/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:26 np0005593234 podman[79562]: 2026-01-23 09:02:26.088295044 +0000 UTC m=+0.024059939 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:26 np0005593234 podman[79562]: 2026-01-23 09:02:26.189323709 +0000 UTC m=+0.125088604 container init 4e392d338fbc79006107be156f5cd2915d0060ca242fcc12d1be88e376440a69 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 04:02:26 np0005593234 podman[79562]: 2026-01-23 09:02:26.194405617 +0000 UTC m=+0.130170492 container start 4e392d338fbc79006107be156f5cd2915d0060ca242fcc12d1be88e376440a69 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:02:26 np0005593234 podman[79562]: 2026-01-23 09:02:26.197601176 +0000 UTC m=+0.133366051 container attach 4e392d338fbc79006107be156f5cd2915d0060ca242fcc12d1be88e376440a69 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:02:26 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:26.216+0000 7fd1ba9ce140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:02:26 np0005593234 ceph-mgr[77448]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 04:02:26 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'volumes'
Jan 23 04:02:26 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/174650588' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]: dispatch
Jan 23 04:02:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e27 e27: 3 total, 2 up, 3 in
Jan 23 04:02:26 np0005593234 ceph-mgr[77448]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:02:26 np0005593234 ceph-mgr[77448]: mgr[py] Loading python module 'zabbix'
Jan 23 04:02:26 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:26.958+0000 7fd1ba9ce140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 04:02:27 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate[79577]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 04:02:27 np0005593234 bash[79562]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 04:02:27 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate[79577]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 23 04:02:27 np0005593234 bash[79562]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Jan 23 04:02:27 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate[79577]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 23 04:02:27 np0005593234 bash[79562]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Jan 23 04:02:27 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate[79577]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:02:27 np0005593234 bash[79562]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 04:02:27 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate[79577]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 23 04:02:27 np0005593234 bash[79562]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Jan 23 04:02:27 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate[79577]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 04:02:27 np0005593234 bash[79562]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Jan 23 04:02:27 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate[79577]: --> ceph-volume raw activate successful for osd ID: 2
Jan 23 04:02:27 np0005593234 bash[79562]: --> ceph-volume raw activate successful for osd ID: 2
Jan 23 04:02:27 np0005593234 systemd[1]: libpod-4e392d338fbc79006107be156f5cd2915d0060ca242fcc12d1be88e376440a69.scope: Deactivated successfully.
Jan 23 04:02:27 np0005593234 conmon[79577]: conmon 4e392d338fbc79006107 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-4e392d338fbc79006107be156f5cd2915d0060ca242fcc12d1be88e376440a69.scope/container/memory.events
Jan 23 04:02:27 np0005593234 podman[79562]: 2026-01-23 09:02:27.114155612 +0000 UTC m=+1.049920487 container died 4e392d338fbc79006107be156f5cd2915d0060ca242fcc12d1be88e376440a69 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507)
Jan 23 04:02:27 np0005593234 systemd[1]: var-lib-containers-storage-overlay-3a5df293291a27a6cab57e888844c8bc579bf08066243cf445e852985da09ee0-merged.mount: Deactivated successfully.
Jan 23 04:02:27 np0005593234 podman[79562]: 2026-01-23 09:02:27.171708679 +0000 UTC m=+1.107473564 container remove 4e392d338fbc79006107be156f5cd2915d0060ca242fcc12d1be88e376440a69 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2-activate, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:02:27 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mgr-compute-2-nrjyzu[77444]: 2026-01-23T09:02:27.220+0000 7fd1ba9ce140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:02:27 np0005593234 ceph-mgr[77448]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 04:02:27 np0005593234 ceph-mgr[77448]: ms_deliver_dispatch: unhandled message 0x55c5deb651e0 mon_map magic: 0 v1 from mon.0 v2:192.168.122.100:3300/0
Jan 23 04:02:27 np0005593234 ceph-mgr[77448]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 04:02:27 np0005593234 podman[79750]: 2026-01-23 09:02:27.36121161 +0000 UTC m=+0.040083135 container create 6b9c1993d24c5cca217efd64715a8e0e3abb1c88057a7bde487f74af054385b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 23 04:02:27 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/274bcecb634b87422726b7df025b2b0f63057dde6107405e814767fdd20c1c7e/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:27 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/274bcecb634b87422726b7df025b2b0f63057dde6107405e814767fdd20c1c7e/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:27 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/274bcecb634b87422726b7df025b2b0f63057dde6107405e814767fdd20c1c7e/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:27 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/274bcecb634b87422726b7df025b2b0f63057dde6107405e814767fdd20c1c7e/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:27 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/274bcecb634b87422726b7df025b2b0f63057dde6107405e814767fdd20c1c7e/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:27 np0005593234 podman[79750]: 2026-01-23 09:02:27.421826461 +0000 UTC m=+0.100698006 container init 6b9c1993d24c5cca217efd64715a8e0e3abb1c88057a7bde487f74af054385b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:02:27 np0005593234 podman[79750]: 2026-01-23 09:02:27.428840759 +0000 UTC m=+0.107712284 container start 6b9c1993d24c5cca217efd64715a8e0e3abb1c88057a7bde487f74af054385b0 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2)
Jan 23 04:02:27 np0005593234 bash[79750]: 6b9c1993d24c5cca217efd64715a8e0e3abb1c88057a7bde487f74af054385b0
Jan 23 04:02:27 np0005593234 podman[79750]: 2026-01-23 09:02:27.344361397 +0000 UTC m=+0.023232942 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:27 np0005593234 systemd[1]: Started Ceph osd.2 for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-osd, pid 2
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: pidfile_write: ignore empty --pid-file
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: bdev(0x55bc37bd7c00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: bdev(0x55bc37bd7c00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: bdev(0x55bc37bd7c00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: bdev(0x55bc37bd7c00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: bdev(0x55bc389e3000 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: bdev(0x55bc389e3000 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: bdev(0x55bc389e3000 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: bdev(0x55bc389e3000 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: bdev(0x55bc389e3000 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: bdev(0x55bc37bd7c00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:02:27 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/174650588' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "volumes", "app": "rbd"}]': finished
Jan 23 04:02:27 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:27 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:27 np0005593234 ceph-osd[79769]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: load: jerasure load: lrc 
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:02:28 np0005593234 podman[79930]: 2026-01-23 09:02:28.092938081 +0000 UTC m=+0.038467546 container create c7c9e3afddb4b50ee99fd1d1e1ce484dc8f295e6f82ce57f9e122da0c08da4f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_black, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 23 04:02:28 np0005593234 systemd[1]: Started libpod-conmon-c7c9e3afddb4b50ee99fd1d1e1ce484dc8f295e6f82ce57f9e122da0c08da4f5.scope.
Jan 23 04:02:28 np0005593234 podman[79930]: 2026-01-23 09:02:28.075614923 +0000 UTC m=+0.021144418 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:28 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:28 np0005593234 podman[79930]: 2026-01-23 09:02:28.19084548 +0000 UTC m=+0.136374965 container init c7c9e3afddb4b50ee99fd1d1e1ce484dc8f295e6f82ce57f9e122da0c08da4f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_black, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:02:28 np0005593234 podman[79930]: 2026-01-23 09:02:28.200165308 +0000 UTC m=+0.145694773 container start c7c9e3afddb4b50ee99fd1d1e1ce484dc8f295e6f82ce57f9e122da0c08da4f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_black, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 23 04:02:28 np0005593234 podman[79930]: 2026-01-23 09:02:28.204008798 +0000 UTC m=+0.149538383 container attach c7c9e3afddb4b50ee99fd1d1e1ce484dc8f295e6f82ce57f9e122da0c08da4f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_black, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507)
Jan 23 04:02:28 np0005593234 xenodochial_black[79946]: 167 167
Jan 23 04:02:28 np0005593234 systemd[1]: libpod-c7c9e3afddb4b50ee99fd1d1e1ce484dc8f295e6f82ce57f9e122da0c08da4f5.scope: Deactivated successfully.
Jan 23 04:02:28 np0005593234 conmon[79946]: conmon c7c9e3afddb4b50ee99f <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c7c9e3afddb4b50ee99fd1d1e1ce484dc8f295e6f82ce57f9e122da0c08da4f5.scope/container/memory.events
Jan 23 04:02:28 np0005593234 podman[79930]: 2026-01-23 09:02:28.211399337 +0000 UTC m=+0.156928802 container died c7c9e3afddb4b50ee99fd1d1e1ce484dc8f295e6f82ce57f9e122da0c08da4f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_black, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 23 04:02:28 np0005593234 systemd[1]: var-lib-containers-storage-overlay-c280fa33ed698027d4d10a0de6c2735df2dbdaf3f5ecb53bbd4b5bfdf92f8814-merged.mount: Deactivated successfully.
Jan 23 04:02:28 np0005593234 podman[79930]: 2026-01-23 09:02:28.251866363 +0000 UTC m=+0.197395838 container remove c7c9e3afddb4b50ee99fd1d1e1ce484dc8f295e6f82ce57f9e122da0c08da4f5 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=xenodochial_black, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:02:28 np0005593234 systemd[1]: libpod-conmon-c7c9e3afddb4b50ee99fd1d1e1ce484dc8f295e6f82ce57f9e122da0c08da4f5.scope: Deactivated successfully.
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:02:28 np0005593234 podman[79974]: 2026-01-23 09:02:28.411970723 +0000 UTC m=+0.046146524 container create 8269aba8a66fdd51cb07ae10aa1401409931ee908ed29e52f6745ff3239eab68 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shockley, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.vendor=CentOS)
Jan 23 04:02:28 np0005593234 systemd[1]: Started libpod-conmon-8269aba8a66fdd51cb07ae10aa1401409931ee908ed29e52f6745ff3239eab68.scope.
Jan 23 04:02:28 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:28 np0005593234 podman[79974]: 2026-01-23 09:02:28.392439456 +0000 UTC m=+0.026615257 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:28 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9db48de106c528e3d4f11b3bf418a0ad7b0df1db77f0e4e7660060786d043a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:28 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9db48de106c528e3d4f11b3bf418a0ad7b0df1db77f0e4e7660060786d043a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:28 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9db48de106c528e3d4f11b3bf418a0ad7b0df1db77f0e4e7660060786d043a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:28 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a9db48de106c528e3d4f11b3bf418a0ad7b0df1db77f0e4e7660060786d043a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:28 np0005593234 podman[79974]: 2026-01-23 09:02:28.508947572 +0000 UTC m=+0.143123383 container init 8269aba8a66fdd51cb07ae10aa1401409931ee908ed29e52f6745ff3239eab68 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shockley, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:02:28 np0005593234 podman[79974]: 2026-01-23 09:02:28.515816156 +0000 UTC m=+0.149991967 container start 8269aba8a66fdd51cb07ae10aa1401409931ee908ed29e52f6745ff3239eab68 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shockley, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:02:28 np0005593234 podman[79974]: 2026-01-23 09:02:28.519264102 +0000 UTC m=+0.153439903 container attach 8269aba8a66fdd51cb07ae10aa1401409931ee908ed29e52f6745ff3239eab68 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shockley, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7ac00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7b400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7b400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7b400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7b400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluefs mount
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluefs mount shared_bdev_used = 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: RocksDB version: 7.9.2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Git sha 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: DB SUMMARY
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: DB Session ID:  C62R6TK4UIVW8BG31XOD
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: CURRENT file:  CURRENT
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                         Options.error_if_exists: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.create_if_missing: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                                     Options.env: 0x55bc38a7ddc0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                                Options.info_log: 0x55bc37c54c00
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                              Options.statistics: (nil)
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.use_fsync: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                              Options.db_log_dir: 
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                                 Options.wal_dir: db.wal
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.write_buffer_manager: 0x55bc38b6a460
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.unordered_write: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.row_cache: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                              Options.wal_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.two_write_queues: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.wal_compression: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.atomic_flush: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.max_background_jobs: 4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.max_background_compactions: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.max_subcompactions: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.max_open_files: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Compression algorithms supported:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kZSTD supported: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kXpressCompression supported: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kBZip2Compression supported: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kLZ4Compression supported: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kZlibCompression supported: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kSnappyCompression supported: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c54660)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4add0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c54660)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4add0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c54660)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4add0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c54660)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4add0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c54660)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4add0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c54660)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4add0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c54660)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4add0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c54620)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4a430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c54620)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4a430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c54620)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4a430#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 6a182f71-4175-4824-ba7f-e6cca47d1e25
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158948583770, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158948583989, "job": 1, "event": "recovery_finished"}
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: freelist init
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: freelist _read_cfg
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluefs umount
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7b400 /var/lib/ceph/osd/ceph-2/block) close
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7b400 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7b400 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7b400 /var/lib/ceph/osd/ceph-2/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bdev(0x55bc38a7b400 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluefs mount
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluefs mount shared_bdev_used = 4718592
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: RocksDB version: 7.9.2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Git sha 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Compile date 2025-05-06 23:30:25
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: DB SUMMARY
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: DB Session ID:  C62R6TK4UIVW8BG31XOC
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: CURRENT file:  CURRENT
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                         Options.error_if_exists: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.create_if_missing: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                                     Options.env: 0x55bc37da23f0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                                Options.info_log: 0x55bc37c55900
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                              Options.statistics: (nil)
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.use_fsync: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                              Options.db_log_dir: 
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                                 Options.wal_dir: db.wal
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.write_buffer_manager: 0x55bc38b6a460
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.unordered_write: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.row_cache: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                              Options.wal_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.two_write_queues: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.wal_compression: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.atomic_flush: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.max_background_jobs: 4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.max_background_compactions: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.max_subcompactions: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.max_open_files: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Compression algorithms supported:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kZSTD supported: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kXpressCompression supported: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kBZip2Compression supported: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kZSTDNotFinalCompression supported: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kLZ4Compression supported: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kZlibCompression supported: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kLZ4HCCompression supported: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: #011kSnappyCompression supported: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c55f60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c55f60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c55f60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c55f60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c55f60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c55f60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c55f60)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4b350#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 483183820#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c543a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4b4b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c543a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4b4b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:           Options.merge_operator: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55bc37c543a0)#012  cache_index_and_filter_blocks: 1#012  cache_index_and_filter_blocks_with_high_priority: 0#012  pin_l0_filter_and_index_blocks_in_cache: 0#012  pin_top_level_index_and_filter: 1#012  index_type: 0#012  data_block_index_type: 0#012  index_shortening: 1#012  data_block_hash_table_util_ratio: 0.750000#012  checksum: 4#012  no_block_cache: 0#012  block_cache: 0x55bc37c4b4b0#012  block_cache_name: BinnedLRUCache#012  block_cache_options:#012    capacity : 536870912#012    num_shard_bits : 4#012    strict_capacity_limit : 0#012    high_pri_pool_ratio: 0.000#012  block_cache_compressed: (nil)#012  persistent_cache: (nil)#012  block_size: 4096#012  block_size_deviation: 10#012  block_restart_interval: 16#012  index_block_restart_interval: 1#012  metadata_block_size: 4096#012  partition_filters: 0#012  use_delta_encoding: 1#012  filter_policy: bloomfilter#012  whole_key_filtering: 1#012  verify_compression: 0#012  read_amp_bytes_per_bit: 0#012  format_version: 5#012  enable_index_compression: 1#012  block_align: 0#012  max_auto_readahead_size: 262144#012  prepopulate_block_cache: 0#012  initial_auto_readahead_size: 8192#012  num_file_reads_for_auto_readahead: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.compression: LZ4
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.num_levels: 7
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.bloom_locality: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                               Options.ttl: 2592000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                       Options.enable_blob_files: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                           Options.min_blob_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/column_family.cc:635] #011(skipping printing options)
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 6a182f71-4175-4824-ba7f-e6cca47d1e25
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158948849032, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 23 04:02:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 23 04:02:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e28 e28: 3 total, 2 up, 3 in
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158949160307, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158948, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "6a182f71-4175-4824-ba7f-e6cca47d1e25", "db_session_id": "C62R6TK4UIVW8BG31XOC", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158949290610, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1593, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 467, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158949, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "6a182f71-4175-4824-ba7f-e6cca47d1e25", "db_session_id": "C62R6TK4UIVW8BG31XOC", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158949302234, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158949, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "6a182f71-4175-4824-ba7f-e6cca47d1e25", "db_session_id": "C62R6TK4UIVW8BG31XOC", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:02:29 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/4038784885' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]: dispatch
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769158949305835, "job": 1, "event": "recovery_finished"}
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55bc37d09c00
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: rocksdb: DB pointer 0x55bc38b53a00
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 0.5 total, 0.5 interval#012Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s#012Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s#012Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.31              0.00         1    0.311       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.31              0.00         1    0.311       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.31              0.00         1    0.311       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.31              0.00         1    0.311       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.5 total, 0.5 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.3 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.3 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bc37c4b350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.5 total, 0.5 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bc37c4b350#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 5.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 0.5 total, 0.5 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: _get_class not permitted to load lua
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: _get_class not permitted to load sdk
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: _get_class not permitted to load test_remote_reads
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 23 04:02:29 np0005593234 sleepy_shockley[79990]: {
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 23 04:02:29 np0005593234 sleepy_shockley[79990]:    "1694e9fb-559e-40c4-a465-98d21c9c2b03": {
Jan 23 04:02:29 np0005593234 sleepy_shockley[79990]:        "ceph_fsid": "e1533653-0a5a-584c-b34b-8689f0d32e77",
Jan 23 04:02:29 np0005593234 sleepy_shockley[79990]:        "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Jan 23 04:02:29 np0005593234 sleepy_shockley[79990]:        "osd_id": 2,
Jan 23 04:02:29 np0005593234 sleepy_shockley[79990]:        "osd_uuid": "1694e9fb-559e-40c4-a465-98d21c9c2b03",
Jan 23 04:02:29 np0005593234 sleepy_shockley[79990]:        "type": "bluestore"
Jan 23 04:02:29 np0005593234 sleepy_shockley[79990]:    }
Jan 23 04:02:29 np0005593234 sleepy_shockley[79990]: }
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: osd.2 0 load_pgs
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: osd.2 0 load_pgs opened 0 pgs
Jan 23 04:02:29 np0005593234 ceph-osd[79769]: osd.2 0 log_to_monitors true
Jan 23 04:02:29 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2[79765]: 2026-01-23T09:02:29.343+0000 7f4ba8547740 -1 osd.2 0 log_to_monitors true
Jan 23 04:02:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]} v 0) v1
Jan 23 04:02:29 np0005593234 ceph-mon[77084]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/2199131998,v1:192.168.122.102:6801/2199131998]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 04:02:29 np0005593234 systemd[1]: libpod-8269aba8a66fdd51cb07ae10aa1401409931ee908ed29e52f6745ff3239eab68.scope: Deactivated successfully.
Jan 23 04:02:29 np0005593234 podman[79974]: 2026-01-23 09:02:29.384435304 +0000 UTC m=+1.018611105 container died 8269aba8a66fdd51cb07ae10aa1401409931ee908ed29e52f6745ff3239eab68 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shockley, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:02:29 np0005593234 systemd[1]: var-lib-containers-storage-overlay-6a9db48de106c528e3d4f11b3bf418a0ad7b0df1db77f0e4e7660060786d043a-merged.mount: Deactivated successfully.
Jan 23 04:02:29 np0005593234 podman[79974]: 2026-01-23 09:02:29.441684951 +0000 UTC m=+1.075860752 container remove 8269aba8a66fdd51cb07ae10aa1401409931ee908ed29e52f6745ff3239eab68 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sleepy_shockley, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:02:29 np0005593234 systemd[1]: libpod-conmon-8269aba8a66fdd51cb07ae10aa1401409931ee908ed29e52f6745ff3239eab68.scope: Deactivated successfully.
Jan 23 04:02:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e28 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:30 np0005593234 ceph-mon[77084]: Health check update: 4 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:02:30 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/4038784885' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 23 04:02:30 np0005593234 ceph-mon[77084]: from='osd.2 [v2:192.168.122.102:6800/2199131998,v1:192.168.122.102:6801/2199131998]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 04:02:30 np0005593234 ceph-mon[77084]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 04:02:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:30 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/1670773661' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 23 04:02:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e29 e29: 3 total, 2 up, 3 in
Jan 23 04:02:30 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 23 04:02:30 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 23 04:02:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]} v 0) v1
Jan 23 04:02:30 np0005593234 ceph-mon[77084]: log_channel(audit) log [INF] : from='osd.2 [v2:192.168.122.102:6800/2199131998,v1:192.168.122.102:6801/2199131998]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 04:02:30 np0005593234 podman[80658]: 2026-01-23 09:02:30.677354392 +0000 UTC m=+0.054279595 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:02:30 np0005593234 podman[80658]: 2026-01-23 09:02:30.98791575 +0000 UTC m=+0.364840933 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 23 04:02:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e30 e30: 3 total, 2 up, 3 in
Jan 23 04:02:31 np0005593234 ceph-mon[77084]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 23 04:02:31 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/1670773661' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 23 04:02:31 np0005593234 ceph-mon[77084]: from='osd.2 [v2:192.168.122.102:6800/2199131998,v1:192.168.122.102:6801/2199131998]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 04:02:31 np0005593234 ceph-mon[77084]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 04:02:31 np0005593234 ceph-osd[79769]: osd.2 0 done with init, starting boot process
Jan 23 04:02:31 np0005593234 ceph-osd[79769]: osd.2 0 start_boot
Jan 23 04:02:31 np0005593234 ceph-osd[79769]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 23 04:02:31 np0005593234 ceph-osd[79769]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 23 04:02:31 np0005593234 ceph-osd[79769]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 23 04:02:31 np0005593234 ceph-osd[79769]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 23 04:02:31 np0005593234 ceph-osd[79769]: osd.2 0  bench count 12288000 bsize 4 KiB
Jan 23 04:02:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:32 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/3627784438' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 23 04:02:32 np0005593234 ceph-mon[77084]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0068, "args": ["host=compute-2", "root=default"]}]': finished
Jan 23 04:02:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e31 e31: 3 total, 2 up, 3 in
Jan 23 04:02:32 np0005593234 podman[81011]: 2026-01-23 09:02:32.823251814 +0000 UTC m=+0.072437230 container create 86adcb22624a82bb2021d20167ac32a77540e160852a8508fdf63b36ba0a867c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_ritchie, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:02:32 np0005593234 podman[81011]: 2026-01-23 09:02:32.773163849 +0000 UTC m=+0.022349305 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:32 np0005593234 systemd[1]: Started libpod-conmon-86adcb22624a82bb2021d20167ac32a77540e160852a8508fdf63b36ba0a867c.scope.
Jan 23 04:02:32 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:32 np0005593234 podman[81011]: 2026-01-23 09:02:32.929836532 +0000 UTC m=+0.179021948 container init 86adcb22624a82bb2021d20167ac32a77540e160852a8508fdf63b36ba0a867c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_ritchie, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:02:32 np0005593234 podman[81011]: 2026-01-23 09:02:32.937490329 +0000 UTC m=+0.186675745 container start 86adcb22624a82bb2021d20167ac32a77540e160852a8508fdf63b36ba0a867c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_ritchie, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Jan 23 04:02:32 np0005593234 agitated_ritchie[81027]: 167 167
Jan 23 04:02:32 np0005593234 systemd[1]: libpod-86adcb22624a82bb2021d20167ac32a77540e160852a8508fdf63b36ba0a867c.scope: Deactivated successfully.
Jan 23 04:02:32 np0005593234 podman[81011]: 2026-01-23 09:02:32.958345947 +0000 UTC m=+0.207531383 container attach 86adcb22624a82bb2021d20167ac32a77540e160852a8508fdf63b36ba0a867c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_ritchie, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:02:32 np0005593234 podman[81011]: 2026-01-23 09:02:32.958829741 +0000 UTC m=+0.208015187 container died 86adcb22624a82bb2021d20167ac32a77540e160852a8508fdf63b36ba0a867c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_ritchie, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:02:33 np0005593234 systemd[1]: var-lib-containers-storage-overlay-6e75d411aaba782dce6ee8f01bac396a40b00e3bdc9dc9cc6d18624e9586d035-merged.mount: Deactivated successfully.
Jan 23 04:02:33 np0005593234 podman[81011]: 2026-01-23 09:02:33.097910368 +0000 UTC m=+0.347095784 container remove 86adcb22624a82bb2021d20167ac32a77540e160852a8508fdf63b36ba0a867c (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=agitated_ritchie, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, io.buildah.version=1.39.3)
Jan 23 04:02:33 np0005593234 systemd[1]: libpod-conmon-86adcb22624a82bb2021d20167ac32a77540e160852a8508fdf63b36ba0a867c.scope: Deactivated successfully.
Jan 23 04:02:33 np0005593234 podman[81053]: 2026-01-23 09:02:33.237089787 +0000 UTC m=+0.036306397 container create 5f8a9c9dabe9ff1636e2b0b912b0ff240c2226b1657d3cb30821f41e9da42425 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_varahamihira, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:02:33 np0005593234 systemd[1]: Started libpod-conmon-5f8a9c9dabe9ff1636e2b0b912b0ff240c2226b1657d3cb30821f41e9da42425.scope.
Jan 23 04:02:33 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:33 np0005593234 podman[81053]: 2026-01-23 09:02:33.221257366 +0000 UTC m=+0.020473986 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:33 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b989abcd53ab60518ada7612d15e23c94c0a02ac6f65af58bdd4ddeb8668285/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:33 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b989abcd53ab60518ada7612d15e23c94c0a02ac6f65af58bdd4ddeb8668285/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:33 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b989abcd53ab60518ada7612d15e23c94c0a02ac6f65af58bdd4ddeb8668285/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:33 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b989abcd53ab60518ada7612d15e23c94c0a02ac6f65af58bdd4ddeb8668285/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:02:33 np0005593234 podman[81053]: 2026-01-23 09:02:33.338846666 +0000 UTC m=+0.138063276 container init 5f8a9c9dabe9ff1636e2b0b912b0ff240c2226b1657d3cb30821f41e9da42425 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_varahamihira, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:02:33 np0005593234 podman[81053]: 2026-01-23 09:02:33.345175772 +0000 UTC m=+0.144392422 container start 5f8a9c9dabe9ff1636e2b0b912b0ff240c2226b1657d3cb30821f41e9da42425 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_varahamihira, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250507)
Jan 23 04:02:33 np0005593234 podman[81053]: 2026-01-23 09:02:33.368722893 +0000 UTC m=+0.167939513 container attach 5f8a9c9dabe9ff1636e2b0b912b0ff240c2226b1657d3cb30821f41e9da42425 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_varahamihira, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2)
Jan 23 04:02:33 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/3627784438' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]: [
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:    {
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:        "available": false,
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:        "ceph_device": false,
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:        "lsm_data": {},
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:        "lvs": [],
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:        "path": "/dev/sr0",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:        "rejected_reasons": [
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "Insufficient space (<5GB)",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "Has a FileSystem"
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:        ],
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:        "sys_api": {
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "actuators": null,
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "device_nodes": "sr0",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "devname": "sr0",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "human_readable_size": "482.00 KB",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "id_bus": "ata",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "model": "QEMU DVD-ROM",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "nr_requests": "2",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "parent": "/dev/sr0",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "partitions": {},
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "path": "/dev/sr0",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "removable": "1",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "rev": "2.5+",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "ro": "0",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "rotational": "1",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "sas_address": "",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "sas_device_handle": "",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "scheduler_mode": "mq-deadline",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "sectors": 0,
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "sectorsize": "2048",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "size": 493568.0,
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "support_discard": "2048",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "type": "disk",
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:            "vendor": "QEMU"
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:        }
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]:    }
Jan 23 04:02:34 np0005593234 nice_varahamihira[81069]: ]
Jan 23 04:02:34 np0005593234 systemd[1]: libpod-5f8a9c9dabe9ff1636e2b0b912b0ff240c2226b1657d3cb30821f41e9da42425.scope: Deactivated successfully.
Jan 23 04:02:34 np0005593234 systemd[1]: libpod-5f8a9c9dabe9ff1636e2b0b912b0ff240c2226b1657d3cb30821f41e9da42425.scope: Consumed 1.140s CPU time.
Jan 23 04:02:34 np0005593234 podman[81053]: 2026-01-23 09:02:34.477460284 +0000 UTC m=+1.276676924 container died 5f8a9c9dabe9ff1636e2b0b912b0ff240c2226b1657d3cb30821f41e9da42425 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_varahamihira, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 23 04:02:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e31 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:34 np0005593234 systemd[1]: var-lib-containers-storage-overlay-8b989abcd53ab60518ada7612d15e23c94c0a02ac6f65af58bdd4ddeb8668285-merged.mount: Deactivated successfully.
Jan 23 04:02:34 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/1955789217' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 23 04:02:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e32 e32: 3 total, 2 up, 3 in
Jan 23 04:02:35 np0005593234 podman[81053]: 2026-01-23 09:02:35.053704519 +0000 UTC m=+1.852921129 container remove 5f8a9c9dabe9ff1636e2b0b912b0ff240c2226b1657d3cb30821f41e9da42425 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_varahamihira, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:02:35 np0005593234 systemd[1]: libpod-conmon-5f8a9c9dabe9ff1636e2b0b912b0ff240c2226b1657d3cb30821f41e9da42425.scope: Deactivated successfully.
Jan 23 04:02:35 np0005593234 ceph-mon[77084]: Health check update: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 04:02:35 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/1955789217' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 23 04:02:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:02:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:02:36 np0005593234 ceph-osd[79769]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 20.247 iops: 5183.157 elapsed_sec: 0.579
Jan 23 04:02:36 np0005593234 ceph-osd[79769]: log_channel(cluster) log [WRN] : OSD bench result of 5183.157191 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 23 04:02:36 np0005593234 ceph-osd[79769]: osd.2 0 waiting for initial osdmap
Jan 23 04:02:36 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2[79765]: 2026-01-23T09:02:36.203+0000 7f4ba4cde640 -1 osd.2 0 waiting for initial osdmap
Jan 23 04:02:36 np0005593234 ceph-osd[79769]: osd.2 32 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 23 04:02:36 np0005593234 ceph-osd[79769]: osd.2 32 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Jan 23 04:02:36 np0005593234 ceph-osd[79769]: osd.2 32 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 23 04:02:36 np0005593234 ceph-osd[79769]: osd.2 32 check_osdmap_features require_osd_release unknown -> reef
Jan 23 04:02:36 np0005593234 ceph-osd[79769]: osd.2 32 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 23 04:02:36 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-osd-2[79765]: 2026-01-23T09:02:36.238+0000 7f4b9faef640 -1 osd.2 32 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 23 04:02:36 np0005593234 ceph-osd[79769]: osd.2 32 set_numa_affinity not setting numa affinity
Jan 23 04:02:36 np0005593234 ceph-osd[79769]: osd.2 32 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Jan 23 04:02:36 np0005593234 ceph-mon[77084]: Adjusting osd_memory_target on compute-2 to 127.9M
Jan 23 04:02:36 np0005593234 ceph-mon[77084]: Unable to set osd_memory_target on compute-2 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Jan 23 04:02:36 np0005593234 ceph-mon[77084]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 04:02:36 np0005593234 ceph-mon[77084]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 04:02:36 np0005593234 ceph-mon[77084]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 04:02:36 np0005593234 ceph-mon[77084]: OSD bench result of 5183.157191 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 23 04:02:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e33 e33: 3 total, 3 up, 3 in
Jan 23 04:02:36 np0005593234 ceph-osd[79769]: osd.2 33 state: booting -> active
Jan 23 04:02:36 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 33 pg[5.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:36 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 33 pg[3.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=33) [2] r=0 lpr=33 pi=[15,33)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: Updating compute-0:/var/lib/ceph/e1533653-0a5a-584c-b34b-8689f0d32e77/config/ceph.conf
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: Updating compute-2:/var/lib/ceph/e1533653-0a5a-584c-b34b-8689f0d32e77/config/ceph.conf
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: Updating compute-1:/var/lib/ceph/e1533653-0a5a-584c-b34b-8689f0d32e77/config/ceph.conf
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: Cluster is now healthy
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: osd.2 [v2:192.168.122.102:6800/2199131998,v1:192.168.122.102:6801/2199131998] boot
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:02:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e34 e34: 3 total, 3 up, 3 in
Jan 23 04:02:37 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 34 pg[5.0( empty local-lis/les=33/34 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=33) [2] r=0 lpr=33 pi=[20,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:37 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 34 pg[3.0( empty local-lis/les=33/34 n=0 ec=15/15 lis/c=15/15 les/c/f=16/16/0 sis=33) [2] r=0 lpr=33 pi=[15,33)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e35 e35: 3 total, 3 up, 3 in
Jan 23 04:02:38 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:02:38 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:02:38 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:38 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/2470226038' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 23 04:02:38 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/2470226038' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 23 04:02:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e35 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e36 e36: 3 total, 3 up, 3 in
Jan 23 04:02:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:02:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:02:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:02:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:02:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:02:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e37 e37: 3 total, 3 up, 3 in
Jan 23 04:02:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:40 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/2903992486' entity='client.admin' 
Jan 23 04:02:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:02:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:02:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:02:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e38 e38: 3 total, 3 up, 3 in
Jan 23 04:02:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:02:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:02:42 np0005593234 ceph-mon[77084]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 23 04:02:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:42 np0005593234 ceph-mon[77084]: Saving service ingress.rgw.default spec with placement count:2
Jan 23 04:02:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e39 e39: 3 total, 3 up, 3 in
Jan 23 04:02:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:02:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:02:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e40 e40: 3 total, 3 up, 3 in
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 40 pg[3.0( empty local-lis/les=33/34 n=0 ec=15/15 lis/c=33/33 les/c/f=34/34/0 sis=40 pruub=9.575865746s) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active pruub 24.343467712s@ mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 40 pg[5.0( empty local-lis/les=33/34 n=0 ec=20/20 lis/c=33/33 les/c/f=34/34/0 sis=40 pruub=9.566013336s) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active pruub 24.333625793s@ mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [2], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 40 pg[5.0( empty local-lis/les=33/34 n=0 ec=20/20 lis/c=33/33 les/c/f=34/34/0 sis=40 pruub=9.566013336s) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown pruub 24.333625793s@ mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 40 pg[3.0( empty local-lis/les=33/34 n=0 ec=15/15 lis/c=33/33 les/c/f=34/34/0 sis=40 pruub=9.575865746s) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown pruub 24.343467712s@ mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e40 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e2 new map
Jan 23 04:02:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e2 print_map#012e2#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:02:44.729005+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 
Jan 23 04:02:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e41 e41: 3 total, 3 up, 3 in
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1a( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1b( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1c( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.19( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.17( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1f( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.10( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.15( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.14( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.12( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.13( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.17( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.10( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.f( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.8( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.b( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.d( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.c( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.a( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.6( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.2( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.3( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.4( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.e( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.9( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.a( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.b( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.6( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.d( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1b( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1f( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=33/34 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1c( empty local-lis/les=33/34 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1b( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1a( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.17( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.19( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.18( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.16( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.15( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.14( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.12( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.13( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.10( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.11( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.f( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.8( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.d( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.b( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.c( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.0( empty local-lis/les=40/41 n=0 ec=20/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.e( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.0( empty local-lis/les=40/41 n=0 ec=15/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.6( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.2( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.4( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.5( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.7( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.8( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.e( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.9( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.a( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.b( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1d( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1e( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1f( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.3( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[3.1c( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 41 pg[5.d( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=33/33 les/c/f=34/34/0 sis=40) [2] r=0 lpr=40 pi=[33,40)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:02:46 np0005593234 ceph-mon[77084]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 04:02:46 np0005593234 ceph-mon[77084]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 23 04:02:46 np0005593234 ceph-mon[77084]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 23 04:02:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.yntofk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:02:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e42 e42: 3 total, 3 up, 3 in
Jan 23 04:02:47 np0005593234 ceph-mon[77084]: Reconfiguring mgr.compute-0.yntofk (monmap changed)...
Jan 23 04:02:47 np0005593234 ceph-mon[77084]: Reconfiguring daemon mgr.compute-0.yntofk on compute-0
Jan 23 04:02:47 np0005593234 ceph-mon[77084]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 04:02:47 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:47 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:47 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:02:48 np0005593234 ceph-mon[77084]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 23 04:02:48 np0005593234 ceph-mon[77084]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 23 04:02:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:48 np0005593234 ceph-mon[77084]: Reconfiguring osd.0 (monmap changed)...
Jan 23 04:02:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 23 04:02:48 np0005593234 ceph-mon[77084]: Reconfiguring daemon osd.0 on compute-0
Jan 23 04:02:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 04:02:49 np0005593234 ceph-mon[77084]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 23 04:02:49 np0005593234 ceph-mon[77084]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 23 04:02:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 23 04:02:49 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/2579638420' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 23 04:02:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e42 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:50 np0005593234 ceph-mon[77084]: Reconfiguring osd.1 (monmap changed)...
Jan 23 04:02:50 np0005593234 ceph-mon[77084]: Reconfiguring daemon osd.1 on compute-1
Jan 23 04:02:50 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/2579638420' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 23 04:02:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:02:51 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Jan 23 04:02:51 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Jan 23 04:02:51 np0005593234 podman[83176]: 2026-01-23 09:02:51.614729147 +0000 UTC m=+0.054707269 container create c9d78138d9a8d1b11b61cd5719e5fd6fd554753911822da5c8dd621703410a91 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_shamir, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:02:51 np0005593234 systemd[1]: Started libpod-conmon-c9d78138d9a8d1b11b61cd5719e5fd6fd554753911822da5c8dd621703410a91.scope.
Jan 23 04:02:51 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:51 np0005593234 podman[83176]: 2026-01-23 09:02:51.589872076 +0000 UTC m=+0.029850278 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:51 np0005593234 podman[83176]: 2026-01-23 09:02:51.6908828 +0000 UTC m=+0.130860932 container init c9d78138d9a8d1b11b61cd5719e5fd6fd554753911822da5c8dd621703410a91 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_shamir, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, CEPH_REF=reef)
Jan 23 04:02:51 np0005593234 podman[83176]: 2026-01-23 09:02:51.697549427 +0000 UTC m=+0.137527529 container start c9d78138d9a8d1b11b61cd5719e5fd6fd554753911822da5c8dd621703410a91 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_shamir, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:02:51 np0005593234 podman[83176]: 2026-01-23 09:02:51.700886551 +0000 UTC m=+0.140864673 container attach c9d78138d9a8d1b11b61cd5719e5fd6fd554753911822da5c8dd621703410a91 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_shamir, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef)
Jan 23 04:02:51 np0005593234 fervent_shamir[83193]: 167 167
Jan 23 04:02:51 np0005593234 systemd[1]: libpod-c9d78138d9a8d1b11b61cd5719e5fd6fd554753911822da5c8dd621703410a91.scope: Deactivated successfully.
Jan 23 04:02:51 np0005593234 conmon[83193]: conmon c9d78138d9a8d1b11b61 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c9d78138d9a8d1b11b61cd5719e5fd6fd554753911822da5c8dd621703410a91.scope/container/memory.events
Jan 23 04:02:51 np0005593234 podman[83176]: 2026-01-23 09:02:51.707928879 +0000 UTC m=+0.147906991 container died c9d78138d9a8d1b11b61cd5719e5fd6fd554753911822da5c8dd621703410a91 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_shamir, ceph=True, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 23 04:02:51 np0005593234 systemd[1]: var-lib-containers-storage-overlay-5a60165b2f8a61abbd712ff39c4f9133a1c287f30460e1fd1d8e5e8ed9c454c3-merged.mount: Deactivated successfully.
Jan 23 04:02:51 np0005593234 podman[83176]: 2026-01-23 09:02:51.754794734 +0000 UTC m=+0.194772866 container remove c9d78138d9a8d1b11b61cd5719e5fd6fd554753911822da5c8dd621703410a91 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=fervent_shamir, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True)
Jan 23 04:02:51 np0005593234 systemd[1]: libpod-conmon-c9d78138d9a8d1b11b61cd5719e5fd6fd554753911822da5c8dd621703410a91.scope: Deactivated successfully.
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.nrjyzu", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:02:52 np0005593234 podman[83329]: 2026-01-23 09:02:52.374151106 +0000 UTC m=+0.079584640 container create 8b7f0aaea69dae19193e23d963c4acef1cd814f2b5b5df75d305498fb3afef86 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_jackson, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:02:52 np0005593234 podman[83329]: 2026-01-23 09:02:52.331783682 +0000 UTC m=+0.037217236 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:02:52 np0005593234 systemd[1]: Started libpod-conmon-8b7f0aaea69dae19193e23d963c4acef1cd814f2b5b5df75d305498fb3afef86.scope.
Jan 23 04:02:52 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:02:52 np0005593234 podman[83329]: 2026-01-23 09:02:52.483999716 +0000 UTC m=+0.189433270 container init 8b7f0aaea69dae19193e23d963c4acef1cd814f2b5b5df75d305498fb3afef86 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_jackson, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS)
Jan 23 04:02:52 np0005593234 podman[83329]: 2026-01-23 09:02:52.491327314 +0000 UTC m=+0.196760848 container start 8b7f0aaea69dae19193e23d963c4acef1cd814f2b5b5df75d305498fb3afef86 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_jackson, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 23 04:02:52 np0005593234 infallible_jackson[83346]: 167 167
Jan 23 04:02:52 np0005593234 systemd[1]: libpod-8b7f0aaea69dae19193e23d963c4acef1cd814f2b5b5df75d305498fb3afef86.scope: Deactivated successfully.
Jan 23 04:02:52 np0005593234 podman[83329]: 2026-01-23 09:02:52.494925075 +0000 UTC m=+0.200358639 container attach 8b7f0aaea69dae19193e23d963c4acef1cd814f2b5b5df75d305498fb3afef86 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_jackson, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True)
Jan 23 04:02:52 np0005593234 podman[83329]: 2026-01-23 09:02:52.49604095 +0000 UTC m=+0.201474494 container died 8b7f0aaea69dae19193e23d963c4acef1cd814f2b5b5df75d305498fb3afef86 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_jackson, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 04:02:52 np0005593234 systemd[1]: var-lib-containers-storage-overlay-8a281b947239cc51acddfeeb07ab8f89dcf42613fe576e62fc600ccf201c95ea-merged.mount: Deactivated successfully.
Jan 23 04:02:52 np0005593234 podman[83329]: 2026-01-23 09:02:52.545173404 +0000 UTC m=+0.250606938 container remove 8b7f0aaea69dae19193e23d963c4acef1cd814f2b5b5df75d305498fb3afef86 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=infallible_jackson, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Jan 23 04:02:52 np0005593234 systemd[1]: libpod-conmon-8b7f0aaea69dae19193e23d963c4acef1cd814f2b5b5df75d305498fb3afef86.scope: Deactivated successfully.
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Jan 23 04:02:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e43 e43: 3 total, 3 up, 3 in
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.1c( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.1d( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.a( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.b( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.5( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[7.5( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.d( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.c( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.f( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[7.a( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[7.14( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.10( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[7.16( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.13( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.12( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.15( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[7.11( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.18( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[7.1d( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[2.1b( empty local-lis/les=0/0 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[7.1f( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.858271599s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431232452s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.853464127s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.426479340s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.853432655s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.426479340s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.19( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.853246689s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.426326752s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.858205795s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431232452s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.18( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.858040810s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431201935s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.18( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857975006s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431201935s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.853051186s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.426353455s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.19( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.853213310s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.426326752s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.17( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.853028297s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.426383972s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.853073120s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.426460266s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.17( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.852993965s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.426383972s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.853019714s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.426460266s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.853026390s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.426353455s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.16( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857659340s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431240082s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.14( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857734680s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431354523s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.16( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857626915s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431240082s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.14( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857690811s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431354523s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857556343s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431232452s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857524872s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431232452s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.13( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857751846s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431606293s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.13( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857724190s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431606293s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857637405s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431587219s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.852953911s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.426383972s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857596397s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431587219s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.858187675s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432258606s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.858163834s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432258606s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.12( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857392311s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431484222s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857332230s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431491852s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.10( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857501984s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431709290s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857263565s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431491852s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857470512s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431720734s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.10( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857454300s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431709290s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857444763s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431720734s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.f( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857387543s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431755066s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.852000237s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.426383972s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.12( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857229233s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431484222s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857290268s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431735992s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857257843s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431735992s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.d( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857275963s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.431816101s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.d( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857249260s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431816101s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.c( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857612610s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432254791s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857666016s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432395935s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.c( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857537270s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432254791s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.6( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857793808s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432571411s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857625008s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432395935s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.6( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857768059s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432571411s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.1( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857667923s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432575226s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857978821s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432933807s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.2( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857611656s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432575226s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.1( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857629776s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432575226s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.2( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857593536s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432575226s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857574463s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432559967s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857537270s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432559967s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.3( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857919693s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.433067322s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.3( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857896805s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.433067322s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857765198s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432933807s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.f( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.856656075s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.431755066s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.4( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857235909s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432643890s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857248306s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432662964s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.4( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857206345s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432643890s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857671738s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432918549s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.5( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857248306s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432708740s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857429504s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432918549s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.5( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857172012s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432708740s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857136726s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432720184s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.7( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857208252s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432804108s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857118607s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432720184s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.7( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857184410s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432804108s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857113838s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432807922s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857076645s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432807922s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857052803s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432834625s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.a( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857067108s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432868958s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857028008s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432834625s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.a( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857041359s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432868958s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857036591s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432926178s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.b( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857034683s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432929993s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857209206s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432662964s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.856982231s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432926178s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.b( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.856979370s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432929993s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.1c( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857079506s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.433055878s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.1c( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.857060432s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.433055878s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.856931686s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432952881s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.856904984s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432952881s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.1e( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.856849670s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.432960510s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.856921196s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.433044434s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.1e( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.856829643s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.432960510s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.856899261s) [1] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.433044434s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.1f( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.856778145s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.433032990s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[3.1f( empty local-lis/les=40/41 n=0 ec=40/15 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.856762886s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.433032990s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.856720924s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active pruub 39.433059692s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/20 lis/c=40/40 les/c/f=41/41/0 sis=43 pruub=15.856560707s) [0] r=-1 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 39.433059692s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[4.15( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[6.17( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [2] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[4.14( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[6.12( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [2] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[4.1( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[4.3( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[4.2( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[6.1( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [2] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[6.1b( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [2] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[4.19( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[4.9( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[4.6( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[4.8( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[4.1c( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[6.1e( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [2] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[6.1c( empty local-lis/les=0/0 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [2] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[4.1f( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 43 pg[4.1d( empty local-lis/les=0/0 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:02:53 np0005593234 ceph-mon[77084]: Reconfiguring mgr.compute-2.nrjyzu (monmap changed)...
Jan 23 04:02:53 np0005593234 ceph-mon[77084]: Reconfiguring daemon mgr.compute-2.nrjyzu on compute-2
Jan 23 04:02:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:02:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:02:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:02:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:02:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:02:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:02:53 np0005593234 podman[83537]: 2026-01-23 09:02:53.551142136 +0000 UTC m=+0.058958821 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2)
Jan 23 04:02:53 np0005593234 podman[83537]: 2026-01-23 09:02:53.647008392 +0000 UTC m=+0.154825067 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS)
Jan 23 04:02:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e44 e44: 3 total, 3 up, 3 in
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[6.1e( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [2] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[4.1c( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[4.1d( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[6.1c( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [2] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.18( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.1b( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[7.1d( empty local-lis/les=43/44 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[7.1f( empty local-lis/les=43/44 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[7.11( empty local-lis/les=43/44 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.15( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[4.1f( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.12( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[4.14( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.13( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[6.12( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [2] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[4.15( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[7.14( empty local-lis/les=43/44 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[6.17( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [2] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.10( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[4.8( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[7.16( empty local-lis/les=43/44 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[7.a( empty local-lis/les=43/44 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.c( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.d( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[4.9( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[4.1( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.f( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[7.5( empty local-lis/les=43/44 n=0 ec=40/24 lis/c=40/40 les/c/f=42/42/0 sis=43) [2] r=0 lpr=43 pi=[40,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[4.6( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[6.1( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [2] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[4.3( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.5( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[4.2( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.a( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.b( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.1d( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[2.1c( empty local-lis/les=43/44 n=0 ec=35/13 lis/c=35/35 les/c/f=37/37/0 sis=43) [2] r=0 lpr=43 pi=[35,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[4.19( empty local-lis/les=43/44 n=0 ec=37/17 lis/c=37/37 les/c/f=38/38/0 sis=43) [2] r=0 lpr=43 pi=[37,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 44 pg[6.1b( empty local-lis/les=43/44 n=0 ec=39/22 lis/c=39/39 les/c/f=40/40/0 sis=43) [2] r=0 lpr=43 pi=[39,43)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:02:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:02:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:02:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:02:55 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/2587378952' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 23 04:02:56 np0005593234 systemd[1]: session-19.scope: Deactivated successfully.
Jan 23 04:02:56 np0005593234 systemd[1]: session-19.scope: Consumed 8.114s CPU time.
Jan 23 04:02:56 np0005593234 systemd-logind[794]: Session 19 logged out. Waiting for processes to exit.
Jan 23 04:02:56 np0005593234 systemd-logind[794]: Removed session 19.
Jan 23 04:02:57 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.4 scrub starts
Jan 23 04:02:57 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.4 scrub ok
Jan 23 04:02:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:02:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e44 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.nxrebk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:03:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.nxrebk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:03:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:00 np0005593234 podman[83764]: 2026-01-23 09:03:00.801351099 +0000 UTC m=+0.044555734 container create 1a9aba9f1078c1d8f9b71de32bdb5e095db92fbd1fd03953203441760dc39981 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_agnesi, io.buildah.version=1.39.3, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:03:00 np0005593234 systemd[1]: Started libpod-conmon-1a9aba9f1078c1d8f9b71de32bdb5e095db92fbd1fd03953203441760dc39981.scope.
Jan 23 04:03:00 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:03:00 np0005593234 podman[83764]: 2026-01-23 09:03:00.878422441 +0000 UTC m=+0.121627096 container init 1a9aba9f1078c1d8f9b71de32bdb5e095db92fbd1fd03953203441760dc39981 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_agnesi, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3)
Jan 23 04:03:00 np0005593234 podman[83764]: 2026-01-23 09:03:00.785946641 +0000 UTC m=+0.029151326 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:03:00 np0005593234 podman[83764]: 2026-01-23 09:03:00.886297315 +0000 UTC m=+0.129501950 container start 1a9aba9f1078c1d8f9b71de32bdb5e095db92fbd1fd03953203441760dc39981 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_agnesi, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507)
Jan 23 04:03:00 np0005593234 podman[83764]: 2026-01-23 09:03:00.892699834 +0000 UTC m=+0.135904489 container attach 1a9aba9f1078c1d8f9b71de32bdb5e095db92fbd1fd03953203441760dc39981 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_agnesi, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:03:00 np0005593234 affectionate_agnesi[83780]: 167 167
Jan 23 04:03:00 np0005593234 systemd[1]: libpod-1a9aba9f1078c1d8f9b71de32bdb5e095db92fbd1fd03953203441760dc39981.scope: Deactivated successfully.
Jan 23 04:03:00 np0005593234 podman[83764]: 2026-01-23 09:03:00.89544653 +0000 UTC m=+0.138651165 container died 1a9aba9f1078c1d8f9b71de32bdb5e095db92fbd1fd03953203441760dc39981 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_agnesi, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.build-date=20250507, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:03:00 np0005593234 systemd[1]: var-lib-containers-storage-overlay-445ed5fd17fd4147844ff1581167ab94533dcc57806a9d1da2ae50c9503d70c4-merged.mount: Deactivated successfully.
Jan 23 04:03:00 np0005593234 podman[83764]: 2026-01-23 09:03:00.930768686 +0000 UTC m=+0.173973321 container remove 1a9aba9f1078c1d8f9b71de32bdb5e095db92fbd1fd03953203441760dc39981 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=affectionate_agnesi, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:03:00 np0005593234 systemd[1]: libpod-conmon-1a9aba9f1078c1d8f9b71de32bdb5e095db92fbd1fd03953203441760dc39981.scope: Deactivated successfully.
Jan 23 04:03:00 np0005593234 systemd[1]: Reloading.
Jan 23 04:03:01 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:03:01 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:03:01 np0005593234 systemd[1]: Reloading.
Jan 23 04:03:01 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:03:01 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:03:01 np0005593234 systemd[1]: Starting Ceph rgw.rgw.compute-2.nxrebk for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:03:01 np0005593234 ceph-mon[77084]: Deploying daemon rgw.rgw.compute-2.nxrebk on compute-2
Jan 23 04:03:01 np0005593234 podman[83926]: 2026-01-23 09:03:01.784603396 +0000 UTC m=+0.098953963 container create 3688fe693d5d11d25baadc81f0e6cc057fa3cb2645accd948218978cd608ff9e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-rgw-rgw-compute-2-nxrebk, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:03:01 np0005593234 podman[83926]: 2026-01-23 09:03:01.705863052 +0000 UTC m=+0.020213669 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:03:01 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6ba28fdf616af59b7e29647a3ec44c0ad8f3d13d2c4e9189f9fe597c81278a1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:01 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6ba28fdf616af59b7e29647a3ec44c0ad8f3d13d2c4e9189f9fe597c81278a1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:01 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6ba28fdf616af59b7e29647a3ec44c0ad8f3d13d2c4e9189f9fe597c81278a1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:01 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6ba28fdf616af59b7e29647a3ec44c0ad8f3d13d2c4e9189f9fe597c81278a1/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-2.nxrebk supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:01 np0005593234 podman[83926]: 2026-01-23 09:03:01.933889409 +0000 UTC m=+0.248240056 container init 3688fe693d5d11d25baadc81f0e6cc057fa3cb2645accd948218978cd608ff9e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-rgw-rgw-compute-2-nxrebk, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3)
Jan 23 04:03:01 np0005593234 podman[83926]: 2026-01-23 09:03:01.940192034 +0000 UTC m=+0.254542631 container start 3688fe693d5d11d25baadc81f0e6cc057fa3cb2645accd948218978cd608ff9e (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-rgw-rgw-compute-2-nxrebk, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 23 04:03:02 np0005593234 radosgw[83946]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:03:02 np0005593234 radosgw[83946]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process radosgw, pid 2
Jan 23 04:03:02 np0005593234 radosgw[83946]: framework: beast
Jan 23 04:03:02 np0005593234 radosgw[83946]: framework conf key: endpoint, val: 192.168.122.102:8082
Jan 23 04:03:02 np0005593234 radosgw[83946]: init_numa not setting numa affinity
Jan 23 04:03:02 np0005593234 bash[83926]: 3688fe693d5d11d25baadc81f0e6cc057fa3cb2645accd948218978cd608ff9e
Jan 23 04:03:02 np0005593234 systemd[1]: Started Ceph rgw.rgw.compute-2.nxrebk for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:03:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 23 04:03:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"} v 0) v1
Jan 23 04:03:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1632836641' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 04:03:03 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.8 scrub starts
Jan 23 04:03:03 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.8 scrub ok
Jan 23 04:03:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:03 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.102:0/1632836641' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 04:03:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:03 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 04:03:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.odtvxh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:03:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.odtvxh", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:03:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:03 np0005593234 ceph-mon[77084]: Deploying daemon rgw.rgw.compute-1.odtvxh on compute-1
Jan 23 04:03:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 23 04:03:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e46 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 23 04:03:04 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 23 04:03:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0) v1
Jan 23 04:03:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1632836641' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:03:05 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.b scrub starts
Jan 23 04:03:05 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.b scrub ok
Jan 23 04:03:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:05 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.102:0/1632836641' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:03:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:05 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 04:03:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.jgxhia", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 04:03:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.jgxhia", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 04:03:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:05 np0005593234 ceph-mon[77084]: Deploying daemon rgw.rgw.compute-0.jgxhia on compute-0
Jan 23 04:03:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 23 04:03:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 23 04:03:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0) v1
Jan 23 04:03:06 np0005593234 ceph-mon[77084]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/1632836641' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:03:06 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 23 04:03:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.cfzfln", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 04:03:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.cfzfln", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 04:03:07 np0005593234 podman[84150]: 2026-01-23 09:03:06.970493799 +0000 UTC m=+0.023384667 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:03:07 np0005593234 podman[84150]: 2026-01-23 09:03:07.157637677 +0000 UTC m=+0.210528515 container create b04ad442bfb93e5127a430593e01233ab1f89e5da40d5b3410d39be62e7e891a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_robinson, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Jan 23 04:03:07 np0005593234 systemd[1]: Started libpod-conmon-b04ad442bfb93e5127a430593e01233ab1f89e5da40d5b3410d39be62e7e891a.scope.
Jan 23 04:03:07 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:03:07 np0005593234 podman[84150]: 2026-01-23 09:03:07.250965713 +0000 UTC m=+0.303856571 container init b04ad442bfb93e5127a430593e01233ab1f89e5da40d5b3410d39be62e7e891a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_robinson, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 23 04:03:07 np0005593234 podman[84150]: 2026-01-23 09:03:07.262624265 +0000 UTC m=+0.315515133 container start b04ad442bfb93e5127a430593e01233ab1f89e5da40d5b3410d39be62e7e891a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_robinson, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:03:07 np0005593234 podman[84150]: 2026-01-23 09:03:07.266683192 +0000 UTC m=+0.319574040 container attach b04ad442bfb93e5127a430593e01233ab1f89e5da40d5b3410d39be62e7e891a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_robinson, CEPH_REF=reef, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:03:07 np0005593234 kind_robinson[84167]: 167 167
Jan 23 04:03:07 np0005593234 systemd[1]: libpod-b04ad442bfb93e5127a430593e01233ab1f89e5da40d5b3410d39be62e7e891a.scope: Deactivated successfully.
Jan 23 04:03:07 np0005593234 podman[84150]: 2026-01-23 09:03:07.271986736 +0000 UTC m=+0.324877574 container died b04ad442bfb93e5127a430593e01233ab1f89e5da40d5b3410d39be62e7e891a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_robinson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:03:07 np0005593234 systemd[1]: var-lib-containers-storage-overlay-254ad25584f60eb362afd9f6f7bdbffea8495206f959322d5df49982754f58df-merged.mount: Deactivated successfully.
Jan 23 04:03:07 np0005593234 podman[84150]: 2026-01-23 09:03:07.316178737 +0000 UTC m=+0.369069565 container remove b04ad442bfb93e5127a430593e01233ab1f89e5da40d5b3410d39be62e7e891a (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=kind_robinson, ceph=True, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:03:07 np0005593234 systemd[1]: libpod-conmon-b04ad442bfb93e5127a430593e01233ab1f89e5da40d5b3410d39be62e7e891a.scope: Deactivated successfully.
Jan 23 04:03:07 np0005593234 systemd[1]: Reloading.
Jan 23 04:03:07 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:03:07 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:03:07 np0005593234 systemd[1]: Reloading.
Jan 23 04:03:07 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:03:07 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:03:07 np0005593234 systemd[1]: Starting Ceph mds.cephfs.compute-2.cfzfln for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:03:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 23 04:03:07 np0005593234 ceph-mon[77084]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 23 04:03:07 np0005593234 ceph-mon[77084]: Deploying daemon mds.cephfs.compute-2.cfzfln on compute-2
Jan 23 04:03:07 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/671819638' entity='client.rgw.rgw.compute-0.jgxhia' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:03:07 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.101:0/2162450091' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:03:07 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.102:0/1632836641' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:03:07 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:03:07 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 04:03:08 np0005593234 podman[84322]: 2026-01-23 09:03:08.184144106 +0000 UTC m=+0.038114093 container create e352e7d14f34fb09ce833ee8ee86c791e8262e1ad6dac4747a9118071301c4a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mds-cephfs-compute-2-cfzfln, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507)
Jan 23 04:03:08 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48586e417aa60ffa3dbe7b8c1a426ea4af3c7c26f012660d88f2ee2b4a95f14d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:08 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48586e417aa60ffa3dbe7b8c1a426ea4af3c7c26f012660d88f2ee2b4a95f14d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:08 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48586e417aa60ffa3dbe7b8c1a426ea4af3c7c26f012660d88f2ee2b4a95f14d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:08 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/48586e417aa60ffa3dbe7b8c1a426ea4af3c7c26f012660d88f2ee2b4a95f14d/merged/var/lib/ceph/mds/ceph-cephfs.compute-2.cfzfln supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:08 np0005593234 podman[84322]: 2026-01-23 09:03:08.248884226 +0000 UTC m=+0.102854223 container init e352e7d14f34fb09ce833ee8ee86c791e8262e1ad6dac4747a9118071301c4a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mds-cephfs-compute-2-cfzfln, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:03:08 np0005593234 podman[84322]: 2026-01-23 09:03:08.256861773 +0000 UTC m=+0.110831750 container start e352e7d14f34fb09ce833ee8ee86c791e8262e1ad6dac4747a9118071301c4a8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mds-cephfs-compute-2-cfzfln, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 23 04:03:08 np0005593234 bash[84322]: e352e7d14f34fb09ce833ee8ee86c791e8262e1ad6dac4747a9118071301c4a8
Jan 23 04:03:08 np0005593234 podman[84322]: 2026-01-23 09:03:08.165920891 +0000 UTC m=+0.019890898 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:03:08 np0005593234 systemd[1]: Started Ceph mds.cephfs.compute-2.cfzfln for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:03:08 np0005593234 ceph-mds[84342]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:03:08 np0005593234 ceph-mds[84342]: ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable), process ceph-mds, pid 2
Jan 23 04:03:08 np0005593234 ceph-mds[84342]: main not setting numa affinity
Jan 23 04:03:08 np0005593234 ceph-mds[84342]: pidfile_write: ignore empty --pid-file
Jan 23 04:03:08 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mds-cephfs-compute-2-cfzfln[84338]: starting mds.cephfs.compute-2.cfzfln at 
Jan 23 04:03:08 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln Updating MDS map to version 2 from mon.1
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0) v1
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/448850748' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/671819638' entity='client.rgw.rgw.compute-0.jgxhia' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-1.odtvxh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.djntrk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.djntrk", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.102:0/448850748' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.101:0/249273750' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:03:08 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/1806314718' entity='client.rgw.rgw.compute-0.jgxhia' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 04:03:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e3 new map
Jan 23 04:03:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e3 print_map#012e3#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0112#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:02:44.729005+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#011#012up#011{}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-2.cfzfln{-1:24154} state up:standby seq 1 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln Updating MDS map to version 3 from mon.1
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln Monitors have assigned me to become a standby.
Jan 23 04:03:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e4 new map
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln Updating MDS map to version 4 from mon.1
Jan 23 04:03:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e4 print_map#012e4#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0114#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:03:08.999863+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24154}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.cfzfln{0:24154} state up:creating seq 1 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]#012 #012 
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.4 handle_mds_map i am now mds.0.4
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.4 handle_mds_map state change up:standby --> up:creating
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.cache creating system inode with ino:0x1
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.cache creating system inode with ino:0x100
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.cache creating system inode with ino:0x600
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.cache creating system inode with ino:0x601
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.cache creating system inode with ino:0x602
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.cache creating system inode with ino:0x603
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.cache creating system inode with ino:0x604
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.cache creating system inode with ino:0x605
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.cache creating system inode with ino:0x606
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.cache creating system inode with ino:0x607
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.cache creating system inode with ino:0x608
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.cache creating system inode with ino:0x609
Jan 23 04:03:09 np0005593234 ceph-mds[84342]: mds.0.4 creating_done
Jan 23 04:03:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e51 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 23 04:03:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0) v1
Jan 23 04:03:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [INF] : from='client.? 192.168.122.102:0/448850748' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: Deploying daemon mds.cephfs.compute-0.djntrk on compute-0
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: daemon mds.cephfs.compute-2.cfzfln assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: Cluster is now healthy
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: daemon mds.cephfs.compute-2.cfzfln is now active in filesystem cephfs as rank 0
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-1.odtvxh' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/1806314718' entity='client.rgw.rgw.compute-0.jgxhia' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/1806314718' entity='client.rgw.rgw.compute-0.jgxhia' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.102:0/448850748' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.101:0/249273750' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-1.odtvxh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e5 new map
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e5 print_map#012e5#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:03:10.006927+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24154}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0110#012[mds.cephfs.compute-2.cfzfln{0:24154} state up:active seq 2 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.djntrk{-1:14418} state up:standby seq 1 addr [v2:192.168.122.100:6806/376811981,v1:192.168.122.100:6807/376811981] compat {c=[1],r=[1],i=[7ff]}]
Jan 23 04:03:10 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln Updating MDS map to version 5 from mon.1
Jan 23 04:03:10 np0005593234 ceph-mds[84342]: mds.0.4 handle_mds_map i am now mds.0.4
Jan 23 04:03:10 np0005593234 ceph-mds[84342]: mds.0.4 handle_mds_map state change up:creating --> up:active
Jan 23 04:03:10 np0005593234 ceph-mds[84342]: mds.0.4 recovery_done -- successful recovery!
Jan 23 04:03:10 np0005593234 ceph-mds[84342]: mds.0.4 active_start
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e6 new map
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e6 print_map#012e6#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0115#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:03:10.006927+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24154}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.cfzfln{0:24154} state up:active seq 2 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.djntrk{-1:14418} state up:standby seq 1 addr [v2:192.168.122.100:6806/376811981,v1:192.168.122.100:6807/376811981] compat {c=[1],r=[1],i=[7ff]}]
Jan 23 04:03:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 23 04:03:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.elkrlx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 04:03:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.elkrlx", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 04:03:11 np0005593234 ceph-mon[77084]: Deploying daemon mds.cephfs.compute-1.elkrlx on compute-1
Jan 23 04:03:11 np0005593234 ceph-mon[77084]: from='client.? 192.168.122.100:0/1806314718' entity='client.rgw.rgw.compute-0.jgxhia' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 04:03:11 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-2.nxrebk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 04:03:11 np0005593234 ceph-mon[77084]: from='client.? ' entity='client.rgw.rgw.compute-1.odtvxh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 04:03:11 np0005593234 radosgw[83946]: LDAP not started since no server URIs were provided in the configuration.
Jan 23 04:03:11 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-rgw-rgw-compute-2-nxrebk[83942]: 2026-01-23T09:03:11.031+0000 7f7de1d90940 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 23 04:03:11 np0005593234 radosgw[83946]: framework: beast
Jan 23 04:03:11 np0005593234 radosgw[83946]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 23 04:03:11 np0005593234 radosgw[83946]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 23 04:03:11 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 23 04:03:11 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 04:03:11 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 04:03:11 np0005593234 radosgw[83946]: starting handler: beast
Jan 23 04:03:11 np0005593234 radosgw[83946]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 04:03:11 np0005593234 radosgw[83946]: mgrc service_daemon_register rgw.24148 metadata {arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.7 (6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad) reef (stable),ceph_version_short=18.2.7,container_hostname=compute-2,container_image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.102:8082,frontend_type#0=beast,hostname=compute-2,id=rgw.compute-2.nxrebk,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=3e406311-b9dd-4a98-8793-19b1bcf2f2db,zone_name=default,zonegroup_id=b6190aad-4d81-46cc-a15a-858fefbf7de5,zonegroup_name=default}
Jan 23 04:03:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e7 new map
Jan 23 04:03:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e7 print_map#012e7#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0117#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:03:13.047531+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24154}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.cfzfln{0:24154} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.djntrk{-1:14418} state up:standby seq 1 addr [v2:192.168.122.100:6806/376811981,v1:192.168.122.100:6807/376811981] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.elkrlx{-1:24167} state up:standby seq 1 addr [v2:192.168.122.101:6804/4162024387,v1:192.168.122.101:6805/4162024387] compat {c=[1],r=[1],i=[7ff]}]
Jan 23 04:03:13 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln Updating MDS map to version 7 from mon.1
Jan 23 04:03:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:14 np0005593234 ceph-mon[77084]: Deploying daemon haproxy.rgw.default.compute-0.iyrury on compute-0
Jan 23 04:03:14 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:14 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.8 scrub starts
Jan 23 04:03:14 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.8 scrub ok
Jan 23 04:03:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e8 new map
Jan 23 04:03:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e8 print_map#012e8#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0117#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:03:13.047531+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24154}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.cfzfln{0:24154} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.djntrk{-1:14418} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/376811981,v1:192.168.122.100:6807/376811981] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.elkrlx{-1:24167} state up:standby seq 1 addr [v2:192.168.122.101:6804/4162024387,v1:192.168.122.101:6805/4162024387] compat {c=[1],r=[1],i=[7ff]}]
Jan 23 04:03:15 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.d scrub starts
Jan 23 04:03:15 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.d scrub ok
Jan 23 04:03:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e9 new map
Jan 23 04:03:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).mds e9 print_map#012e9#012enable_multiple, ever_enabled_multiple: 1,1#012default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2}#012legacy client fscid: 1#012 #012Filesystem 'cephfs' (1)#012fs_name#011cephfs#012epoch#0117#012flags#01112 joinable allow_snaps allow_multimds_snaps#012created#0112026-01-23T09:02:44.728889+0000#012modified#0112026-01-23T09:03:13.047531+0000#012tableserver#0110#012root#0110#012session_timeout#01160#012session_autoclose#011300#012max_file_size#0111099511627776#012max_xattr_size#01165536#012required_client_features#011{}#012last_failure#0110#012last_failure_osd_epoch#0110#012compat#011compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2}#012max_mds#0111#012in#0110#012up#011{0=24154}#012failed#011#012damaged#011#012stopped#011#012data_pools#011[7]#012metadata_pool#0116#012inline_data#011disabled#012balancer#011#012bal_rank_mask#011-1#012standby_count_wanted#0111#012[mds.cephfs.compute-2.cfzfln{0:24154} state up:active seq 3 join_fscid=1 addr [v2:192.168.122.102:6804/817154036,v1:192.168.122.102:6805/817154036] compat {c=[1],r=[1],i=[7ff]}]#012 #012 #012Standby daemons:#012 #012[mds.cephfs.compute-0.djntrk{-1:14418} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.100:6806/376811981,v1:192.168.122.100:6807/376811981] compat {c=[1],r=[1],i=[7ff]}]#012[mds.cephfs.compute-1.elkrlx{-1:24167} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/4162024387,v1:192.168.122.101:6805/4162024387] compat {c=[1],r=[1],i=[7ff]}]
Jan 23 04:03:18 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:18 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:18 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:18 np0005593234 ceph-mon[77084]: Deploying daemon haproxy.rgw.default.compute-2.xmknsp on compute-2
Jan 23 04:03:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.003000096s ======
Jan 23 04:03:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:19.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000096s
Jan 23 04:03:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:19 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:20 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.9 deep-scrub starts
Jan 23 04:03:20 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.9 deep-scrub ok
Jan 23 04:03:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:21.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:21 np0005593234 podman[85053]: 2026-01-23 09:03:21.371407048 +0000 UTC m=+2.819812192 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 04:03:21 np0005593234 podman[85053]: 2026-01-23 09:03:21.38792396 +0000 UTC m=+2.836329094 container create f1023473af551d5aacd1e6ad87440bc6b32d71803fa80e142ee7d8e199eb4f31 (image=quay.io/ceph/haproxy:2.3, name=suspicious_gagarin)
Jan 23 04:03:21 np0005593234 systemd[1]: Started libpod-conmon-f1023473af551d5aacd1e6ad87440bc6b32d71803fa80e142ee7d8e199eb4f31.scope.
Jan 23 04:03:21 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:03:21 np0005593234 podman[85053]: 2026-01-23 09:03:21.47359671 +0000 UTC m=+2.922001864 container init f1023473af551d5aacd1e6ad87440bc6b32d71803fa80e142ee7d8e199eb4f31 (image=quay.io/ceph/haproxy:2.3, name=suspicious_gagarin)
Jan 23 04:03:21 np0005593234 podman[85053]: 2026-01-23 09:03:21.483327178 +0000 UTC m=+2.931732292 container start f1023473af551d5aacd1e6ad87440bc6b32d71803fa80e142ee7d8e199eb4f31 (image=quay.io/ceph/haproxy:2.3, name=suspicious_gagarin)
Jan 23 04:03:21 np0005593234 podman[85053]: 2026-01-23 09:03:21.487027615 +0000 UTC m=+2.935432759 container attach f1023473af551d5aacd1e6ad87440bc6b32d71803fa80e142ee7d8e199eb4f31 (image=quay.io/ceph/haproxy:2.3, name=suspicious_gagarin)
Jan 23 04:03:21 np0005593234 systemd[1]: libpod-f1023473af551d5aacd1e6ad87440bc6b32d71803fa80e142ee7d8e199eb4f31.scope: Deactivated successfully.
Jan 23 04:03:21 np0005593234 suspicious_gagarin[85170]: 0 0
Jan 23 04:03:21 np0005593234 podman[85053]: 2026-01-23 09:03:21.492321082 +0000 UTC m=+2.940726196 container died f1023473af551d5aacd1e6ad87440bc6b32d71803fa80e142ee7d8e199eb4f31 (image=quay.io/ceph/haproxy:2.3, name=suspicious_gagarin)
Jan 23 04:03:21 np0005593234 conmon[85170]: conmon f1023473af551d5aacd1 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-f1023473af551d5aacd1e6ad87440bc6b32d71803fa80e142ee7d8e199eb4f31.scope/container/memory.events
Jan 23 04:03:21 np0005593234 systemd[1]: var-lib-containers-storage-overlay-eda6e3e357cee25b6a2eeb9317297b89eee8a731d62a6a222acc1c3a4b7b87af-merged.mount: Deactivated successfully.
Jan 23 04:03:21 np0005593234 podman[85053]: 2026-01-23 09:03:21.535294962 +0000 UTC m=+2.983700096 container remove f1023473af551d5aacd1e6ad87440bc6b32d71803fa80e142ee7d8e199eb4f31 (image=quay.io/ceph/haproxy:2.3, name=suspicious_gagarin)
Jan 23 04:03:21 np0005593234 systemd[1]: libpod-conmon-f1023473af551d5aacd1e6ad87440bc6b32d71803fa80e142ee7d8e199eb4f31.scope: Deactivated successfully.
Jan 23 04:03:21 np0005593234 systemd[1]: Reloading.
Jan 23 04:03:21 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:03:21 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:03:21 np0005593234 systemd[1]: Reloading.
Jan 23 04:03:21 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:03:21 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:03:22 np0005593234 systemd[1]: Starting Ceph haproxy.rgw.default.compute-2.xmknsp for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:03:22 np0005593234 podman[85317]: 2026-01-23 09:03:22.407465302 +0000 UTC m=+0.045971086 container create 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:03:22 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8da4a1bc1be3adc611a7d32c0685e4164cf5ffe719642c3e71eeaeca3f77bf5e/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:22 np0005593234 podman[85317]: 2026-01-23 09:03:22.465916831 +0000 UTC m=+0.104422635 container init 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:03:22 np0005593234 podman[85317]: 2026-01-23 09:03:22.473657966 +0000 UTC m=+0.112163750 container start 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:03:22 np0005593234 bash[85317]: 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120
Jan 23 04:03:22 np0005593234 podman[85317]: 2026-01-23 09:03:22.387474499 +0000 UTC m=+0.025980343 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 04:03:22 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp[85333]: [NOTICE] 022/090322 (2) : New worker #1 (4) forked
Jan 23 04:03:22 np0005593234 systemd[1]: Started Ceph haproxy.rgw.default.compute-2.xmknsp for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:03:22 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.e scrub starts
Jan 23 04:03:22 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.e scrub ok
Jan 23 04:03:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:23.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:23.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:24 np0005593234 ceph-mon[77084]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:03:24 np0005593234 ceph-mon[77084]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:03:24 np0005593234 ceph-mon[77084]: Deploying daemon keepalived.rgw.default.compute-2.tkmlem on compute-2
Jan 23 04:03:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:25.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:25.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:26 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.e deep-scrub starts
Jan 23 04:03:26 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.e deep-scrub ok
Jan 23 04:03:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:27.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:27 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.12 scrub starts
Jan 23 04:03:27 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.12 scrub ok
Jan 23 04:03:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:27.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:28 np0005593234 podman[85488]: 2026-01-23 09:03:28.439444605 +0000 UTC m=+5.215989442 container create c5473bb4aa0b08282a161fce04270920ea09f495884f3fbe5cb8f7f9ef15e783 (image=quay.io/ceph/keepalived:2.2.4, name=adoring_buck, version=2.2.4, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, release=1793, com.redhat.component=keepalived-container, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, architecture=x86_64)
Jan 23 04:03:28 np0005593234 podman[85488]: 2026-01-23 09:03:28.424413049 +0000 UTC m=+5.200957836 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 04:03:28 np0005593234 systemd[1]: Started libpod-conmon-c5473bb4aa0b08282a161fce04270920ea09f495884f3fbe5cb8f7f9ef15e783.scope.
Jan 23 04:03:28 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:03:28 np0005593234 podman[85488]: 2026-01-23 09:03:28.524388892 +0000 UTC m=+5.300933699 container init c5473bb4aa0b08282a161fce04270920ea09f495884f3fbe5cb8f7f9ef15e783 (image=quay.io/ceph/keepalived:2.2.4, name=adoring_buck, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, architecture=x86_64)
Jan 23 04:03:28 np0005593234 podman[85488]: 2026-01-23 09:03:28.532674504 +0000 UTC m=+5.309219301 container start c5473bb4aa0b08282a161fce04270920ea09f495884f3fbe5cb8f7f9ef15e783 (image=quay.io/ceph/keepalived:2.2.4, name=adoring_buck, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, description=keepalived for Ceph, architecture=x86_64, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, release=1793, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived)
Jan 23 04:03:28 np0005593234 podman[85488]: 2026-01-23 09:03:28.537021261 +0000 UTC m=+5.313566058 container attach c5473bb4aa0b08282a161fce04270920ea09f495884f3fbe5cb8f7f9ef15e783 (image=quay.io/ceph/keepalived:2.2.4, name=adoring_buck, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, architecture=x86_64, com.redhat.component=keepalived-container, name=keepalived, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, release=1793, io.openshift.tags=Ceph keepalived, version=2.2.4, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9)
Jan 23 04:03:28 np0005593234 adoring_buck[85585]: 0 0
Jan 23 04:03:28 np0005593234 systemd[1]: libpod-c5473bb4aa0b08282a161fce04270920ea09f495884f3fbe5cb8f7f9ef15e783.scope: Deactivated successfully.
Jan 23 04:03:28 np0005593234 podman[85488]: 2026-01-23 09:03:28.53982889 +0000 UTC m=+5.316373657 container died c5473bb4aa0b08282a161fce04270920ea09f495884f3fbe5cb8f7f9ef15e783 (image=quay.io/ceph/keepalived:2.2.4, name=adoring_buck, io.k8s.display-name=Keepalived on RHEL 9, architecture=x86_64, com.redhat.component=keepalived-container, name=keepalived, distribution-scope=public, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.28.2)
Jan 23 04:03:28 np0005593234 systemd[1]: var-lib-containers-storage-overlay-d99b016c78881a421c8d5f8233f4f8e4c16a324ff00ffea8422fc4f2c7e073de-merged.mount: Deactivated successfully.
Jan 23 04:03:28 np0005593234 podman[85488]: 2026-01-23 09:03:28.743175502 +0000 UTC m=+5.519720299 container remove c5473bb4aa0b08282a161fce04270920ea09f495884f3fbe5cb8f7f9ef15e783 (image=quay.io/ceph/keepalived:2.2.4, name=adoring_buck, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, release=1793, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, vcs-type=git, version=2.2.4)
Jan 23 04:03:28 np0005593234 systemd[1]: libpod-conmon-c5473bb4aa0b08282a161fce04270920ea09f495884f3fbe5cb8f7f9ef15e783.scope: Deactivated successfully.
Jan 23 04:03:28 np0005593234 systemd[1]: Reloading.
Jan 23 04:03:28 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:03:28 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:03:29 np0005593234 systemd[1]: Reloading.
Jan 23 04:03:29 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:03:29 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:03:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:29.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:29 np0005593234 systemd[1]: Starting Ceph keepalived.rgw.default.compute-2.tkmlem for e1533653-0a5a-584c-b34b-8689f0d32e77...
Jan 23 04:03:29 np0005593234 podman[85727]: 2026-01-23 09:03:29.592956224 +0000 UTC m=+0.033659136 container create 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., name=keepalived, vendor=Red Hat, Inc., architecture=x86_64, release=1793, build-date=2023-02-22T09:23:20)
Jan 23 04:03:29 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.13 scrub starts
Jan 23 04:03:29 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.13 scrub ok
Jan 23 04:03:29 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c4a3c6564029bc22ede4cc607b76732dee5c39351752cdcde3bd751e2c8743a/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:03:29 np0005593234 podman[85727]: 2026-01-23 09:03:29.651401503 +0000 UTC m=+0.092104435 container init 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, build-date=2023-02-22T09:23:20, name=keepalived, vendor=Red Hat, Inc., architecture=x86_64, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, io.buildah.version=1.28.2)
Jan 23 04:03:29 np0005593234 podman[85727]: 2026-01-23 09:03:29.65732612 +0000 UTC m=+0.098029032 container start 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, distribution-scope=public, release=1793, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, architecture=x86_64, vcs-type=git, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, name=keepalived, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4)
Jan 23 04:03:29 np0005593234 bash[85727]: 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72
Jan 23 04:03:29 np0005593234 podman[85727]: 2026-01-23 09:03:29.578708193 +0000 UTC m=+0.019411125 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 04:03:29 np0005593234 systemd[1]: Started Ceph keepalived.rgw.default.compute-2.tkmlem for e1533653-0a5a-584c-b34b-8689f0d32e77.
Jan 23 04:03:29 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem[85742]: Fri Jan 23 09:03:29 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 23 04:03:29 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem[85742]: Fri Jan 23 09:03:29 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 23 04:03:29 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem[85742]: Fri Jan 23 09:03:29 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 23 04:03:29 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem[85742]: Fri Jan 23 09:03:29 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 23 04:03:29 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem[85742]: Fri Jan 23 09:03:29 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 23 04:03:29 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem[85742]: Fri Jan 23 09:03:29 2026: Starting VRRP child process, pid=4
Jan 23 04:03:29 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem[85742]: Fri Jan 23 09:03:29 2026: Startup complete
Jan 23 04:03:29 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem[85742]: Fri Jan 23 09:03:29 2026: (VI_0) Entering BACKUP STATE (init)
Jan 23 04:03:29 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem[85742]: Fri Jan 23 09:03:29 2026: VRRP_Script(check_backend) succeeded
Jan 23 04:03:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:03:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:29.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:03:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:31.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:31 np0005593234 ceph-mon[77084]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 04:03:31 np0005593234 ceph-mon[77084]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 04:03:31 np0005593234 ceph-mon[77084]: Deploying daemon keepalived.rgw.default.compute-0.qsixev on compute-0
Jan 23 04:03:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:31.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:03:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:33.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:03:33 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem[85742]: Fri Jan 23 09:03:33 2026: (VI_0) Entering MASTER STATE
Jan 23 04:03:33 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.11 scrub starts
Jan 23 04:03:33 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.11 scrub ok
Jan 23 04:03:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:33.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:34 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.15 scrub starts
Jan 23 04:03:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:03:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:35.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:03:35 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.1a scrub starts
Jan 23 04:03:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:35.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:36 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 5.1a scrub ok
Jan 23 04:03:36 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.15 scrub ok
Jan 23 04:03:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:03:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:37.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:03:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:37.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:38 np0005593234 podman[86033]: 2026-01-23 09:03:38.824268142 +0000 UTC m=+0.757069930 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:03:38 np0005593234 podman[86033]: 2026-01-23 09:03:38.935003365 +0000 UTC m=+0.867805123 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:03:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:39.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:39 np0005593234 podman[86186]: 2026-01-23 09:03:39.545000791 +0000 UTC m=+0.058479801 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:03:39 np0005593234 podman[86186]: 2026-01-23 09:03:39.559801149 +0000 UTC m=+0.073280159 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:03:39 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Jan 23 04:03:39 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Jan 23 04:03:39 np0005593234 podman[86251]: 2026-01-23 09:03:39.766359093 +0000 UTC m=+0.053152512 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, description=keepalived for Ceph, architecture=x86_64, vcs-type=git)
Jan 23 04:03:39 np0005593234 podman[86251]: 2026-01-23 09:03:39.807821024 +0000 UTC m=+0.094614413 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., build-date=2023-02-22T09:23:20, release=1793, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=)
Jan 23 04:03:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:39.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:40 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem[85742]: Fri Jan 23 09:03:40 2026: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Jan 23 04:03:40 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem[85742]: Fri Jan 23 09:03:40 2026: (VI_0) Entering BACKUP STATE
Jan 23 04:03:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:03:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:03:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:41.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:41 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.1b scrub starts
Jan 23 04:03:41 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.1b scrub ok
Jan 23 04:03:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:41.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:42 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Jan 23 04:03:42 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Jan 23 04:03:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 23 04:03:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:03:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:03:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:43.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:03:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:43.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 23 04:03:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:03:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:03:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:03:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:03:44 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Jan 23 04:03:44 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Jan 23 04:03:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 23 04:03:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:03:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:03:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 04:03:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:03:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:45.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:03:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:45.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 23 04:03:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 23 04:03:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:03:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:03:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:46 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.a scrub starts
Jan 23 04:03:46 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.a scrub ok
Jan 23 04:03:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 23 04:03:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:47.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:47 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:47 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:47.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:48 np0005593234 podman[86641]: 2026-01-23 09:03:48.198085197 +0000 UTC m=+0.075157978 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:03:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 23 04:03:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 04:03:48 np0005593234 podman[86641]: 2026-01-23 09:03:48.345265053 +0000 UTC m=+0.222337834 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:03:48 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.1d deep-scrub starts
Jan 23 04:03:48 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.1d deep-scrub ok
Jan 23 04:03:49 np0005593234 podman[86796]: 2026-01-23 09:03:49.089173155 +0000 UTC m=+0.053889776 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:03:49 np0005593234 podman[86796]: 2026-01-23 09:03:49.10606127 +0000 UTC m=+0.070777871 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:03:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 23 04:03:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:49.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:49 np0005593234 podman[86859]: 2026-01-23 09:03:49.379941843 +0000 UTC m=+0.082582403 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, name=keepalived, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, description=keepalived for Ceph, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, distribution-scope=public)
Jan 23 04:03:49 np0005593234 podman[86859]: 2026-01-23 09:03:49.3959568 +0000 UTC m=+0.098597340 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, vendor=Red Hat, Inc., release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, io.buildah.version=1.28.2, name=keepalived, vcs-type=git, com.redhat.component=keepalived-container, description=keepalived for Ceph)
Jan 23 04:03:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:03:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:03:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:49.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:03:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:51.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:03:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e60 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:51.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 23 04:03:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 23 04:03:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[10.3( empty local-lis/les=0/0 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[10.1( empty local-lis/les=0/0 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[10.4( empty local-lis/les=0/0 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[10.1e( empty local-lis/les=0/0 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[10.10( empty local-lis/les=0/0 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[10.12( empty local-lis/les=0/0 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[10.11( empty local-lis/les=0/0 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[10.f( empty local-lis/les=0/0 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.11( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[11.13( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.a( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.6( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[11.a( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.9( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.16( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.5( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[11.16( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.15( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.f( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.d( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[11.e( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[11.8( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.b( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.c( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[11.3( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.2( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.3( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.1f( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[11.19( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[11.17( empty local-lis/les=0/0 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 61 pg[8.1c( empty local-lis/les=0/0 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Jan 23 04:03:52 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Jan 23 04:03:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[11.16( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.2( v 46'4 (0'0,46'4] local-lis/les=61/62 n=1 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.15( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[10.1( v 50'48 (0'0,50'48] local-lis/les=61/62 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[10.3( v 60'51 lc 50'39 (0'0,60'51] local-lis/les=61/62 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=60'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.16( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[11.a( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.f( v 46'4 lc 0'0 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 mlcod 0'0 active+degraded m=2 mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[11.8( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.9( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.d( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[10.f( v 50'48 (0'0,50'48] local-lis/les=61/62 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.b( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[11.e( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.3( v 46'4 (0'0,46'4] local-lis/les=61/62 n=1 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.5( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[11.19( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.1f( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[10.1e( v 50'48 (0'0,50'48] local-lis/les=61/62 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.1c( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[10.11( v 50'48 (0'0,50'48] local-lis/les=61/62 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.6( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[10.10( v 50'48 (0'0,50'48] local-lis/les=61/62 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[10.12( v 50'48 (0'0,50'48] local-lis/les=61/62 n=0 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.11( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[10.4( v 50'48 (0'0,50'48] local-lis/les=61/62 n=1 ec=57/49 lis/c=57/57 les/c/f=58/58/0 sis=61) [2] r=0 lpr=61 pi=[57,61)/1 crt=50'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[11.17( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.c( v 46'4 lc 0'0 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[11.13( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[11.3( empty local-lis/les=61/62 n=0 ec=59/51 lis/c=59/59 les/c/f=60/60/0 sis=61) [2] r=0 lpr=61 pi=[59,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 62 pg[8.a( v 46'4 (0'0,46'4] local-lis/les=61/62 n=0 ec=55/45 lis/c=55/55 les/c/f=56/56/0 sis=61) [2] r=0 lpr=61 pi=[55,61)/1 crt=46'4 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:03:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:03:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:03:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 23 04:03:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:03:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:03:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:53.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:03:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:53.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:03:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:55.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:03:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:55.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e62 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:03:57 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:57 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:03:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:57.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:03:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:03:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:57.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:03:58 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 23 04:03:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 23 04:03:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 23 04:03:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:03:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:03:59.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:03:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:03:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:03:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:03:59.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 23 04:04:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 23 04:04:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 23 04:04:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:04:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:01.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:04:01 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.d scrub starts
Jan 23 04:04:01 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.d scrub ok
Jan 23 04:04:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e64 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:04:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:01.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:04:01 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 64 pg[9.b( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:01 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 64 pg[9.17( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:01 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 64 pg[9.13( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:01 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 64 pg[9.3( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:01 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 64 pg[9.7( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:01 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 64 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:01 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 64 pg[9.f( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:01 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 64 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=64) [2] r=0 lpr=64 pi=[57,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.f scrub starts
Jan 23 04:04:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.17( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.b( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.3( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.b( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.17( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.7( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.f( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.3( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.f( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.7( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.13( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.1b( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.1f( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 65 pg[9.13( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=65) [2]/[0] r=-1 lpr=65 pi=[57,65)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:02 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.f scrub ok
Jan 23 04:04:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:03.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 23 04:04:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 23 04:04:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:03.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.13( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.17( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.13( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.7( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.b( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.17( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.7( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.b( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.f( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.f( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.1b( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.1b( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.3( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:04 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 67 pg[9.3( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:05.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 23 04:04:05 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 68 pg[9.13( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=5 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:05 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 68 pg[9.7( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=6 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:05 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 68 pg[9.b( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=6 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:05 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 68 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=5 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:05 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 68 pg[9.f( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=6 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:05 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 68 pg[9.1b( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=5 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:05 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 68 pg[9.3( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=6 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:05 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 68 pg[9.17( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=5 ec=57/47 lis/c=65/57 les/c/f=66/59/0 sis=67) [2] r=0 lpr=67 pi=[57,67)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:05.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:04:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:07.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:04:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:07.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 23 04:04:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 69 pg[9.5( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=69) [2] r=0 lpr=69 pi=[57,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 69 pg[9.d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=69) [2] r=0 lpr=69 pi=[57,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 69 pg[9.1d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=69) [2] r=0 lpr=69 pi=[57,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 69 pg[9.15( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=69) [2] r=0 lpr=69 pi=[57,69)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:09.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.c deep-scrub starts
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.c deep-scrub ok
Jan 23 04:04:09 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 23 04:04:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 70 pg[9.15( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 70 pg[9.15( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 70 pg[9.d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 70 pg[9.d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 70 pg[9.1d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 70 pg[9.1d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 70 pg[9.5( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:09 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 70 pg[9.5( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=70) [2]/[0] r=-1 lpr=70 pi=[57,70)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:09.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 23 04:04:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 23 04:04:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:11.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e71 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:11.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 23 04:04:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 23 04:04:11 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 72 pg[9.d( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=70/57 les/c/f=71/59/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:11 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 72 pg[9.d( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=70/57 les/c/f=71/59/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:11 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 72 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=70/57 les/c/f=71/59/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:11 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 72 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=70/57 les/c/f=71/59/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:11 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 72 pg[9.5( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=70/57 les/c/f=71/59/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:11 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 72 pg[9.5( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=70/57 les/c/f=71/59/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:11 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 72 pg[9.15( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=70/57 les/c/f=71/59/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:11 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 72 pg[9.15( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=70/57 les/c/f=71/59/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 23 04:04:12 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 73 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=5 ec=57/47 lis/c=70/57 les/c/f=71/59/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:12 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 73 pg[9.15( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=5 ec=57/47 lis/c=70/57 les/c/f=71/59/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:12 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 73 pg[9.5( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=6 ec=57/47 lis/c=70/57 les/c/f=71/59/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:12 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 73 pg[9.d( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=6 ec=57/47 lis/c=70/57 les/c/f=71/59/0 sis=72) [2] r=0 lpr=72 pi=[57,72)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:13.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 23 04:04:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:04:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:13.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:04:14 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.a scrub starts
Jan 23 04:04:14 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.a scrub ok
Jan 23 04:04:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 23 04:04:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:15.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:15.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e75 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:17.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:17 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.5 scrub starts
Jan 23 04:04:17 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.5 scrub ok
Jan 23 04:04:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:04:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:17.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:04:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 23 04:04:19 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 23 04:04:19 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 23 04:04:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:19.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:19 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.14 scrub starts
Jan 23 04:04:19 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.14 scrub ok
Jan 23 04:04:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:19.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 23 04:04:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 23 04:04:20 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 77 pg[9.8( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=77) [2] r=0 lpr=77 pi=[57,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:20 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 77 pg[9.18( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=77) [2] r=0 lpr=77 pi=[57,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:20 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.10 scrub starts
Jan 23 04:04:20 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.10 scrub ok
Jan 23 04:04:21 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 23 04:04:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 23 04:04:21 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 78 pg[9.18( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=78) [2]/[0] r=-1 lpr=78 pi=[57,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:21 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 78 pg[9.18( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=78) [2]/[0] r=-1 lpr=78 pi=[57,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:21 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 78 pg[9.8( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=78) [2]/[0] r=-1 lpr=78 pi=[57,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:21 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 78 pg[9.8( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=78) [2]/[0] r=-1 lpr=78 pi=[57,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:21.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e78 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:21.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 23 04:04:22 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.16 deep-scrub starts
Jan 23 04:04:22 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.16 deep-scrub ok
Jan 23 04:04:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 23 04:04:23 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 80 pg[9.18( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=78/57 les/c/f=79/59/0 sis=80) [2] r=0 lpr=80 pi=[57,80)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:23 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 80 pg[9.18( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=78/57 les/c/f=79/59/0 sis=80) [2] r=0 lpr=80 pi=[57,80)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:23 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 80 pg[9.8( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=78/57 les/c/f=79/59/0 sis=80) [2] r=0 lpr=80 pi=[57,80)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:23 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 80 pg[9.8( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=78/57 les/c/f=79/59/0 sis=80) [2] r=0 lpr=80 pi=[57,80)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:23.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:23.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 23 04:04:24 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 81 pg[9.18( v 53'1142 (0'0,53'1142] local-lis/les=80/81 n=5 ec=57/47 lis/c=78/57 les/c/f=79/59/0 sis=80) [2] r=0 lpr=80 pi=[57,80)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:24 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 81 pg[9.8( v 53'1142 (0'0,53'1142] local-lis/les=80/81 n=6 ec=57/47 lis/c=78/57 les/c/f=79/59/0 sis=80) [2] r=0 lpr=80 pi=[57,80)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000037s ======
Jan 23 04:04:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:25.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000037s
Jan 23 04:04:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:25.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:26 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Jan 23 04:04:26 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Jan 23 04:04:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e81 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000037s ======
Jan 23 04:04:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:27.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000037s
Jan 23 04:04:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:27.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 23 04:04:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 23 04:04:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:29.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 23 04:04:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:29.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:30 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.b deep-scrub starts
Jan 23 04:04:30 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.b deep-scrub ok
Jan 23 04:04:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 23 04:04:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 23 04:04:30 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 83 pg[9.9( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=82) [2] r=0 lpr=83 pi=[57,82)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:30 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 83 pg[9.19( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=82) [2] r=0 lpr=83 pi=[57,82)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:31.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:31 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.13 scrub starts
Jan 23 04:04:31 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.13 scrub ok
Jan 23 04:04:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:31 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 23 04:04:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 23 04:04:31 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 84 pg[9.9( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=84) [2]/[0] r=-1 lpr=84 pi=[57,84)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:31 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 84 pg[9.9( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=84) [2]/[0] r=-1 lpr=84 pi=[57,84)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:31 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 84 pg[9.19( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=84) [2]/[0] r=-1 lpr=84 pi=[57,84)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:31 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 84 pg[9.19( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=57/57 les/c/f=59/59/0 sis=84) [2]/[0] r=-1 lpr=84 pi=[57,84)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:31.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 23 04:04:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:33.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 23 04:04:33 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 86 pg[9.9( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=84/57 les/c/f=85/59/0 sis=86) [2] r=0 lpr=86 pi=[57,86)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:33 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 86 pg[9.9( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=6 ec=57/47 lis/c=84/57 les/c/f=85/59/0 sis=86) [2] r=0 lpr=86 pi=[57,86)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:33 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 86 pg[9.19( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=84/57 les/c/f=85/59/0 sis=86) [2] r=0 lpr=86 pi=[57,86)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:33 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 86 pg[9.19( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=84/57 les/c/f=85/59/0 sis=86) [2] r=0 lpr=86 pi=[57,86)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000036s ======
Jan 23 04:04:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:33.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000036s
Jan 23 04:04:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 23 04:04:34 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 87 pg[9.9( v 53'1142 (0'0,53'1142] local-lis/les=86/87 n=6 ec=57/47 lis/c=84/57 les/c/f=85/59/0 sis=86) [2] r=0 lpr=86 pi=[57,86)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:34 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 87 pg[9.19( v 53'1142 (0'0,53'1142] local-lis/les=86/87 n=5 ec=57/47 lis/c=84/57 les/c/f=85/59/0 sis=86) [2] r=0 lpr=86 pi=[57,86)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:35.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:35 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Jan 23 04:04:35 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Jan 23 04:04:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000037s ======
Jan 23 04:04:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:35.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000037s
Jan 23 04:04:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000036s ======
Jan 23 04:04:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:37.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000036s
Jan 23 04:04:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:37.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:38 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 23 04:04:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 23 04:04:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:39.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:39 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.11 deep-scrub starts
Jan 23 04:04:39 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.11 deep-scrub ok
Jan 23 04:04:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:39.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 23 04:04:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 23 04:04:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:41.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 23 04:04:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:41.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 23 04:04:42 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 90 pg[9.d( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=6 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=90 pruub=10.188756943s) [1] r=-1 lpr=90 pi=[72,90)/1 crt=53'1142 mlcod 0'0 active pruub 143.607788086s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:42 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 90 pg[9.d( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=6 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=90 pruub=10.188677788s) [1] r=-1 lpr=90 pi=[72,90)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 143.607788086s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:42 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 90 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=5 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=90 pruub=10.184926033s) [1] r=-1 lpr=90 pi=[72,90)/1 crt=53'1142 mlcod 0'0 active pruub 143.604156494s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:42 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 90 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=5 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=90 pruub=10.184857368s) [1] r=-1 lpr=90 pi=[72,90)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 143.604156494s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 23 04:04:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 23 04:04:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:43.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:43 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.1d scrub starts
Jan 23 04:04:43 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.1d scrub ok
Jan 23 04:04:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000037s ======
Jan 23 04:04:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:43.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000037s
Jan 23 04:04:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 23 04:04:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 91 pg[9.d( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=6 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=91) [1]/[2] r=0 lpr=91 pi=[72,91)/1 crt=53'1142 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 91 pg[9.d( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=6 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=91) [1]/[2] r=0 lpr=91 pi=[72,91)/1 crt=53'1142 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 91 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=5 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=91) [1]/[2] r=0 lpr=91 pi=[72,91)/1 crt=53'1142 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 91 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=5 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=91) [1]/[2] r=0 lpr=91 pi=[72,91)/1 crt=53'1142 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 23 04:04:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 23 04:04:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 23 04:04:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 23 04:04:45 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 92 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=91/92 n=5 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[72,91)/1 crt=53'1142 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:45 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 92 pg[9.d( v 53'1142 (0'0,53'1142] local-lis/les=91/92 n=6 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=91) [1]/[2] async=[1] r=0 lpr=91 pi=[72,91)/1 crt=53'1142 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:45 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.1b scrub starts
Jan 23 04:04:45 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.1b scrub ok
Jan 23 04:04:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:45.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:45.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 23 04:04:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 23 04:04:46 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 93 pg[9.d( v 53'1142 (0'0,53'1142] local-lis/les=91/92 n=6 ec=57/47 lis/c=91/72 les/c/f=92/73/0 sis=93 pruub=14.984744072s) [1] async=[1] r=-1 lpr=93 pi=[72,93)/1 crt=53'1142 mlcod 53'1142 active pruub 151.814941406s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:46 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 93 pg[9.d( v 53'1142 (0'0,53'1142] local-lis/les=91/92 n=6 ec=57/47 lis/c=91/72 les/c/f=92/73/0 sis=93 pruub=14.984427452s) [1] r=-1 lpr=93 pi=[72,93)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 151.814941406s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:46 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 93 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=5 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=93 pruub=15.431636810s) [1] r=-1 lpr=93 pi=[67,93)/1 crt=53'1142 mlcod 0'0 active pruub 152.262237549s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:46 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 93 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=5 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=93 pruub=15.431509018s) [1] r=-1 lpr=93 pi=[67,93)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 152.262237549s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:46 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 93 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=91/92 n=5 ec=57/47 lis/c=91/72 les/c/f=92/73/0 sis=93 pruub=14.984263420s) [1] async=[1] r=-1 lpr=93 pi=[72,93)/1 crt=53'1142 mlcod 53'1142 active pruub 151.814865112s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:46 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 93 pg[9.f( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=6 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=93 pruub=15.431105614s) [1] r=-1 lpr=93 pi=[67,93)/1 crt=53'1142 mlcod 0'0 active pruub 152.262298584s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:46 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 93 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=91/92 n=5 ec=57/47 lis/c=91/72 les/c/f=92/73/0 sis=93 pruub=14.983700752s) [1] r=-1 lpr=93 pi=[72,93)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 151.814865112s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:46 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 93 pg[9.f( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=6 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=93 pruub=15.431074142s) [1] r=-1 lpr=93 pi=[67,93)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 152.262298584s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:46 np0005593234 systemd-logind[794]: New session 33 of user zuul.
Jan 23 04:04:46 np0005593234 systemd[1]: Started Session 33 of User zuul.
Jan 23 04:04:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:47 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 23 04:04:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 23 04:04:47 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 94 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=5 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=94) [1]/[2] r=0 lpr=94 pi=[67,94)/1 crt=53'1142 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:47 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 94 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=5 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=94) [1]/[2] r=0 lpr=94 pi=[67,94)/1 crt=53'1142 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:47 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 94 pg[9.f( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=6 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=94) [1]/[2] r=0 lpr=94 pi=[67,94)/1 crt=53'1142 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:47 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 94 pg[9.f( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=6 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=94) [1]/[2] r=0 lpr=94 pi=[67,94)/1 crt=53'1142 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:04:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000036s ======
Jan 23 04:04:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:47.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000036s
Jan 23 04:04:47 np0005593234 python3.9[87277]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:04:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:47.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 23 04:04:48 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 95 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=94/95 n=5 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=94) [1]/[2] async=[1] r=0 lpr=94 pi=[67,94)/1 crt=53'1142 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:48 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 95 pg[9.f( v 53'1142 (0'0,53'1142] local-lis/les=94/95 n=6 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=94) [1]/[2] async=[1] r=0 lpr=94 pi=[67,94)/1 crt=53'1142 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:04:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 23 04:04:49 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 96 pg[9.f( v 53'1142 (0'0,53'1142] local-lis/les=94/95 n=6 ec=57/47 lis/c=94/67 les/c/f=95/68/0 sis=96 pruub=15.426559448s) [1] async=[1] r=-1 lpr=96 pi=[67,96)/1 crt=53'1142 mlcod 53'1142 active pruub 155.096466064s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:49 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 96 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=94/95 n=5 ec=57/47 lis/c=94/67 les/c/f=95/68/0 sis=96 pruub=15.426523209s) [1] async=[1] r=-1 lpr=96 pi=[67,96)/1 crt=53'1142 mlcod 53'1142 active pruub 155.096191406s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:04:49 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 96 pg[9.1f( v 53'1142 (0'0,53'1142] local-lis/les=94/95 n=5 ec=57/47 lis/c=94/67 les/c/f=95/68/0 sis=96 pruub=15.426084518s) [1] r=-1 lpr=96 pi=[67,96)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 155.096191406s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:49 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 96 pg[9.f( v 53'1142 (0'0,53'1142] local-lis/les=94/95 n=6 ec=57/47 lis/c=94/67 les/c/f=95/68/0 sis=96 pruub=15.425621033s) [1] r=-1 lpr=96 pi=[67,96)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 155.096466064s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:04:49 np0005593234 python3.9[87492]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:04:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:49.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 23 04:04:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:04:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:49.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:04:50 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Jan 23 04:04:50 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Jan 23 04:04:51 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Jan 23 04:04:51 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Jan 23 04:04:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:51.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:51.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 23 04:04:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 04:04:52 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Jan 23 04:04:52 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Jan 23 04:04:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 04:04:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:04:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:53.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:04:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 23 04:04:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:53.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 23 04:04:55 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 6.17 deep-scrub starts
Jan 23 04:04:55 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 6.17 deep-scrub ok
Jan 23 04:04:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:55.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:55.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:56 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.14 scrub starts
Jan 23 04:04:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:57.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:57.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:59 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.14 scrub ok
Jan 23 04:04:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:04:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 23 04:04:59 np0005593234 podman[87690]: 2026-01-23 09:04:59.454195954 +0000 UTC m=+2.087638445 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.license=GPLv2)
Jan 23 04:04:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:04:59.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:04:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:04:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:04:59.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:04:59 np0005593234 podman[87759]: 2026-01-23 09:04:59.936782852 +0000 UTC m=+0.138689650 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:05:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 23 04:05:00 np0005593234 podman[87690]: 2026-01-23 09:05:00.335828019 +0000 UTC m=+2.969270460 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:05:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 23 04:05:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 23 04:05:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:01.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 23 04:05:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:01.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:02 np0005593234 podman[87892]: 2026-01-23 09:05:02.023149086 +0000 UTC m=+1.179682739 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:05:02 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 6.12 deep-scrub starts
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:02 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 6.12 deep-scrub ok
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:05:02.351356) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159102351521, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 6890, "num_deletes": 256, "total_data_size": 12228966, "memory_usage": 12466640, "flush_reason": "Manual Compaction"}
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 23 04:05:02 np0005593234 podman[87914]: 2026-01-23 09:05:02.366160468 +0000 UTC m=+0.313280567 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159102402960, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 7285570, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 257, "largest_seqno": 6895, "table_properties": {"data_size": 7259223, "index_size": 17016, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 8261, "raw_key_size": 76804, "raw_average_key_size": 23, "raw_value_size": 7195772, "raw_average_value_size": 2191, "num_data_blocks": 757, "num_entries": 3284, "num_filter_entries": 3284, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 1769158914, "file_creation_time": 1769159102, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 51718 microseconds, and 25525 cpu microseconds.
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:05:02.403093) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 7285570 bytes OK
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:05:02.403118) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:05:02.405821) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:05:02.405846) EVENT_LOG_v1 {"time_micros": 1769159102405839, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:05:02.405869) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 12192624, prev total WAL file size 12192624, number of live WAL files 2.
Jan 23 04:05:02 np0005593234 podman[87892]: 2026-01-23 09:05:02.406883476 +0000 UTC m=+1.563417109 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:05:02.408186) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(7114KB) 8(1648B)]
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159102408442, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 7287218, "oldest_snapshot_seqno": -1}
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3031 keys, 7281747 bytes, temperature: kUnknown
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159102695168, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 7281747, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7256110, "index_size": 16951, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7621, "raw_key_size": 72588, "raw_average_key_size": 23, "raw_value_size": 7195830, "raw_average_value_size": 2374, "num_data_blocks": 756, "num_entries": 3031, "num_filter_entries": 3031, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769159102, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:05:02.695508) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 7281747 bytes
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:05:02.700002) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 25.4 rd, 25.4 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(6.9, 0.0 +0.0 blob) out(6.9 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3289, records dropped: 258 output_compression: NoCompression
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:05:02.700030) EVENT_LOG_v1 {"time_micros": 1769159102700018, "job": 4, "event": "compaction_finished", "compaction_time_micros": 286844, "compaction_time_cpu_micros": 32492, "output_level": 6, "num_output_files": 1, "total_output_size": 7281747, "num_input_records": 3289, "num_output_records": 3031, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159102701675, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159102701813, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 23 04:05:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:05:02.408078) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:05:02 np0005593234 podman[87962]: 2026-01-23 09:05:02.724048713 +0000 UTC m=+0.058638937 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Jan 23 04:05:02 np0005593234 podman[87962]: 2026-01-23 09:05:02.737987537 +0000 UTC m=+0.072577731 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, name=keepalived, io.openshift.expose-services=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=2.2.4, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1793, distribution-scope=public, vcs-type=git, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc.)
Jan 23 04:05:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:05:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:03.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:05:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:05:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:03.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:05:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 23 04:05:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e103 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 23 04:05:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:05 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 6.1 scrub starts
Jan 23 04:05:05 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 6.1 scrub ok
Jan 23 04:05:05 np0005593234 systemd[1]: session-33.scope: Deactivated successfully.
Jan 23 04:05:05 np0005593234 systemd[1]: session-33.scope: Consumed 8.728s CPU time.
Jan 23 04:05:05 np0005593234 systemd-logind[794]: Session 33 logged out. Waiting for processes to exit.
Jan 23 04:05:05 np0005593234 systemd-logind[794]: Removed session 33.
Jan 23 04:05:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:05:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:05.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:05:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:05:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:05.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:05:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 23 04:05:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:05:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:05:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 23 04:05:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:07.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:07.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:08 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 6.1b scrub starts
Jan 23 04:05:08 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 6.1b scrub ok
Jan 23 04:05:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 23 04:05:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 23 04:05:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:05:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:09.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:05:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:09.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 23 04:05:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 23 04:05:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 23 04:05:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 23 04:05:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:05:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:11.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:05:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:11.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:12 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.19 scrub starts
Jan 23 04:05:12 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.19 scrub ok
Jan 23 04:05:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:05:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 23 04:05:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 23 04:05:13 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 109 pg[9.15( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=5 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=109 pruub=11.862911224s) [1] r=-1 lpr=109 pi=[72,109)/1 crt=53'1142 mlcod 0'0 active pruub 175.605148315s@ mbc={}] start_peering_interval up [2] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:13 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 109 pg[9.15( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=5 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=109 pruub=11.862830162s) [1] r=-1 lpr=109 pi=[72,109)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 175.605148315s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:05:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:13.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:05:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 23 04:05:13 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 110 pg[9.15( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=5 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=110) [1]/[2] r=0 lpr=110 pi=[72,110)/1 crt=53'1142 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [1] -> [1], acting [1] -> [2], acting_primary 1 -> 2, up_primary 1 -> 1, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:13 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 110 pg[9.15( v 53'1142 (0'0,53'1142] local-lis/les=72/73 n=5 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=110) [1]/[2] r=0 lpr=110 pi=[72,110)/1 crt=53'1142 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 23 04:05:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:13.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:14 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 23 04:05:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 23 04:05:15 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 111 pg[9.16( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=74/74 les/c/f=75/75/0 sis=111) [2] r=0 lpr=111 pi=[74,111)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:15 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 111 pg[9.15( v 53'1142 (0'0,53'1142] local-lis/les=110/111 n=5 ec=57/47 lis/c=72/72 les/c/f=73/73/0 sis=110) [1]/[2] async=[1] r=0 lpr=110 pi=[72,110)/1 crt=53'1142 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:05:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:15.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:05:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:05:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:15.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:05:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 23 04:05:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 23 04:05:16 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 112 pg[9.15( v 53'1142 (0'0,53'1142] local-lis/les=110/111 n=5 ec=57/47 lis/c=110/72 les/c/f=111/73/0 sis=112 pruub=14.846530914s) [1] async=[1] r=-1 lpr=112 pi=[72,112)/1 crt=53'1142 mlcod 53'1142 active pruub 181.922927856s@ mbc={255={}}] start_peering_interval up [1] -> [1], acting [2] -> [1], acting_primary 2 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:16 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 112 pg[9.15( v 53'1142 (0'0,53'1142] local-lis/les=110/111 n=5 ec=57/47 lis/c=110/72 les/c/f=111/73/0 sis=112 pruub=14.846430779s) [1] r=-1 lpr=112 pi=[72,112)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 181.922927856s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:16 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 112 pg[9.16( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=74/74 les/c/f=75/75/0 sis=112) [2]/[1] r=-1 lpr=112 pi=[74,112)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:16 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 112 pg[9.16( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=74/74 les/c/f=75/75/0 sis=112) [2]/[1] r=-1 lpr=112 pi=[74,112)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:16 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Jan 23 04:05:16 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Jan 23 04:05:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 23 04:05:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:17.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:17.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:17 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 6.1c deep-scrub starts
Jan 23 04:05:17 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 6.1c deep-scrub ok
Jan 23 04:05:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 23 04:05:18 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 114 pg[9.16( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=112/74 les/c/f=113/75/0 sis=114) [2] r=0 lpr=114 pi=[74,114)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:18 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 114 pg[9.16( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=112/74 les/c/f=113/75/0 sis=114) [2] r=0 lpr=114 pi=[74,114)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:18 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.1f scrub starts
Jan 23 04:05:18 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.1f scrub ok
Jan 23 04:05:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 23 04:05:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:19.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:19 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 115 pg[9.16( v 53'1142 (0'0,53'1142] local-lis/les=114/115 n=5 ec=57/47 lis/c=112/74 les/c/f=113/75/0 sis=114) [2] r=0 lpr=114 pi=[74,114)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:19.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:19 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.6 scrub starts
Jan 23 04:05:19 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.6 scrub ok
Jan 23 04:05:20 np0005593234 systemd-logind[794]: New session 34 of user zuul.
Jan 23 04:05:20 np0005593234 systemd[1]: Started Session 34 of User zuul.
Jan 23 04:05:20 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.1d scrub starts
Jan 23 04:05:20 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.1d scrub ok
Jan 23 04:05:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:21.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:21 np0005593234 python3.9[88422]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 23 04:05:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:05:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:21.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:05:21 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 6.1e scrub starts
Jan 23 04:05:22 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 6.1e scrub ok
Jan 23 04:05:22 np0005593234 systemd[72548]: Created slice User Background Tasks Slice.
Jan 23 04:05:22 np0005593234 systemd[72548]: Starting Cleanup of User's Temporary Files and Directories...
Jan 23 04:05:22 np0005593234 systemd[72548]: Finished Cleanup of User's Temporary Files and Directories.
Jan 23 04:05:23 np0005593234 python3.9[88598]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:05:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:23.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:23.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:23 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.1c scrub starts
Jan 23 04:05:23 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.1c scrub ok
Jan 23 04:05:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:25.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:25.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:25 np0005593234 python3.9[88755]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:05:26 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Jan 23 04:05:26 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Jan 23 04:05:26 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 23 04:05:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 23 04:05:27 np0005593234 python3.9[88909]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:05:27 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 23 04:05:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:27.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:27.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:28 np0005593234 python3.9[89063]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:05:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 23 04:05:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 23 04:05:29 np0005593234 python3.9[89216]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:05:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 23 04:05:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:29.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:29.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:30 np0005593234 python3.9[89366]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:05:30 np0005593234 network[89383]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:05:30 np0005593234 network[89384]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:05:30 np0005593234 network[89385]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:05:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 23 04:05:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 23 04:05:30 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 118 pg[9.19( v 53'1142 (0'0,53'1142] local-lis/les=86/87 n=5 ec=57/47 lis/c=86/86 les/c/f=87/87/0 sis=118 pruub=8.469306946s) [0] r=-1 lpr=118 pi=[86,118)/1 crt=53'1142 mlcod 0'0 active pruub 189.404342651s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:30 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 118 pg[9.19( v 53'1142 (0'0,53'1142] local-lis/les=86/87 n=5 ec=57/47 lis/c=86/86 les/c/f=87/87/0 sis=118 pruub=8.469233513s) [0] r=-1 lpr=118 pi=[86,118)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 189.404342651s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:31 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.3 scrub starts
Jan 23 04:05:31 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.3 scrub ok
Jan 23 04:05:31 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 23 04:05:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 23 04:05:31 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 119 pg[9.19( v 53'1142 (0'0,53'1142] local-lis/les=86/87 n=5 ec=57/47 lis/c=86/86 les/c/f=87/87/0 sis=119) [0]/[2] r=0 lpr=119 pi=[86,119)/1 crt=53'1142 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:31 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 119 pg[9.19( v 53'1142 (0'0,53'1142] local-lis/les=86/87 n=5 ec=57/47 lis/c=86/86 les/c/f=87/87/0 sis=119) [0]/[2] r=0 lpr=119 pi=[86,119)/1 crt=53'1142 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:05:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:31.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:05:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:05:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:31.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:05:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 23 04:05:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 23 04:05:32 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 120 pg[9.19( v 53'1142 (0'0,53'1142] local-lis/les=119/120 n=5 ec=57/47 lis/c=86/86 les/c/f=87/87/0 sis=119) [0]/[2] async=[0] r=0 lpr=119 pi=[86,119)/1 crt=53'1142 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 23 04:05:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 23 04:05:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:33.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:33 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 121 pg[9.19( v 53'1142 (0'0,53'1142] local-lis/les=119/120 n=5 ec=57/47 lis/c=119/86 les/c/f=120/87/0 sis=121 pruub=14.770365715s) [0] async=[0] r=-1 lpr=121 pi=[86,121)/1 crt=53'1142 mlcod 53'1142 active pruub 199.047668457s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:33 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 121 pg[9.19( v 53'1142 (0'0,53'1142] local-lis/les=119/120 n=5 ec=57/47 lis/c=119/86 les/c/f=120/87/0 sis=121 pruub=14.770031929s) [0] r=-1 lpr=121 pi=[86,121)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 199.047668457s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:33.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 23 04:05:34 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.1 scrub starts
Jan 23 04:05:34 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.1 scrub ok
Jan 23 04:05:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 23 04:05:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 23 04:05:34 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 123 pg[9.1b( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=5 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=123 pruub=14.720006943s) [0] r=-1 lpr=123 pi=[67,123)/1 crt=53'1142 mlcod 0'0 active pruub 200.263092041s@ mbc={}] start_peering_interval up [2] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:34 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 123 pg[9.1b( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=5 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=123 pruub=14.719384193s) [0] r=-1 lpr=123 pi=[67,123)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 200.263092041s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:35 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.4 scrub starts
Jan 23 04:05:35 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.4 scrub ok
Jan 23 04:05:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 23 04:05:35 np0005593234 python3.9[89648]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:05:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:35.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 23 04:05:35 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 124 pg[9.1b( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=5 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=124) [0]/[2] r=0 lpr=124 pi=[67,124)/1 crt=53'1142 mlcod 0'0 remapped NOTIFY mbc={}] start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:35 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 124 pg[9.1b( v 53'1142 (0'0,53'1142] local-lis/les=67/68 n=5 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=124) [0]/[2] r=0 lpr=124 pi=[67,124)/1 crt=53'1142 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:35.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:36 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.10 scrub starts
Jan 23 04:05:36 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.10 scrub ok
Jan 23 04:05:36 np0005593234 python3.9[89798]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:05:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 23 04:05:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 23 04:05:36 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 125 pg[9.1b( v 53'1142 (0'0,53'1142] local-lis/les=124/125 n=5 ec=57/47 lis/c=67/67 les/c/f=68/68/0 sis=124) [0]/[2] async=[0] r=0 lpr=124 pi=[67,124)/1 crt=53'1142 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:05:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:37.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:05:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 23 04:05:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 23 04:05:37 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 126 pg[9.1b( v 53'1142 (0'0,53'1142] local-lis/les=124/125 n=5 ec=57/47 lis/c=124/67 les/c/f=125/68/0 sis=126 pruub=14.985284805s) [0] async=[0] r=-1 lpr=126 pi=[67,126)/1 crt=53'1142 mlcod 53'1142 active pruub 203.591323853s@ mbc={255={}}] start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:37 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 126 pg[9.1b( v 53'1142 (0'0,53'1142] local-lis/les=124/125 n=5 ec=57/47 lis/c=124/67 les/c/f=125/68/0 sis=126 pruub=14.985191345s) [0] r=-1 lpr=126 pi=[67,126)/1 crt=53'1142 mlcod 0'0 unknown NOTIFY pruub 203.591323853s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:05:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:37.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:05:38 np0005593234 python3.9[90002]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:05:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 23 04:05:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 23 04:05:39 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.12 scrub starts
Jan 23 04:05:39 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.12 scrub ok
Jan 23 04:05:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:39.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e127 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:39 np0005593234 python3.9[90162]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:05:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:39.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:40 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 127 pg[9.1d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=93/93 les/c/f=94/94/0 sis=127) [2] r=0 lpr=127 pi=[93,127)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 23 04:05:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 23 04:05:40 np0005593234 python3.9[90247]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:05:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 23 04:05:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 23 04:05:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e129 e129: 3 total, 3 up, 3 in
Jan 23 04:05:41 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[1] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped mbc={}] start_peering_interval up [2] -> [2], acting [2] -> [1], acting_primary 2 -> 1, up_primary 2 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:41 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 129 pg[9.1d( empty local-lis/les=0/0 n=0 ec=57/47 lis/c=93/93 les/c/f=94/94/0 sis=129) [2]/[1] r=-1 lpr=129 pi=[93,129)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 04:05:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:05:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:41.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:05:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:05:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:41.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:05:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 04:05:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e130 e130: 3 total, 3 up, 3 in
Jan 23 04:05:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 04:05:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e131 e131: 3 total, 3 up, 3 in
Jan 23 04:05:43 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 131 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=129/93 les/c/f=130/94/0 sis=131) [2] r=0 lpr=131 pi=[93,131)/1 luod=0'0 crt=53'1142 mlcod 0'0 active mbc={}] start_peering_interval up [2] -> [2], acting [1] -> [2], acting_primary 1 -> 2, up_primary 2 -> 2, role -1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Jan 23 04:05:43 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 131 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=0/0 n=5 ec=57/47 lis/c=129/93 les/c/f=130/94/0 sis=131) [2] r=0 lpr=131 pi=[93,131)/1 crt=53'1142 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 04:05:43 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.11 scrub starts
Jan 23 04:05:43 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.11 scrub ok
Jan 23 04:05:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:05:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:43.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:05:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:43.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e132 e132: 3 total, 3 up, 3 in
Jan 23 04:05:44 np0005593234 ceph-osd[79769]: osd.2 pg_epoch: 132 pg[9.1d( v 53'1142 (0'0,53'1142] local-lis/les=131/132 n=5 ec=57/47 lis/c=129/93 les/c/f=130/94/0 sis=131) [2] r=0 lpr=131 pi=[93,131)/1 crt=53'1142 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 04:05:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e132 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e133 e133: 3 total, 3 up, 3 in
Jan 23 04:05:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:05:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:45.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:05:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:45.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 e134: 3 total, 3 up, 3 in
Jan 23 04:05:47 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.f scrub starts
Jan 23 04:05:47 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.f scrub ok
Jan 23 04:05:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:05:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:47.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:05:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:47.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:48 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.1e deep-scrub starts
Jan 23 04:05:48 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 10.1e deep-scrub ok
Jan 23 04:05:49 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.11 scrub starts
Jan 23 04:05:49 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.11 scrub ok
Jan 23 04:05:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:49.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:49.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:51 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.13 deep-scrub starts
Jan 23 04:05:51 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.13 deep-scrub ok
Jan 23 04:05:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:05:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:51.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:05:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:51.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:53 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.6 scrub starts
Jan 23 04:05:53 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.6 scrub ok
Jan 23 04:05:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:53.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:05:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:53.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:05:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:05:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:55.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:05:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:05:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:55.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:05:56 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.a scrub starts
Jan 23 04:05:56 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.a scrub ok
Jan 23 04:05:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:57.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:05:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:57.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:05:58 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.a scrub starts
Jan 23 04:05:58 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.a scrub ok
Jan 23 04:05:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:05:59.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:05:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:05:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:05:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:05:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:05:59.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:00 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.16 scrub starts
Jan 23 04:06:00 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.16 scrub ok
Jan 23 04:06:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:01.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:01.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:02 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.16 deep-scrub starts
Jan 23 04:06:02 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.16 deep-scrub ok
Jan 23 04:06:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:03.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:03.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:06:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:05.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:06:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:06:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:05.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:06:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:06:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:07.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:06:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:07.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:09.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:09.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:10 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.15 scrub starts
Jan 23 04:06:10 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.15 scrub ok
Jan 23 04:06:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:11.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:11 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.9 scrub starts
Jan 23 04:06:11 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.9 scrub ok
Jan 23 04:06:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:11.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:12 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.5 scrub starts
Jan 23 04:06:12 np0005593234 podman[90628]: 2026-01-23 09:06:12.609558255 +0000 UTC m=+0.072820861 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 23 04:06:12 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.5 scrub ok
Jan 23 04:06:12 np0005593234 podman[90628]: 2026-01-23 09:06:12.703104176 +0000 UTC m=+0.166366752 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:06:13 np0005593234 podman[90781]: 2026-01-23 09:06:13.398013637 +0000 UTC m=+0.159474029 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:06:13 np0005593234 podman[90781]: 2026-01-23 09:06:13.428608118 +0000 UTC m=+0.190068460 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:06:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:13.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:13 np0005593234 podman[90848]: 2026-01-23 09:06:13.643262825 +0000 UTC m=+0.052847184 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, distribution-scope=public, build-date=2023-02-22T09:23:20, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, com.redhat.component=keepalived-container, description=keepalived for Ceph)
Jan 23 04:06:13 np0005593234 podman[90848]: 2026-01-23 09:06:13.656911447 +0000 UTC m=+0.066495786 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, build-date=2023-02-22T09:23:20, release=1793, name=keepalived, description=keepalived for Ceph, io.buildah.version=1.28.2, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9)
Jan 23 04:06:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:06:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:14.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:06:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:15 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.f deep-scrub starts
Jan 23 04:06:15 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.f deep-scrub ok
Jan 23 04:06:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:06:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:15.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:06:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:16.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:06:16 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.8 scrub starts
Jan 23 04:06:16 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.8 scrub ok
Jan 23 04:06:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:06:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:17.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:06:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:18.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:06:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:06:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:19.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:06:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:06:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:20.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:06:20 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.b scrub starts
Jan 23 04:06:20 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.b scrub ok
Jan 23 04:06:21 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.e scrub starts
Jan 23 04:06:21 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.e scrub ok
Jan 23 04:06:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:21.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:22.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:06:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:23.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:24.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:25.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:06:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:26.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:06:26 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.d scrub starts
Jan 23 04:06:26 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.d scrub ok
Jan 23 04:06:27 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.3 scrub starts
Jan 23 04:06:27 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.3 scrub ok
Jan 23 04:06:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:06:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:27.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:06:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:28.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:28 np0005593234 python3.9[91272]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:06:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:29.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:30.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:30 np0005593234 python3.9[91560]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 23 04:06:31 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.3 scrub starts
Jan 23 04:06:31 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.3 scrub ok
Jan 23 04:06:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:31.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:31 np0005593234 python3.9[91712]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 23 04:06:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:32.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:32 np0005593234 python3.9[91865]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:06:33 np0005593234 python3.9[92017]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 23 04:06:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:33.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:34.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:34 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.2 deep-scrub starts
Jan 23 04:06:34 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.2 deep-scrub ok
Jan 23 04:06:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:35 np0005593234 python3.9[92170]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:06:35 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.c deep-scrub starts
Jan 23 04:06:35 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.c deep-scrub ok
Jan 23 04:06:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:35.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:35 np0005593234 python3.9[92322]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:06:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:36.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:36 np0005593234 python3.9[92400]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:06:37 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.1c scrub starts
Jan 23 04:06:37 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.1c scrub ok
Jan 23 04:06:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:37.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:37 np0005593234 python3.9[92553]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:06:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:38.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:38 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.1f scrub starts
Jan 23 04:06:38 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 8.1f scrub ok
Jan 23 04:06:39 np0005593234 python3.9[92758]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 23 04:06:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:39.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:40.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:40 np0005593234 python3.9[92912]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 23 04:06:40 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.17 scrub starts
Jan 23 04:06:40 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.17 scrub ok
Jan 23 04:06:41 np0005593234 python3.9[93065]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:06:41 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.19 scrub starts
Jan 23 04:06:41 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 11.19 scrub ok
Jan 23 04:06:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:41.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:42.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:42 np0005593234 python3.9[93218]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 23 04:06:42 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.17 scrub starts
Jan 23 04:06:42 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.17 scrub ok
Jan 23 04:06:43 np0005593234 python3.9[93370]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:06:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:43.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000060s ======
Jan 23 04:06:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:44.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000060s
Jan 23 04:06:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:45.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:46.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:46 np0005593234 python3.9[93524]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:06:46 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.13 scrub starts
Jan 23 04:06:46 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.13 scrub ok
Jan 23 04:06:47 np0005593234 python3.9[93677]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:06:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:06:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:47.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:06:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:48.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:49 np0005593234 python3.9[93757]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:06:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:49.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:50 np0005593234 python3.9[93909]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:06:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:50.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:50 np0005593234 python3.9[93988]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:06:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:51.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:51 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.b scrub starts
Jan 23 04:06:51 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.b scrub ok
Jan 23 04:06:52 np0005593234 python3.9[94140]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:06:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:52.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:53 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.7 scrub starts
Jan 23 04:06:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:53.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:53 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.7 scrub ok
Jan 23 04:06:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:54.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:06:55 np0005593234 python3.9[94293]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:06:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:55.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:56 np0005593234 python3.9[94445]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 23 04:06:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:56.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:57 np0005593234 python3.9[94596]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:06:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:57.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:58 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.3 scrub starts
Jan 23 04:06:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:06:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:06:58.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:06:58 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.3 scrub ok
Jan 23 04:06:59 np0005593234 python3.9[94799]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:06:59 np0005593234 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 23 04:06:59 np0005593234 systemd[1]: tuned.service: Deactivated successfully.
Jan 23 04:06:59 np0005593234 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 23 04:06:59 np0005593234 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 04:06:59 np0005593234 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 04:06:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:06:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:06:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:06:59.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:06:59 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.5 scrub starts
Jan 23 04:06:59 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.5 scrub ok
Jan 23 04:06:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:00.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:00 np0005593234 python3.9[94962]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 23 04:07:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:01.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:01 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.8 scrub starts
Jan 23 04:07:01 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.8 scrub ok
Jan 23 04:07:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:02.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:03.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:04.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:05.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:05 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.18 scrub starts
Jan 23 04:07:05 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.18 scrub ok
Jan 23 04:07:06 np0005593234 python3.9[95116]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:07:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:06.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:06 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.9 scrub starts
Jan 23 04:07:06 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.9 scrub ok
Jan 23 04:07:06 np0005593234 python3.9[95271]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:07:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:07.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:07 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.16 deep-scrub starts
Jan 23 04:07:07 np0005593234 systemd[1]: session-34.scope: Deactivated successfully.
Jan 23 04:07:07 np0005593234 systemd[1]: session-34.scope: Consumed 1min 9.988s CPU time.
Jan 23 04:07:07 np0005593234 systemd-logind[794]: Session 34 logged out. Waiting for processes to exit.
Jan 23 04:07:07 np0005593234 systemd-logind[794]: Removed session 34.
Jan 23 04:07:07 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.16 deep-scrub ok
Jan 23 04:07:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:08.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:09.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:09 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.1d scrub starts
Jan 23 04:07:09 np0005593234 ceph-osd[79769]: log_channel(cluster) log [DBG] : 9.1d scrub ok
Jan 23 04:07:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:10.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:11.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:12.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:13.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:13 np0005593234 systemd-logind[794]: New session 35 of user zuul.
Jan 23 04:07:13 np0005593234 systemd[1]: Started Session 35 of User zuul.
Jan 23 04:07:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:14.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:15 np0005593234 python3.9[95455]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:07:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:15.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:16.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:17 np0005593234 python3.9[95612]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 23 04:07:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:17.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:18 np0005593234 python3.9[95766]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:07:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:07:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:18.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:07:19 np0005593234 python3.9[95900]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 04:07:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:19.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:20.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:21.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:22 np0005593234 python3.9[96054]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:07:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:22.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:23 np0005593234 podman[96230]: 2026-01-23 09:07:23.617127252 +0000 UTC m=+0.069360159 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=reef, org.label-schema.build-date=20250507, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:07:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:23.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:23 np0005593234 podman[96230]: 2026-01-23 09:07:23.753284972 +0000 UTC m=+0.205517889 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef)
Jan 23 04:07:24 np0005593234 podman[96461]: 2026-01-23 09:07:24.468984613 +0000 UTC m=+0.117401375 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:07:24 np0005593234 podman[96461]: 2026-01-23 09:07:24.505173657 +0000 UTC m=+0.153590399 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:07:24 np0005593234 podman[96550]: 2026-01-23 09:07:24.742707726 +0000 UTC m=+0.059852748 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, distribution-scope=public, name=keepalived, vcs-type=git, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793)
Jan 23 04:07:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 04:07:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:24.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 04:07:24 np0005593234 podman[96550]: 2026-01-23 09:07:24.758041231 +0000 UTC m=+0.075186233 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, release=1793, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, architecture=x86_64, name=keepalived, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=)
Jan 23 04:07:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:25 np0005593234 python3.9[96648]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:07:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:07:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:07:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:25.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:26 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:07:26 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:07:26 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:07:26 np0005593234 python3.9[96921]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:07:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:26.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:27.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:27 np0005593234 python3.9[97073]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 23 04:07:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:28.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:29 np0005593234 python3.9[97224]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:07:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:29.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:30 np0005593234 python3.9[97382]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:07:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:30.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:31.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:32.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:32 np0005593234 python3.9[97589]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:07:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:07:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:07:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:33.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:34.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:34 np0005593234 python3.9[97877]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 23 04:07:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:35.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:36 np0005593234 python3.9[98029]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:07:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:36.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:36 np0005593234 python3.9[98184]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:07:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:07:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:37.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:07:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:38.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:39 np0005593234 python3.9[98389]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:07:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:07:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:39.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:07:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:40.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:41.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:42 np0005593234 python3.9[98543]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:07:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:42.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:43 np0005593234 python3.9[98698]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 23 04:07:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:43.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:44 np0005593234 systemd[1]: session-35.scope: Deactivated successfully.
Jan 23 04:07:44 np0005593234 systemd[1]: session-35.scope: Consumed 19.570s CPU time.
Jan 23 04:07:44 np0005593234 systemd-logind[794]: Session 35 logged out. Waiting for processes to exit.
Jan 23 04:07:44 np0005593234 systemd-logind[794]: Removed session 35.
Jan 23 04:07:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:44.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:07:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:45.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:07:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:07:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:46.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.744761) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159267744828, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 2819, "num_deletes": 251, "total_data_size": 5711397, "memory_usage": 5779984, "flush_reason": "Manual Compaction"}
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 23 04:07:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:07:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:47.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159267785119, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 3706461, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6900, "largest_seqno": 9714, "table_properties": {"data_size": 3695135, "index_size": 6860, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 29501, "raw_average_key_size": 21, "raw_value_size": 3669857, "raw_average_value_size": 2732, "num_data_blocks": 302, "num_entries": 1343, "num_filter_entries": 1343, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159102, "oldest_key_time": 1769159102, "file_creation_time": 1769159267, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 40437 microseconds, and 10044 cpu microseconds.
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.785204) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 3706461 bytes OK
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.785225) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.787665) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.787681) EVENT_LOG_v1 {"time_micros": 1769159267787676, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.787703) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 5698121, prev total WAL file size 5698121, number of live WAL files 2.
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.789082) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(3619KB)], [15(7111KB)]
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159267789175, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 10988208, "oldest_snapshot_seqno": -1}
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3853 keys, 9441761 bytes, temperature: kUnknown
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159267881730, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 9441761, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9410314, "index_size": 20713, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9669, "raw_key_size": 92913, "raw_average_key_size": 24, "raw_value_size": 9335072, "raw_average_value_size": 2422, "num_data_blocks": 905, "num_entries": 3853, "num_filter_entries": 3853, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769159267, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.882073) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 9441761 bytes
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.883617) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 118.6 rd, 101.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 6.9 +0.0 blob) out(9.0 +0.0 blob), read-write-amplify(5.5) write-amplify(2.5) OK, records in: 4374, records dropped: 521 output_compression: NoCompression
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.883644) EVENT_LOG_v1 {"time_micros": 1769159267883632, "job": 6, "event": "compaction_finished", "compaction_time_micros": 92670, "compaction_time_cpu_micros": 25640, "output_level": 6, "num_output_files": 1, "total_output_size": 9441761, "num_input_records": 4374, "num_output_records": 3853, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159267884849, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159267886401, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.788973) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.886619) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.886626) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.886628) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.886630) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:07:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:07:47.886632) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:07:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:48.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:49.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:50.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:51.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:51 np0005593234 systemd-logind[794]: New session 36 of user zuul.
Jan 23 04:07:51 np0005593234 systemd[1]: Started Session 36 of User zuul.
Jan 23 04:07:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:52.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:52 np0005593234 python3.9[98881]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:07:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:53.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:54 np0005593234 python3.9[99035]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:07:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:07:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:54.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:07:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:07:55 np0005593234 python3.9[99229]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:07:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:55.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:56 np0005593234 systemd[1]: session-36.scope: Deactivated successfully.
Jan 23 04:07:56 np0005593234 systemd[1]: session-36.scope: Consumed 2.310s CPU time.
Jan 23 04:07:56 np0005593234 systemd-logind[794]: Session 36 logged out. Waiting for processes to exit.
Jan 23 04:07:56 np0005593234 systemd-logind[794]: Removed session 36.
Jan 23 04:07:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:56.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:57.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:07:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:07:58.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:07:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:07:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:07:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:07:59.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:08:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:00.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:08:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:01.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:08:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:02.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:02 np0005593234 systemd-logind[794]: New session 37 of user zuul.
Jan 23 04:08:03 np0005593234 systemd[1]: Started Session 37 of User zuul.
Jan 23 04:08:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:03.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:04 np0005593234 python3.9[99462]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:08:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:04.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:05 np0005593234 python3.9[99617]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:08:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:05.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:06 np0005593234 python3.9[99773]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:08:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:06.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:07 np0005593234 python3.9[99858]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:08:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:08:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:07.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:08:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:08:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:08.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:08:09 np0005593234 python3.9[100012]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:08:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:09.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:10.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:11 np0005593234 python3.9[100208]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:08:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:11.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:12 np0005593234 python3.9[100360]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:08:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:12.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:12 np0005593234 python3.9[100526]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:08:13 np0005593234 python3.9[100605]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:08:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:13.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:14 np0005593234 python3.9[100758]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:08:14 np0005593234 python3.9[100837]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:08:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:08:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:14.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:08:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:15 np0005593234 python3.9[100989]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:08:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:15.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:16 np0005593234 python3.9[101141]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:08:16 np0005593234 python3.9[101294]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:08:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:08:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:16.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:08:17 np0005593234 python3.9[101446]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:08:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:08:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:17.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:08:18 np0005593234 python3.9[101598]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:08:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:18.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:19.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:20.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:21 np0005593234 python3.9[101803]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:08:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:21.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:21 np0005593234 python3.9[101957]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:08:22 np0005593234 python3.9[102110]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:08:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:22.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:23 np0005593234 python3.9[102262]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:08:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:23.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:24.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:25.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:26 np0005593234 python3.9[102416]: ansible-service_facts Invoked
Jan 23 04:08:26 np0005593234 network[102434]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:08:26 np0005593234 network[102435]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:08:26 np0005593234 network[102436]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:08:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:26.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:27.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:28.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:08:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:29.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:08:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:30.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:31.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:32.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:33 np0005593234 python3.9[103004]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:08:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:08:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:33.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:08:34 np0005593234 podman[103064]: 2026-01-23 09:08:34.362840698 +0000 UTC m=+0.812783477 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, CEPH_REF=reef)
Jan 23 04:08:34 np0005593234 podman[103085]: 2026-01-23 09:08:34.584874731 +0000 UTC m=+0.114365947 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef)
Jan 23 04:08:34 np0005593234 podman[103064]: 2026-01-23 09:08:34.764161359 +0000 UTC m=+1.214104118 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:08:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:34.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:35.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:36 np0005593234 podman[103241]: 2026-01-23 09:08:36.12098446 +0000 UTC m=+0.077565036 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:08:36 np0005593234 podman[103262]: 2026-01-23 09:08:36.191806765 +0000 UTC m=+0.054298944 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:08:36 np0005593234 podman[103241]: 2026-01-23 09:08:36.307174861 +0000 UTC m=+0.263755417 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:08:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:36 np0005593234 podman[103308]: 2026-01-23 09:08:36.765898252 +0000 UTC m=+0.168370811 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, version=2.2.4, architecture=x86_64, distribution-scope=public, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Jan 23 04:08:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:36.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:37 np0005593234 podman[103308]: 2026-01-23 09:08:37.08100434 +0000 UTC m=+0.483476899 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, architecture=x86_64, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, vcs-type=git, version=2.2.4, distribution-scope=public, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20)
Jan 23 04:08:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:37.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:38.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:08:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:08:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:08:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:08:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:08:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:39.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:40.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:41.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:42.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:43.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:44.658758) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159324658969, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 750, "num_deletes": 254, "total_data_size": 1450901, "memory_usage": 1464712, "flush_reason": "Manual Compaction"}
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 23 04:08:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:44.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159324967416, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 622003, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9719, "largest_seqno": 10464, "table_properties": {"data_size": 618910, "index_size": 1001, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 7879, "raw_average_key_size": 19, "raw_value_size": 612447, "raw_average_value_size": 1527, "num_data_blocks": 45, "num_entries": 401, "num_filter_entries": 401, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159268, "oldest_key_time": 1769159268, "file_creation_time": 1769159324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 308926 microseconds, and 3637 cpu microseconds.
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:44.967750) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 622003 bytes OK
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:44.967816) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:44.974065) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:44.974132) EVENT_LOG_v1 {"time_micros": 1769159324974117, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:44.974166) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 1446933, prev total WAL file size 1462906, number of live WAL files 2.
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:44.975628) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323535' seq:0, type:0; will stop at (end)
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(607KB)], [18(9220KB)]
Jan 23 04:08:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159324975938, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 10063764, "oldest_snapshot_seqno": -1}
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 3755 keys, 7583626 bytes, temperature: kUnknown
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159325045448, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 7583626, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7555723, "index_size": 17425, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9413, "raw_key_size": 91346, "raw_average_key_size": 24, "raw_value_size": 7484976, "raw_average_value_size": 1993, "num_data_blocks": 762, "num_entries": 3755, "num_filter_entries": 3755, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769159324, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:45.045778) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 7583626 bytes
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:45.052627) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 144.6 rd, 109.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 9.0 +0.0 blob) out(7.2 +0.0 blob), read-write-amplify(28.4) write-amplify(12.2) OK, records in: 4254, records dropped: 499 output_compression: NoCompression
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:45.052690) EVENT_LOG_v1 {"time_micros": 1769159325052661, "job": 8, "event": "compaction_finished", "compaction_time_micros": 69591, "compaction_time_cpu_micros": 19033, "output_level": 6, "num_output_files": 1, "total_output_size": 7583626, "num_input_records": 4254, "num_output_records": 3755, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159325053094, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159325056037, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:44.975406) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:45.056138) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:45.056144) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:45.056146) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:45.056148) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:08:45 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:08:45.056150) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:08:45 np0005593234 python3.9[103652]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 23 04:08:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:08:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:45.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:08:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:08:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:46.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:08:47 np0005593234 python3.9[103805]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:08:47 np0005593234 python3.9[103883]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:08:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:08:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:47.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:08:48 np0005593234 python3.9[104036]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:08:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:48.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:50.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:50.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:08:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:08:51 np0005593234 python3.9[104114]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:08:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:08:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:52.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:08:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:52.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:53 np0005593234 python3.9[104318]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:08:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:54.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:54.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:55 np0005593234 python3.9[104471]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:08:56 np0005593234 python3.9[104555]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:08:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:08:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:08:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:56.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:08:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:56.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:08:58.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:08:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:08:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:08:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:08:58.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:00.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:00.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:02.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:02.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:04 np0005593234 systemd[1]: session-37.scope: Deactivated successfully.
Jan 23 04:09:04 np0005593234 systemd[1]: session-37.scope: Consumed 24.007s CPU time.
Jan 23 04:09:04 np0005593234 systemd-logind[794]: Session 37 logged out. Waiting for processes to exit.
Jan 23 04:09:04 np0005593234 systemd-logind[794]: Removed session 37.
Jan 23 04:09:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:04.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:04.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:06.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:06.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:08.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:09:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:08.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:09:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:10.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:10.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:09:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:12.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:09:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:12.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:09:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:14.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:09:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:14.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:16.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:16.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:18 np0005593234 systemd-logind[794]: New session 38 of user zuul.
Jan 23 04:09:18 np0005593234 systemd[1]: Started Session 38 of User zuul.
Jan 23 04:09:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:09:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:18.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:09:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:18.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:19 np0005593234 python3.9[104799]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:20 np0005593234 python3.9[105002]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:20.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:20.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:20 np0005593234 python3.9[105080]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:21 np0005593234 systemd[1]: session-38.scope: Deactivated successfully.
Jan 23 04:09:21 np0005593234 systemd[1]: session-38.scope: Consumed 1.494s CPU time.
Jan 23 04:09:21 np0005593234 systemd-logind[794]: Session 38 logged out. Waiting for processes to exit.
Jan 23 04:09:21 np0005593234 systemd-logind[794]: Removed session 38.
Jan 23 04:09:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:09:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:22.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:09:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:09:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:22.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:09:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:09:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:24.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:09:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:24.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:26.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:26.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:28 np0005593234 systemd-logind[794]: New session 39 of user zuul.
Jan 23 04:09:28 np0005593234 systemd[1]: Started Session 39 of User zuul.
Jan 23 04:09:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:28.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:09:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:28.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:09:29 np0005593234 python3.9[105265]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:09:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:09:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:30.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:09:30 np0005593234 python3.9[105422]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:30.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:31 np0005593234 python3.9[105597]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:32 np0005593234 python3.9[105675]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.m6fjsk6t recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:09:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:32.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:09:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:32.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:33 np0005593234 python3.9[105828]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:33 np0005593234 python3.9[105906]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.bk7m4s_l recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:34 np0005593234 python3.9[106059]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:09:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:34.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:34.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:35 np0005593234 python3.9[106211]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:35 np0005593234 python3.9[106289]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:09:36 np0005593234 python3.9[106442]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:36.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:36 np0005593234 python3.9[106520]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:09:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:36.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:37 np0005593234 python3.9[106672]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:38 np0005593234 python3.9[106825]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:38.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:38 np0005593234 python3.9[106903]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:39.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:39 np0005593234 python3.9[107104]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:40 np0005593234 python3.9[107183]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:40.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:41.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:41 np0005593234 python3.9[107336]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:09:41 np0005593234 systemd[1]: Reloading.
Jan 23 04:09:41 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:09:41 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:09:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:42 np0005593234 python3.9[107527]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:09:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:42.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:09:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:43.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:43 np0005593234 python3.9[107605]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:43 np0005593234 python3.9[107757]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:44 np0005593234 python3.9[107835]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:09:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:44.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:09:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:45.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:45 np0005593234 python3.9[107988]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:09:45 np0005593234 systemd[1]: Reloading.
Jan 23 04:09:45 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:09:45 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:09:45 np0005593234 systemd[1]: Starting Create netns directory...
Jan 23 04:09:45 np0005593234 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 04:09:45 np0005593234 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 04:09:45 np0005593234 systemd[1]: Finished Create netns directory.
Jan 23 04:09:46 np0005593234 python3.9[108179]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:09:46 np0005593234 network[108197]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:09:46 np0005593234 network[108198]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:09:46 np0005593234 network[108199]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:09:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:46.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:47.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:48.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:09:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:49.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:09:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:50.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:51.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:09:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:09:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:09:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:09:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:09:52 np0005593234 python3.9[108595]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:09:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:52.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:09:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:53.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:53 np0005593234 python3.9[108673]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:54 np0005593234 python3.9[108825]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:54 np0005593234 python3.9[108978]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:09:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:54.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:09:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:09:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:55.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:09:55 np0005593234 python3.9[109056]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:56 np0005593234 python3.9[109209]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 04:09:56 np0005593234 systemd[1]: Starting Time & Date Service...
Jan 23 04:09:56 np0005593234 systemd[1]: Started Time & Date Service.
Jan 23 04:09:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:56.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:57.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:09:57 np0005593234 python3.9[109365]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:58 np0005593234 python3.9[109517]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:58 np0005593234 python3.9[109625]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:09:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:09:58.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:09:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:09:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:09:59.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:09:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:09:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:09:59 np0005593234 python3.9[109798]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:09:59 np0005593234 python3.9[109906]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.qkv5ptb3 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:00 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 04:10:00 np0005593234 python3.9[110079]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:00.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:01.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:01 np0005593234 python3.9[110157]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:02 np0005593234 python3.9[110309]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:10:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:02.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:03.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:03 np0005593234 python3[110463]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 04:10:03 np0005593234 python3.9[110615]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:04 np0005593234 python3.9[110693]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:04.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:05.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:05 np0005593234 python3.9[110846]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:05 np0005593234 python3.9[110971]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159404.575499-902-126281750080172/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:06 np0005593234 python3.9[111124]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:06.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:07.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:07 np0005593234 python3.9[111202]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:08 np0005593234 python3.9[111354]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:08 np0005593234 python3.9[111433]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:08.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:09.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:09 np0005593234 python3.9[111585]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:09 np0005593234 python3.9[111663]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:10 np0005593234 python3.9[111816]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:10:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:10.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:11.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:11 np0005593234 python3.9[111971]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:12 np0005593234 python3.9[112123]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:10:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:12.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:10:13 np0005593234 python3.9[112276]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:13.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:13 np0005593234 python3.9[112428]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 04:10:14 np0005593234 python3.9[112581]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 04:10:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:14.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:15.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:15 np0005593234 systemd[1]: session-39.scope: Deactivated successfully.
Jan 23 04:10:15 np0005593234 systemd[1]: session-39.scope: Consumed 28.443s CPU time.
Jan 23 04:10:15 np0005593234 systemd-logind[794]: Session 39 logged out. Waiting for processes to exit.
Jan 23 04:10:15 np0005593234 systemd-logind[794]: Removed session 39.
Jan 23 04:10:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:10:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:16.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:10:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:10:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:17.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:10:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:18.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:19.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:20 np0005593234 systemd-logind[794]: New session 40 of user zuul.
Jan 23 04:10:20 np0005593234 systemd[1]: Started Session 40 of User zuul.
Jan 23 04:10:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:20.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:21.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:21 np0005593234 python3.9[112814]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 23 04:10:22 np0005593234 python3.9[112966]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:10:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:10:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:22.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:10:23 np0005593234 python3.9[113121]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 23 04:10:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:23.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:23 np0005593234 python3.9[113273]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.z5ssxua5 follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:10:24 np0005593234 python3.9[113399]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.z5ssxua5 mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159423.3111863-110-215744499117866/.source.z5ssxua5 _original_basename=.5fdxvw81 follow=False checksum=10ad371b9444ca89894e9504601831d6af2e14d2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:10:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:24.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:10:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:25.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:25 np0005593234 python3.9[113551]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:10:26 np0005593234 python3.9[113704]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCsLbdPIA8nc52wSKcOItc1xJ6faU3FwhWecUgXZZC+Q1wLSrdN9vgOExBhQSwwodluzJ5/GT9VbCuujyBvk7RMEim1+fw7T58Th56PR8y2lL6F6F3ni4S21QxInTLml+/id8wwEZAkFjbCF/AjCRDyH7a6H4wIZtd5ZuzWJuuBENNdtu/qD1QQYkNegqllogNpkdpAFZgvee26yw2sbCX8kpbJoJsowaQUckoRtT2jj7985CLxErKZ8YO8ZozjfuCDCKbcJT0KFimievJZmKXvGaWG5H+P509XDsfN62aQr22US8FbYjdK1lfrJoetkc/MK4h7QuCs6MH2qYiqXIkJYKMSReM+sH3X7V7pSWSUkr0DHREVvBGcC2lRSx45lUCTEtcTY7XmxGORvCORMYla0l1H3mEIkfYLS4sXYtRSHkyFnyQgbNP5MnrmXlK0vrAA81r5U+dOhIL/H2e7S4xcLItH7weUOHIAmCj266mm9+xJyyd7NZ+eUgS0Md5p4Bc=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUSudroiFEdRPXgUCqRHbNRLelYP5RQGMMCn6zD8pfH#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDJLsx8RxJz6M7PIyGcFdzR+Ldl788501Y8ZWLJ8hnDzMCaRkGjzE+kzO/uN75IEtV3aVEl1jNQlk7wON+lORGQ=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQD7jdzOPltwN8PSb4q9DCiO5zY7TIK6sENpltjjN4gdZgxOTsj/dxnfxJlO2lYI1dFyyFnDdZj88a4x1KI5Bnnvl5KRvvZiianfivZWKq9Ngf9fzf7+5CsDFBiu6a7GAfXMf9FocVpqlXf7fsXmb5Iv2xUpNnye4EFIuW965X3SNrRpujRnDe+i0lIwrOsus4R86qn38MWOLfPBAWFYdBaVfTUYjC0eT/I81Y/T2RKqf7XK/bsuHobZ+/a7lymuPsS9L0DFg25ZoIlvkPUVfZxTO5FCyw8GMR+AgbnMQyHwx2JAmewwH3M2l+zVdDQjsE1ZRFlJCmwle9LBa1oFhuLfxLqsykQploeB5Ch/VppbnRQ/GamwWLU5HEKMH2wZ6IymURW7nSStlEhNWvK+Bb9rIy65M6AFOEW94xId4nc+IraS6rc2cuM3Rp97S/6olqjlFDZisdUwdAlhIKuJjA7SsYZ6HyCEbRN3mvMnWbkqpyY605kewQ6kdmucNeWgRtk=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIE++PPNOKtggGl2mGWEm1DV2WpblvGA/F2TEEVeMrsU2#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBP3uOoytpWGDF46u3wwDFxwF05HMnZd51GvbceZrDgZRmc5sxbF+OawPD9kGTcjnaUTzvqWgbFNvcmpuaNTnpzc=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq2Yxebv3BUxXHPuf6nN00teEMYUUVEWMZOqcwNO1dyibdbyxre6VweeeiBR/lerW1mIcmB67juCuLffEgDo8uPtZx9HrD1psd+ji78YeJuvbKIEcTwdtGF0I8PeogHunx+4KBxFsHeF6JHN9+H7lTHiSSIDFzk9BwDkAKEWsYHe8z+5SPDU//XiYNv0drE59KiQF586rnjPR3VZk6WaR+hp2PiHbUUSOvnyB4kI4bCXSCU/Oxv7HDvgeCJapABjisMZg4aiteZ7EaD1yVndkQiS6OxfOGP1srgtNkRL4Idc/XCFXH754lbRd8GzUF0n8N0HbWTcFDuTU+bvhuIH+3EDNxsDQkSCdJTw2EPb/mqZVdXSFxLXUBcXnYkBWZirpgC3g6okg2RQU2bxigFs7lFwJT6QE+wz0DK7Z3ib0XQxjRlY6PIwn1D2soMwKVarxpeM2FfsGrHMHaHioRTVbKpzBMA1oUICSUCvzyhd0I43cO2rUEK/8EMYSsTVRulKs=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAII4nVnNUbCVQAtKJF7UUtMQxNhMw9eVlRVofBpQ70iUi#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPqfkBgoQjr/gZBK1F9K576GMtkxSY6lVgROItGrW+R9EA2lvnOt71IGO0M0lGVvCkTtLktdNpSsYnBu2cJn+4c=#012 create=True mode=0644 path=/tmp/ansible.z5ssxua5 state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:26 np0005593234 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 04:10:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:26.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:27.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:27 np0005593234 python3.9[113858]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.z5ssxua5' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:10:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:28 np0005593234 python3.9[114012]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.z5ssxua5 state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:28 np0005593234 systemd[1]: session-40.scope: Deactivated successfully.
Jan 23 04:10:28 np0005593234 systemd[1]: session-40.scope: Consumed 4.971s CPU time.
Jan 23 04:10:28 np0005593234 systemd-logind[794]: Session 40 logged out. Waiting for processes to exit.
Jan 23 04:10:28 np0005593234 systemd-logind[794]: Removed session 40.
Jan 23 04:10:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:28.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:29.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:10:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:30.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:10:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:31.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:32.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:33.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:34 np0005593234 systemd-logind[794]: New session 41 of user zuul.
Jan 23 04:10:34 np0005593234 systemd[1]: Started Session 41 of User zuul.
Jan 23 04:10:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:34.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:35.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:35 np0005593234 python3.9[114195]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:10:36 np0005593234 python3.9[114352]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 04:10:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:10:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:36.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:10:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:37.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:37 np0005593234 python3.9[114506]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:10:38 np0005593234 python3.9[114660]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:10:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:38.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:39.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:39 np0005593234 python3.9[114813]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:10:40 np0005593234 python3.9[115016]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:10:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:10:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:40.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:10:41 np0005593234 systemd[1]: session-41.scope: Deactivated successfully.
Jan 23 04:10:41 np0005593234 systemd[1]: session-41.scope: Consumed 3.766s CPU time.
Jan 23 04:10:41 np0005593234 systemd-logind[794]: Session 41 logged out. Waiting for processes to exit.
Jan 23 04:10:41 np0005593234 systemd-logind[794]: Removed session 41.
Jan 23 04:10:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:41.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:10:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:42.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:10:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:43.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:44.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:45.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:46 np0005593234 systemd-logind[794]: New session 42 of user zuul.
Jan 23 04:10:46 np0005593234 systemd[1]: Started Session 42 of User zuul.
Jan 23 04:10:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:46.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:47.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:47 np0005593234 python3.9[115197]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:10:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:48.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:49.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:49 np0005593234 python3.9[115354]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:10:50 np0005593234 python3.9[115438]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 04:10:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:50.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:51.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:52 np0005593234 python3.9[115590]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:10:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:52.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:53.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:54 np0005593234 python3.9[115742]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:10:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:54.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:55.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:55 np0005593234 python3.9[115893]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:10:56 np0005593234 python3.9[116043]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:10:56 np0005593234 systemd[1]: session-42.scope: Deactivated successfully.
Jan 23 04:10:56 np0005593234 systemd[1]: session-42.scope: Consumed 5.697s CPU time.
Jan 23 04:10:56 np0005593234 systemd-logind[794]: Session 42 logged out. Waiting for processes to exit.
Jan 23 04:10:56 np0005593234 systemd-logind[794]: Removed session 42.
Jan 23 04:10:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:10:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:56.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:10:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:57.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:10:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:10:58.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:10:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:10:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:10:59.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:10:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:10:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:11:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:11:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:11:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:11:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:00.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:01.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:02 np0005593234 systemd-logind[794]: New session 43 of user zuul.
Jan 23 04:11:02 np0005593234 systemd[1]: Started Session 43 of User zuul.
Jan 23 04:11:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:11:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:03.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:11:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:03.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:04 np0005593234 python3.9[116407]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:11:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:05.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:05.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:05 np0005593234 python3.9[116564]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:06 np0005593234 python3.9[116767]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:11:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:11:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:07.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:07.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:07 np0005593234 python3.9[116919]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:08 np0005593234 python3.9[117042]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159466.8467524-154-49433932778340/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=aff3851171cbd25aa6d79df0476cea51aa192407 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:08 np0005593234 python3.9[117195]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:09.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:09.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:09 np0005593234 python3.9[117318]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159468.3632596-154-169461609616316/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=b5c90b44c6774a0fb2738dc9aefa548e4239c50f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:09 np0005593234 python3.9[117470]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:10 np0005593234 python3.9[117594]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159469.4519897-154-255818037308118/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=1a2f8c9cd7e4f240af9e85277636496ddc8f4700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:11.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:11 np0005593234 python3.9[117746]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:11.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:12 np0005593234 python3.9[117900]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:12 np0005593234 python3.9[118053]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:13.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:11:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:13.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:11:13 np0005593234 python3.9[118176]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159472.2494233-333-187240163420118/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=38472f817920c06e099fdd20c77ebeb183866d70 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:13 np0005593234 python3.9[118328]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:14 np0005593234 python3.9[118452]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159473.468971-333-101089596946473/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=4d54572c36838e9e23d527be56268c8c0160f31d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:11:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:15.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:11:15 np0005593234 python3.9[118604]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:15.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:15 np0005593234 python3.9[118727]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159474.5896187-333-162779178072150/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=7a3310045a88eb6a710daeae8d683b1c71d94ffb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:16 np0005593234 python3.9[118879]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:17.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:17 np0005593234 python3.9[119032]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:17.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:17 np0005593234 python3.9[119184]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:18 np0005593234 python3.9[119307]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159477.3400452-508-19255080354934/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=31358915077abec5ece657621d23c9c2cec01d48 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:11:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:19.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:11:19 np0005593234 python3.9[119460]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:19.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:19 np0005593234 python3.9[119583]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159478.530248-508-176945490075758/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=4d54572c36838e9e23d527be56268c8c0160f31d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:20 np0005593234 python3.9[119737]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:20 np0005593234 python3.9[119909]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159479.9142156-508-120814108479755/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=7bb94f675b9ef91bf2d0d3f10f60219344f15e1e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:21.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:11:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:21.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:11:22 np0005593234 python3.9[120061]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:22 np0005593234 python3.9[120214]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:23.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:11:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:23.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:11:23 np0005593234 python3.9[120337]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159482.3962173-712-6884419249313/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:24 np0005593234 python3.9[120489]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:24 np0005593234 python3.9[120642]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:25.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:11:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:25.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:11:25 np0005593234 python3.9[120765]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159484.2873948-777-11569590924210/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:26 np0005593234 python3.9[120917]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:26 np0005593234 python3.9[121070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:27.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:27.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:27 np0005593234 python3.9[121193]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159486.2028277-842-140547542699839/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:28 np0005593234 python3.9[121345]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:28 np0005593234 python3.9[121498]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:29.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:29.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:29 np0005593234 python3.9[121621]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159488.484875-911-107046509811628/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:30 np0005593234 python3.9[121774]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:31.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:31.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:31 np0005593234 python3.9[121926]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:32 np0005593234 python3.9[122049]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159491.0402453-988-46847012077355/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:32 np0005593234 python3.9[122202]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:11:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:33.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:33.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:33 np0005593234 python3.9[122354]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:34 np0005593234 python3.9[122477]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159493.1138675-1053-262510139833138/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=193e99f8e1220a4ec0ffff2d0cee79b79a562ce2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:11:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:35.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:11:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:35.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:36 np0005593234 systemd[1]: session-43.scope: Deactivated successfully.
Jan 23 04:11:36 np0005593234 systemd[1]: session-43.scope: Consumed 22.197s CPU time.
Jan 23 04:11:36 np0005593234 systemd-logind[794]: Session 43 logged out. Waiting for processes to exit.
Jan 23 04:11:36 np0005593234 systemd-logind[794]: Removed session 43.
Jan 23 04:11:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:37.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:11:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:37.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:11:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:39.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:39.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:41.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.057630) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159501057689, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1832, "num_deletes": 251, "total_data_size": 4654353, "memory_usage": 4726120, "flush_reason": "Manual Compaction"}
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159501087006, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 3042263, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10469, "largest_seqno": 12296, "table_properties": {"data_size": 3034673, "index_size": 4597, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1925, "raw_key_size": 14784, "raw_average_key_size": 19, "raw_value_size": 3019675, "raw_average_value_size": 3962, "num_data_blocks": 207, "num_entries": 762, "num_filter_entries": 762, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159324, "oldest_key_time": 1769159324, "file_creation_time": 1769159501, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 29664 microseconds, and 6529 cpu microseconds.
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.087269) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 3042263 bytes OK
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.087324) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.095003) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.095071) EVENT_LOG_v1 {"time_micros": 1769159501095056, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.095109) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 4646174, prev total WAL file size 4646174, number of live WAL files 2.
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.096865) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(2970KB)], [21(7405KB)]
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159501096997, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 10625889, "oldest_snapshot_seqno": -1}
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4000 keys, 8626082 bytes, temperature: kUnknown
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159501244672, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 8626082, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8596093, "index_size": 18871, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10053, "raw_key_size": 97062, "raw_average_key_size": 24, "raw_value_size": 8520534, "raw_average_value_size": 2130, "num_data_blocks": 817, "num_entries": 4000, "num_filter_entries": 4000, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769159501, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:11:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:41.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.244980) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 8626082 bytes
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.320883) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 71.9 rd, 58.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 7.2 +0.0 blob) out(8.2 +0.0 blob), read-write-amplify(6.3) write-amplify(2.8) OK, records in: 4517, records dropped: 517 output_compression: NoCompression
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.320940) EVENT_LOG_v1 {"time_micros": 1769159501320922, "job": 10, "event": "compaction_finished", "compaction_time_micros": 147786, "compaction_time_cpu_micros": 22755, "output_level": 6, "num_output_files": 1, "total_output_size": 8626082, "num_input_records": 4517, "num_output_records": 4000, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159501321839, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159501323465, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.096799) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.323655) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.323662) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.323664) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.323666) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:11:41.323669) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:11:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:43.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:43.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:44 np0005593234 systemd-logind[794]: New session 44 of user zuul.
Jan 23 04:11:44 np0005593234 systemd[1]: Started Session 44 of User zuul.
Jan 23 04:11:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:11:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:45.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:11:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:45.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:45 np0005593234 python3.9[122714]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:46 np0005593234 python3.9[122867]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:47.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:11:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:47.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:11:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:47 np0005593234 python3.9[122990]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159506.1277816-64-236099448002399/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=9a6a528427b32e6ef98709d36c90302cf328f9ef backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:48 np0005593234 python3.9[123143]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:11:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:49.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:49 np0005593234 python3.9[123266]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159507.7873676-64-271292298563038/.source.conf _original_basename=ceph.conf follow=False checksum=e4aedaaab1f9b40918a770d92609389e4ab78681 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:11:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:49.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:49 np0005593234 systemd[1]: session-44.scope: Deactivated successfully.
Jan 23 04:11:49 np0005593234 systemd[1]: session-44.scope: Consumed 2.441s CPU time.
Jan 23 04:11:49 np0005593234 systemd-logind[794]: Session 44 logged out. Waiting for processes to exit.
Jan 23 04:11:49 np0005593234 systemd-logind[794]: Removed session 44.
Jan 23 04:11:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:51.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:51.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:11:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:53.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:11:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:53.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:11:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.0 total, 600.0 interval#012Cumulative writes: 2135 writes, 12K keys, 2135 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s#012Cumulative WAL: 2135 writes, 2135 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2135 writes, 12K keys, 2135 commit groups, 1.0 writes per commit group, ingest: 23.19 MB, 0.04 MB/s#012Interval WAL: 2135 writes, 2135 syncs, 1.00 writes per sync, written: 0.02 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     32.3      0.43              0.05         5    0.087       0      0       0.0       0.0#012  L6      1/0    8.23 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   2.2     62.3     52.6      0.60              0.10         4    0.149     16K   1795       0.0       0.0#012 Sum      1/0    8.23 MB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.2     36.1     44.1      1.03              0.15         9    0.114     16K   1795       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   3.2     36.2     44.2      1.03              0.15         8    0.128     16K   1795       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0     62.3     52.6      0.60              0.10         4    0.149     16K   1795       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     32.4      0.43              0.05         4    0.108       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.0 total, 600.0 interval#012Flush(GB): cumulative 0.014, interval 0.014#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.06 MB/s read, 1.0 seconds#012Interval compaction: 0.04 GB write, 0.08 MB/s write, 0.04 GB read, 0.06 MB/s read, 1.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 308.00 MB usage: 1.51 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 9.5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(74,1.34 MB,0.436609%) FilterBlock(9,54.48 KB,0.0172751%) IndexBlock(9,118.23 KB,0.0374881%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:11:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:11:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:55.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:11:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:55.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:57.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:57.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:11:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:11:57 np0005593234 systemd-logind[794]: New session 45 of user zuul.
Jan 23 04:11:57 np0005593234 systemd[1]: Started Session 45 of User zuul.
Jan 23 04:11:58 np0005593234 python3.9[123449]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:11:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:11:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:11:59.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:11:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:11:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:11:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:11:59.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:00 np0005593234 python3.9[123605]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:00 np0005593234 python3.9[123808]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:12:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:01.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:12:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:01.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:01 np0005593234 python3.9[123958]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:12:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:02 np0005593234 python3.9[124111]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 04:12:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:12:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:03.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:12:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:03.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:12:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:05.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:12:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:05.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:05 np0005593234 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 23 04:12:05 np0005593234 python3.9[124268]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:12:06 np0005593234 python3.9[124428]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:12:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:07.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:07.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:12:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:12:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:12:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:12:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 23 04:12:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:12:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:12:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:12:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:09.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:09.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:09 np0005593234 python3.9[124757]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:12:10 np0005593234 python3[124913]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 23 04:12:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:11.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:12:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:11.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:12:11 np0005593234 python3.9[125065]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:12 np0005593234 python3.9[125218]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:12:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:13.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:12:13 np0005593234 python3.9[125296]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:13.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:13 np0005593234 python3.9[125448]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:14 np0005593234 python3.9[125526]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.nzawyxl7 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 23 04:12:15 np0005593234 python3.9[125679]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:12:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:15.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:12:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:15.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:12:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:12:15 np0005593234 python3.9[125807]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:16 np0005593234 python3.9[125960]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:17.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:12:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:17.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:12:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:17 np0005593234 python3[126113]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 04:12:18 np0005593234 python3.9[126266]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:12:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:19.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:12:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:19.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:19 np0005593234 python3.9[126391]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159538.2253642-434-18494476317745/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:20 np0005593234 python3.9[126544]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:12:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:21.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:12:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:21.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:21 np0005593234 python3.9[126719]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159540.0172255-479-258177781765670/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:22 np0005593234 python3.9[126872]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:23 np0005593234 python3.9[126997]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159541.897251-524-146730255409133/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:12:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:23.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:12:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:23.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:23 np0005593234 python3.9[127149]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:24 np0005593234 python3.9[127275]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159543.3589165-569-162894658804728/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:25.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:25.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:25 np0005593234 python3.9[127427]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:26 np0005593234 python3.9[127552]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159544.9192467-614-170736426229285/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:27.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:27 np0005593234 python3.9[127705]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:27.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:28 np0005593234 python3.9[127857]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:12:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:29.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:12:29 np0005593234 python3.9[128013]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:12:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 600.5 total, 600.0 interval#012Cumulative writes: 5377 writes, 23K keys, 5377 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.03 MB/s#012Cumulative WAL: 5377 writes, 782 syncs, 6.88 writes per sync, written: 0.02 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 5377 writes, 23K keys, 5377 commit groups, 1.0 writes per commit group, ingest: 18.64 MB, 0.03 MB/s#012Interval WAL: 5377 writes, 782 syncs, 6.88 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.31              0.00         1    0.311       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.31              0.00         1    0.311       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.31              0.00         1    0.311       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.5 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.3 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bc37c4b350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.5 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bc37c4b350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 5.3e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 600.5 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_s
Jan 23 04:12:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:29.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:30 np0005593234 python3.9[128165]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:12:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:31.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:12:31 np0005593234 python3.9[128319]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:12:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:12:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:31.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:12:32 np0005593234 python3.9[128473]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:32 np0005593234 python3.9[128629]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:33.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:33.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:34 np0005593234 python3.9[128780]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:12:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:35.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:35.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:36 np0005593234 python3.9[128933]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:36 np0005593234 ovs-vsctl[128934]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 23 04:12:37 np0005593234 python3.9[129087]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:12:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:37.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:12:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:12:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:37.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:12:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:37 np0005593234 python3.9[129242]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:12:37 np0005593234 ovs-vsctl[129243]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 23 04:12:38 np0005593234 python3.9[129394]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:12:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:39.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:39.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:39 np0005593234 python3.9[129548]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:40 np0005593234 python3.9[129701]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:12:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:41.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:12:41 np0005593234 python3.9[129829]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000028s ======
Jan 23 04:12:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:41.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000028s
Jan 23 04:12:42 np0005593234 python3.9[129981]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:42 np0005593234 python3.9[130060]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:43.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:43.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:43 np0005593234 python3.9[130212]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:44 np0005593234 python3.9[130364]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:44 np0005593234 python3.9[130443]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000027s ======
Jan 23 04:12:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:45.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000027s
Jan 23 04:12:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:45.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:45 np0005593234 python3.9[130595]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:46 np0005593234 python3.9[130673]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:47.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:47.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:47 np0005593234 python3.9[130827]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:12:47 np0005593234 systemd[1]: Reloading.
Jan 23 04:12:47 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:12:47 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:12:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:48 np0005593234 python3.9[131016]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:48 np0005593234 python3.9[131094]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:49.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:49.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:49 np0005593234 python3.9[131247]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:50 np0005593234 python3.9[131325]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:51.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:51 np0005593234 python3.9[131478]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:12:51 np0005593234 systemd[1]: Reloading.
Jan 23 04:12:51 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:12:51 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:12:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:51.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:51 np0005593234 systemd[1]: Starting Create netns directory...
Jan 23 04:12:51 np0005593234 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 04:12:51 np0005593234 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 04:12:51 np0005593234 systemd[1]: Finished Create netns directory.
Jan 23 04:12:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:52 np0005593234 python3.9[131672]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:12:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:53.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:12:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:53.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:53 np0005593234 python3.9[131825]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:54 np0005593234 python3.9[131948]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159572.989994-1367-254738432190415/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:55.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:55.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:55 np0005593234 python3.9[132103]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:56 np0005593234 python3.9[132255]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:12:57 np0005593234 python3.9[132407]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:12:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:57.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:57.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:12:57 np0005593234 python3.9[132531]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159576.526593-1465-280696753571151/.source.json _original_basename=.wjgf8323 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:12:59 np0005593234 python3.9[132682]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:12:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:12:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:12:59.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:12:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:12:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:12:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:12:59.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:01.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:01.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:01 np0005593234 python3.9[133156]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 23 04:13:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:02 np0005593234 python3.9[133308]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 04:13:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:13:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:03.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:13:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:03.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:04 np0005593234 python3[133461]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 04:13:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:05.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:13:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:05.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:13:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:13:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:07.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:13:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:07.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:09.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:09.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:10 np0005593234 podman[133473]: 2026-01-23 09:13:10.921017042 +0000 UTC m=+6.463517610 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 23 04:13:11 np0005593234 podman[133597]: 2026-01-23 09:13:11.061401804 +0000 UTC m=+0.048712810 container create dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 04:13:11 np0005593234 podman[133597]: 2026-01-23 09:13:11.036111 +0000 UTC m=+0.023422036 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 23 04:13:11 np0005593234 python3[133461]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Jan 23 04:13:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:13:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:11.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:13:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:11.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:11 np0005593234 python3.9[133789]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:13:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:12 np0005593234 python3.9[133945]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:13:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:13.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:13 np0005593234 python3.9[134022]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:13:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:13.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:14 np0005593234 python3.9[134173]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769159593.3703442-1699-182917679478438/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:13:14 np0005593234 python3.9[134249]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:13:14 np0005593234 systemd[1]: Reloading.
Jan 23 04:13:14 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:13:14 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:13:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:15.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:15.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:15 np0005593234 python3.9[134418]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:13:15 np0005593234 systemd[1]: Reloading.
Jan 23 04:13:15 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:13:15 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:13:15 np0005593234 systemd[1]: Starting ovn_controller container...
Jan 23 04:13:16 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:13:16 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79402a63c2527faadb4aa30bd498c0c12fa9b808138acfbb6d1cfcbcf4f41013/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 23 04:13:16 np0005593234 systemd[1]: Started /usr/bin/podman healthcheck run dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7.
Jan 23 04:13:16 np0005593234 podman[134532]: 2026-01-23 09:13:16.58093349 +0000 UTC m=+0.651084034 container init dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller)
Jan 23 04:13:16 np0005593234 ovn_controller[134547]: + sudo -E kolla_set_configs
Jan 23 04:13:16 np0005593234 podman[134532]: 2026-01-23 09:13:16.61208302 +0000 UTC m=+0.682233544 container start dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 23 04:13:16 np0005593234 systemd[1]: Created slice User Slice of UID 0.
Jan 23 04:13:16 np0005593234 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 23 04:13:16 np0005593234 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 23 04:13:16 np0005593234 systemd[1]: Starting User Manager for UID 0...
Jan 23 04:13:16 np0005593234 edpm-start-podman-container[134532]: ovn_controller
Jan 23 04:13:16 np0005593234 edpm-start-podman-container[134531]: Creating additional drop-in dependency for "ovn_controller" (dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7)
Jan 23 04:13:16 np0005593234 systemd[134567]: Queued start job for default target Main User Target.
Jan 23 04:13:16 np0005593234 podman[134554]: 2026-01-23 09:13:16.788774746 +0000 UTC m=+0.160725269 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:13:16 np0005593234 systemd[134567]: Created slice User Application Slice.
Jan 23 04:13:16 np0005593234 systemd[134567]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 23 04:13:16 np0005593234 systemd[134567]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:13:16 np0005593234 systemd[134567]: Reached target Paths.
Jan 23 04:13:16 np0005593234 systemd[134567]: Reached target Timers.
Jan 23 04:13:16 np0005593234 systemd[1]: dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7-2ba929d75d248c16.service: Main process exited, code=exited, status=1/FAILURE
Jan 23 04:13:16 np0005593234 systemd[1]: dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7-2ba929d75d248c16.service: Failed with result 'exit-code'.
Jan 23 04:13:16 np0005593234 systemd[134567]: Starting D-Bus User Message Bus Socket...
Jan 23 04:13:16 np0005593234 systemd[134567]: Starting Create User's Volatile Files and Directories...
Jan 23 04:13:16 np0005593234 systemd[1]: Reloading.
Jan 23 04:13:16 np0005593234 systemd[134567]: Finished Create User's Volatile Files and Directories.
Jan 23 04:13:16 np0005593234 systemd[134567]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:13:16 np0005593234 systemd[134567]: Reached target Sockets.
Jan 23 04:13:16 np0005593234 systemd[134567]: Reached target Basic System.
Jan 23 04:13:16 np0005593234 systemd[134567]: Reached target Main User Target.
Jan 23 04:13:16 np0005593234 systemd[134567]: Startup finished in 122ms.
Jan 23 04:13:16 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:13:16 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:13:17 np0005593234 systemd[1]: Started User Manager for UID 0.
Jan 23 04:13:17 np0005593234 systemd[1]: Started ovn_controller container.
Jan 23 04:13:17 np0005593234 systemd[1]: Started Session c1 of User root.
Jan 23 04:13:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:13:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: INFO:__main__:Validating config file
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: INFO:__main__:Writing out command to execute
Jan 23 04:13:17 np0005593234 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: ++ cat /run_command
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: + ARGS=
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: + sudo kolla_copy_cacerts
Jan 23 04:13:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:13:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:17.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:13:17 np0005593234 systemd[1]: Started Session c2 of User root.
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: + [[ ! -n '' ]]
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: + . kolla_extend_start
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: + umask 0022
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 23 04:13:17 np0005593234 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 23 04:13:17 np0005593234 NetworkManager[48942]: <info>  [1769159597.2184] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 23 04:13:17 np0005593234 NetworkManager[48942]: <info>  [1769159597.2194] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:13:17 np0005593234 NetworkManager[48942]: <warn>  [1769159597.2197] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:13:17 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:13:17 np0005593234 NetworkManager[48942]: <info>  [1769159597.2203] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 23 04:13:17 np0005593234 NetworkManager[48942]: <info>  [1769159597.2208] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 23 04:13:17 np0005593234 NetworkManager[48942]: <info>  [1769159597.2210] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 04:13:17 np0005593234 kernel: br-int: entered promiscuous mode
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 23 04:13:17 np0005593234 systemd-udevd[134681]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 04:13:17 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:17Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 04:13:17 np0005593234 NetworkManager[48942]: <info>  [1769159597.3176] manager: (ovn-e9717b-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 23 04:13:17 np0005593234 NetworkManager[48942]: <info>  [1769159597.3180] manager: (ovn-539cfa-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 23 04:13:17 np0005593234 kernel: genev_sys_6081: entered promiscuous mode
Jan 23 04:13:17 np0005593234 NetworkManager[48942]: <info>  [1769159597.3358] device (genev_sys_6081): carrier: link connected
Jan 23 04:13:17 np0005593234 NetworkManager[48942]: <info>  [1769159597.3361] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Jan 23 04:13:17 np0005593234 systemd-udevd[134683]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:13:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:13:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:17.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:13:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:17 np0005593234 NetworkManager[48942]: <info>  [1769159597.7066] manager: (ovn-d80bc7-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Jan 23 04:13:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:19.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:19.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:20 np0005593234 python3.9[134811]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 04:13:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:21.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:21.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:21 np0005593234 python3.9[135015]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:22 np0005593234 python3.9[135138]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159601.0475664-1834-103940839780041/.source.yaml _original_basename=.ewkkm3vp follow=False checksum=d3cbb0a9c550a24d080b6861631678a3f2e708bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:13:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:13:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:13:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:22 np0005593234 python3.9[135290]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:13:22 np0005593234 ovs-vsctl[135292]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 23 04:13:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:13:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:23.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:13:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:13:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:23.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:13:23 np0005593234 python3.9[135444]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:13:23 np0005593234 ovs-vsctl[135496]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 23 04:13:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:13:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:13:24 np0005593234 python3.9[135649]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:13:24 np0005593234 ovs-vsctl[135650]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 23 04:13:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:25.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:25.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:25 np0005593234 systemd[1]: session-45.scope: Deactivated successfully.
Jan 23 04:13:25 np0005593234 systemd[1]: session-45.scope: Consumed 57.006s CPU time.
Jan 23 04:13:25 np0005593234 systemd-logind[794]: Session 45 logged out. Waiting for processes to exit.
Jan 23 04:13:25 np0005593234 systemd-logind[794]: Removed session 45.
Jan 23 04:13:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:27.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:27.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:27 np0005593234 systemd[1]: Stopping User Manager for UID 0...
Jan 23 04:13:27 np0005593234 systemd[134567]: Activating special unit Exit the Session...
Jan 23 04:13:27 np0005593234 systemd[134567]: Stopped target Main User Target.
Jan 23 04:13:27 np0005593234 systemd[134567]: Stopped target Basic System.
Jan 23 04:13:27 np0005593234 systemd[134567]: Stopped target Paths.
Jan 23 04:13:27 np0005593234 systemd[134567]: Stopped target Sockets.
Jan 23 04:13:27 np0005593234 systemd[134567]: Stopped target Timers.
Jan 23 04:13:27 np0005593234 systemd[134567]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 04:13:27 np0005593234 systemd[134567]: Closed D-Bus User Message Bus Socket.
Jan 23 04:13:27 np0005593234 systemd[134567]: Stopped Create User's Volatile Files and Directories.
Jan 23 04:13:27 np0005593234 systemd[134567]: Removed slice User Application Slice.
Jan 23 04:13:27 np0005593234 systemd[134567]: Reached target Shutdown.
Jan 23 04:13:27 np0005593234 systemd[134567]: Finished Exit the Session.
Jan 23 04:13:27 np0005593234 systemd[134567]: Reached target Exit the Session.
Jan 23 04:13:27 np0005593234 systemd[1]: user@0.service: Deactivated successfully.
Jan 23 04:13:27 np0005593234 systemd[1]: Stopped User Manager for UID 0.
Jan 23 04:13:27 np0005593234 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 23 04:13:27 np0005593234 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 23 04:13:27 np0005593234 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 23 04:13:27 np0005593234 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 23 04:13:27 np0005593234 systemd[1]: Removed slice User Slice of UID 0.
Jan 23 04:13:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:13:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:29.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:13:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:13:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:29.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:13:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:31.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:31.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:32 np0005593234 systemd-logind[794]: New session 47 of user zuul.
Jan 23 04:13:32 np0005593234 systemd[1]: Started Session 47 of User zuul.
Jan 23 04:13:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:33 np0005593234 python3.9[135834]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:13:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:33.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:33.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:34 np0005593234 python3.9[135991]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:35.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:35 np0005593234 python3.9[136144]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:35.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:36 np0005593234 python3.9[136297]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:36 np0005593234 python3.9[136449]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:37.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:37.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:37 np0005593234 python3.9[136602]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:38 np0005593234 python3.9[136752]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:13:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:39.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:39 np0005593234 python3.9[136905]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 04:13:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:39.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:41.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:41 np0005593234 python3.9[137056]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:13:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:41.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:13:41 np0005593234 python3.9[137227]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159620.683913-221-195971694292058/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:42 np0005593234 python3.9[137377]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:43.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:43 np0005593234 python3.9[137499]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159622.2575312-265-181429307203991/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:43.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:44 np0005593234 python3.9[137651]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:13:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:45.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:45 np0005593234 python3.9[137736]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:13:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:45.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:47.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:13:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:47.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:13:47 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:47Z|00025|memory|INFO|16000 kB peak resident set size after 30.4 seconds
Jan 23 04:13:47 np0005593234 ovn_controller[134547]: 2026-01-23T09:13:47Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Jan 23 04:13:47 np0005593234 podman[137862]: 2026-01-23 09:13:47.673062388 +0000 UTC m=+0.121025278 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 04:13:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:47 np0005593234 python3.9[137904]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:13:48 np0005593234 python3.9[138070]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:49.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:49 np0005593234 python3.9[138192]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159628.2555318-377-135700569563116/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:49.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:49 np0005593234 python3.9[138342]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:50 np0005593234 python3.9[138463]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159629.4547396-377-177422710005065/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:51.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:51.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:52 np0005593234 python3.9[138614]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:52 np0005593234 python3.9[138735]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159631.6012042-508-35572296276483/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:53.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:53 np0005593234 python3.9[138886]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:53.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:53 np0005593234 python3.9[139007]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159632.8411186-508-175687554371810/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:55.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:55 np0005593234 python3.9[139158]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:13:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:55.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:56 np0005593234 python3.9[139312]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:57.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:57 np0005593234 python3.9[139465]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:57.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:13:57 np0005593234 python3.9[139543]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:58 np0005593234 python3.9[139695]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:13:59 np0005593234 python3.9[139774]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:13:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:13:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:13:59.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:13:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:13:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:13:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:13:59.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:13:59 np0005593234 python3.9[139926]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:00 np0005593234 python3.9[140078]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:01 np0005593234 python3.9[140157]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:01.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:14:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:01.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:14:02 np0005593234 python3.9[140359]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:02 np0005593234 python3.9[140437]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:14:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:03.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:14:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:03.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:04 np0005593234 python3.9[140590]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:14:04 np0005593234 systemd[1]: Reloading.
Jan 23 04:14:04 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:04 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:14:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:05.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:05 np0005593234 python3.9[140780]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:14:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:05.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:14:05 np0005593234 python3.9[140860]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:06 np0005593234 python3.9[141012]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:14:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:07.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:14:07 np0005593234 python3.9[141091]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:07.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:08 np0005593234 python3.9[141243]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:14:08 np0005593234 systemd[1]: Reloading.
Jan 23 04:14:08 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:08 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:14:08 np0005593234 systemd[1]: Starting Create netns directory...
Jan 23 04:14:08 np0005593234 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 04:14:08 np0005593234 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 04:14:08 np0005593234 systemd[1]: Finished Create netns directory.
Jan 23 04:14:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:14:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:09.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:14:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:09.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:09 np0005593234 python3.9[141438]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:14:10 np0005593234 python3.9[141590]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:10 np0005593234 python3.9[141713]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769159649.894561-962-110311107077407/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:14:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:11.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:14:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:11.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:14:12 np0005593234 python3.9[141866]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:12 np0005593234 python3.9[142018]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:14:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:13.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:13.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:13 np0005593234 python3.9[142171]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:14 np0005593234 python3.9[142294]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159653.2077944-1060-52751603942244/.source.json _original_basename=.rpxsh9s8 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:15 np0005593234 python3.9[142445]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:15.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:15.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:17.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:14:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:17.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:14:17 np0005593234 python3.9[142869]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 23 04:14:18 np0005593234 podman[142993]: 2026-01-23 09:14:18.757958641 +0000 UTC m=+0.091544459 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:14:18 np0005593234 python3.9[143038]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 04:14:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:19.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:19.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:20 np0005593234 python3[143201]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 04:14:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:21.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:21.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:23.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:23.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:25.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:14:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:25.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:14:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:27.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:27.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:29.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:29.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:31.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:14:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:31.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:14:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:33.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:33.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:34 np0005593234 podman[143214]: 2026-01-23 09:14:34.367437574 +0000 UTC m=+14.194418310 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:14:34 np0005593234 podman[143555]: 2026-01-23 09:14:34.520931398 +0000 UTC m=+0.058191689 container create 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:14:34 np0005593234 podman[143555]: 2026-01-23 09:14:34.488081825 +0000 UTC m=+0.025342136 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:14:34 np0005593234 python3[143201]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:14:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:35.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:35 np0005593234 python3.9[143746]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:14:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:35.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:14:37 np0005593234 python3.9[143903]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:37.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:37.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:37 np0005593234 python3.9[143979]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:14:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:14:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:14:38 np0005593234 python3.9[144130]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769159677.6257167-1294-39549042815427/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:38 np0005593234 python3.9[144206]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:14:38 np0005593234 systemd[1]: Reloading.
Jan 23 04:14:38 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:14:38 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:39.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:39.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:39 np0005593234 python3.9[144318]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:14:39 np0005593234 systemd[1]: Reloading.
Jan 23 04:14:39 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:39 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:14:40 np0005593234 systemd[1]: Starting ovn_metadata_agent container...
Jan 23 04:14:40 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:14:40 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e741abda285ed8009bdd5a53d828062787e9eb11c6deeb8b1ef21c17c7a538e1/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 23 04:14:40 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e741abda285ed8009bdd5a53d828062787e9eb11c6deeb8b1ef21c17c7a538e1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:14:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:40 np0005593234 systemd[1]: Started /usr/bin/podman healthcheck run 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2.
Jan 23 04:14:40 np0005593234 podman[144360]: 2026-01-23 09:14:40.655642192 +0000 UTC m=+0.494720190 container init 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: + sudo -E kolla_set_configs
Jan 23 04:14:40 np0005593234 podman[144360]: 2026-01-23 09:14:40.701067395 +0000 UTC m=+0.540145283 container start 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Validating config file
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Copying service configuration files
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Writing out command to execute
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 23 04:14:40 np0005593234 edpm-start-podman-container[144360]: ovn_metadata_agent
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: ++ cat /run_command
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: + CMD=neutron-ovn-metadata-agent
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: + ARGS=
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: + sudo kolla_copy_cacerts
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: + [[ ! -n '' ]]
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: + . kolla_extend_start
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: Running command: 'neutron-ovn-metadata-agent'
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: + umask 0022
Jan 23 04:14:40 np0005593234 ovn_metadata_agent[144376]: + exec neutron-ovn-metadata-agent
Jan 23 04:14:40 np0005593234 podman[144383]: 2026-01-23 09:14:40.801494074 +0000 UTC m=+0.087826980 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 23 04:14:40 np0005593234 edpm-start-podman-container[144359]: Creating additional drop-in dependency for "ovn_metadata_agent" (2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2)
Jan 23 04:14:40 np0005593234 systemd[1]: Reloading.
Jan 23 04:14:40 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:40 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:14:41 np0005593234 systemd[1]: Started ovn_metadata_agent container.
Jan 23 04:14:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:14:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:41.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:14:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:41.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:42 np0005593234 python3.9[144664]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.736 144381 INFO neutron.common.config [-] Logging enabled!#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.737 144381 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.737 144381 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.737 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.737 144381 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.738 144381 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.738 144381 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.738 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.738 144381 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.738 144381 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.738 144381 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.738 144381 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.738 144381 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.738 144381 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.739 144381 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.739 144381 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.739 144381 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.739 144381 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.739 144381 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.739 144381 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.739 144381 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.739 144381 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.739 144381 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.740 144381 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.740 144381 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.740 144381 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.740 144381 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.740 144381 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.740 144381 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.740 144381 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.740 144381 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.740 144381 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.741 144381 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.741 144381 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.741 144381 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.741 144381 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.741 144381 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.741 144381 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.741 144381 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.741 144381 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.742 144381 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.742 144381 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.742 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.742 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.742 144381 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.742 144381 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.742 144381 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.742 144381 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.742 144381 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.742 144381 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.743 144381 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.743 144381 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.743 144381 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.743 144381 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.743 144381 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.743 144381 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.744 144381 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.744 144381 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.744 144381 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.745 144381 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.745 144381 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.745 144381 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.745 144381 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.745 144381 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.745 144381 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.745 144381 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.745 144381 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.746 144381 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.746 144381 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.746 144381 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.746 144381 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.746 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.746 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.746 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.746 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.746 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.747 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.747 144381 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.747 144381 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.747 144381 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.747 144381 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.747 144381 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.747 144381 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.747 144381 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.747 144381 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.748 144381 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.748 144381 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.748 144381 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.748 144381 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.748 144381 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.748 144381 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.748 144381 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.748 144381 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.748 144381 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.749 144381 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.749 144381 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.749 144381 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.749 144381 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.749 144381 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.749 144381 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.749 144381 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.749 144381 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.749 144381 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.749 144381 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.750 144381 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.750 144381 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.750 144381 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.750 144381 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.750 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.750 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.750 144381 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.750 144381 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.751 144381 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.751 144381 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.751 144381 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.751 144381 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.751 144381 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.751 144381 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.751 144381 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.751 144381 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.751 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.751 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.752 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.752 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.752 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.752 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.752 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.752 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.752 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.752 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.752 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.753 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.753 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.753 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.753 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.753 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.753 144381 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.753 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.753 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.753 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.754 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.754 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.754 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.754 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.754 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.754 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.754 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.755 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.755 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.755 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.755 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.755 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.755 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.755 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.755 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.755 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.756 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.756 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.756 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.756 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.756 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.756 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.756 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.757 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.757 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.757 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.757 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.757 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.757 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.757 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.757 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.757 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.758 144381 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.758 144381 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.758 144381 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.758 144381 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.758 144381 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.758 144381 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.758 144381 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.759 144381 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.759 144381 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.759 144381 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.759 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.759 144381 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.759 144381 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.759 144381 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.759 144381 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.760 144381 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.760 144381 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.760 144381 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.760 144381 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.760 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.760 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.760 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.760 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.760 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.761 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.761 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.761 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.761 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.761 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.761 144381 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.761 144381 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.761 144381 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.761 144381 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.762 144381 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.762 144381 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.762 144381 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.762 144381 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.762 144381 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.762 144381 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.762 144381 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.762 144381 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.762 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.763 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.763 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.763 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.763 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.763 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.763 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.763 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.763 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.764 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.764 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.764 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.764 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.764 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.764 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.764 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.764 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.765 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.765 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.765 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.765 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.765 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.765 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.765 144381 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.765 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.765 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.766 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.766 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.766 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.766 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.766 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.766 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.766 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.766 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.766 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.767 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.767 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.767 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.767 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.767 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.767 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.767 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.768 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.768 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.768 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.768 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.768 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.768 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.768 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.768 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.768 144381 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.769 144381 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.769 144381 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.769 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.769 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.769 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.769 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.769 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.769 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.769 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.770 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.770 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.770 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.770 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.770 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.770 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.770 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.770 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.771 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.771 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.771 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.771 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.771 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.771 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.771 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.771 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.771 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.772 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.772 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.772 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.772 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.772 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.772 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.772 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.772 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.773 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.773 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.773 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.773 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.773 144381 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.773 144381 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.783 144381 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.784 144381 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.784 144381 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.785 144381 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.785 144381 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.800 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 3ec410d4-99bb-47ec-9f70-86f8400b2621 (UUID: 3ec410d4-99bb-47ec-9f70-86f8400b2621) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.829 144381 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.829 144381 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.829 144381 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.829 144381 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.833 144381 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.842 144381 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.849 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '3ec410d4-99bb-47ec-9f70-86f8400b2621'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], external_ids={}, name=3ec410d4-99bb-47ec-9f70-86f8400b2621, nb_cfg_timestamp=1769159605241, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.850 144381 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f09f4027f70>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.851 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.852 144381 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.852 144381 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.852 144381 INFO oslo_service.service [-] Starting 1 workers#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.857 144381 DEBUG oslo_service.service [-] Started child 144742 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.860 144742 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-164063'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.860 144381 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpcdvjs2ox/privsep.sock']#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.887 144742 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.888 144742 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.888 144742 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.892 144742 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.900 144742 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Jan 23 04:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:42.907 144742 INFO eventlet.wsgi.server [-] (144742) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Jan 23 04:14:43 np0005593234 python3.9[144821]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:14:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:43.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:43 np0005593234 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 23 04:14:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:43.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:43.563 144381 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 04:14:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:43.564 144381 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpcdvjs2ox/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 23 04:14:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:43.430 144923 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 04:14:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:43.435 144923 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 04:14:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:43.437 144923 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 23 04:14:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:43.437 144923 INFO oslo.privsep.daemon [-] privsep daemon running as pid 144923#033[00m
Jan 23 04:14:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:43.567 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[d34fc27b-1d3a-4840-b4f5-ed5bfa0958b1]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:14:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:14:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:14:43 np0005593234 python3.9[145001]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159682.7802007-1430-54676164867520/.source.yaml _original_basename=.xx99bvb0 follow=False checksum=29c9ae8bd33f53131de391173ae7a464927d83f8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.130 144923 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.130 144923 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.130 144923 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:14:44 np0005593234 systemd[1]: session-47.scope: Deactivated successfully.
Jan 23 04:14:44 np0005593234 systemd[1]: session-47.scope: Consumed 1min 1.216s CPU time.
Jan 23 04:14:44 np0005593234 systemd-logind[794]: Session 47 logged out. Waiting for processes to exit.
Jan 23 04:14:44 np0005593234 systemd-logind[794]: Removed session 47.
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.842 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[6c785387-93b5-48f1-9fa2-f573c23849a6]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.845 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, column=external_ids, values=({'neutron:ovn-metadata-id': '8f0223f0-d72d-510d-8b4d-7df66a9eae34'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.858 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.868 144381 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.868 144381 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.868 144381 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.868 144381 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.868 144381 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.869 144381 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.869 144381 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.869 144381 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.869 144381 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.869 144381 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.869 144381 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.869 144381 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.870 144381 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.870 144381 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.870 144381 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.870 144381 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.870 144381 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.870 144381 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.871 144381 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.871 144381 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.871 144381 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.871 144381 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.871 144381 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.871 144381 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.871 144381 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.872 144381 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.872 144381 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.872 144381 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.872 144381 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.872 144381 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.872 144381 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.872 144381 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.873 144381 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.873 144381 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.873 144381 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.873 144381 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.873 144381 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.874 144381 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.874 144381 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.874 144381 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.874 144381 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.874 144381 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.874 144381 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.874 144381 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.875 144381 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.875 144381 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.875 144381 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.875 144381 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.875 144381 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.875 144381 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.875 144381 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.876 144381 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.876 144381 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.876 144381 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.876 144381 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.876 144381 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.876 144381 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.876 144381 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.876 144381 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.877 144381 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.877 144381 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.877 144381 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.877 144381 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.877 144381 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.877 144381 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.877 144381 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.878 144381 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.878 144381 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.878 144381 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.878 144381 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.878 144381 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.878 144381 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.878 144381 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.879 144381 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.879 144381 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.879 144381 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.879 144381 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.879 144381 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.879 144381 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.879 144381 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.880 144381 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.880 144381 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.880 144381 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.880 144381 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.880 144381 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.880 144381 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.880 144381 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.880 144381 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.880 144381 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.881 144381 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.881 144381 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.881 144381 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.881 144381 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.881 144381 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.881 144381 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.881 144381 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.882 144381 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.882 144381 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.882 144381 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.882 144381 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.882 144381 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.882 144381 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.882 144381 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.882 144381 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.882 144381 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.883 144381 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.883 144381 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.883 144381 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.883 144381 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.883 144381 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.883 144381 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.883 144381 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.884 144381 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.884 144381 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.884 144381 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.884 144381 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.884 144381 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.884 144381 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.884 144381 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.884 144381 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.885 144381 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.885 144381 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.885 144381 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.885 144381 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.885 144381 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.885 144381 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.885 144381 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.886 144381 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.886 144381 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.886 144381 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.886 144381 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.886 144381 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.886 144381 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.886 144381 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.887 144381 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.887 144381 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.887 144381 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.887 144381 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.887 144381 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.887 144381 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.887 144381 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.887 144381 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.888 144381 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.888 144381 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.888 144381 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.888 144381 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.888 144381 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.888 144381 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.888 144381 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.888 144381 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.888 144381 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.889 144381 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.889 144381 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.889 144381 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.889 144381 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.889 144381 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.889 144381 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.889 144381 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.889 144381 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.889 144381 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.889 144381 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.889 144381 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.890 144381 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.890 144381 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.890 144381 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.890 144381 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.890 144381 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.890 144381 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.890 144381 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.890 144381 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.890 144381 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.890 144381 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.891 144381 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.891 144381 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.891 144381 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.891 144381 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.891 144381 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.891 144381 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.891 144381 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.891 144381 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.891 144381 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.892 144381 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.892 144381 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.892 144381 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.892 144381 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.892 144381 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.892 144381 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.892 144381 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.892 144381 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.893 144381 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.893 144381 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.893 144381 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.893 144381 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.893 144381 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.893 144381 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.893 144381 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.893 144381 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.893 144381 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.893 144381 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.894 144381 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.894 144381 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.894 144381 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.894 144381 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.894 144381 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.894 144381 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.894 144381 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.894 144381 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.894 144381 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.894 144381 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.895 144381 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.895 144381 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.895 144381 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.895 144381 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.895 144381 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.895 144381 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.895 144381 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.895 144381 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.895 144381 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.895 144381 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.895 144381 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.896 144381 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.896 144381 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.896 144381 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.896 144381 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.896 144381 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.896 144381 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.896 144381 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.896 144381 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.896 144381 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.896 144381 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.897 144381 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.897 144381 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.897 144381 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.897 144381 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.897 144381 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.897 144381 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.897 144381 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.897 144381 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.897 144381 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.897 144381 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.898 144381 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.898 144381 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.898 144381 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.898 144381 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.898 144381 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.898 144381 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.898 144381 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.898 144381 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.898 144381 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.898 144381 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.899 144381 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.899 144381 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.899 144381 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.899 144381 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.899 144381 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.899 144381 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.899 144381 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.899 144381 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.899 144381 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.900 144381 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.900 144381 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.900 144381 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.900 144381 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.900 144381 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.900 144381 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.900 144381 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.900 144381 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.900 144381 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.901 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.901 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.901 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.901 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.901 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.901 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.901 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.901 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.901 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.902 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.902 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.902 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.902 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.902 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.902 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.902 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.902 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.902 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.903 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.903 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.903 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.903 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.903 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.903 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.903 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.903 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.903 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.903 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.904 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.904 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.904 144381 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.904 144381 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.904 144381 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.904 144381 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.904 144381 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:14:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:14:44.904 144381 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 04:14:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:45.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:45.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:47.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:47.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:14:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:49.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:14:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:49.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:49 np0005593234 podman[145030]: 2026-01-23 09:14:49.826300931 +0000 UTC m=+0.107684850 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 04:14:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:50 np0005593234 systemd-logind[794]: New session 48 of user zuul.
Jan 23 04:14:50 np0005593234 systemd[1]: Started Session 48 of User zuul.
Jan 23 04:14:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:51.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:14:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:51.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:14:51 np0005593234 python3.9[145211]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:14:53 np0005593234 python3.9[145368]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:14:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:53.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:53.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:54 np0005593234 python3.9[145533]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:14:54 np0005593234 systemd[1]: Reloading.
Jan 23 04:14:54 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:14:54 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:14:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:55.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:14:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:55.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:56 np0005593234 python3.9[145719]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:14:56 np0005593234 network[145736]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:14:56 np0005593234 network[145737]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:14:56 np0005593234 network[145738]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:14:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:14:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:57.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:14:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:57.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:14:59.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:14:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:14:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:14:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:14:59.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:01.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:15:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:01.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:15:01 np0005593234 python3.9[146003]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:02 np0005593234 python3.9[146206]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:15:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:03.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:15:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:03.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:04 np0005593234 python3.9[146360]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:04 np0005593234 python3.9[146513]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:05.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:05 np0005593234 python3.9[146667]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:05.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:06 np0005593234 python3.9[146820]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:07 np0005593234 python3.9[146974]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:15:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:15:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:07.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:15:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:15:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:07.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:15:08 np0005593234 python3.9[147127]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:08 np0005593234 python3.9[147279]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:09.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:09 np0005593234 python3.9[147432]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:15:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:09.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:15:10 np0005593234 python3.9[147584]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:10 np0005593234 python3.9[147736]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:11 np0005593234 podman[147861]: 2026-01-23 09:15:11.224838508 +0000 UTC m=+0.069670741 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 04:15:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:15:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:11.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:15:11 np0005593234 python3.9[147903]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:11.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:11 np0005593234 python3.9[148059]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:13 np0005593234 python3.9[148212]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:13.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:13.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:13 np0005593234 python3.9[148364]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:14 np0005593234 python3.9[148516]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:15 np0005593234 python3.9[148669]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:15.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:15.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:15 np0005593234 python3.9[148821]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:16 np0005593234 python3.9[148973]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:17.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:17 np0005593234 python3.9[149126]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:15:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:17.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:18 np0005593234 python3.9[149278]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:15:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:19.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:15:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:19.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:19 np0005593234 python3.9[149431]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:15:20 np0005593234 podman[149555]: 2026-01-23 09:15:20.669379978 +0000 UTC m=+0.092682838 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 04:15:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:20 np0005593234 python3.9[149602]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:15:20 np0005593234 systemd[1]: Reloading.
Jan 23 04:15:21 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:15:21 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:15:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:15:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:21.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:15:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:21.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:22 np0005593234 python3.9[149797]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:22 np0005593234 python3.9[150000]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:23 np0005593234 python3.9[150154]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:23.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:23.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:24 np0005593234 python3.9[150307]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:24 np0005593234 python3.9[150460]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:25 np0005593234 python3.9[150614]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:25.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:25.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:25 np0005593234 python3.9[150767]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:15:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:27.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:27.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:28 np0005593234 python3.9[150921]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 23 04:15:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:15:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:29.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:15:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:29.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:29 np0005593234 python3.9[151075]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:15:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:31 np0005593234 python3.9[151234]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 04:15:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:31.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:31.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:32 np0005593234 python3.9[151394]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:15:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:15:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:33.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:15:33 np0005593234 python3.9[151479]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:15:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:33.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:15:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:35.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:15:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:35.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:37.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:37.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:39.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:39.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:41.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:41.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:41 np0005593234 podman[151494]: 2026-01-23 09:15:41.866421345 +0000 UTC m=+0.063202715 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:15:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:15:42.787 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:15:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:15:42.788 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:15:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:15:42.789 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:15:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:43.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:15:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:43.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:15:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:15:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:15:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:45.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:45.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:15:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:15:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:15:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:47.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:15:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:47.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:15:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:49.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.597838) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159749597914, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 2671, "num_deletes": 502, "total_data_size": 6247568, "memory_usage": 6333520, "flush_reason": "Manual Compaction"}
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159749631397, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 4103421, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 12301, "largest_seqno": 14967, "table_properties": {"data_size": 4092973, "index_size": 6302, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3077, "raw_key_size": 22417, "raw_average_key_size": 18, "raw_value_size": 4070378, "raw_average_value_size": 3366, "num_data_blocks": 281, "num_entries": 1209, "num_filter_entries": 1209, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159502, "oldest_key_time": 1769159502, "file_creation_time": 1769159749, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 33632 microseconds, and 11217 cpu microseconds.
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.631471) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 4103421 bytes OK
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.631497) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.633579) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.633598) EVENT_LOG_v1 {"time_micros": 1769159749633592, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.633620) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 6235065, prev total WAL file size 6235065, number of live WAL files 2.
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.634992) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323531' seq:0, type:0; will stop at (end)
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(4007KB)], [24(8423KB)]
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159749635299, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 12729503, "oldest_snapshot_seqno": -1}
Jan 23 04:15:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:49.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4186 keys, 10316024 bytes, temperature: kUnknown
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159749935800, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 10316024, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10283214, "index_size": 21244, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10501, "raw_key_size": 103322, "raw_average_key_size": 24, "raw_value_size": 10202629, "raw_average_value_size": 2437, "num_data_blocks": 898, "num_entries": 4186, "num_filter_entries": 4186, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769159749, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.936055) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 10316024 bytes
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.937817) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 42.3 rd, 34.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.9, 8.2 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(5.6) write-amplify(2.5) OK, records in: 5209, records dropped: 1023 output_compression: NoCompression
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.937849) EVENT_LOG_v1 {"time_micros": 1769159749937838, "job": 12, "event": "compaction_finished", "compaction_time_micros": 300584, "compaction_time_cpu_micros": 245664, "output_level": 6, "num_output_files": 1, "total_output_size": 10316024, "num_input_records": 5209, "num_output_records": 4186, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159749938539, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159749939875, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.634903) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.939935) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.939939) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.939941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.939943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:15:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:15:49.939944) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:15:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:51.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:15:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:51.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:15:51 np0005593234 podman[151872]: 2026-01-23 09:15:51.812373013 +0000 UTC m=+0.108187003 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 04:15:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:15:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:15:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:53.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:53.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:55.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:15:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:15:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:55.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:15:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:57.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:57.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:15:59.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:15:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:15:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:15:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:15:59.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:01.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:01.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:03.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:03.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:05.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:05.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:07.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:07.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:09.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:09.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:11.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:11.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:12 np0005593234 podman[152018]: 2026-01-23 09:16:12.824720254 +0000 UTC m=+0.094803613 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 04:16:13 np0005593234 kernel: SELinux:  Converting 2777 SID table entries...
Jan 23 04:16:13 np0005593234 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:16:13 np0005593234 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:16:13 np0005593234 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:16:13 np0005593234 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:16:13 np0005593234 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:16:13 np0005593234 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:16:13 np0005593234 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:16:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:13.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:13.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:15.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:15.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:17.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:17.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:19.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:19.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:21.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:21.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:22 np0005593234 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 23 04:16:22 np0005593234 podman[152072]: 2026-01-23 09:16:22.789492554 +0000 UTC m=+0.103109457 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 04:16:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:23.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:23.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:24 np0005593234 kernel: SELinux:  Converting 2777 SID table entries...
Jan 23 04:16:24 np0005593234 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:16:24 np0005593234 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:16:24 np0005593234 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:16:24 np0005593234 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:16:24 np0005593234 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:16:24 np0005593234 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:16:24 np0005593234 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:16:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:25.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:25.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:27.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:16:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:27.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:16:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:29.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:29.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:31.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:31.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:33.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:33.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:35.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:35.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:37.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:37.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:39.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:39.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:41.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:41.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:16:42.789 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:16:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:16:42.790 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:16:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:16:42.790 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:16:42 np0005593234 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 23 04:16:42 np0005593234 podman[154811]: 2026-01-23 09:16:42.97543342 +0000 UTC m=+0.092892113 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 04:16:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:43.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:43.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:45.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:45.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:47.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:47.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:49.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:49.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:51.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:51.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:53.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:53 np0005593234 podman[162156]: 2026-01-23 09:16:53.806171751 +0000 UTC m=+0.097686454 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:16:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:53.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:55 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:16:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:55.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:16:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:55.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:16:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:16:56 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:16:56 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:16:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:57.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:57 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:16:57 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:16:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:57.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:16:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.003000097s ======
Jan 23 04:16:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:16:59.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000097s
Jan 23 04:16:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:16:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:16:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:16:59.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:01.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:01.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:17:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:17:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:03.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:03.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:05.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:17:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:05.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:17:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:17:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:07.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:17:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:07.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:09.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:17:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:09.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:09.972542) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159829972731, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 980, "num_deletes": 250, "total_data_size": 2163056, "memory_usage": 2191040, "flush_reason": "Manual Compaction"}
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159829981473, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 887777, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14973, "largest_seqno": 15947, "table_properties": {"data_size": 884206, "index_size": 1351, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9135, "raw_average_key_size": 19, "raw_value_size": 876562, "raw_average_value_size": 1913, "num_data_blocks": 62, "num_entries": 458, "num_filter_entries": 458, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159750, "oldest_key_time": 1769159750, "file_creation_time": 1769159829, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 9065 microseconds, and 4443 cpu microseconds.
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:09.981690) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 887777 bytes OK
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:09.981738) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:09.983164) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:09.983187) EVENT_LOG_v1 {"time_micros": 1769159829983180, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:09.983235) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 2158202, prev total WAL file size 2158202, number of live WAL files 2.
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:09.984510) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323534' seq:72057594037927935, type:22 .. '6D67727374617400353035' seq:0, type:0; will stop at (end)
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(866KB)], [27(10074KB)]
Jan 23 04:17:09 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159829984684, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 11203801, "oldest_snapshot_seqno": -1}
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4165 keys, 7889550 bytes, temperature: kUnknown
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159830083994, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 7889550, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7860424, "index_size": 17624, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10437, "raw_key_size": 103175, "raw_average_key_size": 24, "raw_value_size": 7783661, "raw_average_value_size": 1868, "num_data_blocks": 741, "num_entries": 4165, "num_filter_entries": 4165, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769159829, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:10.084500) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 7889550 bytes
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:10.086042) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 112.5 rd, 79.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 9.8 +0.0 blob) out(7.5 +0.0 blob), read-write-amplify(21.5) write-amplify(8.9) OK, records in: 4644, records dropped: 479 output_compression: NoCompression
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:10.086109) EVENT_LOG_v1 {"time_micros": 1769159830086086, "job": 14, "event": "compaction_finished", "compaction_time_micros": 99568, "compaction_time_cpu_micros": 55363, "output_level": 6, "num_output_files": 1, "total_output_size": 7889550, "num_input_records": 4644, "num_output_records": 4165, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159830087451, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159830089531, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:09.984283) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:10.089722) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:10.089729) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:10.089737) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:10.089739) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:17:10 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:17:10.089741) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:17:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:11.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:11.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:13.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:13 np0005593234 podman[169357]: 2026-01-23 09:17:13.814063934 +0000 UTC m=+0.093685883 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:17:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:13.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:15.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:15.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 04:17:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:17.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 04:17:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:17.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:19.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:19.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:21.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:21 np0005593234 kernel: SELinux:  Converting 2778 SID table entries...
Jan 23 04:17:21 np0005593234 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 04:17:21 np0005593234 kernel: SELinux:  policy capability open_perms=1
Jan 23 04:17:21 np0005593234 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 04:17:21 np0005593234 kernel: SELinux:  policy capability always_check_network=0
Jan 23 04:17:21 np0005593234 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 04:17:21 np0005593234 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 04:17:21 np0005593234 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 04:17:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:21.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:22 np0005593234 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 04:17:22 np0005593234 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 23 04:17:22 np0005593234 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 04:17:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:23.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:23.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:24 np0005593234 podman[169461]: 2026-01-23 09:17:24.034437397 +0000 UTC m=+0.152293959 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 04:17:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:25.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:25.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:27.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:27.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:28 np0005593234 ceph-mgr[77448]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 04:17:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:29.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:29.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:31 np0005593234 systemd[1]: Stopping OpenSSH server daemon...
Jan 23 04:17:31 np0005593234 systemd[1]: sshd.service: Deactivated successfully.
Jan 23 04:17:31 np0005593234 systemd[1]: Stopped OpenSSH server daemon.
Jan 23 04:17:31 np0005593234 systemd[1]: sshd.service: Consumed 2.950s CPU time, read 564.0K from disk, written 64.0K to disk.
Jan 23 04:17:31 np0005593234 systemd[1]: Stopped target sshd-keygen.target.
Jan 23 04:17:31 np0005593234 systemd[1]: Stopping sshd-keygen.target...
Jan 23 04:17:31 np0005593234 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 04:17:31 np0005593234 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 04:17:31 np0005593234 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 04:17:31 np0005593234 systemd[1]: Reached target sshd-keygen.target.
Jan 23 04:17:31 np0005593234 systemd[1]: Starting OpenSSH server daemon...
Jan 23 04:17:31 np0005593234 systemd[1]: Started OpenSSH server daemon.
Jan 23 04:17:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:31.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:31.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:32 np0005593234 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:17:32 np0005593234 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:17:33 np0005593234 systemd[1]: Reloading.
Jan 23 04:17:33 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:33 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:33 np0005593234 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:17:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:33.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:33.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 04:17:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:35.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 04:17:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:35.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 04:17:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:37.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 04:17:37 np0005593234 python3.9[175290]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:17:37 np0005593234 systemd[1]: Reloading.
Jan 23 04:17:37 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:37 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:37.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:38 np0005593234 python3.9[176570]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:17:38 np0005593234 systemd[1]: Reloading.
Jan 23 04:17:38 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:38 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:39.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 04:17:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:39.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 04:17:40 np0005593234 python3.9[177862]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:17:40 np0005593234 systemd[1]: Reloading.
Jan 23 04:17:40 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:40 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:41 np0005593234 python3.9[179174]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:17:41 np0005593234 systemd[1]: Reloading.
Jan 23 04:17:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:41 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:41 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:41.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:41 np0005593234 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:17:41 np0005593234 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:17:41 np0005593234 systemd[1]: man-db-cache-update.service: Consumed 10.885s CPU time.
Jan 23 04:17:41 np0005593234 systemd[1]: run-r761246820d7d4c2bad928c00fcea697e.service: Deactivated successfully.
Jan 23 04:17:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:41.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:17:42.790 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:17:42.792 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:17:42.793 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:17:42 np0005593234 python3.9[179885]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:42 np0005593234 systemd[1]: Reloading.
Jan 23 04:17:42 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:42 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:43.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:43.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:44 np0005593234 python3.9[180126]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:44 np0005593234 systemd[1]: Reloading.
Jan 23 04:17:44 np0005593234 podman[180128]: 2026-01-23 09:17:44.140345093 +0000 UTC m=+0.066743099 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:17:44 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:44 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:45 np0005593234 python3.9[180338]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:45 np0005593234 systemd[1]: Reloading.
Jan 23 04:17:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:45.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:45 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:45 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:45.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:46 np0005593234 python3.9[180527]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:47 np0005593234 python3.9[180683]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:47 np0005593234 systemd[1]: Reloading.
Jan 23 04:17:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:47.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:47 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:47 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:47.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:49 np0005593234 python3.9[180873]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 04:17:49 np0005593234 systemd[1]: Reloading.
Jan 23 04:17:49 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:17:49 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:17:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:49.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:49 np0005593234 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 23 04:17:49 np0005593234 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 23 04:17:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:49.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:50 np0005593234 python3.9[181067]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:51 np0005593234 python3.9[181222]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:51.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:51.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:52 np0005593234 python3.9[181378]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:52 np0005593234 python3.9[181533]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:53.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:53 np0005593234 python3.9[181689]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:53.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:54 np0005593234 python3.9[181844]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:54 np0005593234 podman[181846]: 2026-01-23 09:17:54.427691372 +0000 UTC m=+0.109580341 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 04:17:55 np0005593234 python3.9[182024]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000068s ======
Jan 23 04:17:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:55.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000068s
Jan 23 04:17:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:55.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:56 np0005593234 python3.9[182180]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:17:56 np0005593234 python3.9[182335]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:57.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:57 np0005593234 python3.9[182491]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:57.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:58 np0005593234 python3.9[182646]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:59 np0005593234 python3.9[182801]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:17:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:17:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:17:59.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:17:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:17:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:17:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:17:59.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:17:59 np0005593234 python3.9[182957]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:18:00 np0005593234 python3.9[183112]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 04:18:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:01.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:01.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:03 np0005593234 python3.9[183384]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:18:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:03.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:18:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:03.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:18:03 np0005593234 python3.9[183604]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:18:04 np0005593234 python3.9[183756]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:18:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:18:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:18:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:18:05 np0005593234 python3.9[183908]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:18:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:05.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:05 np0005593234 python3.9[184061]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:18:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:18:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:05.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:18:06 np0005593234 python3.9[184213]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:18:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:07 np0005593234 python3.9[184363]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:18:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:07.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:18:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:07.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:18:08 np0005593234 python3.9[184516]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:09 np0005593234 python3.9[184641]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159887.889678-1649-105094615319131/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:09.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:09 np0005593234 python3.9[184794]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:09.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:10 np0005593234 python3.9[184920]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159889.4247594-1649-272930670707556/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:10 np0005593234 python3.9[185121]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:18:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:18:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:11.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:11 np0005593234 python3.9[185247]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159890.5527887-1649-123773400309724/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:11.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:12 np0005593234 python3.9[185399]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:12 np0005593234 python3.9[185524]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159891.7227027-1649-178867463654497/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:13 np0005593234 python3.9[185676]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:13.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:13 np0005593234 python3.9[185802]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159892.8662217-1649-55466999500495/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:13.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:14 np0005593234 python3.9[185954]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:14 np0005593234 podman[186004]: 2026-01-23 09:18:14.780310705 +0000 UTC m=+0.062168715 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 04:18:15 np0005593234 python3.9[186096]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159894.0436418-1649-56230056433687/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:15.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:15 np0005593234 python3.9[186249]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:18:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:16.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:18:16 np0005593234 python3.9[186372]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159895.1976514-1649-25402324883474/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:16 np0005593234 python3.9[186526]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:17 np0005593234 python3.9[186652]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769159896.365621-1649-125440498754586/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:17.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 04:18:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:18.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 04:18:19 np0005593234 python3.9[186804]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 23 04:18:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 04:18:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:19.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 04:18:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:18:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:20.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:18:20 np0005593234 python3.9[186958]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:21 np0005593234 python3.9[187110]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:21.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:21 np0005593234 python3.9[187263]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:18:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:22.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:18:22 np0005593234 python3.9[187415]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:23 np0005593234 python3.9[187568]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:23.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:23 np0005593234 python3.9[187722]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:18:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:24.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:18:24 np0005593234 python3.9[187924]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:24 np0005593234 podman[188009]: 2026-01-23 09:18:24.805383028 +0000 UTC m=+0.094680944 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 04:18:25 np0005593234 python3.9[188102]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:25.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:26.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:26 np0005593234 python3.9[188255]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:26 np0005593234 python3.9[188407]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:27 np0005593234 python3.9[188560]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:18:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:27.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:18:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:28.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:28 np0005593234 python3.9[188712]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:28 np0005593234 python3.9[188864]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:29 np0005593234 python3.9[189016]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:29.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:30.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:30 np0005593234 python3.9[189169]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:31 np0005593234 python3.9[189292]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159910.1756117-2311-230276209069199/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:31.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:31 np0005593234 python3.9[189445]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:32.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:32 np0005593234 python3.9[189568]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159911.384758-2311-182731849420184/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:33 np0005593234 python3.9[189720]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:33.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:33 np0005593234 python3.9[189844]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159912.5194247-2311-7094409918144/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:34.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:34 np0005593234 python3.9[189996]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:34 np0005593234 python3.9[190119]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159913.769176-2311-229324518407814/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:35 np0005593234 python3.9[190272]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:35.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:35 np0005593234 python3.9[190395]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159914.9636776-2311-100731954860158/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:36.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:36 np0005593234 python3.9[190547]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:37 np0005593234 python3.9[190670]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159916.1561265-2311-10640631660675/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:37.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:37 np0005593234 python3.9[190823]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:18:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:38.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:18:38 np0005593234 python3.9[190946]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159917.3490288-2311-276398164653226/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:39 np0005593234 python3.9[191098]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:39.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:39 np0005593234 python3.9[191222]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159918.5369043-2311-12475020485372/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:18:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:40.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:18:40 np0005593234 python3.9[191374]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:40 np0005593234 python3.9[191497]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159919.7968357-2311-44541534793092/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:41 np0005593234 python3.9[191650]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:18:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:41.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:18:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:41 np0005593234 python3.9[191773]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159920.9754555-2311-182865260210397/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:18:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:42.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:18:42 np0005593234 python3.9[191925]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:18:42.792 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:18:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:18:42.793 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:18:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:18:42.794 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:18:43 np0005593234 python3.9[192048]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159922.0934412-2311-45228515775833/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:43.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:43 np0005593234 auditd[702]: Audit daemon rotating log files
Jan 23 04:18:43 np0005593234 python3.9[192201]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:18:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:44.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:18:44 np0005593234 python3.9[192374]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159923.3113415-2311-155213465179114/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:44 np0005593234 podman[192526]: 2026-01-23 09:18:44.942199152 +0000 UTC m=+0.072723841 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 04:18:45 np0005593234 python3.9[192527]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:45.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:45 np0005593234 python3.9[192669]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159924.5211086-2311-146162986821893/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:18:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:46.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:18:46 np0005593234 python3.9[192821]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:18:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:46 np0005593234 python3.9[192944]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159925.8110895-2311-139073707847684/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:47.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:48.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:48 np0005593234 python3.9[193095]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:18:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:18:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:49.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:18:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:50.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:50 np0005593234 python3.9[193251]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 23 04:18:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:51.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:52.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:52 np0005593234 dbus-broker-launch[772]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 23 04:18:52 np0005593234 python3.9[193408]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:52 np0005593234 python3.9[193560]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:18:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:53.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:18:53 np0005593234 python3.9[193713]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:18:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:54.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:18:54 np0005593234 python3.9[193865]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:54 np0005593234 python3.9[194017]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:55.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:55 np0005593234 podman[194142]: 2026-01-23 09:18:55.637371941 +0000 UTC m=+0.085491258 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 04:18:55 np0005593234 python3.9[194185]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:18:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:56.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:18:56 np0005593234 python3.9[194344]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:57 np0005593234 python3.9[194496]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:18:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:57.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:57 np0005593234 python3.9[194649]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:18:58.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:58 np0005593234 python3.9[194801]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:18:59 np0005593234 python3.9[194953]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:18:59 np0005593234 systemd[1]: Reloading.
Jan 23 04:18:59 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:18:59 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:18:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:18:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:18:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:18:59.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:18:59 np0005593234 systemd[1]: Starting libvirt logging daemon socket...
Jan 23 04:18:59 np0005593234 systemd[1]: Listening on libvirt logging daemon socket.
Jan 23 04:18:59 np0005593234 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 23 04:18:59 np0005593234 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 23 04:18:59 np0005593234 systemd[1]: Starting libvirt logging daemon...
Jan 23 04:18:59 np0005593234 systemd[1]: Started libvirt logging daemon.
Jan 23 04:19:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:00.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:00 np0005593234 python3.9[195146]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:19:00 np0005593234 systemd[1]: Reloading.
Jan 23 04:19:00 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:00 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:00 np0005593234 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 23 04:19:00 np0005593234 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 23 04:19:00 np0005593234 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 23 04:19:00 np0005593234 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 23 04:19:00 np0005593234 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 23 04:19:00 np0005593234 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 23 04:19:00 np0005593234 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 04:19:00 np0005593234 systemd[1]: Started libvirt nodedev daemon.
Jan 23 04:19:01 np0005593234 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 23 04:19:01 np0005593234 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 23 04:19:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:01.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:01 np0005593234 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 23 04:19:01 np0005593234 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 23 04:19:01 np0005593234 python3.9[195363]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:19:01 np0005593234 systemd[1]: Reloading.
Jan 23 04:19:01 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:01 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:19:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:02.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:19:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:02 np0005593234 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 23 04:19:02 np0005593234 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 23 04:19:02 np0005593234 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 23 04:19:02 np0005593234 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 23 04:19:02 np0005593234 systemd[1]: Starting libvirt proxy daemon...
Jan 23 04:19:02 np0005593234 systemd[1]: Started libvirt proxy daemon.
Jan 23 04:19:02 np0005593234 setroubleshoot[195208]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 65529602-ddee-4f78-82a3-a790f76f526a
Jan 23 04:19:02 np0005593234 setroubleshoot[195208]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 23 04:19:02 np0005593234 setroubleshoot[195208]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 65529602-ddee-4f78-82a3-a790f76f526a
Jan 23 04:19:02 np0005593234 setroubleshoot[195208]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Jan 23 04:19:03 np0005593234 python3.9[195585]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:19:03 np0005593234 systemd[1]: Reloading.
Jan 23 04:19:03 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:03 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:03 np0005593234 systemd[1]: Listening on libvirt locking daemon socket.
Jan 23 04:19:03 np0005593234 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 23 04:19:03 np0005593234 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 23 04:19:03 np0005593234 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 23 04:19:03 np0005593234 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 23 04:19:03 np0005593234 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 23 04:19:03 np0005593234 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 23 04:19:03 np0005593234 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 23 04:19:03 np0005593234 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 23 04:19:03 np0005593234 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 23 04:19:03 np0005593234 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 04:19:03 np0005593234 systemd[1]: Started libvirt QEMU daemon.
Jan 23 04:19:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:03.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:04.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:04 np0005593234 python3.9[195851]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:19:04 np0005593234 systemd[1]: Reloading.
Jan 23 04:19:04 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:04 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:04 np0005593234 systemd[1]: Starting libvirt secret daemon socket...
Jan 23 04:19:04 np0005593234 systemd[1]: Listening on libvirt secret daemon socket.
Jan 23 04:19:04 np0005593234 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 23 04:19:04 np0005593234 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 23 04:19:04 np0005593234 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 23 04:19:04 np0005593234 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 23 04:19:04 np0005593234 systemd[1]: Starting libvirt secret daemon...
Jan 23 04:19:04 np0005593234 systemd[1]: Started libvirt secret daemon.
Jan 23 04:19:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:05.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:05 np0005593234 python3.9[196063]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:06.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:06 np0005593234 python3.9[196215]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:19:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:07 np0005593234 python3.9[196367]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;#012echo ceph#012awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:19:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:07.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:08.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:08 np0005593234 python3.9[196522]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:19:09 np0005593234 python3.9[196672]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:09.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:09 np0005593234 python3.9[196794]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159948.6594243-3386-106188878358604/.source.xml follow=False _original_basename=secret.xml.j2 checksum=4390443d357de49206cd2f69bdb29495711c4544 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:10.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:10 np0005593234 python3.9[197008]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine e1533653-0a5a-584c-b34b-8689f0d32e77#012virsh secret-define --file /tmp/secret.xml#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:19:11 np0005593234 podman[197168]: 2026-01-23 09:19:11.429869085 +0000 UTC m=+0.361583089 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:19:11 np0005593234 podman[197168]: 2026-01-23 09:19:11.545117828 +0000 UTC m=+0.476831802 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Jan 23 04:19:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:11 np0005593234 python3.9[197293]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:11.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:12.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:12 np0005593234 podman[197564]: 2026-01-23 09:19:12.250474662 +0000 UTC m=+0.094286782 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:19:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:12 np0005593234 podman[197606]: 2026-01-23 09:19:12.35980208 +0000 UTC m=+0.091133584 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:19:12 np0005593234 podman[197564]: 2026-01-23 09:19:12.42447008 +0000 UTC m=+0.268282170 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:19:12 np0005593234 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 23 04:19:12 np0005593234 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.014s CPU time.
Jan 23 04:19:12 np0005593234 podman[197674]: 2026-01-23 09:19:12.756150459 +0000 UTC m=+0.154322558 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, vcs-type=git, version=2.2.4, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9)
Jan 23 04:19:12 np0005593234 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 23 04:19:12 np0005593234 podman[197747]: 2026-01-23 09:19:12.874798087 +0000 UTC m=+0.093365283 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, distribution-scope=public, io.openshift.tags=Ceph keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, version=2.2.4, architecture=x86_64, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, io.openshift.expose-services=, name=keepalived)
Jan 23 04:19:13 np0005593234 podman[197674]: 2026-01-23 09:19:13.125239052 +0000 UTC m=+0.523411141 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.28.2, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=2.2.4, name=keepalived)
Jan 23 04:19:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:13.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:14.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:14 np0005593234 python3.9[197994]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:14 np0005593234 python3.9[198260]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:19:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:19:15 np0005593234 podman[198372]: 2026-01-23 09:19:15.307728928 +0000 UTC m=+0.069307265 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 04:19:15 np0005593234 python3.9[198420]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159954.356684-3551-217801376295401/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:15.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:16.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:19:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:19:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:19:16 np0005593234 python3.9[198572]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:17 np0005593234 python3.9[198724]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:17.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:17 np0005593234 python3.9[198803]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:18.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:18 np0005593234 python3.9[198955]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:18 np0005593234 python3.9[199033]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.fwhuauf7 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:19.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:19 np0005593234 python3.9[199186]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:20.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:20 np0005593234 python3.9[199264]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:20 np0005593234 python3.9[199416]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:19:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:21.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:22.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:22 np0005593234 python3[199620]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 04:19:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:19:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:19:23 np0005593234 python3.9[199772]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:23.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:23 np0005593234 python3.9[199851]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:24.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:24 np0005593234 python3.9[200053]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.017255) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159965017349, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 1463, "num_deletes": 257, "total_data_size": 3554136, "memory_usage": 3584920, "flush_reason": "Manual Compaction"}
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159965045333, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 2337204, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15952, "largest_seqno": 17410, "table_properties": {"data_size": 2330955, "index_size": 3512, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12158, "raw_average_key_size": 18, "raw_value_size": 2318598, "raw_average_value_size": 3567, "num_data_blocks": 159, "num_entries": 650, "num_filter_entries": 650, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159830, "oldest_key_time": 1769159830, "file_creation_time": 1769159965, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 28144 microseconds, and 6088 cpu microseconds.
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.045419) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 2337204 bytes OK
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.045439) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.047643) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.047658) EVENT_LOG_v1 {"time_micros": 1769159965047653, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.047678) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 3547382, prev total WAL file size 3547382, number of live WAL files 2.
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.048622) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323534' seq:0, type:0; will stop at (end)
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(2282KB)], [30(7704KB)]
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159965048728, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 10226754, "oldest_snapshot_seqno": -1}
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4286 keys, 9889365 bytes, temperature: kUnknown
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159965112953, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 9889365, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9857621, "index_size": 19922, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 106754, "raw_average_key_size": 24, "raw_value_size": 9776828, "raw_average_value_size": 2281, "num_data_blocks": 833, "num_entries": 4286, "num_filter_entries": 4286, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769159965, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.113190) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 9889365 bytes
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.114162) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 159.0 rd, 153.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.2, 7.5 +0.0 blob) out(9.4 +0.0 blob), read-write-amplify(8.6) write-amplify(4.2) OK, records in: 4815, records dropped: 529 output_compression: NoCompression
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.114188) EVENT_LOG_v1 {"time_micros": 1769159965114180, "job": 16, "event": "compaction_finished", "compaction_time_micros": 64315, "compaction_time_cpu_micros": 26361, "output_level": 6, "num_output_files": 1, "total_output_size": 9889365, "num_input_records": 4815, "num_output_records": 4286, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159965114657, "job": 16, "event": "table_file_deletion", "file_number": 32}
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159965115930, "job": 16, "event": "table_file_deletion", "file_number": 30}
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.048486) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.116023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.116030) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.116033) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.116035) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:25.116037) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:25.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:25 np0005593234 python3.9[200179]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159964.321119-3818-210344042076378/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:25 np0005593234 podman[200183]: 2026-01-23 09:19:25.813501623 +0000 UTC m=+0.111351935 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 04:19:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:26.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:26 np0005593234 python3.9[200358]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:27 np0005593234 python3.9[200436]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:27.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:27 np0005593234 python3.9[200589]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:28.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:28 np0005593234 python3.9[200667]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:29 np0005593234 python3.9[200819]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:29.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:29 np0005593234 python3.9[200945]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769159968.6606073-3934-205373784688186/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:30.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:30 np0005593234 python3.9[201097]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.881745) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159970881839, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 306, "num_deletes": 251, "total_data_size": 165554, "memory_usage": 171336, "flush_reason": "Manual Compaction"}
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159970887088, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 108869, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17415, "largest_seqno": 17716, "table_properties": {"data_size": 106898, "index_size": 199, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5054, "raw_average_key_size": 18, "raw_value_size": 103029, "raw_average_value_size": 374, "num_data_blocks": 9, "num_entries": 275, "num_filter_entries": 275, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159965, "oldest_key_time": 1769159965, "file_creation_time": 1769159970, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 5386 microseconds, and 1175 cpu microseconds.
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.887143) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 108869 bytes OK
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.887164) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.888875) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.888891) EVENT_LOG_v1 {"time_micros": 1769159970888885, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.888911) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 163341, prev total WAL file size 163341, number of live WAL files 2.
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.889320) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(106KB)], [33(9657KB)]
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159970889456, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 9998234, "oldest_snapshot_seqno": -1}
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4051 keys, 7956047 bytes, temperature: kUnknown
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159970941498, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 7956047, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7927552, "index_size": 17249, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10181, "raw_key_size": 102554, "raw_average_key_size": 25, "raw_value_size": 7852483, "raw_average_value_size": 1938, "num_data_blocks": 712, "num_entries": 4051, "num_filter_entries": 4051, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769159970, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.941747) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 7956047 bytes
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.949112) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 191.9 rd, 152.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 9.4 +0.0 blob) out(7.6 +0.0 blob), read-write-amplify(164.9) write-amplify(73.1) OK, records in: 4561, records dropped: 510 output_compression: NoCompression
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.949140) EVENT_LOG_v1 {"time_micros": 1769159970949127, "job": 18, "event": "compaction_finished", "compaction_time_micros": 52107, "compaction_time_cpu_micros": 18462, "output_level": 6, "num_output_files": 1, "total_output_size": 7956047, "num_input_records": 4561, "num_output_records": 4051, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159970949278, "job": 18, "event": "table_file_deletion", "file_number": 35}
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769159970951076, "job": 18, "event": "table_file_deletion", "file_number": 33}
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.889238) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.951141) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.951146) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.951148) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.951149) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:19:30.951151) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:19:31 np0005593234 python3.9[201250]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:19:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:31.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:32.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:32 np0005593234 python3.9[201405]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:33.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:33 np0005593234 python3.9[201558]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:19:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:34.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:34 np0005593234 python3.9[201711]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:19:35 np0005593234 python3.9[201866]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:19:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:35.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:36.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:36 np0005593234 python3.9[202021]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:37 np0005593234 python3.9[202173]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:37.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:37 np0005593234 python3.9[202297]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159976.7771726-4151-155406364285873/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:38.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:38 np0005593234 python3.9[202449]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:39 np0005593234 python3.9[202572]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159978.3372517-4196-273005187008271/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:39.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:40.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:40 np0005593234 python3.9[202725]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:19:40 np0005593234 python3.9[202848]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159979.8517704-4241-165156680828966/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:19:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:41.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:41 np0005593234 python3.9[203001]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:19:41 np0005593234 systemd[1]: Reloading.
Jan 23 04:19:42 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:42 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:42.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:42 np0005593234 systemd[1]: Reached target edpm_libvirt.target.
Jan 23 04:19:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:19:42.794 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:19:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:19:42.796 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:19:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:19:42.796 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:19:43 np0005593234 python3.9[203192]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 04:19:43 np0005593234 systemd[1]: Reloading.
Jan 23 04:19:43 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:43 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:43.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:43 np0005593234 systemd[1]: Reloading.
Jan 23 04:19:43 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:19:43 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:19:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:44.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:44 np0005593234 systemd-logind[794]: Session 48 logged out. Waiting for processes to exit.
Jan 23 04:19:44 np0005593234 systemd[1]: session-48.scope: Deactivated successfully.
Jan 23 04:19:44 np0005593234 systemd[1]: session-48.scope: Consumed 3min 34.071s CPU time.
Jan 23 04:19:44 np0005593234 systemd-logind[794]: Removed session 48.
Jan 23 04:19:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:45.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:45 np0005593234 podman[203341]: 2026-01-23 09:19:45.800646553 +0000 UTC m=+0.085291368 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:19:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:19:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:46.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:19:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:47.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:48.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:49.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:50.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:50 np0005593234 systemd-logind[794]: New session 49 of user zuul.
Jan 23 04:19:50 np0005593234 systemd[1]: Started Session 49 of User zuul.
Jan 23 04:19:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:51.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:51 np0005593234 python3.9[203516]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:19:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:19:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:52.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:19:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:53 np0005593234 python3.9[203670]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:19:53 np0005593234 network[203687]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:19:53 np0005593234 network[203688]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:19:53 np0005593234 network[203689]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:19:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:53.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:54.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:55.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:55 np0005593234 podman[203797]: 2026-01-23 09:19:55.955878849 +0000 UTC m=+0.095678550 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 04:19:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:56.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:19:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:57.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:19:58.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:58 np0005593234 python3.9[203991]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 04:19:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:19:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:19:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:19:59.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:19:59 np0005593234 python3.9[204076]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:20:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:00.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:01 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 04:20:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:01.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:02.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:03.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:04.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:05.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:06.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:06 np0005593234 python3.9[204282]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:20:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:07.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:08.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:09.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:10.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:11 np0005593234 python3.9[204438]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:20:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:11.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:12.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:13.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:20:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:14.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:20:15 np0005593234 python3.9[204593]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:20:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:15.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:15 np0005593234 python3.9[204746]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:20:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:16.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:16 np0005593234 podman[204871]: 2026-01-23 09:20:16.550242511 +0000 UTC m=+0.046713210 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 04:20:16 np0005593234 python3.9[204916]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:20:17 np0005593234 python3.9[205041]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769160016.2711625-248-35902709639364/.source.iscsi _original_basename=.ba7zy4tn follow=False checksum=b6f30c323a2c3901ba12760f2629922e0704dc43 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:17.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:18.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:18 np0005593234 python3.9[205193]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:19 np0005593234 python3.9[205346]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:19 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:20:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:19.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:19 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:20:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:20.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:20 np0005593234 python3.9[205499]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:20:20 np0005593234 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 23 04:20:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:21.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:22.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:22 np0005593234 python3.9[205783]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:20:23 np0005593234 systemd[1]: Reloading.
Jan 23 04:20:23 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:20:23 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:20:23 np0005593234 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 04:20:23 np0005593234 systemd[1]: Starting Open-iSCSI...
Jan 23 04:20:23 np0005593234 kernel: Loading iSCSI transport class v2.0-870.
Jan 23 04:20:23 np0005593234 systemd[1]: Started Open-iSCSI.
Jan 23 04:20:23 np0005593234 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 23 04:20:23 np0005593234 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 23 04:20:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:23.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:23 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:20:23 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:20:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:20:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:24.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:20:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:20:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:20:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:20:25 np0005593234 python3.9[206037]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:20:25 np0005593234 network[206054]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:20:25 np0005593234 network[206055]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:20:25 np0005593234 network[206056]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:20:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:25.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:26 np0005593234 podman[206064]: 2026-01-23 09:20:26.101358272 +0000 UTC m=+0.088492747 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 04:20:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:26.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:27.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:28.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:29.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:20:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:30.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:20:30 np0005593234 python3.9[206358]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:20:31 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:20:31 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:20:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:20:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:31.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:20:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:32.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:32 np0005593234 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:20:32 np0005593234 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:20:32 np0005593234 systemd[1]: Reloading.
Jan 23 04:20:32 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:20:32 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:20:33 np0005593234 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:20:33 np0005593234 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:20:33 np0005593234 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:20:33 np0005593234 systemd[1]: run-re9a460d028b6493eaa6d36e88c0e72bf.service: Deactivated successfully.
Jan 23 04:20:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:33.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:34.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:35 np0005593234 python3.9[206728]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 04:20:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:35.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:36.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:36 np0005593234 python3.9[206880]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 23 04:20:37 np0005593234 python3.9[207036]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:20:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:37.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:37 np0005593234 python3.9[207160]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769160036.8560915-512-43802559958320/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:38.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:38 np0005593234 python3.9[207312]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:39.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:40 np0005593234 python3.9[207465]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:20:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 04:20:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:40.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 04:20:40 np0005593234 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 04:20:40 np0005593234 systemd[1]: Stopped Load Kernel Modules.
Jan 23 04:20:40 np0005593234 systemd[1]: Stopping Load Kernel Modules...
Jan 23 04:20:40 np0005593234 systemd[1]: Starting Load Kernel Modules...
Jan 23 04:20:40 np0005593234 systemd[1]: Finished Load Kernel Modules.
Jan 23 04:20:41 np0005593234 python3.9[207621]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:20:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:41.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:41 np0005593234 python3.9[207775]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:20:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:42.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:42 np0005593234 python3.9[207927]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:20:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:20:42.795 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:20:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:20:42.797 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:20:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:20:42.797 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:20:43 np0005593234 python3.9[208050]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769160042.3053408-664-10940569511356/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:20:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:43.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:20:44 np0005593234 python3.9[208203]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:20:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:44.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:44 np0005593234 python3.9[208406]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:45.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:45 np0005593234 python3.9[208559]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:20:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:46.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:20:46 np0005593234 python3.9[208711]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:46 np0005593234 podman[208712]: 2026-01-23 09:20:46.760370101 +0000 UTC m=+0.053865163 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:20:47 np0005593234 python3.9[208882]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:47.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:48 np0005593234 python3.9[209034]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:48.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:48 np0005593234 python3.9[209186]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:49 np0005593234 python3.9[209338]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:49.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:50 np0005593234 python3.9[209491]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:20:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:20:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:50.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:20:50 np0005593234 python3.9[209645]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:20:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:51.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:52 np0005593234 python3.9[209799]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:20:52 np0005593234 systemd[1]: Listening on multipathd control socket.
Jan 23 04:20:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:52.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:53 np0005593234 python3.9[209955]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:20:53 np0005593234 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 23 04:20:53 np0005593234 udevadm[209960]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 23 04:20:53 np0005593234 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 23 04:20:53 np0005593234 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 04:20:53 np0005593234 multipathd[209964]: --------start up--------
Jan 23 04:20:53 np0005593234 multipathd[209964]: read /etc/multipath.conf
Jan 23 04:20:53 np0005593234 multipathd[209964]: path checkers start up
Jan 23 04:20:53 np0005593234 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 04:20:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:53.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:20:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:54.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:20:54 np0005593234 python3.9[210124]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 04:20:55 np0005593234 python3.9[210276]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 23 04:20:55 np0005593234 kernel: Key type psk registered
Jan 23 04:20:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:55.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:55 np0005593234 python3.9[210441]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:20:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:20:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:56.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:20:56 np0005593234 podman[210536]: 2026-01-23 09:20:56.425685851 +0000 UTC m=+0.124101583 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 04:20:56 np0005593234 python3.9[210580]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769160055.4436216-1055-53663946945556/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:57 np0005593234 python3.9[210742]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:20:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:20:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:57.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:20:58.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:20:58 np0005593234 python3.9[210894]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:20:58 np0005593234 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 04:20:58 np0005593234 systemd[1]: Stopped Load Kernel Modules.
Jan 23 04:20:58 np0005593234 systemd[1]: Stopping Load Kernel Modules...
Jan 23 04:20:58 np0005593234 systemd[1]: Starting Load Kernel Modules...
Jan 23 04:20:58 np0005593234 systemd[1]: Finished Load Kernel Modules.
Jan 23 04:20:59 np0005593234 python3.9[211050]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 04:20:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:20:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:20:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:20:59.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:00.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:00 np0005593234 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 23 04:21:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:01.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:02.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:02 np0005593234 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 23 04:21:02 np0005593234 systemd[1]: Reloading.
Jan 23 04:21:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:02 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:21:02 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:21:02 np0005593234 systemd[1]: Reloading.
Jan 23 04:21:02 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:21:02 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:21:03 np0005593234 systemd-logind[794]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 04:21:03 np0005593234 systemd-logind[794]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 04:21:03 np0005593234 lvm[211168]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 04:21:03 np0005593234 lvm[211168]: VG ceph_vg0 finished
Jan 23 04:21:03 np0005593234 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 04:21:03 np0005593234 systemd[1]: Starting man-db-cache-update.service...
Jan 23 04:21:03 np0005593234 systemd[1]: Reloading.
Jan 23 04:21:03 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:21:03 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:21:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:21:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:03.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:21:03 np0005593234 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 04:21:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:04.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:04 np0005593234 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 04:21:04 np0005593234 systemd[1]: Finished man-db-cache-update.service.
Jan 23 04:21:04 np0005593234 systemd[1]: man-db-cache-update.service: Consumed 1.578s CPU time.
Jan 23 04:21:04 np0005593234 systemd[1]: run-re0ab6d95363c44a9a6c31fc783ebc47c.service: Deactivated successfully.
Jan 23 04:21:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:05.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:06.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:06 np0005593234 python3.9[212571]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:21:06 np0005593234 systemd[1]: Stopping Open-iSCSI...
Jan 23 04:21:06 np0005593234 iscsid[205829]: iscsid shutting down.
Jan 23 04:21:06 np0005593234 systemd[1]: iscsid.service: Deactivated successfully.
Jan 23 04:21:06 np0005593234 systemd[1]: Stopped Open-iSCSI.
Jan 23 04:21:06 np0005593234 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 04:21:06 np0005593234 systemd[1]: Starting Open-iSCSI...
Jan 23 04:21:06 np0005593234 systemd[1]: Started Open-iSCSI.
Jan 23 04:21:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:07 np0005593234 python3.9[212728]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:21:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:07.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:08.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:08 np0005593234 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 23 04:21:08 np0005593234 multipathd[209964]: exit (signal)
Jan 23 04:21:08 np0005593234 multipathd[209964]: --------shut down-------
Jan 23 04:21:08 np0005593234 systemd[1]: multipathd.service: Deactivated successfully.
Jan 23 04:21:08 np0005593234 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 23 04:21:08 np0005593234 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 04:21:08 np0005593234 multipathd[212734]: --------start up--------
Jan 23 04:21:08 np0005593234 multipathd[212734]: read /etc/multipath.conf
Jan 23 04:21:08 np0005593234 multipathd[212734]: path checkers start up
Jan 23 04:21:08 np0005593234 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 04:21:09 np0005593234 python3.9[212891]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 04:21:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:09.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:10.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:10 np0005593234 python3.9[213048]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:11.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:11 np0005593234 python3.9[213201]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:21:11 np0005593234 systemd[1]: Reloading.
Jan 23 04:21:11 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:21:11 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:21:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:12.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:12 np0005593234 python3.9[213386]: ansible-ansible.builtin.service_facts Invoked
Jan 23 04:21:12 np0005593234 network[213403]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 04:21:12 np0005593234 network[213404]: 'network-scripts' will be removed from distribution in near future.
Jan 23 04:21:12 np0005593234 network[213405]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 04:21:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:13.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:13 np0005593234 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 23 04:21:13 np0005593234 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 04:21:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:14.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:15.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:16.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:17 np0005593234 podman[213655]: 2026-01-23 09:21:17.557481091 +0000 UTC m=+0.055536806 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 04:21:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:17.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:17 np0005593234 python3.9[213698]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:18.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:18 np0005593234 python3.9[213851]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:19 np0005593234 python3.9[214004]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:19.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:19 np0005593234 python3.9[214158]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:20.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:20 np0005593234 python3.9[214311]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:21 np0005593234 python3.9[214464]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:21.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:22 np0005593234 python3.9[214618]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:22.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:22 np0005593234 python3.9[214771]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:21:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:23.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:24 np0005593234 python3.9[214925]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:24.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:24 np0005593234 python3.9[215077]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:25 np0005593234 python3.9[215279]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:25.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:25 np0005593234 python3.9[215432]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:26.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:26 np0005593234 python3.9[215584]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:26 np0005593234 podman[215632]: 2026-01-23 09:21:26.799912657 +0000 UTC m=+0.092364333 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 23 04:21:27 np0005593234 python3.9[215760]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:27.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:27 np0005593234 python3.9[215913]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:28.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:28 np0005593234 python3.9[216065]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:29 np0005593234 python3.9[216218]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:29.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:30 np0005593234 python3.9[216370]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:30.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:30 np0005593234 python3.9[216522]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:31 np0005593234 python3.9[216674]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:31.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:32 np0005593234 python3.9[216929]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:32.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:32 np0005593234 python3.9[217110]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:21:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:21:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:21:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:21:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:21:33 np0005593234 python3.9[217262]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:33.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:33 np0005593234 python3.9[217415]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:21:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:34.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:35 np0005593234 python3.9[217567]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:35.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:36 np0005593234 python3.9[217720]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 04:21:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:36.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:37 np0005593234 python3.9[217872]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:21:37 np0005593234 systemd[1]: Reloading.
Jan 23 04:21:37 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:21:37 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:21:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:37.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:38.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:38 np0005593234 python3.9[218060]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:39 np0005593234 python3.9[218213]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:39.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:39 np0005593234 python3.9[218417]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:21:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:21:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:40.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:40 np0005593234 python3.9[218570]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:41 np0005593234 python3.9[218723]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:41.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:41 np0005593234 python3.9[218877]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:42.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:42 np0005593234 python3.9[219030]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:21:42.796 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:21:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:21:42.798 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:21:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:21:42.798 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:21:43 np0005593234 python3.9[219183]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 04:21:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:43.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:44.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:45 np0005593234 python3.9[219388]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:45.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:46 np0005593234 python3.9[219540]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:46.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:46 np0005593234 python3.9[219692]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:47 np0005593234 python3.9[219845]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:47 np0005593234 podman[219846]: 2026-01-23 09:21:47.709170793 +0000 UTC m=+0.060688006 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:21:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:47.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:48.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:48 np0005593234 python3.9[220017]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:49 np0005593234 python3.9[220169]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:49.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:49 np0005593234 python3.9[220322]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:50.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:50 np0005593234 python3.9[220474]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:51 np0005593234 python3.9[220626]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:51 np0005593234 python3.9[220779]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:21:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:51.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:52.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:53.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:54.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:21:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.0 total, 600.0 interval#012Cumulative writes: 3509 writes, 18K keys, 3509 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.03 MB/s#012Cumulative WAL: 3509 writes, 3509 syncs, 1.00 writes per sync, written: 0.04 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1374 writes, 6591 keys, 1374 commit groups, 1.0 writes per commit group, ingest: 14.75 MB, 0.02 MB/s#012Interval WAL: 1374 writes, 1374 syncs, 1.00 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     41.3      0.51              0.07         9    0.057       0      0       0.0       0.0#012  L6      1/0    7.59 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.1     71.2     59.1      1.11              0.45         8    0.139     35K   4336       0.0       0.0#012 Sum      1/0    7.59 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.1     48.8     53.5      1.62              0.51        17    0.095     35K   4336       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   5.8     71.0     70.0      0.59              0.37         8    0.074     19K   2541       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0     71.2     59.1      1.11              0.45         8    0.139     35K   4336       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     41.6      0.51              0.07         8    0.063       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.0 total, 600.0 interval#012Flush(GB): cumulative 0.021, interval 0.007#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.08 GB write, 0.07 MB/s write, 0.08 GB read, 0.07 MB/s read, 1.6 seconds#012Interval compaction: 0.04 GB write, 0.07 MB/s write, 0.04 GB read, 0.07 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 308.00 MB usage: 4.88 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000142 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(266,4.57 MB,1.48283%) FilterBlock(17,107.73 KB,0.0341589%) IndexBlock(17,214.94 KB,0.0681493%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:21:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:55.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:56.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:21:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:57.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:57 np0005593234 podman[220908]: 2026-01-23 09:21:57.772413577 +0000 UTC m=+0.081430527 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 04:21:57 np0005593234 python3.9[220952]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 23 04:21:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:21:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:21:58.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:21:58 np0005593234 python3.9[221115]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 04:21:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:21:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:21:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:21:59.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:21:59 np0005593234 python3.9[221274]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 04:22:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:22:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:00.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:22:01 np0005593234 systemd-logind[794]: New session 50 of user zuul.
Jan 23 04:22:01 np0005593234 systemd[1]: Started Session 50 of User zuul.
Jan 23 04:22:01 np0005593234 systemd[1]: session-50.scope: Deactivated successfully.
Jan 23 04:22:01 np0005593234 systemd-logind[794]: Session 50 logged out. Waiting for processes to exit.
Jan 23 04:22:01 np0005593234 systemd-logind[794]: Removed session 50.
Jan 23 04:22:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:01.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:02 np0005593234 python3.9[221461]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:02.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:02 np0005593234 python3.9[221582]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160121.688353-2662-173145973531775/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:03 np0005593234 python3.9[221732]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:03 np0005593234 python3.9[221809]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:03.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:22:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:04.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:22:04 np0005593234 python3.9[221959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:04 np0005593234 python3.9[222080]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160123.7495456-2662-6157013150834/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:05 np0005593234 python3.9[222281]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:05.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:06 np0005593234 python3.9[222402]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160125.0584438-2662-139350924104630/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:06.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:06 np0005593234 python3.9[222552]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:07 np0005593234 python3.9[222673]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160126.193198-2662-106374979916172/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:07 np0005593234 python3.9[222824]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:07.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:08 np0005593234 python3.9[222945]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160127.2939456-2662-147828370027974/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:08.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:09.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:09 np0005593234 python3.9[223098]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:22:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:10.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:10 np0005593234 python3.9[223250]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:22:11 np0005593234 python3.9[223402]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:11.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:12 np0005593234 python3.9[223555]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:22:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:12.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:22:12 np0005593234 python3.9[223678]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769160131.561517-2984-258150853659938/.source _original_basename=.m94sye4_ follow=False checksum=7bc84da85246523a754652f4b5895b38f33249c2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 23 04:22:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:13 np0005593234 python3.9[223831]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:13.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:22:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:14.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:22:14 np0005593234 python3.9[223983]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 23 04:22:14 np0005593234 python3.9[224104]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160133.9692087-3061-157265377462572/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:15 np0005593234 python3.9[224255]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 04:22:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:15.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:16 np0005593234 python3.9[224376]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769160135.1982396-3107-143167399479632/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 04:22:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:16.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:17 np0005593234 python3.9[224529]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 23 04:22:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:17.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:18.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:18 np0005593234 podman[224653]: 2026-01-23 09:22:18.473664612 +0000 UTC m=+0.062854072 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:22:18 np0005593234 python3.9[224699]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 04:22:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:19.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:19 np0005593234 python3[224853]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 04:22:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:20.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:21.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:22.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:23.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:22:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:24.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:22:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:25.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:26.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:22:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:27.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:22:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:22:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:28.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:22:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:22:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1200.5 total, 600.0 interval#012Cumulative writes: 5824 writes, 24K keys, 5824 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s#012Cumulative WAL: 5824 writes, 1000 syncs, 5.82 writes per sync, written: 0.02 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 447 writes, 670 keys, 447 commit groups, 1.0 writes per commit group, ingest: 0.22 MB, 0.00 MB/s#012Interval WAL: 447 writes, 218 syncs, 2.05 writes per sync, written: 0.00 GB, 0.00 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.31              0.00         1    0.311       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.31              0.00         1    0.311       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.31              0.00         1    0.311       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.5 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.3 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bc37c4b350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.5 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bc37c4b350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.6e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1200.5 total, 600.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_
Jan 23 04:22:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:29.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:30.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:31.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:32 np0005593234 podman[224992]: 2026-01-23 09:22:32.18714313 +0000 UTC m=+3.458664039 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 23 04:22:32 np0005593234 podman[224868]: 2026-01-23 09:22:32.202616292 +0000 UTC m=+12.180303152 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 23 04:22:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:32.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:32 np0005593234 podman[225043]: 2026-01-23 09:22:32.373344984 +0000 UTC m=+0.055837980 container create 25e6010f79dac3dc6ca9f8ed46c4472a8b1a70d3dc529d961ab1f6ba09af5f57 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, container_name=nova_compute_init)
Jan 23 04:22:32 np0005593234 podman[225043]: 2026-01-23 09:22:32.341075422 +0000 UTC m=+0.023568408 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 23 04:22:32 np0005593234 python3[224853]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 23 04:22:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:33.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:34.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:34 np0005593234 python3.9[225234]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:35.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:36 np0005593234 python3.9[225389]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 23 04:22:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:22:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:36.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:22:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:37.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:38 np0005593234 python3.9[225541]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 04:22:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:38.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:39 np0005593234 python3[225694]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 04:22:39 np0005593234 podman[225731]: 2026-01-23 09:22:39.27525728 +0000 UTC m=+0.047505985 container create 0a17c213f6c333c7810a664505c998196fb95a30b87638b18761d474476df8f2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 04:22:39 np0005593234 podman[225731]: 2026-01-23 09:22:39.25319287 +0000 UTC m=+0.025441605 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Jan 23 04:22:39 np0005593234 python3[225694]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Jan 23 04:22:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:39.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:40 np0005593234 python3.9[226115]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:22:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:40.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:22:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:22:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:22:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:22:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:22:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:22:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:22:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:22:41 np0005593234 python3.9[226325]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:22:41 np0005593234 python3.9[226477]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769160161.090425-3393-156599525256597/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 04:22:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:41.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:42.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:22:42.797 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:22:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:22:42.799 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:22:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:22:42.799 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:22:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:43 np0005593234 python3.9[226553]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 04:22:43 np0005593234 systemd[1]: Reloading.
Jan 23 04:22:43 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:22:43 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:22:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:43.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:44.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:44 np0005593234 python3.9[226666]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 04:22:44 np0005593234 systemd[1]: Reloading.
Jan 23 04:22:44 np0005593234 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 04:22:44 np0005593234 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 04:22:44 np0005593234 systemd[1]: Starting nova_compute container...
Jan 23 04:22:44 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:22:44 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eddfc90c85d206deba6da23b94c7163b279c514685f6aee83992431ae7af4ea7/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:44 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eddfc90c85d206deba6da23b94c7163b279c514685f6aee83992431ae7af4ea7/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:44 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eddfc90c85d206deba6da23b94c7163b279c514685f6aee83992431ae7af4ea7/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:44 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eddfc90c85d206deba6da23b94c7163b279c514685f6aee83992431ae7af4ea7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:44 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eddfc90c85d206deba6da23b94c7163b279c514685f6aee83992431ae7af4ea7/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:44 np0005593234 podman[226706]: 2026-01-23 09:22:44.906081712 +0000 UTC m=+0.110135152 container init 0a17c213f6c333c7810a664505c998196fb95a30b87638b18761d474476df8f2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, container_name=nova_compute, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm)
Jan 23 04:22:44 np0005593234 podman[226706]: 2026-01-23 09:22:44.912833952 +0000 UTC m=+0.116887382 container start 0a17c213f6c333c7810a664505c998196fb95a30b87638b18761d474476df8f2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=nova_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.build-date=20251202)
Jan 23 04:22:44 np0005593234 podman[226706]: nova_compute
Jan 23 04:22:44 np0005593234 nova_compute[226721]: + sudo -E kolla_set_configs
Jan 23 04:22:44 np0005593234 systemd[1]: Started nova_compute container.
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Validating config file
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Copying service configuration files
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Deleting /etc/ceph
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Creating directory /etc/ceph
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Writing out command to execute
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:45 np0005593234 nova_compute[226721]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 04:22:45 np0005593234 nova_compute[226721]: ++ cat /run_command
Jan 23 04:22:45 np0005593234 nova_compute[226721]: + CMD=nova-compute
Jan 23 04:22:45 np0005593234 nova_compute[226721]: + ARGS=
Jan 23 04:22:45 np0005593234 nova_compute[226721]: + sudo kolla_copy_cacerts
Jan 23 04:22:45 np0005593234 nova_compute[226721]: + [[ ! -n '' ]]
Jan 23 04:22:45 np0005593234 nova_compute[226721]: + . kolla_extend_start
Jan 23 04:22:45 np0005593234 nova_compute[226721]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 04:22:45 np0005593234 nova_compute[226721]: Running command: 'nova-compute'
Jan 23 04:22:45 np0005593234 nova_compute[226721]: + umask 0022
Jan 23 04:22:45 np0005593234 nova_compute[226721]: + exec nova-compute
Jan 23 04:22:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:22:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:45.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:22:46 np0005593234 python3.9[226933]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:22:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:46.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:22:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:47.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:47 np0005593234 python3.9[227085]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:22:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:48.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:22:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:22:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:22:48 np0005593234 podman[227258]: 2026-01-23 09:22:48.799678001 +0000 UTC m=+0.089379323 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Jan 23 04:22:48 np0005593234 python3.9[227303]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.004 226725 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.005 226725 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.006 226725 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.006 226725 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.200 226725 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.222 226725 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.223 226725 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.726 226725 INFO nova.virt.driver [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 23 04:22:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:49.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.964 226725 INFO nova.compute.provider_config [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.975 226725 DEBUG oslo_concurrency.lockutils [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.975 226725 DEBUG oslo_concurrency.lockutils [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.976 226725 DEBUG oslo_concurrency.lockutils [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.977 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.977 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.977 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.977 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.977 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.978 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.978 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.978 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.978 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.978 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.978 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.979 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.979 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.979 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.979 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.979 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.980 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.980 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.980 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.980 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.980 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.980 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.980 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.981 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.981 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.981 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.981 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.981 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.981 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.982 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.982 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.982 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.982 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.982 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.982 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.983 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.983 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.983 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.983 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.983 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.983 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.984 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.984 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.984 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.984 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.984 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.984 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.985 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.985 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.985 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.985 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.986 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.986 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.986 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.986 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.986 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.986 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.986 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.987 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.987 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.987 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.987 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.987 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.988 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.988 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.988 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.988 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.988 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.988 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.989 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.989 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.989 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.989 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.989 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.989 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.989 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.990 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.990 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.990 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.990 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.990 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.990 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.990 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.991 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.991 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.991 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.991 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.991 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.991 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.991 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.992 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.992 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.992 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.992 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.992 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.992 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.993 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.993 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.993 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.993 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.993 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.993 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.994 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.994 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.994 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.994 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.995 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.995 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.995 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.995 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.995 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.995 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.996 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.996 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.996 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.996 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.996 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.997 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.997 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.997 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.997 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.997 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.998 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.998 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.998 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.998 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.998 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.999 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.999 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.999 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.999 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:49 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.999 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:49.999 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.000 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.000 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.000 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.000 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.000 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.000 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.001 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.001 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.001 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.001 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.001 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.001 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.002 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.002 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.002 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.002 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.002 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.002 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.003 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.003 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.003 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.003 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.003 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.003 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.004 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.004 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.004 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.004 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.004 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.004 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.005 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.005 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.005 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.005 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.005 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.005 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.005 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.006 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.006 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.006 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.006 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.006 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.006 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.007 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.007 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.007 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.007 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.007 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.008 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.008 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.008 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.008 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.008 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.008 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.009 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.009 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.009 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.009 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.009 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.009 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.010 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.010 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.010 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.010 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.010 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.010 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.011 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.011 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.011 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.011 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.011 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.011 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.012 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.012 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.012 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.012 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.012 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.012 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.013 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.013 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.013 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.013 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.013 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.014 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.014 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.014 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.014 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.014 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.014 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.015 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.015 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.015 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.015 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.015 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.015 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.016 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.016 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.016 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.016 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.016 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.016 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.016 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.017 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.017 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.017 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.017 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.017 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.018 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.018 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.018 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.018 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.018 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.019 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.019 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.019 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.019 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.019 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.019 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.020 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.020 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.020 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.020 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.020 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.020 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.020 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.020 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.021 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.021 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.021 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.021 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.021 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.021 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.022 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.022 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.022 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.022 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.022 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.023 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.023 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.023 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.023 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.023 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.023 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.024 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.024 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.024 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.024 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.024 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.024 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.025 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.025 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.025 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.025 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.025 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.025 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.025 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.026 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.026 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.026 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.026 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.026 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.026 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.027 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.027 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.027 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.027 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.027 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.027 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.028 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.028 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.028 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.028 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.028 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.028 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.029 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.029 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.029 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.029 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.029 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.029 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.029 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.030 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.030 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.030 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.030 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.030 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.031 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.031 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.031 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.031 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.031 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.031 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.032 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.032 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.032 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.032 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.032 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.032 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.033 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.033 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.033 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.033 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.033 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.034 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.034 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.034 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.034 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.034 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.035 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.035 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.035 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.035 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.035 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.035 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.035 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.036 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.036 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.036 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.036 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.036 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.036 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.037 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.037 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.037 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.037 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.038 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.038 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.038 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.038 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.038 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.038 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.039 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.039 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.039 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.039 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.039 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.039 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.040 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.040 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.040 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.040 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.040 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.040 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.041 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.041 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.041 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.041 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.041 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.042 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.042 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.042 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.042 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.042 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.043 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.043 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.043 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.043 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.043 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.044 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.044 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.044 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.044 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.044 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.044 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.045 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.045 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.045 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.045 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.045 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.046 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.046 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.046 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.046 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.046 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.046 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.047 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.047 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.047 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.047 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.047 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.047 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.048 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.048 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.048 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.048 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.048 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.048 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.048 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.049 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.049 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.049 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.049 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.049 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.050 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.050 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.050 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.050 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.050 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.050 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.050 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.051 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.051 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.051 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.051 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.051 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.051 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.052 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.052 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.052 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.052 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.052 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.052 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.052 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.053 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.053 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.053 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.053 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.053 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.053 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.054 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.054 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.054 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.054 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.054 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.054 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.054 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.055 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.055 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.055 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.055 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.055 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.055 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.055 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.056 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.056 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.056 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.056 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.056 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.056 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.056 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.057 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.057 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.057 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.057 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.057 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.057 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.058 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.058 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.058 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.058 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.058 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.058 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.058 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.059 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.059 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.059 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.059 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.059 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.060 226725 WARNING oslo_config.cfg [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 04:22:50 np0005593234 nova_compute[226721]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 04:22:50 np0005593234 nova_compute[226721]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 04:22:50 np0005593234 nova_compute[226721]: and ``live_migration_inbound_addr`` respectively.
Jan 23 04:22:50 np0005593234 nova_compute[226721]: ).  Its value may be silently ignored in the future.#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.060 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.060 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.060 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.060 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.060 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.061 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.061 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.061 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.061 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.061 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.062 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.062 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.062 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.062 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.062 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.062 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.063 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.063 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.063 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.rbd_secret_uuid        = e1533653-0a5a-584c-b34b-8689f0d32e77 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.063 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.063 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.063 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.064 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.064 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.064 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.064 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.064 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.064 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.065 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.065 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.065 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.065 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.065 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.065 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.066 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.066 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.066 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.066 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.066 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.066 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.067 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.067 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.067 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.067 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.067 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.067 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.068 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.068 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.068 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.068 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.068 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.068 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.069 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.069 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.069 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.069 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.069 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.070 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.070 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.070 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.070 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.070 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.071 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.071 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.071 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.071 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.071 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.071 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.071 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.072 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.072 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.072 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.072 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.072 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.072 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.073 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.073 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.073 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.073 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.073 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.073 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.073 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.074 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.074 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.074 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.074 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.074 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.074 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.074 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.075 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.075 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.075 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.075 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.075 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.075 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.076 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.076 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.076 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.076 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.076 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.076 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.076 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.077 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.077 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.077 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.077 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.077 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.077 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.077 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.078 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.078 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.078 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.078 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.078 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.078 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.078 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.079 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.079 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.079 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.079 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.079 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.079 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.079 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.079 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.080 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.080 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.080 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.080 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.080 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.081 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.081 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.081 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.081 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.081 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.081 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.082 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.082 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.082 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.082 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.082 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.083 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.083 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.083 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.083 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.083 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.083 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.084 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.084 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.084 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.084 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.084 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.084 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.085 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.085 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.085 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.085 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.085 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.085 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.086 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.086 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.086 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.086 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.086 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.086 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.086 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.087 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.087 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.087 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.087 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.087 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.087 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.088 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.088 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.088 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.088 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.088 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.088 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.088 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.089 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.089 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.089 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.089 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.089 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.090 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.090 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.090 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.090 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.090 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.091 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.091 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.091 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.091 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.091 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.092 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.092 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.092 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.092 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.092 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.093 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.093 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.093 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.093 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.093 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.093 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.093 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.094 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.094 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.094 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.094 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.094 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.094 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.094 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.095 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.095 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.095 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.095 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.095 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.095 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.095 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.096 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.096 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.096 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.096 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.096 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.096 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.097 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.097 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.097 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.097 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.097 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.097 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.098 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.098 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.098 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.098 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.098 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.099 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.099 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.099 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.099 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.099 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.100 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.100 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.100 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.100 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.100 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.100 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.101 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.101 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.101 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.101 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.102 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.102 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.102 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.102 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.102 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.102 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.103 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.103 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.103 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.103 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.103 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.104 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.104 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.104 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.104 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.104 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.105 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.105 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.105 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.105 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.105 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.105 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.106 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.106 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.106 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.106 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.106 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.106 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.107 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.107 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.107 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.107 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.107 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.107 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.108 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.108 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.108 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.108 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.108 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.109 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.109 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.109 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.109 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.109 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.109 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.110 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.110 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.110 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.110 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.110 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.110 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.111 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.111 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.111 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.111 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.111 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.111 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.112 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.112 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.112 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.112 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.112 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.112 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.113 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.113 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.113 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.113 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.113 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.113 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.114 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.114 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.114 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.114 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.114 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.114 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.114 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.115 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.115 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.115 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.115 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.115 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.115 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.115 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.116 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.116 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.116 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.116 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.116 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.116 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.116 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.117 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.117 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.117 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.117 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.117 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.117 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.118 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.118 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.118 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.118 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.118 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.118 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.119 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.119 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.119 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.119 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.119 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.119 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.120 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.120 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.120 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.120 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.120 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.120 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.121 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.121 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.121 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.121 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.121 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.121 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.122 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.122 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.122 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.122 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.122 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.122 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.123 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.123 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.123 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.123 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.123 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.124 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.124 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.124 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.124 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.124 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.124 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.125 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.125 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.125 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.125 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.125 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.126 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.126 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.126 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.126 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.126 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.127 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.127 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.127 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.127 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.127 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.128 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.128 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.128 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.128 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.128 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.128 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.129 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.129 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.129 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.129 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.129 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.129 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.130 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.130 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.130 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.130 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.130 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.130 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.131 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.131 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.131 226725 DEBUG oslo_service.service [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.132 226725 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.150 226725 DEBUG nova.virt.libvirt.host [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.151 226725 DEBUG nova.virt.libvirt.host [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.151 226725 DEBUG nova.virt.libvirt.host [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.151 226725 DEBUG nova.virt.libvirt.host [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 23 04:22:50 np0005593234 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 04:22:50 np0005593234 python3.9[227461]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 04:22:50 np0005593234 systemd[1]: Started libvirt QEMU daemon.
Jan 23 04:22:50 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:22:50 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.254 226725 DEBUG nova.virt.libvirt.host [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fb39d66b490> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.257 226725 DEBUG nova.virt.libvirt.host [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fb39d66b490> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.257 226725 INFO nova.virt.libvirt.driver [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.297 226725 WARNING nova.virt.libvirt.driver [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 23 04:22:50 np0005593234 nova_compute[226721]: 2026-01-23 09:22:50.299 226725 DEBUG nova.virt.libvirt.volume.mount [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 23 04:22:50 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:22:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:50.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:51 np0005593234 python3.9[227695]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 04:22:51 np0005593234 nova_compute[226721]: 2026-01-23 09:22:51.255 226725 INFO nova.virt.libvirt.host [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 04:22:51 np0005593234 nova_compute[226721]: 
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <host>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <uuid>3e200bf7-7634-42a0-8184-2372f58672f7</uuid>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <cpu>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <arch>x86_64</arch>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model>EPYC-Rome-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <vendor>AMD</vendor>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <microcode version='16777317'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <signature family='23' model='49' stepping='0'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='x2apic'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='tsc-deadline'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='osxsave'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='hypervisor'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='tsc_adjust'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='spec-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='stibp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='arch-capabilities'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='ssbd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='cmp_legacy'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='topoext'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='virt-ssbd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='lbrv'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='tsc-scale'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='vmcb-clean'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='pause-filter'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='pfthreshold'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='svme-addr-chk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='rdctl-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='skip-l1dfl-vmentry'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='mds-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature name='pschange-mc-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <pages unit='KiB' size='4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <pages unit='KiB' size='2048'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <pages unit='KiB' size='1048576'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </cpu>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <power_management>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <suspend_mem/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </power_management>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <iommu support='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <migration_features>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <live/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <uri_transports>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <uri_transport>tcp</uri_transport>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <uri_transport>rdma</uri_transport>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </uri_transports>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </migration_features>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <topology>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <cells num='1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <cell id='0'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:          <memory unit='KiB'>7864316</memory>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:          <pages unit='KiB' size='4'>1966079</pages>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:          <pages unit='KiB' size='2048'>0</pages>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:          <distances>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:            <sibling id='0' value='10'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:          </distances>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:          <cpus num='8'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:          </cpus>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        </cell>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </cells>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </topology>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <cache>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </cache>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <secmodel>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model>selinux</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <doi>0</doi>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </secmodel>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <secmodel>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model>dac</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <doi>0</doi>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </secmodel>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </host>
Jan 23 04:22:51 np0005593234 nova_compute[226721]: 
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <guest>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <os_type>hvm</os_type>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <arch name='i686'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <wordsize>32</wordsize>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <domain type='qemu'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <domain type='kvm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </arch>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <features>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <pae/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <nonpae/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <acpi default='on' toggle='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <apic default='on' toggle='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <cpuselection/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <deviceboot/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <disksnapshot default='on' toggle='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <externalSnapshot/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </features>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </guest>
Jan 23 04:22:51 np0005593234 nova_compute[226721]: 
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <guest>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <os_type>hvm</os_type>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <arch name='x86_64'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <wordsize>64</wordsize>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <domain type='qemu'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <domain type='kvm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </arch>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <features>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <acpi default='on' toggle='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <apic default='on' toggle='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <cpuselection/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <deviceboot/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <disksnapshot default='on' toggle='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <externalSnapshot/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </features>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </guest>
Jan 23 04:22:51 np0005593234 nova_compute[226721]: 
Jan 23 04:22:51 np0005593234 nova_compute[226721]: </capabilities>
Jan 23 04:22:51 np0005593234 nova_compute[226721]: #033[00m
Jan 23 04:22:51 np0005593234 nova_compute[226721]: 2026-01-23 09:22:51.262 226725 DEBUG nova.virt.libvirt.host [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 04:22:51 np0005593234 systemd[1]: Stopping nova_compute container...
Jan 23 04:22:51 np0005593234 nova_compute[226721]: 2026-01-23 09:22:51.286 226725 DEBUG nova.virt.libvirt.host [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 04:22:51 np0005593234 nova_compute[226721]: <domainCapabilities>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <domain>kvm</domain>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <arch>i686</arch>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <vcpu max='4096'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <iothreads supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <os supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <enum name='firmware'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <loader supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>rom</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pflash</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='readonly'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>yes</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>no</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='secure'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>no</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </loader>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </os>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <cpu>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>on</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>off</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </mode>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='maximumMigratable'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>on</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>off</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </mode>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <vendor>AMD</vendor>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='succor'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </mode>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <mode name='custom' supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ddpd-u'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='intel-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='lam'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sha512'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sm3'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sm4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ddpd-u'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='intel-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='lam'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sha512'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sm3'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sm4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cooperlake'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Denverton'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mpx'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Denverton-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mpx'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Denverton-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Denverton-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='perfmon-v2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='perfmon-v2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbpb'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='perfmon-v2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbpb'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-v5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='GraniteRapids'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-128'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-256'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-512'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-128'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-256'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-512'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='IvyBridge'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='KnightsMill'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512er'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512pf'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512er'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512pf'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Opteron_G4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fma4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xop'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fma4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xop'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Opteron_G5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fma4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tbm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xop'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fma4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tbm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xop'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SierraForest'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='intel-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='lam'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='intel-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='lam'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='core-capability'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mpx'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='split-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='core-capability'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mpx'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='split-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='core-capability'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='split-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='core-capability'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='split-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='athlon'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnow'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnowext'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='athlon-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnow'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnowext'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='core2duo'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='core2duo-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='coreduo'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='coreduo-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='n270'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='n270-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='phenom'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnow'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnowext'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='phenom-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnow'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnowext'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </mode>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </cpu>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <memoryBacking supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <enum name='sourceType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>file</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>anonymous</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>memfd</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </memoryBacking>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <devices>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <disk supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='diskDevice'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>disk</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>cdrom</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>floppy</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>lun</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='bus'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>fdc</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>scsi</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>usb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>sata</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio-transitional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio-non-transitional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </disk>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <graphics supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vnc</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>egl-headless</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>dbus</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </graphics>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <video supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='modelType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vga</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>cirrus</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>none</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>bochs</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>ramfb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </video>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <hostdev supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='mode'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>subsystem</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='startupPolicy'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>default</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>mandatory</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>requisite</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>optional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='subsysType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>usb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pci</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>scsi</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='capsType'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='pciBackend'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </hostdev>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <rng supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio-transitional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio-non-transitional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendModel'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>random</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>egd</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>builtin</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </rng>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <filesystem supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='driverType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>path</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>handle</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtiofs</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </filesystem>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <tpm supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>tpm-tis</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>tpm-crb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendModel'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>emulator</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>external</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendVersion'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>2.0</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </tpm>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <redirdev supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='bus'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>usb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </redirdev>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <channel supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pty</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>unix</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </channel>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <crypto supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>qemu</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendModel'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>builtin</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </crypto>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <interface supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>default</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>passt</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </interface>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <panic supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>isa</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>hyperv</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </panic>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <console supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>null</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vc</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pty</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>dev</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>file</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pipe</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>stdio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>udp</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>tcp</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>unix</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>qemu-vdagent</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>dbus</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </console>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </devices>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <features>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <gic supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <genid supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <backup supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <async-teardown supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <s390-pv supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <ps2 supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <tdx supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <sev supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <sgx supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <hyperv supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='features'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>relaxed</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vapic</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>spinlocks</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vpindex</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>runtime</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>synic</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>stimer</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>reset</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vendor_id</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>frequencies</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>reenlightenment</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>tlbflush</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>ipi</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>avic</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>emsr_bitmap</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>xmm_input</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <defaults>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </defaults>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </hyperv>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <launchSecurity supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </features>
Jan 23 04:22:51 np0005593234 nova_compute[226721]: </domainCapabilities>
Jan 23 04:22:51 np0005593234 nova_compute[226721]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:51 np0005593234 nova_compute[226721]: 2026-01-23 09:22:51.293 226725 DEBUG nova.virt.libvirt.host [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 04:22:51 np0005593234 nova_compute[226721]: <domainCapabilities>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <domain>kvm</domain>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <arch>i686</arch>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <vcpu max='240'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <iothreads supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <os supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <enum name='firmware'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <loader supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>rom</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pflash</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='readonly'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>yes</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>no</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='secure'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>no</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </loader>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </os>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <cpu>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>on</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>off</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </mode>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='maximumMigratable'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>on</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>off</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </mode>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <vendor>AMD</vendor>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='succor'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </mode>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <mode name='custom' supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ddpd-u'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='intel-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='lam'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sha512'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sm3'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sm4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ddpd-u'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='intel-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='lam'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sha512'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sm3'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sm4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cooperlake'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Denverton'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mpx'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Denverton-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mpx'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Denverton-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Denverton-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='perfmon-v2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='perfmon-v2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbpb'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='perfmon-v2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbpb'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-v5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='GraniteRapids'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-128'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-256'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-512'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-128'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-256'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-512'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='IvyBridge'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='KnightsMill'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512er'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512pf'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512er'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512pf'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Opteron_G4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fma4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xop'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fma4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xop'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Opteron_G5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fma4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tbm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xop'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fma4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tbm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xop'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SierraForest'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='intel-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='lam'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='intel-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='lam'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='core-capability'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mpx'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='split-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='core-capability'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mpx'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='split-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='core-capability'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='split-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='core-capability'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='split-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='athlon'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnow'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnowext'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='athlon-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnow'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnowext'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='core2duo'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='core2duo-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='coreduo'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='coreduo-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='n270'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='n270-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='phenom'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnow'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnowext'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='phenom-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnow'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnowext'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </mode>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </cpu>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <memoryBacking supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <enum name='sourceType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>file</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>anonymous</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>memfd</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </memoryBacking>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <devices>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <disk supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='diskDevice'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>disk</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>cdrom</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>floppy</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>lun</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='bus'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>ide</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>fdc</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>scsi</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>usb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>sata</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio-transitional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio-non-transitional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </disk>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <graphics supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vnc</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>egl-headless</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>dbus</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </graphics>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <video supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='modelType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vga</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>cirrus</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>none</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>bochs</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>ramfb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </video>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <hostdev supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='mode'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>subsystem</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='startupPolicy'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>default</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>mandatory</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>requisite</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>optional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='subsysType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>usb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pci</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>scsi</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='capsType'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='pciBackend'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </hostdev>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <rng supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio-transitional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio-non-transitional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendModel'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>random</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>egd</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>builtin</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </rng>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <filesystem supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='driverType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>path</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>handle</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtiofs</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </filesystem>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <tpm supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>tpm-tis</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>tpm-crb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendModel'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>emulator</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>external</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendVersion'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>2.0</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </tpm>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <redirdev supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='bus'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>usb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </redirdev>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <channel supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pty</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>unix</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </channel>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <crypto supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>qemu</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendModel'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>builtin</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </crypto>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <interface supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>default</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>passt</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </interface>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <panic supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>isa</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>hyperv</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </panic>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <console supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>null</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vc</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pty</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>dev</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>file</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pipe</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>stdio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>udp</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>tcp</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>unix</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>qemu-vdagent</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>dbus</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </console>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </devices>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <features>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <gic supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <genid supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <backup supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <async-teardown supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <s390-pv supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <ps2 supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <tdx supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <sev supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <sgx supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <hyperv supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='features'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>relaxed</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vapic</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>spinlocks</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vpindex</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>runtime</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>synic</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>stimer</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>reset</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vendor_id</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>frequencies</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>reenlightenment</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>tlbflush</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>ipi</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>avic</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>emsr_bitmap</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>xmm_input</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <defaults>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </defaults>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </hyperv>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <launchSecurity supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </features>
Jan 23 04:22:51 np0005593234 nova_compute[226721]: </domainCapabilities>
Jan 23 04:22:51 np0005593234 nova_compute[226721]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:51 np0005593234 nova_compute[226721]: 2026-01-23 09:22:51.359 226725 DEBUG nova.virt.libvirt.host [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 04:22:51 np0005593234 nova_compute[226721]: 2026-01-23 09:22:51.364 226725 DEBUG nova.virt.libvirt.host [None req-b6829bc3-c22f-44d9-9c26-61623b9c072a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 04:22:51 np0005593234 nova_compute[226721]: <domainCapabilities>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <domain>kvm</domain>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <arch>x86_64</arch>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <vcpu max='4096'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <iothreads supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <os supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <enum name='firmware'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>efi</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <loader supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>rom</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pflash</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='readonly'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>yes</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>no</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='secure'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>yes</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>no</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </loader>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </os>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <cpu>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>on</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>off</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </mode>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='maximumMigratable'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>on</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>off</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </mode>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <vendor>AMD</vendor>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='succor'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </mode>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <mode name='custom' supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ddpd-u'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='intel-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='lam'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sha512'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sm3'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sm4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ddpd-u'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='intel-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='lam'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sha512'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sm3'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sm4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cooperlake'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Denverton'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mpx'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Denverton-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mpx'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Denverton-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Denverton-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='perfmon-v2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='perfmon-v2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbpb'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amd-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='auto-ibrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='perfmon-v2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbpb'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='stibp-always-on'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='EPYC-v5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='GraniteRapids'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-128'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-256'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-512'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-128'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-256'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx10-512'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='prefetchiti'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Haswell-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='IvyBridge'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='KnightsMill'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512er'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512pf'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512er'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512pf'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Opteron_G4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fma4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xop'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fma4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xop'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Opteron_G5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fma4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tbm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xop'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fma4'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tbm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xop'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='amx-tile'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-bf16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-fp16'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bitalg'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrc'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fzrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='la57'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='taa-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SierraForest'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='intel-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='lam'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ifma'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cmpccxadd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fbsdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='fsrs'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ibrs-all'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='intel-psfd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='lam'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mcdt-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pbrsb-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='psdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='serialize'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vaes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='hle'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='rtm'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512bw'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512cd'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512dq'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512f'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='avx512vl'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='invpcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pcid'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='pku'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='core-capability'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mpx'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='split-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='core-capability'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='mpx'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='split-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='core-capability'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='split-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='core-capability'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='split-lock-detect'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='cldemote'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='erms'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='gfni'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdir64b'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='movdiri'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='xsaves'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='athlon'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnow'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnowext'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='athlon-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnow'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnowext'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='core2duo'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='core2duo-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='coreduo'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='coreduo-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='n270'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='n270-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='ss'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='phenom'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnow'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnowext'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <blockers model='phenom-v1'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnow'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <feature name='3dnowext'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </blockers>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </mode>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </cpu>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <memoryBacking supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <enum name='sourceType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>file</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>anonymous</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <value>memfd</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </memoryBacking>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <devices>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <disk supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='diskDevice'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>disk</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>cdrom</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>floppy</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>lun</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='bus'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>fdc</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>scsi</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>usb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>sata</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio-transitional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio-non-transitional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </disk>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <graphics supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vnc</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>egl-headless</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>dbus</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </graphics>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <video supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='modelType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vga</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>cirrus</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>none</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>bochs</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>ramfb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </video>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <hostdev supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='mode'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>subsystem</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='startupPolicy'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>default</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>mandatory</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>requisite</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>optional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='subsysType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>usb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pci</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>scsi</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='capsType'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='pciBackend'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </hostdev>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <rng supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio-transitional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtio-non-transitional</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendModel'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>random</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>egd</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>builtin</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </rng>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <filesystem supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='driverType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>path</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>handle</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>virtiofs</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </filesystem>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <tpm supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>tpm-tis</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>tpm-crb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendModel'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>emulator</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>external</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendVersion'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>2.0</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </tpm>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <redirdev supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='bus'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>usb</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </redirdev>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <channel supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pty</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>unix</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </channel>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <crypto supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>qemu</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendModel'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>builtin</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </crypto>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <interface supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='backendType'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>default</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>passt</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </interface>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <panic supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='model'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>isa</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>hyperv</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </panic>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <console supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='type'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>null</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vc</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pty</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>dev</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>file</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>pipe</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>stdio</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>udp</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>tcp</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>unix</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>qemu-vdagent</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>dbus</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </console>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </devices>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  <features>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <gic supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <genid supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <backup supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <async-teardown supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <s390-pv supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <ps2 supported='yes'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <tdx supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <sev supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <sgx supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <hyperv supported='yes'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <enum name='features'>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>relaxed</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vapic</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>spinlocks</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vpindex</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>runtime</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>synic</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>stimer</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>reset</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>vendor_id</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>frequencies</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>reenlightenment</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>tlbflush</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>ipi</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>avic</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>emsr_bitmap</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <value>xmm_input</value>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </enum>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      <defaults>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:      </defaults>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    </hyperv>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:    <launchSecurity supported='no'/>
Jan 23 04:22:51 np0005593234 nova_compute[226721]:  </features>
Jan 23 04:22:51 np0005593234 nova_compute[226721]: </domainCapabilities>
Jan 23 04:22:51 np0005593234 nova_compute[226721]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:51 np0005593234 nova_compute[226721]: 2026-01-23 09:22:51.448 226725 DEBUG oslo_concurrency.lockutils [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:22:51 np0005593234 nova_compute[226721]: 2026-01-23 09:22:51.449 226725 DEBUG oslo_concurrency.lockutils [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:22:51 np0005593234 nova_compute[226721]: 2026-01-23 09:22:51.449 226725 DEBUG oslo_concurrency.lockutils [None req-7e89f263-9ede-4ef0-8bb2-c88ddd3106a7 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:22:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:51.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:52 np0005593234 virtqemud[227483]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 23 04:22:52 np0005593234 virtqemud[227483]: hostname: compute-2
Jan 23 04:22:52 np0005593234 virtqemud[227483]: End of file while reading data: Input/output error
Jan 23 04:22:52 np0005593234 systemd[1]: libpod-0a17c213f6c333c7810a664505c998196fb95a30b87638b18761d474476df8f2.scope: Deactivated successfully.
Jan 23 04:22:52 np0005593234 systemd[1]: libpod-0a17c213f6c333c7810a664505c998196fb95a30b87638b18761d474476df8f2.scope: Consumed 3.677s CPU time.
Jan 23 04:22:52 np0005593234 conmon[226721]: conmon 0a17c213f6c333c7810a <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-0a17c213f6c333c7810a664505c998196fb95a30b87638b18761d474476df8f2.scope/container/memory.events
Jan 23 04:22:52 np0005593234 podman[227700]: 2026-01-23 09:22:52.039830698 +0000 UTC m=+0.761778650 container died 0a17c213f6c333c7810a664505c998196fb95a30b87638b18761d474476df8f2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 04:22:52 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0a17c213f6c333c7810a664505c998196fb95a30b87638b18761d474476df8f2-userdata-shm.mount: Deactivated successfully.
Jan 23 04:22:52 np0005593234 systemd[1]: var-lib-containers-storage-overlay-eddfc90c85d206deba6da23b94c7163b279c514685f6aee83992431ae7af4ea7-merged.mount: Deactivated successfully.
Jan 23 04:22:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:52.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:52 np0005593234 podman[227700]: 2026-01-23 09:22:52.929836323 +0000 UTC m=+1.651784275 container cleanup 0a17c213f6c333c7810a664505c998196fb95a30b87638b18761d474476df8f2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.build-date=20251202)
Jan 23 04:22:52 np0005593234 podman[227700]: nova_compute
Jan 23 04:22:52 np0005593234 podman[227733]: nova_compute
Jan 23 04:22:52 np0005593234 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 23 04:22:52 np0005593234 systemd[1]: Stopped nova_compute container.
Jan 23 04:22:53 np0005593234 systemd[1]: Starting nova_compute container...
Jan 23 04:22:53 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:22:53 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eddfc90c85d206deba6da23b94c7163b279c514685f6aee83992431ae7af4ea7/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:53 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eddfc90c85d206deba6da23b94c7163b279c514685f6aee83992431ae7af4ea7/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:53 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eddfc90c85d206deba6da23b94c7163b279c514685f6aee83992431ae7af4ea7/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:53 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eddfc90c85d206deba6da23b94c7163b279c514685f6aee83992431ae7af4ea7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:53 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/eddfc90c85d206deba6da23b94c7163b279c514685f6aee83992431ae7af4ea7/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:53 np0005593234 podman[227746]: 2026-01-23 09:22:53.113784919 +0000 UTC m=+0.095026920 container init 0a17c213f6c333c7810a664505c998196fb95a30b87638b18761d474476df8f2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible)
Jan 23 04:22:53 np0005593234 podman[227746]: 2026-01-23 09:22:53.120640504 +0000 UTC m=+0.101882485 container start 0a17c213f6c333c7810a664505c998196fb95a30b87638b18761d474476df8f2 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Jan 23 04:22:53 np0005593234 podman[227746]: nova_compute
Jan 23 04:22:53 np0005593234 nova_compute[227762]: + sudo -E kolla_set_configs
Jan 23 04:22:53 np0005593234 systemd[1]: Started nova_compute container.
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Validating config file
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Copying service configuration files
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Deleting /etc/ceph
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Creating directory /etc/ceph
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Writing out command to execute
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:53 np0005593234 nova_compute[227762]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 04:22:53 np0005593234 nova_compute[227762]: ++ cat /run_command
Jan 23 04:22:53 np0005593234 nova_compute[227762]: + CMD=nova-compute
Jan 23 04:22:53 np0005593234 nova_compute[227762]: + ARGS=
Jan 23 04:22:53 np0005593234 nova_compute[227762]: + sudo kolla_copy_cacerts
Jan 23 04:22:53 np0005593234 nova_compute[227762]: + [[ ! -n '' ]]
Jan 23 04:22:53 np0005593234 nova_compute[227762]: + . kolla_extend_start
Jan 23 04:22:53 np0005593234 nova_compute[227762]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 04:22:53 np0005593234 nova_compute[227762]: Running command: 'nova-compute'
Jan 23 04:22:53 np0005593234 nova_compute[227762]: + umask 0022
Jan 23 04:22:53 np0005593234 nova_compute[227762]: + exec nova-compute
Jan 23 04:22:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:53.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:54 np0005593234 python3.9[227926]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 04:22:54 np0005593234 systemd[1]: Started libpod-conmon-25e6010f79dac3dc6ca9f8ed46c4472a8b1a70d3dc529d961ab1f6ba09af5f57.scope.
Jan 23 04:22:54 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:22:54 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92bc4b588bc06976479977b10128ec314eddd1dd3d1268e92c5ff1234db6328/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:54 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92bc4b588bc06976479977b10128ec314eddd1dd3d1268e92c5ff1234db6328/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:54 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b92bc4b588bc06976479977b10128ec314eddd1dd3d1268e92c5ff1234db6328/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 23 04:22:54 np0005593234 podman[227952]: 2026-01-23 09:22:54.284993859 +0000 UTC m=+0.121075584 container init 25e6010f79dac3dc6ca9f8ed46c4472a8b1a70d3dc529d961ab1f6ba09af5f57 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, tcib_managed=true, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Jan 23 04:22:54 np0005593234 podman[227952]: 2026-01-23 09:22:54.293895947 +0000 UTC m=+0.129977652 container start 25e6010f79dac3dc6ca9f8ed46c4472a8b1a70d3dc529d961ab1f6ba09af5f57 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=edpm)
Jan 23 04:22:54 np0005593234 python3.9[227926]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Applying nova statedir ownership
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 23 04:22:54 np0005593234 nova_compute_init[227972]: INFO:nova_statedir:Nova statedir ownership complete
Jan 23 04:22:54 np0005593234 systemd[1]: libpod-25e6010f79dac3dc6ca9f8ed46c4472a8b1a70d3dc529d961ab1f6ba09af5f57.scope: Deactivated successfully.
Jan 23 04:22:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:54.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:54 np0005593234 podman[227986]: 2026-01-23 09:22:54.403631935 +0000 UTC m=+0.028842772 container died 25e6010f79dac3dc6ca9f8ed46c4472a8b1a70d3dc529d961ab1f6ba09af5f57 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=nova_compute_init)
Jan 23 04:22:54 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-25e6010f79dac3dc6ca9f8ed46c4472a8b1a70d3dc529d961ab1f6ba09af5f57-userdata-shm.mount: Deactivated successfully.
Jan 23 04:22:54 np0005593234 systemd[1]: var-lib-containers-storage-overlay-b92bc4b588bc06976479977b10128ec314eddd1dd3d1268e92c5ff1234db6328-merged.mount: Deactivated successfully.
Jan 23 04:22:54 np0005593234 podman[227986]: 2026-01-23 09:22:54.452349267 +0000 UTC m=+0.077560074 container cleanup 25e6010f79dac3dc6ca9f8ed46c4472a8b1a70d3dc529d961ab1f6ba09af5f57 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute_init, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 23 04:22:54 np0005593234 systemd[1]: libpod-conmon-25e6010f79dac3dc6ca9f8ed46c4472a8b1a70d3dc529d961ab1f6ba09af5f57.scope: Deactivated successfully.
Jan 23 04:22:55 np0005593234 systemd[1]: session-49.scope: Deactivated successfully.
Jan 23 04:22:55 np0005593234 systemd[1]: session-49.scope: Consumed 1min 59.005s CPU time.
Jan 23 04:22:55 np0005593234 systemd-logind[794]: Session 49 logged out. Waiting for processes to exit.
Jan 23 04:22:55 np0005593234 systemd-logind[794]: Removed session 49.
Jan 23 04:22:55 np0005593234 nova_compute[227762]: 2026-01-23 09:22:55.531 227766 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 04:22:55 np0005593234 nova_compute[227762]: 2026-01-23 09:22:55.531 227766 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 04:22:55 np0005593234 nova_compute[227762]: 2026-01-23 09:22:55.531 227766 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Jan 23 04:22:55 np0005593234 nova_compute[227762]: 2026-01-23 09:22:55.532 227766 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Jan 23 04:22:55 np0005593234 nova_compute[227762]: 2026-01-23 09:22:55.719 227766 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:22:55 np0005593234 nova_compute[227762]: 2026-01-23 09:22:55.732 227766 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:22:55 np0005593234 nova_compute[227762]: 2026-01-23 09:22:55.733 227766 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 04:22:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:55.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.174 227766 INFO nova.virt.driver [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.317 227766 INFO nova.compute.provider_config [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.360 227766 DEBUG oslo_concurrency.lockutils [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.361 227766 DEBUG oslo_concurrency.lockutils [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.361 227766 DEBUG oslo_concurrency.lockutils [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.362 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.362 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.362 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.362 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.362 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.362 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.363 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.363 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.363 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.363 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.363 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.364 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.364 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.364 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.364 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.364 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.365 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.365 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.365 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.365 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.365 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.366 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.366 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.366 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.366 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.366 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.367 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.367 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.367 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.367 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.367 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.367 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.368 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.368 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.368 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.368 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.368 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.368 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.369 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.369 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.369 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.369 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.369 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.370 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.370 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.370 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.370 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.370 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.370 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.371 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.371 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.371 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.371 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.371 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.371 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.371 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.372 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.372 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.372 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.372 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.372 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.372 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.373 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.373 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.373 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.373 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.373 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.374 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.374 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.374 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.374 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.374 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.374 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.375 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.375 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.375 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.375 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.375 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.375 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.375 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.376 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.376 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.376 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.376 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.376 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.376 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.377 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.377 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.377 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.377 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.377 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.377 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.378 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.378 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.378 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.378 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.378 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.378 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.379 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.379 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.379 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.379 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.379 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.380 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.380 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.380 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.380 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.380 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.380 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.381 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.381 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.381 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.381 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.381 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.381 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.382 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.382 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.382 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.382 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.382 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.382 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.383 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.384 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.384 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.384 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:56.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.384 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.385 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.385 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.385 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.385 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.385 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.386 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.386 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.386 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.386 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.386 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.387 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.387 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.387 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.387 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.387 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.388 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.388 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.388 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.388 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.388 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.388 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.389 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.389 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.389 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.389 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.389 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.390 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.390 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.390 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.390 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.390 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.390 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.391 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.391 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.391 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.391 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.391 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.392 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.392 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.392 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.392 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.392 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.392 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.393 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.393 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.393 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.393 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.393 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.394 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.394 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.394 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.394 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.394 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.394 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.395 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.395 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.395 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.395 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.395 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.396 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.396 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.396 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.396 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.396 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.396 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.396 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.397 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.397 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.397 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.397 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.397 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.398 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.398 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.398 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.398 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.398 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.399 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.399 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.399 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.399 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.399 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.399 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.399 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.400 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.400 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.400 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.400 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.400 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.401 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.401 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.401 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.401 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.401 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.402 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.402 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.402 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.402 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.402 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.402 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.403 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.403 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.403 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.403 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.403 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.404 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.404 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.404 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.404 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.404 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.405 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.405 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.405 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.405 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.405 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.406 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.406 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.406 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.406 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.406 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.406 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.407 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.407 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.407 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.407 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.407 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.407 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.408 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.408 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.408 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.408 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.408 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.408 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.408 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.409 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.409 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.409 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.409 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.409 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.409 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.409 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.410 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.410 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.410 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.410 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.410 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.410 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.410 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.411 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.411 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.411 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.411 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.411 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.411 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.412 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.412 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.412 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.412 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.412 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.412 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.412 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.413 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.413 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.413 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.413 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.413 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.413 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.413 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.414 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.414 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.414 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.414 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.414 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.414 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.414 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.415 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.415 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.415 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.415 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.415 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.415 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.415 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.416 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.416 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.416 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.416 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.416 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.416 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.416 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.417 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.417 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.417 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.417 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.417 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.417 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.417 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.418 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.418 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.418 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.418 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.418 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.418 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.418 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.419 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.419 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.419 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.419 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.419 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.420 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.420 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.420 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.420 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.420 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.420 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.421 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.421 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.421 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.421 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.421 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.421 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.421 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.422 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.422 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.422 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.422 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.422 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.422 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.423 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.423 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.423 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.423 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.424 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.424 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.424 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.424 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.424 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.424 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.425 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.425 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.425 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.425 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.425 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.425 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.426 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.426 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.426 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.426 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.426 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.426 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.426 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.427 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.427 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.427 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.427 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.428 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.428 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.428 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.428 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.428 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.429 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.429 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.429 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.429 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.429 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.430 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.430 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.430 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.430 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.430 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.430 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.431 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.431 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.431 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.431 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.431 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.431 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.432 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.432 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.432 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.432 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.432 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.432 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.433 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.433 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.433 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.433 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.433 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.434 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.434 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.434 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.434 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.434 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.434 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.434 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.435 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.435 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.435 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.435 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.435 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.435 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.436 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.436 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.436 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.436 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.436 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.436 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.437 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.437 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.437 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.437 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.437 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.438 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.438 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.438 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.438 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.438 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.438 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.438 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.439 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.439 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.439 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.439 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.439 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.439 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.439 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.440 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.440 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.440 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.440 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.440 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.440 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.441 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.441 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.441 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.441 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.441 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.441 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.441 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.442 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.442 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.442 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.442 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.442 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.442 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.443 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.443 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.443 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.443 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.443 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.444 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.444 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.444 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.444 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.444 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.445 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.445 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.445 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.445 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.445 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.445 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.445 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.446 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.446 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.446 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.446 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.446 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.446 227766 WARNING oslo_config.cfg [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 04:22:56 np0005593234 nova_compute[227762]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 04:22:56 np0005593234 nova_compute[227762]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 04:22:56 np0005593234 nova_compute[227762]: and ``live_migration_inbound_addr`` respectively.
Jan 23 04:22:56 np0005593234 nova_compute[227762]: ).  Its value may be silently ignored in the future.#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.447 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.447 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.447 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.447 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.447 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.447 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.448 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.448 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.448 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.448 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.448 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.448 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.449 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.449 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.449 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.449 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.449 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.450 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.450 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.rbd_secret_uuid        = e1533653-0a5a-584c-b34b-8689f0d32e77 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.450 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.450 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.450 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.450 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.451 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.451 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.451 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.451 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.451 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.451 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.451 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.452 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.452 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.452 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.452 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.452 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.453 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.453 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.453 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.453 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.453 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.454 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.454 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.454 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.454 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.454 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.455 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.455 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.455 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.455 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.455 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.455 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.456 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.456 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.456 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.456 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.456 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.456 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.456 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.457 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.457 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.457 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.457 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.457 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.457 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.457 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.458 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.458 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.458 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.458 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.458 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.458 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.459 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.459 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.459 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.459 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.459 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.459 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.460 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.460 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.460 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.460 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.460 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.460 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.460 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.461 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.461 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.461 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.461 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.461 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.461 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.462 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.462 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.462 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.462 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.462 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.462 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.462 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.463 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.463 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.463 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.463 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.463 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.464 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.464 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.464 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.464 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.464 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.464 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.465 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.465 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.465 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.465 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.465 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.466 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.466 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.466 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.466 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.466 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.466 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.467 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.467 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.467 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.467 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.467 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.467 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.467 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.468 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.468 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.468 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.468 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.468 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.468 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.469 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.469 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.469 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.469 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.469 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.469 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.470 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.470 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.470 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.470 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.471 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.471 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.471 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.471 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.471 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.472 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.472 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.472 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.472 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.472 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.473 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.473 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.473 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.473 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.473 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.474 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.474 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.474 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.474 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.474 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.475 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.475 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.475 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.475 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.475 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.476 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.476 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.476 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.476 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.476 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.477 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.477 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.477 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.477 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.477 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.478 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.478 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.478 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.478 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.479 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.479 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.479 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.479 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.479 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.480 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.480 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.480 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.480 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.480 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.480 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.481 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.481 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.481 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.481 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.481 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.481 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.482 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.482 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.482 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.482 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.483 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.483 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.483 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.483 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.484 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.484 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.484 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.484 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.484 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.485 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.485 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.485 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.485 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.485 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.485 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.486 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.486 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.486 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.486 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.486 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.487 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.487 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.487 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.487 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.487 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.488 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.488 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.488 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.488 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.488 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.489 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.489 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.489 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.489 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.489 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.489 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.490 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.490 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.490 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.490 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.490 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.491 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.491 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.491 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.491 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.492 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.492 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.492 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.492 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.492 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.493 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.493 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.493 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.493 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.493 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.494 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.494 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.494 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.494 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.494 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.495 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.495 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.495 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.495 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.495 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.496 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.496 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.496 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.496 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.496 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.497 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.497 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.497 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.497 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.497 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.498 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.498 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.498 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.498 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.498 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.499 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.499 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.499 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.499 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.499 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.500 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.500 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.500 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.500 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.500 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.501 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.501 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.501 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.501 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.501 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.502 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.502 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.502 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.502 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.502 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.503 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.503 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.503 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.503 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.503 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.504 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.504 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.504 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.504 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.504 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.505 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.505 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.505 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.505 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.505 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.506 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.506 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.506 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.506 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.506 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.507 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.507 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.507 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.507 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.507 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.508 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.508 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.508 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.508 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.508 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.509 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.509 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.509 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.509 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.509 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.510 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.510 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.510 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.510 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.510 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.511 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.511 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.511 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.511 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.511 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.512 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.512 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.512 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.512 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.512 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.513 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.513 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.513 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.513 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.513 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.514 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.514 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.514 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.514 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.514 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.515 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.515 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.515 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.515 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.515 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.515 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.516 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.516 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.516 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.516 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.516 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.517 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.517 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.517 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.517 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.517 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.518 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.518 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.518 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.518 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.518 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.519 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.519 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.519 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.519 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.519 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.520 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.520 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.520 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.520 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.520 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.520 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.521 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.521 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.521 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.521 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.521 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.522 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.522 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.522 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.522 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.522 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.523 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.523 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.523 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.523 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.523 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.524 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.524 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.524 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.524 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.525 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.525 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.525 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.525 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.525 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.525 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.526 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.526 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.526 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.526 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.526 227766 DEBUG oslo_service.service [None req-f58116fc-c545-4b64-95d1-35baf696c878 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.528 227766 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.543 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.543 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.543 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.544 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.556 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f3ecc407790> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.559 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f3ecc407790> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.560 227766 INFO nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Connection event '1' reason 'None'#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.569 227766 INFO nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <host>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <uuid>3e200bf7-7634-42a0-8184-2372f58672f7</uuid>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <cpu>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <arch>x86_64</arch>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model>EPYC-Rome-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <vendor>AMD</vendor>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <microcode version='16777317'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <signature family='23' model='49' stepping='0'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <maxphysaddr mode='emulate' bits='40'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='x2apic'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='tsc-deadline'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='osxsave'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='hypervisor'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='tsc_adjust'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='spec-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='stibp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='arch-capabilities'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='cmp_legacy'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='topoext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='virt-ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='lbrv'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='tsc-scale'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='vmcb-clean'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='pause-filter'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='pfthreshold'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='svme-addr-chk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='rdctl-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='skip-l1dfl-vmentry'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='mds-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature name='pschange-mc-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <pages unit='KiB' size='4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <pages unit='KiB' size='2048'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <pages unit='KiB' size='1048576'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </cpu>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <power_management>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <suspend_mem/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </power_management>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <iommu support='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <migration_features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <live/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <uri_transports>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <uri_transport>tcp</uri_transport>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <uri_transport>rdma</uri_transport>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </uri_transports>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </migration_features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <topology>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <cells num='1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <cell id='0'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:          <memory unit='KiB'>7864316</memory>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:          <pages unit='KiB' size='4'>1966079</pages>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:          <pages unit='KiB' size='2048'>0</pages>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:          <pages unit='KiB' size='1048576'>0</pages>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:          <distances>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:            <sibling id='0' value='10'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:          </distances>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:          <cpus num='8'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:          </cpus>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        </cell>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </cells>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </topology>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <cache>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </cache>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <secmodel>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model>selinux</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <doi>0</doi>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </secmodel>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <secmodel>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model>dac</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <doi>0</doi>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </secmodel>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </host>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <guest>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <os_type>hvm</os_type>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <arch name='i686'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <wordsize>32</wordsize>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <domain type='qemu'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <domain type='kvm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </arch>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <pae/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <nonpae/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <acpi default='on' toggle='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <apic default='on' toggle='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <cpuselection/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <deviceboot/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <disksnapshot default='on' toggle='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <externalSnapshot/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </guest>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <guest>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <os_type>hvm</os_type>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <arch name='x86_64'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <wordsize>64</wordsize>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <domain type='qemu'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <domain type='kvm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </arch>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <acpi default='on' toggle='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <apic default='on' toggle='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <cpuselection/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <deviceboot/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <disksnapshot default='on' toggle='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <externalSnapshot/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </guest>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 
Jan 23 04:22:56 np0005593234 nova_compute[227762]: </capabilities>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: #033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.575 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.581 227766 WARNING nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.581 227766 DEBUG nova.virt.libvirt.volume.mount [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.581 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 04:22:56 np0005593234 nova_compute[227762]: <domainCapabilities>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <domain>kvm</domain>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <arch>i686</arch>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <vcpu max='4096'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <iothreads supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <os supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <enum name='firmware'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <loader supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>rom</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pflash</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='readonly'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>yes</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>no</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='secure'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>no</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </loader>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <cpu>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>on</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>off</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='maximumMigratable'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>on</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>off</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <vendor>AMD</vendor>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='succor'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='custom' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ddpd-u'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sha512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm3'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ddpd-u'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sha512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm3'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cooperlake'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='perfmon-v2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='perfmon-v2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbpb'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='perfmon-v2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbpb'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-128'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-256'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-128'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-256'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='KnightsMill'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512er'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512pf'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512er'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512pf'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tbm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tbm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='athlon'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='athlon-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='core2duo'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='core2duo-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='coreduo'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='coreduo-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='n270'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='n270-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='phenom'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='phenom-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <memoryBacking supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <enum name='sourceType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>file</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>anonymous</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>memfd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </memoryBacking>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <disk supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='diskDevice'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>disk</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>cdrom</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>floppy</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>lun</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='bus'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>fdc</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>scsi</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>usb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>sata</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-non-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <graphics supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vnc</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>egl-headless</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>dbus</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <video supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='modelType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vga</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>cirrus</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>none</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>bochs</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>ramfb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <hostdev supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='mode'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>subsystem</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='startupPolicy'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>default</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>mandatory</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>requisite</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>optional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='subsysType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>usb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pci</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>scsi</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='capsType'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='pciBackend'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </hostdev>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <rng supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-non-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendModel'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>random</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>egd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>builtin</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <filesystem supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='driverType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>path</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>handle</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtiofs</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </filesystem>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <tpm supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tpm-tis</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tpm-crb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendModel'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>emulator</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>external</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendVersion'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>2.0</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </tpm>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <redirdev supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='bus'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>usb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </redirdev>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <channel supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pty</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>unix</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </channel>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <crypto supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>qemu</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendModel'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>builtin</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </crypto>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <interface supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>default</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>passt</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <panic supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>isa</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>hyperv</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </panic>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <console supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>null</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vc</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pty</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>dev</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>file</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pipe</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>stdio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>udp</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tcp</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>unix</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>qemu-vdagent</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>dbus</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </console>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <gic supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <genid supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <backup supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <async-teardown supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <s390-pv supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <ps2 supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <tdx supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <sev supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <sgx supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <hyperv supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='features'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>relaxed</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vapic</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>spinlocks</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vpindex</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>runtime</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>synic</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>stimer</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>reset</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vendor_id</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>frequencies</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>reenlightenment</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tlbflush</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>ipi</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>avic</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>emsr_bitmap</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>xmm_input</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <defaults>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </defaults>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </hyperv>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <launchSecurity supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: </domainCapabilities>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.591 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 04:22:56 np0005593234 nova_compute[227762]: <domainCapabilities>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <domain>kvm</domain>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <arch>i686</arch>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <vcpu max='240'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <iothreads supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <os supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <enum name='firmware'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <loader supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>rom</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pflash</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='readonly'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>yes</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>no</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='secure'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>no</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </loader>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <cpu>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>on</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>off</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='maximumMigratable'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>on</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>off</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <vendor>AMD</vendor>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='succor'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='custom' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ddpd-u'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sha512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm3'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ddpd-u'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sha512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm3'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cooperlake'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='perfmon-v2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='perfmon-v2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbpb'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='perfmon-v2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbpb'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-128'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-256'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-128'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-256'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='KnightsMill'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512er'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512pf'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512er'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512pf'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tbm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tbm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='athlon'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='athlon-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='core2duo'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='core2duo-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='coreduo'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='coreduo-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='n270'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='n270-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='phenom'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='phenom-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <memoryBacking supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <enum name='sourceType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>file</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>anonymous</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>memfd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </memoryBacking>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <disk supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='diskDevice'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>disk</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>cdrom</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>floppy</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>lun</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='bus'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>ide</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>fdc</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>scsi</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>usb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>sata</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-non-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <graphics supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vnc</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>egl-headless</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>dbus</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <video supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='modelType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vga</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>cirrus</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>none</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>bochs</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>ramfb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <hostdev supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='mode'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>subsystem</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='startupPolicy'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>default</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>mandatory</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>requisite</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>optional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='subsysType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>usb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pci</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>scsi</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='capsType'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='pciBackend'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </hostdev>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <rng supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-non-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendModel'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>random</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>egd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>builtin</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <filesystem supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='driverType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>path</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>handle</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtiofs</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </filesystem>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <tpm supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tpm-tis</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tpm-crb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendModel'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>emulator</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>external</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendVersion'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>2.0</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </tpm>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <redirdev supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='bus'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>usb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </redirdev>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <channel supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pty</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>unix</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </channel>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <crypto supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>qemu</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendModel'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>builtin</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </crypto>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <interface supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>default</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>passt</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <panic supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>isa</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>hyperv</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </panic>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <console supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>null</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vc</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pty</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>dev</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>file</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pipe</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>stdio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>udp</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tcp</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>unix</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>qemu-vdagent</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>dbus</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </console>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <gic supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <genid supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <backup supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <async-teardown supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <s390-pv supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <ps2 supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <tdx supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <sev supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <sgx supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <hyperv supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='features'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>relaxed</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vapic</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>spinlocks</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vpindex</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>runtime</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>synic</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>stimer</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>reset</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vendor_id</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>frequencies</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>reenlightenment</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tlbflush</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>ipi</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>avic</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>emsr_bitmap</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>xmm_input</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <defaults>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </defaults>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </hyperv>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <launchSecurity supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: </domainCapabilities>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.654 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.661 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 04:22:56 np0005593234 nova_compute[227762]: <domainCapabilities>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <domain>kvm</domain>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <machine>pc-q35-rhel9.8.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <arch>x86_64</arch>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <vcpu max='4096'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <iothreads supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <os supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <enum name='firmware'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>efi</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <loader supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>rom</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pflash</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='readonly'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>yes</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>no</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='secure'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>yes</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>no</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </loader>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <cpu>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>on</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>off</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='maximumMigratable'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>on</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>off</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <vendor>AMD</vendor>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='succor'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='custom' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ddpd-u'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sha512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm3'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ddpd-u'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sha512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm3'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cooperlake'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='perfmon-v2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='perfmon-v2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbpb'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='perfmon-v2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbpb'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-128'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-256'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-128'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-256'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='KnightsMill'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512er'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512pf'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512er'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512pf'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tbm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tbm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='athlon'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='athlon-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='core2duo'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='core2duo-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='coreduo'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='coreduo-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='n270'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='n270-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='phenom'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='phenom-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <memoryBacking supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <enum name='sourceType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>file</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>anonymous</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>memfd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </memoryBacking>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <disk supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='diskDevice'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>disk</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>cdrom</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>floppy</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>lun</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='bus'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>fdc</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>scsi</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>usb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>sata</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-non-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <graphics supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vnc</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>egl-headless</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>dbus</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <video supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='modelType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vga</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>cirrus</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>none</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>bochs</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>ramfb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <hostdev supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='mode'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>subsystem</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='startupPolicy'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>default</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>mandatory</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>requisite</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>optional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='subsysType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>usb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pci</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>scsi</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='capsType'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='pciBackend'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </hostdev>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <rng supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-non-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendModel'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>random</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>egd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>builtin</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <filesystem supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='driverType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>path</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>handle</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtiofs</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </filesystem>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <tpm supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tpm-tis</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tpm-crb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendModel'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>emulator</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>external</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendVersion'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>2.0</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </tpm>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <redirdev supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='bus'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>usb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </redirdev>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <channel supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pty</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>unix</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </channel>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <crypto supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>qemu</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendModel'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>builtin</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </crypto>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <interface supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>default</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>passt</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <panic supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>isa</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>hyperv</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </panic>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <console supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>null</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vc</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pty</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>dev</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>file</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pipe</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>stdio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>udp</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tcp</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>unix</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>qemu-vdagent</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>dbus</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </console>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <gic supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <genid supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <backup supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <async-teardown supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <s390-pv supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <ps2 supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <tdx supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <sev supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <sgx supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <hyperv supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='features'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>relaxed</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vapic</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>spinlocks</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vpindex</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>runtime</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>synic</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>stimer</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>reset</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vendor_id</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>frequencies</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>reenlightenment</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tlbflush</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>ipi</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>avic</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>emsr_bitmap</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>xmm_input</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <defaults>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </defaults>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </hyperv>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <launchSecurity supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: </domainCapabilities>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.735 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 23 04:22:56 np0005593234 nova_compute[227762]: <domainCapabilities>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <path>/usr/libexec/qemu-kvm</path>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <domain>kvm</domain>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <arch>x86_64</arch>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <vcpu max='240'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <iothreads supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <os supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <enum name='firmware'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <loader supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>rom</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pflash</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='readonly'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>yes</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>no</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='secure'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>no</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </loader>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <cpu>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='host-passthrough' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='hostPassthroughMigratable'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>on</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>off</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='maximum' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='maximumMigratable'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>on</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>off</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='host-model' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model fallback='forbid'>EPYC-Rome</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <vendor>AMD</vendor>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='x2apic'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='tsc-deadline'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='hypervisor'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='tsc_adjust'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='spec-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='stibp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='cmp_legacy'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='overflow-recov'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='succor'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='amd-ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='virt-ssbd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='lbrv'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='tsc-scale'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='vmcb-clean'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='flushbyasid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='pause-filter'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='pfthreshold'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='svme-addr-chk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='require' name='lfence-always-serializing'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <feature policy='disable' name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <mode name='custom' supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Broadwell-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cascadelake-Server-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='ClearwaterForest'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ddpd-u'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sha512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm3'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='ClearwaterForest-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ddpd-u'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sha512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm3'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sm4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cooperlake'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cooperlake-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Cooperlake-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Denverton-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Dhyana-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Genoa'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Genoa-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Genoa-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='perfmon-v2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Milan-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Rome-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Turin'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='perfmon-v2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbpb'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-Turin-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amd-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='auto-ibrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vp2intersect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fs-gs-base-ns'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibpb-brtype'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='no-nested-data-bp'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='null-sel-clr-base'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='perfmon-v2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbpb'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='srso-user-kernel-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='stibp-always-on'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='EPYC-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-128'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-256'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='GraniteRapids-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-128'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-256'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx10-512'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='prefetchiti'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Haswell-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-noTSX'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v6'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Icelake-Server-v7'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='IvyBridge-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='KnightsMill'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512er'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512pf'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='KnightsMill-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4fmaps'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-4vnniw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512er'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512pf'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G4-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tbm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Opteron_G5-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fma4'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tbm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xop'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SapphireRapids-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='amx-tile'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-bf16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-fp16'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512-vpopcntdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bitalg'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vbmi2'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrc'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fzrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='la57'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='taa-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='tsx-ldtrk'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='SierraForest-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ifma'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-ne-convert'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx-vnni-int8'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bhi-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='bus-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cmpccxadd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fbsdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='fsrs'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ibrs-all'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='intel-psfd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ipred-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='lam'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mcdt-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pbrsb-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='psdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rrsba-ctrl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='sbdr-ssdp-no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='serialize'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vaes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='vpclmulqdq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Client-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='hle'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='rtm'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Skylake-Server-v5'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512bw'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512cd'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512dq'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512f'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='avx512vl'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='invpcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pcid'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='pku'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='mpx'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v2'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v3'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='core-capability'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='split-lock-detect'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='Snowridge-v4'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='cldemote'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='erms'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='gfni'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdir64b'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='movdiri'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='xsaves'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='athlon'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='athlon-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='core2duo'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='core2duo-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='coreduo'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='coreduo-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='n270'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='n270-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='ss'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='phenom'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <blockers model='phenom-v1'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnow'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <feature name='3dnowext'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </blockers>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </mode>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <memoryBacking supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <enum name='sourceType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>file</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>anonymous</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <value>memfd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </memoryBacking>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <disk supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='diskDevice'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>disk</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>cdrom</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>floppy</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>lun</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='bus'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>ide</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>fdc</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>scsi</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>usb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>sata</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-non-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <graphics supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vnc</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>egl-headless</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>dbus</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <video supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='modelType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vga</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>cirrus</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>none</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>bochs</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>ramfb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <hostdev supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='mode'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>subsystem</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='startupPolicy'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>default</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>mandatory</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>requisite</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>optional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='subsysType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>usb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pci</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>scsi</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='capsType'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='pciBackend'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </hostdev>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <rng supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtio-non-transitional</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendModel'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>random</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>egd</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>builtin</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <filesystem supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='driverType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>path</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>handle</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>virtiofs</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </filesystem>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <tpm supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tpm-tis</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tpm-crb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendModel'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>emulator</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>external</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendVersion'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>2.0</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </tpm>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <redirdev supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='bus'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>usb</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </redirdev>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <channel supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pty</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>unix</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </channel>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <crypto supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>qemu</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendModel'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>builtin</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </crypto>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <interface supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='backendType'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>default</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>passt</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <panic supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='model'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>isa</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>hyperv</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </panic>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <console supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='type'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>null</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vc</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pty</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>dev</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>file</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>pipe</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>stdio</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>udp</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tcp</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>unix</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>qemu-vdagent</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>dbus</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </console>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <gic supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <vmcoreinfo supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <genid supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <backingStoreInput supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <backup supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <async-teardown supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <s390-pv supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <ps2 supported='yes'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <tdx supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <sev supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <sgx supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <hyperv supported='yes'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <enum name='features'>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>relaxed</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vapic</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>spinlocks</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vpindex</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>runtime</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>synic</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>stimer</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>reset</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>vendor_id</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>frequencies</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>reenlightenment</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>tlbflush</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>ipi</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>avic</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>emsr_bitmap</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <value>xmm_input</value>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </enum>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      <defaults>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <spinlocks>4095</spinlocks>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <stimer_direct>on</stimer_direct>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <tlbflush_direct>on</tlbflush_direct>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <tlbflush_extended>on</tlbflush_extended>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:        <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:      </defaults>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    </hyperv>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:    <launchSecurity supported='no'/>
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: </domainCapabilities>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.801 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.801 227766 INFO nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Secure Boot support detected#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.804 227766 INFO nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.805 227766 INFO nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.816 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] cpu compare xml: <cpu match="exact">
Jan 23 04:22:56 np0005593234 nova_compute[227762]:  <model>Nehalem</model>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: </cpu>
Jan 23 04:22:56 np0005593234 nova_compute[227762]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.818 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Jan 23 04:22:56 np0005593234 nova_compute[227762]: 2026-01-23 09:22:56.992 227766 INFO nova.virt.node [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Determined node identity 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from /var/lib/nova/compute_id#033[00m
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.090 227766 WARNING nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Compute nodes ['89873210-bee9-46e9-9f9d-0cd7a156c3a8'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.137 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.179 227766 WARNING nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.179 227766 DEBUG oslo_concurrency.lockutils [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.180 227766 DEBUG oslo_concurrency.lockutils [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.180 227766 DEBUG oslo_concurrency.lockutils [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.180 227766 DEBUG nova.compute.resource_tracker [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.180 227766 DEBUG oslo_concurrency.processutils [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:22:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:22:57 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1526734171' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.624 227766 DEBUG oslo_concurrency.processutils [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:22:57 np0005593234 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 04:22:57 np0005593234 systemd[1]: Started libvirt nodedev daemon.
Jan 23 04:22:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:57.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.953 227766 WARNING nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.955 227766 DEBUG nova.compute.resource_tracker [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5256MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.956 227766 DEBUG oslo_concurrency.lockutils [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.956 227766 DEBUG oslo_concurrency.lockutils [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.975 227766 WARNING nova.compute.resource_tracker [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] No compute node record for compute-2.ctlplane.example.com:89873210-bee9-46e9-9f9d-0cd7a156c3a8: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 89873210-bee9-46e9-9f9d-0cd7a156c3a8 could not be found.#033[00m
Jan 23 04:22:57 np0005593234 nova_compute[227762]: 2026-01-23 09:22:57.997 227766 INFO nova.compute.resource_tracker [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: 89873210-bee9-46e9-9f9d-0cd7a156c3a8#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.063 227766 DEBUG nova.compute.resource_tracker [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.063 227766 DEBUG nova.compute.resource_tracker [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.222 227766 INFO nova.scheduler.client.report [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [req-4c1063a4-43ae-4e3d-a67e-59436662b091] Created resource provider record via placement API for resource provider with UUID 89873210-bee9-46e9-9f9d-0cd7a156c3a8 and name compute-2.ctlplane.example.com.#033[00m
Jan 23 04:22:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.279 227766 DEBUG oslo_concurrency.processutils [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:22:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:22:58.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:22:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:22:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/384229591' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.736 227766 DEBUG oslo_concurrency.processutils [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.742 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 23 04:22:58 np0005593234 nova_compute[227762]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.742 227766 INFO nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] kernel doesn't support AMD SEV#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.743 227766 DEBUG nova.compute.provider_tree [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.744 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.747 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Libvirt baseline CPU <cpu>
Jan 23 04:22:58 np0005593234 nova_compute[227762]:  <arch>x86_64</arch>
Jan 23 04:22:58 np0005593234 nova_compute[227762]:  <model>Nehalem</model>
Jan 23 04:22:58 np0005593234 nova_compute[227762]:  <vendor>AMD</vendor>
Jan 23 04:22:58 np0005593234 nova_compute[227762]:  <topology sockets="8" cores="1" threads="1"/>
Jan 23 04:22:58 np0005593234 nova_compute[227762]: </cpu>
Jan 23 04:22:58 np0005593234 nova_compute[227762]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.796 227766 DEBUG nova.scheduler.client.report [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Updated inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.797 227766 DEBUG nova.compute.provider_tree [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Updating resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.797 227766 DEBUG nova.compute.provider_tree [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.867 227766 DEBUG nova.compute.provider_tree [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Updating resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.895 227766 DEBUG nova.compute.resource_tracker [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.895 227766 DEBUG oslo_concurrency.lockutils [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.939s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.896 227766 DEBUG nova.service [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.955 227766 DEBUG nova.service [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Jan 23 04:22:58 np0005593234 nova_compute[227762]: 2026-01-23 09:22:58.955 227766 DEBUG nova.servicegroup.drivers.db [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Jan 23 04:22:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:22:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:22:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:22:59.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:00.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:01.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:02.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:02 np0005593234 podman[228133]: 2026-01-23 09:23:02.831438718 +0000 UTC m=+0.128048001 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:23:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:23:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:03.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:04.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:05.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:06.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:07.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:23:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:08.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:09.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:10.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:11.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:12.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:23:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:13.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:14.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:23:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:15.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:23:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:16.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:17.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:23:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:18.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:19 np0005593234 podman[228220]: 2026-01-23 09:23:19.761599554 +0000 UTC m=+0.052615415 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:23:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:19.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:20.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:21.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:22.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:23:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:23.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:24.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:25.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:26.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.810076) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160207810186, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 2339, "num_deletes": 251, "total_data_size": 6016969, "memory_usage": 6107504, "flush_reason": "Manual Compaction"}
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160207835680, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 3941775, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17721, "largest_seqno": 20055, "table_properties": {"data_size": 3932267, "index_size": 6066, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 18686, "raw_average_key_size": 19, "raw_value_size": 3913459, "raw_average_value_size": 4185, "num_data_blocks": 271, "num_entries": 935, "num_filter_entries": 935, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769159970, "oldest_key_time": 1769159970, "file_creation_time": 1769160207, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 25626 microseconds, and 8412 cpu microseconds.
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.835738) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 3941775 bytes OK
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.835757) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.838203) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.838216) EVENT_LOG_v1 {"time_micros": 1769160207838211, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.838236) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 6006753, prev total WAL file size 6006753, number of live WAL files 2.
Jan 23 04:23:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.839767) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Jan 23 04:23:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(3849KB)], [36(7769KB)]
Jan 23 04:23:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:27.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160207839877, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 11897822, "oldest_snapshot_seqno": -1}
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 4467 keys, 9853502 bytes, temperature: kUnknown
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160207926167, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 9853502, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9820721, "index_size": 20511, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11205, "raw_key_size": 111562, "raw_average_key_size": 24, "raw_value_size": 9736793, "raw_average_value_size": 2179, "num_data_blocks": 854, "num_entries": 4467, "num_filter_entries": 4467, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769160207, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.926463) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 9853502 bytes
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.927852) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 137.7 rd, 114.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.8, 7.6 +0.0 blob) out(9.4 +0.0 blob), read-write-amplify(5.5) write-amplify(2.5) OK, records in: 4986, records dropped: 519 output_compression: NoCompression
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.927867) EVENT_LOG_v1 {"time_micros": 1769160207927860, "job": 20, "event": "compaction_finished", "compaction_time_micros": 86399, "compaction_time_cpu_micros": 26732, "output_level": 6, "num_output_files": 1, "total_output_size": 9853502, "num_input_records": 4986, "num_output_records": 4467, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160207928720, "job": 20, "event": "table_file_deletion", "file_number": 38}
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160207930219, "job": 20, "event": "table_file_deletion", "file_number": 36}
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.839660) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.930311) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.930316) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.930319) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.930320) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:23:27 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:23:27.930322) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:23:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:23:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:23:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:28.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:23:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:29.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:30.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:31.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:23:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2302531878' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:23:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:23:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2302531878' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:23:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:32.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:23:33 np0005593234 podman[228298]: 2026-01-23 09:23:33.781404837 +0000 UTC m=+0.082384805 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 23 04:23:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:33.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:23:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:34.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:23:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:35.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:23:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:36.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:23:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:37.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:23:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:38.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:39.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:40.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:41.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:42.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:23:42.798 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:23:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:23:42.799 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:23:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:23:42.799 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:23:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:23:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:43.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:23:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:44.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:23:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:45.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:46.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:23:47.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:23:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:23:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:23:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:23:48.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:23:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:26:59 np0005593234 rsyslogd[1006]: imjournal: 1028 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 23 04:27:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:00.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:27:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:00.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:27:01 np0005593234 nova_compute[227762]: 2026-01-23 09:27:01.512 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:01 np0005593234 nova_compute[227762]: 2026-01-23 09:27:01.513 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:02.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.132 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.132 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.132 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.161 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.162 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.162 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.162 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.162 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.163 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.163 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.163 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.163 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.201 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.202 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.202 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.202 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.202 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:27:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:27:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3745844446' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.633 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:27:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:02.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.772 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.773 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5317MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.774 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.774 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.874 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.875 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:27:02 np0005593234 nova_compute[227762]: 2026-01-23 09:27:02.898 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:27:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:27:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:27:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/399743735' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:27:03 np0005593234 nova_compute[227762]: 2026-01-23 09:27:03.326 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:27:03 np0005593234 nova_compute[227762]: 2026-01-23 09:27:03.332 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:27:03 np0005593234 nova_compute[227762]: 2026-01-23 09:27:03.365 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:27:03 np0005593234 nova_compute[227762]: 2026-01-23 09:27:03.367 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:27:03 np0005593234 nova_compute[227762]: 2026-01-23 09:27:03.368 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:27:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:04.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:04.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:06.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:06.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:08.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:27:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:27:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:08.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:27:08 np0005593234 podman[229995]: 2026-01-23 09:27:08.817312525 +0000 UTC m=+0.096787230 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 23 04:27:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:10.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:10.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.748723) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160431748784, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1147, "num_deletes": 255, "total_data_size": 2491277, "memory_usage": 2536808, "flush_reason": "Manual Compaction"}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160431760167, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 1645726, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21491, "largest_seqno": 22633, "table_properties": {"data_size": 1640703, "index_size": 2547, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10519, "raw_average_key_size": 18, "raw_value_size": 1630543, "raw_average_value_size": 2916, "num_data_blocks": 114, "num_entries": 559, "num_filter_entries": 559, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769160337, "oldest_key_time": 1769160337, "file_creation_time": 1769160431, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 11495 microseconds, and 4372 cpu microseconds.
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.760225) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 1645726 bytes OK
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.760247) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.762327) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.762340) EVENT_LOG_v1 {"time_micros": 1769160431762336, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.762360) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 2485693, prev total WAL file size 2485693, number of live WAL files 2.
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.763035) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323533' seq:72057594037927935, type:22 .. '6C6F676D00353034' seq:0, type:0; will stop at (end)
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(1607KB)], [42(7981KB)]
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160431763121, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 9818901, "oldest_snapshot_seqno": -1}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 4637 keys, 9646035 bytes, temperature: kUnknown
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160431822389, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 9646035, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9613350, "index_size": 19967, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11653, "raw_key_size": 116472, "raw_average_key_size": 25, "raw_value_size": 9527594, "raw_average_value_size": 2054, "num_data_blocks": 827, "num_entries": 4637, "num_filter_entries": 4637, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769160431, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.822763) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 9646035 bytes
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.824047) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 165.4 rd, 162.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 7.8 +0.0 blob) out(9.2 +0.0 blob), read-write-amplify(11.8) write-amplify(5.9) OK, records in: 5162, records dropped: 525 output_compression: NoCompression
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.824066) EVENT_LOG_v1 {"time_micros": 1769160431824057, "job": 24, "event": "compaction_finished", "compaction_time_micros": 59371, "compaction_time_cpu_micros": 19242, "output_level": 6, "num_output_files": 1, "total_output_size": 9646035, "num_input_records": 5162, "num_output_records": 4637, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160431824415, "job": 24, "event": "table_file_deletion", "file_number": 44}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160431825792, "job": 24, "event": "table_file_deletion", "file_number": 42}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.762970) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.825886) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.825891) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.825893) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.825894) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.825896) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.826258) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160431826292, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 257, "num_deletes": 251, "total_data_size": 21496, "memory_usage": 27960, "flush_reason": "Manual Compaction"}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160431828004, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 13305, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22635, "largest_seqno": 22890, "table_properties": {"data_size": 11552, "index_size": 50, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 645, "raw_key_size": 4640, "raw_average_key_size": 18, "raw_value_size": 8177, "raw_average_value_size": 31, "num_data_blocks": 2, "num_entries": 256, "num_filter_entries": 256, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769160431, "oldest_key_time": 1769160431, "file_creation_time": 1769160431, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 1772 microseconds, and 573 cpu microseconds.
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.828029) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 13305 bytes OK
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.828042) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.829785) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.829796) EVENT_LOG_v1 {"time_micros": 1769160431829793, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.829811) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 19470, prev total WAL file size 19470, number of live WAL files 2.
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.830052) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(12KB)], [45(9419KB)]
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160431830093, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 9659340, "oldest_snapshot_seqno": -1}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 4387 keys, 7615027 bytes, temperature: kUnknown
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160431873380, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 7615027, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 7585808, "index_size": 17141, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 11013, "raw_key_size": 111972, "raw_average_key_size": 25, "raw_value_size": 7506134, "raw_average_value_size": 1710, "num_data_blocks": 699, "num_entries": 4387, "num_filter_entries": 4387, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769160431, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.873700) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 7615027 bytes
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.875490) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 222.6 rd, 175.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.0, 9.2 +0.0 blob) out(7.3 +0.0 blob), read-write-amplify(1298.3) write-amplify(572.3) OK, records in: 4893, records dropped: 506 output_compression: NoCompression
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.875513) EVENT_LOG_v1 {"time_micros": 1769160431875503, "job": 26, "event": "compaction_finished", "compaction_time_micros": 43395, "compaction_time_cpu_micros": 16084, "output_level": 6, "num_output_files": 1, "total_output_size": 7615027, "num_input_records": 4893, "num_output_records": 4387, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160431875649, "job": 26, "event": "table_file_deletion", "file_number": 47}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160431877415, "job": 26, "event": "table_file_deletion", "file_number": 45}
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.829982) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.877525) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.877531) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.877533) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.877534) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:27:11 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:27:11.877535) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:27:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:27:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:12.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:27:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:12.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:27:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:14.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:14.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:16.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:27:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:16.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:27:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:27:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:27:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:27:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:27:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:27:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:18.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:27:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:18.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:20.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:20.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=404 latency=0.002000063s ======
Jan 23 04:27:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:20.820 +0000] "GET /healthcheck HTTP/1.1" 404 240 - "python-urllib3/1.26.5" - latency=0.002000063s
Jan 23 04:27:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:27:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:22.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:27:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:27:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:22.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:27:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 04:27:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:27:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:27:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:24.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:27:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:24.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:27:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:26.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:26.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e135 e135: 3 total, 3 up, 3 in
Jan 23 04:27:27 np0005593234 podman[230238]: 2026-01-23 09:27:27.465554863 +0000 UTC m=+0.055857298 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 23 04:27:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:27:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:28.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:27:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e136 e136: 3 total, 3 up, 3 in
Jan 23 04:27:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:27:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:27:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:28.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:27:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:30.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e137 e137: 3 total, 3 up, 3 in
Jan 23 04:27:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:27:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:30.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:27:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:32.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:27:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:32.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:27:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e137 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:27:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e138 e138: 3 total, 3 up, 3 in
Jan 23 04:27:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:27:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:34.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:27:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:27:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:34.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:27:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:36.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:36.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e139 e139: 3 total, 3 up, 3 in
Jan 23 04:27:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:38.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:27:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:27:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:38.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:27:39 np0005593234 podman[230290]: 2026-01-23 09:27:39.7835156 +0000 UTC m=+0.077657095 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:27:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:40.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:40.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:42.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:27:42.803 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:27:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:27:42.804 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:27:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:27:42.804 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:27:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:42.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:27:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:44.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:27:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/425157363' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:27:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:27:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/425157363' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:27:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:44.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:46.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:47.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:48.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:27:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:49.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:50.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:51.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:52.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:27:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:53.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:54.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:27:54.448 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:27:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:27:54.450 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:27:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:27:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:55.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:27:55 np0005593234 nova_compute[227762]: 2026-01-23 09:27:55.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:55 np0005593234 nova_compute[227762]: 2026-01-23 09:27:55.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 04:27:55 np0005593234 nova_compute[227762]: 2026-01-23 09:27:55.768 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 04:27:55 np0005593234 nova_compute[227762]: 2026-01-23 09:27:55.769 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:55 np0005593234 nova_compute[227762]: 2026-01-23 09:27:55.770 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 04:27:55 np0005593234 nova_compute[227762]: 2026-01-23 09:27:55.784 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:56.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:57.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:57 np0005593234 podman[230376]: 2026-01-23 09:27:57.791793375 +0000 UTC m=+0.082851318 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:27:57 np0005593234 nova_compute[227762]: 2026-01-23 09:27:57.800 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:27:58.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:27:58 np0005593234 nova_compute[227762]: 2026-01-23 09:27:58.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:27:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:27:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:27:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:27:59.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:27:59 np0005593234 nova_compute[227762]: 2026-01-23 09:27:59.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:28:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:00.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:00 np0005593234 nova_compute[227762]: 2026-01-23 09:28:00.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:28:00 np0005593234 nova_compute[227762]: 2026-01-23 09:28:00.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:28:00 np0005593234 nova_compute[227762]: 2026-01-23 09:28:00.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:28:00 np0005593234 nova_compute[227762]: 2026-01-23 09:28:00.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:28:00 np0005593234 nova_compute[227762]: 2026-01-23 09:28:00.772 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:00 np0005593234 nova_compute[227762]: 2026-01-23 09:28:00.772 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:00 np0005593234 nova_compute[227762]: 2026-01-23 09:28:00.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:28:00 np0005593234 nova_compute[227762]: 2026-01-23 09:28:00.773 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:28:00 np0005593234 nova_compute[227762]: 2026-01-23 09:28:00.774 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:28:01 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/330733492' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:28:01 np0005593234 nova_compute[227762]: 2026-01-23 09:28:01.226 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:01 np0005593234 nova_compute[227762]: 2026-01-23 09:28:01.371 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:28:01 np0005593234 nova_compute[227762]: 2026-01-23 09:28:01.372 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5300MB free_disk=20.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:28:01 np0005593234 nova_compute[227762]: 2026-01-23 09:28:01.372 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:01 np0005593234 nova_compute[227762]: 2026-01-23 09:28:01.372 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:28:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:01.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:28:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:28:01.452 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:28:01 np0005593234 nova_compute[227762]: 2026-01-23 09:28:01.565 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:28:01 np0005593234 nova_compute[227762]: 2026-01-23 09:28:01.565 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:28:01 np0005593234 nova_compute[227762]: 2026-01-23 09:28:01.664 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:28:02 np0005593234 nova_compute[227762]: 2026-01-23 09:28:02.015 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:28:02 np0005593234 nova_compute[227762]: 2026-01-23 09:28:02.016 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:28:02 np0005593234 nova_compute[227762]: 2026-01-23 09:28:02.049 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:28:02 np0005593234 nova_compute[227762]: 2026-01-23 09:28:02.080 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:28:02 np0005593234 nova_compute[227762]: 2026-01-23 09:28:02.138 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:02.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:28:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/800715591' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:28:02 np0005593234 nova_compute[227762]: 2026-01-23 09:28:02.606 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:02 np0005593234 nova_compute[227762]: 2026-01-23 09:28:02.611 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:28:02 np0005593234 nova_compute[227762]: 2026-01-23 09:28:02.651 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:28:02 np0005593234 nova_compute[227762]: 2026-01-23 09:28:02.652 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:28:02 np0005593234 nova_compute[227762]: 2026-01-23 09:28:02.652 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.280s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:28:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:28:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:03.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:03 np0005593234 nova_compute[227762]: 2026-01-23 09:28:03.652 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:28:03 np0005593234 nova_compute[227762]: 2026-01-23 09:28:03.653 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:28:03 np0005593234 nova_compute[227762]: 2026-01-23 09:28:03.653 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:28:03 np0005593234 nova_compute[227762]: 2026-01-23 09:28:03.653 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:28:03 np0005593234 nova_compute[227762]: 2026-01-23 09:28:03.684 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:28:03 np0005593234 nova_compute[227762]: 2026-01-23 09:28:03.684 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:28:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:28:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:04.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:28:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:05.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:06.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:07.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e140 e140: 3 total, 3 up, 3 in
Jan 23 04:28:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:08.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e140 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:28:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e141 e141: 3 total, 3 up, 3 in
Jan 23 04:28:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:09.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:10.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:10 np0005593234 podman[230496]: 2026-01-23 09:28:10.824777387 +0000 UTC m=+0.118747482 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:28:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:11.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:12.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:28:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:13.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:14.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:15.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:16.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e142 e142: 3 total, 3 up, 3 in
Jan 23 04:28:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:17.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:18.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:28:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:28:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:19.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:28:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:20.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:28:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:21.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:28:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:22.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:23 np0005593234 nova_compute[227762]: 2026-01-23 09:28:23.057 227766 DEBUG oslo_concurrency.lockutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Acquiring lock "7ae8913c-d2d4-493a-86e7-978465ea0f9a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:23 np0005593234 nova_compute[227762]: 2026-01-23 09:28:23.057 227766 DEBUG oslo_concurrency.lockutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Lock "7ae8913c-d2d4-493a-86e7-978465ea0f9a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:23 np0005593234 nova_compute[227762]: 2026-01-23 09:28:23.101 227766 DEBUG nova.compute.manager [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:28:23 np0005593234 nova_compute[227762]: 2026-01-23 09:28:23.262 227766 DEBUG oslo_concurrency.lockutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:23 np0005593234 nova_compute[227762]: 2026-01-23 09:28:23.262 227766 DEBUG oslo_concurrency.lockutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:23 np0005593234 nova_compute[227762]: 2026-01-23 09:28:23.271 227766 DEBUG nova.virt.hardware [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:28:23 np0005593234 nova_compute[227762]: 2026-01-23 09:28:23.271 227766 INFO nova.compute.claims [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:28:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:28:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:23.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:23 np0005593234 nova_compute[227762]: 2026-01-23 09:28:23.834 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:24.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:28:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1410376294' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.299 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.307 227766 DEBUG nova.compute.provider_tree [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.327 227766 DEBUG nova.scheduler.client.report [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.366 227766 DEBUG oslo_concurrency.lockutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.367 227766 DEBUG nova.compute.manager [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.466 227766 DEBUG nova.compute.manager [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.467 227766 DEBUG nova.network.neutron [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.515 227766 INFO nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.542 227766 DEBUG nova.compute.manager [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.674 227766 DEBUG nova.compute.manager [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.676 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.676 227766 INFO nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Creating image(s)#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.706 227766 DEBUG nova.storage.rbd_utils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] rbd image 7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.728 227766 DEBUG nova.storage.rbd_utils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] rbd image 7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.753 227766 DEBUG nova.storage.rbd_utils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] rbd image 7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.757 227766 DEBUG oslo_concurrency.lockutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.758 227766 DEBUG oslo_concurrency.lockutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.763 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Acquiring lock "6ce66043-c3e3-4988-976a-2ba903e63d87" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.764 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:24 np0005593234 nova_compute[227762]: 2026-01-23 09:28:24.836 227766 DEBUG nova.compute.manager [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.033 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.033 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.043 227766 DEBUG nova.virt.hardware [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.043 227766 INFO nova.compute.claims [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:28:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:28:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:28:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.274 227766 DEBUG oslo_concurrency.processutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:25.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:28:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3906381425' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.751 227766 DEBUG oslo_concurrency.processutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.757 227766 DEBUG nova.compute.provider_tree [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.781 227766 DEBUG nova.scheduler.client.report [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.808 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.809 227766 DEBUG nova.compute.manager [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.872 227766 DEBUG nova.compute.manager [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.873 227766 DEBUG nova.network.neutron [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.915 227766 INFO nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:28:25 np0005593234 nova_compute[227762]: 2026-01-23 09:28:25.945 227766 DEBUG nova.compute.manager [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:28:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:26.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:26 np0005593234 nova_compute[227762]: 2026-01-23 09:28:26.268 227766 DEBUG nova.compute.manager [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:28:26 np0005593234 nova_compute[227762]: 2026-01-23 09:28:26.269 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:28:26 np0005593234 nova_compute[227762]: 2026-01-23 09:28:26.269 227766 INFO nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Creating image(s)#033[00m
Jan 23 04:28:26 np0005593234 nova_compute[227762]: 2026-01-23 09:28:26.295 227766 DEBUG nova.storage.rbd_utils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] rbd image 6ce66043-c3e3-4988-976a-2ba903e63d87_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:28:26 np0005593234 nova_compute[227762]: 2026-01-23 09:28:26.324 227766 DEBUG nova.storage.rbd_utils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] rbd image 6ce66043-c3e3-4988-976a-2ba903e63d87_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:28:26 np0005593234 nova_compute[227762]: 2026-01-23 09:28:26.353 227766 DEBUG nova.storage.rbd_utils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] rbd image 6ce66043-c3e3-4988-976a-2ba903e63d87_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:28:26 np0005593234 nova_compute[227762]: 2026-01-23 09:28:26.358 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:26 np0005593234 nova_compute[227762]: 2026-01-23 09:28:26.360 227766 DEBUG nova.virt.libvirt.imagebackend [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/84c0ef19-7f67-4bd3-95d8-507c3e0942ed/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/84c0ef19-7f67-4bd3-95d8-507c3e0942ed/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 04:28:26 np0005593234 nova_compute[227762]: 2026-01-23 09:28:26.412 227766 DEBUG nova.network.neutron [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:28:26 np0005593234 nova_compute[227762]: 2026-01-23 09:28:26.412 227766 DEBUG nova.compute.manager [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:28:27 np0005593234 nova_compute[227762]: 2026-01-23 09:28:27.366 227766 DEBUG nova.network.neutron [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Automatically allocating a network for project 7ce4d2b2bd9d4e648ef6fd351b972262. _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2460#033[00m
Jan 23 04:28:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:27.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:27 np0005593234 podman[230863]: 2026-01-23 09:28:27.939238174 +0000 UTC m=+0.062876216 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 04:28:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:28.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.523 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.590 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2.part --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.591 227766 DEBUG nova.virt.images [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] 84c0ef19-7f67-4bd3-95d8-507c3e0942ed was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.593 227766 DEBUG nova.privsep.utils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.593 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2.part /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.768 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2.part /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2.converted" returned: 0 in 0.175s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.774 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.834 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2.converted --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.836 227766 DEBUG oslo_concurrency.lockutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 4.078s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.860 227766 DEBUG nova.storage.rbd_utils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] rbd image 7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.863 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.880 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 2.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.881 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.904 227766 DEBUG nova.storage.rbd_utils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] rbd image 6ce66043-c3e3-4988-976a-2ba903e63d87_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:28:28 np0005593234 nova_compute[227762]: 2026-01-23 09:28:28.908 227766 DEBUG oslo_concurrency.processutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 6ce66043-c3e3-4988-976a-2ba903e63d87_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:29.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:30.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:31.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:32.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:32 np0005593234 nova_compute[227762]: 2026-01-23 09:28:32.286 227766 DEBUG oslo_concurrency.processutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 6ce66043-c3e3-4988-976a-2ba903e63d87_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.378s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:32 np0005593234 nova_compute[227762]: 2026-01-23 09:28:32.323 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:32 np0005593234 nova_compute[227762]: 2026-01-23 09:28:32.425 227766 DEBUG nova.storage.rbd_utils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] resizing rbd image 6ce66043-c3e3-4988-976a-2ba903e63d87_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:28:32 np0005593234 nova_compute[227762]: 2026-01-23 09:28:32.760 227766 DEBUG nova.storage.rbd_utils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] resizing rbd image 7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.134 227766 DEBUG nova.objects.instance [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Lazy-loading 'migration_context' on Instance uuid 7ae8913c-d2d4-493a-86e7-978465ea0f9a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.140 227766 DEBUG nova.objects.instance [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lazy-loading 'migration_context' on Instance uuid 6ce66043-c3e3-4988-976a-2ba903e63d87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:28:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.342 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.343 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Ensure instance console log exists: /var/lib/nova/instances/7ae8913c-d2d4-493a-86e7-978465ea0f9a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.343 227766 DEBUG oslo_concurrency.lockutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.344 227766 DEBUG oslo_concurrency.lockutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.344 227766 DEBUG oslo_concurrency.lockutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.346 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.349 227766 WARNING nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.361 227766 DEBUG nova.virt.libvirt.host [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.361 227766 DEBUG nova.virt.libvirt.host [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.365 227766 DEBUG nova.virt.libvirt.host [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.365 227766 DEBUG nova.virt.libvirt.host [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.367 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.367 227766 DEBUG nova.virt.hardware [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.367 227766 DEBUG nova.virt.hardware [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.368 227766 DEBUG nova.virt.hardware [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.368 227766 DEBUG nova.virt.hardware [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.368 227766 DEBUG nova.virt.hardware [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.368 227766 DEBUG nova.virt.hardware [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.368 227766 DEBUG nova.virt.hardware [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.368 227766 DEBUG nova.virt.hardware [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.369 227766 DEBUG nova.virt.hardware [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.369 227766 DEBUG nova.virt.hardware [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.369 227766 DEBUG nova.virt.hardware [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.372 227766 DEBUG nova.privsep.utils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.372 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.400 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.401 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Ensure instance console log exists: /var/lib/nova/instances/6ce66043-c3e3-4988-976a-2ba903e63d87/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.402 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.402 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.402 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:28:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:33.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:28:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/664858025' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.916 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.544s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.940 227766 DEBUG nova.storage.rbd_utils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] rbd image 7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:28:33 np0005593234 nova_compute[227762]: 2026-01-23 09:28:33.944 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:34.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:28:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1782880531' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:28:34 np0005593234 nova_compute[227762]: 2026-01-23 09:28:34.447 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:34 np0005593234 nova_compute[227762]: 2026-01-23 09:28:34.451 227766 DEBUG nova.objects.instance [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7ae8913c-d2d4-493a-86e7-978465ea0f9a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:28:34 np0005593234 nova_compute[227762]: 2026-01-23 09:28:34.534 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  <uuid>7ae8913c-d2d4-493a-86e7-978465ea0f9a</uuid>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  <name>instance-00000002</name>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <nova:name>tempest-DeleteServersAdminTestJSON-server-1991243709</nova:name>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:28:33</nova:creationTime>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <nova:user uuid="f09d5ce458784af2bc1f1b44acf6a4de">tempest-DeleteServersAdminTestJSON-240543639-project-member</nova:user>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <nova:project uuid="6925f4bcdc5443728129167068ef60b3">tempest-DeleteServersAdminTestJSON-240543639</nova:project>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <entry name="serial">7ae8913c-d2d4-493a-86e7-978465ea0f9a</entry>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <entry name="uuid">7ae8913c-d2d4-493a-86e7-978465ea0f9a</entry>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk.config">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/7ae8913c-d2d4-493a-86e7-978465ea0f9a/console.log" append="off"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:28:34 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:28:34 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:28:34 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:28:34 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:28:34 np0005593234 nova_compute[227762]: 2026-01-23 09:28:34.724 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:28:34 np0005593234 nova_compute[227762]: 2026-01-23 09:28:34.724 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:28:34 np0005593234 nova_compute[227762]: 2026-01-23 09:28:34.725 227766 INFO nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Using config drive#033[00m
Jan 23 04:28:34 np0005593234 nova_compute[227762]: 2026-01-23 09:28:34.747 227766 DEBUG nova.storage.rbd_utils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] rbd image 7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:28:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:28:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:28:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:35.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:36.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:37 np0005593234 nova_compute[227762]: 2026-01-23 09:28:37.053 227766 INFO nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Creating config drive at /var/lib/nova/instances/7ae8913c-d2d4-493a-86e7-978465ea0f9a/disk.config#033[00m
Jan 23 04:28:37 np0005593234 nova_compute[227762]: 2026-01-23 09:28:37.058 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7ae8913c-d2d4-493a-86e7-978465ea0f9a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8lxozmw9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:37 np0005593234 nova_compute[227762]: 2026-01-23 09:28:37.184 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7ae8913c-d2d4-493a-86e7-978465ea0f9a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8lxozmw9" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:37 np0005593234 nova_compute[227762]: 2026-01-23 09:28:37.211 227766 DEBUG nova.storage.rbd_utils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] rbd image 7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:28:37 np0005593234 nova_compute[227762]: 2026-01-23 09:28:37.214 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7ae8913c-d2d4-493a-86e7-978465ea0f9a/disk.config 7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:37 np0005593234 nova_compute[227762]: 2026-01-23 09:28:37.359 227766 DEBUG oslo_concurrency.processutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7ae8913c-d2d4-493a-86e7-978465ea0f9a/disk.config 7ae8913c-d2d4-493a-86e7-978465ea0f9a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:37 np0005593234 nova_compute[227762]: 2026-01-23 09:28:37.360 227766 INFO nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Deleting local config drive /var/lib/nova/instances/7ae8913c-d2d4-493a-86e7-978465ea0f9a/disk.config because it was imported into RBD.#033[00m
Jan 23 04:28:37 np0005593234 systemd[1]: Starting libvirt secret daemon...
Jan 23 04:28:37 np0005593234 systemd[1]: Started libvirt secret daemon.
Jan 23 04:28:37 np0005593234 systemd-machined[195626]: New machine qemu-1-instance-00000002.
Jan 23 04:28:37 np0005593234 systemd[1]: Started Virtual Machine qemu-1-instance-00000002.
Jan 23 04:28:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:37.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:28:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:38.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:28:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:28:38 np0005593234 nova_compute[227762]: 2026-01-23 09:28:38.515 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160518.505538, 7ae8913c-d2d4-493a-86e7-978465ea0f9a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:28:38 np0005593234 nova_compute[227762]: 2026-01-23 09:28:38.517 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:28:38 np0005593234 nova_compute[227762]: 2026-01-23 09:28:38.520 227766 DEBUG nova.compute.manager [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:28:38 np0005593234 nova_compute[227762]: 2026-01-23 09:28:38.520 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:28:38 np0005593234 nova_compute[227762]: 2026-01-23 09:28:38.526 227766 INFO nova.virt.libvirt.driver [-] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Instance spawned successfully.#033[00m
Jan 23 04:28:38 np0005593234 nova_compute[227762]: 2026-01-23 09:28:38.527 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:28:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:39.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:40 np0005593234 nova_compute[227762]: 2026-01-23 09:28:40.034 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:28:40 np0005593234 nova_compute[227762]: 2026-01-23 09:28:40.037 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:28:40 np0005593234 nova_compute[227762]: 2026-01-23 09:28:40.096 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:28:40 np0005593234 nova_compute[227762]: 2026-01-23 09:28:40.096 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:28:40 np0005593234 nova_compute[227762]: 2026-01-23 09:28:40.097 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:28:40 np0005593234 nova_compute[227762]: 2026-01-23 09:28:40.097 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:28:40 np0005593234 nova_compute[227762]: 2026-01-23 09:28:40.098 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:28:40 np0005593234 nova_compute[227762]: 2026-01-23 09:28:40.098 227766 DEBUG nova.virt.libvirt.driver [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:28:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:40.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:40 np0005593234 nova_compute[227762]: 2026-01-23 09:28:40.617 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:28:40 np0005593234 nova_compute[227762]: 2026-01-23 09:28:40.618 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160518.5072832, 7ae8913c-d2d4-493a-86e7-978465ea0f9a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:28:40 np0005593234 nova_compute[227762]: 2026-01-23 09:28:40.618 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] VM Started (Lifecycle Event)#033[00m
Jan 23 04:28:41 np0005593234 nova_compute[227762]: 2026-01-23 09:28:41.028 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:28:41 np0005593234 nova_compute[227762]: 2026-01-23 09:28:41.032 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:28:41 np0005593234 nova_compute[227762]: 2026-01-23 09:28:41.074 227766 INFO nova.compute.manager [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Took 16.40 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:28:41 np0005593234 nova_compute[227762]: 2026-01-23 09:28:41.076 227766 DEBUG nova.compute.manager [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:28:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:41.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:41 np0005593234 nova_compute[227762]: 2026-01-23 09:28:41.610 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:28:41 np0005593234 podman[231369]: 2026-01-23 09:28:41.796710138 +0000 UTC m=+0.083314851 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 04:28:42 np0005593234 nova_compute[227762]: 2026-01-23 09:28:42.153 227766 INFO nova.compute.manager [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Took 18.94 seconds to build instance.#033[00m
Jan 23 04:28:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:42.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:42 np0005593234 nova_compute[227762]: 2026-01-23 09:28:42.691 227766 DEBUG oslo_concurrency.lockutils [None req-a28e5ace-dc88-46e1-8405-af005c4e0148 f09d5ce458784af2bc1f1b44acf6a4de 6925f4bcdc5443728129167068ef60b3 - - default default] Lock "7ae8913c-d2d4-493a-86e7-978465ea0f9a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:28:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:28:42.804 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:28:42.805 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:28:42.806 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:28:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:28:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:43.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:44.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:28:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1665608111' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:28:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:28:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1665608111' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:28:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:45.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:46.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:47.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:48.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:28:49 np0005593234 nova_compute[227762]: 2026-01-23 09:28:49.472 227766 DEBUG oslo_concurrency.lockutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Acquiring lock "7ae8913c-d2d4-493a-86e7-978465ea0f9a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:49 np0005593234 nova_compute[227762]: 2026-01-23 09:28:49.473 227766 DEBUG oslo_concurrency.lockutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Lock "7ae8913c-d2d4-493a-86e7-978465ea0f9a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:49 np0005593234 nova_compute[227762]: 2026-01-23 09:28:49.473 227766 DEBUG oslo_concurrency.lockutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Acquiring lock "7ae8913c-d2d4-493a-86e7-978465ea0f9a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:49 np0005593234 nova_compute[227762]: 2026-01-23 09:28:49.474 227766 DEBUG oslo_concurrency.lockutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Lock "7ae8913c-d2d4-493a-86e7-978465ea0f9a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:49 np0005593234 nova_compute[227762]: 2026-01-23 09:28:49.474 227766 DEBUG oslo_concurrency.lockutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Lock "7ae8913c-d2d4-493a-86e7-978465ea0f9a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:28:49 np0005593234 nova_compute[227762]: 2026-01-23 09:28:49.475 227766 INFO nova.compute.manager [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Terminating instance#033[00m
Jan 23 04:28:49 np0005593234 nova_compute[227762]: 2026-01-23 09:28:49.476 227766 DEBUG oslo_concurrency.lockutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Acquiring lock "refresh_cache-7ae8913c-d2d4-493a-86e7-978465ea0f9a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:28:49 np0005593234 nova_compute[227762]: 2026-01-23 09:28:49.476 227766 DEBUG oslo_concurrency.lockutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Acquired lock "refresh_cache-7ae8913c-d2d4-493a-86e7-978465ea0f9a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:28:49 np0005593234 nova_compute[227762]: 2026-01-23 09:28:49.476 227766 DEBUG nova.network.neutron [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:28:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:49.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:50.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:50 np0005593234 nova_compute[227762]: 2026-01-23 09:28:50.471 227766 DEBUG nova.network.neutron [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:28:50 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 04:28:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:28:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:51.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:28:52 np0005593234 nova_compute[227762]: 2026-01-23 09:28:52.105 227766 DEBUG nova.network.neutron [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:28:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:28:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:52.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:28:52 np0005593234 nova_compute[227762]: 2026-01-23 09:28:52.340 227766 DEBUG oslo_concurrency.lockutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Releasing lock "refresh_cache-7ae8913c-d2d4-493a-86e7-978465ea0f9a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:28:52 np0005593234 nova_compute[227762]: 2026-01-23 09:28:52.340 227766 DEBUG nova.compute.manager [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:28:52 np0005593234 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Deactivated successfully.
Jan 23 04:28:52 np0005593234 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000002.scope: Consumed 13.492s CPU time.
Jan 23 04:28:52 np0005593234 systemd-machined[195626]: Machine qemu-1-instance-00000002 terminated.
Jan 23 04:28:52 np0005593234 nova_compute[227762]: 2026-01-23 09:28:52.561 227766 INFO nova.virt.libvirt.driver [-] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Instance destroyed successfully.#033[00m
Jan 23 04:28:52 np0005593234 nova_compute[227762]: 2026-01-23 09:28:52.562 227766 DEBUG nova.objects.instance [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Lazy-loading 'resources' on Instance uuid 7ae8913c-d2d4-493a-86e7-978465ea0f9a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:28:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:28:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:53.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:28:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:54.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:28:55 np0005593234 nova_compute[227762]: 2026-01-23 09:28:55.120 227766 INFO nova.virt.libvirt.driver [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Deleting instance files /var/lib/nova/instances/7ae8913c-d2d4-493a-86e7-978465ea0f9a_del#033[00m
Jan 23 04:28:55 np0005593234 nova_compute[227762]: 2026-01-23 09:28:55.121 227766 INFO nova.virt.libvirt.driver [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Deletion of /var/lib/nova/instances/7ae8913c-d2d4-493a-86e7-978465ea0f9a_del complete#033[00m
Jan 23 04:28:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:55.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:56.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:57 np0005593234 nova_compute[227762]: 2026-01-23 09:28:57.106 227766 DEBUG nova.virt.libvirt.host [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Jan 23 04:28:57 np0005593234 nova_compute[227762]: 2026-01-23 09:28:57.107 227766 INFO nova.virt.libvirt.host [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] UEFI support detected#033[00m
Jan 23 04:28:57 np0005593234 nova_compute[227762]: 2026-01-23 09:28:57.108 227766 INFO nova.compute.manager [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Took 4.77 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:28:57 np0005593234 nova_compute[227762]: 2026-01-23 09:28:57.109 227766 DEBUG oslo.service.loopingcall [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:28:57 np0005593234 nova_compute[227762]: 2026-01-23 09:28:57.109 227766 DEBUG nova.compute.manager [-] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:28:57 np0005593234 nova_compute[227762]: 2026-01-23 09:28:57.110 227766 DEBUG nova.network.neutron [-] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:28:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:28:57.227 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:28:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:28:57.229 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:28:57 np0005593234 nova_compute[227762]: 2026-01-23 09:28:57.435 227766 DEBUG nova.network.neutron [-] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:28:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:57.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:58 np0005593234 nova_compute[227762]: 2026-01-23 09:28:58.166 227766 DEBUG nova.network.neutron [-] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:28:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:28:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:28:58.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:28:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:28:58 np0005593234 nova_compute[227762]: 2026-01-23 09:28:58.388 227766 INFO nova.compute.manager [-] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Took 1.28 seconds to deallocate network for instance.#033[00m
Jan 23 04:28:58 np0005593234 nova_compute[227762]: 2026-01-23 09:28:58.680 227766 DEBUG oslo_concurrency.lockutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:28:58 np0005593234 nova_compute[227762]: 2026-01-23 09:28:58.681 227766 DEBUG oslo_concurrency.lockutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:28:58 np0005593234 nova_compute[227762]: 2026-01-23 09:28:58.752 227766 DEBUG oslo_concurrency.processutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:58 np0005593234 podman[231477]: 2026-01-23 09:28:58.7627095 +0000 UTC m=+0.053967949 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 04:28:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:28:59 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2393166179' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:28:59 np0005593234 nova_compute[227762]: 2026-01-23 09:28:59.205 227766 DEBUG oslo_concurrency.processutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:59 np0005593234 nova_compute[227762]: 2026-01-23 09:28:59.213 227766 DEBUG nova.compute.provider_tree [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:28:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:28:59.231 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:28:59 np0005593234 nova_compute[227762]: 2026-01-23 09:28:59.276 227766 ERROR nova.scheduler.client.report [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] [req-8339c4ac-84db-43c4-a55d-5f12cfc304c2] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 89873210-bee9-46e9-9f9d-0cd7a156c3a8.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-8339c4ac-84db-43c4-a55d-5f12cfc304c2"}]}#033[00m
Jan 23 04:28:59 np0005593234 nova_compute[227762]: 2026-01-23 09:28:59.301 227766 DEBUG nova.scheduler.client.report [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:28:59 np0005593234 nova_compute[227762]: 2026-01-23 09:28:59.324 227766 DEBUG nova.scheduler.client.report [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:28:59 np0005593234 nova_compute[227762]: 2026-01-23 09:28:59.325 227766 DEBUG nova.compute.provider_tree [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 0, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:28:59 np0005593234 nova_compute[227762]: 2026-01-23 09:28:59.345 227766 DEBUG nova.scheduler.client.report [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:28:59 np0005593234 nova_compute[227762]: 2026-01-23 09:28:59.370 227766 DEBUG nova.scheduler.client.report [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:28:59 np0005593234 nova_compute[227762]: 2026-01-23 09:28:59.462 227766 DEBUG oslo_concurrency.processutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:28:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:28:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:28:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:28:59.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:28:59 np0005593234 nova_compute[227762]: 2026-01-23 09:28:59.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:28:59 np0005593234 nova_compute[227762]: 2026-01-23 09:28:59.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:28:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:28:59 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3687350542' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:28:59 np0005593234 nova_compute[227762]: 2026-01-23 09:28:59.937 227766 DEBUG oslo_concurrency.processutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:28:59 np0005593234 nova_compute[227762]: 2026-01-23 09:28:59.943 227766 DEBUG nova.compute.provider_tree [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:29:00 np0005593234 nova_compute[227762]: 2026-01-23 09:29:00.066 227766 DEBUG nova.scheduler.client.report [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Updated inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 20, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Jan 23 04:29:00 np0005593234 nova_compute[227762]: 2026-01-23 09:29:00.067 227766 DEBUG nova.compute.provider_tree [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Updating resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Jan 23 04:29:00 np0005593234 nova_compute[227762]: 2026-01-23 09:29:00.067 227766 DEBUG nova.compute.provider_tree [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:29:00 np0005593234 nova_compute[227762]: 2026-01-23 09:29:00.122 227766 DEBUG oslo_concurrency.lockutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:00.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:00 np0005593234 nova_compute[227762]: 2026-01-23 09:29:00.315 227766 INFO nova.scheduler.client.report [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Deleted allocations for instance 7ae8913c-d2d4-493a-86e7-978465ea0f9a#033[00m
Jan 23 04:29:00 np0005593234 nova_compute[227762]: 2026-01-23 09:29:00.475 227766 DEBUG oslo_concurrency.lockutils [None req-e6e54a10-16f4-4e5f-830b-ca09beae0aec 5d35ed555832443fb6e89b0efee3d383 72813715a19d4d8d941848a1faa0ecce - - default default] Lock "7ae8913c-d2d4-493a-86e7-978465ea0f9a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 11.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:00 np0005593234 nova_compute[227762]: 2026-01-23 09:29:00.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:29:00 np0005593234 nova_compute[227762]: 2026-01-23 09:29:00.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:29:00 np0005593234 nova_compute[227762]: 2026-01-23 09:29:00.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:29:00 np0005593234 nova_compute[227762]: 2026-01-23 09:29:00.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:29:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:29:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:01.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:29:01 np0005593234 nova_compute[227762]: 2026-01-23 09:29:01.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:29:01 np0005593234 nova_compute[227762]: 2026-01-23 09:29:01.776 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:01 np0005593234 nova_compute[227762]: 2026-01-23 09:29:01.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:01 np0005593234 nova_compute[227762]: 2026-01-23 09:29:01.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:01 np0005593234 nova_compute[227762]: 2026-01-23 09:29:01.777 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:29:01 np0005593234 nova_compute[227762]: 2026-01-23 09:29:01.778 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:29:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1363277558' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:29:02 np0005593234 nova_compute[227762]: 2026-01-23 09:29:02.220 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:02.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:02 np0005593234 nova_compute[227762]: 2026-01-23 09:29:02.399 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:29:02 np0005593234 nova_compute[227762]: 2026-01-23 09:29:02.401 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5252MB free_disk=20.880550384521484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:29:02 np0005593234 nova_compute[227762]: 2026-01-23 09:29:02.402 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:02 np0005593234 nova_compute[227762]: 2026-01-23 09:29:02.402 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:02 np0005593234 nova_compute[227762]: 2026-01-23 09:29:02.511 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 6ce66043-c3e3-4988-976a-2ba903e63d87 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:29:02 np0005593234 nova_compute[227762]: 2026-01-23 09:29:02.512 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:29:02 np0005593234 nova_compute[227762]: 2026-01-23 09:29:02.512 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:29:02 np0005593234 nova_compute[227762]: 2026-01-23 09:29:02.568 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:29:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/591400063' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:29:03 np0005593234 nova_compute[227762]: 2026-01-23 09:29:03.027 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:03 np0005593234 nova_compute[227762]: 2026-01-23 09:29:03.034 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:29:03 np0005593234 nova_compute[227762]: 2026-01-23 09:29:03.081 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:29:03 np0005593234 nova_compute[227762]: 2026-01-23 09:29:03.122 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:29:03 np0005593234 nova_compute[227762]: 2026-01-23 09:29:03.123 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:29:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:03.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:04 np0005593234 nova_compute[227762]: 2026-01-23 09:29:04.123 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:29:04 np0005593234 nova_compute[227762]: 2026-01-23 09:29:04.148 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:29:04 np0005593234 nova_compute[227762]: 2026-01-23 09:29:04.148 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:29:04 np0005593234 nova_compute[227762]: 2026-01-23 09:29:04.148 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:29:04 np0005593234 nova_compute[227762]: 2026-01-23 09:29:04.176 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 04:29:04 np0005593234 nova_compute[227762]: 2026-01-23 09:29:04.176 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:29:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:04.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:04 np0005593234 nova_compute[227762]: 2026-01-23 09:29:04.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:29:04 np0005593234 nova_compute[227762]: 2026-01-23 09:29:04.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:29:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e143 e143: 3 total, 3 up, 3 in
Jan 23 04:29:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:05.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:06.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e144 e144: 3 total, 3 up, 3 in
Jan 23 04:29:07 np0005593234 nova_compute[227762]: 2026-01-23 09:29:07.559 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160532.5573523, 7ae8913c-d2d4-493a-86e7-978465ea0f9a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:29:07 np0005593234 nova_compute[227762]: 2026-01-23 09:29:07.560 227766 INFO nova.compute.manager [-] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:29:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:07.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:07 np0005593234 nova_compute[227762]: 2026-01-23 09:29:07.659 227766 DEBUG nova.compute.manager [None req-f94e767f-0a32-45cf-b698-e7f5ad86a0aa - - - - - -] [instance: 7ae8913c-d2d4-493a-86e7-978465ea0f9a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:29:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:29:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:08.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:29:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:29:08 np0005593234 nova_compute[227762]: 2026-01-23 09:29:08.326 227766 DEBUG nova.network.neutron [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Automatically allocated network: {'id': 'f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7', 'name': 'auto_allocated_network', 'tenant_id': '7ce4d2b2bd9d4e648ef6fd351b972262', 'admin_state_up': True, 'mtu': 1442, 'status': 'ACTIVE', 'subnets': ['23d287b3-60e5-4a28-b726-19dba2a5fcc9', '327a0f17-5714-4333-a846-376efac64bfd'], 'shared': False, 'availability_zone_hints': [], 'availability_zones': [], 'ipv4_address_scope': None, 'ipv6_address_scope': None, 'router:external': False, 'description': '', 'qos_policy_id': None, 'port_security_enabled': True, 'dns_domain': '', 'l2_adjacency': True, 'tags': [], 'created_at': '2026-01-23T09:28:31Z', 'updated_at': '2026-01-23T09:29:07Z', 'revision_number': 4, 'project_id': '7ce4d2b2bd9d4e648ef6fd351b972262'} _auto_allocate_network /usr/lib/python3.9/site-packages/nova/network/neutron.py:2478#033[00m
Jan 23 04:29:08 np0005593234 nova_compute[227762]: 2026-01-23 09:29:08.337 227766 WARNING oslo_policy.policy [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 23 04:29:08 np0005593234 nova_compute[227762]: 2026-01-23 09:29:08.337 227766 WARNING oslo_policy.policy [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Jan 23 04:29:08 np0005593234 nova_compute[227762]: 2026-01-23 09:29:08.339 227766 DEBUG nova.policy [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3f4fe5f838cb42d0ae4285971b115141', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7ce4d2b2bd9d4e648ef6fd351b972262', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:29:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:09.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:29:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:10.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:29:11 np0005593234 nova_compute[227762]: 2026-01-23 09:29:11.388 227766 DEBUG nova.network.neutron [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Successfully created port: 4f194d42-fe53-4ca6-a4fd-94fc9a92ffed _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:29:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:11.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:12.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:12 np0005593234 podman[231642]: 2026-01-23 09:29:12.803519818 +0000 UTC m=+0.093405985 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 04:29:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e144 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:29:13 np0005593234 nova_compute[227762]: 2026-01-23 09:29:13.335 227766 DEBUG nova.network.neutron [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Successfully updated port: 4f194d42-fe53-4ca6-a4fd-94fc9a92ffed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:29:13 np0005593234 nova_compute[227762]: 2026-01-23 09:29:13.363 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Acquiring lock "refresh_cache-6ce66043-c3e3-4988-976a-2ba903e63d87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:29:13 np0005593234 nova_compute[227762]: 2026-01-23 09:29:13.364 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Acquired lock "refresh_cache-6ce66043-c3e3-4988-976a-2ba903e63d87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:29:13 np0005593234 nova_compute[227762]: 2026-01-23 09:29:13.364 227766 DEBUG nova.network.neutron [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:29:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:13.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:14 np0005593234 nova_compute[227762]: 2026-01-23 09:29:14.131 227766 DEBUG nova.network.neutron [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:29:14 np0005593234 nova_compute[227762]: 2026-01-23 09:29:14.188 227766 DEBUG nova.compute.manager [req-c5aa0b22-f2dc-42ac-ac09-1cf2136f9a8f req-dd9df5d7-4b01-43ce-97bf-a4d10224debd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Received event network-changed-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:29:14 np0005593234 nova_compute[227762]: 2026-01-23 09:29:14.189 227766 DEBUG nova.compute.manager [req-c5aa0b22-f2dc-42ac-ac09-1cf2136f9a8f req-dd9df5d7-4b01-43ce-97bf-a4d10224debd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Refreshing instance network info cache due to event network-changed-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:29:14 np0005593234 nova_compute[227762]: 2026-01-23 09:29:14.189 227766 DEBUG oslo_concurrency.lockutils [req-c5aa0b22-f2dc-42ac-ac09-1cf2136f9a8f req-dd9df5d7-4b01-43ce-97bf-a4d10224debd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6ce66043-c3e3-4988-976a-2ba903e63d87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:29:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:14.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:29:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:15.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:29:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:16.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 e145: 3 total, 3 up, 3 in
Jan 23 04:29:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:17.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:18.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.267 227766 DEBUG nova.network.neutron [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Updating instance_info_cache with network_info: [{"id": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "address": "fa:16:3e:47:9a:c0", "network": {"id": "f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::136", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ce4d2b2bd9d4e648ef6fd351b972262", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f194d42-fe", "ovs_interfaceid": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:29:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.306 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Releasing lock "refresh_cache-6ce66043-c3e3-4988-976a-2ba903e63d87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.306 227766 DEBUG nova.compute.manager [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Instance network_info: |[{"id": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "address": "fa:16:3e:47:9a:c0", "network": {"id": "f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::136", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ce4d2b2bd9d4e648ef6fd351b972262", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f194d42-fe", "ovs_interfaceid": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.307 227766 DEBUG oslo_concurrency.lockutils [req-c5aa0b22-f2dc-42ac-ac09-1cf2136f9a8f req-dd9df5d7-4b01-43ce-97bf-a4d10224debd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6ce66043-c3e3-4988-976a-2ba903e63d87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.307 227766 DEBUG nova.network.neutron [req-c5aa0b22-f2dc-42ac-ac09-1cf2136f9a8f req-dd9df5d7-4b01-43ce-97bf-a4d10224debd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Refreshing network info cache for port 4f194d42-fe53-4ca6-a4fd-94fc9a92ffed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.310 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Start _get_guest_xml network_info=[{"id": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "address": "fa:16:3e:47:9a:c0", "network": {"id": "f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::136", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ce4d2b2bd9d4e648ef6fd351b972262", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f194d42-fe", "ovs_interfaceid": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.316 227766 WARNING nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.327 227766 DEBUG nova.virt.libvirt.host [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.328 227766 DEBUG nova.virt.libvirt.host [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.336 227766 DEBUG nova.virt.libvirt.host [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.336 227766 DEBUG nova.virt.libvirt.host [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.338 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.338 227766 DEBUG nova.virt.hardware [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.338 227766 DEBUG nova.virt.hardware [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.339 227766 DEBUG nova.virt.hardware [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.339 227766 DEBUG nova.virt.hardware [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.339 227766 DEBUG nova.virt.hardware [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.339 227766 DEBUG nova.virt.hardware [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.340 227766 DEBUG nova.virt.hardware [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.340 227766 DEBUG nova.virt.hardware [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.340 227766 DEBUG nova.virt.hardware [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.340 227766 DEBUG nova.virt.hardware [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.341 227766 DEBUG nova.virt.hardware [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.344 227766 DEBUG oslo_concurrency.processutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:29:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/899823763' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.811 227766 DEBUG oslo_concurrency.processutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.841 227766 DEBUG nova.storage.rbd_utils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] rbd image 6ce66043-c3e3-4988-976a-2ba903e63d87_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:29:18 np0005593234 nova_compute[227762]: 2026-01-23 09:29:18.845 227766 DEBUG oslo_concurrency.processutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:29:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2825367113' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.340 227766 DEBUG oslo_concurrency.processutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.342 227766 DEBUG nova.virt.libvirt.vif [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:28:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-590995285-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-590995285-1',id=3,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ce4d2b2bd9d4e648ef6fd351b972262',ramdisk_id='',reservation_id='r-lqgmnwk0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-335645779',owner_user_name='tempest-AutoAllocateNetworkTest-335645779-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:28:26Z,user_data=None,user_id='3f4fe5f838cb42d0ae4285971b115141',uuid=6ce66043-c3e3-4988-976a-2ba903e63d87,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "address": "fa:16:3e:47:9a:c0", "network": {"id": "f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::136", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ce4d2b2bd9d4e648ef6fd351b972262", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f194d42-fe", "ovs_interfaceid": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.342 227766 DEBUG nova.network.os_vif_util [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Converting VIF {"id": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "address": "fa:16:3e:47:9a:c0", "network": {"id": "f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::136", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ce4d2b2bd9d4e648ef6fd351b972262", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f194d42-fe", "ovs_interfaceid": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.344 227766 DEBUG nova.network.os_vif_util [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:9a:c0,bridge_name='br-int',has_traffic_filtering=True,id=4f194d42-fe53-4ca6-a4fd-94fc9a92ffed,network=Network(f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f194d42-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.346 227766 DEBUG nova.objects.instance [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6ce66043-c3e3-4988-976a-2ba903e63d87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.369 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  <uuid>6ce66043-c3e3-4988-976a-2ba903e63d87</uuid>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  <name>instance-00000003</name>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <nova:name>tempest-tempest.common.compute-instance-590995285-1</nova:name>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:29:18</nova:creationTime>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <nova:user uuid="3f4fe5f838cb42d0ae4285971b115141">tempest-AutoAllocateNetworkTest-335645779-project-member</nova:user>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <nova:project uuid="7ce4d2b2bd9d4e648ef6fd351b972262">tempest-AutoAllocateNetworkTest-335645779</nova:project>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <nova:port uuid="4f194d42-fe53-4ca6-a4fd-94fc9a92ffed">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="fdfe:381f:8400:1::136" ipVersion="6"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.1.0.146" ipVersion="4"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <entry name="serial">6ce66043-c3e3-4988-976a-2ba903e63d87</entry>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <entry name="uuid">6ce66043-c3e3-4988-976a-2ba903e63d87</entry>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/6ce66043-c3e3-4988-976a-2ba903e63d87_disk">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/6ce66043-c3e3-4988-976a-2ba903e63d87_disk.config">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:47:9a:c0"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <target dev="tap4f194d42-fe"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/6ce66043-c3e3-4988-976a-2ba903e63d87/console.log" append="off"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:29:19 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:29:19 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:29:19 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:29:19 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.371 227766 DEBUG nova.compute.manager [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Preparing to wait for external event network-vif-plugged-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.371 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Acquiring lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.372 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.372 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.373 227766 DEBUG nova.virt.libvirt.vif [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:28:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-590995285-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-590995285-1',id=3,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7ce4d2b2bd9d4e648ef6fd351b972262',ramdisk_id='',reservation_id='r-lqgmnwk0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AutoAllocateNetworkTest-335645779',owner_user_name='tempest-AutoAllocateNetworkTest-335645779-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:28:26Z,user_data=None,user_id='3f4fe5f838cb42d0ae4285971b115141',uuid=6ce66043-c3e3-4988-976a-2ba903e63d87,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "address": "fa:16:3e:47:9a:c0", "network": {"id": "f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::136", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ce4d2b2bd9d4e648ef6fd351b972262", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f194d42-fe", "ovs_interfaceid": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.373 227766 DEBUG nova.network.os_vif_util [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Converting VIF {"id": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "address": "fa:16:3e:47:9a:c0", "network": {"id": "f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::136", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ce4d2b2bd9d4e648ef6fd351b972262", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f194d42-fe", "ovs_interfaceid": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.374 227766 DEBUG nova.network.os_vif_util [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:9a:c0,bridge_name='br-int',has_traffic_filtering=True,id=4f194d42-fe53-4ca6-a4fd-94fc9a92ffed,network=Network(f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f194d42-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.374 227766 DEBUG os_vif [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:9a:c0,bridge_name='br-int',has_traffic_filtering=True,id=4f194d42-fe53-4ca6-a4fd-94fc9a92ffed,network=Network(f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f194d42-fe') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.421 227766 DEBUG ovsdbapp.backend.ovs_idl [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.421 227766 DEBUG ovsdbapp.backend.ovs_idl [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.421 227766 DEBUG ovsdbapp.backend.ovs_idl [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.422 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.423 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [POLLOUT] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.423 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.424 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.440 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.441 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.441 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.442 227766 INFO oslo.privsep.daemon [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp530isaqo/privsep.sock']#033[00m
Jan 23 04:29:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:19.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.647 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Acquiring lock "872939ff-8eb8-4a0a-a32d-f1268af38264" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.648 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.710 227766 DEBUG nova.compute.manager [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.807 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.808 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.818 227766 DEBUG nova.virt.hardware [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:29:19 np0005593234 nova_compute[227762]: 2026-01-23 09:29:19.819 227766 INFO nova.compute.claims [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.019 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.176 227766 INFO oslo.privsep.daemon [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.027 231740 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.031 231740 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.035 231740 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.035 231740 INFO oslo.privsep.daemon [-] privsep daemon running as pid 231740#033[00m
Jan 23 04:29:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:29:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:20.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:29:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:29:20 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1896033368' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.475 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.484 227766 DEBUG nova.compute.provider_tree [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.505 227766 DEBUG nova.scheduler.client.report [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.512 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.512 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f194d42-fe, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.513 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4f194d42-fe, col_values=(('external_ids', {'iface-id': '4f194d42-fe53-4ca6-a4fd-94fc9a92ffed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:47:9a:c0', 'vm-uuid': '6ce66043-c3e3-4988-976a-2ba903e63d87'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.515 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:20 np0005593234 NetworkManager[48942]: <info>  [1769160560.5160] manager: (tap4f194d42-fe): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.518 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.521 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.522 227766 INFO os_vif [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:9a:c0,bridge_name='br-int',has_traffic_filtering=True,id=4f194d42-fe53-4ca6-a4fd-94fc9a92ffed,network=Network(f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f194d42-fe')#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.554 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.747s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.555 227766 DEBUG nova.compute.manager [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.632 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.632 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.633 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] No VIF found with MAC fa:16:3e:47:9a:c0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.633 227766 INFO nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Using config drive#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.657 227766 DEBUG nova.storage.rbd_utils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] rbd image 6ce66043-c3e3-4988-976a-2ba903e63d87_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.665 227766 DEBUG nova.compute.manager [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.665 227766 DEBUG nova.network.neutron [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.699 227766 INFO nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.735 227766 DEBUG nova.compute.manager [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.875 227766 DEBUG nova.compute.manager [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.876 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.877 227766 INFO nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Creating image(s)#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.898 227766 DEBUG nova.storage.rbd_utils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] rbd image 872939ff-8eb8-4a0a-a32d-f1268af38264_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.923 227766 DEBUG nova.storage.rbd_utils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] rbd image 872939ff-8eb8-4a0a-a32d-f1268af38264_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.949 227766 DEBUG nova.storage.rbd_utils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] rbd image 872939ff-8eb8-4a0a-a32d-f1268af38264_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:29:20 np0005593234 nova_compute[227762]: 2026-01-23 09:29:20.953 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:21 np0005593234 nova_compute[227762]: 2026-01-23 09:29:21.012 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:21 np0005593234 nova_compute[227762]: 2026-01-23 09:29:21.013 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:21 np0005593234 nova_compute[227762]: 2026-01-23 09:29:21.014 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:21 np0005593234 nova_compute[227762]: 2026-01-23 09:29:21.015 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:21 np0005593234 nova_compute[227762]: 2026-01-23 09:29:21.037 227766 DEBUG nova.storage.rbd_utils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] rbd image 872939ff-8eb8-4a0a-a32d-f1268af38264_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:29:21 np0005593234 nova_compute[227762]: 2026-01-23 09:29:21.041 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 872939ff-8eb8-4a0a-a32d-f1268af38264_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:29:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:21.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:29:21 np0005593234 nova_compute[227762]: 2026-01-23 09:29:21.759 227766 DEBUG nova.policy [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a75f3e5fbaff48e6a69b0a34b177d007', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'ebe73c1fb9f04cafa7ccf24cd83451f6', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:29:21 np0005593234 nova_compute[227762]: 2026-01-23 09:29:21.868 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 872939ff-8eb8-4a0a-a32d-f1268af38264_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.827s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:21 np0005593234 nova_compute[227762]: 2026-01-23 09:29:21.943 227766 DEBUG nova.storage.rbd_utils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] resizing rbd image 872939ff-8eb8-4a0a-a32d-f1268af38264_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.136 227766 INFO nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Creating config drive at /var/lib/nova/instances/6ce66043-c3e3-4988-976a-2ba903e63d87/disk.config#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.142 227766 DEBUG oslo_concurrency.processutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6ce66043-c3e3-4988-976a-2ba903e63d87/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdrnrvpjh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.238 227766 DEBUG nova.objects.instance [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lazy-loading 'migration_context' on Instance uuid 872939ff-8eb8-4a0a-a32d-f1268af38264 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:29:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:22.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.269 227766 DEBUG oslo_concurrency.processutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6ce66043-c3e3-4988-976a-2ba903e63d87/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdrnrvpjh" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.296 227766 DEBUG nova.storage.rbd_utils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] rbd image 6ce66043-c3e3-4988-976a-2ba903e63d87_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.299 227766 DEBUG oslo_concurrency.processutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6ce66043-c3e3-4988-976a-2ba903e63d87/disk.config 6ce66043-c3e3-4988-976a-2ba903e63d87_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.321 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.322 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Ensure instance console log exists: /var/lib/nova/instances/872939ff-8eb8-4a0a-a32d-f1268af38264/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.322 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.322 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.323 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.490 227766 DEBUG oslo_concurrency.processutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6ce66043-c3e3-4988-976a-2ba903e63d87/disk.config 6ce66043-c3e3-4988-976a-2ba903e63d87_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.191s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.491 227766 INFO nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Deleting local config drive /var/lib/nova/instances/6ce66043-c3e3-4988-976a-2ba903e63d87/disk.config because it was imported into RBD.#033[00m
Jan 23 04:29:22 np0005593234 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 23 04:29:22 np0005593234 kernel: tap4f194d42-fe: entered promiscuous mode
Jan 23 04:29:22 np0005593234 NetworkManager[48942]: <info>  [1769160562.5501] manager: (tap4f194d42-fe): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Jan 23 04:29:22 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:22Z|00027|binding|INFO|Claiming lport 4f194d42-fe53-4ca6-a4fd-94fc9a92ffed for this chassis.
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.551 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:22 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:22Z|00028|binding|INFO|4f194d42-fe53-4ca6-a4fd-94fc9a92ffed: Claiming fa:16:3e:47:9a:c0 10.1.0.146 fdfe:381f:8400:1::136
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.558 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:22.589 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:9a:c0 10.1.0.146 fdfe:381f:8400:1::136'], port_security=['fa:16:3e:47:9a:c0 10.1.0.146 fdfe:381f:8400:1::136'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.146/26 fdfe:381f:8400:1::136/64', 'neutron:device_id': '6ce66043-c3e3-4988-976a-2ba903e63d87', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ce4d2b2bd9d4e648ef6fd351b972262', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a1793278-8c6f-49e9-be94-8a60e6a54c2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5658ab5-291d-4119-8cb7-9ecc0ad5a8b4, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=4f194d42-fe53-4ca6-a4fd-94fc9a92ffed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:29:22 np0005593234 systemd-udevd[232009]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:29:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:22.591 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 4f194d42-fe53-4ca6-a4fd-94fc9a92ffed in datapath f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7 bound to our chassis#033[00m
Jan 23 04:29:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:22.595 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7#033[00m
Jan 23 04:29:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:22.598 144381 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmp9kkr6q35/privsep.sock']#033[00m
Jan 23 04:29:22 np0005593234 NetworkManager[48942]: <info>  [1769160562.6056] device (tap4f194d42-fe): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:29:22 np0005593234 NetworkManager[48942]: <info>  [1769160562.6061] device (tap4f194d42-fe): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:29:22 np0005593234 systemd-machined[195626]: New machine qemu-2-instance-00000003.
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.638 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.640 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:22 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:22Z|00029|binding|INFO|Setting lport 4f194d42-fe53-4ca6-a4fd-94fc9a92ffed ovn-installed in OVS
Jan 23 04:29:22 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:22Z|00030|binding|INFO|Setting lport 4f194d42-fe53-4ca6-a4fd-94fc9a92ffed up in Southbound
Jan 23 04:29:22 np0005593234 systemd[1]: Started Virtual Machine qemu-2-instance-00000003.
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.647 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.688 227766 DEBUG nova.network.neutron [req-c5aa0b22-f2dc-42ac-ac09-1cf2136f9a8f req-dd9df5d7-4b01-43ce-97bf-a4d10224debd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Updated VIF entry in instance network info cache for port 4f194d42-fe53-4ca6-a4fd-94fc9a92ffed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.689 227766 DEBUG nova.network.neutron [req-c5aa0b22-f2dc-42ac-ac09-1cf2136f9a8f req-dd9df5d7-4b01-43ce-97bf-a4d10224debd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Updating instance_info_cache with network_info: [{"id": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "address": "fa:16:3e:47:9a:c0", "network": {"id": "f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::136", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ce4d2b2bd9d4e648ef6fd351b972262", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f194d42-fe", "ovs_interfaceid": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.722 227766 DEBUG oslo_concurrency.lockutils [req-c5aa0b22-f2dc-42ac-ac09-1cf2136f9a8f req-dd9df5d7-4b01-43ce-97bf-a4d10224debd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6ce66043-c3e3-4988-976a-2ba903e63d87" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.979 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160562.979136, 6ce66043-c3e3-4988-976a-2ba903e63d87 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:29:22 np0005593234 nova_compute[227762]: 2026-01-23 09:29:22.980 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] VM Started (Lifecycle Event)#033[00m
Jan 23 04:29:23 np0005593234 nova_compute[227762]: 2026-01-23 09:29:23.009 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:29:23 np0005593234 nova_compute[227762]: 2026-01-23 09:29:23.013 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160562.9803138, 6ce66043-c3e3-4988-976a-2ba903e63d87 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:29:23 np0005593234 nova_compute[227762]: 2026-01-23 09:29:23.013 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:29:23 np0005593234 nova_compute[227762]: 2026-01-23 09:29:23.034 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:29:23 np0005593234 nova_compute[227762]: 2026-01-23 09:29:23.037 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:29:23 np0005593234 nova_compute[227762]: 2026-01-23 09:29:23.060 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:29:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:23.266 144381 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 04:29:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:23.267 144381 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp9kkr6q35/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 23 04:29:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:23.141 232070 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 04:29:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:23.145 232070 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 04:29:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:23.147 232070 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Jan 23 04:29:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:23.148 232070 INFO oslo.privsep.daemon [-] privsep daemon running as pid 232070#033[00m
Jan 23 04:29:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:23.270 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bea2230c-ac3d-483d-8e45-6b1c82c1c1c5]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:29:23 np0005593234 nova_compute[227762]: 2026-01-23 09:29:23.321 227766 DEBUG nova.network.neutron [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Successfully created port: 7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:29:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:23.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:23.782 232070 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:23.782 232070 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:23.782 232070 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:24.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:24.419 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[41d722b2-053e-4172-b964-0e82d287b60f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:24.421 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf0fce0a3-e1 in ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:29:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:24.423 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf0fce0a3-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:29:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:24.423 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4ec9f26f-596d-414d-b429-98e19a1c6156]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:24.426 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3828c0e5-d092-4d53-aa25-22542684dda2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:24.455 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[f12a4c70-c4d2-4967-931d-1b9f727466fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:24.475 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[271dcd7e-9fc5-46e6-ba5a-16bb39819b3d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:24.478 144381 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpw1fco9b5/privsep.sock']#033[00m
Jan 23 04:29:24 np0005593234 nova_compute[227762]: 2026-01-23 09:29:24.616 227766 DEBUG nova.network.neutron [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Successfully updated port: 7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:29:24 np0005593234 nova_compute[227762]: 2026-01-23 09:29:24.639 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Acquiring lock "refresh_cache-872939ff-8eb8-4a0a-a32d-f1268af38264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:29:24 np0005593234 nova_compute[227762]: 2026-01-23 09:29:24.639 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Acquired lock "refresh_cache-872939ff-8eb8-4a0a-a32d-f1268af38264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:29:24 np0005593234 nova_compute[227762]: 2026-01-23 09:29:24.639 227766 DEBUG nova.network.neutron [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:29:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:25.150 144381 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 04:29:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:25.152 144381 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpw1fco9b5/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Jan 23 04:29:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:25.011 232085 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 04:29:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:25.015 232085 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 04:29:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:25.016 232085 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 23 04:29:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:25.017 232085 INFO oslo.privsep.daemon [-] privsep daemon running as pid 232085#033[00m
Jan 23 04:29:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:25.156 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[704a8292-0bcb-4eec-bbee-c702df3b119f]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:25 np0005593234 nova_compute[227762]: 2026-01-23 09:29:25.201 227766 DEBUG nova.compute.manager [req-73a39bd1-f37e-4cf5-b8e7-4c93ac55a4bc req-d1422de0-9e47-4c0f-a8c3-eafabec413e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Received event network-changed-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:29:25 np0005593234 nova_compute[227762]: 2026-01-23 09:29:25.202 227766 DEBUG nova.compute.manager [req-73a39bd1-f37e-4cf5-b8e7-4c93ac55a4bc req-d1422de0-9e47-4c0f-a8c3-eafabec413e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Refreshing instance network info cache due to event network-changed-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:29:25 np0005593234 nova_compute[227762]: 2026-01-23 09:29:25.202 227766 DEBUG oslo_concurrency.lockutils [req-73a39bd1-f37e-4cf5-b8e7-4c93ac55a4bc req-d1422de0-9e47-4c0f-a8c3-eafabec413e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-872939ff-8eb8-4a0a-a32d-f1268af38264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:29:25 np0005593234 nova_compute[227762]: 2026-01-23 09:29:25.227 227766 DEBUG nova.network.neutron [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:29:25 np0005593234 nova_compute[227762]: 2026-01-23 09:29:25.517 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:25.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:25.658 232085 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:25.658 232085 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:25.658 232085 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:26.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.266 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[bbe3d507-f4e0-47e9-ad0d-5c7eae46935a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:26 np0005593234 NetworkManager[48942]: <info>  [1769160566.2952] manager: (tapf0fce0a3-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.294 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8b1916d1-f8e8-48ee-835e-5d054f404518]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:26 np0005593234 systemd-udevd[232098]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.329 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[bd766d2c-b28a-4b4e-8274-0243ada00ba5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.333 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c3a74302-8f0a-41fe-b126-e4ea03d8b620]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:26 np0005593234 NetworkManager[48942]: <info>  [1769160566.3556] device (tapf0fce0a3-e0): carrier: link connected
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.359 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7736b05d-e360-44d5-9549-11e8ab783e43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.377 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b7cb0fad-f0b1-47d9-8a2c-367d3646ae7f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0fce0a3-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:e7:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446131, 'reachable_time': 35363, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232116, 'error': None, 'target': 'ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.394 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[425f095c-2100-460f-9e09-a8e8af41d5ca]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:e729'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446131, 'tstamp': 446131}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232117, 'error': None, 'target': 'ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.412 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c0b39c-06d4-4a05-9b88-66affc076289]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf0fce0a3-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:e7:29'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446131, 'reachable_time': 35363, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232118, 'error': None, 'target': 'ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.443 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ede838-73ee-495b-a197-d2f9af44de2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.496 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f14e129e-2b19-4122-8a7a-3a9b5da485be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.497 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0fce0a3-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.498 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.498 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf0fce0a3-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:26 np0005593234 kernel: tapf0fce0a3-e0: entered promiscuous mode
Jan 23 04:29:26 np0005593234 NetworkManager[48942]: <info>  [1769160566.5002] manager: (tapf0fce0a3-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.499 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.503 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.504 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf0fce0a3-e0, col_values=(('external_ids', {'iface-id': '7b9b8a00-a18e-4fcd-9afc-7e5a3b0670b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:26 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:26Z|00031|binding|INFO|Releasing lport 7b9b8a00-a18e-4fcd-9afc-7e5a3b0670b9 from this chassis (sb_readonly=0)
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.527 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.529 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.532 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[33b61886-ba60-4b7a-8d1b-df38fdf60612]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.535 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7.pid.haproxy
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:29:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:26.536 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7', 'env', 'PROCESS_TAG=haproxy-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.868 227766 DEBUG nova.compute.manager [req-34d2da5f-dd82-40f0-ba8e-dc6988ee9c61 req-02a5a42b-a469-4428-aedf-ed0fe3222ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Received event network-vif-plugged-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.869 227766 DEBUG oslo_concurrency.lockutils [req-34d2da5f-dd82-40f0-ba8e-dc6988ee9c61 req-02a5a42b-a469-4428-aedf-ed0fe3222ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.869 227766 DEBUG oslo_concurrency.lockutils [req-34d2da5f-dd82-40f0-ba8e-dc6988ee9c61 req-02a5a42b-a469-4428-aedf-ed0fe3222ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.869 227766 DEBUG oslo_concurrency.lockutils [req-34d2da5f-dd82-40f0-ba8e-dc6988ee9c61 req-02a5a42b-a469-4428-aedf-ed0fe3222ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.870 227766 DEBUG nova.compute.manager [req-34d2da5f-dd82-40f0-ba8e-dc6988ee9c61 req-02a5a42b-a469-4428-aedf-ed0fe3222ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Processing event network-vif-plugged-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.870 227766 DEBUG nova.compute.manager [req-34d2da5f-dd82-40f0-ba8e-dc6988ee9c61 req-02a5a42b-a469-4428-aedf-ed0fe3222ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Received event network-vif-plugged-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.870 227766 DEBUG oslo_concurrency.lockutils [req-34d2da5f-dd82-40f0-ba8e-dc6988ee9c61 req-02a5a42b-a469-4428-aedf-ed0fe3222ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.871 227766 DEBUG oslo_concurrency.lockutils [req-34d2da5f-dd82-40f0-ba8e-dc6988ee9c61 req-02a5a42b-a469-4428-aedf-ed0fe3222ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.871 227766 DEBUG oslo_concurrency.lockutils [req-34d2da5f-dd82-40f0-ba8e-dc6988ee9c61 req-02a5a42b-a469-4428-aedf-ed0fe3222ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.871 227766 DEBUG nova.compute.manager [req-34d2da5f-dd82-40f0-ba8e-dc6988ee9c61 req-02a5a42b-a469-4428-aedf-ed0fe3222ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] No waiting events found dispatching network-vif-plugged-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.871 227766 WARNING nova.compute.manager [req-34d2da5f-dd82-40f0-ba8e-dc6988ee9c61 req-02a5a42b-a469-4428-aedf-ed0fe3222ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Received unexpected event network-vif-plugged-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.872 227766 DEBUG nova.compute.manager [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.876 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160566.8757982, 6ce66043-c3e3-4988-976a-2ba903e63d87 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.876 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.877 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.881 227766 INFO nova.virt.libvirt.driver [-] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Instance spawned successfully.#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.881 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.901 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.909 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.909 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.910 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.910 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.911 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.911 227766 DEBUG nova.virt.libvirt.driver [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.914 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:29:26 np0005593234 nova_compute[227762]: 2026-01-23 09:29:26.953 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.008 227766 INFO nova.compute.manager [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Took 60.74 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.008 227766 DEBUG nova.compute.manager [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:29:27 np0005593234 podman[232150]: 2026-01-23 09:29:26.935630135 +0000 UTC m=+0.027202771 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.053 227766 DEBUG nova.network.neutron [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Updating instance_info_cache with network_info: [{"id": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "address": "fa:16:3e:83:1c:37", "network": {"id": "ffffc8c5-19e2-44d8-a270-e9ab8f022e29", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-738628532-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ebe73c1fb9f04cafa7ccf24cd83451f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c973301-1b", "ovs_interfaceid": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.090 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Releasing lock "refresh_cache-872939ff-8eb8-4a0a-a32d-f1268af38264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.090 227766 DEBUG nova.compute.manager [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Instance network_info: |[{"id": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "address": "fa:16:3e:83:1c:37", "network": {"id": "ffffc8c5-19e2-44d8-a270-e9ab8f022e29", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-738628532-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ebe73c1fb9f04cafa7ccf24cd83451f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c973301-1b", "ovs_interfaceid": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.091 227766 DEBUG oslo_concurrency.lockutils [req-73a39bd1-f37e-4cf5-b8e7-4c93ac55a4bc req-d1422de0-9e47-4c0f-a8c3-eafabec413e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-872939ff-8eb8-4a0a-a32d-f1268af38264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.091 227766 DEBUG nova.network.neutron [req-73a39bd1-f37e-4cf5-b8e7-4c93ac55a4bc req-d1422de0-9e47-4c0f-a8c3-eafabec413e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Refreshing network info cache for port 7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.095 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Start _get_guest_xml network_info=[{"id": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "address": "fa:16:3e:83:1c:37", "network": {"id": "ffffc8c5-19e2-44d8-a270-e9ab8f022e29", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-738628532-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ebe73c1fb9f04cafa7ccf24cd83451f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c973301-1b", "ovs_interfaceid": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.098 227766 WARNING nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.102 227766 DEBUG nova.virt.libvirt.host [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.103 227766 DEBUG nova.virt.libvirt.host [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.110 227766 INFO nova.compute.manager [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Took 62.13 seconds to build instance.#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.114 227766 DEBUG nova.virt.libvirt.host [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.115 227766 DEBUG nova.virt.libvirt.host [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.116 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.116 227766 DEBUG nova.virt.hardware [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.117 227766 DEBUG nova.virt.hardware [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.117 227766 DEBUG nova.virt.hardware [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.117 227766 DEBUG nova.virt.hardware [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.117 227766 DEBUG nova.virt.hardware [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.118 227766 DEBUG nova.virt.hardware [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.118 227766 DEBUG nova.virt.hardware [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.118 227766 DEBUG nova.virt.hardware [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.124 227766 DEBUG nova.virt.hardware [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.124 227766 DEBUG nova.virt.hardware [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.124 227766 DEBUG nova.virt.hardware [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.127 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.148 227766 DEBUG oslo_concurrency.lockutils [None req-0d6b6a9c-d266-47ae-bccd-b0f31cda7bf4 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 62.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:27 np0005593234 podman[232150]: 2026-01-23 09:29:27.452815242 +0000 UTC m=+0.544387858 container create 24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 04:29:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:29:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3728899512' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:29:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:29:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:27.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.631 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:27 np0005593234 systemd[1]: Started libpod-conmon-24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27.scope.
Jan 23 04:29:27 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.672 227766 DEBUG nova.storage.rbd_utils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] rbd image 872939ff-8eb8-4a0a-a32d-f1268af38264_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:29:27 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ba07ba111c0111442744b3a7d2ab0f7623a57edcb13e5215136ccf28faf0d402/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.684 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:27 np0005593234 nova_compute[227762]: 2026-01-23 09:29:27.702 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:27 np0005593234 podman[232150]: 2026-01-23 09:29:27.831224276 +0000 UTC m=+0.922796922 container init 24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 04:29:27 np0005593234 podman[232150]: 2026-01-23 09:29:27.839711028 +0000 UTC m=+0.931283644 container start 24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 04:29:27 np0005593234 neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7[232195]: [NOTICE]   (232231) : New worker (232233) forked
Jan 23 04:29:27 np0005593234 neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7[232195]: [NOTICE]   (232231) : Loading success.
Jan 23 04:29:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:29:28 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1898063901' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.205 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.207 227766 DEBUG nova.virt.libvirt.vif [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:29:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAssistedSnapshotsTest-server-97711755',display_name='tempest-VolumesAssistedSnapshotsTest-server-97711755',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-volumesassistedsnapshotstest-server-97711755',id=7,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCmSmB+IHFq0O9PF55ZzO+5v091xg4xLpQp/CjSpWBRWDFZMjNYWRSoBHQ3LdxG68hlvt7sDiL760W0gnf8/lRN1xqc+p54NWWwkPX2r922HexwZgnT2ckHcQlqvz1V5tw==',key_name='tempest-keypair-736576401',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ebe73c1fb9f04cafa7ccf24cd83451f6',ramdisk_id='',reservation_id='r-94svrqx4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesAssistedSnapshotsTest-2057771284',owner_user_name='tempest-VolumesAssistedSnapshotsTest-2057771284-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:29:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a75f3e5fbaff48e6a69b0a34b177d007',uuid=872939ff-8eb8-4a0a-a32d-f1268af38264,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "address": "fa:16:3e:83:1c:37", "network": {"id": "ffffc8c5-19e2-44d8-a270-e9ab8f022e29", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-738628532-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ebe73c1fb9f04cafa7ccf24cd83451f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c973301-1b", "ovs_interfaceid": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.208 227766 DEBUG nova.network.os_vif_util [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Converting VIF {"id": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "address": "fa:16:3e:83:1c:37", "network": {"id": "ffffc8c5-19e2-44d8-a270-e9ab8f022e29", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-738628532-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ebe73c1fb9f04cafa7ccf24cd83451f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c973301-1b", "ovs_interfaceid": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.209 227766 DEBUG nova.network.os_vif_util [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:1c:37,bridge_name='br-int',has_traffic_filtering=True,id=7c973301-1b33-4f5a-83bf-f8b8bcb22bd7,network=Network(ffffc8c5-19e2-44d8-a270-e9ab8f022e29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c973301-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.210 227766 DEBUG nova.objects.instance [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lazy-loading 'pci_devices' on Instance uuid 872939ff-8eb8-4a0a-a32d-f1268af38264 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.232 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  <uuid>872939ff-8eb8-4a0a-a32d-f1268af38264</uuid>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  <name>instance-00000007</name>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <nova:name>tempest-VolumesAssistedSnapshotsTest-server-97711755</nova:name>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:29:27</nova:creationTime>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <nova:user uuid="a75f3e5fbaff48e6a69b0a34b177d007">tempest-VolumesAssistedSnapshotsTest-2057771284-project-member</nova:user>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <nova:project uuid="ebe73c1fb9f04cafa7ccf24cd83451f6">tempest-VolumesAssistedSnapshotsTest-2057771284</nova:project>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <nova:port uuid="7c973301-1b33-4f5a-83bf-f8b8bcb22bd7">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <entry name="serial">872939ff-8eb8-4a0a-a32d-f1268af38264</entry>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <entry name="uuid">872939ff-8eb8-4a0a-a32d-f1268af38264</entry>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/872939ff-8eb8-4a0a-a32d-f1268af38264_disk">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/872939ff-8eb8-4a0a-a32d-f1268af38264_disk.config">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:83:1c:37"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <target dev="tap7c973301-1b"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/872939ff-8eb8-4a0a-a32d-f1268af38264/console.log" append="off"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:29:28 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:29:28 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:29:28 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:29:28 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.240 227766 DEBUG nova.compute.manager [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Preparing to wait for external event network-vif-plugged-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.240 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Acquiring lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.241 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.241 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.242 227766 DEBUG nova.virt.libvirt.vif [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:29:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAssistedSnapshotsTest-server-97711755',display_name='tempest-VolumesAssistedSnapshotsTest-server-97711755',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-volumesassistedsnapshotstest-server-97711755',id=7,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCmSmB+IHFq0O9PF55ZzO+5v091xg4xLpQp/CjSpWBRWDFZMjNYWRSoBHQ3LdxG68hlvt7sDiL760W0gnf8/lRN1xqc+p54NWWwkPX2r922HexwZgnT2ckHcQlqvz1V5tw==',key_name='tempest-keypair-736576401',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='ebe73c1fb9f04cafa7ccf24cd83451f6',ramdisk_id='',reservation_id='r-94svrqx4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesAssistedSnapshotsTest-2057771284',owner_user_name='tempest-VolumesAssistedSnapshotsTest-2057771284-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:29:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a75f3e5fbaff48e6a69b0a34b177d007',uuid=872939ff-8eb8-4a0a-a32d-f1268af38264,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "address": "fa:16:3e:83:1c:37", "network": {"id": "ffffc8c5-19e2-44d8-a270-e9ab8f022e29", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-738628532-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ebe73c1fb9f04cafa7ccf24cd83451f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c973301-1b", "ovs_interfaceid": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.243 227766 DEBUG nova.network.os_vif_util [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Converting VIF {"id": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "address": "fa:16:3e:83:1c:37", "network": {"id": "ffffc8c5-19e2-44d8-a270-e9ab8f022e29", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-738628532-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ebe73c1fb9f04cafa7ccf24cd83451f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c973301-1b", "ovs_interfaceid": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.244 227766 DEBUG nova.network.os_vif_util [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:83:1c:37,bridge_name='br-int',has_traffic_filtering=True,id=7c973301-1b33-4f5a-83bf-f8b8bcb22bd7,network=Network(ffffc8c5-19e2-44d8-a270-e9ab8f022e29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c973301-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.245 227766 DEBUG os_vif [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:1c:37,bridge_name='br-int',has_traffic_filtering=True,id=7c973301-1b33-4f5a-83bf-f8b8bcb22bd7,network=Network(ffffc8c5-19e2-44d8-a270-e9ab8f022e29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c973301-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.246 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.246 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.247 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.251 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.252 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7c973301-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.252 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7c973301-1b, col_values=(('external_ids', {'iface-id': '7c973301-1b33-4f5a-83bf-f8b8bcb22bd7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:83:1c:37', 'vm-uuid': '872939ff-8eb8-4a0a-a32d-f1268af38264'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.254 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:28 np0005593234 NetworkManager[48942]: <info>  [1769160568.2548] manager: (tap7c973301-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/27)
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.258 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.265 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.268 227766 INFO os_vif [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:83:1c:37,bridge_name='br-int',has_traffic_filtering=True,id=7c973301-1b33-4f5a-83bf-f8b8bcb22bd7,network=Network(ffffc8c5-19e2-44d8-a270-e9ab8f022e29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c973301-1b')#033[00m
Jan 23 04:29:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:28.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.478 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.479 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.480 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] No VIF found with MAC fa:16:3e:83:1c:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.481 227766 INFO nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Using config drive#033[00m
Jan 23 04:29:28 np0005593234 nova_compute[227762]: 2026-01-23 09:29:28.624 227766 DEBUG nova.storage.rbd_utils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] rbd image 872939ff-8eb8-4a0a-a32d-f1268af38264_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:29:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:29.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:29 np0005593234 nova_compute[227762]: 2026-01-23 09:29:29.644 227766 INFO nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Creating config drive at /var/lib/nova/instances/872939ff-8eb8-4a0a-a32d-f1268af38264/disk.config#033[00m
Jan 23 04:29:29 np0005593234 nova_compute[227762]: 2026-01-23 09:29:29.652 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/872939ff-8eb8-4a0a-a32d-f1268af38264/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn4rz24jo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:29 np0005593234 podman[232317]: 2026-01-23 09:29:29.768896671 +0000 UTC m=+0.060383836 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 04:29:29 np0005593234 nova_compute[227762]: 2026-01-23 09:29:29.767 227766 DEBUG nova.network.neutron [req-73a39bd1-f37e-4cf5-b8e7-4c93ac55a4bc req-d1422de0-9e47-4c0f-a8c3-eafabec413e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Updated VIF entry in instance network info cache for port 7c973301-1b33-4f5a-83bf-f8b8bcb22bd7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:29:29 np0005593234 nova_compute[227762]: 2026-01-23 09:29:29.769 227766 DEBUG nova.network.neutron [req-73a39bd1-f37e-4cf5-b8e7-4c93ac55a4bc req-d1422de0-9e47-4c0f-a8c3-eafabec413e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Updating instance_info_cache with network_info: [{"id": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "address": "fa:16:3e:83:1c:37", "network": {"id": "ffffc8c5-19e2-44d8-a270-e9ab8f022e29", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-738628532-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ebe73c1fb9f04cafa7ccf24cd83451f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c973301-1b", "ovs_interfaceid": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:29:29 np0005593234 nova_compute[227762]: 2026-01-23 09:29:29.791 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/872939ff-8eb8-4a0a-a32d-f1268af38264/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpn4rz24jo" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:29 np0005593234 nova_compute[227762]: 2026-01-23 09:29:29.824 227766 DEBUG nova.storage.rbd_utils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] rbd image 872939ff-8eb8-4a0a-a32d-f1268af38264_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:29:29 np0005593234 nova_compute[227762]: 2026-01-23 09:29:29.830 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/872939ff-8eb8-4a0a-a32d-f1268af38264/disk.config 872939ff-8eb8-4a0a-a32d-f1268af38264_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:29 np0005593234 nova_compute[227762]: 2026-01-23 09:29:29.849 227766 DEBUG oslo_concurrency.lockutils [req-73a39bd1-f37e-4cf5-b8e7-4c93ac55a4bc req-d1422de0-9e47-4c0f-a8c3-eafabec413e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-872939ff-8eb8-4a0a-a32d-f1268af38264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:29:30 np0005593234 nova_compute[227762]: 2026-01-23 09:29:30.014 227766 DEBUG oslo_concurrency.processutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/872939ff-8eb8-4a0a-a32d-f1268af38264/disk.config 872939ff-8eb8-4a0a-a32d-f1268af38264_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.184s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:30 np0005593234 nova_compute[227762]: 2026-01-23 09:29:30.014 227766 INFO nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Deleting local config drive /var/lib/nova/instances/872939ff-8eb8-4a0a-a32d-f1268af38264/disk.config because it was imported into RBD.#033[00m
Jan 23 04:29:30 np0005593234 NetworkManager[48942]: <info>  [1769160570.0612] manager: (tap7c973301-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/28)
Jan 23 04:29:30 np0005593234 kernel: tap7c973301-1b: entered promiscuous mode
Jan 23 04:29:30 np0005593234 nova_compute[227762]: 2026-01-23 09:29:30.062 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:30 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:30Z|00032|binding|INFO|Claiming lport 7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 for this chassis.
Jan 23 04:29:30 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:30Z|00033|binding|INFO|7c973301-1b33-4f5a-83bf-f8b8bcb22bd7: Claiming fa:16:3e:83:1c:37 10.100.0.8
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.078 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:1c:37 10.100.0.8'], port_security=['fa:16:3e:83:1c:37 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '872939ff-8eb8-4a0a-a32d-f1268af38264', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ffffc8c5-19e2-44d8-a270-e9ab8f022e29', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ebe73c1fb9f04cafa7ccf24cd83451f6', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4f1c0fb7-f40a-472c-bfb9-989e30daaa92', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a7fe01b-a599-4e74-8d7d-b2f262bde0ef, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=7c973301-1b33-4f5a-83bf-f8b8bcb22bd7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.079 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 in datapath ffffc8c5-19e2-44d8-a270-e9ab8f022e29 bound to our chassis#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.082 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ffffc8c5-19e2-44d8-a270-e9ab8f022e29#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.095 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[95d6bd84-eb5a-4570-90d8-de8568d25a91]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.096 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapffffc8c5-11 in ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.097 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapffffc8c5-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.097 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9e5b1c31-3804-434e-8f32-b919eedc576a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 systemd-machined[195626]: New machine qemu-3-instance-00000007.
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.098 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5d06900d-dd0a-4d67-aa7a-9135e74c7a60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.128 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[939d8a8d-b126-4c9f-aeb7-30bde8fb4d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 nova_compute[227762]: 2026-01-23 09:29:30.132 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:30 np0005593234 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Jan 23 04:29:30 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:30Z|00034|binding|INFO|Setting lport 7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 ovn-installed in OVS
Jan 23 04:29:30 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:30Z|00035|binding|INFO|Setting lport 7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 up in Southbound
Jan 23 04:29:30 np0005593234 nova_compute[227762]: 2026-01-23 09:29:30.139 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.152 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b37ff3dc-9b76-46b6-b7f4-ea0f187f07ed]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 systemd-udevd[232393]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:29:30 np0005593234 NetworkManager[48942]: <info>  [1769160570.1924] device (tap7c973301-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:29:30 np0005593234 NetworkManager[48942]: <info>  [1769160570.1932] device (tap7c973301-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.190 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[37860d97-4f1f-4154-a55f-0580fdd8b4b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.197 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7efc5791-f610-4aec-9638-1e250961c4da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 NetworkManager[48942]: <info>  [1769160570.1988] manager: (tapffffc8c5-10): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.236 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f29ebcba-d178-4f3b-88b0-7953025b8ee1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.240 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[02335b4f-732d-45c0-bb07-5ce060f77860]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 NetworkManager[48942]: <info>  [1769160570.2613] device (tapffffc8c5-10): carrier: link connected
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.267 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e4cc15db-0adf-4f3d-ab73-2e61f5fcf316]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:30.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.286 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[63f67802-79bd-4944-8c91-869483506d01]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapffffc8c5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:09:2c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446522, 'reachable_time': 25516, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 232421, 'error': None, 'target': 'ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.304 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c6361da7-3863-4877-89a7-4c208fef542e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef0:92c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 446522, 'tstamp': 446522}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 232422, 'error': None, 'target': 'ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.326 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e09ba4dd-fbcc-4464-81fa-e304e62c683c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapffffc8c5-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f0:09:2c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446522, 'reachable_time': 25516, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 232423, 'error': None, 'target': 'ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.363 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7371b8b4-133e-4445-ac8b-e772ef0ca657]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.438 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eb7531e4-8e1c-45f0-a214-5f5bcd69f089]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.440 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapffffc8c5-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.441 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.441 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapffffc8c5-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:30 np0005593234 kernel: tapffffc8c5-10: entered promiscuous mode
Jan 23 04:29:30 np0005593234 nova_compute[227762]: 2026-01-23 09:29:30.444 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:30 np0005593234 NetworkManager[48942]: <info>  [1769160570.4460] manager: (tapffffc8c5-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.450 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapffffc8c5-10, col_values=(('external_ids', {'iface-id': '071e131e-4e0f-4065-bcb0-9a9a7ef98ac9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:30 np0005593234 nova_compute[227762]: 2026-01-23 09:29:30.453 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:30 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:30Z|00036|binding|INFO|Releasing lport 071e131e-4e0f-4065-bcb0-9a9a7ef98ac9 from this chassis (sb_readonly=0)
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.455 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ffffc8c5-19e2-44d8-a270-e9ab8f022e29.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ffffc8c5-19e2-44d8-a270-e9ab8f022e29.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.457 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b58c6551-666e-4210-b6dc-e3211d88e51b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.458 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ffffc8c5-19e2-44d8-a270-e9ab8f022e29
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ffffc8c5-19e2-44d8-a270-e9ab8f022e29.pid.haproxy
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ffffc8c5-19e2-44d8-a270-e9ab8f022e29
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:30.459 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29', 'env', 'PROCESS_TAG=haproxy-ffffc8c5-19e2-44d8-a270-e9ab8f022e29', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ffffc8c5-19e2-44d8-a270-e9ab8f022e29.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:29:30 np0005593234 nova_compute[227762]: 2026-01-23 09:29:30.467 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:30 np0005593234 podman[232455]: 2026-01-23 09:29:30.918957959 +0000 UTC m=+0.064739790 container create 31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 04:29:30 np0005593234 systemd[1]: Started libpod-conmon-31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb.scope.
Jan 23 04:29:30 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:29:30 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7b36abb81656390c3adb363da5a55699c570a71ab326af7c7eb498746edcdbd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:29:30 np0005593234 podman[232455]: 2026-01-23 09:29:30.887685713 +0000 UTC m=+0.033467564 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:29:30 np0005593234 podman[232455]: 2026-01-23 09:29:30.984762861 +0000 UTC m=+0.130544712 container init 31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 04:29:30 np0005593234 podman[232455]: 2026-01-23 09:29:30.990428806 +0000 UTC m=+0.136210637 container start 31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 04:29:31 np0005593234 neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29[232506]: [NOTICE]   (232514) : New worker (232516) forked
Jan 23 04:29:31 np0005593234 neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29[232506]: [NOTICE]   (232514) : Loading success.
Jan 23 04:29:31 np0005593234 nova_compute[227762]: 2026-01-23 09:29:31.055 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160571.05454, 872939ff-8eb8-4a0a-a32d-f1268af38264 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:29:31 np0005593234 nova_compute[227762]: 2026-01-23 09:29:31.056 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] VM Started (Lifecycle Event)#033[00m
Jan 23 04:29:31 np0005593234 nova_compute[227762]: 2026-01-23 09:29:31.139 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:29:31 np0005593234 nova_compute[227762]: 2026-01-23 09:29:31.143 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160571.055649, 872939ff-8eb8-4a0a-a32d-f1268af38264 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:29:31 np0005593234 nova_compute[227762]: 2026-01-23 09:29:31.143 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:29:31 np0005593234 nova_compute[227762]: 2026-01-23 09:29:31.485 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:29:31 np0005593234 nova_compute[227762]: 2026-01-23 09:29:31.489 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:29:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:31.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:32.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.469 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.644 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.923 227766 DEBUG nova.compute.manager [req-56367721-53bd-49ed-b810-635e14e1d775 req-8d194a51-1295-405c-921a-ca94258cc3f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Received event network-vif-plugged-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.924 227766 DEBUG oslo_concurrency.lockutils [req-56367721-53bd-49ed-b810-635e14e1d775 req-8d194a51-1295-405c-921a-ca94258cc3f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.924 227766 DEBUG oslo_concurrency.lockutils [req-56367721-53bd-49ed-b810-635e14e1d775 req-8d194a51-1295-405c-921a-ca94258cc3f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.925 227766 DEBUG oslo_concurrency.lockutils [req-56367721-53bd-49ed-b810-635e14e1d775 req-8d194a51-1295-405c-921a-ca94258cc3f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.925 227766 DEBUG nova.compute.manager [req-56367721-53bd-49ed-b810-635e14e1d775 req-8d194a51-1295-405c-921a-ca94258cc3f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Processing event network-vif-plugged-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.926 227766 DEBUG nova.compute.manager [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.929 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160572.9290943, 872939ff-8eb8-4a0a-a32d-f1268af38264 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.930 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.962 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.966 227766 INFO nova.virt.libvirt.driver [-] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Instance spawned successfully.#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.967 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.990 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:29:32 np0005593234 nova_compute[227762]: 2026-01-23 09:29:32.994 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:29:33 np0005593234 nova_compute[227762]: 2026-01-23 09:29:33.031 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:29:33 np0005593234 nova_compute[227762]: 2026-01-23 09:29:33.032 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:29:33 np0005593234 nova_compute[227762]: 2026-01-23 09:29:33.032 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:29:33 np0005593234 nova_compute[227762]: 2026-01-23 09:29:33.033 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:29:33 np0005593234 nova_compute[227762]: 2026-01-23 09:29:33.033 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:29:33 np0005593234 nova_compute[227762]: 2026-01-23 09:29:33.034 227766 DEBUG nova.virt.libvirt.driver [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:29:33 np0005593234 nova_compute[227762]: 2026-01-23 09:29:33.130 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:29:33 np0005593234 nova_compute[227762]: 2026-01-23 09:29:33.254 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:29:33 np0005593234 nova_compute[227762]: 2026-01-23 09:29:33.372 227766 INFO nova.compute.manager [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Took 12.50 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:29:33 np0005593234 nova_compute[227762]: 2026-01-23 09:29:33.373 227766 DEBUG nova.compute.manager [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:29:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:33.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:34.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:35 np0005593234 nova_compute[227762]: 2026-01-23 09:29:35.168 227766 INFO nova.compute.manager [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Took 15.41 seconds to build instance.#033[00m
Jan 23 04:29:35 np0005593234 nova_compute[227762]: 2026-01-23 09:29:35.621 227766 DEBUG oslo_concurrency.lockutils [None req-7ecaf42e-051c-45bd-998a-3898c77b58f1 a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.973s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:35.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:35 np0005593234 nova_compute[227762]: 2026-01-23 09:29:35.931 227766 DEBUG nova.compute.manager [req-06ad1ddd-d53f-4e0c-8c97-e3a856e6762c req-d897016a-14c9-4db8-88f2-8342a45216d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Received event network-vif-plugged-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:29:35 np0005593234 nova_compute[227762]: 2026-01-23 09:29:35.931 227766 DEBUG oslo_concurrency.lockutils [req-06ad1ddd-d53f-4e0c-8c97-e3a856e6762c req-d897016a-14c9-4db8-88f2-8342a45216d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:35 np0005593234 nova_compute[227762]: 2026-01-23 09:29:35.931 227766 DEBUG oslo_concurrency.lockutils [req-06ad1ddd-d53f-4e0c-8c97-e3a856e6762c req-d897016a-14c9-4db8-88f2-8342a45216d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:35 np0005593234 nova_compute[227762]: 2026-01-23 09:29:35.931 227766 DEBUG oslo_concurrency.lockutils [req-06ad1ddd-d53f-4e0c-8c97-e3a856e6762c req-d897016a-14c9-4db8-88f2-8342a45216d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:35 np0005593234 nova_compute[227762]: 2026-01-23 09:29:35.931 227766 DEBUG nova.compute.manager [req-06ad1ddd-d53f-4e0c-8c97-e3a856e6762c req-d897016a-14c9-4db8-88f2-8342a45216d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] No waiting events found dispatching network-vif-plugged-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:29:35 np0005593234 nova_compute[227762]: 2026-01-23 09:29:35.932 227766 WARNING nova.compute.manager [req-06ad1ddd-d53f-4e0c-8c97-e3a856e6762c req-d897016a-14c9-4db8-88f2-8342a45216d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Received unexpected event network-vif-plugged-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:29:36 np0005593234 podman[232695]: 2026-01-23 09:29:36.050425032 +0000 UTC m=+0.524497755 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:29:36 np0005593234 podman[232695]: 2026-01-23 09:29:36.233102572 +0000 UTC m=+0.707175285 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:29:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:36.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:36 np0005593234 podman[232844]: 2026-01-23 09:29:36.947155529 +0000 UTC m=+0.066473123 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:29:36 np0005593234 podman[232844]: 2026-01-23 09:29:36.977995881 +0000 UTC m=+0.097313365 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:29:37 np0005593234 podman[232909]: 2026-01-23 09:29:37.460789497 +0000 UTC m=+0.251030362 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., version=2.2.4, description=keepalived for Ceph, architecture=x86_64, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived)
Jan 23 04:29:37 np0005593234 podman[232909]: 2026-01-23 09:29:37.49165358 +0000 UTC m=+0.281894425 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, io.openshift.tags=Ceph keepalived, vcs-type=git, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, io.openshift.expose-services=, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, io.buildah.version=1.28.2, distribution-scope=public, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Jan 23 04:29:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:37.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:37 np0005593234 nova_compute[227762]: 2026-01-23 09:29:37.646 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:38 np0005593234 nova_compute[227762]: 2026-01-23 09:29:38.256 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:38.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:29:39 np0005593234 NetworkManager[48942]: <info>  [1769160579.2031] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/31)
Jan 23 04:29:39 np0005593234 NetworkManager[48942]: <info>  [1769160579.2036] device (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:29:39 np0005593234 NetworkManager[48942]: <warn>  [1769160579.2037] device (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:29:39 np0005593234 NetworkManager[48942]: <info>  [1769160579.2042] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/32)
Jan 23 04:29:39 np0005593234 NetworkManager[48942]: <info>  [1769160579.2051] device (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 04:29:39 np0005593234 NetworkManager[48942]: <warn>  [1769160579.2052] device (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 04:29:39 np0005593234 NetworkManager[48942]: <info>  [1769160579.2057] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Jan 23 04:29:39 np0005593234 NetworkManager[48942]: <info>  [1769160579.2064] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 23 04:29:39 np0005593234 NetworkManager[48942]: <info>  [1769160579.2070] device (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 04:29:39 np0005593234 NetworkManager[48942]: <info>  [1769160579.2072] device (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 04:29:39 np0005593234 nova_compute[227762]: 2026-01-23 09:29:39.217 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:39 np0005593234 nova_compute[227762]: 2026-01-23 09:29:39.328 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:39 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:39Z|00037|binding|INFO|Releasing lport 071e131e-4e0f-4065-bcb0-9a9a7ef98ac9 from this chassis (sb_readonly=0)
Jan 23 04:29:39 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:39Z|00038|binding|INFO|Releasing lport 7b9b8a00-a18e-4fcd-9afc-7e5a3b0670b9 from this chassis (sb_readonly=0)
Jan 23 04:29:39 np0005593234 nova_compute[227762]: 2026-01-23 09:29:39.346 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:39.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:29:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:29:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:40.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:40 np0005593234 nova_compute[227762]: 2026-01-23 09:29:40.532 227766 DEBUG nova.compute.manager [req-fd3ebbe0-2392-4cfe-86d0-7a16cf6cfd80 req-af0f73ad-f4d2-42fd-ae02-ccef504c16e2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Received event network-changed-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:29:40 np0005593234 nova_compute[227762]: 2026-01-23 09:29:40.532 227766 DEBUG nova.compute.manager [req-fd3ebbe0-2392-4cfe-86d0-7a16cf6cfd80 req-af0f73ad-f4d2-42fd-ae02-ccef504c16e2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Refreshing instance network info cache due to event network-changed-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:29:40 np0005593234 nova_compute[227762]: 2026-01-23 09:29:40.532 227766 DEBUG oslo_concurrency.lockutils [req-fd3ebbe0-2392-4cfe-86d0-7a16cf6cfd80 req-af0f73ad-f4d2-42fd-ae02-ccef504c16e2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-872939ff-8eb8-4a0a-a32d-f1268af38264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:29:40 np0005593234 nova_compute[227762]: 2026-01-23 09:29:40.532 227766 DEBUG oslo_concurrency.lockutils [req-fd3ebbe0-2392-4cfe-86d0-7a16cf6cfd80 req-af0f73ad-f4d2-42fd-ae02-ccef504c16e2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-872939ff-8eb8-4a0a-a32d-f1268af38264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:29:40 np0005593234 nova_compute[227762]: 2026-01-23 09:29:40.532 227766 DEBUG nova.network.neutron [req-fd3ebbe0-2392-4cfe-86d0-7a16cf6cfd80 req-af0f73ad-f4d2-42fd-ae02-ccef504c16e2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Refreshing network info cache for port 7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.332 227766 DEBUG oslo_concurrency.lockutils [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Acquiring lock "6ce66043-c3e3-4988-976a-2ba903e63d87" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.333 227766 DEBUG oslo_concurrency.lockutils [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.333 227766 DEBUG oslo_concurrency.lockutils [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Acquiring lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.333 227766 DEBUG oslo_concurrency.lockutils [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.333 227766 DEBUG oslo_concurrency.lockutils [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.335 227766 INFO nova.compute.manager [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Terminating instance#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.335 227766 DEBUG nova.compute.manager [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:29:41 np0005593234 kernel: tap4f194d42-fe (unregistering): left promiscuous mode
Jan 23 04:29:41 np0005593234 NetworkManager[48942]: <info>  [1769160581.4114] device (tap4f194d42-fe): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:29:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:29:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:29:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:29:41 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:41Z|00039|binding|INFO|Releasing lport 4f194d42-fe53-4ca6-a4fd-94fc9a92ffed from this chassis (sb_readonly=0)
Jan 23 04:29:41 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:41Z|00040|binding|INFO|Setting lport 4f194d42-fe53-4ca6-a4fd-94fc9a92ffed down in Southbound
Jan 23 04:29:41 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:41Z|00041|binding|INFO|Removing iface tap4f194d42-fe ovn-installed in OVS
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.428 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.431 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:41.436 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:47:9a:c0 10.1.0.146 fdfe:381f:8400:1::136'], port_security=['fa:16:3e:47:9a:c0 10.1.0.146 fdfe:381f:8400:1::136'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.1.0.146/26 fdfe:381f:8400:1::136/64', 'neutron:device_id': '6ce66043-c3e3-4988-976a-2ba903e63d87', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7ce4d2b2bd9d4e648ef6fd351b972262', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a1793278-8c6f-49e9-be94-8a60e6a54c2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f5658ab5-291d-4119-8cb7-9ecc0ad5a8b4, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=4f194d42-fe53-4ca6-a4fd-94fc9a92ffed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:29:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:41.438 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 4f194d42-fe53-4ca6-a4fd-94fc9a92ffed in datapath f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7 unbound from our chassis#033[00m
Jan 23 04:29:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:41.439 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:29:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:41.440 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[20a40e9d-ad03-465b-ae3c-1221ec543d05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:41.441 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7 namespace which is not needed anymore#033[00m
Jan 23 04:29:41 np0005593234 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Deactivated successfully.
Jan 23 04:29:41 np0005593234 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000003.scope: Consumed 12.112s CPU time.
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.489 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:41 np0005593234 systemd-machined[195626]: Machine qemu-2-instance-00000003 terminated.
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.571 227766 INFO nova.virt.libvirt.driver [-] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Instance destroyed successfully.#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.572 227766 DEBUG nova.objects.instance [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lazy-loading 'resources' on Instance uuid 6ce66043-c3e3-4988-976a-2ba903e63d87 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:29:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:41.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.644 227766 DEBUG nova.virt.libvirt.vif [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:28:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-tempest.common.compute-instance-590995285-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-590995285-1',id=3,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:29:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7ce4d2b2bd9d4e648ef6fd351b972262',ramdisk_id='',reservation_id='r-lqgmnwk0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AutoAllocateNetworkTest-335645779',owner_user_name='tempest-AutoAllocateNetworkTest-335645779-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:29:27Z,user_data=None,user_id='3f4fe5f838cb42d0ae4285971b115141',uuid=6ce66043-c3e3-4988-976a-2ba903e63d87,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "address": "fa:16:3e:47:9a:c0", "network": {"id": "f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::136", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ce4d2b2bd9d4e648ef6fd351b972262", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f194d42-fe", "ovs_interfaceid": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.644 227766 DEBUG nova.network.os_vif_util [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Converting VIF {"id": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "address": "fa:16:3e:47:9a:c0", "network": {"id": "f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7", "bridge": "br-int", "label": "auto_allocated_network", "subnets": [{"cidr": "fdfe:381f:8400:1::/64", "dns": [], "gateway": {"address": "fdfe:381f:8400:1::1", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "fdfe:381f:8400:1::136", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true}}, {"cidr": "10.1.0.128/26", "dns": [], "gateway": {"address": "10.1.0.129", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.1.0.146", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7ce4d2b2bd9d4e648ef6fd351b972262", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4f194d42-fe", "ovs_interfaceid": "4f194d42-fe53-4ca6-a4fd-94fc9a92ffed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.645 227766 DEBUG nova.network.os_vif_util [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:47:9a:c0,bridge_name='br-int',has_traffic_filtering=True,id=4f194d42-fe53-4ca6-a4fd-94fc9a92ffed,network=Network(f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f194d42-fe') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.646 227766 DEBUG os_vif [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:9a:c0,bridge_name='br-int',has_traffic_filtering=True,id=4f194d42-fe53-4ca6-a4fd-94fc9a92ffed,network=Network(f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f194d42-fe') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.647 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.647 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f194d42-fe, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.649 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.650 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.653 227766 INFO os_vif [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:47:9a:c0,bridge_name='br-int',has_traffic_filtering=True,id=4f194d42-fe53-4ca6-a4fd-94fc9a92ffed,network=Network(f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4f194d42-fe')#033[00m
Jan 23 04:29:41 np0005593234 neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7[232195]: [NOTICE]   (232231) : haproxy version is 2.8.14-c23fe91
Jan 23 04:29:41 np0005593234 neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7[232195]: [NOTICE]   (232231) : path to executable is /usr/sbin/haproxy
Jan 23 04:29:41 np0005593234 neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7[232195]: [WARNING]  (232231) : Exiting Master process...
Jan 23 04:29:41 np0005593234 neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7[232195]: [ALERT]    (232231) : Current worker (232233) exited with code 143 (Terminated)
Jan 23 04:29:41 np0005593234 neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7[232195]: [WARNING]  (232231) : All workers exited. Exiting... (0)
Jan 23 04:29:41 np0005593234 systemd[1]: libpod-24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27.scope: Deactivated successfully.
Jan 23 04:29:41 np0005593234 podman[233100]: 2026-01-23 09:29:41.673105152 +0000 UTC m=+0.128901911 container died 24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.778 227766 DEBUG nova.compute.manager [req-c7b46612-c61c-48f7-9464-7b1d6955359f req-a2691ea8-d436-4142-b91e-fac22cfc8319 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Received event network-vif-unplugged-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.779 227766 DEBUG oslo_concurrency.lockutils [req-c7b46612-c61c-48f7-9464-7b1d6955359f req-a2691ea8-d436-4142-b91e-fac22cfc8319 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.779 227766 DEBUG oslo_concurrency.lockutils [req-c7b46612-c61c-48f7-9464-7b1d6955359f req-a2691ea8-d436-4142-b91e-fac22cfc8319 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.779 227766 DEBUG oslo_concurrency.lockutils [req-c7b46612-c61c-48f7-9464-7b1d6955359f req-a2691ea8-d436-4142-b91e-fac22cfc8319 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.779 227766 DEBUG nova.compute.manager [req-c7b46612-c61c-48f7-9464-7b1d6955359f req-a2691ea8-d436-4142-b91e-fac22cfc8319 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] No waiting events found dispatching network-vif-unplugged-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:29:41 np0005593234 nova_compute[227762]: 2026-01-23 09:29:41.779 227766 DEBUG nova.compute.manager [req-c7b46612-c61c-48f7-9464-7b1d6955359f req-a2691ea8-d436-4142-b91e-fac22cfc8319 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Received event network-vif-unplugged-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:29:41 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27-userdata-shm.mount: Deactivated successfully.
Jan 23 04:29:41 np0005593234 systemd[1]: var-lib-containers-storage-overlay-ba07ba111c0111442744b3a7d2ab0f7623a57edcb13e5215136ccf28faf0d402-merged.mount: Deactivated successfully.
Jan 23 04:29:41 np0005593234 podman[233100]: 2026-01-23 09:29:41.855231855 +0000 UTC m=+0.311028614 container cleanup 24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 04:29:41 np0005593234 systemd[1]: libpod-conmon-24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27.scope: Deactivated successfully.
Jan 23 04:29:41 np0005593234 podman[233161]: 2026-01-23 09:29:41.996146135 +0000 UTC m=+0.119953864 container remove 24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 04:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:42.001 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[39c63293-30b5-4007-ad1b-767bfc208e05]: (4, ('Fri Jan 23 09:29:41 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7 (24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27)\n24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27\nFri Jan 23 09:29:41 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7 (24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27)\n24db0dcb3dba7e275ed14df140b942d9104ebef84367b2e38164237b1613cf27\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:42.003 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8f0773aa-04b0-4ea3-b099-e3108fe1ed77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:42.004 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf0fce0a3-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:42 np0005593234 nova_compute[227762]: 2026-01-23 09:29:42.005 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:42 np0005593234 kernel: tapf0fce0a3-e0: left promiscuous mode
Jan 23 04:29:42 np0005593234 nova_compute[227762]: 2026-01-23 09:29:42.007 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:42 np0005593234 nova_compute[227762]: 2026-01-23 09:29:42.026 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:42.027 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[08c7c2d3-d706-4ff2-940f-bf3924697aa0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:42.044 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fca104f2-7788-44af-b71f-fb7ba038c0a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:42.045 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b91781ee-a8f8-4e0f-8df3-a43e09e1ef17]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:42.061 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7183944b-ec87-43a5-8072-5f5ae7b986b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446122, 'reachable_time': 32600, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233174, 'error': None, 'target': 'ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:42.070 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f0fce0a3-e4b8-4f67-b93f-3b825f52a4b7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:42.071 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[a191f298-e2ca-454d-8ec9-fe7df1498542]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:42 np0005593234 systemd[1]: run-netns-ovnmeta\x2df0fce0a3\x2de4b8\x2d4f67\x2db93f\x2d3b825f52a4b7.mount: Deactivated successfully.
Jan 23 04:29:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:42.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:42 np0005593234 nova_compute[227762]: 2026-01-23 09:29:42.648 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:42.805 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:42.806 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:42.807 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:43 np0005593234 nova_compute[227762]: 2026-01-23 09:29:43.290 227766 DEBUG nova.network.neutron [req-fd3ebbe0-2392-4cfe-86d0-7a16cf6cfd80 req-af0f73ad-f4d2-42fd-ae02-ccef504c16e2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Updated VIF entry in instance network info cache for port 7c973301-1b33-4f5a-83bf-f8b8bcb22bd7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:29:43 np0005593234 nova_compute[227762]: 2026-01-23 09:29:43.291 227766 DEBUG nova.network.neutron [req-fd3ebbe0-2392-4cfe-86d0-7a16cf6cfd80 req-af0f73ad-f4d2-42fd-ae02-ccef504c16e2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Updating instance_info_cache with network_info: [{"id": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "address": "fa:16:3e:83:1c:37", "network": {"id": "ffffc8c5-19e2-44d8-a270-e9ab8f022e29", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-738628532-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ebe73c1fb9f04cafa7ccf24cd83451f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c973301-1b", "ovs_interfaceid": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:29:43 np0005593234 nova_compute[227762]: 2026-01-23 09:29:43.328 227766 DEBUG oslo_concurrency.lockutils [req-fd3ebbe0-2392-4cfe-86d0-7a16cf6cfd80 req-af0f73ad-f4d2-42fd-ae02-ccef504c16e2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-872939ff-8eb8-4a0a-a32d-f1268af38264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:29:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:43.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:29:43 np0005593234 podman[233178]: 2026-01-23 09:29:43.786374458 +0000 UTC m=+0.080123345 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 23 04:29:43 np0005593234 nova_compute[227762]: 2026-01-23 09:29:43.996 227766 DEBUG nova.compute.manager [req-5be45649-5ee9-47cc-99fd-486636c59ff6 req-16aecf81-ee38-4608-ab01-66bdfbe7a6e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Received event network-vif-plugged-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:29:43 np0005593234 nova_compute[227762]: 2026-01-23 09:29:43.997 227766 DEBUG oslo_concurrency.lockutils [req-5be45649-5ee9-47cc-99fd-486636c59ff6 req-16aecf81-ee38-4608-ab01-66bdfbe7a6e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:43 np0005593234 nova_compute[227762]: 2026-01-23 09:29:43.998 227766 DEBUG oslo_concurrency.lockutils [req-5be45649-5ee9-47cc-99fd-486636c59ff6 req-16aecf81-ee38-4608-ab01-66bdfbe7a6e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:43 np0005593234 nova_compute[227762]: 2026-01-23 09:29:43.998 227766 DEBUG oslo_concurrency.lockutils [req-5be45649-5ee9-47cc-99fd-486636c59ff6 req-16aecf81-ee38-4608-ab01-66bdfbe7a6e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:43 np0005593234 nova_compute[227762]: 2026-01-23 09:29:43.998 227766 DEBUG nova.compute.manager [req-5be45649-5ee9-47cc-99fd-486636c59ff6 req-16aecf81-ee38-4608-ab01-66bdfbe7a6e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] No waiting events found dispatching network-vif-plugged-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:29:43 np0005593234 nova_compute[227762]: 2026-01-23 09:29:43.999 227766 WARNING nova.compute.manager [req-5be45649-5ee9-47cc-99fd-486636c59ff6 req-16aecf81-ee38-4608-ab01-66bdfbe7a6e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Received unexpected event network-vif-plugged-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:29:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:44.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:29:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2744060476' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:29:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:29:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2744060476' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:29:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:29:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:45.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:29:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:46.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:46 np0005593234 nova_compute[227762]: 2026-01-23 09:29:46.457 227766 INFO nova.virt.libvirt.driver [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Deleting instance files /var/lib/nova/instances/6ce66043-c3e3-4988-976a-2ba903e63d87_del#033[00m
Jan 23 04:29:46 np0005593234 nova_compute[227762]: 2026-01-23 09:29:46.458 227766 INFO nova.virt.libvirt.driver [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Deletion of /var/lib/nova/instances/6ce66043-c3e3-4988-976a-2ba903e63d87_del complete#033[00m
Jan 23 04:29:46 np0005593234 nova_compute[227762]: 2026-01-23 09:29:46.649 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:46 np0005593234 nova_compute[227762]: 2026-01-23 09:29:46.721 227766 INFO nova.compute.manager [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Took 5.39 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:29:46 np0005593234 nova_compute[227762]: 2026-01-23 09:29:46.722 227766 DEBUG oslo.service.loopingcall [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:29:46 np0005593234 nova_compute[227762]: 2026-01-23 09:29:46.722 227766 DEBUG nova.compute.manager [-] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:29:46 np0005593234 nova_compute[227762]: 2026-01-23 09:29:46.722 227766 DEBUG nova.network.neutron [-] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:29:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:47.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:47 np0005593234 nova_compute[227762]: 2026-01-23 09:29:47.651 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:47 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:47Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:83:1c:37 10.100.0.8
Jan 23 04:29:47 np0005593234 ovn_controller[134547]: 2026-01-23T09:29:47Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:83:1c:37 10.100.0.8
Jan 23 04:29:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:48.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:48 np0005593234 nova_compute[227762]: 2026-01-23 09:29:48.351 227766 DEBUG nova.network.neutron [-] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:29:48 np0005593234 nova_compute[227762]: 2026-01-23 09:29:48.377 227766 INFO nova.compute.manager [-] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Took 1.66 seconds to deallocate network for instance.#033[00m
Jan 23 04:29:48 np0005593234 nova_compute[227762]: 2026-01-23 09:29:48.557 227766 DEBUG oslo_concurrency.lockutils [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:48 np0005593234 nova_compute[227762]: 2026-01-23 09:29:48.559 227766 DEBUG oslo_concurrency.lockutils [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:48 np0005593234 nova_compute[227762]: 2026-01-23 09:29:48.593 227766 DEBUG nova.compute.manager [req-331fc082-9646-41f6-b36c-20948050473f req-99cdb5f2-d549-4148-8488-056f3b9c3846 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Received event network-vif-deleted-4f194d42-fe53-4ca6-a4fd-94fc9a92ffed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:29:48 np0005593234 nova_compute[227762]: 2026-01-23 09:29:48.659 227766 DEBUG oslo_concurrency.processutils [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:29:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:29:49 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4144093949' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:29:49 np0005593234 nova_compute[227762]: 2026-01-23 09:29:49.082 227766 DEBUG oslo_concurrency.processutils [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:49 np0005593234 nova_compute[227762]: 2026-01-23 09:29:49.090 227766 DEBUG nova.compute.provider_tree [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:29:49 np0005593234 nova_compute[227762]: 2026-01-23 09:29:49.309 227766 DEBUG nova.scheduler.client.report [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:29:49 np0005593234 nova_compute[227762]: 2026-01-23 09:29:49.501 227766 DEBUG oslo_concurrency.lockutils [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:49 np0005593234 nova_compute[227762]: 2026-01-23 09:29:49.566 227766 INFO nova.scheduler.client.report [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Deleted allocations for instance 6ce66043-c3e3-4988-976a-2ba903e63d87#033[00m
Jan 23 04:29:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:49.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:49 np0005593234 nova_compute[227762]: 2026-01-23 09:29:49.767 227766 DEBUG oslo_concurrency.lockutils [None req-f34d350a-28f7-422c-b825-9fd9bfde1c2f 3f4fe5f838cb42d0ae4285971b115141 7ce4d2b2bd9d4e648ef6fd351b972262 - - default default] Lock "6ce66043-c3e3-4988-976a-2ba903e63d87" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.435s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:29:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:50.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:29:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:29:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:51.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:29:51 np0005593234 nova_compute[227762]: 2026-01-23 09:29:51.662 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:51 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:29:51 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:29:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:52.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:52 np0005593234 nova_compute[227762]: 2026-01-23 09:29:52.655 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:29:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:53.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:29:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:29:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:54.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:29:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:55.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:29:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:56.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:56 np0005593234 nova_compute[227762]: 2026-01-23 09:29:56.570 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160581.5690205, 6ce66043-c3e3-4988-976a-2ba903e63d87 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:29:56 np0005593234 nova_compute[227762]: 2026-01-23 09:29:56.571 227766 INFO nova.compute.manager [-] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:29:56 np0005593234 nova_compute[227762]: 2026-01-23 09:29:56.665 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:56 np0005593234 nova_compute[227762]: 2026-01-23 09:29:56.925 227766 DEBUG nova.compute.manager [None req-604226da-9a69-4679-a390-3c3653d9518a - - - - - -] [instance: 6ce66043-c3e3-4988-976a-2ba903e63d87] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:29:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:57 np0005593234 nova_compute[227762]: 2026-01-23 09:29:57.658 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:57.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:57 np0005593234 nova_compute[227762]: 2026-01-23 09:29:57.816 227766 DEBUG oslo_concurrency.lockutils [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Acquiring lock "872939ff-8eb8-4a0a-a32d-f1268af38264" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:57 np0005593234 nova_compute[227762]: 2026-01-23 09:29:57.816 227766 DEBUG oslo_concurrency.lockutils [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:57 np0005593234 nova_compute[227762]: 2026-01-23 09:29:57.837 227766 DEBUG nova.objects.instance [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Lazy-loading 'flavor' on Instance uuid 872939ff-8eb8-4a0a-a32d-f1268af38264 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:29:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:29:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:29:58.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:29:58 np0005593234 nova_compute[227762]: 2026-01-23 09:29:58.332 227766 DEBUG oslo_concurrency.lockutils [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.516s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:29:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:29:58 np0005593234 nova_compute[227762]: 2026-01-23 09:29:58.740 227766 DEBUG oslo_concurrency.lockutils [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Acquiring lock "872939ff-8eb8-4a0a-a32d-f1268af38264" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:29:58 np0005593234 nova_compute[227762]: 2026-01-23 09:29:58.741 227766 DEBUG oslo_concurrency.lockutils [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:29:58 np0005593234 nova_compute[227762]: 2026-01-23 09:29:58.742 227766 INFO nova.compute.manager [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Attaching volume 3dc5fcfb-69eb-4c5b-a254-b3c8a59b23ea to /dev/vdb#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.001 227766 DEBUG os_brick.utils [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.003 227766 INFO oslo.privsep.daemon [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'os_brick.privileged.default', '--privsep_sock_path', '/tmp/tmpbkw2f5lz/privsep.sock']#033[00m
Jan 23 04:29:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:29:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:29:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:29:59.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.728 227766 INFO oslo.privsep.daemon [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.587 233340 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.591 233340 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.593 233340 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.594 233340 INFO oslo.privsep.daemon [-] privsep daemon running as pid 233340#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.734 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[798b8371-7dac-47e8-8021-1ce87ebb75b4]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:29:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:59.778 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:29:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:59.779 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:29:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:29:59.779 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.822 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.827 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.838 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.838 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[386af5d0-3701-42af-ab2d-de55dd2d6e06]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.840 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.846 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.847 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[c76eb42a-486a-417e-b7fb-617f79ebb7fb]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.854 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.867 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.868 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[3c6885df-dcb2-4669-9f00-0219ffec9166]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.870 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[f38e34a7-397a-47b6-a522-5cdcde5daf02]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.870 227766 DEBUG oslo_concurrency.processutils [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.889 227766 DEBUG oslo_concurrency.processutils [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] CMD "nvme version" returned: 0 in 0.019s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.893 227766 DEBUG os_brick.initiator.connectors.lightos [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.893 227766 DEBUG os_brick.initiator.connectors.lightos [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.893 227766 DEBUG os_brick.initiator.connectors.lightos [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.894 227766 DEBUG os_brick.utils [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] <== get_connector_properties: return (891ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:29:59 np0005593234 nova_compute[227762]: 2026-01-23 09:29:59.894 227766 DEBUG nova.virt.block_device [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Updating existing volume attachment record: 4d42beb6-235a-4153-a3c8-6aab60d72429 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 04:30:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:00.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:00 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 04:30:00 np0005593234 nova_compute[227762]: 2026-01-23 09:30:00.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:30:00 np0005593234 podman[233350]: 2026-01-23 09:30:00.756464117 +0000 UTC m=+0.051851402 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 04:30:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:30:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:01.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:30:01 np0005593234 nova_compute[227762]: 2026-01-23 09:30:01.668 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:01 np0005593234 nova_compute[227762]: 2026-01-23 09:30:01.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:30:01 np0005593234 nova_compute[227762]: 2026-01-23 09:30:01.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:30:01 np0005593234 nova_compute[227762]: 2026-01-23 09:30:01.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:30:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:30:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3997110014' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:30:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:02.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:02 np0005593234 nova_compute[227762]: 2026-01-23 09:30:02.659 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:02 np0005593234 nova_compute[227762]: 2026-01-23 09:30:02.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:30:02 np0005593234 nova_compute[227762]: 2026-01-23 09:30:02.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.096 227766 DEBUG oslo_concurrency.lockutils [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Acquiring lock "cache_volume_driver" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.097 227766 DEBUG oslo_concurrency.lockutils [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Lock "cache_volume_driver" acquired by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.099 227766 DEBUG oslo_concurrency.lockutils [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Lock "cache_volume_driver" "released" by "nova.virt.libvirt.driver.LibvirtDriver._get_volume_driver.<locals>._cache_volume_driver" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.106 227766 DEBUG nova.objects.instance [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Lazy-loading 'flavor' on Instance uuid 872939ff-8eb8-4a0a-a32d-f1268af38264 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.179 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.179 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.180 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.180 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.181 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.211 227766 DEBUG nova.virt.libvirt.driver [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Attempting to attach volume 3dc5fcfb-69eb-4c5b-a254-b3c8a59b23ea with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.215 227766 DEBUG nova.virt.libvirt.guest [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 04:30:03 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:30:03 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-3dc5fcfb-69eb-4c5b-a254-b3c8a59b23ea">
Jan 23 04:30:03 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 04:30:03 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 04:30:03 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 04:30:03 np0005593234 nova_compute[227762]:  </source>
Jan 23 04:30:03 np0005593234 nova_compute[227762]:  <auth username="openstack">
Jan 23 04:30:03 np0005593234 nova_compute[227762]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:30:03 np0005593234 nova_compute[227762]:  </auth>
Jan 23 04:30:03 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 04:30:03 np0005593234 nova_compute[227762]:  <serial>3dc5fcfb-69eb-4c5b-a254-b3c8a59b23ea</serial>
Jan 23 04:30:03 np0005593234 nova_compute[227762]: </disk>
Jan 23 04:30:03 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.606 227766 DEBUG nova.virt.libvirt.driver [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.606 227766 DEBUG nova.virt.libvirt.driver [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.606 227766 DEBUG nova.virt.libvirt.driver [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.607 227766 DEBUG nova.virt.libvirt.driver [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] No VIF found with MAC fa:16:3e:83:1c:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:30:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:30:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3177828826' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.662 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:30:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:03.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:30:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.946 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.947 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:30:03 np0005593234 nova_compute[227762]: 2026-01-23 09:30:03.947 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000007 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:30:04 np0005593234 nova_compute[227762]: 2026-01-23 09:30:04.094 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:30:04 np0005593234 nova_compute[227762]: 2026-01-23 09:30:04.095 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4726MB free_disk=20.85190200805664GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:30:04 np0005593234 nova_compute[227762]: 2026-01-23 09:30:04.095 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:04 np0005593234 nova_compute[227762]: 2026-01-23 09:30:04.096 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:04.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:04 np0005593234 nova_compute[227762]: 2026-01-23 09:30:04.561 227766 DEBUG oslo_concurrency.lockutils [None req-3d6b325f-c337-4ee6-b9c1-e7901927f010 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 5.820s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:04 np0005593234 nova_compute[227762]: 2026-01-23 09:30:04.563 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 872939ff-8eb8-4a0a-a32d-f1268af38264 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:30:04 np0005593234 nova_compute[227762]: 2026-01-23 09:30:04.564 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:30:04 np0005593234 nova_compute[227762]: 2026-01-23 09:30:04.564 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:30:04 np0005593234 nova_compute[227762]: 2026-01-23 09:30:04.616 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:04 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:30:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:30:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3525952224' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.078 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.084 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.156 227766 DEBUG nova.virt.libvirt.driver [None req-88569c0a-e0d3-49b3-abbd-1d94e7452f81 4993ef3bb88a4f719e0bb3937dc56103 47a2487d7c1a44acbf7b364a8ce8a4ff - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] volume_snapshot_create: create_info: {'snapshot_id': 'd7bd24c5-3c87-41fe-96cb-c8f67f276e21', 'type': 'qcow2', 'new_file': 'new_file'} volume_snapshot_create /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:3572#033[00m
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.162 227766 ERROR nova.virt.libvirt.driver [None req-88569c0a-e0d3-49b3-abbd-1d94e7452f81 4993ef3bb88a4f719e0bb3937dc56103 47a2487d7c1a44acbf7b364a8ce8a4ff - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Error occurred during volume_snapshot_create, sending error status to Cinder.: nova.exception.InternalError: Found no disk to snapshot.
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.162 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Traceback (most recent call last):
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.162 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3590, in volume_snapshot_create
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.162 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264]     self._volume_snapshot_create(context, instance, guest,
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.162 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3477, in _volume_snapshot_create
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.162 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264]     raise exception.InternalError(msg)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.162 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] nova.exception.InternalError: Found no disk to snapshot.
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.162 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] #033[00m
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.220 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.399 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.399 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver [None req-88569c0a-e0d3-49b3-abbd-1d94e7452f81 4993ef3bb88a4f719e0bb3937dc56103 47a2487d7c1a44acbf7b364a8ce8a4ff - - default default] Failed to send updated snapshot status to volume service.: nova.exception.SnapshotNotFound: Snapshot d7bd24c5-3c87-41fe-96cb-c8f67f276e21 could not be found.
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3590, in volume_snapshot_create
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     self._volume_snapshot_create(context, instance, guest,
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3477, in _volume_snapshot_create
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     raise exception.InternalError(msg)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver nova.exception.InternalError: Found no disk to snapshot.
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver 
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred:
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver 
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 466, in wrapper
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     res = method(self, ctx, snapshot_id, *args, **kwargs)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 761, in update_snapshot_status
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     vs.update_snapshot_status(
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/v3/volume_snapshots.py", line 225, in update_snapshot_status
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     return self._action('os-update_snapshot_status',
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/v3/volume_snapshots.py", line 221, in _action
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     resp, body = self.api.client.post(url, body=body)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/client.py", line 223, in post
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     return self._cs_request(url, 'POST', **kwargs)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/client.py", line 211, in _cs_request
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     return self.request(url, method, **kwargs)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/client.py", line 197, in request
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     raise exceptions.from_response(resp, body)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver cinderclient.exceptions.NotFound: Snapshot d7bd24c5-3c87-41fe-96cb-c8f67f276e21 could not be found. (HTTP 404) (Request-ID: req-65aec193-c896-4c12-84ba-f1322c02c9f1)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver 
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred:
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver 
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3412, in _volume_snapshot_update_status
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     self._volume_api.update_snapshot_status(context,
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 397, in wrapper
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     res = method(self, ctx, *args, **kwargs)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 468, in wrapper
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     _reraise(exception.SnapshotNotFound(snapshot_id=snapshot_id))
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 488, in _reraise
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     raise desired_exc.with_traceback(sys.exc_info()[2])
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 466, in wrapper
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     res = method(self, ctx, snapshot_id, *args, **kwargs)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 761, in update_snapshot_status
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     vs.update_snapshot_status(
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/v3/volume_snapshots.py", line 225, in update_snapshot_status
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     return self._action('os-update_snapshot_status',
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/v3/volume_snapshots.py", line 221, in _action
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     resp, body = self.api.client.post(url, body=body)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/client.py", line 223, in post
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     return self._cs_request(url, 'POST', **kwargs)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/client.py", line 211, in _cs_request
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     return self.request(url, method, **kwargs)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/client.py", line 197, in request
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver     raise exceptions.from_response(resp, body)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver nova.exception.SnapshotNotFound: Snapshot d7bd24c5-3c87-41fe-96cb-c8f67f276e21 could not be found.
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.538 227766 ERROR nova.virt.libvirt.driver #033[00m
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server [None req-88569c0a-e0d3-49b3-abbd-1d94e7452f81 4993ef3bb88a4f719e0bb3937dc56103 47a2487d7c1a44acbf7b364a8ce8a4ff - - default default] Exception during message handling: nova.exception.InternalError: Found no disk to snapshot.
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 244, in inner
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     return func(*args, **kwargs)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     self.force_reraise()
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     raise self.value
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 4410, in volume_snapshot_create
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     self.driver.volume_snapshot_create(context, instance, volume_id,
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3597, in volume_snapshot_create
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     self._volume_snapshot_update_status(
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     self.force_reraise()
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     raise self.value
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3590, in volume_snapshot_create
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     self._volume_snapshot_create(context, instance, guest,
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3477, in _volume_snapshot_create
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server     raise exception.InternalError(msg)
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server nova.exception.InternalError: Found no disk to snapshot.
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.542 227766 ERROR oslo_messaging.rpc.server #033[00m
Jan 23 04:30:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:30:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:05.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.800 227766 DEBUG nova.virt.libvirt.driver [None req-c0228507-85ec-4f71-96c6-ec0ea4c0b9d5 4993ef3bb88a4f719e0bb3937dc56103 47a2487d7c1a44acbf7b364a8ce8a4ff - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] volume_snapshot_delete: delete_info: {'volume_id': '3dc5fcfb-69eb-4c5b-a254-b3c8a59b23ea'} _volume_snapshot_delete /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:3673#033[00m
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.801 227766 ERROR nova.virt.libvirt.driver [None req-c0228507-85ec-4f71-96c6-ec0ea4c0b9d5 4993ef3bb88a4f719e0bb3937dc56103 47a2487d7c1a44acbf7b364a8ce8a4ff - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Error occurred during volume_snapshot_delete, sending error status to Cinder.: KeyError: 'type'
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.801 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Traceback (most recent call last):
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.801 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3846, in volume_snapshot_delete
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.801 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264]     self._volume_snapshot_delete(context, instance, volume_id,
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.801 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264]   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3676, in _volume_snapshot_delete
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.801 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264]     if delete_info['type'] != 'qcow2':
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.801 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] KeyError: 'type'
Jan 23 04:30:05 np0005593234 nova_compute[227762]: 2026-01-23 09:30:05.801 227766 ERROR nova.virt.libvirt.driver [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] #033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver [None req-c0228507-85ec-4f71-96c6-ec0ea4c0b9d5 4993ef3bb88a4f719e0bb3937dc56103 47a2487d7c1a44acbf7b364a8ce8a4ff - - default default] Failed to send updated snapshot status to volume service.: nova.exception.SnapshotNotFound: Snapshot None could not be found.
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3846, in volume_snapshot_delete
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     self._volume_snapshot_delete(context, instance, volume_id,
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3676, in _volume_snapshot_delete
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     if delete_info['type'] != 'qcow2':
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver KeyError: 'type'
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver 
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred:
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver 
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 466, in wrapper
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     res = method(self, ctx, snapshot_id, *args, **kwargs)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 761, in update_snapshot_status
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     vs.update_snapshot_status(
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/v3/volume_snapshots.py", line 225, in update_snapshot_status
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     return self._action('os-update_snapshot_status',
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/v3/volume_snapshots.py", line 221, in _action
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     resp, body = self.api.client.post(url, body=body)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/client.py", line 223, in post
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     return self._cs_request(url, 'POST', **kwargs)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/client.py", line 211, in _cs_request
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     return self.request(url, method, **kwargs)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/client.py", line 197, in request
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     raise exceptions.from_response(resp, body)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver cinderclient.exceptions.NotFound: Snapshot None could not be found. (HTTP 404) (Request-ID: req-3ed99b03-2b65-4b1d-85b9-dd27d6b358a3)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver 
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred:
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver 
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3412, in _volume_snapshot_update_status
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     self._volume_api.update_snapshot_status(context,
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 397, in wrapper
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     res = method(self, ctx, *args, **kwargs)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 468, in wrapper
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     _reraise(exception.SnapshotNotFound(snapshot_id=snapshot_id))
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 488, in _reraise
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     raise desired_exc.with_traceback(sys.exc_info()[2])
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 466, in wrapper
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     res = method(self, ctx, snapshot_id, *args, **kwargs)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/volume/cinder.py", line 761, in update_snapshot_status
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     vs.update_snapshot_status(
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/v3/volume_snapshots.py", line 225, in update_snapshot_status
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     return self._action('os-update_snapshot_status',
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/v3/volume_snapshots.py", line 221, in _action
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     resp, body = self.api.client.post(url, body=body)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/client.py", line 223, in post
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     return self._cs_request(url, 'POST', **kwargs)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/client.py", line 211, in _cs_request
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     return self.request(url, method, **kwargs)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/cinderclient/client.py", line 197, in request
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver     raise exceptions.from_response(resp, body)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver nova.exception.SnapshotNotFound: Snapshot None could not be found.
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.115 227766 ERROR nova.virt.libvirt.driver #033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server [None req-c0228507-85ec-4f71-96c6-ec0ea4c0b9d5 4993ef3bb88a4f719e0bb3937dc56103 47a2487d7c1a44acbf7b364a8ce8a4ff - - default default] Exception during message handling: KeyError: 'type'
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 244, in inner
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     return func(*args, **kwargs)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     self.force_reraise()
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     raise self.value
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 4422, in volume_snapshot_delete
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     self.driver.volume_snapshot_delete(context, instance, volume_id,
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3853, in volume_snapshot_delete
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     self._volume_snapshot_update_status(
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     self.force_reraise()
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     raise self.value
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3846, in volume_snapshot_delete
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     self._volume_snapshot_delete(context, instance, volume_id,
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3676, in _volume_snapshot_delete
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server     if delete_info['type'] != 'qcow2':
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server KeyError: 'type'
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.118 227766 ERROR oslo_messaging.rpc.server #033[00m
Jan 23 04:30:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:30:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:06.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.399 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.401 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.401 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.701 227766 DEBUG oslo_concurrency.lockutils [None req-0901dd1e-fc35-4483-9e19-7d85e178475f 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Acquiring lock "872939ff-8eb8-4a0a-a32d-f1268af38264" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.701 227766 DEBUG oslo_concurrency.lockutils [None req-0901dd1e-fc35-4483-9e19-7d85e178475f 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.712 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.726 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-872939ff-8eb8-4a0a-a32d-f1268af38264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.726 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-872939ff-8eb8-4a0a-a32d-f1268af38264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.726 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.726 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 872939ff-8eb8-4a0a-a32d-f1268af38264 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.729 227766 INFO nova.compute.manager [None req-0901dd1e-fc35-4483-9e19-7d85e178475f 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Detaching volume 3dc5fcfb-69eb-4c5b-a254-b3c8a59b23ea#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.860 227766 INFO nova.virt.block_device [None req-0901dd1e-fc35-4483-9e19-7d85e178475f 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Attempting to driver detach volume 3dc5fcfb-69eb-4c5b-a254-b3c8a59b23ea from mountpoint /dev/vdb#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.868 227766 DEBUG nova.virt.libvirt.driver [None req-0901dd1e-fc35-4483-9e19-7d85e178475f 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Attempting to detach device vdb from instance 872939ff-8eb8-4a0a-a32d-f1268af38264 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.868 227766 DEBUG nova.virt.libvirt.guest [None req-0901dd1e-fc35-4483-9e19-7d85e178475f 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 04:30:06 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-3dc5fcfb-69eb-4c5b-a254-b3c8a59b23ea">
Jan 23 04:30:06 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:  </source>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:  <serial>3dc5fcfb-69eb-4c5b-a254-b3c8a59b23ea</serial>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 04:30:06 np0005593234 nova_compute[227762]: </disk>
Jan 23 04:30:06 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.875 227766 INFO nova.virt.libvirt.driver [None req-0901dd1e-fc35-4483-9e19-7d85e178475f 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Successfully detached device vdb from instance 872939ff-8eb8-4a0a-a32d-f1268af38264 from the persistent domain config.#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.875 227766 DEBUG nova.virt.libvirt.driver [None req-0901dd1e-fc35-4483-9e19-7d85e178475f 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 872939ff-8eb8-4a0a-a32d-f1268af38264 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.876 227766 DEBUG nova.virt.libvirt.guest [None req-0901dd1e-fc35-4483-9e19-7d85e178475f 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 04:30:06 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-3dc5fcfb-69eb-4c5b-a254-b3c8a59b23ea">
Jan 23 04:30:06 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:  </source>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:  <serial>3dc5fcfb-69eb-4c5b-a254-b3c8a59b23ea</serial>
Jan 23 04:30:06 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 04:30:06 np0005593234 nova_compute[227762]: </disk>
Jan 23 04:30:06 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.926 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769160606.9262714, 872939ff-8eb8-4a0a-a32d-f1268af38264 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.927 227766 DEBUG nova.virt.libvirt.driver [None req-0901dd1e-fc35-4483-9e19-7d85e178475f 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 872939ff-8eb8-4a0a-a32d-f1268af38264 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 04:30:06 np0005593234 nova_compute[227762]: 2026-01-23 09:30:06.929 227766 INFO nova.virt.libvirt.driver [None req-0901dd1e-fc35-4483-9e19-7d85e178475f 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Successfully detached device vdb from instance 872939ff-8eb8-4a0a-a32d-f1268af38264 from the live domain config.#033[00m
Jan 23 04:30:07 np0005593234 nova_compute[227762]: 2026-01-23 09:30:07.660 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:07.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:07 np0005593234 nova_compute[227762]: 2026-01-23 09:30:07.946 227766 DEBUG nova.objects.instance [None req-0901dd1e-fc35-4483-9e19-7d85e178475f 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Lazy-loading 'flavor' on Instance uuid 872939ff-8eb8-4a0a-a32d-f1268af38264 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:08 np0005593234 nova_compute[227762]: 2026-01-23 09:30:08.062 227766 DEBUG oslo_concurrency.lockutils [None req-0901dd1e-fc35-4483-9e19-7d85e178475f 7490603c6c014277a3f2b3aa497d32a4 649cf144e93c4123b8f11cac66308419 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.361s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:08.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:30:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:09.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:30:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:30:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:10.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:30:10 np0005593234 nova_compute[227762]: 2026-01-23 09:30:10.781 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Updating instance_info_cache with network_info: [{"id": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "address": "fa:16:3e:83:1c:37", "network": {"id": "ffffc8c5-19e2-44d8-a270-e9ab8f022e29", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-738628532-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ebe73c1fb9f04cafa7ccf24cd83451f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c973301-1b", "ovs_interfaceid": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:30:10 np0005593234 nova_compute[227762]: 2026-01-23 09:30:10.807 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-872939ff-8eb8-4a0a-a32d-f1268af38264" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:30:10 np0005593234 nova_compute[227762]: 2026-01-23 09:30:10.808 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:30:10 np0005593234 nova_compute[227762]: 2026-01-23 09:30:10.808 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:30:11 np0005593234 nova_compute[227762]: 2026-01-23 09:30:11.146 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:30:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:11.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:11 np0005593234 nova_compute[227762]: 2026-01-23 09:30:11.714 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:12.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:30:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4118998592' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:30:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:30:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4118998592' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:30:12 np0005593234 nova_compute[227762]: 2026-01-23 09:30:12.677 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.637 227766 DEBUG oslo_concurrency.lockutils [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Acquiring lock "872939ff-8eb8-4a0a-a32d-f1268af38264" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.637 227766 DEBUG oslo_concurrency.lockutils [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.638 227766 DEBUG oslo_concurrency.lockutils [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Acquiring lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.638 227766 DEBUG oslo_concurrency.lockutils [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.638 227766 DEBUG oslo_concurrency.lockutils [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.639 227766 INFO nova.compute.manager [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Terminating instance#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.640 227766 DEBUG nova.compute.manager [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:30:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:13.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:13 np0005593234 kernel: tap7c973301-1b (unregistering): left promiscuous mode
Jan 23 04:30:13 np0005593234 NetworkManager[48942]: <info>  [1769160613.7762] device (tap7c973301-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:30:13 np0005593234 ovn_controller[134547]: 2026-01-23T09:30:13Z|00042|binding|INFO|Releasing lport 7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 from this chassis (sb_readonly=0)
Jan 23 04:30:13 np0005593234 ovn_controller[134547]: 2026-01-23T09:30:13Z|00043|binding|INFO|Setting lport 7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 down in Southbound
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.792 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:13 np0005593234 ovn_controller[134547]: 2026-01-23T09:30:13Z|00044|binding|INFO|Removing iface tap7c973301-1b ovn-installed in OVS
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.795 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:13.799 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:83:1c:37 10.100.0.8'], port_security=['fa:16:3e:83:1c:37 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '872939ff-8eb8-4a0a-a32d-f1268af38264', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ffffc8c5-19e2-44d8-a270-e9ab8f022e29', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ebe73c1fb9f04cafa7ccf24cd83451f6', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4f1c0fb7-f40a-472c-bfb9-989e30daaa92', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.216'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3a7fe01b-a599-4e74-8d7d-b2f262bde0ef, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=7c973301-1b33-4f5a-83bf-f8b8bcb22bd7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:13.801 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 in datapath ffffc8c5-19e2-44d8-a270-e9ab8f022e29 unbound from our chassis#033[00m
Jan 23 04:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:13.803 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ffffc8c5-19e2-44d8-a270-e9ab8f022e29, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:13.805 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[40a39cc9-62f9-4799-930f-5708aa899432]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:13.806 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29 namespace which is not needed anymore#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.815 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:13 np0005593234 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Jan 23 04:30:13 np0005593234 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 15.149s CPU time.
Jan 23 04:30:13 np0005593234 systemd-machined[195626]: Machine qemu-3-instance-00000007 terminated.
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.882 227766 INFO nova.virt.libvirt.driver [-] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Instance destroyed successfully.#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.882 227766 DEBUG nova.objects.instance [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lazy-loading 'resources' on Instance uuid 872939ff-8eb8-4a0a-a32d-f1268af38264 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.897 227766 DEBUG nova.virt.libvirt.vif [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:29:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-VolumesAssistedSnapshotsTest-server-97711755',display_name='tempest-VolumesAssistedSnapshotsTest-server-97711755',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-volumesassistedsnapshotstest-server-97711755',id=7,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCmSmB+IHFq0O9PF55ZzO+5v091xg4xLpQp/CjSpWBRWDFZMjNYWRSoBHQ3LdxG68hlvt7sDiL760W0gnf8/lRN1xqc+p54NWWwkPX2r922HexwZgnT2ckHcQlqvz1V5tw==',key_name='tempest-keypair-736576401',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:29:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ebe73c1fb9f04cafa7ccf24cd83451f6',ramdisk_id='',reservation_id='r-94svrqx4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-VolumesAssistedSnapshotsTest-2057771284',owner_user_name='tempest-VolumesAssistedSnapshotsTest-2057771284-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:29:34Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='a75f3e5fbaff48e6a69b0a34b177d007',uuid=872939ff-8eb8-4a0a-a32d-f1268af38264,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "address": "fa:16:3e:83:1c:37", "network": {"id": "ffffc8c5-19e2-44d8-a270-e9ab8f022e29", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-738628532-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ebe73c1fb9f04cafa7ccf24cd83451f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c973301-1b", "ovs_interfaceid": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.898 227766 DEBUG nova.network.os_vif_util [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Converting VIF {"id": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "address": "fa:16:3e:83:1c:37", "network": {"id": "ffffc8c5-19e2-44d8-a270-e9ab8f022e29", "bridge": "br-int", "label": "tempest-VolumesAssistedSnapshotsTest-738628532-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "ebe73c1fb9f04cafa7ccf24cd83451f6", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7c973301-1b", "ovs_interfaceid": "7c973301-1b33-4f5a-83bf-f8b8bcb22bd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.899 227766 DEBUG nova.network.os_vif_util [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:83:1c:37,bridge_name='br-int',has_traffic_filtering=True,id=7c973301-1b33-4f5a-83bf-f8b8bcb22bd7,network=Network(ffffc8c5-19e2-44d8-a270-e9ab8f022e29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c973301-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.899 227766 DEBUG os_vif [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:1c:37,bridge_name='br-int',has_traffic_filtering=True,id=7c973301-1b33-4f5a-83bf-f8b8bcb22bd7,network=Network(ffffc8c5-19e2-44d8-a270-e9ab8f022e29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c973301-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.902 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.902 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7c973301-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.903 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.906 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:30:13 np0005593234 nova_compute[227762]: 2026-01-23 09:30:13.909 227766 INFO os_vif [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:83:1c:37,bridge_name='br-int',has_traffic_filtering=True,id=7c973301-1b33-4f5a-83bf-f8b8bcb22bd7,network=Network(ffffc8c5-19e2-44d8-a270-e9ab8f022e29),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7c973301-1b')#033[00m
Jan 23 04:30:13 np0005593234 podman[233496]: 2026-01-23 09:30:13.922858127 +0000 UTC m=+0.099344068 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:30:13 np0005593234 neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29[232506]: [NOTICE]   (232514) : haproxy version is 2.8.14-c23fe91
Jan 23 04:30:13 np0005593234 neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29[232506]: [NOTICE]   (232514) : path to executable is /usr/sbin/haproxy
Jan 23 04:30:13 np0005593234 neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29[232506]: [WARNING]  (232514) : Exiting Master process...
Jan 23 04:30:13 np0005593234 neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29[232506]: [ALERT]    (232514) : Current worker (232516) exited with code 143 (Terminated)
Jan 23 04:30:13 np0005593234 neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29[232506]: [WARNING]  (232514) : All workers exited. Exiting... (0)
Jan 23 04:30:13 np0005593234 systemd[1]: libpod-31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb.scope: Deactivated successfully.
Jan 23 04:30:13 np0005593234 podman[233547]: 2026-01-23 09:30:13.951417549 +0000 UTC m=+0.046382473 container died 31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 04:30:13 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb-userdata-shm.mount: Deactivated successfully.
Jan 23 04:30:13 np0005593234 systemd[1]: var-lib-containers-storage-overlay-b7b36abb81656390c3adb363da5a55699c570a71ab326af7c7eb498746edcdbd-merged.mount: Deactivated successfully.
Jan 23 04:30:13 np0005593234 podman[233547]: 2026-01-23 09:30:13.996775609 +0000 UTC m=+0.091740523 container cleanup 31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:30:14 np0005593234 systemd[1]: libpod-conmon-31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb.scope: Deactivated successfully.
Jan 23 04:30:14 np0005593234 podman[233598]: 2026-01-23 09:30:14.069741362 +0000 UTC m=+0.048269431 container remove 31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:30:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:14.075 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[58e257ef-2b71-44cf-a08b-a4ae636dd518]: (4, ('Fri Jan 23 09:30:13 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29 (31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb)\n31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb\nFri Jan 23 09:30:14 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29 (31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb)\n31f23aa1e0b0a21efffc9586382c617e0b6ba872c8ce4c68f43eca3af53de7fb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:30:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:14.076 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ca6bdc52-3bcc-4731-83c5-fca08d02afb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:30:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:14.078 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapffffc8c5-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:30:14 np0005593234 nova_compute[227762]: 2026-01-23 09:30:14.079 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:14 np0005593234 kernel: tapffffc8c5-10: left promiscuous mode
Jan 23 04:30:14 np0005593234 nova_compute[227762]: 2026-01-23 09:30:14.095 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:14.097 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f463f75d-66db-463a-8342-efef570ec349]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:30:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:14.111 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e690bff4-e03b-4185-86b8-12a9dd97378b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:30:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:14.112 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2a962aa4-e8c5-4ac3-aacd-6632d80dec8b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:30:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:14.130 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8ae18a55-080a-48ee-9ac3-cc0f7d9a4a0d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 446514, 'reachable_time': 31728, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233613, 'error': None, 'target': 'ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:30:14 np0005593234 systemd[1]: run-netns-ovnmeta\x2dffffc8c5\x2d19e2\x2d44d8\x2da270\x2de9ab8f022e29.mount: Deactivated successfully.
Jan 23 04:30:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:14.135 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ffffc8c5-19e2-44d8-a270-e9ab8f022e29 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:30:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:14.135 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[b829e017-cc8a-42f8-acb5-4505865fae29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:30:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:14.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:14 np0005593234 nova_compute[227762]: 2026-01-23 09:30:14.332 227766 DEBUG nova.compute.manager [req-b8bc1d45-905d-4a25-bbb2-2c33c7d39503 req-2c6390e2-1e2b-430f-b878-ec0b63541e4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Received event network-vif-unplugged-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:30:14 np0005593234 nova_compute[227762]: 2026-01-23 09:30:14.333 227766 DEBUG oslo_concurrency.lockutils [req-b8bc1d45-905d-4a25-bbb2-2c33c7d39503 req-2c6390e2-1e2b-430f-b878-ec0b63541e4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:14 np0005593234 nova_compute[227762]: 2026-01-23 09:30:14.333 227766 DEBUG oslo_concurrency.lockutils [req-b8bc1d45-905d-4a25-bbb2-2c33c7d39503 req-2c6390e2-1e2b-430f-b878-ec0b63541e4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:14 np0005593234 nova_compute[227762]: 2026-01-23 09:30:14.333 227766 DEBUG oslo_concurrency.lockutils [req-b8bc1d45-905d-4a25-bbb2-2c33c7d39503 req-2c6390e2-1e2b-430f-b878-ec0b63541e4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:14 np0005593234 nova_compute[227762]: 2026-01-23 09:30:14.333 227766 DEBUG nova.compute.manager [req-b8bc1d45-905d-4a25-bbb2-2c33c7d39503 req-2c6390e2-1e2b-430f-b878-ec0b63541e4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] No waiting events found dispatching network-vif-unplugged-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:30:14 np0005593234 nova_compute[227762]: 2026-01-23 09:30:14.334 227766 DEBUG nova.compute.manager [req-b8bc1d45-905d-4a25-bbb2-2c33c7d39503 req-2c6390e2-1e2b-430f-b878-ec0b63541e4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Received event network-vif-unplugged-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:30:15 np0005593234 nova_compute[227762]: 2026-01-23 09:30:15.165 227766 INFO nova.virt.libvirt.driver [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Deleting instance files /var/lib/nova/instances/872939ff-8eb8-4a0a-a32d-f1268af38264_del#033[00m
Jan 23 04:30:15 np0005593234 nova_compute[227762]: 2026-01-23 09:30:15.166 227766 INFO nova.virt.libvirt.driver [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Deletion of /var/lib/nova/instances/872939ff-8eb8-4a0a-a32d-f1268af38264_del complete#033[00m
Jan 23 04:30:15 np0005593234 nova_compute[227762]: 2026-01-23 09:30:15.316 227766 INFO nova.compute.manager [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Took 1.68 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:30:15 np0005593234 nova_compute[227762]: 2026-01-23 09:30:15.318 227766 DEBUG oslo.service.loopingcall [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:30:15 np0005593234 nova_compute[227762]: 2026-01-23 09:30:15.319 227766 DEBUG nova.compute.manager [-] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:30:15 np0005593234 nova_compute[227762]: 2026-01-23 09:30:15.320 227766 DEBUG nova.network.neutron [-] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:30:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:30:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:15.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:30:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:30:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:16.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:30:16 np0005593234 nova_compute[227762]: 2026-01-23 09:30:16.444 227766 DEBUG nova.compute.manager [req-0fd6c269-7d67-4e4a-b131-2f88ccd58d45 req-b3ecac6d-a2d5-4419-af92-793dafa0af1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Received event network-vif-plugged-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:30:16 np0005593234 nova_compute[227762]: 2026-01-23 09:30:16.444 227766 DEBUG oslo_concurrency.lockutils [req-0fd6c269-7d67-4e4a-b131-2f88ccd58d45 req-b3ecac6d-a2d5-4419-af92-793dafa0af1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:16 np0005593234 nova_compute[227762]: 2026-01-23 09:30:16.444 227766 DEBUG oslo_concurrency.lockutils [req-0fd6c269-7d67-4e4a-b131-2f88ccd58d45 req-b3ecac6d-a2d5-4419-af92-793dafa0af1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:16 np0005593234 nova_compute[227762]: 2026-01-23 09:30:16.444 227766 DEBUG oslo_concurrency.lockutils [req-0fd6c269-7d67-4e4a-b131-2f88ccd58d45 req-b3ecac6d-a2d5-4419-af92-793dafa0af1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:16 np0005593234 nova_compute[227762]: 2026-01-23 09:30:16.444 227766 DEBUG nova.compute.manager [req-0fd6c269-7d67-4e4a-b131-2f88ccd58d45 req-b3ecac6d-a2d5-4419-af92-793dafa0af1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] No waiting events found dispatching network-vif-plugged-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:30:16 np0005593234 nova_compute[227762]: 2026-01-23 09:30:16.445 227766 WARNING nova.compute.manager [req-0fd6c269-7d67-4e4a-b131-2f88ccd58d45 req-b3ecac6d-a2d5-4419-af92-793dafa0af1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Received unexpected event network-vif-plugged-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:30:17 np0005593234 nova_compute[227762]: 2026-01-23 09:30:17.518 227766 DEBUG nova.network.neutron [-] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:30:17 np0005593234 nova_compute[227762]: 2026-01-23 09:30:17.536 227766 INFO nova.compute.manager [-] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Took 2.22 seconds to deallocate network for instance.#033[00m
Jan 23 04:30:17 np0005593234 nova_compute[227762]: 2026-01-23 09:30:17.593 227766 DEBUG oslo_concurrency.lockutils [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:17 np0005593234 nova_compute[227762]: 2026-01-23 09:30:17.594 227766 DEBUG oslo_concurrency.lockutils [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:17 np0005593234 nova_compute[227762]: 2026-01-23 09:30:17.677 227766 DEBUG nova.compute.manager [req-deb59e6b-fba1-4844-969b-48135e2507f3 req-b4c9870f-cc56-4768-8470-4d6511224275 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Received event network-vif-deleted-7c973301-1b33-4f5a-83bf-f8b8bcb22bd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:30:17 np0005593234 nova_compute[227762]: 2026-01-23 09:30:17.679 227766 DEBUG oslo_concurrency.processutils [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:17.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:17 np0005593234 nova_compute[227762]: 2026-01-23 09:30:17.703 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:30:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3953647076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:30:18 np0005593234 nova_compute[227762]: 2026-01-23 09:30:18.126 227766 DEBUG oslo_concurrency.processutils [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:18 np0005593234 nova_compute[227762]: 2026-01-23 09:30:18.134 227766 DEBUG nova.compute.provider_tree [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:30:18 np0005593234 nova_compute[227762]: 2026-01-23 09:30:18.163 227766 DEBUG nova.scheduler.client.report [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:30:18 np0005593234 nova_compute[227762]: 2026-01-23 09:30:18.195 227766 DEBUG oslo_concurrency.lockutils [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:18 np0005593234 nova_compute[227762]: 2026-01-23 09:30:18.259 227766 INFO nova.scheduler.client.report [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Deleted allocations for instance 872939ff-8eb8-4a0a-a32d-f1268af38264#033[00m
Jan 23 04:30:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:18.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:18 np0005593234 nova_compute[227762]: 2026-01-23 09:30:18.406 227766 DEBUG oslo_concurrency.lockutils [None req-51b63ac2-bfe2-4558-812e-84da3f01957f a75f3e5fbaff48e6a69b0a34b177d007 ebe73c1fb9f04cafa7ccf24cd83451f6 - - default default] Lock "872939ff-8eb8-4a0a-a32d-f1268af38264" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:18 np0005593234 nova_compute[227762]: 2026-01-23 09:30:18.580 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:18 np0005593234 nova_compute[227762]: 2026-01-23 09:30:18.784 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:18 np0005593234 nova_compute[227762]: 2026-01-23 09:30:18.903 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:30:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:19.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:30:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:20.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:30:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:21.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:30:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:22.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:22 np0005593234 nova_compute[227762]: 2026-01-23 09:30:22.706 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:23 np0005593234 nova_compute[227762]: 2026-01-23 09:30:23.686 227766 DEBUG nova.compute.manager [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 23 04:30:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:23.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:23 np0005593234 nova_compute[227762]: 2026-01-23 09:30:23.808 227766 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:23 np0005593234 nova_compute[227762]: 2026-01-23 09:30:23.809 227766 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:23 np0005593234 nova_compute[227762]: 2026-01-23 09:30:23.839 227766 DEBUG nova.objects.instance [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lazy-loading 'pci_requests' on Instance uuid f3277436-85d0-4674-aa69-d7a50448a5d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:23 np0005593234 nova_compute[227762]: 2026-01-23 09:30:23.874 227766 DEBUG nova.virt.hardware [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:30:23 np0005593234 nova_compute[227762]: 2026-01-23 09:30:23.874 227766 INFO nova.compute.claims [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:30:23 np0005593234 nova_compute[227762]: 2026-01-23 09:30:23.875 227766 DEBUG nova.objects.instance [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lazy-loading 'resources' on Instance uuid f3277436-85d0-4674-aa69-d7a50448a5d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:23 np0005593234 nova_compute[227762]: 2026-01-23 09:30:23.890 227766 DEBUG nova.objects.instance [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lazy-loading 'numa_topology' on Instance uuid f3277436-85d0-4674-aa69-d7a50448a5d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:23 np0005593234 nova_compute[227762]: 2026-01-23 09:30:23.905 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:23 np0005593234 nova_compute[227762]: 2026-01-23 09:30:23.912 227766 DEBUG nova.objects.instance [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lazy-loading 'pci_devices' on Instance uuid f3277436-85d0-4674-aa69-d7a50448a5d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:23 np0005593234 nova_compute[227762]: 2026-01-23 09:30:23.959 227766 INFO nova.compute.resource_tracker [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Updating resource usage from migration 0d413dd7-db9d-4d9c-972f-46c175c8f097#033[00m
Jan 23 04:30:23 np0005593234 nova_compute[227762]: 2026-01-23 09:30:23.960 227766 DEBUG nova.compute.resource_tracker [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Starting to track incoming migration 0d413dd7-db9d-4d9c-972f-46c175c8f097 with flavor 68d42077-c749-4366-ba3e-07758debb02d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 04:30:24 np0005593234 nova_compute[227762]: 2026-01-23 09:30:24.025 227766 DEBUG oslo_concurrency.processutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:24.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:30:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1002218887' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:30:24 np0005593234 nova_compute[227762]: 2026-01-23 09:30:24.467 227766 DEBUG oslo_concurrency.processutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:24 np0005593234 nova_compute[227762]: 2026-01-23 09:30:24.473 227766 DEBUG nova.compute.provider_tree [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:30:24 np0005593234 nova_compute[227762]: 2026-01-23 09:30:24.499 227766 DEBUG nova.scheduler.client.report [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:30:24 np0005593234 nova_compute[227762]: 2026-01-23 09:30:24.524 227766 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.716s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:24 np0005593234 nova_compute[227762]: 2026-01-23 09:30:24.525 227766 INFO nova.compute.manager [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Migrating#033[00m
Jan 23 04:30:24 np0005593234 nova_compute[227762]: 2026-01-23 09:30:24.525 227766 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:30:24 np0005593234 nova_compute[227762]: 2026-01-23 09:30:24.525 227766 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:30:24 np0005593234 nova_compute[227762]: 2026-01-23 09:30:24.536 227766 INFO nova.compute.rpcapi [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Jan 23 04:30:24 np0005593234 nova_compute[227762]: 2026-01-23 09:30:24.537 227766 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:30:25 np0005593234 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 04:30:25 np0005593234 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 04:30:25 np0005593234 systemd-logind[794]: New session 51 of user nova.
Jan 23 04:30:25 np0005593234 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 04:30:25 np0005593234 systemd[1]: Starting User Manager for UID 42436...
Jan 23 04:30:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:25.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:25 np0005593234 systemd[233672]: Queued start job for default target Main User Target.
Jan 23 04:30:25 np0005593234 systemd[233672]: Created slice User Application Slice.
Jan 23 04:30:25 np0005593234 systemd[233672]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:30:25 np0005593234 systemd[233672]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:30:25 np0005593234 systemd[233672]: Reached target Paths.
Jan 23 04:30:25 np0005593234 systemd[233672]: Reached target Timers.
Jan 23 04:30:25 np0005593234 systemd[233672]: Starting D-Bus User Message Bus Socket...
Jan 23 04:30:25 np0005593234 systemd[233672]: Starting Create User's Volatile Files and Directories...
Jan 23 04:30:25 np0005593234 systemd[233672]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:30:25 np0005593234 systemd[233672]: Reached target Sockets.
Jan 23 04:30:25 np0005593234 systemd[233672]: Finished Create User's Volatile Files and Directories.
Jan 23 04:30:25 np0005593234 systemd[233672]: Reached target Basic System.
Jan 23 04:30:25 np0005593234 systemd[233672]: Reached target Main User Target.
Jan 23 04:30:25 np0005593234 systemd[233672]: Startup finished in 132ms.
Jan 23 04:30:25 np0005593234 systemd[1]: Started User Manager for UID 42436.
Jan 23 04:30:25 np0005593234 systemd[1]: Started Session 51 of User nova.
Jan 23 04:30:25 np0005593234 systemd[1]: session-51.scope: Deactivated successfully.
Jan 23 04:30:25 np0005593234 systemd-logind[794]: Session 51 logged out. Waiting for processes to exit.
Jan 23 04:30:25 np0005593234 systemd-logind[794]: Removed session 51.
Jan 23 04:30:26 np0005593234 systemd-logind[794]: New session 53 of user nova.
Jan 23 04:30:26 np0005593234 systemd[1]: Started Session 53 of User nova.
Jan 23 04:30:26 np0005593234 systemd[1]: session-53.scope: Deactivated successfully.
Jan 23 04:30:26 np0005593234 systemd-logind[794]: Session 53 logged out. Waiting for processes to exit.
Jan 23 04:30:26 np0005593234 systemd-logind[794]: Removed session 53.
Jan 23 04:30:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:26.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:27.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:27 np0005593234 nova_compute[227762]: 2026-01-23 09:30:27.709 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:28.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:28 np0005593234 nova_compute[227762]: 2026-01-23 09:30:28.880 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160613.8794038, 872939ff-8eb8-4a0a-a32d-f1268af38264 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:30:28 np0005593234 nova_compute[227762]: 2026-01-23 09:30:28.881 227766 INFO nova.compute.manager [-] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:30:28 np0005593234 nova_compute[227762]: 2026-01-23 09:30:28.907 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:28 np0005593234 nova_compute[227762]: 2026-01-23 09:30:28.909 227766 DEBUG nova.compute.manager [None req-1d5c8166-6ec3-4376-b4f2-0bae6b4b1822 - - - - - -] [instance: 872939ff-8eb8-4a0a-a32d-f1268af38264] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:30:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:29.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:30.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:30:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:31.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:30:31 np0005593234 podman[233747]: 2026-01-23 09:30:31.780268302 +0000 UTC m=+0.071670224 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 04:30:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:32.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:32 np0005593234 nova_compute[227762]: 2026-01-23 09:30:32.711 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:30:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:33.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:30:33 np0005593234 nova_compute[227762]: 2026-01-23 09:30:33.909 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:30:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:34.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:30:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:30:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:35.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:30:36 np0005593234 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 04:30:36 np0005593234 systemd[233672]: Activating special unit Exit the Session...
Jan 23 04:30:36 np0005593234 systemd[233672]: Stopped target Main User Target.
Jan 23 04:30:36 np0005593234 systemd[233672]: Stopped target Basic System.
Jan 23 04:30:36 np0005593234 systemd[233672]: Stopped target Paths.
Jan 23 04:30:36 np0005593234 systemd[233672]: Stopped target Sockets.
Jan 23 04:30:36 np0005593234 systemd[233672]: Stopped target Timers.
Jan 23 04:30:36 np0005593234 systemd[233672]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:30:36 np0005593234 systemd[233672]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 04:30:36 np0005593234 systemd[233672]: Closed D-Bus User Message Bus Socket.
Jan 23 04:30:36 np0005593234 systemd[233672]: Stopped Create User's Volatile Files and Directories.
Jan 23 04:30:36 np0005593234 systemd[233672]: Removed slice User Application Slice.
Jan 23 04:30:36 np0005593234 systemd[233672]: Reached target Shutdown.
Jan 23 04:30:36 np0005593234 systemd[233672]: Finished Exit the Session.
Jan 23 04:30:36 np0005593234 systemd[233672]: Reached target Exit the Session.
Jan 23 04:30:36 np0005593234 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 04:30:36 np0005593234 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 04:30:36 np0005593234 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 04:30:36 np0005593234 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 04:30:36 np0005593234 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 04:30:36 np0005593234 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 04:30:36 np0005593234 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 04:30:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:36.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.379133) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160637379194, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 2417, "num_deletes": 251, "total_data_size": 5759018, "memory_usage": 5818944, "flush_reason": "Manual Compaction"}
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160637464077, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 3768804, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22895, "largest_seqno": 25307, "table_properties": {"data_size": 3758978, "index_size": 6192, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 20542, "raw_average_key_size": 20, "raw_value_size": 3739190, "raw_average_value_size": 3735, "num_data_blocks": 274, "num_entries": 1001, "num_filter_entries": 1001, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769160432, "oldest_key_time": 1769160432, "file_creation_time": 1769160637, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 85040 microseconds, and 7702 cpu microseconds.
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.464174) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 3768804 bytes OK
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.464194) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.518898) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.518956) EVENT_LOG_v1 {"time_micros": 1769160637518942, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.518985) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 5748256, prev total WAL file size 5748256, number of live WAL files 2.
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.520690) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(3680KB)], [48(7436KB)]
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160637520814, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 11383831, "oldest_snapshot_seqno": -1}
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 4867 keys, 9349869 bytes, temperature: kUnknown
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160637690088, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 9349869, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9316050, "index_size": 20525, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 122836, "raw_average_key_size": 25, "raw_value_size": 9226589, "raw_average_value_size": 1895, "num_data_blocks": 842, "num_entries": 4867, "num_filter_entries": 4867, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769160637, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.690329) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 9349869 bytes
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.701167) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 67.2 rd, 55.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 7.3 +0.0 blob) out(8.9 +0.0 blob), read-write-amplify(5.5) write-amplify(2.5) OK, records in: 5388, records dropped: 521 output_compression: NoCompression
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.701218) EVENT_LOG_v1 {"time_micros": 1769160637701187, "job": 28, "event": "compaction_finished", "compaction_time_micros": 169343, "compaction_time_cpu_micros": 20073, "output_level": 6, "num_output_files": 1, "total_output_size": 9349869, "num_input_records": 5388, "num_output_records": 4867, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160637702293, "job": 28, "event": "table_file_deletion", "file_number": 50}
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160637703700, "job": 28, "event": "table_file_deletion", "file_number": 48}
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.520554) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.703738) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.703742) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.703744) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.703745) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:30:37 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:30:37.703746) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:30:37 np0005593234 nova_compute[227762]: 2026-01-23 09:30:37.713 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:37.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:38.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:38 np0005593234 nova_compute[227762]: 2026-01-23 09:30:38.911 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:39.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:40.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:41.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:42.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:42 np0005593234 nova_compute[227762]: 2026-01-23 09:30:42.715 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:42.805 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:42.806 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:30:42.806 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:43 np0005593234 nova_compute[227762]: 2026-01-23 09:30:43.542 227766 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquiring lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:30:43 np0005593234 nova_compute[227762]: 2026-01-23 09:30:43.543 227766 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquired lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:30:43 np0005593234 nova_compute[227762]: 2026-01-23 09:30:43.543 227766 DEBUG nova.network.neutron [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:30:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:43.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:43 np0005593234 nova_compute[227762]: 2026-01-23 09:30:43.770 227766 DEBUG nova.network.neutron [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:30:43 np0005593234 nova_compute[227762]: 2026-01-23 09:30:43.914 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:44 np0005593234 nova_compute[227762]: 2026-01-23 09:30:44.281 227766 DEBUG nova.network.neutron [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:30:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:44.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:30:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/116655622' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:30:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:30:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/116655622' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:30:44 np0005593234 nova_compute[227762]: 2026-01-23 09:30:44.462 227766 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Releasing lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:30:44 np0005593234 nova_compute[227762]: 2026-01-23 09:30:44.562 227766 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 23 04:30:44 np0005593234 nova_compute[227762]: 2026-01-23 09:30:44.564 227766 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 04:30:44 np0005593234 nova_compute[227762]: 2026-01-23 09:30:44.565 227766 INFO nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Creating image(s)#033[00m
Jan 23 04:30:44 np0005593234 nova_compute[227762]: 2026-01-23 09:30:44.605 227766 DEBUG nova.storage.rbd_utils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] creating snapshot(nova-resize) on rbd image(f3277436-85d0-4674-aa69-d7a50448a5d0_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:30:44 np0005593234 podman[233811]: 2026-01-23 09:30:44.780623886 +0000 UTC m=+0.077727671 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:30:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:45.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e146 e146: 3 total, 3 up, 3 in
Jan 23 04:30:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:46.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.360 227766 DEBUG nova.objects.instance [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lazy-loading 'trusted_certs' on Instance uuid f3277436-85d0-4674-aa69-d7a50448a5d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.462 227766 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.462 227766 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Ensure instance console log exists: /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.463 227766 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.463 227766 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.463 227766 DEBUG oslo_concurrency.lockutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.465 227766 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.469 227766 WARNING nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.475 227766 DEBUG nova.virt.libvirt.host [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.476 227766 DEBUG nova.virt.libvirt.host [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.478 227766 DEBUG nova.virt.libvirt.host [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.478 227766 DEBUG nova.virt.libvirt.host [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.480 227766 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.480 227766 DEBUG nova.virt.hardware [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.480 227766 DEBUG nova.virt.hardware [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.481 227766 DEBUG nova.virt.hardware [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.481 227766 DEBUG nova.virt.hardware [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.481 227766 DEBUG nova.virt.hardware [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.481 227766 DEBUG nova.virt.hardware [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.482 227766 DEBUG nova.virt.hardware [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.482 227766 DEBUG nova.virt.hardware [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.482 227766 DEBUG nova.virt.hardware [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.482 227766 DEBUG nova.virt.hardware [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.483 227766 DEBUG nova.virt.hardware [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.483 227766 DEBUG nova.objects.instance [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lazy-loading 'vcpu_model' on Instance uuid f3277436-85d0-4674-aa69-d7a50448a5d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.501 227766 DEBUG oslo_concurrency.processutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:30:46 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1914346700' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.928 227766 DEBUG oslo_concurrency.processutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:46 np0005593234 nova_compute[227762]: 2026-01-23 09:30:46.963 227766 DEBUG oslo_concurrency.processutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:30:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4287987508' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:30:47 np0005593234 nova_compute[227762]: 2026-01-23 09:30:47.716 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:47 np0005593234 nova_compute[227762]: 2026-01-23 09:30:47.742 227766 DEBUG oslo_concurrency.processutils [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.779s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 04:30:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:47.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 04:30:47 np0005593234 nova_compute[227762]: 2026-01-23 09:30:47.746 227766 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  <uuid>f3277436-85d0-4674-aa69-d7a50448a5d0</uuid>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  <name>instance-00000008</name>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <nova:name>tempest-MigrationsAdminTest-server-1070439771</nova:name>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:30:46</nova:creationTime>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <nova:user uuid="7536fa2e625541fba613dc32a49a4c5b">tempest-MigrationsAdminTest-2056264627-project-member</nova:user>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <nova:project uuid="11def90dfdc14cfe928302bec2835794">tempest-MigrationsAdminTest-2056264627</nova:project>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <entry name="serial">f3277436-85d0-4674-aa69-d7a50448a5d0</entry>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <entry name="uuid">f3277436-85d0-4674-aa69-d7a50448a5d0</entry>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/f3277436-85d0-4674-aa69-d7a50448a5d0_disk">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/f3277436-85d0-4674-aa69-d7a50448a5d0_disk.config">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0/console.log" append="off"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:30:47 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:30:47 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:30:47 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:30:47 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:30:47 np0005593234 nova_compute[227762]: 2026-01-23 09:30:47.944 227766 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:30:47 np0005593234 nova_compute[227762]: 2026-01-23 09:30:47.944 227766 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:30:47 np0005593234 nova_compute[227762]: 2026-01-23 09:30:47.945 227766 INFO nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Using config drive#033[00m
Jan 23 04:30:48 np0005593234 systemd-machined[195626]: New machine qemu-4-instance-00000008.
Jan 23 04:30:48 np0005593234 systemd[1]: Started Virtual Machine qemu-4-instance-00000008.
Jan 23 04:30:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:48.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.707 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160648.7067342, f3277436-85d0-4674-aa69-d7a50448a5d0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.708 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.710 227766 DEBUG nova.compute.manager [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.713 227766 INFO nova.virt.libvirt.driver [-] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance running successfully.#033[00m
Jan 23 04:30:48 np0005593234 virtqemud[227483]: argument unsupported: QEMU guest agent is not configured
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.716 227766 DEBUG nova.virt.libvirt.guest [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.716 227766 DEBUG nova.virt.libvirt.driver [None req-9c85c1b3-7383-485b-bd4a-d5f1b7ae1bb8 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 23 04:30:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.734 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.737 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.781 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.782 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160648.7076428, f3277436-85d0-4674-aa69-d7a50448a5d0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.782 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] VM Started (Lifecycle Event)#033[00m
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.811 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.815 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:30:48 np0005593234 nova_compute[227762]: 2026-01-23 09:30:48.916 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:49.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:50.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:51.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:52.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:52 np0005593234 nova_compute[227762]: 2026-01-23 09:30:52.719 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e147 e147: 3 total, 3 up, 3 in
Jan 23 04:30:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:53.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:53 np0005593234 nova_compute[227762]: 2026-01-23 09:30:53.918 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:30:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:54.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:30:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:30:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:30:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:30:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:55.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:56.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:57 np0005593234 nova_compute[227762]: 2026-01-23 09:30:57.726 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:30:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:57.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:30:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:30:58.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e147 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:30:58 np0005593234 nova_compute[227762]: 2026-01-23 09:30:58.921 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:30:59 np0005593234 nova_compute[227762]: 2026-01-23 09:30:59.197 227766 DEBUG oslo_concurrency.lockutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "f2d1fdc0-baaf-4566-8655-aafdbcf1f473" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:59 np0005593234 nova_compute[227762]: 2026-01-23 09:30:59.197 227766 DEBUG oslo_concurrency.lockutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f2d1fdc0-baaf-4566-8655-aafdbcf1f473" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:59 np0005593234 nova_compute[227762]: 2026-01-23 09:30:59.221 227766 DEBUG nova.compute.manager [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:30:59 np0005593234 nova_compute[227762]: 2026-01-23 09:30:59.320 227766 DEBUG oslo_concurrency.lockutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:30:59 np0005593234 nova_compute[227762]: 2026-01-23 09:30:59.320 227766 DEBUG oslo_concurrency.lockutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:30:59 np0005593234 nova_compute[227762]: 2026-01-23 09:30:59.328 227766 DEBUG nova.virt.hardware [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:30:59 np0005593234 nova_compute[227762]: 2026-01-23 09:30:59.330 227766 INFO nova.compute.claims [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:30:59 np0005593234 nova_compute[227762]: 2026-01-23 09:30:59.503 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:30:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:30:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:30:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:30:59.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:30:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:30:59 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1120693088' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:30:59 np0005593234 nova_compute[227762]: 2026-01-23 09:30:59.976 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:30:59 np0005593234 nova_compute[227762]: 2026-01-23 09:30:59.984 227766 DEBUG nova.compute.provider_tree [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.004 227766 DEBUG nova.scheduler.client.report [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.089 227766 DEBUG oslo_concurrency.lockutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.769s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.090 227766 DEBUG nova.compute.manager [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.208 227766 DEBUG nova.compute.manager [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.209 227766 DEBUG nova.network.neutron [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.255 227766 INFO nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.325 227766 DEBUG nova.compute.manager [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:31:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:00.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.448 227766 DEBUG nova.compute.manager [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.449 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.450 227766 INFO nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Creating image(s)#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.635 227766 DEBUG nova.storage.rbd_utils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.668 227766 DEBUG nova.storage.rbd_utils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.697 227766 DEBUG nova.storage.rbd_utils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.701 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.759 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.760 227766 DEBUG oslo_concurrency.lockutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.760 227766 DEBUG oslo_concurrency.lockutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.761 227766 DEBUG oslo_concurrency.lockutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.789 227766 DEBUG nova.storage.rbd_utils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:00 np0005593234 nova_compute[227762]: 2026-01-23 09:31:00.792 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:01 np0005593234 nova_compute[227762]: 2026-01-23 09:31:01.035 227766 DEBUG nova.network.neutron [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:31:01 np0005593234 nova_compute[227762]: 2026-01-23 09:31:01.035 227766 DEBUG nova.compute.manager [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:31:01 np0005593234 nova_compute[227762]: 2026-01-23 09:31:01.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:01.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.216 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:31:02.217 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:31:02.218 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.252 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.294 227766 DEBUG nova.storage.rbd_utils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] resizing rbd image f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:31:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:31:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:02.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.667 227766 DEBUG nova.objects.instance [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'migration_context' on Instance uuid f2d1fdc0-baaf-4566-8655-aafdbcf1f473 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.685 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.685 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Ensure instance console log exists: /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.686 227766 DEBUG oslo_concurrency.lockutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.686 227766 DEBUG oslo_concurrency.lockutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.686 227766 DEBUG oslo_concurrency.lockutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.688 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.692 227766 WARNING nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.707 227766 DEBUG nova.virt.libvirt.host [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.707 227766 DEBUG nova.virt.libvirt.host [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.710 227766 DEBUG nova.virt.libvirt.host [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.710 227766 DEBUG nova.virt.libvirt.host [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.711 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.711 227766 DEBUG nova.virt.hardware [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.712 227766 DEBUG nova.virt.hardware [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.712 227766 DEBUG nova.virt.hardware [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.712 227766 DEBUG nova.virt.hardware [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.713 227766 DEBUG nova.virt.hardware [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.713 227766 DEBUG nova.virt.hardware [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.713 227766 DEBUG nova.virt.hardware [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.713 227766 DEBUG nova.virt.hardware [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.713 227766 DEBUG nova.virt.hardware [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.714 227766 DEBUG nova.virt.hardware [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.714 227766 DEBUG nova.virt.hardware [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.717 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.736 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:02 np0005593234 podman[234392]: 2026-01-23 09:31:02.762292586 +0000 UTC m=+0.053906315 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.796 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.796 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.797 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.797 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:31:02 np0005593234 nova_compute[227762]: 2026-01-23 09:31:02.797 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:31:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3927094359' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:31:03 np0005593234 nova_compute[227762]: 2026-01-23 09:31:03.174 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:31:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3485796465' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:31:03 np0005593234 nova_compute[227762]: 2026-01-23 09:31:03.204 227766 DEBUG nova.storage.rbd_utils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:03 np0005593234 nova_compute[227762]: 2026-01-23 09:31:03.208 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:03 np0005593234 nova_compute[227762]: 2026-01-23 09:31:03.228 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e148 e148: 3 total, 3 up, 3 in
Jan 23 04:31:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:31:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4169335306' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:31:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:03 np0005593234 ovn_controller[134547]: 2026-01-23T09:31:03Z|00045|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 23 04:31:03 np0005593234 nova_compute[227762]: 2026-01-23 09:31:03.764 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.556s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:03 np0005593234 nova_compute[227762]: 2026-01-23 09:31:03.766 227766 DEBUG nova.objects.instance [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'pci_devices' on Instance uuid f2d1fdc0-baaf-4566-8655-aafdbcf1f473 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:31:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:03.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:03 np0005593234 nova_compute[227762]: 2026-01-23 09:31:03.856 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  <uuid>f2d1fdc0-baaf-4566-8655-aafdbcf1f473</uuid>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  <name>instance-0000000a</name>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <nova:name>tempest-MigrationsAdminTest-server-1871952171</nova:name>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:31:02</nova:creationTime>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <nova:user uuid="7536fa2e625541fba613dc32a49a4c5b">tempest-MigrationsAdminTest-2056264627-project-member</nova:user>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <nova:project uuid="11def90dfdc14cfe928302bec2835794">tempest-MigrationsAdminTest-2056264627</nova:project>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <entry name="serial">f2d1fdc0-baaf-4566-8655-aafdbcf1f473</entry>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <entry name="uuid">f2d1fdc0-baaf-4566-8655-aafdbcf1f473</entry>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk.config">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473/console.log" append="off"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:31:03 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:31:03 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:31:03 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:31:03 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:31:03 np0005593234 nova_compute[227762]: 2026-01-23 09:31:03.862 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:31:03 np0005593234 nova_compute[227762]: 2026-01-23 09:31:03.862 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:31:03 np0005593234 nova_compute[227762]: 2026-01-23 09:31:03.923 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.007 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.009 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4702MB free_disk=20.922042846679688GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.009 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.009 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.094 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.094 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.095 227766 INFO nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Using config drive#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.130 227766 DEBUG nova.storage.rbd_utils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.174 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance f3277436-85d0-4674-aa69-d7a50448a5d0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.174 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance f2d1fdc0-baaf-4566-8655-aafdbcf1f473 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.175 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.175 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.244 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:04.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.378 227766 INFO nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Creating config drive at /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473/disk.config#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.384 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr680k42g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.512 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr680k42g" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.541 227766 DEBUG nova.storage.rbd_utils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] rbd image f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.545 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473/disk.config f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:31:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1981966718' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.692 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.698 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.722 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.744 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.745 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.736s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.768 227766 DEBUG oslo_concurrency.processutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473/disk.config f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:04 np0005593234 nova_compute[227762]: 2026-01-23 09:31:04.769 227766 INFO nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Deleting local config drive /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473/disk.config because it was imported into RBD.#033[00m
Jan 23 04:31:04 np0005593234 systemd-machined[195626]: New machine qemu-5-instance-0000000a.
Jan 23 04:31:04 np0005593234 systemd[1]: Started Virtual Machine qemu-5-instance-0000000a.
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.536 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160665.536365, f2d1fdc0-baaf-4566-8655-aafdbcf1f473 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.538 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.539 227766 DEBUG nova.compute.manager [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.540 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.543 227766 INFO nova.virt.libvirt.driver [-] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Instance spawned successfully.#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.543 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.585 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.589 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.589 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.590 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.590 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.590 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.591 227766 DEBUG nova.virt.libvirt.driver [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.594 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.633 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.634 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160665.5374198, f2d1fdc0-baaf-4566-8655-aafdbcf1f473 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.634 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] VM Started (Lifecycle Event)#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.667 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.673 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.683 227766 INFO nova.compute.manager [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Took 5.23 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.683 227766 DEBUG nova.compute.manager [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.714 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.740 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.750 227766 INFO nova.compute.manager [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Took 6.47 seconds to build instance.#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.768 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.769 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.769 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:31:05 np0005593234 nova_compute[227762]: 2026-01-23 09:31:05.772 227766 DEBUG oslo_concurrency.lockutils [None req-388d2b8a-6541-4315-af9f-4d2fb25e58ac 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f2d1fdc0-baaf-4566-8655-aafdbcf1f473" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:31:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:05.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:31:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:31:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:31:06 np0005593234 nova_compute[227762]: 2026-01-23 09:31:06.080 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:31:06 np0005593234 nova_compute[227762]: 2026-01-23 09:31:06.081 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:31:06 np0005593234 nova_compute[227762]: 2026-01-23 09:31:06.081 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:31:06 np0005593234 nova_compute[227762]: 2026-01-23 09:31:06.082 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f3277436-85d0-4674-aa69-d7a50448a5d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:31:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:31:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:06.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:31:06 np0005593234 nova_compute[227762]: 2026-01-23 09:31:06.559 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:31:07 np0005593234 nova_compute[227762]: 2026-01-23 09:31:07.374 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:31:07 np0005593234 nova_compute[227762]: 2026-01-23 09:31:07.730 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:07.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:08.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:08 np0005593234 nova_compute[227762]: 2026-01-23 09:31:08.582 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:31:08 np0005593234 nova_compute[227762]: 2026-01-23 09:31:08.583 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:31:08 np0005593234 nova_compute[227762]: 2026-01-23 09:31:08.583 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:08 np0005593234 nova_compute[227762]: 2026-01-23 09:31:08.584 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:08 np0005593234 nova_compute[227762]: 2026-01-23 09:31:08.584 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:08 np0005593234 nova_compute[227762]: 2026-01-23 09:31:08.584 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:08 np0005593234 nova_compute[227762]: 2026-01-23 09:31:08.585 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:31:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:08 np0005593234 nova_compute[227762]: 2026-01-23 09:31:08.927 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:31:09.221 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:31:09 np0005593234 nova_compute[227762]: 2026-01-23 09:31:09.584 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:31:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:31:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:09.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:31:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:10.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:31:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:11.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:31:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:12.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:12 np0005593234 nova_compute[227762]: 2026-01-23 09:31:12.731 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:13 np0005593234 nova_compute[227762]: 2026-01-23 09:31:13.777 227766 DEBUG oslo_concurrency.lockutils [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "refresh_cache-f2d1fdc0-baaf-4566-8655-aafdbcf1f473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:31:13 np0005593234 nova_compute[227762]: 2026-01-23 09:31:13.778 227766 DEBUG oslo_concurrency.lockutils [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquired lock "refresh_cache-f2d1fdc0-baaf-4566-8655-aafdbcf1f473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:31:13 np0005593234 nova_compute[227762]: 2026-01-23 09:31:13.779 227766 DEBUG nova.network.neutron [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:31:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:31:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:13.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:31:13 np0005593234 nova_compute[227762]: 2026-01-23 09:31:13.929 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:14.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:14 np0005593234 nova_compute[227762]: 2026-01-23 09:31:14.547 227766 DEBUG nova.network.neutron [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:31:15 np0005593234 nova_compute[227762]: 2026-01-23 09:31:15.531 227766 DEBUG nova.network.neutron [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:31:15 np0005593234 nova_compute[227762]: 2026-01-23 09:31:15.568 227766 DEBUG oslo_concurrency.lockutils [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Releasing lock "refresh_cache-f2d1fdc0-baaf-4566-8655-aafdbcf1f473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:31:15 np0005593234 nova_compute[227762]: 2026-01-23 09:31:15.735 227766 DEBUG nova.virt.libvirt.driver [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 23 04:31:15 np0005593234 nova_compute[227762]: 2026-01-23 09:31:15.736 227766 DEBUG nova.virt.libvirt.volume.remotefs [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Creating file /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473/bed1f85392094ef0a65504bcafb4170e.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 23 04:31:15 np0005593234 nova_compute[227762]: 2026-01-23 09:31:15.736 227766 DEBUG oslo_concurrency.processutils [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473/bed1f85392094ef0a65504bcafb4170e.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:15.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:15 np0005593234 podman[234740]: 2026-01-23 09:31:15.819450184 +0000 UTC m=+0.117527520 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 04:31:16 np0005593234 nova_compute[227762]: 2026-01-23 09:31:16.167 227766 DEBUG oslo_concurrency.processutils [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473/bed1f85392094ef0a65504bcafb4170e.tmp" returned: 1 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:16 np0005593234 nova_compute[227762]: 2026-01-23 09:31:16.169 227766 DEBUG oslo_concurrency.processutils [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473/bed1f85392094ef0a65504bcafb4170e.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 04:31:16 np0005593234 nova_compute[227762]: 2026-01-23 09:31:16.170 227766 DEBUG nova.virt.libvirt.volume.remotefs [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Creating directory /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473 on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 23 04:31:16 np0005593234 nova_compute[227762]: 2026-01-23 09:31:16.170 227766 DEBUG oslo_concurrency.processutils [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:16.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:16 np0005593234 nova_compute[227762]: 2026-01-23 09:31:16.388 227766 DEBUG oslo_concurrency.processutils [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/f2d1fdc0-baaf-4566-8655-aafdbcf1f473" returned: 0 in 0.218s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:16 np0005593234 nova_compute[227762]: 2026-01-23 09:31:16.392 227766 DEBUG nova.virt.libvirt.driver [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 04:31:17 np0005593234 nova_compute[227762]: 2026-01-23 09:31:17.733 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:31:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:17.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:31:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:18.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:18 np0005593234 nova_compute[227762]: 2026-01-23 09:31:18.932 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:19.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:20.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:31:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:21.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:31:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:31:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:22.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:31:22 np0005593234 nova_compute[227762]: 2026-01-23 09:31:22.735 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:23.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:23 np0005593234 nova_compute[227762]: 2026-01-23 09:31:23.935 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:24.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:31:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:25.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:31:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:26.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:26 np0005593234 nova_compute[227762]: 2026-01-23 09:31:26.437 227766 DEBUG nova.virt.libvirt.driver [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 04:31:27 np0005593234 nova_compute[227762]: 2026-01-23 09:31:27.737 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:27.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:31:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:28.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:31:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:28 np0005593234 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 23 04:31:28 np0005593234 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000a.scope: Consumed 16.163s CPU time.
Jan 23 04:31:28 np0005593234 systemd-machined[195626]: Machine qemu-5-instance-0000000a terminated.
Jan 23 04:31:28 np0005593234 nova_compute[227762]: 2026-01-23 09:31:28.937 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:29 np0005593234 nova_compute[227762]: 2026-01-23 09:31:29.450 227766 INFO nova.virt.libvirt.driver [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Instance shutdown successfully after 13 seconds.#033[00m
Jan 23 04:31:29 np0005593234 nova_compute[227762]: 2026-01-23 09:31:29.454 227766 INFO nova.virt.libvirt.driver [-] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Instance destroyed successfully.#033[00m
Jan 23 04:31:29 np0005593234 nova_compute[227762]: 2026-01-23 09:31:29.457 227766 DEBUG nova.virt.libvirt.driver [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:31:29 np0005593234 nova_compute[227762]: 2026-01-23 09:31:29.457 227766 DEBUG nova.virt.libvirt.driver [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:31:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:29.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:30 np0005593234 nova_compute[227762]: 2026-01-23 09:31:30.292 227766 DEBUG oslo_concurrency.lockutils [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "f2d1fdc0-baaf-4566-8655-aafdbcf1f473-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:30 np0005593234 nova_compute[227762]: 2026-01-23 09:31:30.293 227766 DEBUG oslo_concurrency.lockutils [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f2d1fdc0-baaf-4566-8655-aafdbcf1f473-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:30 np0005593234 nova_compute[227762]: 2026-01-23 09:31:30.293 227766 DEBUG oslo_concurrency.lockutils [None req-cd8134c6-df22-42b8-953d-6e61fa57fbdb 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f2d1fdc0-baaf-4566-8655-aafdbcf1f473-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:30.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:31.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:32.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:32 np0005593234 nova_compute[227762]: 2026-01-23 09:31:32.739 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:33 np0005593234 podman[234830]: 2026-01-23 09:31:33.770401386 +0000 UTC m=+0.065072550 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 04:31:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:33.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:33 np0005593234 nova_compute[227762]: 2026-01-23 09:31:33.939 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:34.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:31:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:35.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:31:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e149 e149: 3 total, 3 up, 3 in
Jan 23 04:31:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:36.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:37 np0005593234 nova_compute[227762]: 2026-01-23 09:31:37.741 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:37.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:38.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e149 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:38 np0005593234 nova_compute[227762]: 2026-01-23 09:31:38.941 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:39.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:40 np0005593234 nova_compute[227762]: 2026-01-23 09:31:40.183 227766 DEBUG oslo_concurrency.lockutils [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "f2d1fdc0-baaf-4566-8655-aafdbcf1f473" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:40 np0005593234 nova_compute[227762]: 2026-01-23 09:31:40.183 227766 DEBUG oslo_concurrency.lockutils [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f2d1fdc0-baaf-4566-8655-aafdbcf1f473" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:40 np0005593234 nova_compute[227762]: 2026-01-23 09:31:40.183 227766 DEBUG nova.compute.manager [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Going to confirm migration 3 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 23 04:31:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:40.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:41 np0005593234 nova_compute[227762]: 2026-01-23 09:31:41.009 227766 DEBUG oslo_concurrency.lockutils [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "refresh_cache-f2d1fdc0-baaf-4566-8655-aafdbcf1f473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:31:41 np0005593234 nova_compute[227762]: 2026-01-23 09:31:41.009 227766 DEBUG oslo_concurrency.lockutils [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquired lock "refresh_cache-f2d1fdc0-baaf-4566-8655-aafdbcf1f473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:31:41 np0005593234 nova_compute[227762]: 2026-01-23 09:31:41.010 227766 DEBUG nova.network.neutron [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:31:41 np0005593234 nova_compute[227762]: 2026-01-23 09:31:41.010 227766 DEBUG nova.objects.instance [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'info_cache' on Instance uuid f2d1fdc0-baaf-4566-8655-aafdbcf1f473 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:31:41 np0005593234 nova_compute[227762]: 2026-01-23 09:31:41.317 227766 DEBUG nova.network.neutron [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:31:41 np0005593234 nova_compute[227762]: 2026-01-23 09:31:41.624 227766 DEBUG nova.network.neutron [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:31:41 np0005593234 nova_compute[227762]: 2026-01-23 09:31:41.655 227766 DEBUG oslo_concurrency.lockutils [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Releasing lock "refresh_cache-f2d1fdc0-baaf-4566-8655-aafdbcf1f473" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:31:41 np0005593234 nova_compute[227762]: 2026-01-23 09:31:41.656 227766 DEBUG nova.objects.instance [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'migration_context' on Instance uuid f2d1fdc0-baaf-4566-8655-aafdbcf1f473 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:31:41 np0005593234 nova_compute[227762]: 2026-01-23 09:31:41.790 227766 DEBUG nova.storage.rbd_utils [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] removing snapshot(nova-resize) on rbd image(f2d1fdc0-baaf-4566-8655-aafdbcf1f473_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:31:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:41.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e150 e150: 3 total, 3 up, 3 in
Jan 23 04:31:42 np0005593234 nova_compute[227762]: 2026-01-23 09:31:42.114 227766 DEBUG oslo_concurrency.lockutils [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:42 np0005593234 nova_compute[227762]: 2026-01-23 09:31:42.115 227766 DEBUG oslo_concurrency.lockutils [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:42 np0005593234 nova_compute[227762]: 2026-01-23 09:31:42.308 227766 DEBUG oslo_concurrency.processutils [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:31:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:42.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:31:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2629187586' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:31:42 np0005593234 nova_compute[227762]: 2026-01-23 09:31:42.744 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:42 np0005593234 nova_compute[227762]: 2026-01-23 09:31:42.750 227766 DEBUG oslo_concurrency.processutils [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:31:42 np0005593234 nova_compute[227762]: 2026-01-23 09:31:42.759 227766 DEBUG nova.compute.provider_tree [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:31:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:31:42.806 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:31:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:31:42.807 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:31:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:31:42.808 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:42 np0005593234 nova_compute[227762]: 2026-01-23 09:31:42.824 227766 DEBUG nova.scheduler.client.report [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:31:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e150 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:43.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:43 np0005593234 nova_compute[227762]: 2026-01-23 09:31:43.923 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160688.9223201, f2d1fdc0-baaf-4566-8655-aafdbcf1f473 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:31:43 np0005593234 nova_compute[227762]: 2026-01-23 09:31:43.923 227766 INFO nova.compute.manager [-] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:31:43 np0005593234 nova_compute[227762]: 2026-01-23 09:31:43.942 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:44 np0005593234 nova_compute[227762]: 2026-01-23 09:31:44.119 227766 DEBUG nova.compute.manager [None req-b3ce441b-5834-42b2-b4c4-584a16483085 - - - - - -] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:31:44 np0005593234 nova_compute[227762]: 2026-01-23 09:31:44.318 227766 DEBUG oslo_concurrency.lockutils [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 2.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:31:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1356027591' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:31:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:31:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1356027591' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:31:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:44.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:44 np0005593234 nova_compute[227762]: 2026-01-23 09:31:44.552 227766 INFO nova.scheduler.client.report [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Deleted allocation for migration cc07d23d-e40d-4b7a-98b0-a8f2611399a1#033[00m
Jan 23 04:31:45 np0005593234 nova_compute[227762]: 2026-01-23 09:31:45.057 227766 DEBUG oslo_concurrency.lockutils [None req-b408b253-8312-4607-9a31-bdff17070632 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f2d1fdc0-baaf-4566-8655-aafdbcf1f473" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 4.874s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:31:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:45.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:46.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:46 np0005593234 podman[234912]: 2026-01-23 09:31:46.786601278 +0000 UTC m=+0.082689589 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:31:47 np0005593234 nova_compute[227762]: 2026-01-23 09:31:47.745 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:31:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:47.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:31:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:48.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e151 e151: 3 total, 3 up, 3 in
Jan 23 04:31:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:48 np0005593234 nova_compute[227762]: 2026-01-23 09:31:48.944 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:49.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:50.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:51.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:52.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:52 np0005593234 nova_compute[227762]: 2026-01-23 09:31:52.748 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:53.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:54 np0005593234 nova_compute[227762]: 2026-01-23 09:31:54.001 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:54.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:31:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.0 total, 600.0 interval#012Cumulative writes: 4986 writes, 26K keys, 4986 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.03 MB/s#012Cumulative WAL: 4986 writes, 4986 syncs, 1.00 writes per sync, written: 0.05 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1477 writes, 7196 keys, 1477 commit groups, 1.0 writes per commit group, ingest: 15.42 MB, 0.03 MB/s#012Interval WAL: 1477 writes, 1477 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     48.6      0.64              0.09        14    0.046       0      0       0.0       0.0#012  L6      1/0    8.92 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5     84.9     70.4      1.54              0.55        13    0.118     61K   6862       0.0       0.0#012 Sum      1/0    8.92 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5     59.9     64.0      2.18              0.64        27    0.081     61K   6862       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   5.2     92.2     94.5      0.56              0.13        10    0.056     25K   2526       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0     84.9     70.4      1.54              0.55        13    0.118     61K   6862       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     48.8      0.64              0.09        13    0.049       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 1800.0 total, 600.0 interval#012Flush(GB): cumulative 0.030, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.14 GB write, 0.08 MB/s write, 0.13 GB read, 0.07 MB/s read, 2.2 seconds#012Interval compaction: 0.05 GB write, 0.09 MB/s write, 0.05 GB read, 0.09 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 304.00 MB usage: 12.43 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000154 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(714,11.92 MB,3.92084%) FilterBlock(27,177.23 KB,0.0569344%) IndexBlock(27,340.78 KB,0.109472%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:31:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:55.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:31:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:56.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:31:57 np0005593234 nova_compute[227762]: 2026-01-23 09:31:57.750 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:57.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:31:58.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:31:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:31:59 np0005593234 nova_compute[227762]: 2026-01-23 09:31:59.004 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:31:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:31:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:31:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:31:59.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:00.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:32:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:01.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:32:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:02.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:02 np0005593234 nova_compute[227762]: 2026-01-23 09:32:02.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:02 np0005593234 nova_compute[227762]: 2026-01-23 09:32:02.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:02 np0005593234 nova_compute[227762]: 2026-01-23 09:32:02.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:02 np0005593234 nova_compute[227762]: 2026-01-23 09:32:02.753 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:03 np0005593234 nova_compute[227762]: 2026-01-23 09:32:03.782 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:03 np0005593234 nova_compute[227762]: 2026-01-23 09:32:03.782 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:03 np0005593234 nova_compute[227762]: 2026-01-23 09:32:03.782 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:03 np0005593234 nova_compute[227762]: 2026-01-23 09:32:03.783 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:32:03 np0005593234 nova_compute[227762]: 2026-01-23 09:32:03.783 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:03.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:04 np0005593234 nova_compute[227762]: 2026-01-23 09:32:04.006 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:32:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1160175219' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:32:04 np0005593234 nova_compute[227762]: 2026-01-23 09:32:04.213 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:04 np0005593234 podman[235020]: 2026-01-23 09:32:04.307345417 +0000 UTC m=+0.047373548 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 04:32:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:04.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:04 np0005593234 nova_compute[227762]: 2026-01-23 09:32:04.645 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:32:04 np0005593234 nova_compute[227762]: 2026-01-23 09:32:04.646 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:32:04 np0005593234 nova_compute[227762]: 2026-01-23 09:32:04.775 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:32:04 np0005593234 nova_compute[227762]: 2026-01-23 09:32:04.776 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4684MB free_disk=20.83100128173828GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:32:04 np0005593234 nova_compute[227762]: 2026-01-23 09:32:04.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:04 np0005593234 nova_compute[227762]: 2026-01-23 09:32:04.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:32:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:05.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:32:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:06 np0005593234 nova_compute[227762]: 2026-01-23 09:32:06.424 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance f3277436-85d0-4674-aa69-d7a50448a5d0 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:32:06 np0005593234 nova_compute[227762]: 2026-01-23 09:32:06.424 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:32:06 np0005593234 nova_compute[227762]: 2026-01-23 09:32:06.425 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:32:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:06.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:06 np0005593234 nova_compute[227762]: 2026-01-23 09:32:06.489 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:32:06 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/803550551' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:32:06 np0005593234 nova_compute[227762]: 2026-01-23 09:32:06.916 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:06 np0005593234 nova_compute[227762]: 2026-01-23 09:32:06.922 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:32:06 np0005593234 nova_compute[227762]: 2026-01-23 09:32:06.955 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:32:06 np0005593234 nova_compute[227762]: 2026-01-23 09:32:06.996 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:32:06 np0005593234 nova_compute[227762]: 2026-01-23 09:32:06.997 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:07 np0005593234 nova_compute[227762]: 2026-01-23 09:32:07.754 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:07.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:07 np0005593234 nova_compute[227762]: 2026-01-23 09:32:07.996 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:07 np0005593234 nova_compute[227762]: 2026-01-23 09:32:07.997 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:32:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:08.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:08 np0005593234 nova_compute[227762]: 2026-01-23 09:32:08.480 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: f2d1fdc0-baaf-4566-8655-aafdbcf1f473] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Jan 23 04:32:08 np0005593234 nova_compute[227762]: 2026-01-23 09:32:08.480 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:32:08 np0005593234 nova_compute[227762]: 2026-01-23 09:32:08.481 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:08 np0005593234 nova_compute[227762]: 2026-01-23 09:32:08.481 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:08 np0005593234 nova_compute[227762]: 2026-01-23 09:32:08.481 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:08 np0005593234 nova_compute[227762]: 2026-01-23 09:32:08.481 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:32:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:08 np0005593234 nova_compute[227762]: 2026-01-23 09:32:08.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:08 np0005593234 nova_compute[227762]: 2026-01-23 09:32:08.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:32:09 np0005593234 nova_compute[227762]: 2026-01-23 09:32:09.009 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:09 np0005593234 nova_compute[227762]: 2026-01-23 09:32:09.298 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:32:09.297 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:32:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:32:09.299 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:32:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:32:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:09.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:32:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:10.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:32:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:32:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:11.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/228397196' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:12.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:12 np0005593234 nova_compute[227762]: 2026-01-23 09:32:12.755 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:13.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:14 np0005593234 nova_compute[227762]: 2026-01-23 09:32:14.073 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:14.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 23 04:32:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:32:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:15.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:32:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:32:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:32:16.301 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:32:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:16.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:17 np0005593234 nova_compute[227762]: 2026-01-23 09:32:17.517 227766 DEBUG nova.compute.manager [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 23 04:32:17 np0005593234 nova_compute[227762]: 2026-01-23 09:32:17.632 227766 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:17 np0005593234 nova_compute[227762]: 2026-01-23 09:32:17.633 227766 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:17 np0005593234 nova_compute[227762]: 2026-01-23 09:32:17.657 227766 DEBUG nova.objects.instance [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'pci_requests' on Instance uuid 0fb415e8-9c82-4021-9088-cfd399d453a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:17 np0005593234 nova_compute[227762]: 2026-01-23 09:32:17.673 227766 DEBUG nova.virt.hardware [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:32:17 np0005593234 nova_compute[227762]: 2026-01-23 09:32:17.673 227766 INFO nova.compute.claims [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:32:17 np0005593234 nova_compute[227762]: 2026-01-23 09:32:17.673 227766 DEBUG nova.objects.instance [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'resources' on Instance uuid 0fb415e8-9c82-4021-9088-cfd399d453a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:17 np0005593234 nova_compute[227762]: 2026-01-23 09:32:17.688 227766 DEBUG nova.objects.instance [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0fb415e8-9c82-4021-9088-cfd399d453a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:17 np0005593234 nova_compute[227762]: 2026-01-23 09:32:17.747 227766 INFO nova.compute.resource_tracker [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Updating resource usage from migration 47980d7d-34af-4e57-9dbf-4f58fd30ae9c#033[00m
Jan 23 04:32:17 np0005593234 nova_compute[227762]: 2026-01-23 09:32:17.748 227766 DEBUG nova.compute.resource_tracker [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Starting to track incoming migration 47980d7d-34af-4e57-9dbf-4f58fd30ae9c with flavor eebea5f8-9b11-45ad-873d-c4ea90d3de87 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 04:32:17 np0005593234 nova_compute[227762]: 2026-01-23 09:32:17.756 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:17 np0005593234 podman[235417]: 2026-01-23 09:32:17.781496383 +0000 UTC m=+0.076094583 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 23 04:32:17 np0005593234 nova_compute[227762]: 2026-01-23 09:32:17.887 227766 DEBUG oslo_concurrency.processutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:32:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:17.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:32:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:32:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3438858860' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:32:18 np0005593234 nova_compute[227762]: 2026-01-23 09:32:18.309 227766 DEBUG oslo_concurrency.processutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:18 np0005593234 nova_compute[227762]: 2026-01-23 09:32:18.315 227766 DEBUG nova.compute.provider_tree [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:32:18 np0005593234 nova_compute[227762]: 2026-01-23 09:32:18.342 227766 DEBUG nova.scheduler.client.report [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:32:18 np0005593234 nova_compute[227762]: 2026-01-23 09:32:18.387 227766 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.755s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:18 np0005593234 nova_compute[227762]: 2026-01-23 09:32:18.388 227766 INFO nova.compute.manager [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Migrating#033[00m
Jan 23 04:32:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:32:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:18.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:32:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:19 np0005593234 nova_compute[227762]: 2026-01-23 09:32:19.075 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:19 np0005593234 systemd-logind[794]: New session 54 of user nova.
Jan 23 04:32:19 np0005593234 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 04:32:19 np0005593234 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 04:32:19 np0005593234 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 04:32:19 np0005593234 systemd[1]: Starting User Manager for UID 42436...
Jan 23 04:32:19 np0005593234 systemd[235469]: Queued start job for default target Main User Target.
Jan 23 04:32:19 np0005593234 systemd[235469]: Created slice User Application Slice.
Jan 23 04:32:19 np0005593234 systemd[235469]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:32:19 np0005593234 systemd[235469]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:32:19 np0005593234 systemd[235469]: Reached target Paths.
Jan 23 04:32:19 np0005593234 systemd[235469]: Reached target Timers.
Jan 23 04:32:19 np0005593234 systemd[235469]: Starting D-Bus User Message Bus Socket...
Jan 23 04:32:19 np0005593234 systemd[235469]: Starting Create User's Volatile Files and Directories...
Jan 23 04:32:19 np0005593234 systemd[235469]: Finished Create User's Volatile Files and Directories.
Jan 23 04:32:19 np0005593234 systemd[235469]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:32:19 np0005593234 systemd[235469]: Reached target Sockets.
Jan 23 04:32:19 np0005593234 systemd[235469]: Reached target Basic System.
Jan 23 04:32:19 np0005593234 systemd[235469]: Reached target Main User Target.
Jan 23 04:32:19 np0005593234 systemd[235469]: Startup finished in 130ms.
Jan 23 04:32:19 np0005593234 systemd[1]: Started User Manager for UID 42436.
Jan 23 04:32:19 np0005593234 systemd[1]: Started Session 54 of User nova.
Jan 23 04:32:19 np0005593234 systemd[1]: session-54.scope: Deactivated successfully.
Jan 23 04:32:19 np0005593234 systemd-logind[794]: Session 54 logged out. Waiting for processes to exit.
Jan 23 04:32:19 np0005593234 systemd-logind[794]: Removed session 54.
Jan 23 04:32:19 np0005593234 systemd-logind[794]: New session 56 of user nova.
Jan 23 04:32:19 np0005593234 systemd[1]: Started Session 56 of User nova.
Jan 23 04:32:19 np0005593234 systemd[1]: session-56.scope: Deactivated successfully.
Jan 23 04:32:19 np0005593234 systemd-logind[794]: Session 56 logged out. Waiting for processes to exit.
Jan 23 04:32:19 np0005593234 systemd-logind[794]: Removed session 56.
Jan 23 04:32:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:19.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:20.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:20 np0005593234 nova_compute[227762]: 2026-01-23 09:32:20.491 227766 DEBUG oslo_concurrency.lockutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Acquiring lock "bbdd11d9-762b-449b-aaee-7037e6742838" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:20 np0005593234 nova_compute[227762]: 2026-01-23 09:32:20.492 227766 DEBUG oslo_concurrency.lockutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "bbdd11d9-762b-449b-aaee-7037e6742838" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:20 np0005593234 nova_compute[227762]: 2026-01-23 09:32:20.512 227766 DEBUG nova.compute.manager [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:32:20 np0005593234 nova_compute[227762]: 2026-01-23 09:32:20.601 227766 DEBUG oslo_concurrency.lockutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:20 np0005593234 nova_compute[227762]: 2026-01-23 09:32:20.602 227766 DEBUG oslo_concurrency.lockutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:20 np0005593234 nova_compute[227762]: 2026-01-23 09:32:20.611 227766 DEBUG nova.virt.hardware [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:32:20 np0005593234 nova_compute[227762]: 2026-01-23 09:32:20.611 227766 INFO nova.compute.claims [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:32:20 np0005593234 nova_compute[227762]: 2026-01-23 09:32:20.764 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:32:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2717954059' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.191 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.197 227766 DEBUG nova.compute.provider_tree [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.217 227766 DEBUG nova.scheduler.client.report [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.260 227766 DEBUG oslo_concurrency.lockutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.658s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.260 227766 DEBUG nova.compute.manager [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.326 227766 DEBUG nova.compute.manager [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.343 227766 INFO nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.370 227766 DEBUG nova.compute.manager [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.500 227766 DEBUG nova.compute.manager [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.502 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.502 227766 INFO nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Creating image(s)#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.529 227766 DEBUG nova.storage.rbd_utils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.558 227766 DEBUG nova.storage.rbd_utils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.715 227766 DEBUG nova.storage.rbd_utils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.719 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.780 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.781 227766 DEBUG oslo_concurrency.lockutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.782 227766 DEBUG oslo_concurrency.lockutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.782 227766 DEBUG oslo_concurrency.lockutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.809 227766 DEBUG nova.storage.rbd_utils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:21 np0005593234 nova_compute[227762]: 2026-01-23 09:32:21.813 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 bbdd11d9-762b-449b-aaee-7037e6742838_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:21.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.156 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 bbdd11d9-762b-449b-aaee-7037e6742838_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.343s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.221 227766 DEBUG nova.storage.rbd_utils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] resizing rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.320 227766 DEBUG nova.objects.instance [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lazy-loading 'migration_context' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:22.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.675 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.676 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Ensure instance console log exists: /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.676 227766 DEBUG oslo_concurrency.lockutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.677 227766 DEBUG oslo_concurrency.lockutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.677 227766 DEBUG oslo_concurrency.lockutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.678 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.682 227766 WARNING nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.687 227766 DEBUG nova.virt.libvirt.host [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.687 227766 DEBUG nova.virt.libvirt.host [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.690 227766 DEBUG nova.virt.libvirt.host [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.691 227766 DEBUG nova.virt.libvirt.host [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.692 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.692 227766 DEBUG nova.virt.hardware [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.693 227766 DEBUG nova.virt.hardware [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.693 227766 DEBUG nova.virt.hardware [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.693 227766 DEBUG nova.virt.hardware [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.694 227766 DEBUG nova.virt.hardware [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.694 227766 DEBUG nova.virt.hardware [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.694 227766 DEBUG nova.virt.hardware [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.695 227766 DEBUG nova.virt.hardware [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.695 227766 DEBUG nova.virt.hardware [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.695 227766 DEBUG nova.virt.hardware [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.695 227766 DEBUG nova.virt.hardware [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.698 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:22 np0005593234 nova_compute[227762]: 2026-01-23 09:32:22.759 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1447678976' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:23 np0005593234 nova_compute[227762]: 2026-01-23 09:32:23.139 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:23 np0005593234 nova_compute[227762]: 2026-01-23 09:32:23.164 227766 DEBUG nova.storage.rbd_utils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:23 np0005593234 nova_compute[227762]: 2026-01-23 09:32:23.167 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3407275796' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:23 np0005593234 nova_compute[227762]: 2026-01-23 09:32:23.608 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:23 np0005593234 nova_compute[227762]: 2026-01-23 09:32:23.610 227766 DEBUG nova.objects.instance [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lazy-loading 'pci_devices' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:23 np0005593234 nova_compute[227762]: 2026-01-23 09:32:23.812 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  <uuid>bbdd11d9-762b-449b-aaee-7037e6742838</uuid>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  <name>instance-00000010</name>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServersAdmin275Test-server-2025545369</nova:name>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:32:22</nova:creationTime>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <nova:user uuid="0772eb0a2fe14d3182d80050e5d94826">tempest-ServersAdmin275Test-1983842130-project-member</nova:user>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <nova:project uuid="430b9216a12b4d41a4333023b00acdff">tempest-ServersAdmin275Test-1983842130</nova:project>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <entry name="serial">bbdd11d9-762b-449b-aaee-7037e6742838</entry>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <entry name="uuid">bbdd11d9-762b-449b-aaee-7037e6742838</entry>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/bbdd11d9-762b-449b-aaee-7037e6742838_disk">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/bbdd11d9-762b-449b-aaee-7037e6742838_disk.config">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/console.log" append="off"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:32:23 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:32:23 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:32:23 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:32:23 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:32:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:32:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:23.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:32:24 np0005593234 nova_compute[227762]: 2026-01-23 09:32:24.078 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:24 np0005593234 nova_compute[227762]: 2026-01-23 09:32:24.123 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:32:24 np0005593234 nova_compute[227762]: 2026-01-23 09:32:24.123 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:32:24 np0005593234 nova_compute[227762]: 2026-01-23 09:32:24.124 227766 INFO nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Using config drive#033[00m
Jan 23 04:32:24 np0005593234 nova_compute[227762]: 2026-01-23 09:32:24.178 227766 DEBUG nova.storage.rbd_utils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:24.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:24 np0005593234 nova_compute[227762]: 2026-01-23 09:32:24.851 227766 INFO nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Creating config drive at /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config#033[00m
Jan 23 04:32:24 np0005593234 nova_compute[227762]: 2026-01-23 09:32:24.858 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphs0mnmxi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:24 np0005593234 nova_compute[227762]: 2026-01-23 09:32:24.998 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphs0mnmxi" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:25 np0005593234 nova_compute[227762]: 2026-01-23 09:32:25.143 227766 DEBUG nova.storage.rbd_utils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:25 np0005593234 nova_compute[227762]: 2026-01-23 09:32:25.146 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config bbdd11d9-762b-449b-aaee-7037e6742838_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:25 np0005593234 nova_compute[227762]: 2026-01-23 09:32:25.579 227766 DEBUG oslo_concurrency.processutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config bbdd11d9-762b-449b-aaee-7037e6742838_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:25 np0005593234 nova_compute[227762]: 2026-01-23 09:32:25.580 227766 INFO nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Deleting local config drive /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config because it was imported into RBD.#033[00m
Jan 23 04:32:25 np0005593234 systemd-machined[195626]: New machine qemu-6-instance-00000010.
Jan 23 04:32:25 np0005593234 systemd[1]: Started Virtual Machine qemu-6-instance-00000010.
Jan 23 04:32:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:32:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:25.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.312 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160746.3111675, bbdd11d9-762b-449b-aaee-7037e6742838 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.314 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.316 227766 DEBUG nova.compute.manager [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.317 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.322 227766 INFO nova.virt.libvirt.driver [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance spawned successfully.#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.323 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.340 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.347 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.352 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.353 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.353 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.354 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.354 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.354 227766 DEBUG nova.virt.libvirt.driver [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.393 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.393 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160746.3135045, bbdd11d9-762b-449b-aaee-7037e6742838 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.393 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] VM Started (Lifecycle Event)#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.430 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.434 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:26.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.460 227766 INFO nova.compute.manager [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Took 4.96 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.460 227766 DEBUG nova.compute.manager [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.461 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.525 227766 INFO nova.compute.manager [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Took 5.95 seconds to build instance.#033[00m
Jan 23 04:32:26 np0005593234 nova_compute[227762]: 2026-01-23 09:32:26.555 227766 DEBUG oslo_concurrency.lockutils [None req-609525c3-a636-4040-83ae-1d160ab4a25c 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "bbdd11d9-762b-449b-aaee-7037e6742838" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:27 np0005593234 nova_compute[227762]: 2026-01-23 09:32:27.761 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:27.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:28 np0005593234 ceph-mgr[77448]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 04:32:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:28.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:29 np0005593234 nova_compute[227762]: 2026-01-23 09:32:29.080 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:32:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 1800.5 total, 600.0 interval#012Cumulative writes: 9889 writes, 40K keys, 9889 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s#012Cumulative WAL: 9889 writes, 2483 syncs, 3.98 writes per sync, written: 0.04 GB, 0.02 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4065 writes, 16K keys, 4065 commit groups, 1.0 writes per commit group, ingest: 18.37 MB, 0.03 MB/s#012Interval WAL: 4065 writes, 1483 syncs, 2.74 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 04:32:29 np0005593234 nova_compute[227762]: 2026-01-23 09:32:29.579 227766 INFO nova.compute.manager [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Rebuilding instance#033[00m
Jan 23 04:32:29 np0005593234 nova_compute[227762]: 2026-01-23 09:32:29.858 227766 DEBUG nova.objects.instance [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lazy-loading 'trusted_certs' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:29 np0005593234 nova_compute[227762]: 2026-01-23 09:32:29.883 227766 DEBUG nova.compute.manager [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:29.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:29 np0005593234 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 04:32:29 np0005593234 systemd[235469]: Activating special unit Exit the Session...
Jan 23 04:32:29 np0005593234 systemd[235469]: Stopped target Main User Target.
Jan 23 04:32:29 np0005593234 systemd[235469]: Stopped target Basic System.
Jan 23 04:32:29 np0005593234 systemd[235469]: Stopped target Paths.
Jan 23 04:32:29 np0005593234 systemd[235469]: Stopped target Sockets.
Jan 23 04:32:29 np0005593234 systemd[235469]: Stopped target Timers.
Jan 23 04:32:29 np0005593234 systemd[235469]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:32:29 np0005593234 systemd[235469]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 04:32:29 np0005593234 systemd[235469]: Closed D-Bus User Message Bus Socket.
Jan 23 04:32:29 np0005593234 systemd[235469]: Stopped Create User's Volatile Files and Directories.
Jan 23 04:32:29 np0005593234 systemd[235469]: Removed slice User Application Slice.
Jan 23 04:32:29 np0005593234 systemd[235469]: Reached target Shutdown.
Jan 23 04:32:29 np0005593234 systemd[235469]: Finished Exit the Session.
Jan 23 04:32:29 np0005593234 systemd[235469]: Reached target Exit the Session.
Jan 23 04:32:29 np0005593234 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 04:32:29 np0005593234 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 04:32:29 np0005593234 nova_compute[227762]: 2026-01-23 09:32:29.947 227766 DEBUG nova.objects.instance [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lazy-loading 'pci_requests' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:29 np0005593234 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 04:32:29 np0005593234 nova_compute[227762]: 2026-01-23 09:32:29.966 227766 DEBUG nova.objects.instance [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lazy-loading 'pci_devices' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:29 np0005593234 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 04:32:29 np0005593234 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 04:32:29 np0005593234 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 04:32:29 np0005593234 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 04:32:29 np0005593234 nova_compute[227762]: 2026-01-23 09:32:29.984 227766 DEBUG nova.objects.instance [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lazy-loading 'resources' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:30 np0005593234 nova_compute[227762]: 2026-01-23 09:32:30.015 227766 DEBUG nova.objects.instance [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lazy-loading 'migration_context' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:30 np0005593234 nova_compute[227762]: 2026-01-23 09:32:30.030 227766 DEBUG nova.objects.instance [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:32:30 np0005593234 nova_compute[227762]: 2026-01-23 09:32:30.034 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 04:32:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:30.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:30 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4253371632' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:31.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:32.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:32 np0005593234 nova_compute[227762]: 2026-01-23 09:32:32.763 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:32:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:33.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:32:34 np0005593234 nova_compute[227762]: 2026-01-23 09:32:34.083 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:34.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:34 np0005593234 nova_compute[227762]: 2026-01-23 09:32:34.716 227766 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:32:34 np0005593234 nova_compute[227762]: 2026-01-23 09:32:34.717 227766 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquired lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:32:34 np0005593234 nova_compute[227762]: 2026-01-23 09:32:34.717 227766 DEBUG nova.network.neutron [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:32:34 np0005593234 podman[235916]: 2026-01-23 09:32:34.791518343 +0000 UTC m=+0.077961941 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 04:32:35 np0005593234 nova_compute[227762]: 2026-01-23 09:32:35.061 227766 DEBUG nova.network.neutron [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:32:35 np0005593234 nova_compute[227762]: 2026-01-23 09:32:35.515 227766 DEBUG nova.network.neutron [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:32:35 np0005593234 nova_compute[227762]: 2026-01-23 09:32:35.550 227766 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Releasing lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:32:35 np0005593234 nova_compute[227762]: 2026-01-23 09:32:35.778 227766 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 23 04:32:35 np0005593234 nova_compute[227762]: 2026-01-23 09:32:35.779 227766 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 04:32:35 np0005593234 nova_compute[227762]: 2026-01-23 09:32:35.780 227766 INFO nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Creating image(s)#033[00m
Jan 23 04:32:35 np0005593234 nova_compute[227762]: 2026-01-23 09:32:35.815 227766 DEBUG nova.storage.rbd_utils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] creating snapshot(nova-resize) on rbd image(0fb415e8-9c82-4021-9088-cfd399d453a0_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:32:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:32:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:35.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:32:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e152 e152: 3 total, 3 up, 3 in
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.085 227766 DEBUG nova.objects.instance [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 0fb415e8-9c82-4021-9088-cfd399d453a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:32:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:36.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.469 227766 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.469 227766 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Ensure instance console log exists: /var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.470 227766 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.470 227766 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.471 227766 DEBUG oslo_concurrency.lockutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.472 227766 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.477 227766 WARNING nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.485 227766 DEBUG nova.virt.libvirt.host [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.486 227766 DEBUG nova.virt.libvirt.host [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.490 227766 DEBUG nova.virt.libvirt.host [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.491 227766 DEBUG nova.virt.libvirt.host [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.492 227766 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.492 227766 DEBUG nova.virt.hardware [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eebea5f8-9b11-45ad-873d-c4ea90d3de87',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.493 227766 DEBUG nova.virt.hardware [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.493 227766 DEBUG nova.virt.hardware [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.493 227766 DEBUG nova.virt.hardware [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.494 227766 DEBUG nova.virt.hardware [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.494 227766 DEBUG nova.virt.hardware [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.494 227766 DEBUG nova.virt.hardware [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.494 227766 DEBUG nova.virt.hardware [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.495 227766 DEBUG nova.virt.hardware [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.495 227766 DEBUG nova.virt.hardware [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.495 227766 DEBUG nova.virt.hardware [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.496 227766 DEBUG nova.objects.instance [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0fb415e8-9c82-4021-9088-cfd399d453a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.537 227766 DEBUG oslo_concurrency.processutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:36 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/296656050' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:36 np0005593234 nova_compute[227762]: 2026-01-23 09:32:36.980 227766 DEBUG oslo_concurrency.processutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:37 np0005593234 nova_compute[227762]: 2026-01-23 09:32:37.024 227766 DEBUG oslo_concurrency.processutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4150379953' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:37 np0005593234 nova_compute[227762]: 2026-01-23 09:32:37.445 227766 DEBUG oslo_concurrency.processutils [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:37 np0005593234 nova_compute[227762]: 2026-01-23 09:32:37.450 227766 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  <uuid>0fb415e8-9c82-4021-9088-cfd399d453a0</uuid>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  <name>instance-0000000e</name>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  <memory>196608</memory>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <nova:name>tempest-MigrationsAdminTest-server-2110965880</nova:name>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:32:36</nova:creationTime>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.micro">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <nova:memory>192</nova:memory>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <nova:user uuid="7536fa2e625541fba613dc32a49a4c5b">tempest-MigrationsAdminTest-2056264627-project-member</nova:user>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <nova:project uuid="11def90dfdc14cfe928302bec2835794">tempest-MigrationsAdminTest-2056264627</nova:project>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <entry name="serial">0fb415e8-9c82-4021-9088-cfd399d453a0</entry>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <entry name="uuid">0fb415e8-9c82-4021-9088-cfd399d453a0</entry>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/0fb415e8-9c82-4021-9088-cfd399d453a0_disk">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/0fb415e8-9c82-4021-9088-cfd399d453a0_disk.config">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/0fb415e8-9c82-4021-9088-cfd399d453a0/console.log" append="off"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:32:37 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:32:37 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:32:37 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:32:37 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:32:37 np0005593234 nova_compute[227762]: 2026-01-23 09:32:37.540 227766 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:32:37 np0005593234 nova_compute[227762]: 2026-01-23 09:32:37.540 227766 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:32:37 np0005593234 nova_compute[227762]: 2026-01-23 09:32:37.541 227766 INFO nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Using config drive#033[00m
Jan 23 04:32:37 np0005593234 systemd-machined[195626]: New machine qemu-7-instance-0000000e.
Jan 23 04:32:37 np0005593234 systemd[1]: Started Virtual Machine qemu-7-instance-0000000e.
Jan 23 04:32:37 np0005593234 nova_compute[227762]: 2026-01-23 09:32:37.764 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:37.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:32:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:38.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:32:38 np0005593234 nova_compute[227762]: 2026-01-23 09:32:38.554 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160758.5538275, 0fb415e8-9c82-4021-9088-cfd399d453a0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:38 np0005593234 nova_compute[227762]: 2026-01-23 09:32:38.555 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:32:38 np0005593234 nova_compute[227762]: 2026-01-23 09:32:38.558 227766 DEBUG nova.compute.manager [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:32:38 np0005593234 nova_compute[227762]: 2026-01-23 09:32:38.563 227766 INFO nova.virt.libvirt.driver [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance running successfully.#033[00m
Jan 23 04:32:38 np0005593234 virtqemud[227483]: argument unsupported: QEMU guest agent is not configured
Jan 23 04:32:38 np0005593234 nova_compute[227762]: 2026-01-23 09:32:38.565 227766 DEBUG nova.virt.libvirt.guest [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 04:32:38 np0005593234 nova_compute[227762]: 2026-01-23 09:32:38.566 227766 DEBUG nova.virt.libvirt.driver [None req-e1c7ff27-4d9b-433d-8a45-1d888768e584 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 23 04:32:38 np0005593234 nova_compute[227762]: 2026-01-23 09:32:38.593 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:38 np0005593234 nova_compute[227762]: 2026-01-23 09:32:38.601 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:38 np0005593234 nova_compute[227762]: 2026-01-23 09:32:38.644 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 23 04:32:38 np0005593234 nova_compute[227762]: 2026-01-23 09:32:38.644 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160758.5549266, 0fb415e8-9c82-4021-9088-cfd399d453a0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:38 np0005593234 nova_compute[227762]: 2026-01-23 09:32:38.644 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] VM Started (Lifecycle Event)#033[00m
Jan 23 04:32:38 np0005593234 nova_compute[227762]: 2026-01-23 09:32:38.675 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:38 np0005593234 nova_compute[227762]: 2026-01-23 09:32:38.678 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:39 np0005593234 nova_compute[227762]: 2026-01-23 09:32:39.101 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:32:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:39.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:32:40 np0005593234 nova_compute[227762]: 2026-01-23 09:32:40.082 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 04:32:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:32:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:40.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:32:40 np0005593234 nova_compute[227762]: 2026-01-23 09:32:40.591 227766 DEBUG oslo_concurrency.lockutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:32:40 np0005593234 nova_compute[227762]: 2026-01-23 09:32:40.592 227766 DEBUG oslo_concurrency.lockutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquired lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:32:40 np0005593234 nova_compute[227762]: 2026-01-23 09:32:40.592 227766 DEBUG nova.network.neutron [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:32:40 np0005593234 nova_compute[227762]: 2026-01-23 09:32:40.872 227766 DEBUG nova.network.neutron [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:32:41 np0005593234 nova_compute[227762]: 2026-01-23 09:32:41.836 227766 DEBUG nova.network.neutron [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:32:41 np0005593234 nova_compute[227762]: 2026-01-23 09:32:41.861 227766 DEBUG oslo_concurrency.lockutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Releasing lock "refresh_cache-0fb415e8-9c82-4021-9088-cfd399d453a0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:32:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:41.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:42 np0005593234 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Jan 23 04:32:42 np0005593234 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Consumed 4.337s CPU time.
Jan 23 04:32:42 np0005593234 systemd-machined[195626]: Machine qemu-7-instance-0000000e terminated.
Jan 23 04:32:42 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Jan 23 04:32:42 np0005593234 nova_compute[227762]: 2026-01-23 09:32:42.307 227766 INFO nova.virt.libvirt.driver [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Instance destroyed successfully.#033[00m
Jan 23 04:32:42 np0005593234 nova_compute[227762]: 2026-01-23 09:32:42.308 227766 DEBUG nova.objects.instance [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'resources' on Instance uuid 0fb415e8-9c82-4021-9088-cfd399d453a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:42 np0005593234 nova_compute[227762]: 2026-01-23 09:32:42.334 227766 DEBUG oslo_concurrency.lockutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:42 np0005593234 nova_compute[227762]: 2026-01-23 09:32:42.334 227766 DEBUG oslo_concurrency.lockutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:42 np0005593234 nova_compute[227762]: 2026-01-23 09:32:42.357 227766 DEBUG nova.objects.instance [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'migration_context' on Instance uuid 0fb415e8-9c82-4021-9088-cfd399d453a0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:42 np0005593234 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000010.scope: Deactivated successfully.
Jan 23 04:32:42 np0005593234 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000010.scope: Consumed 13.515s CPU time.
Jan 23 04:32:42 np0005593234 systemd-machined[195626]: Machine qemu-6-instance-00000010 terminated.
Jan 23 04:32:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:42.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:42 np0005593234 nova_compute[227762]: 2026-01-23 09:32:42.509 227766 DEBUG oslo_concurrency.processutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:42 np0005593234 nova_compute[227762]: 2026-01-23 09:32:42.766 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:32:42.807 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:32:42.808 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:32:42.808 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:32:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1773655518' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:32:42 np0005593234 nova_compute[227762]: 2026-01-23 09:32:42.957 227766 DEBUG oslo_concurrency.processutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:42 np0005593234 nova_compute[227762]: 2026-01-23 09:32:42.969 227766 DEBUG nova.compute.provider_tree [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:32:42 np0005593234 nova_compute[227762]: 2026-01-23 09:32:42.998 227766 DEBUG nova.scheduler.client.report [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:32:43 np0005593234 nova_compute[227762]: 2026-01-23 09:32:43.086 227766 DEBUG oslo_concurrency.lockutils [None req-e0edf52a-dc07-4097-80b3-66cc32c02770 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.752s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:43 np0005593234 nova_compute[227762]: 2026-01-23 09:32:43.097 227766 INFO nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance shutdown successfully after 13 seconds.#033[00m
Jan 23 04:32:43 np0005593234 nova_compute[227762]: 2026-01-23 09:32:43.103 227766 INFO nova.virt.libvirt.driver [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance destroyed successfully.#033[00m
Jan 23 04:32:43 np0005593234 nova_compute[227762]: 2026-01-23 09:32:43.109 227766 INFO nova.virt.libvirt.driver [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance destroyed successfully.#033[00m
Jan 23 04:32:43 np0005593234 nova_compute[227762]: 2026-01-23 09:32:43.527 227766 INFO nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Deleting instance files /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838_del#033[00m
Jan 23 04:32:43 np0005593234 nova_compute[227762]: 2026-01-23 09:32:43.528 227766 INFO nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Deletion of /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838_del complete#033[00m
Jan 23 04:32:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e152 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:43 np0005593234 nova_compute[227762]: 2026-01-23 09:32:43.765 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:32:43 np0005593234 nova_compute[227762]: 2026-01-23 09:32:43.766 227766 INFO nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Creating image(s)#033[00m
Jan 23 04:32:43 np0005593234 nova_compute[227762]: 2026-01-23 09:32:43.792 227766 DEBUG nova.storage.rbd_utils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:43 np0005593234 nova_compute[227762]: 2026-01-23 09:32:43.819 227766 DEBUG nova.storage.rbd_utils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:43 np0005593234 nova_compute[227762]: 2026-01-23 09:32:43.847 227766 DEBUG nova.storage.rbd_utils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:43 np0005593234 nova_compute[227762]: 2026-01-23 09:32:43.852 227766 DEBUG oslo_concurrency.lockutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Acquiring lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:43 np0005593234 nova_compute[227762]: 2026-01-23 09:32:43.853 227766 DEBUG oslo_concurrency.lockutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:43.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:44 np0005593234 nova_compute[227762]: 2026-01-23 09:32:44.103 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:44 np0005593234 nova_compute[227762]: 2026-01-23 09:32:44.252 227766 DEBUG nova.virt.libvirt.imagebackend [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/ae1f9e37-418c-462f-81d1-3599a6d89de9/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/ae1f9e37-418c-462f-81d1-3599a6d89de9/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 04:32:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:32:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3770974501' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:32:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:32:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3770974501' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:32:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:32:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:44.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:32:45 np0005593234 nova_compute[227762]: 2026-01-23 09:32:45.830 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:45 np0005593234 nova_compute[227762]: 2026-01-23 09:32:45.890 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.part --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:45 np0005593234 nova_compute[227762]: 2026-01-23 09:32:45.891 227766 DEBUG nova.virt.images [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] ae1f9e37-418c-462f-81d1-3599a6d89de9 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 23 04:32:45 np0005593234 nova_compute[227762]: 2026-01-23 09:32:45.892 227766 DEBUG nova.privsep.utils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 23 04:32:45 np0005593234 nova_compute[227762]: 2026-01-23 09:32:45.893 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.part /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:45.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.041 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.part /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.converted" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.048 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.107 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20.converted --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.108 227766 DEBUG oslo_concurrency.lockutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.134 227766 DEBUG nova.storage.rbd_utils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.137 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 bbdd11d9-762b-449b-aaee-7037e6742838_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e153 e153: 3 total, 3 up, 3 in
Jan 23 04:32:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:46.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.499 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 bbdd11d9-762b-449b-aaee-7037e6742838_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.363s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.566 227766 DEBUG nova.storage.rbd_utils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] resizing rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.676 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.676 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Ensure instance console log exists: /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.677 227766 DEBUG oslo_concurrency.lockutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.677 227766 DEBUG oslo_concurrency.lockutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.678 227766 DEBUG oslo_concurrency.lockutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.679 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.683 227766 WARNING nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.689 227766 DEBUG nova.virt.libvirt.host [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.689 227766 DEBUG nova.virt.libvirt.host [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.693 227766 DEBUG nova.virt.libvirt.host [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.693 227766 DEBUG nova.virt.libvirt.host [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.695 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.695 227766 DEBUG nova.virt.hardware [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.695 227766 DEBUG nova.virt.hardware [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.696 227766 DEBUG nova.virt.hardware [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.696 227766 DEBUG nova.virt.hardware [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.696 227766 DEBUG nova.virt.hardware [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.697 227766 DEBUG nova.virt.hardware [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.697 227766 DEBUG nova.virt.hardware [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.698 227766 DEBUG nova.virt.hardware [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.698 227766 DEBUG nova.virt.hardware [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.699 227766 DEBUG nova.virt.hardware [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.699 227766 DEBUG nova.virt.hardware [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.700 227766 DEBUG nova.objects.instance [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lazy-loading 'vcpu_model' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:46 np0005593234 nova_compute[227762]: 2026-01-23 09:32:46.735 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3793678276' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:47 np0005593234 nova_compute[227762]: 2026-01-23 09:32:47.168 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:47 np0005593234 nova_compute[227762]: 2026-01-23 09:32:47.197 227766 DEBUG nova.storage.rbd_utils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:47 np0005593234 nova_compute[227762]: 2026-01-23 09:32:47.202 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:32:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2074093515' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:32:47 np0005593234 nova_compute[227762]: 2026-01-23 09:32:47.640 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:47 np0005593234 nova_compute[227762]: 2026-01-23 09:32:47.644 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  <uuid>bbdd11d9-762b-449b-aaee-7037e6742838</uuid>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  <name>instance-00000010</name>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServersAdmin275Test-server-2025545369</nova:name>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:32:46</nova:creationTime>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <nova:user uuid="0772eb0a2fe14d3182d80050e5d94826">tempest-ServersAdmin275Test-1983842130-project-member</nova:user>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <nova:project uuid="430b9216a12b4d41a4333023b00acdff">tempest-ServersAdmin275Test-1983842130</nova:project>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="ae1f9e37-418c-462f-81d1-3599a6d89de9"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <entry name="serial">bbdd11d9-762b-449b-aaee-7037e6742838</entry>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <entry name="uuid">bbdd11d9-762b-449b-aaee-7037e6742838</entry>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/bbdd11d9-762b-449b-aaee-7037e6742838_disk">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/bbdd11d9-762b-449b-aaee-7037e6742838_disk.config">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/console.log" append="off"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:32:47 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:32:47 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:32:47 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:32:47 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:32:47 np0005593234 nova_compute[227762]: 2026-01-23 09:32:47.768 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:47 np0005593234 nova_compute[227762]: 2026-01-23 09:32:47.875 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:32:47 np0005593234 nova_compute[227762]: 2026-01-23 09:32:47.876 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:32:47 np0005593234 nova_compute[227762]: 2026-01-23 09:32:47.876 227766 INFO nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Using config drive#033[00m
Jan 23 04:32:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:47.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:47 np0005593234 nova_compute[227762]: 2026-01-23 09:32:47.950 227766 DEBUG nova.storage.rbd_utils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:47 np0005593234 nova_compute[227762]: 2026-01-23 09:32:47.976 227766 DEBUG nova.objects.instance [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lazy-loading 'ec2_ids' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:48 np0005593234 nova_compute[227762]: 2026-01-23 09:32:48.086 227766 DEBUG nova.objects.instance [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lazy-loading 'keypairs' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:32:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:48.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:32:48 np0005593234 nova_compute[227762]: 2026-01-23 09:32:48.711 227766 INFO nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Creating config drive at /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config#033[00m
Jan 23 04:32:48 np0005593234 nova_compute[227762]: 2026-01-23 09:32:48.717 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu6smo20g execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e153 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:48 np0005593234 podman[236455]: 2026-01-23 09:32:48.778469681 +0000 UTC m=+0.076869137 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:32:48 np0005593234 nova_compute[227762]: 2026-01-23 09:32:48.844 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu6smo20g" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:48 np0005593234 nova_compute[227762]: 2026-01-23 09:32:48.874 227766 DEBUG nova.storage.rbd_utils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:32:48 np0005593234 nova_compute[227762]: 2026-01-23 09:32:48.878 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config bbdd11d9-762b-449b-aaee-7037e6742838_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.039 227766 DEBUG oslo_concurrency.processutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config bbdd11d9-762b-449b-aaee-7037e6742838_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.040 227766 INFO nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Deleting local config drive /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config because it was imported into RBD.#033[00m
Jan 23 04:32:49 np0005593234 systemd-machined[195626]: New machine qemu-8-instance-00000010.
Jan 23 04:32:49 np0005593234 systemd[1]: Started Virtual Machine qemu-8-instance-00000010.
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.104 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.541 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for bbdd11d9-762b-449b-aaee-7037e6742838 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.542 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160769.5405452, bbdd11d9-762b-449b-aaee-7037e6742838 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.542 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.545 227766 DEBUG nova.compute.manager [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.545 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.549 227766 INFO nova.virt.libvirt.driver [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance spawned successfully.#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.549 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.580 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.585 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.586 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.586 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.586 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.587 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.587 227766 DEBUG nova.virt.libvirt.driver [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.591 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.626 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.627 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160769.5417945, bbdd11d9-762b-449b-aaee-7037e6742838 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.627 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] VM Started (Lifecycle Event)#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.676 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.681 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.695 227766 DEBUG nova.compute.manager [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.736 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.786 227766 DEBUG oslo_concurrency.lockutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.787 227766 DEBUG oslo_concurrency.lockutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.788 227766 DEBUG nova.objects.instance [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:32:49 np0005593234 nova_compute[227762]: 2026-01-23 09:32:49.876 227766 DEBUG oslo_concurrency.lockutils [None req-e444debf-315d-4fc1-a596-90ca2f2fe80b 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:32:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:49.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:50.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:51.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:52.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:52 np0005593234 nova_compute[227762]: 2026-01-23 09:32:52.770 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:52 np0005593234 nova_compute[227762]: 2026-01-23 09:32:52.795 227766 INFO nova.compute.manager [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Rebuilding instance#033[00m
Jan 23 04:32:53 np0005593234 nova_compute[227762]: 2026-01-23 09:32:53.326 227766 DEBUG nova.objects.instance [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lazy-loading 'trusted_certs' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:53 np0005593234 nova_compute[227762]: 2026-01-23 09:32:53.353 227766 DEBUG nova.compute.manager [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:53 np0005593234 nova_compute[227762]: 2026-01-23 09:32:53.443 227766 DEBUG nova.objects.instance [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lazy-loading 'pci_requests' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e154 e154: 3 total, 3 up, 3 in
Jan 23 04:32:53 np0005593234 nova_compute[227762]: 2026-01-23 09:32:53.464 227766 DEBUG nova.objects.instance [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lazy-loading 'pci_devices' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:53 np0005593234 nova_compute[227762]: 2026-01-23 09:32:53.481 227766 DEBUG nova.objects.instance [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lazy-loading 'resources' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:53 np0005593234 nova_compute[227762]: 2026-01-23 09:32:53.514 227766 DEBUG nova.objects.instance [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lazy-loading 'migration_context' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:32:53 np0005593234 nova_compute[227762]: 2026-01-23 09:32:53.541 227766 DEBUG nova.objects.instance [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:32:53 np0005593234 nova_compute[227762]: 2026-01-23 09:32:53.544 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 04:32:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:32:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:53.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:32:54 np0005593234 nova_compute[227762]: 2026-01-23 09:32:54.107 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:32:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:54.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:32:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:55.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:56.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:57 np0005593234 nova_compute[227762]: 2026-01-23 09:32:57.306 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160762.3046422, 0fb415e8-9c82-4021-9088-cfd399d453a0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:32:57 np0005593234 nova_compute[227762]: 2026-01-23 09:32:57.307 227766 INFO nova.compute.manager [-] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:32:57 np0005593234 nova_compute[227762]: 2026-01-23 09:32:57.329 227766 DEBUG nova.compute.manager [None req-b94567d4-32bd-4fa4-895c-2e8312d0b27c - - - - - -] [instance: 0fb415e8-9c82-4021-9088-cfd399d453a0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:32:57 np0005593234 nova_compute[227762]: 2026-01-23 09:32:57.772 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:57.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:32:58.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:32:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:32:59 np0005593234 nova_compute[227762]: 2026-01-23 09:32:59.109 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:32:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:32:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:32:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:32:59.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:33:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:00.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:33:00 np0005593234 nova_compute[227762]: 2026-01-23 09:33:00.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:00 np0005593234 nova_compute[227762]: 2026-01-23 09:33:00.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 04:33:00 np0005593234 nova_compute[227762]: 2026-01-23 09:33:00.777 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 04:33:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:01.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:02.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:02 np0005593234 nova_compute[227762]: 2026-01-23 09:33:02.774 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:02 np0005593234 nova_compute[227762]: 2026-01-23 09:33:02.776 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:02 np0005593234 nova_compute[227762]: 2026-01-23 09:33:02.776 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:02 np0005593234 nova_compute[227762]: 2026-01-23 09:33:02.776 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:02 np0005593234 nova_compute[227762]: 2026-01-23 09:33:02.811 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:02 np0005593234 nova_compute[227762]: 2026-01-23 09:33:02.812 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:02 np0005593234 nova_compute[227762]: 2026-01-23 09:33:02.812 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:02 np0005593234 nova_compute[227762]: 2026-01-23 09:33:02.813 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:33:02 np0005593234 nova_compute[227762]: 2026-01-23 09:33:02.813 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:33:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/788818493' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.271 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.368 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.368 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.371 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.372 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000010 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.513 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.514 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4395MB free_disk=20.719512939453125GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.514 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.515 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.592 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 04:33:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.748 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance f3277436-85d0-4674-aa69-d7a50448a5d0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.749 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance bbdd11d9-762b-449b-aaee-7037e6742838 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.796 227766 WARNING nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 07c9ba0d-ab3d-4079-ab97-46e91de4911a has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}.#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.796 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.796 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.815 227766 DEBUG nova.compute.manager [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 23 04:33:03 np0005593234 nova_compute[227762]: 2026-01-23 09:33:03.950 227766 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:03.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.105 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.124 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:33:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:04.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:33:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:33:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2913031537' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.541 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.547 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.563 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.584 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.585 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.585 227766 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.607 227766 DEBUG nova.objects.instance [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lazy-loading 'pci_requests' on Instance uuid 07c9ba0d-ab3d-4079-ab97-46e91de4911a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.625 227766 DEBUG nova.virt.hardware [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.625 227766 INFO nova.compute.claims [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.626 227766 DEBUG nova.objects.instance [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lazy-loading 'resources' on Instance uuid 07c9ba0d-ab3d-4079-ab97-46e91de4911a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.639 227766 DEBUG nova.objects.instance [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lazy-loading 'numa_topology' on Instance uuid 07c9ba0d-ab3d-4079-ab97-46e91de4911a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.675 227766 DEBUG nova.objects.instance [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lazy-loading 'pci_devices' on Instance uuid 07c9ba0d-ab3d-4079-ab97-46e91de4911a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.769 227766 INFO nova.compute.resource_tracker [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Updating resource usage from migration 36643aa1-ddbd-427a-beef-052fd4db42bf#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.769 227766 DEBUG nova.compute.resource_tracker [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Starting to track incoming migration 36643aa1-ddbd-427a-beef-052fd4db42bf with flavor 68d42077-c749-4366-ba3e-07758debb02d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 04:33:04 np0005593234 nova_compute[227762]: 2026-01-23 09:33:04.884 227766 DEBUG oslo_concurrency.processutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:33:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1709595499' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:33:05 np0005593234 nova_compute[227762]: 2026-01-23 09:33:05.306 227766 DEBUG oslo_concurrency.processutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:05 np0005593234 nova_compute[227762]: 2026-01-23 09:33:05.311 227766 DEBUG nova.compute.provider_tree [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:33:05 np0005593234 nova_compute[227762]: 2026-01-23 09:33:05.334 227766 DEBUG nova.scheduler.client.report [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:33:05 np0005593234 nova_compute[227762]: 2026-01-23 09:33:05.389 227766 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.804s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:05 np0005593234 nova_compute[227762]: 2026-01-23 09:33:05.389 227766 INFO nova.compute.manager [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Migrating#033[00m
Jan 23 04:33:05 np0005593234 nova_compute[227762]: 2026-01-23 09:33:05.548 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:05 np0005593234 nova_compute[227762]: 2026-01-23 09:33:05.573 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:05 np0005593234 nova_compute[227762]: 2026-01-23 09:33:05.573 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:05 np0005593234 nova_compute[227762]: 2026-01-23 09:33:05.573 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:33:05 np0005593234 podman[236704]: 2026-01-23 09:33:05.786801908 +0000 UTC m=+0.077413245 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 04:33:05 np0005593234 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000010.scope: Deactivated successfully.
Jan 23 04:33:05 np0005593234 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000010.scope: Consumed 13.233s CPU time.
Jan 23 04:33:05 np0005593234 systemd-machined[195626]: Machine qemu-8-instance-00000010 terminated.
Jan 23 04:33:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:06.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:06.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:06 np0005593234 nova_compute[227762]: 2026-01-23 09:33:06.611 227766 INFO nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance shutdown successfully after 13 seconds.#033[00m
Jan 23 04:33:06 np0005593234 nova_compute[227762]: 2026-01-23 09:33:06.617 227766 INFO nova.virt.libvirt.driver [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance destroyed successfully.#033[00m
Jan 23 04:33:06 np0005593234 nova_compute[227762]: 2026-01-23 09:33:06.621 227766 INFO nova.virt.libvirt.driver [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance destroyed successfully.#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.020 227766 INFO nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Deleting instance files /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838_del#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.021 227766 INFO nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Deletion of /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838_del complete#033[00m
Jan 23 04:33:07 np0005593234 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 04:33:07 np0005593234 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 04:33:07 np0005593234 systemd-logind[794]: New session 57 of user nova.
Jan 23 04:33:07 np0005593234 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 04:33:07 np0005593234 systemd[1]: Starting User Manager for UID 42436...
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.263 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.265 227766 INFO nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Creating image(s)#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.287 227766 DEBUG nova.storage.rbd_utils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.314 227766 DEBUG nova.storage.rbd_utils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:33:07 np0005593234 systemd[236749]: Queued start job for default target Main User Target.
Jan 23 04:33:07 np0005593234 systemd[236749]: Created slice User Application Slice.
Jan 23 04:33:07 np0005593234 systemd[236749]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:33:07 np0005593234 systemd[236749]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:33:07 np0005593234 systemd[236749]: Reached target Paths.
Jan 23 04:33:07 np0005593234 systemd[236749]: Reached target Timers.
Jan 23 04:33:07 np0005593234 systemd[236749]: Starting D-Bus User Message Bus Socket...
Jan 23 04:33:07 np0005593234 systemd[236749]: Starting Create User's Volatile Files and Directories...
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.342 227766 DEBUG nova.storage.rbd_utils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.347 227766 DEBUG oslo_concurrency.processutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:07 np0005593234 systemd[236749]: Finished Create User's Volatile Files and Directories.
Jan 23 04:33:07 np0005593234 systemd[236749]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:33:07 np0005593234 systemd[236749]: Reached target Sockets.
Jan 23 04:33:07 np0005593234 systemd[236749]: Reached target Basic System.
Jan 23 04:33:07 np0005593234 systemd[236749]: Reached target Main User Target.
Jan 23 04:33:07 np0005593234 systemd[236749]: Startup finished in 152ms.
Jan 23 04:33:07 np0005593234 systemd[1]: Started User Manager for UID 42436.
Jan 23 04:33:07 np0005593234 systemd[1]: Started Session 57 of User nova.
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.410 227766 DEBUG oslo_concurrency.processutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.411 227766 DEBUG oslo_concurrency.lockutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.411 227766 DEBUG oslo_concurrency.lockutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.412 227766 DEBUG oslo_concurrency.lockutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.439 227766 DEBUG nova.storage.rbd_utils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:33:07 np0005593234 systemd[1]: session-57.scope: Deactivated successfully.
Jan 23 04:33:07 np0005593234 systemd-logind[794]: Session 57 logged out. Waiting for processes to exit.
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.444 227766 DEBUG oslo_concurrency.processutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 bbdd11d9-762b-449b-aaee-7037e6742838_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:07 np0005593234 systemd-logind[794]: Removed session 57.
Jan 23 04:33:07 np0005593234 systemd-logind[794]: New session 59 of user nova.
Jan 23 04:33:07 np0005593234 systemd[1]: Started Session 59 of User nova.
Jan 23 04:33:07 np0005593234 systemd[1]: session-59.scope: Deactivated successfully.
Jan 23 04:33:07 np0005593234 systemd-logind[794]: Session 59 logged out. Waiting for processes to exit.
Jan 23 04:33:07 np0005593234 systemd-logind[794]: Removed session 59.
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.763 227766 DEBUG oslo_concurrency.processutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 bbdd11d9-762b-449b-aaee-7037e6742838_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.320s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.795 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.850 227766 DEBUG nova.storage.rbd_utils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] resizing rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.952 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.952 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Ensure instance console log exists: /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.953 227766 DEBUG oslo_concurrency.lockutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.953 227766 DEBUG oslo_concurrency.lockutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.953 227766 DEBUG oslo_concurrency.lockutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.955 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.960 227766 WARNING nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.966 227766 DEBUG nova.virt.libvirt.host [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.966 227766 DEBUG nova.virt.libvirt.host [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.969 227766 DEBUG nova.virt.libvirt.host [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.970 227766 DEBUG nova.virt.libvirt.host [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.972 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.972 227766 DEBUG nova.virt.hardware [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.972 227766 DEBUG nova.virt.hardware [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.973 227766 DEBUG nova.virt.hardware [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.973 227766 DEBUG nova.virt.hardware [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.973 227766 DEBUG nova.virt.hardware [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.973 227766 DEBUG nova.virt.hardware [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.974 227766 DEBUG nova.virt.hardware [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.974 227766 DEBUG nova.virt.hardware [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.974 227766 DEBUG nova.virt.hardware [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.974 227766 DEBUG nova.virt.hardware [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.974 227766 DEBUG nova.virt.hardware [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:33:07 np0005593234 nova_compute[227762]: 2026-01-23 09:33:07.975 227766 DEBUG nova.objects.instance [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lazy-loading 'vcpu_model' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:08 np0005593234 nova_compute[227762]: 2026-01-23 09:33:08.001 227766 DEBUG oslo_concurrency.processutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:08.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:08 np0005593234 nova_compute[227762]: 2026-01-23 09:33:08.088 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:33:08 np0005593234 nova_compute[227762]: 2026-01-23 09:33:08.089 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:33:08 np0005593234 nova_compute[227762]: 2026-01-23 09:33:08.089 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:33:08 np0005593234 nova_compute[227762]: 2026-01-23 09:33:08.089 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f3277436-85d0-4674-aa69-d7a50448a5d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:33:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3432877880' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:33:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:08.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:08 np0005593234 nova_compute[227762]: 2026-01-23 09:33:08.736 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:33:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.028 227766 DEBUG oslo_concurrency.processutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.058 227766 DEBUG nova.storage.rbd_utils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.062 227766 DEBUG oslo_concurrency.processutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.127 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.219 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.247 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.247 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.247 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.248 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.248 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 04:33:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:33:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1500378898' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.481 227766 DEBUG oslo_concurrency.processutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.484 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  <uuid>bbdd11d9-762b-449b-aaee-7037e6742838</uuid>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  <name>instance-00000010</name>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServersAdmin275Test-server-2025545369</nova:name>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:33:07</nova:creationTime>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <nova:user uuid="0772eb0a2fe14d3182d80050e5d94826">tempest-ServersAdmin275Test-1983842130-project-member</nova:user>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <nova:project uuid="430b9216a12b4d41a4333023b00acdff">tempest-ServersAdmin275Test-1983842130</nova:project>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <entry name="serial">bbdd11d9-762b-449b-aaee-7037e6742838</entry>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <entry name="uuid">bbdd11d9-762b-449b-aaee-7037e6742838</entry>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/bbdd11d9-762b-449b-aaee-7037e6742838_disk">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/bbdd11d9-762b-449b-aaee-7037e6742838_disk.config">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/console.log" append="off"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:33:09 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:33:09 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:33:09 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:33:09 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.548 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.548 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.549 227766 INFO nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Using config drive#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.576 227766 DEBUG nova.storage.rbd_utils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.597 227766 DEBUG nova.objects.instance [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lazy-loading 'ec2_ids' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.630 227766 DEBUG nova.objects.instance [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lazy-loading 'keypairs' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.811 227766 INFO nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Creating config drive at /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.816 227766 DEBUG oslo_concurrency.processutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfq_7rsj8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.946 227766 DEBUG oslo_concurrency.processutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpfq_7rsj8" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.982 227766 DEBUG nova.storage.rbd_utils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] rbd image bbdd11d9-762b-449b-aaee-7037e6742838_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:33:09 np0005593234 nova_compute[227762]: 2026-01-23 09:33:09.986 227766 DEBUG oslo_concurrency.processutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config bbdd11d9-762b-449b-aaee-7037e6742838_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:10.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.140 227766 DEBUG oslo_concurrency.processutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config bbdd11d9-762b-449b-aaee-7037e6742838_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.154s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.141 227766 INFO nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Deleting local config drive /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838/disk.config because it was imported into RBD.#033[00m
Jan 23 04:33:10 np0005593234 systemd-machined[195626]: New machine qemu-9-instance-00000010.
Jan 23 04:33:10 np0005593234 systemd[1]: Started Virtual Machine qemu-9-instance-00000010.
Jan 23 04:33:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:10.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.760 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.761 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.913 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for bbdd11d9-762b-449b-aaee-7037e6742838 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.914 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160790.9132316, bbdd11d9-762b-449b-aaee-7037e6742838 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.914 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.917 227766 DEBUG nova.compute.manager [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.917 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.921 227766 INFO nova.virt.libvirt.driver [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance spawned successfully.#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.921 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.942 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.946 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.950 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.951 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.951 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.951 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.952 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.952 227766 DEBUG nova.virt.libvirt.driver [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.986 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.986 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160790.9171999, bbdd11d9-762b-449b-aaee-7037e6742838 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:33:10 np0005593234 nova_compute[227762]: 2026-01-23 09:33:10.987 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] VM Started (Lifecycle Event)#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.018 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.023 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.032 227766 DEBUG nova.compute.manager [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.061 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.171 227766 DEBUG oslo_concurrency.lockutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.172 227766 DEBUG oslo_concurrency.lockutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.173 227766 DEBUG nova.objects.instance [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.258 227766 DEBUG oslo_concurrency.lockutils [None req-05cf2de9-d3f7-4e83-9f67-c0d14d04d702 1b4c854344cc4ac58a189c40477f71f9 ac5ac2ab13014f62859561f5c476e4f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.911 227766 DEBUG oslo_concurrency.lockutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Acquiring lock "bbdd11d9-762b-449b-aaee-7037e6742838" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.911 227766 DEBUG oslo_concurrency.lockutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "bbdd11d9-762b-449b-aaee-7037e6742838" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.912 227766 DEBUG oslo_concurrency.lockutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Acquiring lock "bbdd11d9-762b-449b-aaee-7037e6742838-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.912 227766 DEBUG oslo_concurrency.lockutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "bbdd11d9-762b-449b-aaee-7037e6742838-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.912 227766 DEBUG oslo_concurrency.lockutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "bbdd11d9-762b-449b-aaee-7037e6742838-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.914 227766 INFO nova.compute.manager [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Terminating instance#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.915 227766 DEBUG oslo_concurrency.lockutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Acquiring lock "refresh_cache-bbdd11d9-762b-449b-aaee-7037e6742838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.915 227766 DEBUG oslo_concurrency.lockutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Acquired lock "refresh_cache-bbdd11d9-762b-449b-aaee-7037e6742838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:33:11 np0005593234 nova_compute[227762]: 2026-01-23 09:33:11.916 227766 DEBUG nova.network.neutron [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:33:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:12.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:12.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:12 np0005593234 nova_compute[227762]: 2026-01-23 09:33:12.738 227766 DEBUG nova.network.neutron [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:33:12 np0005593234 nova_compute[227762]: 2026-01-23 09:33:12.777 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:13 np0005593234 nova_compute[227762]: 2026-01-23 09:33:13.068 227766 DEBUG nova.network.neutron [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:33:13 np0005593234 nova_compute[227762]: 2026-01-23 09:33:13.085 227766 DEBUG oslo_concurrency.lockutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Releasing lock "refresh_cache-bbdd11d9-762b-449b-aaee-7037e6742838" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:33:13 np0005593234 nova_compute[227762]: 2026-01-23 09:33:13.086 227766 DEBUG nova.compute.manager [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:33:13 np0005593234 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000010.scope: Deactivated successfully.
Jan 23 04:33:13 np0005593234 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000010.scope: Consumed 2.920s CPU time.
Jan 23 04:33:13 np0005593234 systemd-machined[195626]: Machine qemu-9-instance-00000010 terminated.
Jan 23 04:33:13 np0005593234 nova_compute[227762]: 2026-01-23 09:33:13.304 227766 INFO nova.virt.libvirt.driver [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance destroyed successfully.#033[00m
Jan 23 04:33:13 np0005593234 nova_compute[227762]: 2026-01-23 09:33:13.305 227766 DEBUG nova.objects.instance [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lazy-loading 'resources' on Instance uuid bbdd11d9-762b-449b-aaee-7037e6742838 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:13 np0005593234 nova_compute[227762]: 2026-01-23 09:33:13.799 227766 INFO nova.virt.libvirt.driver [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Deleting instance files /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838_del#033[00m
Jan 23 04:33:13 np0005593234 nova_compute[227762]: 2026-01-23 09:33:13.800 227766 INFO nova.virt.libvirt.driver [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Deletion of /var/lib/nova/instances/bbdd11d9-762b-449b-aaee-7037e6742838_del complete#033[00m
Jan 23 04:33:13 np0005593234 nova_compute[227762]: 2026-01-23 09:33:13.908 227766 INFO nova.compute.manager [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Took 0.82 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:33:13 np0005593234 nova_compute[227762]: 2026-01-23 09:33:13.908 227766 DEBUG oslo.service.loopingcall [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:33:13 np0005593234 nova_compute[227762]: 2026-01-23 09:33:13.909 227766 DEBUG nova.compute.manager [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:33:13 np0005593234 nova_compute[227762]: 2026-01-23 09:33:13.909 227766 DEBUG nova.network.neutron [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:33:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:33:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:14.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:33:14 np0005593234 nova_compute[227762]: 2026-01-23 09:33:14.130 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:33:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:14.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:33:14 np0005593234 nova_compute[227762]: 2026-01-23 09:33:14.509 227766 DEBUG nova.network.neutron [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:33:14 np0005593234 nova_compute[227762]: 2026-01-23 09:33:14.535 227766 DEBUG nova.network.neutron [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:33:14 np0005593234 nova_compute[227762]: 2026-01-23 09:33:14.620 227766 INFO nova.compute.manager [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Took 0.71 seconds to deallocate network for instance.#033[00m
Jan 23 04:33:14 np0005593234 nova_compute[227762]: 2026-01-23 09:33:14.707 227766 DEBUG oslo_concurrency.lockutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:14 np0005593234 nova_compute[227762]: 2026-01-23 09:33:14.707 227766 DEBUG oslo_concurrency.lockutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:14 np0005593234 nova_compute[227762]: 2026-01-23 09:33:14.814 227766 DEBUG oslo_concurrency.processutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:33:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1639549361' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:33:15 np0005593234 nova_compute[227762]: 2026-01-23 09:33:15.228 227766 DEBUG oslo_concurrency.processutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:15 np0005593234 nova_compute[227762]: 2026-01-23 09:33:15.235 227766 DEBUG nova.compute.provider_tree [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:33:15 np0005593234 nova_compute[227762]: 2026-01-23 09:33:15.269 227766 DEBUG nova.scheduler.client.report [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:33:15 np0005593234 nova_compute[227762]: 2026-01-23 09:33:15.301 227766 DEBUG oslo_concurrency.lockutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:15 np0005593234 nova_compute[227762]: 2026-01-23 09:33:15.353 227766 INFO nova.scheduler.client.report [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Deleted allocations for instance bbdd11d9-762b-449b-aaee-7037e6742838#033[00m
Jan 23 04:33:15 np0005593234 nova_compute[227762]: 2026-01-23 09:33:15.470 227766 DEBUG oslo_concurrency.lockutils [None req-16001026-4429-4fba-9b46-ce27b1746422 0772eb0a2fe14d3182d80050e5d94826 430b9216a12b4d41a4333023b00acdff - - default default] Lock "bbdd11d9-762b-449b-aaee-7037e6742838" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.559s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:16.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:16.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:17 np0005593234 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 04:33:17 np0005593234 systemd[236749]: Activating special unit Exit the Session...
Jan 23 04:33:17 np0005593234 systemd[236749]: Stopped target Main User Target.
Jan 23 04:33:17 np0005593234 systemd[236749]: Stopped target Basic System.
Jan 23 04:33:17 np0005593234 systemd[236749]: Stopped target Paths.
Jan 23 04:33:17 np0005593234 systemd[236749]: Stopped target Sockets.
Jan 23 04:33:17 np0005593234 systemd[236749]: Stopped target Timers.
Jan 23 04:33:17 np0005593234 systemd[236749]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:33:17 np0005593234 systemd[236749]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 04:33:17 np0005593234 systemd[236749]: Closed D-Bus User Message Bus Socket.
Jan 23 04:33:17 np0005593234 systemd[236749]: Stopped Create User's Volatile Files and Directories.
Jan 23 04:33:17 np0005593234 systemd[236749]: Removed slice User Application Slice.
Jan 23 04:33:17 np0005593234 systemd[236749]: Reached target Shutdown.
Jan 23 04:33:17 np0005593234 systemd[236749]: Finished Exit the Session.
Jan 23 04:33:17 np0005593234 systemd[236749]: Reached target Exit the Session.
Jan 23 04:33:17 np0005593234 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 04:33:17 np0005593234 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 04:33:17 np0005593234 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 04:33:17 np0005593234 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 04:33:17 np0005593234 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 04:33:17 np0005593234 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 04:33:17 np0005593234 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 04:33:17 np0005593234 nova_compute[227762]: 2026-01-23 09:33:17.778 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:17 np0005593234 nova_compute[227762]: 2026-01-23 09:33:17.966 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:33:17.965 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:33:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:33:17.967 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:33:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:33:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:33:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:18.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:18 np0005593234 podman[237606]: 2026-01-23 09:33:18.406061948 +0000 UTC m=+0.051399873 container create ff0e733f96f4eabbcaaa1ce8d4bbff581484126929c6734a9128522ee1df24d8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_perlman, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:33:18 np0005593234 systemd[1]: Started libpod-conmon-ff0e733f96f4eabbcaaa1ce8d4bbff581484126929c6734a9128522ee1df24d8.scope.
Jan 23 04:33:18 np0005593234 podman[237606]: 2026-01-23 09:33:18.381387119 +0000 UTC m=+0.026725074 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:33:18 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:33:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:18.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:18 np0005593234 podman[237606]: 2026-01-23 09:33:18.510878556 +0000 UTC m=+0.156216511 container init ff0e733f96f4eabbcaaa1ce8d4bbff581484126929c6734a9128522ee1df24d8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_perlman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:33:18 np0005593234 podman[237606]: 2026-01-23 09:33:18.51871318 +0000 UTC m=+0.164051105 container start ff0e733f96f4eabbcaaa1ce8d4bbff581484126929c6734a9128522ee1df24d8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_perlman, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 04:33:18 np0005593234 podman[237606]: 2026-01-23 09:33:18.522803327 +0000 UTC m=+0.168141292 container attach ff0e733f96f4eabbcaaa1ce8d4bbff581484126929c6734a9128522ee1df24d8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_perlman, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:33:18 np0005593234 adoring_perlman[237622]: 167 167
Jan 23 04:33:18 np0005593234 systemd[1]: libpod-ff0e733f96f4eabbcaaa1ce8d4bbff581484126929c6734a9128522ee1df24d8.scope: Deactivated successfully.
Jan 23 04:33:18 np0005593234 podman[237606]: 2026-01-23 09:33:18.529404923 +0000 UTC m=+0.174742848 container died ff0e733f96f4eabbcaaa1ce8d4bbff581484126929c6734a9128522ee1df24d8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_perlman, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Jan 23 04:33:18 np0005593234 systemd[1]: var-lib-containers-storage-overlay-bb16a3f8019693a3c4dc0d6310ae912b6aefbcb1c0644fbfa03fee9613030bd1-merged.mount: Deactivated successfully.
Jan 23 04:33:18 np0005593234 podman[237606]: 2026-01-23 09:33:18.564866829 +0000 UTC m=+0.210204744 container remove ff0e733f96f4eabbcaaa1ce8d4bbff581484126929c6734a9128522ee1df24d8 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=adoring_perlman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3)
Jan 23 04:33:18 np0005593234 systemd[1]: libpod-conmon-ff0e733f96f4eabbcaaa1ce8d4bbff581484126929c6734a9128522ee1df24d8.scope: Deactivated successfully.
Jan 23 04:33:18 np0005593234 podman[237646]: 2026-01-23 09:33:18.717753265 +0000 UTC m=+0.039930046 container create b729989eff4685708a0b6b910e82eda834f48d67bcf610efe3258f656bb00b3f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_gould, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 04:33:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e154 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:18 np0005593234 systemd[1]: Started libpod-conmon-b729989eff4685708a0b6b910e82eda834f48d67bcf610efe3258f656bb00b3f.scope.
Jan 23 04:33:18 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:33:18 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8363430b3909e5a946c1a0d3603fefbd2bb8ae304fbadf52968d354493140e04/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 04:33:18 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8363430b3909e5a946c1a0d3603fefbd2bb8ae304fbadf52968d354493140e04/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 04:33:18 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8363430b3909e5a946c1a0d3603fefbd2bb8ae304fbadf52968d354493140e04/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 04:33:18 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8363430b3909e5a946c1a0d3603fefbd2bb8ae304fbadf52968d354493140e04/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 04:33:18 np0005593234 podman[237646]: 2026-01-23 09:33:18.792792004 +0000 UTC m=+0.114968785 container init b729989eff4685708a0b6b910e82eda834f48d67bcf610efe3258f656bb00b3f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_gould, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3)
Jan 23 04:33:18 np0005593234 podman[237646]: 2026-01-23 09:33:18.700428095 +0000 UTC m=+0.022604896 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 04:33:18 np0005593234 podman[237646]: 2026-01-23 09:33:18.799614017 +0000 UTC m=+0.121790798 container start b729989eff4685708a0b6b910e82eda834f48d67bcf610efe3258f656bb00b3f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_gould, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:33:18 np0005593234 podman[237646]: 2026-01-23 09:33:18.807955547 +0000 UTC m=+0.130132348 container attach b729989eff4685708a0b6b910e82eda834f48d67bcf610efe3258f656bb00b3f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_gould, OSD_FLAVOR=default, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 04:33:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:33:18.970 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:33:19 np0005593234 nova_compute[227762]: 2026-01-23 09:33:19.132 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:19 np0005593234 podman[237680]: 2026-01-23 09:33:19.799788377 +0000 UTC m=+0.091783402 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller)
Jan 23 04:33:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.003000095s ======
Jan 23 04:33:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:20.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 23 04:33:20 np0005593234 stoic_gould[237663]: [
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:    {
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:        "available": false,
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:        "ceph_device": false,
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:        "lsm_data": {},
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:        "lvs": [],
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:        "path": "/dev/sr0",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:        "rejected_reasons": [
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "Insufficient space (<5GB)",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "Has a FileSystem"
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:        ],
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:        "sys_api": {
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "actuators": null,
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "device_nodes": "sr0",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "devname": "sr0",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "human_readable_size": "482.00 KB",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "id_bus": "ata",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "model": "QEMU DVD-ROM",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "nr_requests": "2",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "parent": "/dev/sr0",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "partitions": {},
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "path": "/dev/sr0",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "removable": "1",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "rev": "2.5+",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "ro": "0",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "rotational": "1",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "sas_address": "",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "sas_device_handle": "",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "scheduler_mode": "mq-deadline",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "sectors": 0,
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "sectorsize": "2048",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "size": 493568.0,
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "support_discard": "2048",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "type": "disk",
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:            "vendor": "QEMU"
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:        }
Jan 23 04:33:20 np0005593234 stoic_gould[237663]:    }
Jan 23 04:33:20 np0005593234 stoic_gould[237663]: ]
Jan 23 04:33:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:33:20 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2945979402' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:33:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:33:20 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2945979402' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:33:20 np0005593234 systemd[1]: libpod-b729989eff4685708a0b6b910e82eda834f48d67bcf610efe3258f656bb00b3f.scope: Deactivated successfully.
Jan 23 04:33:20 np0005593234 systemd[1]: libpod-b729989eff4685708a0b6b910e82eda834f48d67bcf610efe3258f656bb00b3f.scope: Consumed 1.259s CPU time.
Jan 23 04:33:20 np0005593234 podman[237646]: 2026-01-23 09:33:20.084311127 +0000 UTC m=+1.406487908 container died b729989eff4685708a0b6b910e82eda834f48d67bcf610efe3258f656bb00b3f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_gould, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 04:33:20 np0005593234 systemd[1]: var-lib-containers-storage-overlay-8363430b3909e5a946c1a0d3603fefbd2bb8ae304fbadf52968d354493140e04-merged.mount: Deactivated successfully.
Jan 23 04:33:20 np0005593234 podman[237646]: 2026-01-23 09:33:20.186804522 +0000 UTC m=+1.508981313 container remove b729989eff4685708a0b6b910e82eda834f48d67bcf610efe3258f656bb00b3f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=stoic_gould, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 04:33:20 np0005593234 systemd[1]: libpod-conmon-b729989eff4685708a0b6b910e82eda834f48d67bcf610efe3258f656bb00b3f.scope: Deactivated successfully.
Jan 23 04:33:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:33:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:20.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:33:21 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:21 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:21 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:33:21 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:21 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:33:21 np0005593234 nova_compute[227762]: 2026-01-23 09:33:21.240 227766 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquiring lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:33:21 np0005593234 nova_compute[227762]: 2026-01-23 09:33:21.241 227766 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquired lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:33:21 np0005593234 nova_compute[227762]: 2026-01-23 09:33:21.241 227766 DEBUG nova.network.neutron [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:33:21 np0005593234 nova_compute[227762]: 2026-01-23 09:33:21.597 227766 DEBUG nova.network.neutron [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:33:21 np0005593234 nova_compute[227762]: 2026-01-23 09:33:21.954 227766 DEBUG nova.network.neutron [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:33:22 np0005593234 nova_compute[227762]: 2026-01-23 09:33:22.002 227766 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Releasing lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:33:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:22.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:22 np0005593234 nova_compute[227762]: 2026-01-23 09:33:22.243 227766 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 23 04:33:22 np0005593234 nova_compute[227762]: 2026-01-23 09:33:22.246 227766 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 04:33:22 np0005593234 nova_compute[227762]: 2026-01-23 09:33:22.246 227766 INFO nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Creating image(s)#033[00m
Jan 23 04:33:22 np0005593234 nova_compute[227762]: 2026-01-23 09:33:22.284 227766 DEBUG nova.storage.rbd_utils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] creating snapshot(nova-resize) on rbd image(07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:33:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:22.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:22 np0005593234 nova_compute[227762]: 2026-01-23 09:33:22.780 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e155 e155: 3 total, 3 up, 3 in
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.546 227766 DEBUG nova.objects.instance [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 07c9ba0d-ab3d-4079-ab97-46e91de4911a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.654 227766 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.655 227766 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Ensure instance console log exists: /var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.655 227766 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.656 227766 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.656 227766 DEBUG oslo_concurrency.lockutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.658 227766 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.663 227766 WARNING nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.673 227766 DEBUG nova.virt.libvirt.host [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.674 227766 DEBUG nova.virt.libvirt.host [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.678 227766 DEBUG nova.virt.libvirt.host [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.678 227766 DEBUG nova.virt.libvirt.host [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.680 227766 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.680 227766 DEBUG nova.virt.hardware [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.680 227766 DEBUG nova.virt.hardware [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.681 227766 DEBUG nova.virt.hardware [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.681 227766 DEBUG nova.virt.hardware [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.681 227766 DEBUG nova.virt.hardware [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.681 227766 DEBUG nova.virt.hardware [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.681 227766 DEBUG nova.virt.hardware [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.682 227766 DEBUG nova.virt.hardware [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.682 227766 DEBUG nova.virt.hardware [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.682 227766 DEBUG nova.virt.hardware [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.682 227766 DEBUG nova.virt.hardware [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.683 227766 DEBUG nova.objects.instance [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 07c9ba0d-ab3d-4079-ab97-46e91de4911a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:23 np0005593234 nova_compute[227762]: 2026-01-23 09:33:23.702 227766 DEBUG oslo_concurrency.processutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:33:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:24.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:33:24 np0005593234 nova_compute[227762]: 2026-01-23 09:33:24.134 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:33:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1469271272' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:33:24 np0005593234 nova_compute[227762]: 2026-01-23 09:33:24.189 227766 DEBUG oslo_concurrency.processutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:24 np0005593234 nova_compute[227762]: 2026-01-23 09:33:24.234 227766 DEBUG oslo_concurrency.processutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:24.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:33:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/917104822' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:33:24 np0005593234 nova_compute[227762]: 2026-01-23 09:33:24.754 227766 DEBUG oslo_concurrency.processutils [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.521s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:24 np0005593234 nova_compute[227762]: 2026-01-23 09:33:24.758 227766 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  <uuid>07c9ba0d-ab3d-4079-ab97-46e91de4911a</uuid>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  <name>instance-00000012</name>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <nova:name>tempest-MigrationsAdminTest-server-2012645134</nova:name>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:33:23</nova:creationTime>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <nova:user uuid="7536fa2e625541fba613dc32a49a4c5b">tempest-MigrationsAdminTest-2056264627-project-member</nova:user>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <nova:project uuid="11def90dfdc14cfe928302bec2835794">tempest-MigrationsAdminTest-2056264627</nova:project>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <entry name="serial">07c9ba0d-ab3d-4079-ab97-46e91de4911a</entry>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <entry name="uuid">07c9ba0d-ab3d-4079-ab97-46e91de4911a</entry>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/07c9ba0d-ab3d-4079-ab97-46e91de4911a_disk.config">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/07c9ba0d-ab3d-4079-ab97-46e91de4911a/console.log" append="off"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:33:24 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:33:24 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:33:24 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:33:24 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:33:24 np0005593234 nova_compute[227762]: 2026-01-23 09:33:24.897 227766 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:33:24 np0005593234 nova_compute[227762]: 2026-01-23 09:33:24.898 227766 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:33:24 np0005593234 nova_compute[227762]: 2026-01-23 09:33:24.898 227766 INFO nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Using config drive#033[00m
Jan 23 04:33:25 np0005593234 systemd-machined[195626]: New machine qemu-10-instance-00000012.
Jan 23 04:33:25 np0005593234 systemd[1]: Started Virtual Machine qemu-10-instance-00000012.
Jan 23 04:33:25 np0005593234 nova_compute[227762]: 2026-01-23 09:33:25.466 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160805.4657538, 07c9ba0d-ab3d-4079-ab97-46e91de4911a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:33:25 np0005593234 nova_compute[227762]: 2026-01-23 09:33:25.466 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:33:25 np0005593234 nova_compute[227762]: 2026-01-23 09:33:25.468 227766 DEBUG nova.compute.manager [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:33:25 np0005593234 nova_compute[227762]: 2026-01-23 09:33:25.472 227766 INFO nova.virt.libvirt.driver [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance running successfully.#033[00m
Jan 23 04:33:25 np0005593234 virtqemud[227483]: argument unsupported: QEMU guest agent is not configured
Jan 23 04:33:25 np0005593234 nova_compute[227762]: 2026-01-23 09:33:25.475 227766 DEBUG nova.virt.libvirt.guest [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 04:33:25 np0005593234 nova_compute[227762]: 2026-01-23 09:33:25.475 227766 DEBUG nova.virt.libvirt.driver [None req-e5f36f35-af43-48a3-a1b1-5d6132f62203 a790e81b73f84621bb26f54bce8c8921 9e48dfffea1c4d81b302c63c3e3ab955 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 23 04:33:25 np0005593234 nova_compute[227762]: 2026-01-23 09:33:25.505 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:25 np0005593234 nova_compute[227762]: 2026-01-23 09:33:25.509 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:33:25 np0005593234 nova_compute[227762]: 2026-01-23 09:33:25.554 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 23 04:33:25 np0005593234 nova_compute[227762]: 2026-01-23 09:33:25.554 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160805.4683037, 07c9ba0d-ab3d-4079-ab97-46e91de4911a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:33:25 np0005593234 nova_compute[227762]: 2026-01-23 09:33:25.554 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] VM Started (Lifecycle Event)#033[00m
Jan 23 04:33:25 np0005593234 nova_compute[227762]: 2026-01-23 09:33:25.586 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:25 np0005593234 nova_compute[227762]: 2026-01-23 09:33:25.589 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:33:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:26.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:26.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:27 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:27 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:33:27 np0005593234 nova_compute[227762]: 2026-01-23 09:33:27.783 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:28.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:28 np0005593234 nova_compute[227762]: 2026-01-23 09:33:28.303 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160793.3020341, bbdd11d9-762b-449b-aaee-7037e6742838 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:33:28 np0005593234 nova_compute[227762]: 2026-01-23 09:33:28.304 227766 INFO nova.compute.manager [-] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:33:28 np0005593234 nova_compute[227762]: 2026-01-23 09:33:28.334 227766 DEBUG nova.compute.manager [None req-e3246156-91b0-4b83-9383-627a840ac285 - - - - - -] [instance: bbdd11d9-762b-449b-aaee-7037e6742838] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:33:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:28.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:33:28 np0005593234 nova_compute[227762]: 2026-01-23 09:33:28.786 227766 DEBUG oslo_concurrency.lockutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:33:28 np0005593234 nova_compute[227762]: 2026-01-23 09:33:28.786 227766 DEBUG oslo_concurrency.lockutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquired lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:33:28 np0005593234 nova_compute[227762]: 2026-01-23 09:33:28.787 227766 DEBUG nova.network.neutron [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:33:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:29 np0005593234 nova_compute[227762]: 2026-01-23 09:33:29.076 227766 DEBUG nova.network.neutron [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:33:29 np0005593234 nova_compute[227762]: 2026-01-23 09:33:29.137 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:29 np0005593234 nova_compute[227762]: 2026-01-23 09:33:29.491 227766 DEBUG nova.network.neutron [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:33:29 np0005593234 nova_compute[227762]: 2026-01-23 09:33:29.633 227766 DEBUG oslo_concurrency.lockutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Releasing lock "refresh_cache-07c9ba0d-ab3d-4079-ab97-46e91de4911a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:33:29 np0005593234 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000012.scope: Deactivated successfully.
Jan 23 04:33:29 np0005593234 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000012.scope: Consumed 4.830s CPU time.
Jan 23 04:33:29 np0005593234 systemd-machined[195626]: Machine qemu-10-instance-00000012 terminated.
Jan 23 04:33:29 np0005593234 nova_compute[227762]: 2026-01-23 09:33:29.905 227766 INFO nova.virt.libvirt.driver [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Instance destroyed successfully.#033[00m
Jan 23 04:33:29 np0005593234 nova_compute[227762]: 2026-01-23 09:33:29.906 227766 DEBUG nova.objects.instance [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'resources' on Instance uuid 07c9ba0d-ab3d-4079-ab97-46e91de4911a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:29 np0005593234 nova_compute[227762]: 2026-01-23 09:33:29.958 227766 DEBUG oslo_concurrency.lockutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:29 np0005593234 nova_compute[227762]: 2026-01-23 09:33:29.959 227766 DEBUG oslo_concurrency.lockutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:30.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:30 np0005593234 nova_compute[227762]: 2026-01-23 09:33:30.094 227766 DEBUG nova.objects.instance [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'migration_context' on Instance uuid 07c9ba0d-ab3d-4079-ab97-46e91de4911a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:33:30 np0005593234 nova_compute[227762]: 2026-01-23 09:33:30.302 227766 DEBUG oslo_concurrency.processutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:33:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:33:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:30.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:33:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:33:30 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3031038550' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:33:30 np0005593234 nova_compute[227762]: 2026-01-23 09:33:30.785 227766 DEBUG oslo_concurrency.processutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.484s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:33:30 np0005593234 nova_compute[227762]: 2026-01-23 09:33:30.792 227766 DEBUG nova.compute.provider_tree [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:33:30 np0005593234 nova_compute[227762]: 2026-01-23 09:33:30.815 227766 DEBUG nova.scheduler.client.report [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:33:30 np0005593234 nova_compute[227762]: 2026-01-23 09:33:30.904 227766 DEBUG oslo_concurrency.lockutils [None req-147c56e1-fd9f-4352-9872-9a5b6353b0f7 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.945s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:32.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:33:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:32.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:33:32 np0005593234 nova_compute[227762]: 2026-01-23 09:33:32.815 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:33:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:34.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:33:34 np0005593234 nova_compute[227762]: 2026-01-23 09:33:34.140 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:33:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:34.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:33:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e156 e156: 3 total, 3 up, 3 in
Jan 23 04:33:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:33:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:36.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:33:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:33:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:36.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:33:36 np0005593234 podman[239326]: 2026-01-23 09:33:36.768629058 +0000 UTC m=+0.054741767 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:33:37 np0005593234 nova_compute[227762]: 2026-01-23 09:33:37.841 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:38.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:33:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:38.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:33:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:39 np0005593234 nova_compute[227762]: 2026-01-23 09:33:39.142 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:40.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:33:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:40.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.059255) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160821059306, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2452, "num_deletes": 254, "total_data_size": 5790536, "memory_usage": 5856800, "flush_reason": "Manual Compaction"}
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160821082300, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 3726777, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25312, "largest_seqno": 27759, "table_properties": {"data_size": 3716845, "index_size": 6234, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21599, "raw_average_key_size": 20, "raw_value_size": 3696614, "raw_average_value_size": 3557, "num_data_blocks": 274, "num_entries": 1039, "num_filter_entries": 1039, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769160639, "oldest_key_time": 1769160639, "file_creation_time": 1769160821, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 23194 microseconds, and 8091 cpu microseconds.
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.082452) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 3726777 bytes OK
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.082494) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.084725) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.084738) EVENT_LOG_v1 {"time_micros": 1769160821084734, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.084753) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 5779490, prev total WAL file size 5779490, number of live WAL files 2.
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.086348) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(3639KB)], [51(9130KB)]
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160821086434, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 13076646, "oldest_snapshot_seqno": -1}
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5379 keys, 11100667 bytes, temperature: kUnknown
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160821192603, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 11100667, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11062069, "index_size": 24024, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13509, "raw_key_size": 134461, "raw_average_key_size": 24, "raw_value_size": 10962385, "raw_average_value_size": 2037, "num_data_blocks": 991, "num_entries": 5379, "num_filter_entries": 5379, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769160821, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.192873) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 11100667 bytes
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.207476) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 123.0 rd, 104.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 8.9 +0.0 blob) out(10.6 +0.0 blob), read-write-amplify(6.5) write-amplify(3.0) OK, records in: 5906, records dropped: 527 output_compression: NoCompression
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.207523) EVENT_LOG_v1 {"time_micros": 1769160821207506, "job": 30, "event": "compaction_finished", "compaction_time_micros": 106274, "compaction_time_cpu_micros": 22671, "output_level": 6, "num_output_files": 1, "total_output_size": 11100667, "num_input_records": 5906, "num_output_records": 5379, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160821208376, "job": 30, "event": "table_file_deletion", "file_number": 53}
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160821210291, "job": 30, "event": "table_file_deletion", "file_number": 51}
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.086261) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.210325) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.210330) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.210332) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.210334) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:41.210336) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:33:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:42.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:33:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:42.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:33:42.808 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:33:42.810 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:33:42.810 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:42 np0005593234 nova_compute[227762]: 2026-01-23 09:33:42.843 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e157 e157: 3 total, 3 up, 3 in
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.494180) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160823494204, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 291, "num_deletes": 256, "total_data_size": 92417, "memory_usage": 98936, "flush_reason": "Manual Compaction"}
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160823497543, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 60856, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 27764, "largest_seqno": 28050, "table_properties": {"data_size": 58936, "index_size": 148, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 4706, "raw_average_key_size": 16, "raw_value_size": 55011, "raw_average_value_size": 197, "num_data_blocks": 7, "num_entries": 278, "num_filter_entries": 278, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769160822, "oldest_key_time": 1769160822, "file_creation_time": 1769160823, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 3599 microseconds, and 886 cpu microseconds.
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.497777) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 60856 bytes OK
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.497794) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.499301) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.499312) EVENT_LOG_v1 {"time_micros": 1769160823499308, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.499324) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 90232, prev total WAL file size 90232, number of live WAL files 2.
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.499663) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353033' seq:72057594037927935, type:22 .. '6C6F676D00373534' seq:0, type:0; will stop at (end)
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(59KB)], [54(10MB)]
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160823499689, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 11161523, "oldest_snapshot_seqno": -1}
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5133 keys, 11075268 bytes, temperature: kUnknown
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160823566473, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 11075268, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11037578, "index_size": 23756, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12869, "raw_key_size": 130526, "raw_average_key_size": 25, "raw_value_size": 10941501, "raw_average_value_size": 2131, "num_data_blocks": 976, "num_entries": 5133, "num_filter_entries": 5133, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769160823, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.566777) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 11075268 bytes
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.568622) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 166.8 rd, 165.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 10.6 +0.0 blob) out(10.6 +0.0 blob), read-write-amplify(365.4) write-amplify(182.0) OK, records in: 5657, records dropped: 524 output_compression: NoCompression
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.568639) EVENT_LOG_v1 {"time_micros": 1769160823568631, "job": 32, "event": "compaction_finished", "compaction_time_micros": 66924, "compaction_time_cpu_micros": 23904, "output_level": 6, "num_output_files": 1, "total_output_size": 11075268, "num_input_records": 5657, "num_output_records": 5133, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160823568750, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769160823570496, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.499554) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.570555) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.570561) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.570563) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.570590) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:33:43.570593) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:33:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:44.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:44 np0005593234 nova_compute[227762]: 2026-01-23 09:33:44.144 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:33:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2968932037' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:33:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:33:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2968932037' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:33:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:44.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:44 np0005593234 nova_compute[227762]: 2026-01-23 09:33:44.905 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160809.9026806, 07c9ba0d-ab3d-4079-ab97-46e91de4911a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:33:44 np0005593234 nova_compute[227762]: 2026-01-23 09:33:44.905 227766 INFO nova.compute.manager [-] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:33:45 np0005593234 nova_compute[227762]: 2026-01-23 09:33:45.245 227766 DEBUG nova.compute.manager [None req-7c31a8fa-d807-4041-a473-410fc8956d27 - - - - - -] [instance: 07c9ba0d-ab3d-4079-ab97-46e91de4911a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:33:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:46.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:33:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:46.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:33:47 np0005593234 nova_compute[227762]: 2026-01-23 09:33:47.845 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:48.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:48.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:49 np0005593234 nova_compute[227762]: 2026-01-23 09:33:49.201 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:50.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:50 np0005593234 podman[239376]: 2026-01-23 09:33:50.224353304 +0000 UTC m=+0.082834033 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:33:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:50.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:50 np0005593234 nova_compute[227762]: 2026-01-23 09:33:50.962 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:33:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:52.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:52.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:52 np0005593234 nova_compute[227762]: 2026-01-23 09:33:52.847 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:53 np0005593234 nova_compute[227762]: 2026-01-23 09:33:53.153 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Triggering sync for uuid f3277436-85d0-4674-aa69-d7a50448a5d0 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 04:33:53 np0005593234 nova_compute[227762]: 2026-01-23 09:33:53.154 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "f3277436-85d0-4674-aa69-d7a50448a5d0" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:33:53 np0005593234 nova_compute[227762]: 2026-01-23 09:33:53.154 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "f3277436-85d0-4674-aa69-d7a50448a5d0" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:33:53 np0005593234 nova_compute[227762]: 2026-01-23 09:33:53.248 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "f3277436-85d0-4674-aa69-d7a50448a5d0" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:33:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:54.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:54 np0005593234 nova_compute[227762]: 2026-01-23 09:33:54.204 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:33:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:54.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:33:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:56.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:33:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:56.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:33:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e158 e158: 3 total, 3 up, 3 in
Jan 23 04:33:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e159 e159: 3 total, 3 up, 3 in
Jan 23 04:33:57 np0005593234 nova_compute[227762]: 2026-01-23 09:33:57.894 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:33:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:33:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:33:58.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:33:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:33:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:33:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:33:58.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:33:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e160 e160: 3 total, 3 up, 3 in
Jan 23 04:33:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:33:59 np0005593234 nova_compute[227762]: 2026-01-23 09:33:59.204 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:00.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:00.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:02.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:02.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:02 np0005593234 nova_compute[227762]: 2026-01-23 09:34:02.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:02 np0005593234 nova_compute[227762]: 2026-01-23 09:34:02.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:02 np0005593234 nova_compute[227762]: 2026-01-23 09:34:02.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:02 np0005593234 nova_compute[227762]: 2026-01-23 09:34:02.896 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.014 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.015 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.015 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.016 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.016 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:34:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4027757903' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.487 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e161 e161: 3 total, 3 up, 3 in
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.686 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.687 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000008 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.839 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.841 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4598MB free_disk=20.80617904663086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.841 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.841 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.996 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance f3277436-85d0-4674-aa69-d7a50448a5d0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.997 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:34:03 np0005593234 nova_compute[227762]: 2026-01-23 09:34:03.997 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:34:04 np0005593234 nova_compute[227762]: 2026-01-23 09:34:04.021 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:34:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:04 np0005593234 nova_compute[227762]: 2026-01-23 09:34:04.088 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:34:04 np0005593234 nova_compute[227762]: 2026-01-23 09:34:04.089 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:34:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:04.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:04 np0005593234 nova_compute[227762]: 2026-01-23 09:34:04.176 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:34:04 np0005593234 nova_compute[227762]: 2026-01-23 09:34:04.207 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:04 np0005593234 nova_compute[227762]: 2026-01-23 09:34:04.215 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:34:04 np0005593234 nova_compute[227762]: 2026-01-23 09:34:04.310 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:04.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:34:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1139882951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:34:04 np0005593234 nova_compute[227762]: 2026-01-23 09:34:04.788 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:04 np0005593234 nova_compute[227762]: 2026-01-23 09:34:04.794 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:34:04 np0005593234 nova_compute[227762]: 2026-01-23 09:34:04.867 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:34:04 np0005593234 nova_compute[227762]: 2026-01-23 09:34:04.935 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:34:04 np0005593234 nova_compute[227762]: 2026-01-23 09:34:04.936 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.094s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:06.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:06.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:06 np0005593234 nova_compute[227762]: 2026-01-23 09:34:06.937 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:06 np0005593234 nova_compute[227762]: 2026-01-23 09:34:06.938 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:06 np0005593234 nova_compute[227762]: 2026-01-23 09:34:06.938 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:34:07 np0005593234 nova_compute[227762]: 2026-01-23 09:34:07.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:07 np0005593234 podman[239484]: 2026-01-23 09:34:07.770699783 +0000 UTC m=+0.058355560 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 04:34:07 np0005593234 nova_compute[227762]: 2026-01-23 09:34:07.898 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:08.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:34:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:08.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:34:08 np0005593234 nova_compute[227762]: 2026-01-23 09:34:08.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:08 np0005593234 nova_compute[227762]: 2026-01-23 09:34:08.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:34:08 np0005593234 nova_compute[227762]: 2026-01-23 09:34:08.780 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:34:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:09 np0005593234 nova_compute[227762]: 2026-01-23 09:34:09.208 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:10.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:10.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:10 np0005593234 nova_compute[227762]: 2026-01-23 09:34:10.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:11 np0005593234 nova_compute[227762]: 2026-01-23 09:34:11.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:34:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:12.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:12.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:12 np0005593234 nova_compute[227762]: 2026-01-23 09:34:12.900 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:14.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:14 np0005593234 nova_compute[227762]: 2026-01-23 09:34:14.267 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:34:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:14.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:34:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:16.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:34:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:16.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:34:17 np0005593234 nova_compute[227762]: 2026-01-23 09:34:17.903 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:34:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:18.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:34:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:18.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:19 np0005593234 nova_compute[227762]: 2026-01-23 09:34:19.269 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:20.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:34:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:20.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:34:20 np0005593234 podman[239560]: 2026-01-23 09:34:20.80136724 +0000 UTC m=+0.092291837 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 23 04:34:21 np0005593234 nova_compute[227762]: 2026-01-23 09:34:21.021 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:34:21.023 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:34:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:34:21.025 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:34:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:22.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:34:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:22.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:34:22 np0005593234 nova_compute[227762]: 2026-01-23 09:34:22.904 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:34:23.026 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:34:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:24.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:24 np0005593234 nova_compute[227762]: 2026-01-23 09:34:24.271 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:24.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:26.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:34:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:26.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:34:27 np0005593234 nova_compute[227762]: 2026-01-23 09:34:27.905 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:34:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:28.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:34:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:34:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:34:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:34:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:34:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:34:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:28.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:34:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:29 np0005593234 nova_compute[227762]: 2026-01-23 09:34:29.273 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:30.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:30.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:31 np0005593234 nova_compute[227762]: 2026-01-23 09:34:31.554 227766 DEBUG oslo_concurrency.lockutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "f3277436-85d0-4674-aa69-d7a50448a5d0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:31 np0005593234 nova_compute[227762]: 2026-01-23 09:34:31.555 227766 DEBUG oslo_concurrency.lockutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f3277436-85d0-4674-aa69-d7a50448a5d0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:31 np0005593234 nova_compute[227762]: 2026-01-23 09:34:31.555 227766 DEBUG oslo_concurrency.lockutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "f3277436-85d0-4674-aa69-d7a50448a5d0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:31 np0005593234 nova_compute[227762]: 2026-01-23 09:34:31.556 227766 DEBUG oslo_concurrency.lockutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f3277436-85d0-4674-aa69-d7a50448a5d0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:31 np0005593234 nova_compute[227762]: 2026-01-23 09:34:31.556 227766 DEBUG oslo_concurrency.lockutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f3277436-85d0-4674-aa69-d7a50448a5d0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:31 np0005593234 nova_compute[227762]: 2026-01-23 09:34:31.558 227766 INFO nova.compute.manager [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Terminating instance#033[00m
Jan 23 04:34:31 np0005593234 nova_compute[227762]: 2026-01-23 09:34:31.559 227766 DEBUG oslo_concurrency.lockutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:34:31 np0005593234 nova_compute[227762]: 2026-01-23 09:34:31.559 227766 DEBUG oslo_concurrency.lockutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquired lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:34:31 np0005593234 nova_compute[227762]: 2026-01-23 09:34:31.559 227766 DEBUG nova.network.neutron [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:34:32 np0005593234 nova_compute[227762]: 2026-01-23 09:34:32.094 227766 DEBUG nova.network.neutron [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:34:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:32.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:32.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:32 np0005593234 nova_compute[227762]: 2026-01-23 09:34:32.850 227766 DEBUG nova.network.neutron [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:34:32 np0005593234 nova_compute[227762]: 2026-01-23 09:34:32.879 227766 DEBUG oslo_concurrency.lockutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Releasing lock "refresh_cache-f3277436-85d0-4674-aa69-d7a50448a5d0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:34:32 np0005593234 nova_compute[227762]: 2026-01-23 09:34:32.880 227766 DEBUG nova.compute.manager [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:34:32 np0005593234 nova_compute[227762]: 2026-01-23 09:34:32.907 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:32 np0005593234 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Deactivated successfully.
Jan 23 04:34:32 np0005593234 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d00000008.scope: Consumed 20.762s CPU time.
Jan 23 04:34:32 np0005593234 systemd-machined[195626]: Machine qemu-4-instance-00000008 terminated.
Jan 23 04:34:33 np0005593234 nova_compute[227762]: 2026-01-23 09:34:33.105 227766 INFO nova.virt.libvirt.driver [-] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance destroyed successfully.#033[00m
Jan 23 04:34:33 np0005593234 nova_compute[227762]: 2026-01-23 09:34:33.106 227766 DEBUG nova.objects.instance [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lazy-loading 'resources' on Instance uuid f3277436-85d0-4674-aa69-d7a50448a5d0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:34:33 np0005593234 nova_compute[227762]: 2026-01-23 09:34:33.564 227766 INFO nova.virt.libvirt.driver [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Deleting instance files /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0_del#033[00m
Jan 23 04:34:33 np0005593234 nova_compute[227762]: 2026-01-23 09:34:33.565 227766 INFO nova.virt.libvirt.driver [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Deletion of /var/lib/nova/instances/f3277436-85d0-4674-aa69-d7a50448a5d0_del complete#033[00m
Jan 23 04:34:33 np0005593234 nova_compute[227762]: 2026-01-23 09:34:33.706 227766 INFO nova.compute.manager [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Took 0.83 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:34:33 np0005593234 nova_compute[227762]: 2026-01-23 09:34:33.706 227766 DEBUG oslo.service.loopingcall [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:34:33 np0005593234 nova_compute[227762]: 2026-01-23 09:34:33.707 227766 DEBUG nova.compute.manager [-] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:34:33 np0005593234 nova_compute[227762]: 2026-01-23 09:34:33.707 227766 DEBUG nova.network.neutron [-] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:34:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e161 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:34:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:34.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:34:34 np0005593234 nova_compute[227762]: 2026-01-23 09:34:34.276 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:34.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:34:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:34:35 np0005593234 nova_compute[227762]: 2026-01-23 09:34:35.681 227766 DEBUG nova.network.neutron [-] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:34:35 np0005593234 nova_compute[227762]: 2026-01-23 09:34:35.711 227766 DEBUG nova.network.neutron [-] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:34:35 np0005593234 nova_compute[227762]: 2026-01-23 09:34:35.740 227766 INFO nova.compute.manager [-] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Took 2.03 seconds to deallocate network for instance.#033[00m
Jan 23 04:34:35 np0005593234 nova_compute[227762]: 2026-01-23 09:34:35.827 227766 DEBUG oslo_concurrency.lockutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:35 np0005593234 nova_compute[227762]: 2026-01-23 09:34:35.828 227766 DEBUG oslo_concurrency.lockutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:35 np0005593234 nova_compute[227762]: 2026-01-23 09:34:35.945 227766 DEBUG oslo_concurrency.processutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:36.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:34:36 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1783044448' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:34:36 np0005593234 nova_compute[227762]: 2026-01-23 09:34:36.412 227766 DEBUG oslo_concurrency.processutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:36 np0005593234 nova_compute[227762]: 2026-01-23 09:34:36.419 227766 DEBUG nova.compute.provider_tree [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:34:36 np0005593234 nova_compute[227762]: 2026-01-23 09:34:36.466 227766 DEBUG nova.scheduler.client.report [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:34:36 np0005593234 nova_compute[227762]: 2026-01-23 09:34:36.525 227766 DEBUG oslo_concurrency.lockutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.697s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:36.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:36 np0005593234 nova_compute[227762]: 2026-01-23 09:34:36.894 227766 INFO nova.scheduler.client.report [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Deleted allocations for instance f3277436-85d0-4674-aa69-d7a50448a5d0#033[00m
Jan 23 04:34:37 np0005593234 nova_compute[227762]: 2026-01-23 09:34:37.241 227766 DEBUG oslo_concurrency.lockutils [None req-670a82db-af0e-4877-a6cd-0c4ad67ca21f 7536fa2e625541fba613dc32a49a4c5b 11def90dfdc14cfe928302bec2835794 - - default default] Lock "f3277436-85d0-4674-aa69-d7a50448a5d0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:37 np0005593234 nova_compute[227762]: 2026-01-23 09:34:37.910 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:38.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e162 e162: 3 total, 3 up, 3 in
Jan 23 04:34:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:34:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:38.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:34:38 np0005593234 podman[239869]: 2026-01-23 09:34:38.760453704 +0000 UTC m=+0.054403887 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 04:34:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e162 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:39 np0005593234 nova_compute[227762]: 2026-01-23 09:34:39.278 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:40.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:34:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:40.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:34:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:42.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:42.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:34:42.811 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:34:42.812 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:34:42.812 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:42 np0005593234 nova_compute[227762]: 2026-01-23 09:34:42.912 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e163 e163: 3 total, 3 up, 3 in
Jan 23 04:34:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:44.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:44 np0005593234 nova_compute[227762]: 2026-01-23 09:34:44.280 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:34:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:44.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:34:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:46.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:46.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:47 np0005593234 nova_compute[227762]: 2026-01-23 09:34:47.914 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:48 np0005593234 nova_compute[227762]: 2026-01-23 09:34:48.104 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160873.1028671, f3277436-85d0-4674-aa69-d7a50448a5d0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:34:48 np0005593234 nova_compute[227762]: 2026-01-23 09:34:48.104 227766 INFO nova.compute.manager [-] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:34:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:48.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:48 np0005593234 nova_compute[227762]: 2026-01-23 09:34:48.187 227766 DEBUG nova.compute.manager [None req-d8e90848-ea60-4315-8726-b7b9cf3ad75a - - - - - -] [instance: f3277436-85d0-4674-aa69-d7a50448a5d0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:34:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:48.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:49 np0005593234 nova_compute[227762]: 2026-01-23 09:34:49.327 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:50.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:50.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:51 np0005593234 podman[239945]: 2026-01-23 09:34:51.786375213 +0000 UTC m=+0.080207200 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 04:34:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:52.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:52.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:52 np0005593234 nova_compute[227762]: 2026-01-23 09:34:52.916 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:34:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:54.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:34:54 np0005593234 nova_compute[227762]: 2026-01-23 09:34:54.329 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:34:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:54.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:34:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:56.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:56 np0005593234 nova_compute[227762]: 2026-01-23 09:34:56.341 227766 DEBUG oslo_concurrency.lockutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "0f44d94b-c404-4501-9396-3fb093b808a6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:56 np0005593234 nova_compute[227762]: 2026-01-23 09:34:56.341 227766 DEBUG oslo_concurrency.lockutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "0f44d94b-c404-4501-9396-3fb093b808a6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:56 np0005593234 nova_compute[227762]: 2026-01-23 09:34:56.371 227766 DEBUG nova.compute.manager [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:34:56 np0005593234 nova_compute[227762]: 2026-01-23 09:34:56.497 227766 DEBUG oslo_concurrency.lockutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:56 np0005593234 nova_compute[227762]: 2026-01-23 09:34:56.497 227766 DEBUG oslo_concurrency.lockutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:56 np0005593234 nova_compute[227762]: 2026-01-23 09:34:56.504 227766 DEBUG nova.virt.hardware [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:34:56 np0005593234 nova_compute[227762]: 2026-01-23 09:34:56.505 227766 INFO nova.compute.claims [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:34:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:56.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:56 np0005593234 nova_compute[227762]: 2026-01-23 09:34:56.729 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:34:57 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3611179412' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:34:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e164 e164: 3 total, 3 up, 3 in
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.166 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.173 227766 DEBUG nova.compute.provider_tree [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.220 227766 DEBUG nova.scheduler.client.report [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.260 227766 DEBUG oslo_concurrency.lockutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.762s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.261 227766 DEBUG nova.compute.manager [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.388 227766 DEBUG nova.compute.manager [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.388 227766 DEBUG nova.network.neutron [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.433 227766 INFO nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.460 227766 DEBUG nova.compute.manager [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.609 227766 DEBUG nova.compute.manager [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.610 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.610 227766 INFO nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Creating image(s)#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.635 227766 DEBUG nova.storage.rbd_utils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 0f44d94b-c404-4501-9396-3fb093b808a6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.659 227766 DEBUG nova.storage.rbd_utils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 0f44d94b-c404-4501-9396-3fb093b808a6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.686 227766 DEBUG nova.storage.rbd_utils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 0f44d94b-c404-4501-9396-3fb093b808a6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.690 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.749 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.750 227766 DEBUG oslo_concurrency.lockutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.750 227766 DEBUG oslo_concurrency.lockutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.751 227766 DEBUG oslo_concurrency.lockutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.775 227766 DEBUG nova.storage.rbd_utils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 0f44d94b-c404-4501-9396-3fb093b808a6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.778 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0f44d94b-c404-4501-9396-3fb093b808a6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:57 np0005593234 nova_compute[227762]: 2026-01-23 09:34:57.918 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.052 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0f44d94b-c404-4501-9396-3fb093b808a6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.274s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.114 227766 DEBUG nova.storage.rbd_utils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] resizing rbd image 0f44d94b-c404-4501-9396-3fb093b808a6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:34:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:34:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:34:58.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:34:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e165 e165: 3 total, 3 up, 3 in
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.221 227766 DEBUG nova.objects.instance [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lazy-loading 'migration_context' on Instance uuid 0f44d94b-c404-4501-9396-3fb093b808a6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.237 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.238 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Ensure instance console log exists: /var/lib/nova/instances/0f44d94b-c404-4501-9396-3fb093b808a6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.238 227766 DEBUG oslo_concurrency.lockutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.239 227766 DEBUG oslo_concurrency.lockutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.239 227766 DEBUG oslo_concurrency.lockutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.312 227766 DEBUG nova.network.neutron [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.312 227766 DEBUG nova.compute.manager [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.314 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.317 227766 WARNING nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.332 227766 DEBUG nova.virt.libvirt.host [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.332 227766 DEBUG nova.virt.libvirt.host [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.336 227766 DEBUG nova.virt.libvirt.host [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.337 227766 DEBUG nova.virt.libvirt.host [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.338 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.338 227766 DEBUG nova.virt.hardware [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.339 227766 DEBUG nova.virt.hardware [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.339 227766 DEBUG nova.virt.hardware [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.339 227766 DEBUG nova.virt.hardware [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.339 227766 DEBUG nova.virt.hardware [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.340 227766 DEBUG nova.virt.hardware [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.340 227766 DEBUG nova.virt.hardware [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.340 227766 DEBUG nova.virt.hardware [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.340 227766 DEBUG nova.virt.hardware [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.341 227766 DEBUG nova.virt.hardware [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.341 227766 DEBUG nova.virt.hardware [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.344 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:34:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:34:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:34:58.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:34:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:34:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1453123447' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.779 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.809 227766 DEBUG nova.storage.rbd_utils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 0f44d94b-c404-4501-9396-3fb093b808a6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:58 np0005593234 nova_compute[227762]: 2026-01-23 09:34:58.813 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:34:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e166 e166: 3 total, 3 up, 3 in
Jan 23 04:34:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:34:59 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2479491880' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:34:59 np0005593234 nova_compute[227762]: 2026-01-23 09:34:59.249 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:59 np0005593234 nova_compute[227762]: 2026-01-23 09:34:59.251 227766 DEBUG nova.objects.instance [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lazy-loading 'pci_devices' on Instance uuid 0f44d94b-c404-4501-9396-3fb093b808a6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:34:59 np0005593234 nova_compute[227762]: 2026-01-23 09:34:59.279 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  <uuid>0f44d94b-c404-4501-9396-3fb093b808a6</uuid>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  <name>instance-00000015</name>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <nova:name>tempest-LiveMigrationNegativeTest-server-1205718794</nova:name>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:34:58</nova:creationTime>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <nova:user uuid="3e17ce3f8d5246daad6b3964a2b6df05">tempest-LiveMigrationNegativeTest-202193021-project-member</nova:user>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <nova:project uuid="ab9c85124a434b1390041a9ca5c05ddd">tempest-LiveMigrationNegativeTest-202193021</nova:project>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <entry name="serial">0f44d94b-c404-4501-9396-3fb093b808a6</entry>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <entry name="uuid">0f44d94b-c404-4501-9396-3fb093b808a6</entry>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/0f44d94b-c404-4501-9396-3fb093b808a6_disk">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/0f44d94b-c404-4501-9396-3fb093b808a6_disk.config">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/0f44d94b-c404-4501-9396-3fb093b808a6/console.log" append="off"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:34:59 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:34:59 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:34:59 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:34:59 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:34:59 np0005593234 nova_compute[227762]: 2026-01-23 09:34:59.331 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:34:59 np0005593234 nova_compute[227762]: 2026-01-23 09:34:59.352 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:34:59 np0005593234 nova_compute[227762]: 2026-01-23 09:34:59.353 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:34:59 np0005593234 nova_compute[227762]: 2026-01-23 09:34:59.353 227766 INFO nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Using config drive#033[00m
Jan 23 04:34:59 np0005593234 nova_compute[227762]: 2026-01-23 09:34:59.384 227766 DEBUG nova.storage.rbd_utils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 0f44d94b-c404-4501-9396-3fb093b808a6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:59 np0005593234 nova_compute[227762]: 2026-01-23 09:34:59.826 227766 INFO nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Creating config drive at /var/lib/nova/instances/0f44d94b-c404-4501-9396-3fb093b808a6/disk.config#033[00m
Jan 23 04:34:59 np0005593234 nova_compute[227762]: 2026-01-23 09:34:59.831 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0f44d94b-c404-4501-9396-3fb093b808a6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdby3wic5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:34:59 np0005593234 nova_compute[227762]: 2026-01-23 09:34:59.957 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0f44d94b-c404-4501-9396-3fb093b808a6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdby3wic5" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:34:59 np0005593234 nova_compute[227762]: 2026-01-23 09:34:59.985 227766 DEBUG nova.storage.rbd_utils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] rbd image 0f44d94b-c404-4501-9396-3fb093b808a6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:34:59 np0005593234 nova_compute[227762]: 2026-01-23 09:34:59.988 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0f44d94b-c404-4501-9396-3fb093b808a6/disk.config 0f44d94b-c404-4501-9396-3fb093b808a6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:00.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:00 np0005593234 nova_compute[227762]: 2026-01-23 09:35:00.417 227766 DEBUG oslo_concurrency.processutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0f44d94b-c404-4501-9396-3fb093b808a6/disk.config 0f44d94b-c404-4501-9396-3fb093b808a6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:00 np0005593234 nova_compute[227762]: 2026-01-23 09:35:00.418 227766 INFO nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Deleting local config drive /var/lib/nova/instances/0f44d94b-c404-4501-9396-3fb093b808a6/disk.config because it was imported into RBD.#033[00m
Jan 23 04:35:00 np0005593234 systemd-machined[195626]: New machine qemu-11-instance-00000015.
Jan 23 04:35:00 np0005593234 systemd[1]: Started Virtual Machine qemu-11-instance-00000015.
Jan 23 04:35:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:35:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:00.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.056 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160901.0550914, 0f44d94b-c404-4501-9396-3fb093b808a6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.058 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.061 227766 DEBUG nova.compute.manager [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.061 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.066 227766 INFO nova.virt.libvirt.driver [-] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Instance spawned successfully.#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.066 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.096 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.101 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.101 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.102 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.102 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.102 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.103 227766 DEBUG nova.virt.libvirt.driver [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.107 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.160 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.161 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160901.0570097, 0f44d94b-c404-4501-9396-3fb093b808a6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.161 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] VM Started (Lifecycle Event)#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.206 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.209 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.243 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.251 227766 INFO nova.compute.manager [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Took 3.64 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.252 227766 DEBUG nova.compute.manager [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.403 227766 INFO nova.compute.manager [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Took 4.97 seconds to build instance.#033[00m
Jan 23 04:35:01 np0005593234 nova_compute[227762]: 2026-01-23 09:35:01.422 227766 DEBUG oslo_concurrency.lockutils [None req-8f898bf0-16bd-42c9-a433-d7b46bc6775b 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "0f44d94b-c404-4501-9396-3fb093b808a6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:35:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:02.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:35:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:02.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:02 np0005593234 nova_compute[227762]: 2026-01-23 09:35:02.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:02 np0005593234 nova_compute[227762]: 2026-01-23 09:35:02.920 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e167 e167: 3 total, 3 up, 3 in
Jan 23 04:35:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:35:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:04.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.334 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:04.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.733 227766 DEBUG nova.objects.instance [None req-f940e641-12c8-4406-ac62-0d5d4ba6d80c 6029e83c4f47403aa9bf951949d7a0af 0d5606436cdc46a083fe5a1b1d0754a3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0f44d94b-c404-4501-9396-3fb093b808a6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.761 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160904.7615008, 0f44d94b-c404-4501-9396-3fb093b808a6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.762 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.778 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.778 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.779 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.779 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.779 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.805 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.809 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:35:04 np0005593234 nova_compute[227762]: 2026-01-23 09:35:04.834 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 23 04:35:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:35:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2538507702' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:35:05 np0005593234 nova_compute[227762]: 2026-01-23 09:35:05.487 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.708s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:05 np0005593234 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000015.scope: Deactivated successfully.
Jan 23 04:35:05 np0005593234 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000015.scope: Consumed 4.429s CPU time.
Jan 23 04:35:05 np0005593234 systemd-machined[195626]: Machine qemu-11-instance-00000015 terminated.
Jan 23 04:35:05 np0005593234 nova_compute[227762]: 2026-01-23 09:35:05.670 227766 DEBUG nova.compute.manager [None req-f940e641-12c8-4406-ac62-0d5d4ba6d80c 6029e83c4f47403aa9bf951949d7a0af 0d5606436cdc46a083fe5a1b1d0754a3 - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:05 np0005593234 nova_compute[227762]: 2026-01-23 09:35:05.866 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000015 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:35:05 np0005593234 nova_compute[227762]: 2026-01-23 09:35:05.867 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000015 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:35:06 np0005593234 nova_compute[227762]: 2026-01-23 09:35:06.021 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:35:06 np0005593234 nova_compute[227762]: 2026-01-23 09:35:06.022 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4799MB free_disk=20.87649154663086GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:35:06 np0005593234 nova_compute[227762]: 2026-01-23 09:35:06.022 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:06 np0005593234 nova_compute[227762]: 2026-01-23 09:35:06.023 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:06.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:06 np0005593234 nova_compute[227762]: 2026-01-23 09:35:06.225 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 0f44d94b-c404-4501-9396-3fb093b808a6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:35:06 np0005593234 nova_compute[227762]: 2026-01-23 09:35:06.226 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:35:06 np0005593234 nova_compute[227762]: 2026-01-23 09:35:06.226 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:35:06 np0005593234 nova_compute[227762]: 2026-01-23 09:35:06.447 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:06.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:35:06 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3441687791' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:35:06 np0005593234 nova_compute[227762]: 2026-01-23 09:35:06.913 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:06 np0005593234 nova_compute[227762]: 2026-01-23 09:35:06.918 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:35:06 np0005593234 nova_compute[227762]: 2026-01-23 09:35:06.939 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:35:06 np0005593234 nova_compute[227762]: 2026-01-23 09:35:06.988 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:35:06 np0005593234 nova_compute[227762]: 2026-01-23 09:35:06.989 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.966s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:07 np0005593234 nova_compute[227762]: 2026-01-23 09:35:07.922 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:08.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:35:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:08.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:35:08 np0005593234 nova_compute[227762]: 2026-01-23 09:35:08.983 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.013 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.014 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.014 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:35:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.319 227766 DEBUG oslo_concurrency.lockutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "0f44d94b-c404-4501-9396-3fb093b808a6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.320 227766 DEBUG oslo_concurrency.lockutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "0f44d94b-c404-4501-9396-3fb093b808a6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.320 227766 DEBUG oslo_concurrency.lockutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "0f44d94b-c404-4501-9396-3fb093b808a6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.321 227766 DEBUG oslo_concurrency.lockutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "0f44d94b-c404-4501-9396-3fb093b808a6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.321 227766 DEBUG oslo_concurrency.lockutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "0f44d94b-c404-4501-9396-3fb093b808a6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.322 227766 INFO nova.compute.manager [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Terminating instance#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.323 227766 DEBUG oslo_concurrency.lockutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "refresh_cache-0f44d94b-c404-4501-9396-3fb093b808a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.324 227766 DEBUG oslo_concurrency.lockutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquired lock "refresh_cache-0f44d94b-c404-4501-9396-3fb093b808a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.324 227766 DEBUG nova.network.neutron [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.329 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-0f44d94b-c404-4501-9396-3fb093b808a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.335 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:09 np0005593234 nova_compute[227762]: 2026-01-23 09:35:09.594 227766 DEBUG nova.network.neutron [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:35:09 np0005593234 podman[240398]: 2026-01-23 09:35:09.779495757 +0000 UTC m=+0.061192878 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.000 227766 DEBUG nova.network.neutron [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.021 227766 DEBUG oslo_concurrency.lockutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Releasing lock "refresh_cache-0f44d94b-c404-4501-9396-3fb093b808a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.022 227766 DEBUG nova.compute.manager [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.023 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-0f44d94b-c404-4501-9396-3fb093b808a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.023 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.023 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0f44d94b-c404-4501-9396-3fb093b808a6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.029 227766 INFO nova.virt.libvirt.driver [-] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Instance destroyed successfully.#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.030 227766 DEBUG nova.objects.instance [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lazy-loading 'resources' on Instance uuid 0f44d94b-c404-4501-9396-3fb093b808a6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:35:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:10.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.234 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.508 227766 INFO nova.virt.libvirt.driver [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Deleting instance files /var/lib/nova/instances/0f44d94b-c404-4501-9396-3fb093b808a6_del#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.509 227766 INFO nova.virt.libvirt.driver [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Deletion of /var/lib/nova/instances/0f44d94b-c404-4501-9396-3fb093b808a6_del complete#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.577 227766 INFO nova.compute.manager [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Took 0.55 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.577 227766 DEBUG oslo.service.loopingcall [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.578 227766 DEBUG nova.compute.manager [-] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.578 227766 DEBUG nova.network.neutron [-] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:35:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:10.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.833 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.854 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-0f44d94b-c404-4501-9396-3fb093b808a6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.855 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.856 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.856 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.860 227766 DEBUG nova.network.neutron [-] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.875 227766 DEBUG nova.network.neutron [-] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.899 227766 INFO nova.compute.manager [-] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Took 0.32 seconds to deallocate network for instance.#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.951 227766 DEBUG oslo_concurrency.lockutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:10 np0005593234 nova_compute[227762]: 2026-01-23 09:35:10.952 227766 DEBUG oslo_concurrency.lockutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:11 np0005593234 nova_compute[227762]: 2026-01-23 09:35:11.017 227766 DEBUG oslo_concurrency.processutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:35:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:35:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3870429015' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:35:11 np0005593234 nova_compute[227762]: 2026-01-23 09:35:11.473 227766 DEBUG oslo_concurrency.processutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:35:11 np0005593234 nova_compute[227762]: 2026-01-23 09:35:11.479 227766 DEBUG nova.compute.provider_tree [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:35:11 np0005593234 nova_compute[227762]: 2026-01-23 09:35:11.508 227766 DEBUG nova.scheduler.client.report [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:35:11 np0005593234 nova_compute[227762]: 2026-01-23 09:35:11.538 227766 DEBUG oslo_concurrency.lockutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:11 np0005593234 nova_compute[227762]: 2026-01-23 09:35:11.562 227766 INFO nova.scheduler.client.report [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Deleted allocations for instance 0f44d94b-c404-4501-9396-3fb093b808a6#033[00m
Jan 23 04:35:11 np0005593234 nova_compute[227762]: 2026-01-23 09:35:11.635 227766 DEBUG oslo_concurrency.lockutils [None req-f5fcf9fa-b797-4663-9670-f328e9c10e6c 3e17ce3f8d5246daad6b3964a2b6df05 ab9c85124a434b1390041a9ca5c05ddd - - default default] Lock "0f44d94b-c404-4501-9396-3fb093b808a6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:11 np0005593234 nova_compute[227762]: 2026-01-23 09:35:11.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:12.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:35:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:12.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:35:12 np0005593234 nova_compute[227762]: 2026-01-23 09:35:12.925 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:13 np0005593234 nova_compute[227762]: 2026-01-23 09:35:13.737 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:35:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:14.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:14 np0005593234 nova_compute[227762]: 2026-01-23 09:35:14.338 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:35:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:14.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:35:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:16.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:16.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:17 np0005593234 nova_compute[227762]: 2026-01-23 09:35:17.928 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:18.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:18.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:19 np0005593234 nova_compute[227762]: 2026-01-23 09:35:19.361 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e168 e168: 3 total, 3 up, 3 in
Jan 23 04:35:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:20.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:35:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:20.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:35:20 np0005593234 nova_compute[227762]: 2026-01-23 09:35:20.672 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160905.6711218, 0f44d94b-c404-4501-9396-3fb093b808a6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:35:20 np0005593234 nova_compute[227762]: 2026-01-23 09:35:20.673 227766 INFO nova.compute.manager [-] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:35:20 np0005593234 nova_compute[227762]: 2026-01-23 09:35:20.696 227766 DEBUG nova.compute.manager [None req-14f6e27d-970f-4e84-a482-e84988ecf4e9 - - - - - -] [instance: 0f44d94b-c404-4501-9396-3fb093b808a6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:35:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:22.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:35:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:22.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:35:22 np0005593234 podman[240514]: 2026-01-23 09:35:22.791691898 +0000 UTC m=+0.088762187 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Jan 23 04:35:22 np0005593234 nova_compute[227762]: 2026-01-23 09:35:22.929 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 e169: 3 total, 3 up, 3 in
Jan 23 04:35:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:24.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:24 np0005593234 nova_compute[227762]: 2026-01-23 09:35:24.363 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:24.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:35:24.947 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:35:24 np0005593234 nova_compute[227762]: 2026-01-23 09:35:24.948 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:35:24.948 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:35:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:35:25.950 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:35:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:26.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:35:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:26.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:35:27 np0005593234 nova_compute[227762]: 2026-01-23 09:35:27.930 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:35:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:28.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:35:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:28.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:29 np0005593234 nova_compute[227762]: 2026-01-23 09:35:29.366 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:30.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:30.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:32.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:32.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:32 np0005593234 nova_compute[227762]: 2026-01-23 09:35:32.932 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:34.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:34 np0005593234 nova_compute[227762]: 2026-01-23 09:35:34.369 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:34.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:36.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:35:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:35:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:35:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:36.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:37 np0005593234 nova_compute[227762]: 2026-01-23 09:35:37.934 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:35:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:38.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:35:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:38.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:39 np0005593234 nova_compute[227762]: 2026-01-23 09:35:39.371 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:40.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:35:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:40.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:35:40 np0005593234 podman[240731]: 2026-01-23 09:35:40.784551951 +0000 UTC m=+0.082682209 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:35:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:42.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:42.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:35:42.812 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:35:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:35:42.813 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:35:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:35:42.813 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:35:42 np0005593234 nova_compute[227762]: 2026-01-23 09:35:42.936 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:35:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:35:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:44.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:35:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3701273073' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:35:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:35:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3701273073' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:35:44 np0005593234 nova_compute[227762]: 2026-01-23 09:35:44.373 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:44.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:35:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:46.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:35:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:35:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:46.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:35:47 np0005593234 nova_compute[227762]: 2026-01-23 09:35:47.938 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:48.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:48.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:49 np0005593234 nova_compute[227762]: 2026-01-23 09:35:49.374 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:50.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:50.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:52.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:52.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:52 np0005593234 nova_compute[227762]: 2026-01-23 09:35:52.940 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:53 np0005593234 podman[240857]: 2026-01-23 09:35:53.807429514 +0000 UTC m=+0.084980279 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 04:35:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:54.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:54 np0005593234 nova_compute[227762]: 2026-01-23 09:35:54.376 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:54.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:56.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:56.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:57 np0005593234 nova_compute[227762]: 2026-01-23 09:35:57.947 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:35:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:35:58.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:35:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:35:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:35:58.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:35:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:35:59 np0005593234 nova_compute[227762]: 2026-01-23 09:35:59.379 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:36:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:00.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:36:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:36:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:00.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:36:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:36:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:02.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:36:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:36:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:02.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:36:02 np0005593234 nova_compute[227762]: 2026-01-23 09:36:02.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:02 np0005593234 nova_compute[227762]: 2026-01-23 09:36:02.949 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:04.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:04 np0005593234 nova_compute[227762]: 2026-01-23 09:36:04.380 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:36:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:04.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:36:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:06.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:06.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:06 np0005593234 nova_compute[227762]: 2026-01-23 09:36:06.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:06 np0005593234 nova_compute[227762]: 2026-01-23 09:36:06.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:06 np0005593234 nova_compute[227762]: 2026-01-23 09:36:06.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:36:06 np0005593234 nova_compute[227762]: 2026-01-23 09:36:06.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:07 np0005593234 nova_compute[227762]: 2026-01-23 09:36:07.311 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:07 np0005593234 nova_compute[227762]: 2026-01-23 09:36:07.311 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:07 np0005593234 nova_compute[227762]: 2026-01-23 09:36:07.312 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:07 np0005593234 nova_compute[227762]: 2026-01-23 09:36:07.312 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:36:07 np0005593234 nova_compute[227762]: 2026-01-23 09:36:07.312 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:36:07 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/126814809' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:36:07 np0005593234 nova_compute[227762]: 2026-01-23 09:36:07.736 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:07 np0005593234 nova_compute[227762]: 2026-01-23 09:36:07.877 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:36:07 np0005593234 nova_compute[227762]: 2026-01-23 09:36:07.878 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4817MB free_disk=20.94662857055664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:36:07 np0005593234 nova_compute[227762]: 2026-01-23 09:36:07.879 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:07 np0005593234 nova_compute[227762]: 2026-01-23 09:36:07.879 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:07 np0005593234 nova_compute[227762]: 2026-01-23 09:36:07.975 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:36:07 np0005593234 nova_compute[227762]: 2026-01-23 09:36:07.976 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:36:07 np0005593234 nova_compute[227762]: 2026-01-23 09:36:07.994 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:08 np0005593234 nova_compute[227762]: 2026-01-23 09:36:08.007 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:08.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:36:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1494788499' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:36:08 np0005593234 nova_compute[227762]: 2026-01-23 09:36:08.435 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:08 np0005593234 nova_compute[227762]: 2026-01-23 09:36:08.442 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:36:08 np0005593234 nova_compute[227762]: 2026-01-23 09:36:08.459 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:36:08 np0005593234 nova_compute[227762]: 2026-01-23 09:36:08.494 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:36:08 np0005593234 nova_compute[227762]: 2026-01-23 09:36:08.494 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.615s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:08.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.284 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.285 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.305 227766 DEBUG nova.compute.manager [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.382 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.385 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.386 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.391 227766 DEBUG nova.virt.hardware [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.392 227766 INFO nova.compute.claims [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.494 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.494 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.494 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.514 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.515 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.515 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:09 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.552 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:36:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3726516418' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:09.999 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.006 227766 DEBUG nova.compute.provider_tree [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.031 227766 DEBUG nova.scheduler.client.report [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.072 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.686s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.073 227766 DEBUG nova.compute.manager [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.133 227766 DEBUG nova.compute.manager [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.134 227766 DEBUG nova.network.neutron [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.163 227766 INFO nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.190 227766 DEBUG nova.compute.manager [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:36:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:10.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.345 227766 DEBUG nova.compute.manager [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.347 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.347 227766 INFO nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Creating image(s)#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.373 227766 DEBUG nova.storage.rbd_utils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] rbd image 5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.401 227766 DEBUG nova.storage.rbd_utils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] rbd image 5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.429 227766 DEBUG nova.storage.rbd_utils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] rbd image 5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.433 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.501 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.502 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.503 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.503 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.527 227766 DEBUG nova.storage.rbd_utils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] rbd image 5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.530 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:10.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:10 np0005593234 nova_compute[227762]: 2026-01-23 09:36:10.928 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.398s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:11 np0005593234 nova_compute[227762]: 2026-01-23 09:36:11.005 227766 DEBUG nova.storage.rbd_utils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] resizing rbd image 5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:36:11 np0005593234 nova_compute[227762]: 2026-01-23 09:36:11.120 227766 DEBUG nova.objects.instance [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lazy-loading 'migration_context' on Instance uuid 5cea9bfc-e97a-4d07-a251-8ca3978b5f98 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:36:11 np0005593234 nova_compute[227762]: 2026-01-23 09:36:11.174 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:36:11 np0005593234 nova_compute[227762]: 2026-01-23 09:36:11.175 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Ensure instance console log exists: /var/lib/nova/instances/5cea9bfc-e97a-4d07-a251-8ca3978b5f98/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:36:11 np0005593234 nova_compute[227762]: 2026-01-23 09:36:11.175 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:11 np0005593234 nova_compute[227762]: 2026-01-23 09:36:11.175 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:11 np0005593234 nova_compute[227762]: 2026-01-23 09:36:11.176 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:11 np0005593234 podman[241151]: 2026-01-23 09:36:11.25368438 +0000 UTC m=+0.052223519 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 04:36:11 np0005593234 nova_compute[227762]: 2026-01-23 09:36:11.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:11 np0005593234 nova_compute[227762]: 2026-01-23 09:36:11.793 227766 DEBUG nova.policy [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f72965e950c4761bfedd99fdc411a83', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd0dce6e339c349d4ab97cee5e49fff3a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:36:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:12.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:12.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:12 np0005593234 nova_compute[227762]: 2026-01-23 09:36:12.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:12 np0005593234 nova_compute[227762]: 2026-01-23 09:36:12.995 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:14.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:14 np0005593234 nova_compute[227762]: 2026-01-23 09:36:14.257 227766 DEBUG nova.network.neutron [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Successfully updated port: a19a3bde-2463-4f15-afe7-f8df8c608bb7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:36:14 np0005593234 nova_compute[227762]: 2026-01-23 09:36:14.286 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "refresh_cache-5cea9bfc-e97a-4d07-a251-8ca3978b5f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:36:14 np0005593234 nova_compute[227762]: 2026-01-23 09:36:14.286 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquired lock "refresh_cache-5cea9bfc-e97a-4d07-a251-8ca3978b5f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:36:14 np0005593234 nova_compute[227762]: 2026-01-23 09:36:14.287 227766 DEBUG nova.network.neutron [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:36:14 np0005593234 nova_compute[227762]: 2026-01-23 09:36:14.385 227766 DEBUG nova.compute.manager [req-12e7c2fd-568d-4fb5-ac12-e92a32d9edd2 req-bfc2030f-f01b-4d01-baf8-8d31b7194342 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-changed-a19a3bde-2463-4f15-afe7-f8df8c608bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:14 np0005593234 nova_compute[227762]: 2026-01-23 09:36:14.386 227766 DEBUG nova.compute.manager [req-12e7c2fd-568d-4fb5-ac12-e92a32d9edd2 req-bfc2030f-f01b-4d01-baf8-8d31b7194342 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Refreshing instance network info cache due to event network-changed-a19a3bde-2463-4f15-afe7-f8df8c608bb7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:36:14 np0005593234 nova_compute[227762]: 2026-01-23 09:36:14.387 227766 DEBUG oslo_concurrency.lockutils [req-12e7c2fd-568d-4fb5-ac12-e92a32d9edd2 req-bfc2030f-f01b-4d01-baf8-8d31b7194342 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-5cea9bfc-e97a-4d07-a251-8ca3978b5f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:36:14 np0005593234 nova_compute[227762]: 2026-01-23 09:36:14.387 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:14.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:14 np0005593234 nova_compute[227762]: 2026-01-23 09:36:14.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:36:15 np0005593234 nova_compute[227762]: 2026-01-23 09:36:15.041 227766 DEBUG nova.network.neutron [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:36:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:16.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:16.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.139 227766 DEBUG nova.network.neutron [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Updating instance_info_cache with network_info: [{"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.195 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Releasing lock "refresh_cache-5cea9bfc-e97a-4d07-a251-8ca3978b5f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.195 227766 DEBUG nova.compute.manager [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Instance network_info: |[{"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.196 227766 DEBUG oslo_concurrency.lockutils [req-12e7c2fd-568d-4fb5-ac12-e92a32d9edd2 req-bfc2030f-f01b-4d01-baf8-8d31b7194342 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-5cea9bfc-e97a-4d07-a251-8ca3978b5f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.196 227766 DEBUG nova.network.neutron [req-12e7c2fd-568d-4fb5-ac12-e92a32d9edd2 req-bfc2030f-f01b-4d01-baf8-8d31b7194342 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Refreshing network info cache for port a19a3bde-2463-4f15-afe7-f8df8c608bb7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.199 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Start _get_guest_xml network_info=[{"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.204 227766 WARNING nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.212 227766 DEBUG nova.virt.libvirt.host [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.214 227766 DEBUG nova.virt.libvirt.host [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.220 227766 DEBUG nova.virt.libvirt.host [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.221 227766 DEBUG nova.virt.libvirt.host [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.223 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.224 227766 DEBUG nova.virt.hardware [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.224 227766 DEBUG nova.virt.hardware [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.225 227766 DEBUG nova.virt.hardware [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.225 227766 DEBUG nova.virt.hardware [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.226 227766 DEBUG nova.virt.hardware [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.226 227766 DEBUG nova.virt.hardware [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.227 227766 DEBUG nova.virt.hardware [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.227 227766 DEBUG nova.virt.hardware [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.228 227766 DEBUG nova.virt.hardware [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.228 227766 DEBUG nova.virt.hardware [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.228 227766 DEBUG nova.virt.hardware [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.234 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:36:17 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/463106357' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.689 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.715 227766 DEBUG nova.storage.rbd_utils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] rbd image 5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.720 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:17 np0005593234 nova_compute[227762]: 2026-01-23 09:36:17.996 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:36:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3409966921' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.165 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.167 227766 DEBUG nova.virt.libvirt.vif [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:36:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1674522276',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1674522276',id=26,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d0dce6e339c349d4ab97cee5e49fff3a',ramdisk_id='',reservation_id='r-lql1hhru',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1207260646',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1207260646-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:36:10Z,user_data=None,user_id='4f72965e950c4761bfedd99fdc411a83',uuid=5cea9bfc-e97a-4d07-a251-8ca3978b5f98,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.168 227766 DEBUG nova.network.os_vif_util [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Converting VIF {"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.169 227766 DEBUG nova.network.os_vif_util [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:ec:dc,bridge_name='br-int',has_traffic_filtering=True,id=a19a3bde-2463-4f15-afe7-f8df8c608bb7,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa19a3bde-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.170 227766 DEBUG nova.objects.instance [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lazy-loading 'pci_devices' on Instance uuid 5cea9bfc-e97a-4d07-a251-8ca3978b5f98 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:36:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:18.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.337 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  <uuid>5cea9bfc-e97a-4d07-a251-8ca3978b5f98</uuid>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  <name>instance-0000001a</name>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-1674522276</nova:name>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:36:17</nova:creationTime>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <nova:user uuid="4f72965e950c4761bfedd99fdc411a83">tempest-LiveAutoBlockMigrationV225Test-1207260646-project-member</nova:user>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <nova:project uuid="d0dce6e339c349d4ab97cee5e49fff3a">tempest-LiveAutoBlockMigrationV225Test-1207260646</nova:project>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <nova:port uuid="a19a3bde-2463-4f15-afe7-f8df8c608bb7">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <entry name="serial">5cea9bfc-e97a-4d07-a251-8ca3978b5f98</entry>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <entry name="uuid">5cea9bfc-e97a-4d07-a251-8ca3978b5f98</entry>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk.config">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:7b:ec:dc"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <target dev="tapa19a3bde-24"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/5cea9bfc-e97a-4d07-a251-8ca3978b5f98/console.log" append="off"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:36:18 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:36:18 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:36:18 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:36:18 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.338 227766 DEBUG nova.compute.manager [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Preparing to wait for external event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.339 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.339 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.339 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.340 227766 DEBUG nova.virt.libvirt.vif [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:36:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1674522276',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1674522276',id=26,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d0dce6e339c349d4ab97cee5e49fff3a',ramdisk_id='',reservation_id='r-lql1hhru',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1207260646',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1207260646-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:36:10Z,user_data=None,user_id='4f72965e950c4761bfedd99fdc411a83',uuid=5cea9bfc-e97a-4d07-a251-8ca3978b5f98,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.340 227766 DEBUG nova.network.os_vif_util [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Converting VIF {"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.341 227766 DEBUG nova.network.os_vif_util [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:ec:dc,bridge_name='br-int',has_traffic_filtering=True,id=a19a3bde-2463-4f15-afe7-f8df8c608bb7,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa19a3bde-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.341 227766 DEBUG os_vif [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:ec:dc,bridge_name='br-int',has_traffic_filtering=True,id=a19a3bde-2463-4f15-afe7-f8df8c608bb7,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa19a3bde-24') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.342 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.342 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.343 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.347 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.347 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa19a3bde-24, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.347 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa19a3bde-24, col_values=(('external_ids', {'iface-id': 'a19a3bde-2463-4f15-afe7-f8df8c608bb7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:ec:dc', 'vm-uuid': '5cea9bfc-e97a-4d07-a251-8ca3978b5f98'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.349 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:18 np0005593234 NetworkManager[48942]: <info>  [1769160978.3504] manager: (tapa19a3bde-24): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.351 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.357 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.358 227766 INFO os_vif [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:ec:dc,bridge_name='br-int',has_traffic_filtering=True,id=a19a3bde-2463-4f15-afe7-f8df8c608bb7,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa19a3bde-24')#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.501 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.502 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.502 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] No VIF found with MAC fa:16:3e:7b:ec:dc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.503 227766 INFO nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Using config drive#033[00m
Jan 23 04:36:18 np0005593234 nova_compute[227762]: 2026-01-23 09:36:18.532 227766 DEBUG nova.storage.rbd_utils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] rbd image 5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:18.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:19 np0005593234 nova_compute[227762]: 2026-01-23 09:36:19.312 227766 INFO nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Creating config drive at /var/lib/nova/instances/5cea9bfc-e97a-4d07-a251-8ca3978b5f98/disk.config#033[00m
Jan 23 04:36:19 np0005593234 nova_compute[227762]: 2026-01-23 09:36:19.317 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/5cea9bfc-e97a-4d07-a251-8ca3978b5f98/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwhxenekw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:19 np0005593234 nova_compute[227762]: 2026-01-23 09:36:19.450 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/5cea9bfc-e97a-4d07-a251-8ca3978b5f98/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwhxenekw" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:19 np0005593234 nova_compute[227762]: 2026-01-23 09:36:19.478 227766 DEBUG nova.storage.rbd_utils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] rbd image 5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:36:19 np0005593234 nova_compute[227762]: 2026-01-23 09:36:19.483 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/5cea9bfc-e97a-4d07-a251-8ca3978b5f98/disk.config 5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:19 np0005593234 nova_compute[227762]: 2026-01-23 09:36:19.628 227766 DEBUG oslo_concurrency.processutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/5cea9bfc-e97a-4d07-a251-8ca3978b5f98/disk.config 5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.145s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:19 np0005593234 nova_compute[227762]: 2026-01-23 09:36:19.630 227766 INFO nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Deleting local config drive /var/lib/nova/instances/5cea9bfc-e97a-4d07-a251-8ca3978b5f98/disk.config because it was imported into RBD.#033[00m
Jan 23 04:36:19 np0005593234 kernel: tapa19a3bde-24: entered promiscuous mode
Jan 23 04:36:19 np0005593234 NetworkManager[48942]: <info>  [1769160979.6943] manager: (tapa19a3bde-24): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Jan 23 04:36:19 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:19Z|00046|binding|INFO|Claiming lport a19a3bde-2463-4f15-afe7-f8df8c608bb7 for this chassis.
Jan 23 04:36:19 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:19Z|00047|binding|INFO|a19a3bde-2463-4f15-afe7-f8df8c608bb7: Claiming fa:16:3e:7b:ec:dc 10.100.0.6
Jan 23 04:36:19 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:19Z|00048|binding|INFO|Claiming lport 9852d6c7-7b56-465e-865f-eb8c24e61417 for this chassis.
Jan 23 04:36:19 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:19Z|00049|binding|INFO|9852d6c7-7b56-465e-865f-eb8c24e61417: Claiming fa:16:3e:9e:36:a6 19.80.0.21
Jan 23 04:36:19 np0005593234 nova_compute[227762]: 2026-01-23 09:36:19.696 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:19 np0005593234 nova_compute[227762]: 2026-01-23 09:36:19.699 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:19 np0005593234 nova_compute[227762]: 2026-01-23 09:36:19.703 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.712 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:36:a6 19.80.0.21'], port_security=['fa:16:3e:9e:36:a6 19.80.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['a19a3bde-2463-4f15-afe7-f8df8c608bb7'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1709862236', 'neutron:cidrs': '19.80.0.21/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-914246be-3a6e-47b3-afc0-463db5fa1dae', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1709862236', 'neutron:project_id': 'd0dce6e339c349d4ab97cee5e49fff3a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0179c400-b2f2-4914-b563-942a61ef1858', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=6bd4ce00-6348-4d9c-ba3b-d576a6d3e856, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9852d6c7-7b56-465e-865f-eb8c24e61417) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.713 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:ec:dc 10.100.0.6'], port_security=['fa:16:3e:7b:ec:dc 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-178752437', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5cea9bfc-e97a-4d07-a251-8ca3978b5f98', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-178752437', 'neutron:project_id': 'd0dce6e339c349d4ab97cee5e49fff3a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0179c400-b2f2-4914-b563-942a61ef1858', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbb60528-b878-42fd-9c2f-0a3345010b1a, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a19a3bde-2463-4f15-afe7-f8df8c608bb7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.714 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 9852d6c7-7b56-465e-865f-eb8c24e61417 in datapath 914246be-3a6e-47b3-afc0-463db5fa1dae bound to our chassis#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.716 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 914246be-3a6e-47b3-afc0-463db5fa1dae#033[00m
Jan 23 04:36:19 np0005593234 systemd-udevd[241339]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:36:19 np0005593234 systemd-machined[195626]: New machine qemu-12-instance-0000001a.
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.728 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a2207479-a5d3-4f15-9bd2-fc8cffbe33b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.730 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap914246be-31 in ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.732 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap914246be-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.732 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4ecec55b-73bb-4f1b-a837-8281d3ab4bb9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.733 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1579c96a-c626-407d-b2f3-3002356a37f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 NetworkManager[48942]: <info>  [1769160979.7423] device (tapa19a3bde-24): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:36:19 np0005593234 NetworkManager[48942]: <info>  [1769160979.7432] device (tapa19a3bde-24): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.747 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[7ac4f09e-56ac-42e5-a9a0-b638a2b46b39]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.771 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[22c6a3a2-ee6a-4957-b6bf-eb1469091904]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 systemd[1]: Started Virtual Machine qemu-12-instance-0000001a.
Jan 23 04:36:19 np0005593234 nova_compute[227762]: 2026-01-23 09:36:19.799 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.798 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d294ef05-1fda-4067-ba9c-edfb7e5e1dcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 NetworkManager[48942]: <info>  [1769160979.8051] manager: (tap914246be-30): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.804 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d548d4e9-327c-4cb2-830d-812a7f89a92a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:19Z|00050|binding|INFO|Setting lport a19a3bde-2463-4f15-afe7-f8df8c608bb7 ovn-installed in OVS
Jan 23 04:36:19 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:19Z|00051|binding|INFO|Setting lport a19a3bde-2463-4f15-afe7-f8df8c608bb7 up in Southbound
Jan 23 04:36:19 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:19Z|00052|binding|INFO|Setting lport 9852d6c7-7b56-465e-865f-eb8c24e61417 up in Southbound
Jan 23 04:36:19 np0005593234 nova_compute[227762]: 2026-01-23 09:36:19.809 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.836 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9c9f1986-40a7-46eb-a8fc-28808297cf91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.839 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[be3006fa-9ac7-4f7b-9936-6bf06fde3203]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 NetworkManager[48942]: <info>  [1769160979.8596] device (tap914246be-30): carrier: link connected
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.865 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4776099f-5c5f-4752-8000-5d73e94aeb8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.881 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[042a0087-90fa-45cd-aa50-e4850d4306be]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap914246be-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:ab:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487481, 'reachable_time': 42045, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241371, 'error': None, 'target': 'ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.896 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e7d3a133-4c9f-4f93-9291-cdfabc69a2f8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:abf6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487481, 'tstamp': 487481}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241372, 'error': None, 'target': 'ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.912 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f308700a-d725-4bfc-98e7-80565d390151]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap914246be-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:ab:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487481, 'reachable_time': 42045, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241373, 'error': None, 'target': 'ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.943 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[05d0f476-d796-4e5e-9597-ee17e2809138]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.993 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1afd00d3-4ace-41be-a2d0-62482fe67920]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.995 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap914246be-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.995 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:36:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:19.996 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap914246be-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:19 np0005593234 nova_compute[227762]: 2026-01-23 09:36:19.997 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:19 np0005593234 kernel: tap914246be-30: entered promiscuous mode
Jan 23 04:36:19 np0005593234 NetworkManager[48942]: <info>  [1769160979.9984] manager: (tap914246be-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.000 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap914246be-30, col_values=(('external_ids', {'iface-id': '3fd7a4e0-3e1c-454c-a93e-1fa905fbcde2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:20 np0005593234 nova_compute[227762]: 2026-01-23 09:36:20.001 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:20 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:20Z|00053|binding|INFO|Releasing lport 3fd7a4e0-3e1c-454c-a93e-1fa905fbcde2 from this chassis (sb_readonly=0)
Jan 23 04:36:20 np0005593234 nova_compute[227762]: 2026-01-23 09:36:20.017 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.019 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/914246be-3a6e-47b3-afc0-463db5fa1dae.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/914246be-3a6e-47b3-afc0-463db5fa1dae.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.020 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c2cf58-588e-4df7-ba88-b5d95f24eaf5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.021 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-914246be-3a6e-47b3-afc0-463db5fa1dae
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/914246be-3a6e-47b3-afc0-463db5fa1dae.pid.haproxy
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 914246be-3a6e-47b3-afc0-463db5fa1dae
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.022 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae', 'env', 'PROCESS_TAG=haproxy-914246be-3a6e-47b3-afc0-463db5fa1dae', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/914246be-3a6e-47b3-afc0-463db5fa1dae.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:36:20 np0005593234 nova_compute[227762]: 2026-01-23 09:36:20.057 227766 DEBUG nova.network.neutron [req-12e7c2fd-568d-4fb5-ac12-e92a32d9edd2 req-bfc2030f-f01b-4d01-baf8-8d31b7194342 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Updated VIF entry in instance network info cache for port a19a3bde-2463-4f15-afe7-f8df8c608bb7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:36:20 np0005593234 nova_compute[227762]: 2026-01-23 09:36:20.058 227766 DEBUG nova.network.neutron [req-12e7c2fd-568d-4fb5-ac12-e92a32d9edd2 req-bfc2030f-f01b-4d01-baf8-8d31b7194342 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Updating instance_info_cache with network_info: [{"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:36:20 np0005593234 nova_compute[227762]: 2026-01-23 09:36:20.086 227766 DEBUG oslo_concurrency.lockutils [req-12e7c2fd-568d-4fb5-ac12-e92a32d9edd2 req-bfc2030f-f01b-4d01-baf8-8d31b7194342 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-5cea9bfc-e97a-4d07-a251-8ca3978b5f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:36:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:20.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:20 np0005593234 nova_compute[227762]: 2026-01-23 09:36:20.402 227766 DEBUG nova.compute.manager [req-ec072edf-78a2-4996-b6ba-7ee3a884f2f6 req-074e87d8-2d4b-4cb9-9446-db8f7a245f37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:20 np0005593234 nova_compute[227762]: 2026-01-23 09:36:20.403 227766 DEBUG oslo_concurrency.lockutils [req-ec072edf-78a2-4996-b6ba-7ee3a884f2f6 req-074e87d8-2d4b-4cb9-9446-db8f7a245f37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:20 np0005593234 nova_compute[227762]: 2026-01-23 09:36:20.404 227766 DEBUG oslo_concurrency.lockutils [req-ec072edf-78a2-4996-b6ba-7ee3a884f2f6 req-074e87d8-2d4b-4cb9-9446-db8f7a245f37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:20 np0005593234 nova_compute[227762]: 2026-01-23 09:36:20.404 227766 DEBUG oslo_concurrency.lockutils [req-ec072edf-78a2-4996-b6ba-7ee3a884f2f6 req-074e87d8-2d4b-4cb9-9446-db8f7a245f37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:20 np0005593234 nova_compute[227762]: 2026-01-23 09:36:20.404 227766 DEBUG nova.compute.manager [req-ec072edf-78a2-4996-b6ba-7ee3a884f2f6 req-074e87d8-2d4b-4cb9-9446-db8f7a245f37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Processing event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:36:20 np0005593234 podman[241405]: 2026-01-23 09:36:20.441069004 +0000 UTC m=+0.086166688 container create 56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:36:20 np0005593234 podman[241405]: 2026-01-23 09:36:20.381524128 +0000 UTC m=+0.026621842 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:36:20 np0005593234 systemd[1]: Started libpod-conmon-56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b.scope.
Jan 23 04:36:20 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:36:20 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9703e522fbaca3bbf9294fa95ea9c4233414ccd60c805dd733871fe83f25ceda/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:36:20 np0005593234 podman[241405]: 2026-01-23 09:36:20.533556766 +0000 UTC m=+0.178654450 container init 56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 04:36:20 np0005593234 podman[241405]: 2026-01-23 09:36:20.539352617 +0000 UTC m=+0.184450301 container start 56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:36:20 np0005593234 neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae[241420]: [NOTICE]   (241424) : New worker (241426) forked
Jan 23 04:36:20 np0005593234 neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae[241420]: [NOTICE]   (241424) : Loading success.
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.609 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a19a3bde-2463-4f15-afe7-f8df8c608bb7 in datapath 8eab8076-0848-4daf-bbac-f3f8b65ca750 unbound from our chassis#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.612 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8eab8076-0848-4daf-bbac-f3f8b65ca750#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.621 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dedf35ed-08b6-491a-8d06-25b9b0ac7504]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.622 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8eab8076-01 in ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.623 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8eab8076-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.624 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6301d206-6d02-47a6-bbb1-1e999c454ee4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.624 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[065498b4-635b-4e5e-9a43-87a4263724aa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.636 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[6a3a8c02-2cd2-4eb6-82e5-9bb6fc203a37]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.649 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b169cf33-d364-4f67-b5e5-b9c808c5d8fe]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.676 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[176e7f42-cad2-4e19-89c3-f7f5c264303f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 NetworkManager[48942]: <info>  [1769160980.6835] manager: (tap8eab8076-00): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Jan 23 04:36:20 np0005593234 systemd-udevd[241356]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.682 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[88b1b14d-49b4-4674-b232-e8b8764bbc51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:20.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.720 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[92674f00-94f5-4e40-b2ad-5bb7f5d3f6f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.724 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1db87be9-b08b-4f8a-a84e-1bb17c0d5f3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 NetworkManager[48942]: <info>  [1769160980.7462] device (tap8eab8076-00): carrier: link connected
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.753 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7e0bd832-ff5e-4b45-8afc-96d466b5770b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.771 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c6ff4574-20ee-4704-b19e-594e6d9e5b8e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8eab8076-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:5b:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487570, 'reachable_time': 27976, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241445, 'error': None, 'target': 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.787 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0907536d-630c-4b48-af39-5964052959e1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe46:5b99'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 487570, 'tstamp': 487570}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 241446, 'error': None, 'target': 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.802 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0443444c-ba7a-4e9d-b18e-710024d23c7f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8eab8076-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:5b:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 20], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487570, 'reachable_time': 27976, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 241447, 'error': None, 'target': 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.831 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[51ea2d89-ca9b-4fb2-9f3e-a35fdd0a1728]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.887 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ed7bd5c1-52c7-4d86-bd1c-5d6eba2c72e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.890 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8eab8076-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.890 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.891 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8eab8076-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:20 np0005593234 kernel: tap8eab8076-00: entered promiscuous mode
Jan 23 04:36:20 np0005593234 nova_compute[227762]: 2026-01-23 09:36:20.894 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:20 np0005593234 NetworkManager[48942]: <info>  [1769160980.8953] manager: (tap8eab8076-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.897 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8eab8076-00, col_values=(('external_ids', {'iface-id': 'b545a870-aa18-4f64-a8a7-f8512824c4cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:20 np0005593234 nova_compute[227762]: 2026-01-23 09:36:20.898 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:20 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:20Z|00054|binding|INFO|Releasing lport b545a870-aa18-4f64-a8a7-f8512824c4cc from this chassis (sb_readonly=0)
Jan 23 04:36:20 np0005593234 nova_compute[227762]: 2026-01-23 09:36:20.915 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.916 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8eab8076-0848-4daf-bbac-f3f8b65ca750.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8eab8076-0848-4daf-bbac-f3f8b65ca750.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.917 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[436c4fee-2d84-4e28-a6b3-736d590a0cb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.918 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-8eab8076-0848-4daf-bbac-f3f8b65ca750
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/8eab8076-0848-4daf-bbac-f3f8b65ca750.pid.haproxy
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 8eab8076-0848-4daf-bbac-f3f8b65ca750
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:36:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:20.918 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'env', 'PROCESS_TAG=haproxy-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8eab8076-0848-4daf-bbac-f3f8b65ca750.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:36:21 np0005593234 podman[241511]: 2026-01-23 09:36:21.321649888 +0000 UTC m=+0.090677426 container create 6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 04:36:21 np0005593234 podman[241511]: 2026-01-23 09:36:21.252922327 +0000 UTC m=+0.021949885 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:36:21 np0005593234 systemd[1]: Started libpod-conmon-6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa.scope.
Jan 23 04:36:21 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.368 227766 DEBUG nova.compute.manager [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.369 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160981.369218, 5cea9bfc-e97a-4d07-a251-8ca3978b5f98 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.369 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] VM Started (Lifecycle Event)#033[00m
Jan 23 04:36:21 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6ff5f08a81d7c11ab816363e295da578b655f2ccc82e07b41f3f2c7253544933/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.390 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.395 227766 INFO nova.virt.libvirt.driver [-] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Instance spawned successfully.#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.395 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:36:21 np0005593234 podman[241511]: 2026-01-23 09:36:21.398886156 +0000 UTC m=+0.167913714 container init 6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.400 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.403 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:36:21 np0005593234 podman[241511]: 2026-01-23 09:36:21.405600695 +0000 UTC m=+0.174628233 container start 6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 04:36:21 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[241532]: [NOTICE]   (241536) : New worker (241538) forked
Jan 23 04:36:21 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[241532]: [NOTICE]   (241536) : Loading success.
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.453 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.454 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.454 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.455 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.455 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.455 227766 DEBUG nova.virt.libvirt.driver [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.459 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.459 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160981.3710997, 5cea9bfc-e97a-4d07-a251-8ca3978b5f98 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.459 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.493 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.496 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160981.3809097, 5cea9bfc-e97a-4d07-a251-8ca3978b5f98 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.497 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.530 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.534 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.566 227766 INFO nova.compute.manager [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Took 11.22 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.567 227766 DEBUG nova.compute.manager [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.568 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.646 227766 INFO nova.compute.manager [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Took 12.28 seconds to build instance.#033[00m
Jan 23 04:36:21 np0005593234 nova_compute[227762]: 2026-01-23 09:36:21.671 227766 DEBUG oslo_concurrency.lockutils [None req-a1622aa8-ff31-49e8-a8d8-2cd5615b9393 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.386s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:22.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:22 np0005593234 nova_compute[227762]: 2026-01-23 09:36:22.514 227766 DEBUG nova.compute.manager [req-f7c113cf-0dee-4e56-a73f-f49286e7b2ab req-a6363d3c-5673-4213-acc7-fabacf248485 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:22 np0005593234 nova_compute[227762]: 2026-01-23 09:36:22.514 227766 DEBUG oslo_concurrency.lockutils [req-f7c113cf-0dee-4e56-a73f-f49286e7b2ab req-a6363d3c-5673-4213-acc7-fabacf248485 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:22 np0005593234 nova_compute[227762]: 2026-01-23 09:36:22.514 227766 DEBUG oslo_concurrency.lockutils [req-f7c113cf-0dee-4e56-a73f-f49286e7b2ab req-a6363d3c-5673-4213-acc7-fabacf248485 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:22 np0005593234 nova_compute[227762]: 2026-01-23 09:36:22.514 227766 DEBUG oslo_concurrency.lockutils [req-f7c113cf-0dee-4e56-a73f-f49286e7b2ab req-a6363d3c-5673-4213-acc7-fabacf248485 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:22 np0005593234 nova_compute[227762]: 2026-01-23 09:36:22.515 227766 DEBUG nova.compute.manager [req-f7c113cf-0dee-4e56-a73f-f49286e7b2ab req-a6363d3c-5673-4213-acc7-fabacf248485 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] No waiting events found dispatching network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:36:22 np0005593234 nova_compute[227762]: 2026-01-23 09:36:22.515 227766 WARNING nova.compute.manager [req-f7c113cf-0dee-4e56-a73f-f49286e7b2ab req-a6363d3c-5673-4213-acc7-fabacf248485 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received unexpected event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:36:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:22.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:22 np0005593234 nova_compute[227762]: 2026-01-23 09:36:22.999 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:23 np0005593234 nova_compute[227762]: 2026-01-23 09:36:23.350 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:36:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:24.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:36:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:24.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:24 np0005593234 podman[241549]: 2026-01-23 09:36:24.780202771 +0000 UTC m=+0.076486725 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 04:36:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:26.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:26.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:27 np0005593234 nova_compute[227762]: 2026-01-23 09:36:27.282 227766 DEBUG nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Check if temp file /var/lib/nova/instances/tmpet99ib7x exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 23 04:36:27 np0005593234 nova_compute[227762]: 2026-01-23 09:36:27.284 227766 DEBUG nova.compute.manager [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpet99ib7x',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5cea9bfc-e97a-4d07-a251-8ca3978b5f98',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 23 04:36:28 np0005593234 nova_compute[227762]: 2026-01-23 09:36:28.001 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:28.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:28 np0005593234 nova_compute[227762]: 2026-01-23 09:36:28.351 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:28 np0005593234 nova_compute[227762]: 2026-01-23 09:36:28.439 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:28.439 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:36:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:28.441 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:36:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:28.442 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:28.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:30.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:30.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:32.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:36:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:32.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:36:33 np0005593234 nova_compute[227762]: 2026-01-23 09:36:33.003 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:33 np0005593234 nova_compute[227762]: 2026-01-23 09:36:33.354 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:33 np0005593234 nova_compute[227762]: 2026-01-23 09:36:33.733 227766 DEBUG nova.compute.manager [req-ca25bc42-83c2-463a-9037-88f19efb022c req-7c408968-fe9a-47d1-848e-6511b210f6ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-vif-unplugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:33 np0005593234 nova_compute[227762]: 2026-01-23 09:36:33.733 227766 DEBUG oslo_concurrency.lockutils [req-ca25bc42-83c2-463a-9037-88f19efb022c req-7c408968-fe9a-47d1-848e-6511b210f6ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:33 np0005593234 nova_compute[227762]: 2026-01-23 09:36:33.734 227766 DEBUG oslo_concurrency.lockutils [req-ca25bc42-83c2-463a-9037-88f19efb022c req-7c408968-fe9a-47d1-848e-6511b210f6ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:33 np0005593234 nova_compute[227762]: 2026-01-23 09:36:33.734 227766 DEBUG oslo_concurrency.lockutils [req-ca25bc42-83c2-463a-9037-88f19efb022c req-7c408968-fe9a-47d1-848e-6511b210f6ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:33 np0005593234 nova_compute[227762]: 2026-01-23 09:36:33.734 227766 DEBUG nova.compute.manager [req-ca25bc42-83c2-463a-9037-88f19efb022c req-7c408968-fe9a-47d1-848e-6511b210f6ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] No waiting events found dispatching network-vif-unplugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:36:33 np0005593234 nova_compute[227762]: 2026-01-23 09:36:33.734 227766 DEBUG nova.compute.manager [req-ca25bc42-83c2-463a-9037-88f19efb022c req-7c408968-fe9a-47d1-848e-6511b210f6ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-vif-unplugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:36:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:36:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:34.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:36:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:34Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:ec:dc 10.100.0.6
Jan 23 04:36:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:34Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:ec:dc 10.100.0.6
Jan 23 04:36:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:34.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:35 np0005593234 nova_compute[227762]: 2026-01-23 09:36:35.638 227766 INFO nova.compute.manager [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Took 6.24 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Jan 23 04:36:35 np0005593234 nova_compute[227762]: 2026-01-23 09:36:35.639 227766 DEBUG nova.compute.manager [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:36:35 np0005593234 nova_compute[227762]: 2026-01-23 09:36:35.662 227766 DEBUG nova.compute.manager [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpet99ib7x',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5cea9bfc-e97a-4d07-a251-8ca3978b5f98',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(4584031b-3dfa-4ba7-880d-b5ee586cd7c0),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 23 04:36:35 np0005593234 nova_compute[227762]: 2026-01-23 09:36:35.668 227766 DEBUG nova.objects.instance [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lazy-loading 'migration_context' on Instance uuid 5cea9bfc-e97a-4d07-a251-8ca3978b5f98 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:36:35 np0005593234 nova_compute[227762]: 2026-01-23 09:36:35.669 227766 DEBUG nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 23 04:36:35 np0005593234 nova_compute[227762]: 2026-01-23 09:36:35.671 227766 DEBUG nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 23 04:36:35 np0005593234 nova_compute[227762]: 2026-01-23 09:36:35.672 227766 DEBUG nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 23 04:36:35 np0005593234 nova_compute[227762]: 2026-01-23 09:36:35.692 227766 DEBUG nova.virt.libvirt.vif [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:36:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1674522276',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1674522276',id=26,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:36:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d0dce6e339c349d4ab97cee5e49fff3a',ramdisk_id='',reservation_id='r-lql1hhru',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1207260646',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1207260646-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:36:21Z,user_data=None,user_id='4f72965e950c4761bfedd99fdc411a83',uuid=5cea9bfc-e97a-4d07-a251-8ca3978b5f98,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:36:35 np0005593234 nova_compute[227762]: 2026-01-23 09:36:35.692 227766 DEBUG nova.network.os_vif_util [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Converting VIF {"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:36:35 np0005593234 nova_compute[227762]: 2026-01-23 09:36:35.694 227766 DEBUG nova.network.os_vif_util [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:ec:dc,bridge_name='br-int',has_traffic_filtering=True,id=a19a3bde-2463-4f15-afe7-f8df8c608bb7,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa19a3bde-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:36:35 np0005593234 nova_compute[227762]: 2026-01-23 09:36:35.694 227766 DEBUG nova.virt.libvirt.migration [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Updating guest XML with vif config: <interface type="ethernet">
Jan 23 04:36:35 np0005593234 nova_compute[227762]:  <mac address="fa:16:3e:7b:ec:dc"/>
Jan 23 04:36:35 np0005593234 nova_compute[227762]:  <model type="virtio"/>
Jan 23 04:36:35 np0005593234 nova_compute[227762]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:36:35 np0005593234 nova_compute[227762]:  <mtu size="1442"/>
Jan 23 04:36:35 np0005593234 nova_compute[227762]:  <target dev="tapa19a3bde-24"/>
Jan 23 04:36:35 np0005593234 nova_compute[227762]: </interface>
Jan 23 04:36:35 np0005593234 nova_compute[227762]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 23 04:36:35 np0005593234 nova_compute[227762]: 2026-01-23 09:36:35.695 227766 DEBUG nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.011 227766 DEBUG nova.compute.manager [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.013 227766 DEBUG oslo_concurrency.lockutils [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.013 227766 DEBUG oslo_concurrency.lockutils [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.013 227766 DEBUG oslo_concurrency.lockutils [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.013 227766 DEBUG nova.compute.manager [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] No waiting events found dispatching network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.014 227766 WARNING nova.compute.manager [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received unexpected event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.014 227766 DEBUG nova.compute.manager [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-changed-a19a3bde-2463-4f15-afe7-f8df8c608bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.014 227766 DEBUG nova.compute.manager [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Refreshing instance network info cache due to event network-changed-a19a3bde-2463-4f15-afe7-f8df8c608bb7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.014 227766 DEBUG oslo_concurrency.lockutils [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-5cea9bfc-e97a-4d07-a251-8ca3978b5f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.015 227766 DEBUG oslo_concurrency.lockutils [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-5cea9bfc-e97a-4d07-a251-8ca3978b5f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.015 227766 DEBUG nova.network.neutron [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Refreshing network info cache for port a19a3bde-2463-4f15-afe7-f8df8c608bb7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.175 227766 DEBUG nova.virt.libvirt.migration [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.175 227766 INFO nova.virt.libvirt.migration [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 23 04:36:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:36:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:36.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.280 227766 INFO nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 23 04:36:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:36.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.784 227766 DEBUG nova.virt.libvirt.migration [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 23 04:36:36 np0005593234 nova_compute[227762]: 2026-01-23 09:36:36.785 227766 DEBUG nova.virt.libvirt.migration [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.090 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769160997.0899785, 5cea9bfc-e97a-4d07-a251-8ca3978b5f98 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.090 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.116 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.120 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.147 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 23 04:36:37 np0005593234 kernel: tapa19a3bde-24 (unregistering): left promiscuous mode
Jan 23 04:36:37 np0005593234 NetworkManager[48942]: <info>  [1769160997.2957] device (tapa19a3bde-24): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:36:37 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:37Z|00055|binding|INFO|Releasing lport a19a3bde-2463-4f15-afe7-f8df8c608bb7 from this chassis (sb_readonly=0)
Jan 23 04:36:37 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:37Z|00056|binding|INFO|Setting lport a19a3bde-2463-4f15-afe7-f8df8c608bb7 down in Southbound
Jan 23 04:36:37 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:37Z|00057|binding|INFO|Releasing lport 9852d6c7-7b56-465e-865f-eb8c24e61417 from this chassis (sb_readonly=0)
Jan 23 04:36:37 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:37Z|00058|binding|INFO|Setting lport 9852d6c7-7b56-465e-865f-eb8c24e61417 down in Southbound
Jan 23 04:36:37 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:37Z|00059|binding|INFO|Removing iface tapa19a3bde-24 ovn-installed in OVS
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.302 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.304 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:37 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:37Z|00060|binding|INFO|Releasing lport 3fd7a4e0-3e1c-454c-a93e-1fa905fbcde2 from this chassis (sb_readonly=0)
Jan 23 04:36:37 np0005593234 ovn_controller[134547]: 2026-01-23T09:36:37Z|00061|binding|INFO|Releasing lport b545a870-aa18-4f64-a8a7-f8512824c4cc from this chassis (sb_readonly=0)
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.310 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9e:36:a6 19.80.0.21'], port_security=['fa:16:3e:9e:36:a6 19.80.0.21'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['a19a3bde-2463-4f15-afe7-f8df8c608bb7'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1709862236', 'neutron:cidrs': '19.80.0.21/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-914246be-3a6e-47b3-afc0-463db5fa1dae', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1709862236', 'neutron:project_id': 'd0dce6e339c349d4ab97cee5e49fff3a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '0179c400-b2f2-4914-b563-942a61ef1858', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=6bd4ce00-6348-4d9c-ba3b-d576a6d3e856, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=9852d6c7-7b56-465e-865f-eb8c24e61417) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.311 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:ec:dc 10.100.0.6'], port_security=['fa:16:3e:7b:ec:dc 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'd80bc768-e67f-4e48-bcf3-42912cda98f1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-178752437', 'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '5cea9bfc-e97a-4d07-a251-8ca3978b5f98', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-178752437', 'neutron:project_id': 'd0dce6e339c349d4ab97cee5e49fff3a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0179c400-b2f2-4914-b563-942a61ef1858', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbb60528-b878-42fd-9c2f-0a3345010b1a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a19a3bde-2463-4f15-afe7-f8df8c608bb7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.312 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 9852d6c7-7b56-465e-865f-eb8c24e61417 in datapath 914246be-3a6e-47b3-afc0-463db5fa1dae unbound from our chassis#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.313 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 914246be-3a6e-47b3-afc0-463db5fa1dae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.316 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e71ce37f-4be3-40fb-84cf-aa2cf82371d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.318 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae namespace which is not needed anymore#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.334 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:37 np0005593234 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Jan 23 04:36:37 np0005593234 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d0000001a.scope: Consumed 14.548s CPU time.
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.419 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:37 np0005593234 systemd-machined[195626]: Machine qemu-12-instance-0000001a terminated.
Jan 23 04:36:37 np0005593234 virtqemud[227483]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk: No such file or directory
Jan 23 04:36:37 np0005593234 virtqemud[227483]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/5cea9bfc-e97a-4d07-a251-8ca3978b5f98_disk: No such file or directory
Jan 23 04:36:37 np0005593234 neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae[241420]: [NOTICE]   (241424) : haproxy version is 2.8.14-c23fe91
Jan 23 04:36:37 np0005593234 neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae[241420]: [NOTICE]   (241424) : path to executable is /usr/sbin/haproxy
Jan 23 04:36:37 np0005593234 neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae[241420]: [WARNING]  (241424) : Exiting Master process...
Jan 23 04:36:37 np0005593234 neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae[241420]: [ALERT]    (241424) : Current worker (241426) exited with code 143 (Terminated)
Jan 23 04:36:37 np0005593234 neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae[241420]: [WARNING]  (241424) : All workers exited. Exiting... (0)
Jan 23 04:36:37 np0005593234 systemd[1]: libpod-56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b.scope: Deactivated successfully.
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.463 227766 DEBUG nova.virt.libvirt.guest [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.464 227766 INFO nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Migration operation has completed#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.464 227766 INFO nova.compute.manager [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] _post_live_migration() is started..#033[00m
Jan 23 04:36:37 np0005593234 podman[241661]: 2026-01-23 09:36:37.469469557 +0000 UTC m=+0.051812305 container died 56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.474 227766 DEBUG nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.474 227766 DEBUG nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.474 227766 DEBUG nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 23 04:36:37 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b-userdata-shm.mount: Deactivated successfully.
Jan 23 04:36:37 np0005593234 systemd[1]: var-lib-containers-storage-overlay-9703e522fbaca3bbf9294fa95ea9c4233414ccd60c805dd733871fe83f25ceda-merged.mount: Deactivated successfully.
Jan 23 04:36:37 np0005593234 podman[241661]: 2026-01-23 09:36:37.524234955 +0000 UTC m=+0.106577683 container cleanup 56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:36:37 np0005593234 systemd[1]: libpod-conmon-56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b.scope: Deactivated successfully.
Jan 23 04:36:37 np0005593234 podman[241701]: 2026-01-23 09:36:37.592525593 +0000 UTC m=+0.047492702 container remove 56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.599 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6f8ca464-d2a3-4c33-8e68-80a3303a3431]: (4, ('Fri Jan 23 09:36:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae (56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b)\n56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b\nFri Jan 23 09:36:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae (56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b)\n56bccab4f71c030f41ac04cb787ba6a7b722f78a7b5e0486d44c8472dedff99b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.601 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bbcba9b8-962f-43a0-95d0-b7d84060632f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.602 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap914246be-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.604 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:37 np0005593234 kernel: tap914246be-30: left promiscuous mode
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.621 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.625 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5d54051d-599d-417c-b30e-146cdbe8f75b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.641 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c22a6bff-ed34-4168-813f-89d61ecafd5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.642 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[47e1b06c-8170-4fb1-8718-9b17bf36a9c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.659 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ff8c362c-881d-46be-8cfa-5e5673dfb395]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487475, 'reachable_time': 34570, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241721, 'error': None, 'target': 'ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.662 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-914246be-3a6e-47b3-afc0-463db5fa1dae deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.663 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[e8c19a6b-2c59-4445-a8fb-cf98c5b3b93f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.663 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a19a3bde-2463-4f15-afe7-f8df8c608bb7 in datapath 8eab8076-0848-4daf-bbac-f3f8b65ca750 unbound from our chassis#033[00m
Jan 23 04:36:37 np0005593234 systemd[1]: run-netns-ovnmeta\x2d914246be\x2d3a6e\x2d47b3\x2dafc0\x2d463db5fa1dae.mount: Deactivated successfully.
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.665 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8eab8076-0848-4daf-bbac-f3f8b65ca750, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.665 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[68c3999c-f27d-490d-ad78-d84144b7a239]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.666 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 namespace which is not needed anymore#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.749 227766 DEBUG nova.compute.manager [req-d5eba890-8179-4579-a209-e3a3dcae5c45 req-e9e2219a-593d-41c7-bed5-cd0d0fcb3c38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-vif-unplugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.749 227766 DEBUG oslo_concurrency.lockutils [req-d5eba890-8179-4579-a209-e3a3dcae5c45 req-e9e2219a-593d-41c7-bed5-cd0d0fcb3c38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.750 227766 DEBUG oslo_concurrency.lockutils [req-d5eba890-8179-4579-a209-e3a3dcae5c45 req-e9e2219a-593d-41c7-bed5-cd0d0fcb3c38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.750 227766 DEBUG oslo_concurrency.lockutils [req-d5eba890-8179-4579-a209-e3a3dcae5c45 req-e9e2219a-593d-41c7-bed5-cd0d0fcb3c38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.750 227766 DEBUG nova.compute.manager [req-d5eba890-8179-4579-a209-e3a3dcae5c45 req-e9e2219a-593d-41c7-bed5-cd0d0fcb3c38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] No waiting events found dispatching network-vif-unplugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.750 227766 DEBUG nova.compute.manager [req-d5eba890-8179-4579-a209-e3a3dcae5c45 req-e9e2219a-593d-41c7-bed5-cd0d0fcb3c38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-vif-unplugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:36:37 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[241532]: [NOTICE]   (241536) : haproxy version is 2.8.14-c23fe91
Jan 23 04:36:37 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[241532]: [NOTICE]   (241536) : path to executable is /usr/sbin/haproxy
Jan 23 04:36:37 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[241532]: [WARNING]  (241536) : Exiting Master process...
Jan 23 04:36:37 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[241532]: [WARNING]  (241536) : Exiting Master process...
Jan 23 04:36:37 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[241532]: [ALERT]    (241536) : Current worker (241538) exited with code 143 (Terminated)
Jan 23 04:36:37 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[241532]: [WARNING]  (241536) : All workers exited. Exiting... (0)
Jan 23 04:36:37 np0005593234 systemd[1]: libpod-6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa.scope: Deactivated successfully.
Jan 23 04:36:37 np0005593234 podman[241740]: 2026-01-23 09:36:37.802072753 +0000 UTC m=+0.045057545 container died 6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:36:37 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa-userdata-shm.mount: Deactivated successfully.
Jan 23 04:36:37 np0005593234 systemd[1]: var-lib-containers-storage-overlay-6ff5f08a81d7c11ab816363e295da578b655f2ccc82e07b41f3f2c7253544933-merged.mount: Deactivated successfully.
Jan 23 04:36:37 np0005593234 podman[241740]: 2026-01-23 09:36:37.841187412 +0000 UTC m=+0.084172214 container cleanup 6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 04:36:37 np0005593234 systemd[1]: libpod-conmon-6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa.scope: Deactivated successfully.
Jan 23 04:36:37 np0005593234 podman[241772]: 2026-01-23 09:36:37.9056089 +0000 UTC m=+0.043930399 container remove 6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.911 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[25c94bf3-4b39-47cc-af6c-fe18a86c4d03]: (4, ('Fri Jan 23 09:36:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 (6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa)\n6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa\nFri Jan 23 09:36:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 (6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa)\n6059864ed45dd7ee917bcde5bcebb350145827c124e905db4551c49006ab0bfa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.913 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f67872b1-9c28-44d9-b6fd-ea14b0dd6168]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.914 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8eab8076-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:37 np0005593234 kernel: tap8eab8076-00: left promiscuous mode
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.916 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:37 np0005593234 nova_compute[227762]: 2026-01-23 09:36:37.931 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.935 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6d4f9d49-33dc-4663-a7e4-601f61ea9fcf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.950 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[05167682-2a9d-449f-bf4d-d90eda907d61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.951 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fb179c93-1ff4-4668-a257-7ad682a690bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.967 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c9834384-253a-444a-9d16-388c2b83eabb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 487563, 'reachable_time': 27899, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 241791, 'error': None, 'target': 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.969 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:36:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:37.970 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[ed6141a5-8832-4f87-8f50-af4d22556f27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:36:38 np0005593234 nova_compute[227762]: 2026-01-23 09:36:38.005 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:36:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:38.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:36:38 np0005593234 nova_compute[227762]: 2026-01-23 09:36:38.356 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:38 np0005593234 systemd[1]: run-netns-ovnmeta\x2d8eab8076\x2d0848\x2d4daf\x2dbbac\x2df3f8b65ca750.mount: Deactivated successfully.
Jan 23 04:36:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:38.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:39 np0005593234 nova_compute[227762]: 2026-01-23 09:36:39.531 227766 DEBUG nova.network.neutron [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Updated VIF entry in instance network info cache for port a19a3bde-2463-4f15-afe7-f8df8c608bb7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:36:39 np0005593234 nova_compute[227762]: 2026-01-23 09:36:39.532 227766 DEBUG nova.network.neutron [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Updating instance_info_cache with network_info: [{"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:36:39 np0005593234 nova_compute[227762]: 2026-01-23 09:36:39.558 227766 DEBUG oslo_concurrency.lockutils [req-77bc3267-d592-4421-8142-9697b5bd7437 req-aeb2a4ce-af1e-472a-8f93-0b4f85fc1ab4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-5cea9bfc-e97a-4d07-a251-8ca3978b5f98" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:36:39 np0005593234 nova_compute[227762]: 2026-01-23 09:36:39.876 227766 DEBUG nova.compute.manager [req-9378b9cc-6468-43d6-b8cb-7f4d515d4112 req-dc8e1e61-3e77-4677-9d0a-dedb168bdc0b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:39 np0005593234 nova_compute[227762]: 2026-01-23 09:36:39.877 227766 DEBUG oslo_concurrency.lockutils [req-9378b9cc-6468-43d6-b8cb-7f4d515d4112 req-dc8e1e61-3e77-4677-9d0a-dedb168bdc0b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:39 np0005593234 nova_compute[227762]: 2026-01-23 09:36:39.877 227766 DEBUG oslo_concurrency.lockutils [req-9378b9cc-6468-43d6-b8cb-7f4d515d4112 req-dc8e1e61-3e77-4677-9d0a-dedb168bdc0b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:39 np0005593234 nova_compute[227762]: 2026-01-23 09:36:39.877 227766 DEBUG oslo_concurrency.lockutils [req-9378b9cc-6468-43d6-b8cb-7f4d515d4112 req-dc8e1e61-3e77-4677-9d0a-dedb168bdc0b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:39 np0005593234 nova_compute[227762]: 2026-01-23 09:36:39.878 227766 DEBUG nova.compute.manager [req-9378b9cc-6468-43d6-b8cb-7f4d515d4112 req-dc8e1e61-3e77-4677-9d0a-dedb168bdc0b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] No waiting events found dispatching network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:36:39 np0005593234 nova_compute[227762]: 2026-01-23 09:36:39.878 227766 WARNING nova.compute.manager [req-9378b9cc-6468-43d6-b8cb-7f4d515d4112 req-dc8e1e61-3e77-4677-9d0a-dedb168bdc0b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received unexpected event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.247 227766 DEBUG nova.compute.manager [req-6d8f6fb6-1636-4cf1-bf66-be9564711a50 req-a4cc75e5-83cb-40f6-99b2-db3d8a4a505c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-vif-unplugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.248 227766 DEBUG oslo_concurrency.lockutils [req-6d8f6fb6-1636-4cf1-bf66-be9564711a50 req-a4cc75e5-83cb-40f6-99b2-db3d8a4a505c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.248 227766 DEBUG oslo_concurrency.lockutils [req-6d8f6fb6-1636-4cf1-bf66-be9564711a50 req-a4cc75e5-83cb-40f6-99b2-db3d8a4a505c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.248 227766 DEBUG oslo_concurrency.lockutils [req-6d8f6fb6-1636-4cf1-bf66-be9564711a50 req-a4cc75e5-83cb-40f6-99b2-db3d8a4a505c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.248 227766 DEBUG nova.compute.manager [req-6d8f6fb6-1636-4cf1-bf66-be9564711a50 req-a4cc75e5-83cb-40f6-99b2-db3d8a4a505c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] No waiting events found dispatching network-vif-unplugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.248 227766 DEBUG nova.compute.manager [req-6d8f6fb6-1636-4cf1-bf66-be9564711a50 req-a4cc75e5-83cb-40f6-99b2-db3d8a4a505c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-vif-unplugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:36:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:40.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.478 227766 DEBUG nova.network.neutron [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Activated binding for port a19a3bde-2463-4f15-afe7-f8df8c608bb7 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.478 227766 DEBUG nova.compute.manager [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.479 227766 DEBUG nova.virt.libvirt.vif [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:36:05Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-1674522276',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-1674522276',id=26,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:36:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d0dce6e339c349d4ab97cee5e49fff3a',ramdisk_id='',reservation_id='r-lql1hhru',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1207260646',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1207260646-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:36:26Z,user_data=None,user_id='4f72965e950c4761bfedd99fdc411a83',uuid=5cea9bfc-e97a-4d07-a251-8ca3978b5f98,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.479 227766 DEBUG nova.network.os_vif_util [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Converting VIF {"id": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "address": "fa:16:3e:7b:ec:dc", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa19a3bde-24", "ovs_interfaceid": "a19a3bde-2463-4f15-afe7-f8df8c608bb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.480 227766 DEBUG nova.network.os_vif_util [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:ec:dc,bridge_name='br-int',has_traffic_filtering=True,id=a19a3bde-2463-4f15-afe7-f8df8c608bb7,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa19a3bde-24') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.480 227766 DEBUG os_vif [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:ec:dc,bridge_name='br-int',has_traffic_filtering=True,id=a19a3bde-2463-4f15-afe7-f8df8c608bb7,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa19a3bde-24') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.482 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.482 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa19a3bde-24, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.484 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.484 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.487 227766 INFO os_vif [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:ec:dc,bridge_name='br-int',has_traffic_filtering=True,id=a19a3bde-2463-4f15-afe7-f8df8c608bb7,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapa19a3bde-24')#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.487 227766 DEBUG oslo_concurrency.lockutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.488 227766 DEBUG oslo_concurrency.lockutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.488 227766 DEBUG oslo_concurrency.lockutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.488 227766 DEBUG nova.compute.manager [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.489 227766 INFO nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Deleting instance files /var/lib/nova/instances/5cea9bfc-e97a-4d07-a251-8ca3978b5f98_del#033[00m
Jan 23 04:36:40 np0005593234 nova_compute[227762]: 2026-01-23 09:36:40.489 227766 INFO nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Deletion of /var/lib/nova/instances/5cea9bfc-e97a-4d07-a251-8ca3978b5f98_del complete#033[00m
Jan 23 04:36:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:36:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:40.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:36:41 np0005593234 podman[241793]: 2026-01-23 09:36:41.760073103 +0000 UTC m=+0.051839987 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.051 227766 DEBUG nova.compute.manager [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.051 227766 DEBUG oslo_concurrency.lockutils [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.051 227766 DEBUG oslo_concurrency.lockutils [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.051 227766 DEBUG oslo_concurrency.lockutils [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.052 227766 DEBUG nova.compute.manager [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] No waiting events found dispatching network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.052 227766 WARNING nova.compute.manager [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received unexpected event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.052 227766 DEBUG nova.compute.manager [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.052 227766 DEBUG oslo_concurrency.lockutils [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.052 227766 DEBUG oslo_concurrency.lockutils [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.053 227766 DEBUG oslo_concurrency.lockutils [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.053 227766 DEBUG nova.compute.manager [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] No waiting events found dispatching network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.053 227766 WARNING nova.compute.manager [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received unexpected event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.053 227766 DEBUG nova.compute.manager [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.053 227766 DEBUG oslo_concurrency.lockutils [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.053 227766 DEBUG oslo_concurrency.lockutils [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.053 227766 DEBUG oslo_concurrency.lockutils [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.054 227766 DEBUG nova.compute.manager [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] No waiting events found dispatching network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:36:42 np0005593234 nova_compute[227762]: 2026-01-23 09:36:42.054 227766 WARNING nova.compute.manager [req-92fc33bd-b1fc-4b99-a28c-66945006c430 req-8fb15a5d-d29e-4085-8ff8-a1dcccf1ef25 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Received unexpected event network-vif-plugged-a19a3bde-2463-4f15-afe7-f8df8c608bb7 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 04:36:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:42.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:42.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:42.813 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:42.814 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:36:42.814 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:43 np0005593234 nova_compute[227762]: 2026-01-23 09:36:43.006 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e169 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:44.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e170 e170: 3 total, 3 up, 3 in
Jan 23 04:36:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:36:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2617608864' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:36:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:36:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2617608864' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:36:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:44.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:45 np0005593234 nova_compute[227762]: 2026-01-23 09:36:45.486 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:36:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:36:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:46.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:46.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:36:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:36:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:36:48 np0005593234 nova_compute[227762]: 2026-01-23 09:36:48.007 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:36:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:48.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:36:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:48.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e171 e171: 3 total, 3 up, 3 in
Jan 23 04:36:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:50.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:50 np0005593234 nova_compute[227762]: 2026-01-23 09:36:50.421 227766 DEBUG oslo_concurrency.lockutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Acquiring lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:50 np0005593234 nova_compute[227762]: 2026-01-23 09:36:50.422 227766 DEBUG oslo_concurrency.lockutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:50 np0005593234 nova_compute[227762]: 2026-01-23 09:36:50.422 227766 DEBUG oslo_concurrency.lockutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "5cea9bfc-e97a-4d07-a251-8ca3978b5f98-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:50 np0005593234 nova_compute[227762]: 2026-01-23 09:36:50.445 227766 DEBUG oslo_concurrency.lockutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:50 np0005593234 nova_compute[227762]: 2026-01-23 09:36:50.445 227766 DEBUG oslo_concurrency.lockutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:50 np0005593234 nova_compute[227762]: 2026-01-23 09:36:50.445 227766 DEBUG oslo_concurrency.lockutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:50 np0005593234 nova_compute[227762]: 2026-01-23 09:36:50.446 227766 DEBUG nova.compute.resource_tracker [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:36:50 np0005593234 nova_compute[227762]: 2026-01-23 09:36:50.446 227766 DEBUG oslo_concurrency.processutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:50 np0005593234 nova_compute[227762]: 2026-01-23 09:36:50.490 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:50.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:36:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/642153338' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:36:50 np0005593234 nova_compute[227762]: 2026-01-23 09:36:50.902 227766 DEBUG oslo_concurrency.processutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:51 np0005593234 nova_compute[227762]: 2026-01-23 09:36:51.130 227766 WARNING nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:36:51 np0005593234 nova_compute[227762]: 2026-01-23 09:36:51.131 227766 DEBUG nova.compute.resource_tracker [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4814MB free_disk=20.922042846679688GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:36:51 np0005593234 nova_compute[227762]: 2026-01-23 09:36:51.131 227766 DEBUG oslo_concurrency.lockutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:36:51 np0005593234 nova_compute[227762]: 2026-01-23 09:36:51.132 227766 DEBUG oslo_concurrency.lockutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:36:51 np0005593234 nova_compute[227762]: 2026-01-23 09:36:51.232 227766 DEBUG nova.compute.resource_tracker [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Migration for instance 5cea9bfc-e97a-4d07-a251-8ca3978b5f98 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 23 04:36:51 np0005593234 nova_compute[227762]: 2026-01-23 09:36:51.482 227766 DEBUG nova.compute.resource_tracker [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 23 04:36:51 np0005593234 nova_compute[227762]: 2026-01-23 09:36:51.521 227766 DEBUG nova.compute.resource_tracker [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Migration 4584031b-3dfa-4ba7-880d-b5ee586cd7c0 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 23 04:36:51 np0005593234 nova_compute[227762]: 2026-01-23 09:36:51.522 227766 DEBUG nova.compute.resource_tracker [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:36:51 np0005593234 nova_compute[227762]: 2026-01-23 09:36:51.522 227766 DEBUG nova.compute.resource_tracker [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:36:51 np0005593234 nova_compute[227762]: 2026-01-23 09:36:51.574 227766 DEBUG oslo_concurrency.processutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:36:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:36:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2272036881' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:36:52 np0005593234 nova_compute[227762]: 2026-01-23 09:36:52.085 227766 DEBUG oslo_concurrency.processutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:36:52 np0005593234 nova_compute[227762]: 2026-01-23 09:36:52.093 227766 DEBUG nova.compute.provider_tree [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:36:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:36:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:36:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:52.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:52 np0005593234 nova_compute[227762]: 2026-01-23 09:36:52.463 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769160997.4624932, 5cea9bfc-e97a-4d07-a251-8ca3978b5f98 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:36:52 np0005593234 nova_compute[227762]: 2026-01-23 09:36:52.464 227766 INFO nova.compute.manager [-] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:36:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:36:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:52.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:36:53 np0005593234 nova_compute[227762]: 2026-01-23 09:36:53.009 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e171 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:36:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:36:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:54.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:36:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:54.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:54 np0005593234 nova_compute[227762]: 2026-01-23 09:36:54.819 227766 DEBUG nova.scheduler.client.report [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:36:54 np0005593234 nova_compute[227762]: 2026-01-23 09:36:54.894 227766 DEBUG nova.compute.manager [None req-f7b5b7ee-e73a-4a50-a843-cd918842d267 - - - - - -] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:36:55 np0005593234 nova_compute[227762]: 2026-01-23 09:36:55.060 227766 DEBUG nova.compute.resource_tracker [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:36:55 np0005593234 nova_compute[227762]: 2026-01-23 09:36:55.060 227766 DEBUG oslo_concurrency.lockutils [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:36:55 np0005593234 nova_compute[227762]: 2026-01-23 09:36:55.066 227766 INFO nova.compute.manager [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Jan 23 04:36:55 np0005593234 nova_compute[227762]: 2026-01-23 09:36:55.403 227766 INFO nova.scheduler.client.report [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Deleted allocation for migration 4584031b-3dfa-4ba7-880d-b5ee586cd7c0#033[00m
Jan 23 04:36:55 np0005593234 nova_compute[227762]: 2026-01-23 09:36:55.403 227766 DEBUG nova.virt.libvirt.driver [None req-91d1a2e7-3db5-44bc-a87f-8b8335c98ccb 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 5cea9bfc-e97a-4d07-a251-8ca3978b5f98] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 23 04:36:55 np0005593234 nova_compute[227762]: 2026-01-23 09:36:55.494 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:55 np0005593234 podman[242096]: 2026-01-23 09:36:55.856750773 +0000 UTC m=+0.135047070 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:55.906996) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161015907059, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 2654, "num_deletes": 505, "total_data_size": 5486874, "memory_usage": 5571904, "flush_reason": "Manual Compaction"}
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161015933445, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 3579232, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28056, "largest_seqno": 30704, "table_properties": {"data_size": 3568832, "index_size": 6126, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3205, "raw_key_size": 25701, "raw_average_key_size": 20, "raw_value_size": 3545803, "raw_average_value_size": 2820, "num_data_blocks": 266, "num_entries": 1257, "num_filter_entries": 1257, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769160823, "oldest_key_time": 1769160823, "file_creation_time": 1769161015, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 26780 microseconds, and 7651 cpu microseconds.
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:55.933754) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 3579232 bytes OK
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:55.933801) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:55.935733) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:55.935762) EVENT_LOG_v1 {"time_micros": 1769161015935752, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:55.935782) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 5474127, prev total WAL file size 5474127, number of live WAL files 2.
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:55.937088) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(3495KB)], [57(10MB)]
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161015937200, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 14654500, "oldest_snapshot_seqno": -1}
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5363 keys, 8870245 bytes, temperature: kUnknown
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161015997682, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 8870245, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8834325, "index_size": 21410, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13445, "raw_key_size": 136800, "raw_average_key_size": 25, "raw_value_size": 8737495, "raw_average_value_size": 1629, "num_data_blocks": 864, "num_entries": 5363, "num_filter_entries": 5363, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769161015, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:55.997937) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 8870245 bytes
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:55.999613) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 241.9 rd, 146.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 10.6 +0.0 blob) out(8.5 +0.0 blob), read-write-amplify(6.6) write-amplify(2.5) OK, records in: 6390, records dropped: 1027 output_compression: NoCompression
Jan 23 04:36:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:55.999643) EVENT_LOG_v1 {"time_micros": 1769161015999626, "job": 34, "event": "compaction_finished", "compaction_time_micros": 60570, "compaction_time_cpu_micros": 20314, "output_level": 6, "num_output_files": 1, "total_output_size": 8870245, "num_input_records": 6390, "num_output_records": 5363, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:36:56 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:36:56 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161016000405, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 23 04:36:56 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:36:56 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161016002754, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 23 04:36:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:55.936974) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:36:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:56.002795) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:36:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:56.002801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:36:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:56.002802) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:36:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:56.002804) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:36:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:36:56.002806) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:36:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:56.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:56.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:58 np0005593234 nova_compute[227762]: 2026-01-23 09:36:58.011 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:36:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:36:58.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:36:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:36:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:36:58.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:36:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e172 e172: 3 total, 3 up, 3 in
Jan 23 04:36:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e172 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:37:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:00.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:37:00 np0005593234 nova_compute[227762]: 2026-01-23 09:37:00.500 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:00.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e173 e173: 3 total, 3 up, 3 in
Jan 23 04:37:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:37:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:02.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:37:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:02.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:03 np0005593234 nova_compute[227762]: 2026-01-23 09:37:03.012 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:04.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:04.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:04 np0005593234 nova_compute[227762]: 2026-01-23 09:37:04.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:05 np0005593234 nova_compute[227762]: 2026-01-23 09:37:05.504 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:06.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:06.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:06 np0005593234 nova_compute[227762]: 2026-01-23 09:37:06.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:07 np0005593234 nova_compute[227762]: 2026-01-23 09:37:07.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:07 np0005593234 nova_compute[227762]: 2026-01-23 09:37:07.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:07 np0005593234 nova_compute[227762]: 2026-01-23 09:37:07.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:07 np0005593234 nova_compute[227762]: 2026-01-23 09:37:07.778 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:07 np0005593234 nova_compute[227762]: 2026-01-23 09:37:07.778 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:07 np0005593234 nova_compute[227762]: 2026-01-23 09:37:07.778 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:37:07 np0005593234 nova_compute[227762]: 2026-01-23 09:37:07.778 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:08 np0005593234 nova_compute[227762]: 2026-01-23 09:37:08.014 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:37:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/749713741' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:37:08 np0005593234 nova_compute[227762]: 2026-01-23 09:37:08.204 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:37:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:08.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:37:08 np0005593234 nova_compute[227762]: 2026-01-23 09:37:08.380 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:37:08 np0005593234 nova_compute[227762]: 2026-01-23 09:37:08.381 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4833MB free_disk=20.921859741210938GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:37:08 np0005593234 nova_compute[227762]: 2026-01-23 09:37:08.381 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:08 np0005593234 nova_compute[227762]: 2026-01-23 09:37:08.381 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:08 np0005593234 nova_compute[227762]: 2026-01-23 09:37:08.671 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:37:08 np0005593234 nova_compute[227762]: 2026-01-23 09:37:08.671 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:37:08 np0005593234 nova_compute[227762]: 2026-01-23 09:37:08.697 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:08.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:37:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/243173174' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:37:09 np0005593234 nova_compute[227762]: 2026-01-23 09:37:09.125 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:09 np0005593234 nova_compute[227762]: 2026-01-23 09:37:09.132 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:37:09 np0005593234 nova_compute[227762]: 2026-01-23 09:37:09.175 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:37:09 np0005593234 nova_compute[227762]: 2026-01-23 09:37:09.177 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:37:09 np0005593234 nova_compute[227762]: 2026-01-23 09:37:09.177 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:10 np0005593234 nova_compute[227762]: 2026-01-23 09:37:10.176 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:10 np0005593234 nova_compute[227762]: 2026-01-23 09:37:10.177 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:37:10 np0005593234 nova_compute[227762]: 2026-01-23 09:37:10.177 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:37:10 np0005593234 nova_compute[227762]: 2026-01-23 09:37:10.196 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:37:10 np0005593234 nova_compute[227762]: 2026-01-23 09:37:10.196 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:10 np0005593234 nova_compute[227762]: 2026-01-23 09:37:10.196 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:10 np0005593234 nova_compute[227762]: 2026-01-23 09:37:10.196 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:37:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:37:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:10.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:37:10 np0005593234 nova_compute[227762]: 2026-01-23 09:37:10.507 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:10.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:12.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:12.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:12 np0005593234 podman[242226]: 2026-01-23 09:37:12.755537417 +0000 UTC m=+0.053686525 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 23 04:37:13 np0005593234 nova_compute[227762]: 2026-01-23 09:37:13.015 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:13 np0005593234 nova_compute[227762]: 2026-01-23 09:37:13.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:14.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:14 np0005593234 nova_compute[227762]: 2026-01-23 09:37:14.685 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:14 np0005593234 nova_compute[227762]: 2026-01-23 09:37:14.686 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:14 np0005593234 nova_compute[227762]: 2026-01-23 09:37:14.709 227766 DEBUG nova.compute.manager [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:37:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:14.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:14 np0005593234 nova_compute[227762]: 2026-01-23 09:37:14.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:14 np0005593234 nova_compute[227762]: 2026-01-23 09:37:14.792 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:14 np0005593234 nova_compute[227762]: 2026-01-23 09:37:14.792 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:14 np0005593234 nova_compute[227762]: 2026-01-23 09:37:14.800 227766 DEBUG nova.virt.hardware [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:37:14 np0005593234 nova_compute[227762]: 2026-01-23 09:37:14.800 227766 INFO nova.compute.claims [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:37:14 np0005593234 nova_compute[227762]: 2026-01-23 09:37:14.953 227766 DEBUG oslo_concurrency.processutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:37:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2113290337' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.393 227766 DEBUG oslo_concurrency.processutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.399 227766 DEBUG nova.compute.provider_tree [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.489 227766 DEBUG nova.scheduler.client.report [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.510 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.516 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.724s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.516 227766 DEBUG nova.compute.manager [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.576 227766 DEBUG nova.compute.manager [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.576 227766 DEBUG nova.network.neutron [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.604 227766 INFO nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.640 227766 DEBUG nova.compute.manager [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.706 227766 INFO nova.virt.block_device [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Booting with volume b06791ec-66fd-4114-8448-7ea0b7f88f25 at /dev/vda#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.876 227766 DEBUG os_brick.utils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.877 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.889 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.889 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[4999f0f3-a10c-4475-994c-35a7f73185df]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.891 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.899 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.899 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[93c2e942-f073-4600-9b16-320b667a911a]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.901 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.909 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.909 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[5d335a4e-1cd3-45b5-b1a3-5338da161cc6]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.911 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[ed71287e-c5ae-40b9-a189-cf141942ca39]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.911 227766 DEBUG oslo_concurrency.processutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.932 227766 DEBUG oslo_concurrency.processutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CMD "nvme version" returned: 0 in 0.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.935 227766 DEBUG os_brick.initiator.connectors.lightos [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.935 227766 DEBUG os_brick.initiator.connectors.lightos [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.936 227766 DEBUG os_brick.initiator.connectors.lightos [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.936 227766 DEBUG os_brick.utils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] <== get_connector_properties: return (59ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:37:15 np0005593234 nova_compute[227762]: 2026-01-23 09:37:15.936 227766 DEBUG nova.virt.block_device [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Updating existing volume attachment record: 157a81cb-fd76-48d4-abf5-e6fb564e20a5 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 04:37:16 np0005593234 nova_compute[227762]: 2026-01-23 09:37:16.261 227766 DEBUG nova.policy [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '4f72965e950c4761bfedd99fdc411a83', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd0dce6e339c349d4ab97cee5e49fff3a', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:37:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:37:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:16.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:37:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:16.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:37:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3752954385' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:37:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:17.253 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:37:17 np0005593234 nova_compute[227762]: 2026-01-23 09:37:17.254 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:17.254 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:37:17 np0005593234 nova_compute[227762]: 2026-01-23 09:37:17.405 227766 DEBUG nova.network.neutron [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Successfully created port: 27e277b3-2135-4e3e-b336-e0da87509465 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:37:17 np0005593234 nova_compute[227762]: 2026-01-23 09:37:17.469 227766 DEBUG nova.compute.manager [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:37:17 np0005593234 nova_compute[227762]: 2026-01-23 09:37:17.471 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:37:17 np0005593234 nova_compute[227762]: 2026-01-23 09:37:17.471 227766 INFO nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Creating image(s)#033[00m
Jan 23 04:37:17 np0005593234 nova_compute[227762]: 2026-01-23 09:37:17.472 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 04:37:17 np0005593234 nova_compute[227762]: 2026-01-23 09:37:17.472 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Ensure instance console log exists: /var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:37:17 np0005593234 nova_compute[227762]: 2026-01-23 09:37:17.472 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:17 np0005593234 nova_compute[227762]: 2026-01-23 09:37:17.473 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:17 np0005593234 nova_compute[227762]: 2026-01-23 09:37:17.473 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:18 np0005593234 nova_compute[227762]: 2026-01-23 09:37:18.021 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:37:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:18.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:37:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:18.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:19 np0005593234 nova_compute[227762]: 2026-01-23 09:37:19.580 227766 DEBUG nova.network.neutron [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Successfully updated port: 27e277b3-2135-4e3e-b336-e0da87509465 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:37:19 np0005593234 nova_compute[227762]: 2026-01-23 09:37:19.614 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:37:19 np0005593234 nova_compute[227762]: 2026-01-23 09:37:19.615 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquired lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:37:19 np0005593234 nova_compute[227762]: 2026-01-23 09:37:19.615 227766 DEBUG nova.network.neutron [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:37:19 np0005593234 nova_compute[227762]: 2026-01-23 09:37:19.736 227766 DEBUG nova.compute.manager [req-aa67140f-84fb-4ed0-b905-61ae97838929 req-b84452a9-944a-4dfb-8708-e7d2a702bfd3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-changed-27e277b3-2135-4e3e-b336-e0da87509465 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:19 np0005593234 nova_compute[227762]: 2026-01-23 09:37:19.737 227766 DEBUG nova.compute.manager [req-aa67140f-84fb-4ed0-b905-61ae97838929 req-b84452a9-944a-4dfb-8708-e7d2a702bfd3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Refreshing instance network info cache due to event network-changed-27e277b3-2135-4e3e-b336-e0da87509465. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:37:19 np0005593234 nova_compute[227762]: 2026-01-23 09:37:19.737 227766 DEBUG oslo_concurrency.lockutils [req-aa67140f-84fb-4ed0-b905-61ae97838929 req-b84452a9-944a-4dfb-8708-e7d2a702bfd3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:37:20 np0005593234 nova_compute[227762]: 2026-01-23 09:37:20.177 227766 DEBUG nova.network.neutron [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:37:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:20.256 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:20.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:20 np0005593234 nova_compute[227762]: 2026-01-23 09:37:20.514 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:37:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:20.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.538 227766 DEBUG nova.network.neutron [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Updating instance_info_cache with network_info: [{"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.565 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Releasing lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.566 227766 DEBUG nova.compute.manager [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Instance network_info: |[{"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.566 227766 DEBUG oslo_concurrency.lockutils [req-aa67140f-84fb-4ed0-b905-61ae97838929 req-b84452a9-944a-4dfb-8708-e7d2a702bfd3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.566 227766 DEBUG nova.network.neutron [req-aa67140f-84fb-4ed0-b905-61ae97838929 req-b84452a9-944a-4dfb-8708-e7d2a702bfd3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Refreshing network info cache for port 27e277b3-2135-4e3e-b336-e0da87509465 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.572 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Start _get_guest_xml network_info=[{"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'boot_index': 0, 'mount_device': '/dev/vda', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-b06791ec-66fd-4114-8448-7ea0b7f88f25', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'b06791ec-66fd-4114-8448-7ea0b7f88f25', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '261ab1ec-f79b-4867-bcb6-1c1d7491120e', 'attached_at': '', 'detached_at': '', 'volume_id': 'b06791ec-66fd-4114-8448-7ea0b7f88f25', 'serial': 'b06791ec-66fd-4114-8448-7ea0b7f88f25'}, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '157a81cb-fd76-48d4-abf5-e6fb564e20a5', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.577 227766 WARNING nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.582 227766 DEBUG nova.virt.libvirt.host [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.584 227766 DEBUG nova.virt.libvirt.host [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.587 227766 DEBUG nova.virt.libvirt.host [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.588 227766 DEBUG nova.virt.libvirt.host [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.589 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.589 227766 DEBUG nova.virt.hardware [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.589 227766 DEBUG nova.virt.hardware [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.590 227766 DEBUG nova.virt.hardware [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.590 227766 DEBUG nova.virt.hardware [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.590 227766 DEBUG nova.virt.hardware [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.590 227766 DEBUG nova.virt.hardware [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.591 227766 DEBUG nova.virt.hardware [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.591 227766 DEBUG nova.virt.hardware [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.591 227766 DEBUG nova.virt.hardware [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.591 227766 DEBUG nova.virt.hardware [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.592 227766 DEBUG nova.virt.hardware [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.625 227766 DEBUG nova.storage.rbd_utils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] rbd image 261ab1ec-f79b-4867-bcb6-1c1d7491120e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:21 np0005593234 nova_compute[227762]: 2026-01-23 09:37:21.631 227766 DEBUG oslo_concurrency.processutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:37:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3042404784' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.053 227766 DEBUG oslo_concurrency.processutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.083 227766 DEBUG nova.virt.libvirt.vif [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-724421301',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-724421301',id=29,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d0dce6e339c349d4ab97cee5e49fff3a',ramdisk_id='',reservation_id='r-106tqp53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1207260646',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1207260646-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:37:15Z,user_data=None,user_id='4f72965e950c4761bfedd99fdc411a83',uuid=261ab1ec-f79b-4867-bcb6-1c1d7491120e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.084 227766 DEBUG nova.network.os_vif_util [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Converting VIF {"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.085 227766 DEBUG nova.network.os_vif_util [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.086 227766 DEBUG nova.objects.instance [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lazy-loading 'pci_devices' on Instance uuid 261ab1ec-f79b-4867-bcb6-1c1d7491120e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.119 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  <uuid>261ab1ec-f79b-4867-bcb6-1c1d7491120e</uuid>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  <name>instance-0000001d</name>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <nova:name>tempest-LiveAutoBlockMigrationV225Test-server-724421301</nova:name>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:37:21</nova:creationTime>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <nova:user uuid="4f72965e950c4761bfedd99fdc411a83">tempest-LiveAutoBlockMigrationV225Test-1207260646-project-member</nova:user>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <nova:project uuid="d0dce6e339c349d4ab97cee5e49fff3a">tempest-LiveAutoBlockMigrationV225Test-1207260646</nova:project>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <nova:port uuid="27e277b3-2135-4e3e-b336-e0da87509465">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <entry name="serial">261ab1ec-f79b-4867-bcb6-1c1d7491120e</entry>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <entry name="uuid">261ab1ec-f79b-4867-bcb6-1c1d7491120e</entry>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/261ab1ec-f79b-4867-bcb6-1c1d7491120e_disk.config">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="volumes/volume-b06791ec-66fd-4114-8448-7ea0b7f88f25">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <serial>b06791ec-66fd-4114-8448-7ea0b7f88f25</serial>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:34:06:0e"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <target dev="tap27e277b3-21"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e/console.log" append="off"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:37:22 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:37:22 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:37:22 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:37:22 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.121 227766 DEBUG nova.compute.manager [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Preparing to wait for external event network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.121 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.121 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.122 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.122 227766 DEBUG nova.virt.libvirt.vif [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-724421301',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-724421301',id=29,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d0dce6e339c349d4ab97cee5e49fff3a',ramdisk_id='',reservation_id='r-106tqp53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1207260646',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1207260646-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:37:15Z,user_data=None,user_id='4f72965e950c4761bfedd99fdc411a83',uuid=261ab1ec-f79b-4867-bcb6-1c1d7491120e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.123 227766 DEBUG nova.network.os_vif_util [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Converting VIF {"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.123 227766 DEBUG nova.network.os_vif_util [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.124 227766 DEBUG os_vif [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.124 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.125 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.125 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.128 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.129 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27e277b3-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.129 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap27e277b3-21, col_values=(('external_ids', {'iface-id': '27e277b3-2135-4e3e-b336-e0da87509465', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:06:0e', 'vm-uuid': '261ab1ec-f79b-4867-bcb6-1c1d7491120e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.130 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:22 np0005593234 NetworkManager[48942]: <info>  [1769161042.1315] manager: (tap27e277b3-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.133 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.136 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.137 227766 INFO os_vif [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21')#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.236 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.237 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.237 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] No VIF found with MAC fa:16:3e:34:06:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.238 227766 INFO nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Using config drive#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.266 227766 DEBUG nova.storage.rbd_utils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] rbd image 261ab1ec-f79b-4867-bcb6-1c1d7491120e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:22.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:22.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.775 227766 INFO nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Creating config drive at /var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e/disk.config#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.780 227766 DEBUG oslo_concurrency.processutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqv6cj2zq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.908 227766 DEBUG oslo_concurrency.processutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqv6cj2zq" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.940 227766 DEBUG nova.storage.rbd_utils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] rbd image 261ab1ec-f79b-4867-bcb6-1c1d7491120e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:22 np0005593234 nova_compute[227762]: 2026-01-23 09:37:22.943 227766 DEBUG oslo_concurrency.processutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e/disk.config 261ab1ec-f79b-4867-bcb6-1c1d7491120e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.023 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.096 227766 DEBUG oslo_concurrency.processutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e/disk.config 261ab1ec-f79b-4867-bcb6-1c1d7491120e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.097 227766 INFO nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Deleting local config drive /var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e/disk.config because it was imported into RBD.#033[00m
Jan 23 04:37:23 np0005593234 kernel: tap27e277b3-21: entered promiscuous mode
Jan 23 04:37:23 np0005593234 NetworkManager[48942]: <info>  [1769161043.1476] manager: (tap27e277b3-21): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 23 04:37:23 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:23Z|00062|binding|INFO|Claiming lport 27e277b3-2135-4e3e-b336-e0da87509465 for this chassis.
Jan 23 04:37:23 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:23Z|00063|binding|INFO|27e277b3-2135-4e3e-b336-e0da87509465: Claiming fa:16:3e:34:06:0e 10.100.0.11
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.148 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.159 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:23 np0005593234 systemd-udevd[242390]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:37:23 np0005593234 NetworkManager[48942]: <info>  [1769161043.1848] device (tap27e277b3-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:37:23 np0005593234 NetworkManager[48942]: <info>  [1769161043.1853] device (tap27e277b3-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:37:23 np0005593234 systemd-machined[195626]: New machine qemu-13-instance-0000001d.
Jan 23 04:37:23 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:23Z|00064|binding|INFO|Setting lport 27e277b3-2135-4e3e-b336-e0da87509465 ovn-installed in OVS
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.228 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:23 np0005593234 systemd[1]: Started Virtual Machine qemu-13-instance-0000001d.
Jan 23 04:37:23 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:23Z|00065|binding|INFO|Setting lport 27e277b3-2135-4e3e-b336-e0da87509465 up in Southbound
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.415 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:06:0e 10.100.0.11'], port_security=['fa:16:3e:34:06:0e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '261ab1ec-f79b-4867-bcb6-1c1d7491120e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0dce6e339c349d4ab97cee5e49fff3a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0179c400-b2f2-4914-b563-942a61ef1858', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbb60528-b878-42fd-9c2f-0a3345010b1a, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=27e277b3-2135-4e3e-b336-e0da87509465) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.416 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 27e277b3-2135-4e3e-b336-e0da87509465 in datapath 8eab8076-0848-4daf-bbac-f3f8b65ca750 bound to our chassis#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.418 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8eab8076-0848-4daf-bbac-f3f8b65ca750#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.429 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[881c4516-ea54-4d3e-8c12-e2e43f507f90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.430 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8eab8076-01 in ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.432 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8eab8076-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.432 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5b4447cf-ea6d-44f2-b139-e1c62d3a288e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.433 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[43003290-5010-491c-8d99-dd2f4513dd57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.449 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[1d658bd7-5cc4-4e61-af39-4019682d9992]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.475 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d8840876-34f7-4fd8-8fcc-2856fb9f3b97]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.509 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[adba433d-c0bb-4833-9cab-d7c80b3fcff5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.514 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f51bc6-36d7-4786-9392-a34f0191a443]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 NetworkManager[48942]: <info>  [1769161043.5157] manager: (tap8eab8076-00): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.550 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[5aefcbec-a607-49d2-a4d0-2036204eb6c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.553 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[48673648-4923-4e29-a939-f8b3b5fa788f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 NetworkManager[48942]: <info>  [1769161043.5758] device (tap8eab8076-00): carrier: link connected
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.581 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7d992cc1-6f90-4f1e-87df-3517b7ad8ab4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.595 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e07870-85a7-4299-bf6a-ffb04c14e77d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8eab8076-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:5b:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493853, 'reachable_time': 15189, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242462, 'error': None, 'target': 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.613 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d6cc8711-8dd5-4199-a5c1-a2439bb1b774]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe46:5b99'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 493853, 'tstamp': 493853}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242464, 'error': None, 'target': 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.629 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[72f67619-0459-4f67-a2b8-64f898468199]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8eab8076-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:5b:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493853, 'reachable_time': 15189, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242469, 'error': None, 'target': 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.658 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e7bc7185-89f1-4a10-9f40-3dfa9b8626d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.709 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161043.7083647, 261ab1ec-f79b-4867-bcb6-1c1d7491120e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.709 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] VM Started (Lifecycle Event)#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.716 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4b412bd9-a25f-4e6f-a89d-d56b6132e4d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.717 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8eab8076-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.718 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.718 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8eab8076-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.719 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:23 np0005593234 NetworkManager[48942]: <info>  [1769161043.7205] manager: (tap8eab8076-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 23 04:37:23 np0005593234 kernel: tap8eab8076-00: entered promiscuous mode
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.722 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.723 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8eab8076-00, col_values=(('external_ids', {'iface-id': 'b545a870-aa18-4f64-a8a7-f8512824c4cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.724 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:23 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:23Z|00066|binding|INFO|Releasing lport b545a870-aa18-4f64-a8a7-f8512824c4cc from this chassis (sb_readonly=0)
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.725 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.725 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8eab8076-0848-4daf-bbac-f3f8b65ca750.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8eab8076-0848-4daf-bbac-f3f8b65ca750.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.726 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bd97e35a-9e0b-4694-bc46-314affc936be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.727 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-8eab8076-0848-4daf-bbac-f3f8b65ca750
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/8eab8076-0848-4daf-bbac-f3f8b65ca750.pid.haproxy
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 8eab8076-0848-4daf-bbac-f3f8b65ca750
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:23.727 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'env', 'PROCESS_TAG=haproxy-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8eab8076-0848-4daf-bbac-f3f8b65ca750.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.730 227766 DEBUG nova.network.neutron [req-aa67140f-84fb-4ed0-b905-61ae97838929 req-b84452a9-944a-4dfb-8708-e7d2a702bfd3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Updated VIF entry in instance network info cache for port 27e277b3-2135-4e3e-b336-e0da87509465. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.731 227766 DEBUG nova.network.neutron [req-aa67140f-84fb-4ed0-b905-61ae97838929 req-b84452a9-944a-4dfb-8708-e7d2a702bfd3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Updating instance_info_cache with network_info: [{"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.738 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.759 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.764 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161043.70978, 261ab1ec-f79b-4867-bcb6-1c1d7491120e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.765 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.768 227766 DEBUG oslo_concurrency.lockutils [req-aa67140f-84fb-4ed0-b905-61ae97838929 req-b84452a9-944a-4dfb-8708-e7d2a702bfd3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.802 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.806 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:37:23 np0005593234 nova_compute[227762]: 2026-01-23 09:37:23.843 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:37:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e173 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:24 np0005593234 podman[242502]: 2026-01-23 09:37:24.12949833 +0000 UTC m=+0.047312305 container create 36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:37:24 np0005593234 systemd[1]: Started libpod-conmon-36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db.scope.
Jan 23 04:37:24 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:37:24 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55b160c5437a68b90df682d321357d8ee13a6b68a1f12db6fd33ae577d3074f4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:37:24 np0005593234 podman[242502]: 2026-01-23 09:37:24.199389949 +0000 UTC m=+0.117203944 container init 36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:37:24 np0005593234 podman[242502]: 2026-01-23 09:37:24.103947784 +0000 UTC m=+0.021761779 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:37:24 np0005593234 podman[242502]: 2026-01-23 09:37:24.206964845 +0000 UTC m=+0.124778830 container start 36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 04:37:24 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[242518]: [NOTICE]   (242522) : New worker (242524) forked
Jan 23 04:37:24 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[242518]: [NOTICE]   (242522) : Loading success.
Jan 23 04:37:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:37:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:24.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:37:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:24.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.174 227766 DEBUG nova.compute.manager [req-f57bf3ed-f472-4f66-a48c-7978ef9673f6 req-1d840444-fd26-4d1e-b509-8569f3d1e86a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.174 227766 DEBUG oslo_concurrency.lockutils [req-f57bf3ed-f472-4f66-a48c-7978ef9673f6 req-1d840444-fd26-4d1e-b509-8569f3d1e86a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.175 227766 DEBUG oslo_concurrency.lockutils [req-f57bf3ed-f472-4f66-a48c-7978ef9673f6 req-1d840444-fd26-4d1e-b509-8569f3d1e86a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.175 227766 DEBUG oslo_concurrency.lockutils [req-f57bf3ed-f472-4f66-a48c-7978ef9673f6 req-1d840444-fd26-4d1e-b509-8569f3d1e86a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.175 227766 DEBUG nova.compute.manager [req-f57bf3ed-f472-4f66-a48c-7978ef9673f6 req-1d840444-fd26-4d1e-b509-8569f3d1e86a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Processing event network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.175 227766 DEBUG nova.compute.manager [req-f57bf3ed-f472-4f66-a48c-7978ef9673f6 req-1d840444-fd26-4d1e-b509-8569f3d1e86a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.175 227766 DEBUG oslo_concurrency.lockutils [req-f57bf3ed-f472-4f66-a48c-7978ef9673f6 req-1d840444-fd26-4d1e-b509-8569f3d1e86a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.175 227766 DEBUG oslo_concurrency.lockutils [req-f57bf3ed-f472-4f66-a48c-7978ef9673f6 req-1d840444-fd26-4d1e-b509-8569f3d1e86a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.175 227766 DEBUG oslo_concurrency.lockutils [req-f57bf3ed-f472-4f66-a48c-7978ef9673f6 req-1d840444-fd26-4d1e-b509-8569f3d1e86a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.176 227766 DEBUG nova.compute.manager [req-f57bf3ed-f472-4f66-a48c-7978ef9673f6 req-1d840444-fd26-4d1e-b509-8569f3d1e86a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] No waiting events found dispatching network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.176 227766 WARNING nova.compute.manager [req-f57bf3ed-f472-4f66-a48c-7978ef9673f6 req-1d840444-fd26-4d1e-b509-8569f3d1e86a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received unexpected event network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.176 227766 DEBUG nova.compute.manager [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.180 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.181 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161046.181419, 261ab1ec-f79b-4867-bcb6-1c1d7491120e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.181 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.185 227766 INFO nova.virt.libvirt.driver [-] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Instance spawned successfully.#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.186 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.259 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.266 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.269 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.269 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.270 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.270 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.271 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.271 227766 DEBUG nova.virt.libvirt.driver [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:26.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.325 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.345 227766 INFO nova.compute.manager [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Took 8.88 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.345 227766 DEBUG nova.compute.manager [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.421 227766 INFO nova.compute.manager [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Took 11.66 seconds to build instance.#033[00m
Jan 23 04:37:26 np0005593234 nova_compute[227762]: 2026-01-23 09:37:26.443 227766 DEBUG oslo_concurrency.lockutils [None req-b8c739d5-82e3-4aec-aab1-d8618732a986 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e174 e174: 3 total, 3 up, 3 in
Jan 23 04:37:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:26.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:26 np0005593234 podman[242534]: 2026-01-23 09:37:26.79861264 +0000 UTC m=+0.089628625 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 04:37:27 np0005593234 nova_compute[227762]: 2026-01-23 09:37:27.196 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:28 np0005593234 nova_compute[227762]: 2026-01-23 09:37:28.031 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:37:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:28.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:37:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:28.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:30.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:30.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:32 np0005593234 nova_compute[227762]: 2026-01-23 09:37:32.199 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:32.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:32.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:33 np0005593234 nova_compute[227762]: 2026-01-23 09:37:33.070 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:33 np0005593234 nova_compute[227762]: 2026-01-23 09:37:33.716 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "aede4522-9d5a-4475-9dd9-46c044901917" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:33 np0005593234 nova_compute[227762]: 2026-01-23 09:37:33.717 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:33 np0005593234 nova_compute[227762]: 2026-01-23 09:37:33.769 227766 DEBUG nova.compute.manager [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:37:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 e175: 3 total, 3 up, 3 in
Jan 23 04:37:33 np0005593234 nova_compute[227762]: 2026-01-23 09:37:33.789 227766 DEBUG nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Check if temp file /var/lib/nova/instances/tmpc7fzoq9f exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065#033[00m
Jan 23 04:37:33 np0005593234 nova_compute[227762]: 2026-01-23 09:37:33.789 227766 DEBUG nova.compute.manager [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc7fzoq9f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='261ab1ec-f79b-4867-bcb6-1c1d7491120e',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587#033[00m
Jan 23 04:37:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:34 np0005593234 nova_compute[227762]: 2026-01-23 09:37:34.176 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:34 np0005593234 nova_compute[227762]: 2026-01-23 09:37:34.177 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:34 np0005593234 nova_compute[227762]: 2026-01-23 09:37:34.188 227766 DEBUG nova.virt.hardware [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:37:34 np0005593234 nova_compute[227762]: 2026-01-23 09:37:34.190 227766 INFO nova.compute.claims [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:37:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:37:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:34.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:37:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:37:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:34.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.054 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:37:35 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3971830178' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.497 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.504 227766 DEBUG nova.compute.provider_tree [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.521 227766 DEBUG nova.scheduler.client.report [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.554 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.377s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.555 227766 DEBUG nova.compute.manager [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.618 227766 DEBUG nova.compute.manager [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.618 227766 DEBUG nova.network.neutron [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.641 227766 INFO nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.694 227766 DEBUG nova.compute.manager [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.825 227766 DEBUG nova.compute.manager [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.826 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.826 227766 INFO nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Creating image(s)#033[00m
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.864 227766 DEBUG nova.storage.rbd_utils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image aede4522-9d5a-4475-9dd9-46c044901917_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:35 np0005593234 nova_compute[227762]: 2026-01-23 09:37:35.957 227766 DEBUG nova.storage.rbd_utils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image aede4522-9d5a-4475-9dd9-46c044901917_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:36 np0005593234 nova_compute[227762]: 2026-01-23 09:37:36.050 227766 DEBUG nova.storage.rbd_utils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image aede4522-9d5a-4475-9dd9-46c044901917_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:36 np0005593234 nova_compute[227762]: 2026-01-23 09:37:36.054 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:36 np0005593234 nova_compute[227762]: 2026-01-23 09:37:36.112 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:36 np0005593234 nova_compute[227762]: 2026-01-23 09:37:36.113 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:36 np0005593234 nova_compute[227762]: 2026-01-23 09:37:36.114 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:36 np0005593234 nova_compute[227762]: 2026-01-23 09:37:36.115 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:36 np0005593234 nova_compute[227762]: 2026-01-23 09:37:36.141 227766 DEBUG nova.storage.rbd_utils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image aede4522-9d5a-4475-9dd9-46c044901917_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:36 np0005593234 nova_compute[227762]: 2026-01-23 09:37:36.144 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 aede4522-9d5a-4475-9dd9-46c044901917_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:36.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:36 np0005593234 nova_compute[227762]: 2026-01-23 09:37:36.411 227766 DEBUG nova.policy [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '726bd44b7ec443a0a4b8b632b06c622e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9f68f8c2203944c9a6e44a6756c8b4b9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:37:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:37:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:36.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:37:36 np0005593234 nova_compute[227762]: 2026-01-23 09:37:36.813 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 aede4522-9d5a-4475-9dd9-46c044901917_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.669s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:36 np0005593234 nova_compute[227762]: 2026-01-23 09:37:36.885 227766 DEBUG nova.storage.rbd_utils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] resizing rbd image aede4522-9d5a-4475-9dd9-46c044901917_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:37:37 np0005593234 nova_compute[227762]: 2026-01-23 09:37:37.027 227766 DEBUG nova.objects.instance [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lazy-loading 'migration_context' on Instance uuid aede4522-9d5a-4475-9dd9-46c044901917 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:37 np0005593234 nova_compute[227762]: 2026-01-23 09:37:37.043 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:37:37 np0005593234 nova_compute[227762]: 2026-01-23 09:37:37.044 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Ensure instance console log exists: /var/lib/nova/instances/aede4522-9d5a-4475-9dd9-46c044901917/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:37:37 np0005593234 nova_compute[227762]: 2026-01-23 09:37:37.044 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:37 np0005593234 nova_compute[227762]: 2026-01-23 09:37:37.045 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:37 np0005593234 nova_compute[227762]: 2026-01-23 09:37:37.045 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:37 np0005593234 nova_compute[227762]: 2026-01-23 09:37:37.202 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:38 np0005593234 nova_compute[227762]: 2026-01-23 09:37:38.072 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:37:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:38.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:37:38 np0005593234 nova_compute[227762]: 2026-01-23 09:37:38.414 227766 DEBUG nova.network.neutron [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Successfully created port: d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:37:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:38.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:39 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:39Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:34:06:0e 10.100.0.11
Jan 23 04:37:39 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:39Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:34:06:0e 10.100.0.11
Jan 23 04:37:39 np0005593234 nova_compute[227762]: 2026-01-23 09:37:39.492 227766 DEBUG nova.network.neutron [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Successfully updated port: d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:37:39 np0005593234 nova_compute[227762]: 2026-01-23 09:37:39.520 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "refresh_cache-aede4522-9d5a-4475-9dd9-46c044901917" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:37:39 np0005593234 nova_compute[227762]: 2026-01-23 09:37:39.521 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquired lock "refresh_cache-aede4522-9d5a-4475-9dd9-46c044901917" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:37:39 np0005593234 nova_compute[227762]: 2026-01-23 09:37:39.521 227766 DEBUG nova.network.neutron [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:37:39 np0005593234 nova_compute[227762]: 2026-01-23 09:37:39.680 227766 DEBUG nova.compute.manager [req-30a52f08-f6b2-4287-a24b-feb6f3bf6fdb req-fdc95ae4-68c0-42bf-86c2-aeaeff9aab08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Received event network-changed-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:39 np0005593234 nova_compute[227762]: 2026-01-23 09:37:39.681 227766 DEBUG nova.compute.manager [req-30a52f08-f6b2-4287-a24b-feb6f3bf6fdb req-fdc95ae4-68c0-42bf-86c2-aeaeff9aab08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Refreshing instance network info cache due to event network-changed-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:37:39 np0005593234 nova_compute[227762]: 2026-01-23 09:37:39.681 227766 DEBUG oslo_concurrency.lockutils [req-30a52f08-f6b2-4287-a24b-feb6f3bf6fdb req-fdc95ae4-68c0-42bf-86c2-aeaeff9aab08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-aede4522-9d5a-4475-9dd9-46c044901917" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:37:39 np0005593234 nova_compute[227762]: 2026-01-23 09:37:39.799 227766 DEBUG nova.network.neutron [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:37:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:37:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:40.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:37:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:40.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.435 227766 DEBUG nova.network.neutron [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Updating instance_info_cache with network_info: [{"id": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "address": "fa:16:3e:ff:c5:b8", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6a90f4f-8b", "ovs_interfaceid": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.480 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Releasing lock "refresh_cache-aede4522-9d5a-4475-9dd9-46c044901917" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.481 227766 DEBUG nova.compute.manager [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Instance network_info: |[{"id": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "address": "fa:16:3e:ff:c5:b8", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6a90f4f-8b", "ovs_interfaceid": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.482 227766 DEBUG oslo_concurrency.lockutils [req-30a52f08-f6b2-4287-a24b-feb6f3bf6fdb req-fdc95ae4-68c0-42bf-86c2-aeaeff9aab08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-aede4522-9d5a-4475-9dd9-46c044901917" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.483 227766 DEBUG nova.network.neutron [req-30a52f08-f6b2-4287-a24b-feb6f3bf6fdb req-fdc95ae4-68c0-42bf-86c2-aeaeff9aab08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Refreshing network info cache for port d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.487 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Start _get_guest_xml network_info=[{"id": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "address": "fa:16:3e:ff:c5:b8", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6a90f4f-8b", "ovs_interfaceid": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.495 227766 WARNING nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.502 227766 DEBUG nova.virt.libvirt.host [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.503 227766 DEBUG nova.virt.libvirt.host [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.506 227766 DEBUG nova.virt.libvirt.host [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.507 227766 DEBUG nova.virt.libvirt.host [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.509 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.509 227766 DEBUG nova.virt.hardware [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.510 227766 DEBUG nova.virt.hardware [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.510 227766 DEBUG nova.virt.hardware [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.510 227766 DEBUG nova.virt.hardware [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.511 227766 DEBUG nova.virt.hardware [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.511 227766 DEBUG nova.virt.hardware [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.511 227766 DEBUG nova.virt.hardware [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.511 227766 DEBUG nova.virt.hardware [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.512 227766 DEBUG nova.virt.hardware [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.512 227766 DEBUG nova.virt.hardware [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.512 227766 DEBUG nova.virt.hardware [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.516 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.923 227766 DEBUG nova.compute.manager [req-ea20144b-2b00-488e-bc37-bf6c573add81 req-82f911d2-acad-4f05-8836-b8fdf00b7903 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-vif-unplugged-27e277b3-2135-4e3e-b336-e0da87509465 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.924 227766 DEBUG oslo_concurrency.lockutils [req-ea20144b-2b00-488e-bc37-bf6c573add81 req-82f911d2-acad-4f05-8836-b8fdf00b7903 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.924 227766 DEBUG oslo_concurrency.lockutils [req-ea20144b-2b00-488e-bc37-bf6c573add81 req-82f911d2-acad-4f05-8836-b8fdf00b7903 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.925 227766 DEBUG oslo_concurrency.lockutils [req-ea20144b-2b00-488e-bc37-bf6c573add81 req-82f911d2-acad-4f05-8836-b8fdf00b7903 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.925 227766 DEBUG nova.compute.manager [req-ea20144b-2b00-488e-bc37-bf6c573add81 req-82f911d2-acad-4f05-8836-b8fdf00b7903 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] No waiting events found dispatching network-vif-unplugged-27e277b3-2135-4e3e-b336-e0da87509465 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.926 227766 DEBUG nova.compute.manager [req-ea20144b-2b00-488e-bc37-bf6c573add81 req-82f911d2-acad-4f05-8836-b8fdf00b7903 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-vif-unplugged-27e277b3-2135-4e3e-b336-e0da87509465 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:37:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:37:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/111306356' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:37:41 np0005593234 nova_compute[227762]: 2026-01-23 09:37:41.982 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.007 227766 DEBUG nova.storage.rbd_utils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image aede4522-9d5a-4475-9dd9-46c044901917_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.011 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.206 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:42.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:37:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1135046651' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.455 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.457 227766 DEBUG nova.virt.libvirt.vif [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:37:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-107272333',display_name='tempest-VolumesAdminNegativeTest-server-107272333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-volumesadminnegativetest-server-107272333',id=32,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f68f8c2203944c9a6e44a6756c8b4b9',ramdisk_id='',reservation_id='r-9tfd70gd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-905168495',owner_user_name='tempest-VolumesAdminNegativeTest-905168495-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:37:35Z,user_data=None,user_id='726bd44b7ec443a0a4b8b632b06c622e',uuid=aede4522-9d5a-4475-9dd9-46c044901917,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "address": "fa:16:3e:ff:c5:b8", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6a90f4f-8b", "ovs_interfaceid": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.457 227766 DEBUG nova.network.os_vif_util [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Converting VIF {"id": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "address": "fa:16:3e:ff:c5:b8", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6a90f4f-8b", "ovs_interfaceid": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.458 227766 DEBUG nova.network.os_vif_util [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:c5:b8,bridge_name='br-int',has_traffic_filtering=True,id=d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6a90f4f-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.460 227766 DEBUG nova.objects.instance [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lazy-loading 'pci_devices' on Instance uuid aede4522-9d5a-4475-9dd9-46c044901917 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.485 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <uuid>aede4522-9d5a-4475-9dd9-46c044901917</uuid>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <name>instance-00000020</name>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <nova:name>tempest-VolumesAdminNegativeTest-server-107272333</nova:name>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:37:41</nova:creationTime>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <nova:user uuid="726bd44b7ec443a0a4b8b632b06c622e">tempest-VolumesAdminNegativeTest-905168495-project-member</nova:user>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <nova:project uuid="9f68f8c2203944c9a6e44a6756c8b4b9">tempest-VolumesAdminNegativeTest-905168495</nova:project>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <nova:port uuid="d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <entry name="serial">aede4522-9d5a-4475-9dd9-46c044901917</entry>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <entry name="uuid">aede4522-9d5a-4475-9dd9-46c044901917</entry>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/aede4522-9d5a-4475-9dd9-46c044901917_disk">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/aede4522-9d5a-4475-9dd9-46c044901917_disk.config">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:ff:c5:b8"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <target dev="tapd6a90f4f-8b"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/aede4522-9d5a-4475-9dd9-46c044901917/console.log" append="off"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:37:42 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:37:42 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.486 227766 DEBUG nova.compute.manager [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Preparing to wait for external event network-vif-plugged-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.487 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "aede4522-9d5a-4475-9dd9-46c044901917-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.487 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.488 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.489 227766 DEBUG nova.virt.libvirt.vif [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:37:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-107272333',display_name='tempest-VolumesAdminNegativeTest-server-107272333',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-volumesadminnegativetest-server-107272333',id=32,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9f68f8c2203944c9a6e44a6756c8b4b9',ramdisk_id='',reservation_id='r-9tfd70gd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-VolumesAdminNegativeTest-905168495',owner_user_name='tempest-VolumesAdminNegativeTest-905168495-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:37:35Z,user_data=None,user_id='726bd44b7ec443a0a4b8b632b06c622e',uuid=aede4522-9d5a-4475-9dd9-46c044901917,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "address": "fa:16:3e:ff:c5:b8", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6a90f4f-8b", "ovs_interfaceid": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.489 227766 DEBUG nova.network.os_vif_util [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Converting VIF {"id": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "address": "fa:16:3e:ff:c5:b8", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6a90f4f-8b", "ovs_interfaceid": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.489 227766 DEBUG nova.network.os_vif_util [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:c5:b8,bridge_name='br-int',has_traffic_filtering=True,id=d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6a90f4f-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.490 227766 DEBUG os_vif [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:c5:b8,bridge_name='br-int',has_traffic_filtering=True,id=d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6a90f4f-8b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.490 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.491 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.491 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.496 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.496 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6a90f4f-8b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.497 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd6a90f4f-8b, col_values=(('external_ids', {'iface-id': 'd6a90f4f-8b5b-441d-a945-cdba5f1aa2fa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:c5:b8', 'vm-uuid': 'aede4522-9d5a-4475-9dd9-46c044901917'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.498 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:42 np0005593234 NetworkManager[48942]: <info>  [1769161062.4996] manager: (tapd6a90f4f-8b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.501 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.505 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.506 227766 INFO os_vif [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:c5:b8,bridge_name='br-int',has_traffic_filtering=True,id=d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6a90f4f-8b')#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.570 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.570 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.570 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] No VIF found with MAC fa:16:3e:ff:c5:b8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.571 227766 INFO nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Using config drive#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.592 227766 DEBUG nova.storage.rbd_utils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image aede4522-9d5a-4475-9dd9-46c044901917_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:42.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:42.814 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:42.815 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:42.816 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.823 227766 INFO nova.compute.manager [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Took 7.08 seconds for pre_live_migration on destination host compute-0.ctlplane.example.com.#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.824 227766 DEBUG nova.compute.manager [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.844 227766 DEBUG nova.compute.manager [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpc7fzoq9f',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='261ab1ec-f79b-4867-bcb6-1c1d7491120e',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=Migration(b0b4513c-fd12-42eb-894f-3243a9897e2b),old_vol_attachment_ids={b06791ec-66fd-4114-8448-7ea0b7f88f25='157a81cb-fd76-48d4-abf5-e6fb564e20a5'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.848 227766 DEBUG nova.objects.instance [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lazy-loading 'migration_context' on Instance uuid 261ab1ec-f79b-4867-bcb6-1c1d7491120e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.849 227766 DEBUG nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.850 227766 DEBUG nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.850 227766 DEBUG nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.869 227766 DEBUG nova.virt.libvirt.migration [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Find same serial number: pos=1, serial=b06791ec-66fd-4114-8448-7ea0b7f88f25 _update_volume_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:242#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.870 227766 DEBUG nova.virt.libvirt.vif [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-724421301',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-724421301',id=29,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:37:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d0dce6e339c349d4ab97cee5e49fff3a',ramdisk_id='',reservation_id='r-106tqp53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1207260646',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1207260646-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:37:26Z,user_data=None,user_id='4f72965e950c4761bfedd99fdc411a83',uuid=261ab1ec-f79b-4867-bcb6-1c1d7491120e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.871 227766 DEBUG nova.network.os_vif_util [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Converting VIF {"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.871 227766 DEBUG nova.network.os_vif_util [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.872 227766 DEBUG nova.virt.libvirt.migration [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Updating guest XML with vif config: <interface type="ethernet">
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <mac address="fa:16:3e:34:06:0e"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <model type="virtio"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <mtu size="1442"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]:  <target dev="tap27e277b3-21"/>
Jan 23 04:37:42 np0005593234 nova_compute[227762]: </interface>
Jan 23 04:37:42 np0005593234 nova_compute[227762]: _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.872 227766 DEBUG nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.889 227766 DEBUG nova.network.neutron [req-30a52f08-f6b2-4287-a24b-feb6f3bf6fdb req-fdc95ae4-68c0-42bf-86c2-aeaeff9aab08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Updated VIF entry in instance network info cache for port d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.889 227766 DEBUG nova.network.neutron [req-30a52f08-f6b2-4287-a24b-feb6f3bf6fdb req-fdc95ae4-68c0-42bf-86c2-aeaeff9aab08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Updating instance_info_cache with network_info: [{"id": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "address": "fa:16:3e:ff:c5:b8", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6a90f4f-8b", "ovs_interfaceid": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:37:42 np0005593234 nova_compute[227762]: 2026-01-23 09:37:42.919 227766 DEBUG oslo_concurrency.lockutils [req-30a52f08-f6b2-4287-a24b-feb6f3bf6fdb req-fdc95ae4-68c0-42bf-86c2-aeaeff9aab08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-aede4522-9d5a-4475-9dd9-46c044901917" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.075 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.264 227766 INFO nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Creating config drive at /var/lib/nova/instances/aede4522-9d5a-4475-9dd9-46c044901917/disk.config#033[00m
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.269 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aede4522-9d5a-4475-9dd9-46c044901917/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpku24ry7v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.353 227766 DEBUG nova.virt.libvirt.migration [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.354 227766 INFO nova.virt.libvirt.migration [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Increasing downtime to 50 ms after 0 sec elapsed time#033[00m
Jan 23 04:37:43 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.403 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aede4522-9d5a-4475-9dd9-46c044901917/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpku24ry7v" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.428 227766 DEBUG nova.storage.rbd_utils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] rbd image aede4522-9d5a-4475-9dd9-46c044901917_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.432 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/aede4522-9d5a-4475-9dd9-46c044901917/disk.config aede4522-9d5a-4475-9dd9-46c044901917_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.460 227766 INFO nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).#033[00m
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.593 227766 DEBUG oslo_concurrency.processutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/aede4522-9d5a-4475-9dd9-46c044901917/disk.config aede4522-9d5a-4475-9dd9-46c044901917_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.593 227766 INFO nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Deleting local config drive /var/lib/nova/instances/aede4522-9d5a-4475-9dd9-46c044901917/disk.config because it was imported into RBD.#033[00m
Jan 23 04:37:43 np0005593234 kernel: tapd6a90f4f-8b: entered promiscuous mode
Jan 23 04:37:43 np0005593234 NetworkManager[48942]: <info>  [1769161063.6414] manager: (tapd6a90f4f-8b): new Tun device (/org/freedesktop/NetworkManager/Devices/46)
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.665 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:43 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:43Z|00067|binding|INFO|Claiming lport d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa for this chassis.
Jan 23 04:37:43 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:43Z|00068|binding|INFO|d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa: Claiming fa:16:3e:ff:c5:b8 10.100.0.4
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.673 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:43 np0005593234 NetworkManager[48942]: <info>  [1769161063.6866] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.685 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:43 np0005593234 NetworkManager[48942]: <info>  [1769161063.6872] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.690 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:c5:b8 10.100.0.4'], port_security=['fa:16:3e:ff:c5:b8 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'aede4522-9d5a-4475-9dd9-46c044901917', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f68f8c2203944c9a6e44a6756c8b4b9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '45d222ef-6053-4ab3-9207-38013b247762', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2448bc-0bf3-4fe3-aeb3-04d125f323ad, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.691 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa in datapath ef05741c-2d3e-419c-adbb-a2a3bca97f59 bound to our chassis#033[00m
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.693 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ef05741c-2d3e-419c-adbb-a2a3bca97f59#033[00m
Jan 23 04:37:43 np0005593234 systemd-machined[195626]: New machine qemu-14-instance-00000020.
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.706 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7a19979f-6d11-401e-ac72-9fe7c966184e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.708 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapef05741c-21 in ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.709 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapef05741c-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.709 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[864b714f-df32-4384-85e3-d7e0e4c7459c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.710 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2b65068e-ce38-41e3-ba75-82850567a391]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.723 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[bb6dd25f-a5f4-472e-adec-5643ff8f1870]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 systemd[1]: Started Virtual Machine qemu-14-instance-00000020.
Jan 23 04:37:43 np0005593234 systemd-udevd[242961]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:37:43 np0005593234 podman[242940]: 2026-01-23 09:37:43.752072506 +0000 UTC m=+0.054180109 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.752 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7cf8eb91-23d4-4efb-a48a-93bfd3ee47c1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 NetworkManager[48942]: <info>  [1769161063.7575] device (tapd6a90f4f-8b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:37:43 np0005593234 NetworkManager[48942]: <info>  [1769161063.7584] device (tapd6a90f4f-8b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.786 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[8836a320-bf64-47d1-907c-53adddcf7add]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 NetworkManager[48942]: <info>  [1769161063.7966] manager: (tapef05741c-20): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.796 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0c5089f8-4cba-42b5-96e3-4c91da2742be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 systemd-udevd[242965]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.826 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[3c14b1a0-36c6-48e0-9750-8b47d9f05b19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.829 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[786bbea6-666f-4314-9788-8d87cb4e54f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 NetworkManager[48942]: <info>  [1769161063.8491] device (tapef05741c-20): carrier: link connected
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.855 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f7d93106-c18b-46d5-a0c5-6ea07af6f2a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.873 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[307470e3-856f-4dfe-813e-1603d3455f70]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef05741c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8a:3d:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495880, 'reachable_time': 28427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 242995, 'error': None, 'target': 'ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.890 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4a259446-db02-4578-ab44-01da6a200914]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8a:3dba'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 495880, 'tstamp': 495880}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 242997, 'error': None, 'target': 'ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.907 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c285287b-8914-4c7b-89c1-800a63787735]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef05741c-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8a:3d:ba'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495880, 'reachable_time': 28427, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 242998, 'error': None, 'target': 'ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:43.941 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0c005db9-d726-441c-86c8-756f24d962fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.958 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:43 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:43Z|00069|binding|INFO|Releasing lport b545a870-aa18-4f64-a8a7-f8512824c4cc from this chassis (sb_readonly=0)
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.991 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.995 227766 DEBUG nova.virt.libvirt.migration [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 23 04:37:43 np0005593234 nova_compute[227762]: 2026-01-23 09:37:43.995 227766 DEBUG nova.virt.libvirt.migration [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.001 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f2e97277-ec89-4d37-8707-f9be0cbe471e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.002 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef05741c-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.003 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.003 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef05741c-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:44 np0005593234 kernel: tapef05741c-20: entered promiscuous mode
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.005 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:44 np0005593234 NetworkManager[48942]: <info>  [1769161064.0063] manager: (tapef05741c-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Jan 23 04:37:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:44Z|00070|binding|INFO|Setting lport d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa ovn-installed in OVS
Jan 23 04:37:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:44Z|00071|binding|INFO|Setting lport d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa up in Southbound
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.007 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.008 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.009 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapef05741c-20, col_values=(('external_ids', {'iface-id': 'f8b32530-de7e-473a-a3e9-65c6259bc8bb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.010 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:44Z|00072|binding|INFO|Releasing lport f8b32530-de7e-473a-a3e9-65c6259bc8bb from this chassis (sb_readonly=1)
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.011 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.011 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ef05741c-2d3e-419c-adbb-a2a3bca97f59.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ef05741c-2d3e-419c-adbb-a2a3bca97f59.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.012 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6ab8bd7e-99a0-433f-9e57-ad77310c1ede]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.015 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ef05741c-2d3e-419c-adbb-a2a3bca97f59
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ef05741c-2d3e-419c-adbb-a2a3bca97f59.pid.haproxy
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ef05741c-2d3e-419c-adbb-a2a3bca97f59
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.016 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'env', 'PROCESS_TAG=haproxy-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ef05741c-2d3e-419c-adbb-a2a3bca97f59.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.025 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.067 227766 DEBUG nova.compute.manager [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.067 227766 DEBUG oslo_concurrency.lockutils [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.068 227766 DEBUG oslo_concurrency.lockutils [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.068 227766 DEBUG oslo_concurrency.lockutils [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.068 227766 DEBUG nova.compute.manager [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] No waiting events found dispatching network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.068 227766 WARNING nova.compute.manager [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received unexpected event network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.069 227766 DEBUG nova.compute.manager [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-changed-27e277b3-2135-4e3e-b336-e0da87509465 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.069 227766 DEBUG nova.compute.manager [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Refreshing instance network info cache due to event network-changed-27e277b3-2135-4e3e-b336-e0da87509465. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.069 227766 DEBUG oslo_concurrency.lockutils [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.070 227766 DEBUG oslo_concurrency.lockutils [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.070 227766 DEBUG nova.network.neutron [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Refreshing network info cache for port 27e277b3-2135-4e3e-b336-e0da87509465 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:37:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:44.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:44 np0005593234 podman[243032]: 2026-01-23 09:37:44.4452151 +0000 UTC m=+0.049748362 container create c212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:37:44 np0005593234 systemd[1]: Started libpod-conmon-c212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a.scope.
Jan 23 04:37:44 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:37:44 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3da1a6b1e3b8ba163b3d82725d1bd3e448056ea1acfaba03a472f97d8be4721a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:37:44 np0005593234 podman[243032]: 2026-01-23 09:37:44.418829777 +0000 UTC m=+0.023363069 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.515 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161064.5144272, 261ab1ec-f79b-4867-bcb6-1c1d7491120e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.517 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.520 227766 DEBUG nova.virt.libvirt.migration [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.520 227766 DEBUG nova.virt.libvirt.migration [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525#033[00m
Jan 23 04:37:44 np0005593234 podman[243032]: 2026-01-23 09:37:44.527218035 +0000 UTC m=+0.131751297 container init c212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:37:44 np0005593234 podman[243032]: 2026-01-23 09:37:44.533451849 +0000 UTC m=+0.137985111 container start c212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.548 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.553 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:37:44 np0005593234 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[243047]: [NOTICE]   (243051) : New worker (243053) forked
Jan 23 04:37:44 np0005593234 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[243047]: [NOTICE]   (243051) : Loading success.
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.582 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] During sync_power_state the instance has a pending task (migrating). Skip.#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.616 227766 DEBUG nova.compute.manager [req-0b35d240-849d-441e-a1e9-4b870e55ed77 req-ed6b33bd-14a3-46d2-946a-d3fbbe428b79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Received event network-vif-plugged-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.617 227766 DEBUG oslo_concurrency.lockutils [req-0b35d240-849d-441e-a1e9-4b870e55ed77 req-ed6b33bd-14a3-46d2-946a-d3fbbe428b79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "aede4522-9d5a-4475-9dd9-46c044901917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.617 227766 DEBUG oslo_concurrency.lockutils [req-0b35d240-849d-441e-a1e9-4b870e55ed77 req-ed6b33bd-14a3-46d2-946a-d3fbbe428b79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.618 227766 DEBUG oslo_concurrency.lockutils [req-0b35d240-849d-441e-a1e9-4b870e55ed77 req-ed6b33bd-14a3-46d2-946a-d3fbbe428b79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.618 227766 DEBUG nova.compute.manager [req-0b35d240-849d-441e-a1e9-4b870e55ed77 req-ed6b33bd-14a3-46d2-946a-d3fbbe428b79 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Processing event network-vif-plugged-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.742 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:44 np0005593234 kernel: tap27e277b3-21 (unregistering): left promiscuous mode
Jan 23 04:37:44 np0005593234 NetworkManager[48942]: <info>  [1769161064.7570] device (tap27e277b3-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:37:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:44Z|00073|binding|INFO|Releasing lport 27e277b3-2135-4e3e-b336-e0da87509465 from this chassis (sb_readonly=0)
Jan 23 04:37:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:44Z|00074|binding|INFO|Setting lport 27e277b3-2135-4e3e-b336-e0da87509465 down in Southbound
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.764 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:44Z|00075|binding|INFO|Removing iface tap27e277b3-21 ovn-installed in OVS
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.766 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:44.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.774 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:06:0e 10.100.0.11'], port_security=['fa:16:3e:34:06:0e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com,compute-0.ctlplane.example.com', 'activation-strategy': 'rarp', 'additional-chassis-activated': 'd80bc768-e67f-4e48-bcf3-42912cda98f1'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '261ab1ec-f79b-4867-bcb6-1c1d7491120e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0dce6e339c349d4ab97cee5e49fff3a', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0179c400-b2f2-4914-b563-942a61ef1858', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbb60528-b878-42fd-9c2f-0a3345010b1a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=27e277b3-2135-4e3e-b336-e0da87509465) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.776 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 27e277b3-2135-4e3e-b336-e0da87509465 in datapath 8eab8076-0848-4daf-bbac-f3f8b65ca750 unbound from our chassis#033[00m
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.779 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8eab8076-0848-4daf-bbac-f3f8b65ca750, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.780 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e28980db-e17d-492b-b9c2-5cd82e0c6ba1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:44.780 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 namespace which is not needed anymore#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.783 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:44 np0005593234 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 23 04:37:44 np0005593234 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001d.scope: Consumed 14.447s CPU time.
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.832 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161064.8325222, aede4522-9d5a-4475-9dd9-46c044901917 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.833 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aede4522-9d5a-4475-9dd9-46c044901917] VM Started (Lifecycle Event)#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.835 227766 DEBUG nova.compute.manager [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:37:44 np0005593234 systemd-machined[195626]: Machine qemu-13-instance-0000001d terminated.
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.841 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.850 227766 INFO nova.virt.libvirt.driver [-] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Instance spawned successfully.#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.850 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.867 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.870 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.878 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.878 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.879 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.879 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:44 np0005593234 virtqemud[227483]: Unable to get XATTR trusted.libvirt.security.ref_selinux on volumes/volume-b06791ec-66fd-4114-8448-7ea0b7f88f25: No such file or directory
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.879 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:44 np0005593234 virtqemud[227483]: Unable to get XATTR trusted.libvirt.security.ref_dac on volumes/volume-b06791ec-66fd-4114-8448-7ea0b7f88f25: No such file or directory
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.880 227766 DEBUG nova.virt.libvirt.driver [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.906 227766 DEBUG nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.907 227766 DEBUG nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.907 227766 DEBUG nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630#033[00m
Jan 23 04:37:44 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[242518]: [NOTICE]   (242522) : haproxy version is 2.8.14-c23fe91
Jan 23 04:37:44 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[242518]: [NOTICE]   (242522) : path to executable is /usr/sbin/haproxy
Jan 23 04:37:44 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[242518]: [WARNING]  (242522) : Exiting Master process...
Jan 23 04:37:44 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[242518]: [ALERT]    (242522) : Current worker (242524) exited with code 143 (Terminated)
Jan 23 04:37:44 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[242518]: [WARNING]  (242522) : All workers exited. Exiting... (0)
Jan 23 04:37:44 np0005593234 systemd[1]: libpod-36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db.scope: Deactivated successfully.
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.918 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aede4522-9d5a-4475-9dd9-46c044901917] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.918 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161064.8326526, aede4522-9d5a-4475-9dd9-46c044901917 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.918 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aede4522-9d5a-4475-9dd9-46c044901917] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:37:44 np0005593234 podman[243124]: 2026-01-23 09:37:44.919236455 +0000 UTC m=+0.043265872 container died 36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 04:37:44 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db-userdata-shm.mount: Deactivated successfully.
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.951 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:44 np0005593234 systemd[1]: var-lib-containers-storage-overlay-55b160c5437a68b90df682d321357d8ee13a6b68a1f12db6fd33ae577d3074f4-merged.mount: Deactivated successfully.
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.957 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161064.8400707, aede4522-9d5a-4475-9dd9-46c044901917 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.957 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aede4522-9d5a-4475-9dd9-46c044901917] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.963 227766 INFO nova.compute.manager [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Took 9.14 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.964 227766 DEBUG nova.compute.manager [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:44 np0005593234 podman[243124]: 2026-01-23 09:37:44.966655535 +0000 UTC m=+0.090684952 container cleanup 36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 04:37:44 np0005593234 systemd[1]: libpod-conmon-36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db.scope: Deactivated successfully.
Jan 23 04:37:44 np0005593234 nova_compute[227762]: 2026-01-23 09:37:44.999 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:37:45 np0005593234 nova_compute[227762]: 2026-01-23 09:37:45.002 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:37:45 np0005593234 nova_compute[227762]: 2026-01-23 09:37:45.022 227766 DEBUG nova.virt.libvirt.guest [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Domain has shutdown/gone away: Domain not found: no domain with matching uuid '261ab1ec-f79b-4867-bcb6-1c1d7491120e' (instance-0000001d) get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688#033[00m
Jan 23 04:37:45 np0005593234 nova_compute[227762]: 2026-01-23 09:37:45.022 227766 INFO nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Migration operation has completed#033[00m
Jan 23 04:37:45 np0005593234 nova_compute[227762]: 2026-01-23 09:37:45.023 227766 INFO nova.compute.manager [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] _post_live_migration() is started..#033[00m
Jan 23 04:37:45 np0005593234 podman[243165]: 2026-01-23 09:37:45.036763914 +0000 UTC m=+0.043702305 container remove 36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 04:37:45 np0005593234 nova_compute[227762]: 2026-01-23 09:37:45.041 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aede4522-9d5a-4475-9dd9-46c044901917] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:37:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:45.042 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f106f4d6-7550-4a68-abe8-dd2e966bc12b]: (4, ('Fri Jan 23 09:37:44 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 (36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db)\n36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db\nFri Jan 23 09:37:44 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 (36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db)\n36185456e99a85b362d355c80746e5c727667e55a2c3abc85a136840ce8132db\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:45.044 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e9194292-3157-40e4-9a35-e75b115a21c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:45.045 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8eab8076-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:45 np0005593234 nova_compute[227762]: 2026-01-23 09:37:45.046 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:45 np0005593234 kernel: tap8eab8076-00: left promiscuous mode
Jan 23 04:37:45 np0005593234 nova_compute[227762]: 2026-01-23 09:37:45.060 227766 INFO nova.compute.manager [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Took 10.93 seconds to build instance.#033[00m
Jan 23 04:37:45 np0005593234 nova_compute[227762]: 2026-01-23 09:37:45.066 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:45.069 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac213dc-767a-4c26-9cec-1ef6efae68bf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:45 np0005593234 nova_compute[227762]: 2026-01-23 09:37:45.081 227766 DEBUG oslo_concurrency.lockutils [None req-e4829e19-2d5b-42a2-b126-7c80a658e810 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.364s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:45.088 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1a738213-15f8-4490-83e2-17a9880588a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:45.089 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[16298d61-ca4c-4449-b715-1eed30764f07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:45.107 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d769eb0a-b01e-4123-bafd-9949f8c144de]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 493846, 'reachable_time': 21998, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243183, 'error': None, 'target': 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:45 np0005593234 systemd[1]: run-netns-ovnmeta\x2d8eab8076\x2d0848\x2d4daf\x2dbbac\x2df3f8b65ca750.mount: Deactivated successfully.
Jan 23 04:37:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:45.112 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:37:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:45.112 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[e5347608-1df6-43e5-9021-6e71c9907d52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.030 227766 DEBUG nova.network.neutron [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Activated binding for port 27e277b3-2135-4e3e-b336-e0da87509465 and host compute-0.ctlplane.example.com migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.031 227766 DEBUG nova.compute.manager [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.032 227766 DEBUG nova.virt.libvirt.vif [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-724421301',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-724421301',id=29,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:37:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d0dce6e339c349d4ab97cee5e49fff3a',ramdisk_id='',reservation_id='r-106tqp53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1207260646',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1207260646-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:37:32Z,user_data=None,user_id='4f72965e950c4761bfedd99fdc411a83',uuid=261ab1ec-f79b-4867-bcb6-1c1d7491120e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.033 227766 DEBUG nova.network.os_vif_util [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Converting VIF {"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.034 227766 DEBUG nova.network.os_vif_util [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.034 227766 DEBUG os_vif [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.037 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.037 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27e277b3-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.039 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.040 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.042 227766 INFO os_vif [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21')#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.043 227766 DEBUG oslo_concurrency.lockutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.043 227766 DEBUG oslo_concurrency.lockutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.043 227766 DEBUG oslo_concurrency.lockutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.044 227766 DEBUG nova.compute.manager [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.044 227766 INFO nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Deleting instance files /var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e_del#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.045 227766 INFO nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Deletion of /var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e_del complete#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.244 227766 DEBUG nova.compute.manager [req-b0afce19-4989-4c0b-9f44-074b6a0cd134 req-5001cad9-38b3-40da-8f39-59f5f4e56f70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-vif-unplugged-27e277b3-2135-4e3e-b336-e0da87509465 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.244 227766 DEBUG oslo_concurrency.lockutils [req-b0afce19-4989-4c0b-9f44-074b6a0cd134 req-5001cad9-38b3-40da-8f39-59f5f4e56f70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.245 227766 DEBUG oslo_concurrency.lockutils [req-b0afce19-4989-4c0b-9f44-074b6a0cd134 req-5001cad9-38b3-40da-8f39-59f5f4e56f70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.245 227766 DEBUG oslo_concurrency.lockutils [req-b0afce19-4989-4c0b-9f44-074b6a0cd134 req-5001cad9-38b3-40da-8f39-59f5f4e56f70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.245 227766 DEBUG nova.compute.manager [req-b0afce19-4989-4c0b-9f44-074b6a0cd134 req-5001cad9-38b3-40da-8f39-59f5f4e56f70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] No waiting events found dispatching network-vif-unplugged-27e277b3-2135-4e3e-b336-e0da87509465 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.245 227766 DEBUG nova.compute.manager [req-b0afce19-4989-4c0b-9f44-074b6a0cd134 req-5001cad9-38b3-40da-8f39-59f5f4e56f70 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-vif-unplugged-27e277b3-2135-4e3e-b336-e0da87509465 for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:37:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:46.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:46.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.784 227766 DEBUG nova.compute.manager [req-f857a6ce-8460-440d-96fb-004c834de55b req-b27e485d-0edc-4d08-9768-e456f87cb20e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Received event network-vif-plugged-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.784 227766 DEBUG oslo_concurrency.lockutils [req-f857a6ce-8460-440d-96fb-004c834de55b req-b27e485d-0edc-4d08-9768-e456f87cb20e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "aede4522-9d5a-4475-9dd9-46c044901917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.784 227766 DEBUG oslo_concurrency.lockutils [req-f857a6ce-8460-440d-96fb-004c834de55b req-b27e485d-0edc-4d08-9768-e456f87cb20e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.784 227766 DEBUG oslo_concurrency.lockutils [req-f857a6ce-8460-440d-96fb-004c834de55b req-b27e485d-0edc-4d08-9768-e456f87cb20e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.785 227766 DEBUG nova.compute.manager [req-f857a6ce-8460-440d-96fb-004c834de55b req-b27e485d-0edc-4d08-9768-e456f87cb20e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] No waiting events found dispatching network-vif-plugged-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:37:46 np0005593234 nova_compute[227762]: 2026-01-23 09:37:46.785 227766 WARNING nova.compute.manager [req-f857a6ce-8460-440d-96fb-004c834de55b req-b27e485d-0edc-4d08-9768-e456f87cb20e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Received unexpected event network-vif-plugged-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa for instance with vm_state active and task_state None.#033[00m
Jan 23 04:37:47 np0005593234 nova_compute[227762]: 2026-01-23 09:37:47.518 227766 DEBUG nova.network.neutron [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Updated VIF entry in instance network info cache for port 27e277b3-2135-4e3e-b336-e0da87509465. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:37:47 np0005593234 nova_compute[227762]: 2026-01-23 09:37:47.519 227766 DEBUG nova.network.neutron [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Updating instance_info_cache with network_info: [{"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "compute-0.ctlplane.example.com"}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:37:47 np0005593234 nova_compute[227762]: 2026-01-23 09:37:47.847 227766 DEBUG oslo_concurrency.lockutils [req-69779057-a351-49be-9b3f-a25b6ef39086 req-cf5727d3-ce8e-4404-89f6-de03df696df3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:37:48 np0005593234 nova_compute[227762]: 2026-01-23 09:37:48.124 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:48.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:48 np0005593234 nova_compute[227762]: 2026-01-23 09:37:48.388 227766 DEBUG nova.compute.manager [req-f07aff13-67cf-4f8c-b681-bf7e03e41113 req-814c2042-1cf6-4009-ae7b-c4eef642719e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:48 np0005593234 nova_compute[227762]: 2026-01-23 09:37:48.389 227766 DEBUG oslo_concurrency.lockutils [req-f07aff13-67cf-4f8c-b681-bf7e03e41113 req-814c2042-1cf6-4009-ae7b-c4eef642719e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:48 np0005593234 nova_compute[227762]: 2026-01-23 09:37:48.389 227766 DEBUG oslo_concurrency.lockutils [req-f07aff13-67cf-4f8c-b681-bf7e03e41113 req-814c2042-1cf6-4009-ae7b-c4eef642719e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:48 np0005593234 nova_compute[227762]: 2026-01-23 09:37:48.389 227766 DEBUG oslo_concurrency.lockutils [req-f07aff13-67cf-4f8c-b681-bf7e03e41113 req-814c2042-1cf6-4009-ae7b-c4eef642719e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:48 np0005593234 nova_compute[227762]: 2026-01-23 09:37:48.389 227766 DEBUG nova.compute.manager [req-f07aff13-67cf-4f8c-b681-bf7e03e41113 req-814c2042-1cf6-4009-ae7b-c4eef642719e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] No waiting events found dispatching network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:37:48 np0005593234 nova_compute[227762]: 2026-01-23 09:37:48.389 227766 WARNING nova.compute.manager [req-f07aff13-67cf-4f8c-b681-bf7e03e41113 req-814c2042-1cf6-4009-ae7b-c4eef642719e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received unexpected event network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 04:37:48 np0005593234 nova_compute[227762]: 2026-01-23 09:37:48.390 227766 DEBUG nova.compute.manager [req-f07aff13-67cf-4f8c-b681-bf7e03e41113 req-814c2042-1cf6-4009-ae7b-c4eef642719e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:48 np0005593234 nova_compute[227762]: 2026-01-23 09:37:48.390 227766 DEBUG oslo_concurrency.lockutils [req-f07aff13-67cf-4f8c-b681-bf7e03e41113 req-814c2042-1cf6-4009-ae7b-c4eef642719e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:48 np0005593234 nova_compute[227762]: 2026-01-23 09:37:48.390 227766 DEBUG oslo_concurrency.lockutils [req-f07aff13-67cf-4f8c-b681-bf7e03e41113 req-814c2042-1cf6-4009-ae7b-c4eef642719e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:48 np0005593234 nova_compute[227762]: 2026-01-23 09:37:48.390 227766 DEBUG oslo_concurrency.lockutils [req-f07aff13-67cf-4f8c-b681-bf7e03e41113 req-814c2042-1cf6-4009-ae7b-c4eef642719e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:48 np0005593234 nova_compute[227762]: 2026-01-23 09:37:48.390 227766 DEBUG nova.compute.manager [req-f07aff13-67cf-4f8c-b681-bf7e03e41113 req-814c2042-1cf6-4009-ae7b-c4eef642719e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] No waiting events found dispatching network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:37:48 np0005593234 nova_compute[227762]: 2026-01-23 09:37:48.390 227766 WARNING nova.compute.manager [req-f07aff13-67cf-4f8c-b681-bf7e03e41113 req-814c2042-1cf6-4009-ae7b-c4eef642719e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received unexpected event network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 for instance with vm_state active and task_state migrating.#033[00m
Jan 23 04:37:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:48.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:50.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:50 np0005593234 nova_compute[227762]: 2026-01-23 09:37:50.687 227766 DEBUG oslo_concurrency.lockutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:50 np0005593234 nova_compute[227762]: 2026-01-23 09:37:50.689 227766 DEBUG oslo_concurrency.lockutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:50 np0005593234 nova_compute[227762]: 2026-01-23 09:37:50.690 227766 DEBUG oslo_concurrency.lockutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:50 np0005593234 nova_compute[227762]: 2026-01-23 09:37:50.718 227766 DEBUG oslo_concurrency.lockutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:50 np0005593234 nova_compute[227762]: 2026-01-23 09:37:50.719 227766 DEBUG oslo_concurrency.lockutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:50 np0005593234 nova_compute[227762]: 2026-01-23 09:37:50.719 227766 DEBUG oslo_concurrency.lockutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:50 np0005593234 nova_compute[227762]: 2026-01-23 09:37:50.720 227766 DEBUG nova.compute.resource_tracker [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:37:50 np0005593234 nova_compute[227762]: 2026-01-23 09:37:50.720 227766 DEBUG oslo_concurrency.processutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:37:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:50.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.041 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:37:51 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1383031627' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.249 227766 DEBUG oslo_concurrency.processutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.325 227766 DEBUG nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] skipping disk for instance-00000020 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.325 227766 DEBUG nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] skipping disk for instance-00000020 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.476 227766 WARNING nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.477 227766 DEBUG nova.compute.resource_tracker [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4538MB free_disk=20.780529022216797GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.478 227766 DEBUG oslo_concurrency.lockutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.478 227766 DEBUG oslo_concurrency.lockutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.533 227766 DEBUG nova.compute.resource_tracker [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Migration for instance 261ab1ec-f79b-4867-bcb6-1c1d7491120e refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.556 227766 DEBUG nova.compute.resource_tracker [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.607 227766 DEBUG nova.compute.resource_tracker [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Migration b0b4513c-fd12-42eb-894f-3243a9897e2b is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.608 227766 DEBUG nova.compute.resource_tracker [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Instance aede4522-9d5a-4475-9dd9-46c044901917 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.608 227766 DEBUG nova.compute.resource_tracker [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.608 227766 DEBUG nova.compute.resource_tracker [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.688 227766 DEBUG oslo_concurrency.processutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.921 227766 DEBUG oslo_concurrency.lockutils [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "aede4522-9d5a-4475-9dd9-46c044901917" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.921 227766 DEBUG oslo_concurrency.lockutils [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.921 227766 DEBUG oslo_concurrency.lockutils [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "aede4522-9d5a-4475-9dd9-46c044901917-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.922 227766 DEBUG oslo_concurrency.lockutils [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.922 227766 DEBUG oslo_concurrency.lockutils [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.924 227766 INFO nova.compute.manager [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Terminating instance#033[00m
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.925 227766 DEBUG nova.compute.manager [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:37:51 np0005593234 kernel: tapd6a90f4f-8b (unregistering): left promiscuous mode
Jan 23 04:37:51 np0005593234 NetworkManager[48942]: <info>  [1769161071.9730] device (tapd6a90f4f-8b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:37:51 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:51Z|00076|binding|INFO|Releasing lport d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa from this chassis (sb_readonly=0)
Jan 23 04:37:51 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:51Z|00077|binding|INFO|Setting lport d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa down in Southbound
Jan 23 04:37:51 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:51Z|00078|binding|INFO|Removing iface tapd6a90f4f-8b ovn-installed in OVS
Jan 23 04:37:51 np0005593234 nova_compute[227762]: 2026-01-23 09:37:51.983 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:51.992 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:c5:b8 10.100.0.4'], port_security=['fa:16:3e:ff:c5:b8 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'aede4522-9d5a-4475-9dd9-46c044901917', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9f68f8c2203944c9a6e44a6756c8b4b9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '45d222ef-6053-4ab3-9207-38013b247762', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ee2448bc-0bf3-4fe3-aeb3-04d125f323ad, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:51.994 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa in datapath ef05741c-2d3e-419c-adbb-a2a3bca97f59 unbound from our chassis#033[00m
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:51.996 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef05741c-2d3e-419c-adbb-a2a3bca97f59, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:51.997 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cf24b9e7-e5f0-4bc9-96ba-bdbe384458a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:51.998 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59 namespace which is not needed anymore#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.016 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:52 np0005593234 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000020.scope: Deactivated successfully.
Jan 23 04:37:52 np0005593234 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d00000020.scope: Consumed 8.352s CPU time.
Jan 23 04:37:52 np0005593234 systemd-machined[195626]: Machine qemu-14-instance-00000020 terminated.
Jan 23 04:37:52 np0005593234 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[243047]: [NOTICE]   (243051) : haproxy version is 2.8.14-c23fe91
Jan 23 04:37:52 np0005593234 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[243047]: [NOTICE]   (243051) : path to executable is /usr/sbin/haproxy
Jan 23 04:37:52 np0005593234 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[243047]: [WARNING]  (243051) : Exiting Master process...
Jan 23 04:37:52 np0005593234 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[243047]: [ALERT]    (243051) : Current worker (243053) exited with code 143 (Terminated)
Jan 23 04:37:52 np0005593234 neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59[243047]: [WARNING]  (243051) : All workers exited. Exiting... (0)
Jan 23 04:37:52 np0005593234 systemd[1]: libpod-c212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a.scope: Deactivated successfully.
Jan 23 04:37:52 np0005593234 conmon[243047]: conmon c212998c014c7d8d9b4e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a.scope/container/memory.events
Jan 23 04:37:52 np0005593234 podman[243306]: 2026-01-23 09:37:52.143675283 +0000 UTC m=+0.055214556 container died c212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.167 227766 INFO nova.virt.libvirt.driver [-] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Instance destroyed successfully.#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.168 227766 DEBUG nova.objects.instance [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lazy-loading 'resources' on Instance uuid aede4522-9d5a-4475-9dd9-46c044901917 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:52 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a-userdata-shm.mount: Deactivated successfully.
Jan 23 04:37:52 np0005593234 systemd[1]: var-lib-containers-storage-overlay-3da1a6b1e3b8ba163b3d82725d1bd3e448056ea1acfaba03a472f97d8be4721a-merged.mount: Deactivated successfully.
Jan 23 04:37:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:37:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/180373957' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:37:52 np0005593234 podman[243306]: 2026-01-23 09:37:52.185292292 +0000 UTC m=+0.096831555 container cleanup c212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.189 227766 DEBUG nova.virt.libvirt.vif [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:37:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-VolumesAdminNegativeTest-server-107272333',display_name='tempest-VolumesAdminNegativeTest-server-107272333',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-volumesadminnegativetest-server-107272333',id=32,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:37:44Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9f68f8c2203944c9a6e44a6756c8b4b9',ramdisk_id='',reservation_id='r-9tfd70gd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-VolumesAdminNegativeTest-905168495',owner_user_name='tempest-VolumesAdminNegativeTest-905168495-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:37:45Z,user_data=None,user_id='726bd44b7ec443a0a4b8b632b06c622e',uuid=aede4522-9d5a-4475-9dd9-46c044901917,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "address": "fa:16:3e:ff:c5:b8", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6a90f4f-8b", "ovs_interfaceid": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.189 227766 DEBUG nova.network.os_vif_util [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Converting VIF {"id": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "address": "fa:16:3e:ff:c5:b8", "network": {"id": "ef05741c-2d3e-419c-adbb-a2a3bca97f59", "bridge": "br-int", "label": "tempest-VolumesAdminNegativeTest-964592179-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9f68f8c2203944c9a6e44a6756c8b4b9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd6a90f4f-8b", "ovs_interfaceid": "d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.190 227766 DEBUG nova.network.os_vif_util [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:c5:b8,bridge_name='br-int',has_traffic_filtering=True,id=d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6a90f4f-8b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.190 227766 DEBUG os_vif [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:c5:b8,bridge_name='br-int',has_traffic_filtering=True,id=d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6a90f4f-8b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.195 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.196 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6a90f4f-8b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.197 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:52 np0005593234 systemd[1]: libpod-conmon-c212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a.scope: Deactivated successfully.
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.203 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.204 227766 DEBUG oslo_concurrency.processutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.205 227766 INFO os_vif [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:c5:b8,bridge_name='br-int',has_traffic_filtering=True,id=d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa,network=Network(ef05741c-2d3e-419c-adbb-a2a3bca97f59),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd6a90f4f-8b')#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.241 227766 DEBUG nova.compute.provider_tree [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.263 227766 DEBUG nova.scheduler.client.report [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:37:52 np0005593234 podman[243349]: 2026-01-23 09:37:52.2710974 +0000 UTC m=+0.059775776 container remove c212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:52.276 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c987e82b-5f4e-4f18-8aea-22b866e5a184]: (4, ('Fri Jan 23 09:37:52 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59 (c212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a)\nc212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a\nFri Jan 23 09:37:52 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59 (c212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a)\nc212998c014c7d8d9b4efb7e29c9b21785f9c37bf0470194a24a8294e8f6794a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:52.278 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c9c62a76-4746-44ec-8daf-c75de26c64e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:52.279 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef05741c-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.281 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:52 np0005593234 kernel: tapef05741c-20: left promiscuous mode
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.294 227766 DEBUG nova.compute.resource_tracker [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.295 227766 DEBUG oslo_concurrency.lockutils [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.817s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.298 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:52.300 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8a709f13-0f1b-4b84-ae82-1ca7c785cdef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.302 227766 INFO nova.compute.manager [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Migrating instance to compute-0.ctlplane.example.com finished successfully.#033[00m
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:52.316 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2d03df14-9736-41d1-a4ca-00e1c72e48e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:52.317 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[11666507-1fec-4a7c-a6d5-279714de3410]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:52.331 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3be06b83-2c8c-4d7b-abe6-88b6271a3c04]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 495874, 'reachable_time': 30608, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 243379, 'error': None, 'target': 'ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:52 np0005593234 systemd[1]: run-netns-ovnmeta\x2def05741c\x2d2d3e\x2d419c\x2dadbb\x2da2a3bca97f59.mount: Deactivated successfully.
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:52.334 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ef05741c-2d3e-419c-adbb-a2a3bca97f59 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:37:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:52.335 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[022169d7-8b94-4a0c-81f7-52730a47b64d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:52.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.394 227766 INFO nova.scheduler.client.report [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Deleted allocation for migration b0b4513c-fd12-42eb-894f-3243a9897e2b#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.395 227766 DEBUG nova.virt.libvirt.driver [None req-5cef1da9-f042-4a0d-a3f1-4c769aeb546d 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.690 227766 INFO nova.virt.libvirt.driver [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Deleting instance files /var/lib/nova/instances/aede4522-9d5a-4475-9dd9-46c044901917_del#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.691 227766 INFO nova.virt.libvirt.driver [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Deletion of /var/lib/nova/instances/aede4522-9d5a-4475-9dd9-46c044901917_del complete#033[00m
Jan 23 04:37:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:52.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.794 227766 INFO nova.compute.manager [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Took 0.87 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.795 227766 DEBUG oslo.service.loopingcall [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.795 227766 DEBUG nova.compute.manager [-] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.795 227766 DEBUG nova.network.neutron [-] [instance: aede4522-9d5a-4475-9dd9-46c044901917] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.855 227766 DEBUG nova.compute.manager [req-92b06636-db54-4548-900b-efea2640ee87 req-e51c2951-b7da-418c-8f31-d7418afcb11f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Received event network-vif-unplugged-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.855 227766 DEBUG oslo_concurrency.lockutils [req-92b06636-db54-4548-900b-efea2640ee87 req-e51c2951-b7da-418c-8f31-d7418afcb11f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "aede4522-9d5a-4475-9dd9-46c044901917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.855 227766 DEBUG oslo_concurrency.lockutils [req-92b06636-db54-4548-900b-efea2640ee87 req-e51c2951-b7da-418c-8f31-d7418afcb11f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.855 227766 DEBUG oslo_concurrency.lockutils [req-92b06636-db54-4548-900b-efea2640ee87 req-e51c2951-b7da-418c-8f31-d7418afcb11f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.856 227766 DEBUG nova.compute.manager [req-92b06636-db54-4548-900b-efea2640ee87 req-e51c2951-b7da-418c-8f31-d7418afcb11f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] No waiting events found dispatching network-vif-unplugged-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:37:52 np0005593234 nova_compute[227762]: 2026-01-23 09:37:52.856 227766 DEBUG nova.compute.manager [req-92b06636-db54-4548-900b-efea2640ee87 req-e51c2951-b7da-418c-8f31-d7418afcb11f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Received event network-vif-unplugged-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:37:53 np0005593234 nova_compute[227762]: 2026-01-23 09:37:53.107 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:53 np0005593234 nova_compute[227762]: 2026-01-23 09:37:53.125 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:53 np0005593234 nova_compute[227762]: 2026-01-23 09:37:53.776 227766 DEBUG nova.network.neutron [-] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:37:53 np0005593234 nova_compute[227762]: 2026-01-23 09:37:53.805 227766 INFO nova.compute.manager [-] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Took 1.01 seconds to deallocate network for instance.#033[00m
Jan 23 04:37:53 np0005593234 nova_compute[227762]: 2026-01-23 09:37:53.913 227766 DEBUG oslo_concurrency.lockutils [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:53 np0005593234 nova_compute[227762]: 2026-01-23 09:37:53.913 227766 DEBUG oslo_concurrency.lockutils [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:54 np0005593234 nova_compute[227762]: 2026-01-23 09:37:54.003 227766 DEBUG oslo_concurrency.processutils [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:54 np0005593234 nova_compute[227762]: 2026-01-23 09:37:54.065 227766 DEBUG nova.virt.libvirt.driver [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Creating tmpfile /var/lib/nova/instances/tmphmsj_e0s to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 23 04:37:54 np0005593234 nova_compute[227762]: 2026-01-23 09:37:54.275 227766 DEBUG nova.compute.manager [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphmsj_e0s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 23 04:37:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:37:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:54.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:37:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:37:54 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/274539304' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:37:54 np0005593234 nova_compute[227762]: 2026-01-23 09:37:54.466 227766 DEBUG oslo_concurrency.processutils [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:54 np0005593234 nova_compute[227762]: 2026-01-23 09:37:54.473 227766 DEBUG nova.compute.provider_tree [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:37:54 np0005593234 nova_compute[227762]: 2026-01-23 09:37:54.490 227766 DEBUG nova.scheduler.client.report [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:37:54 np0005593234 nova_compute[227762]: 2026-01-23 09:37:54.525 227766 DEBUG oslo_concurrency.lockutils [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:54 np0005593234 nova_compute[227762]: 2026-01-23 09:37:54.558 227766 INFO nova.scheduler.client.report [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Deleted allocations for instance aede4522-9d5a-4475-9dd9-46c044901917#033[00m
Jan 23 04:37:54 np0005593234 nova_compute[227762]: 2026-01-23 09:37:54.649 227766 DEBUG oslo_concurrency.lockutils [None req-2b41db55-bc8f-440b-ad61-5b8934b1c28e 726bd44b7ec443a0a4b8b632b06c622e 9f68f8c2203944c9a6e44a6756c8b4b9 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.728s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:54.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:55 np0005593234 nova_compute[227762]: 2026-01-23 09:37:55.305 227766 DEBUG nova.compute.manager [req-46991de4-1351-440a-8faf-c72f8f1d481e req-5f962852-c090-40da-a4d3-eb419ee7cc6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Received event network-vif-plugged-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:55 np0005593234 nova_compute[227762]: 2026-01-23 09:37:55.305 227766 DEBUG oslo_concurrency.lockutils [req-46991de4-1351-440a-8faf-c72f8f1d481e req-5f962852-c090-40da-a4d3-eb419ee7cc6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "aede4522-9d5a-4475-9dd9-46c044901917-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:55 np0005593234 nova_compute[227762]: 2026-01-23 09:37:55.306 227766 DEBUG oslo_concurrency.lockutils [req-46991de4-1351-440a-8faf-c72f8f1d481e req-5f962852-c090-40da-a4d3-eb419ee7cc6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:55 np0005593234 nova_compute[227762]: 2026-01-23 09:37:55.307 227766 DEBUG oslo_concurrency.lockutils [req-46991de4-1351-440a-8faf-c72f8f1d481e req-5f962852-c090-40da-a4d3-eb419ee7cc6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aede4522-9d5a-4475-9dd9-46c044901917-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:55 np0005593234 nova_compute[227762]: 2026-01-23 09:37:55.307 227766 DEBUG nova.compute.manager [req-46991de4-1351-440a-8faf-c72f8f1d481e req-5f962852-c090-40da-a4d3-eb419ee7cc6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] No waiting events found dispatching network-vif-plugged-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:37:55 np0005593234 nova_compute[227762]: 2026-01-23 09:37:55.308 227766 WARNING nova.compute.manager [req-46991de4-1351-440a-8faf-c72f8f1d481e req-5f962852-c090-40da-a4d3-eb419ee7cc6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Received unexpected event network-vif-plugged-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa for instance with vm_state deleted and task_state None.#033[00m
Jan 23 04:37:55 np0005593234 nova_compute[227762]: 2026-01-23 09:37:55.308 227766 DEBUG nova.compute.manager [req-46991de4-1351-440a-8faf-c72f8f1d481e req-5f962852-c090-40da-a4d3-eb419ee7cc6c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Received event network-vif-deleted-d6a90f4f-8b5b-441d-a945-cdba5f1aa2fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:37:55 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:37:55 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:37:55 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:37:55 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:37:55 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:37:55 np0005593234 nova_compute[227762]: 2026-01-23 09:37:55.431 227766 DEBUG nova.compute.manager [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphmsj_e0s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='261ab1ec-f79b-4867-bcb6-1c1d7491120e',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 23 04:37:55 np0005593234 nova_compute[227762]: 2026-01-23 09:37:55.501 227766 DEBUG oslo_concurrency.lockutils [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Acquiring lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:37:55 np0005593234 nova_compute[227762]: 2026-01-23 09:37:55.502 227766 DEBUG oslo_concurrency.lockutils [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Acquired lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:37:55 np0005593234 nova_compute[227762]: 2026-01-23 09:37:55.502 227766 DEBUG nova.network.neutron [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:37:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:56.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:37:56 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3889034530' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:37:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:37:56 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3889034530' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:37:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:56.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.198 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.549 227766 DEBUG nova.network.neutron [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Updating instance_info_cache with network_info: [{"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.586 227766 DEBUG oslo_concurrency.lockutils [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Releasing lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.589 227766 DEBUG os_brick.utils [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.591 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.601 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.601 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[9ddb3810-1b62-4eaa-965c-9d9d81a82369]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.603 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.609 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.609 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[51f9a0c2-8850-49ce-bbab-fe395c8b97b1]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.612 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.619 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.620 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[5be23a75-4e36-4fa8-8ea1-3324c626cccf]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.621 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c7bbbd-61b9-4f42-a9d0-d05802176907]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.622 227766 DEBUG oslo_concurrency.processutils [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.645 227766 DEBUG oslo_concurrency.processutils [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] CMD "nvme version" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.647 227766 DEBUG os_brick.initiator.connectors.lightos [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.648 227766 DEBUG os_brick.initiator.connectors.lightos [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.648 227766 DEBUG os_brick.initiator.connectors.lightos [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.648 227766 DEBUG os_brick.utils [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] <== get_connector_properties: return (58ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.728 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Acquiring lock "e8375e53-0781-4214-93e9-725707aab45d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.729 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.757 227766 DEBUG nova.compute.manager [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:37:57 np0005593234 podman[243544]: 2026-01-23 09:37:57.808838531 +0000 UTC m=+0.091222759 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.872 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.872 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.878 227766 DEBUG nova.virt.hardware [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:37:57 np0005593234 nova_compute[227762]: 2026-01-23 09:37:57.879 227766 INFO nova.compute.claims [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:37:57 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:57Z|00079|memory|INFO|peak resident set size grew 51% in last 1450.3 seconds, from 16000 kB to 24124 kB
Jan 23 04:37:57 np0005593234 ovn_controller[134547]: 2026-01-23T09:37:57Z|00080|memory|INFO|idl-cells-OVN_Southbound:10903 idl-cells-Open_vSwitch:756 lflow-cache-entries-cache-expr:380 lflow-cache-entries-cache-matches:296 lflow-cache-size-KB:1612 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:634 ofctrl_installed_flow_usage-KB:464 ofctrl_sb_flow_ref_usage-KB:237
Jan 23 04:37:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:58.082 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:37:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:37:58.083 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.083 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.111 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.131 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:37:58.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:37:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1081486001' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.534 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.539 227766 DEBUG nova.compute.provider_tree [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.562 227766 DEBUG nova.scheduler.client.report [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.603 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.731s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.604 227766 DEBUG nova.compute.manager [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.663 227766 DEBUG nova.compute.manager [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.664 227766 DEBUG nova.network.neutron [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.702 227766 INFO nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.736 227766 DEBUG nova.compute.manager [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:37:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:37:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:37:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:37:58.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.862 227766 DEBUG nova.virt.libvirt.driver [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphmsj_e0s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='261ab1ec-f79b-4867-bcb6-1c1d7491120e',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={b06791ec-66fd-4114-8448-7ea0b7f88f25='1efcf993-49fb-4692-9e1a-35930d237781'},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.863 227766 DEBUG nova.virt.libvirt.driver [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Creating instance directory: /var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.863 227766 DEBUG nova.virt.libvirt.driver [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Ensure instance console log exists: /var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.864 227766 DEBUG nova.virt.libvirt.driver [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Connecting volumes before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10901#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.866 227766 DEBUG nova.virt.libvirt.driver [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.868 227766 DEBUG nova.virt.libvirt.vif [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-724421301',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-724421301',id=29,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:37:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d0dce6e339c349d4ab97cee5e49fff3a',ramdisk_id='',reservation_id='r-106tqp53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1207260646',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1207260646-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:37:50Z,user_data=None,user_id='4f72965e950c4761bfedd99fdc411a83',uuid=261ab1ec-f79b-4867-bcb6-1c1d7491120e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.868 227766 DEBUG nova.network.os_vif_util [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Converting VIF {"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.869 227766 DEBUG nova.network.os_vif_util [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.869 227766 DEBUG os_vif [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.870 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.871 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.873 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.874 227766 DEBUG nova.compute.manager [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.875 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.876 227766 INFO nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Creating image(s)#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.899 227766 DEBUG nova.storage.rbd_utils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] rbd image e8375e53-0781-4214-93e9-725707aab45d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.922 227766 DEBUG nova.storage.rbd_utils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] rbd image e8375e53-0781-4214-93e9-725707aab45d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.943 227766 DEBUG nova.storage.rbd_utils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] rbd image e8375e53-0781-4214-93e9-725707aab45d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.947 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.974 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.976 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap27e277b3-21, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.977 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap27e277b3-21, col_values=(('external_ids', {'iface-id': '27e277b3-2135-4e3e-b336-e0da87509465', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:06:0e', 'vm-uuid': '261ab1ec-f79b-4867-bcb6-1c1d7491120e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.978 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:58 np0005593234 NetworkManager[48942]: <info>  [1769161078.9796] manager: (tap27e277b3-21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.982 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.984 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.984 227766 INFO os_vif [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21')#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.988 227766 DEBUG nova.virt.libvirt.driver [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 23 04:37:58 np0005593234 nova_compute[227762]: 2026-01-23 09:37:58.988 227766 DEBUG nova.compute.manager [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphmsj_e0s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='261ab1ec-f79b-4867-bcb6-1c1d7491120e',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={b06791ec-66fd-4114-8448-7ea0b7f88f25='1efcf993-49fb-4692-9e1a-35930d237781'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.017 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.018 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.018 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.019 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.039 227766 DEBUG nova.storage.rbd_utils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] rbd image e8375e53-0781-4214-93e9-725707aab45d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.042 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 e8375e53-0781-4214-93e9-725707aab45d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:37:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.359 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 e8375e53-0781-4214-93e9-725707aab45d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.317s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.422 227766 DEBUG nova.storage.rbd_utils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] resizing rbd image e8375e53-0781-4214-93e9-725707aab45d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.452 227766 DEBUG nova.policy [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0b90ffd889434d4992770d9c8694044d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6efe15a8c8b44f02b78e989774efff46', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.522 227766 DEBUG nova.objects.instance [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lazy-loading 'migration_context' on Instance uuid e8375e53-0781-4214-93e9-725707aab45d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.539 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.539 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Ensure instance console log exists: /var/lib/nova/instances/e8375e53-0781-4214-93e9-725707aab45d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.540 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.540 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:37:59 np0005593234 nova_compute[227762]: 2026-01-23 09:37:59.540 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:00 np0005593234 nova_compute[227762]: 2026-01-23 09:38:00.042 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161064.9020452, 261ab1ec-f79b-4867-bcb6-1c1d7491120e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:00 np0005593234 nova_compute[227762]: 2026-01-23 09:38:00.042 227766 INFO nova.compute.manager [-] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:38:00 np0005593234 nova_compute[227762]: 2026-01-23 09:38:00.155 227766 DEBUG nova.compute.manager [None req-02308e03-09ef-4b41-a540-3872b0689bb2 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:00.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:00.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:01 np0005593234 nova_compute[227762]: 2026-01-23 09:38:01.067 227766 DEBUG nova.network.neutron [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Port 27e277b3-2135-4e3e-b336-e0da87509465 updated with migration profile {'os_vif_delegation': True, 'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 23 04:38:01 np0005593234 nova_compute[227762]: 2026-01-23 09:38:01.395 227766 DEBUG nova.compute.manager [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[LibvirtLiveMigrateBDMInfo],block_migration=False,disk_available_mb=19456,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmphmsj_e0s',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='261ab1ec-f79b-4867-bcb6-1c1d7491120e',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=True,migration=<?>,old_vol_attachment_ids={b06791ec-66fd-4114-8448-7ea0b7f88f25='1efcf993-49fb-4692-9e1a-35930d237781'},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 23 04:38:01 np0005593234 systemd[1]: Starting libvirt proxy daemon...
Jan 23 04:38:01 np0005593234 systemd[1]: Started libvirt proxy daemon.
Jan 23 04:38:01 np0005593234 kernel: tap27e277b3-21: entered promiscuous mode
Jan 23 04:38:01 np0005593234 NetworkManager[48942]: <info>  [1769161081.6823] manager: (tap27e277b3-21): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Jan 23 04:38:01 np0005593234 nova_compute[227762]: 2026-01-23 09:38:01.682 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:01 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:01Z|00081|binding|INFO|Claiming lport 27e277b3-2135-4e3e-b336-e0da87509465 for this additional chassis.
Jan 23 04:38:01 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:01Z|00082|binding|INFO|27e277b3-2135-4e3e-b336-e0da87509465: Claiming fa:16:3e:34:06:0e 10.100.0.11
Jan 23 04:38:01 np0005593234 nova_compute[227762]: 2026-01-23 09:38:01.701 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:01 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:01Z|00083|binding|INFO|Setting lport 27e277b3-2135-4e3e-b336-e0da87509465 ovn-installed in OVS
Jan 23 04:38:01 np0005593234 nova_compute[227762]: 2026-01-23 09:38:01.705 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:01 np0005593234 systemd-udevd[243794]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:38:01 np0005593234 nova_compute[227762]: 2026-01-23 09:38:01.712 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:01 np0005593234 systemd-machined[195626]: New machine qemu-15-instance-0000001d.
Jan 23 04:38:01 np0005593234 NetworkManager[48942]: <info>  [1769161081.7219] device (tap27e277b3-21): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:38:01 np0005593234 NetworkManager[48942]: <info>  [1769161081.7226] device (tap27e277b3-21): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:38:01 np0005593234 systemd[1]: Started Virtual Machine qemu-15-instance-0000001d.
Jan 23 04:38:01 np0005593234 nova_compute[227762]: 2026-01-23 09:38:01.962 227766 DEBUG nova.network.neutron [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Successfully created port: 6c46396c-25f3-4442-b5bf-7ba6361eed17 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:38:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:38:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:38:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:02.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:02 np0005593234 nova_compute[227762]: 2026-01-23 09:38:02.641 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161082.641525, 261ab1ec-f79b-4867-bcb6-1c1d7491120e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:02 np0005593234 nova_compute[227762]: 2026-01-23 09:38:02.642 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] VM Started (Lifecycle Event)#033[00m
Jan 23 04:38:02 np0005593234 nova_compute[227762]: 2026-01-23 09:38:02.667 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:02 np0005593234 nova_compute[227762]: 2026-01-23 09:38:02.719 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "52908364-c256-4f35-8ea4-1904a14fa399" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:02 np0005593234 nova_compute[227762]: 2026-01-23 09:38:02.719 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:02 np0005593234 nova_compute[227762]: 2026-01-23 09:38:02.738 227766 DEBUG nova.compute.manager [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:38:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:02.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:03 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:03.084 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.105 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161083.1046784, 261ab1ec-f79b-4867-bcb6-1c1d7491120e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.106 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.128 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.156 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.159 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.195 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.273 227766 DEBUG nova.network.neutron [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Successfully updated port: 6c46396c-25f3-4442-b5bf-7ba6361eed17 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.279 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.279 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.284 227766 DEBUG nova.virt.hardware [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.284 227766 INFO nova.compute.claims [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.322 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Acquiring lock "refresh_cache-e8375e53-0781-4214-93e9-725707aab45d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.323 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Acquired lock "refresh_cache-e8375e53-0781-4214-93e9-725707aab45d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.323 227766 DEBUG nova.network.neutron [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.562 227766 DEBUG nova.compute.manager [req-8057924d-56e1-441c-8531-f4f86ac64d4b req-58246398-3102-4d42-88e7-4a92a8dca51d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Received event network-changed-6c46396c-25f3-4442-b5bf-7ba6361eed17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.562 227766 DEBUG nova.compute.manager [req-8057924d-56e1-441c-8531-f4f86ac64d4b req-58246398-3102-4d42-88e7-4a92a8dca51d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Refreshing instance network info cache due to event network-changed-6c46396c-25f3-4442-b5bf-7ba6361eed17. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.563 227766 DEBUG oslo_concurrency.lockutils [req-8057924d-56e1-441c-8531-f4f86ac64d4b req-58246398-3102-4d42-88e7-4a92a8dca51d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-e8375e53-0781-4214-93e9-725707aab45d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.606 227766 DEBUG nova.network.neutron [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.642 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:03 np0005593234 nova_compute[227762]: 2026-01-23 09:38:03.978 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:38:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1883236069' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:38:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.061 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.067 227766 DEBUG nova.compute.provider_tree [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.081 227766 DEBUG nova.scheduler.client.report [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.131 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.852s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.132 227766 DEBUG nova.compute.manager [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.228 227766 DEBUG nova.compute.manager [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.229 227766 DEBUG nova.network.neutron [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.259 227766 INFO nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.284 227766 DEBUG nova.compute.manager [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:38:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:38:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:04.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.421 227766 DEBUG nova.compute.manager [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.423 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.423 227766 INFO nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Creating image(s)#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.447 227766 DEBUG nova.storage.rbd_utils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 52908364-c256-4f35-8ea4-1904a14fa399_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.474 227766 DEBUG nova.storage.rbd_utils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 52908364-c256-4f35-8ea4-1904a14fa399_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.502 227766 DEBUG nova.storage.rbd_utils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 52908364-c256-4f35-8ea4-1904a14fa399_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.506 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.565 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.566 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.566 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.566 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.592 227766 DEBUG nova.storage.rbd_utils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 52908364-c256-4f35-8ea4-1904a14fa399_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.598 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 52908364-c256-4f35-8ea4-1904a14fa399_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.690 227766 DEBUG nova.network.neutron [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Updating instance_info_cache with network_info: [{"id": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "address": "fa:16:3e:ab:f6:9f", "network": {"id": "1b41d31a-47ee-41d6-9860-bbbbe4b282f2", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1925335251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6efe15a8c8b44f02b78e989774efff46", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46396c-25", "ovs_interfaceid": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.730 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Releasing lock "refresh_cache-e8375e53-0781-4214-93e9-725707aab45d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.731 227766 DEBUG nova.compute.manager [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Instance network_info: |[{"id": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "address": "fa:16:3e:ab:f6:9f", "network": {"id": "1b41d31a-47ee-41d6-9860-bbbbe4b282f2", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1925335251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6efe15a8c8b44f02b78e989774efff46", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46396c-25", "ovs_interfaceid": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.732 227766 DEBUG oslo_concurrency.lockutils [req-8057924d-56e1-441c-8531-f4f86ac64d4b req-58246398-3102-4d42-88e7-4a92a8dca51d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-e8375e53-0781-4214-93e9-725707aab45d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.732 227766 DEBUG nova.network.neutron [req-8057924d-56e1-441c-8531-f4f86ac64d4b req-58246398-3102-4d42-88e7-4a92a8dca51d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Refreshing network info cache for port 6c46396c-25f3-4442-b5bf-7ba6361eed17 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.736 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Start _get_guest_xml network_info=[{"id": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "address": "fa:16:3e:ab:f6:9f", "network": {"id": "1b41d31a-47ee-41d6-9860-bbbbe4b282f2", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1925335251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6efe15a8c8b44f02b78e989774efff46", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46396c-25", "ovs_interfaceid": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.744 227766 WARNING nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.750 227766 DEBUG nova.virt.libvirt.host [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.751 227766 DEBUG nova.virt.libvirt.host [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.760 227766 DEBUG nova.virt.libvirt.host [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.761 227766 DEBUG nova.virt.libvirt.host [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.763 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.764 227766 DEBUG nova.virt.hardware [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.765 227766 DEBUG nova.virt.hardware [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.765 227766 DEBUG nova.virt.hardware [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.765 227766 DEBUG nova.virt.hardware [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.765 227766 DEBUG nova.virt.hardware [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.765 227766 DEBUG nova.virt.hardware [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.766 227766 DEBUG nova.virt.hardware [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.766 227766 DEBUG nova.virt.hardware [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.766 227766 DEBUG nova.virt.hardware [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.766 227766 DEBUG nova.virt.hardware [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.767 227766 DEBUG nova.virt.hardware [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.771 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:04.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.912 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 52908364-c256-4f35-8ea4-1904a14fa399_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.314s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.950 227766 DEBUG nova.policy [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '191a72cfd0a841e9806246e07eb62fa6', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1a5f46b255cd4387bd3e4c0acaa39466', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:38:04 np0005593234 nova_compute[227762]: 2026-01-23 09:38:04.996 227766 DEBUG nova.storage.rbd_utils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] resizing rbd image 52908364-c256-4f35-8ea4-1904a14fa399_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.125 227766 DEBUG nova.objects.instance [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'migration_context' on Instance uuid 52908364-c256-4f35-8ea4-1904a14fa399 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.150 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.150 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Ensure instance console log exists: /var/lib/nova/instances/52908364-c256-4f35-8ea4-1904a14fa399/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.151 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.151 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.152 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:38:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3094546404' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.222 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.248 227766 DEBUG nova.storage.rbd_utils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] rbd image e8375e53-0781-4214-93e9-725707aab45d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.252 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:05 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:05Z|00084|binding|INFO|Claiming lport 27e277b3-2135-4e3e-b336-e0da87509465 for this chassis.
Jan 23 04:38:05 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:05Z|00085|binding|INFO|27e277b3-2135-4e3e-b336-e0da87509465: Claiming fa:16:3e:34:06:0e 10.100.0.11
Jan 23 04:38:05 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:05Z|00086|binding|INFO|Setting lport 27e277b3-2135-4e3e-b336-e0da87509465 up in Southbound
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.615 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:06:0e 10.100.0.11'], port_security=['fa:16:3e:34:06:0e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '261ab1ec-f79b-4867-bcb6-1c1d7491120e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0dce6e339c349d4ab97cee5e49fff3a', 'neutron:revision_number': '20', 'neutron:security_group_ids': '0179c400-b2f2-4914-b563-942a61ef1858', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbb60528-b878-42fd-9c2f-0a3345010b1a, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=27e277b3-2135-4e3e-b336-e0da87509465) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.616 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 27e277b3-2135-4e3e-b336-e0da87509465 in datapath 8eab8076-0848-4daf-bbac-f3f8b65ca750 bound to our chassis#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.618 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8eab8076-0848-4daf-bbac-f3f8b65ca750#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.630 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bd2adb4d-db1b-447f-98bf-71175d122929]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.631 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8eab8076-01 in ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.633 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8eab8076-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.633 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aa386d63-ea2a-4cc0-ac79-51acae39e054]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.634 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f5fc9576-7a9b-422f-baa4-b3c7ad051e41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.647 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[45816c2d-24a4-4732-a75e-434fede80542]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:38:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3774858833' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.670 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2de0e915-d6b0-4391-9026-b28e053992d8]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.675 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.677 227766 DEBUG nova.virt.libvirt.vif [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:37:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1271327709',display_name='tempest-ImagesNegativeTestJSON-server-1271327709',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1271327709',id=34,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6efe15a8c8b44f02b78e989774efff46',ramdisk_id='',reservation_id='r-gme042xa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-79321248',owner_user_name='tempest-ImagesNegativeTestJSON-79321248-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:37:58Z,user_data=None,user_id='0b90ffd889434d4992770d9c8694044d',uuid=e8375e53-0781-4214-93e9-725707aab45d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "address": "fa:16:3e:ab:f6:9f", "network": {"id": "1b41d31a-47ee-41d6-9860-bbbbe4b282f2", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1925335251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6efe15a8c8b44f02b78e989774efff46", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46396c-25", "ovs_interfaceid": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.677 227766 DEBUG nova.network.os_vif_util [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Converting VIF {"id": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "address": "fa:16:3e:ab:f6:9f", "network": {"id": "1b41d31a-47ee-41d6-9860-bbbbe4b282f2", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1925335251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6efe15a8c8b44f02b78e989774efff46", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46396c-25", "ovs_interfaceid": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.680 227766 DEBUG nova.network.os_vif_util [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:f6:9f,bridge_name='br-int',has_traffic_filtering=True,id=6c46396c-25f3-4442-b5bf-7ba6361eed17,network=Network(1b41d31a-47ee-41d6-9860-bbbbe4b282f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46396c-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.683 227766 DEBUG nova.objects.instance [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lazy-loading 'pci_devices' on Instance uuid e8375e53-0781-4214-93e9-725707aab45d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.700 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d3089524-813d-455d-b9d4-203614fc7a5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.707 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b06e8b7d-e09d-4760-a313-1112e70a8384]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 NetworkManager[48942]: <info>  [1769161085.7084] manager: (tap8eab8076-00): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.708 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  <uuid>e8375e53-0781-4214-93e9-725707aab45d</uuid>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  <name>instance-00000022</name>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <nova:name>tempest-ImagesNegativeTestJSON-server-1271327709</nova:name>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:38:04</nova:creationTime>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <nova:user uuid="0b90ffd889434d4992770d9c8694044d">tempest-ImagesNegativeTestJSON-79321248-project-member</nova:user>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <nova:project uuid="6efe15a8c8b44f02b78e989774efff46">tempest-ImagesNegativeTestJSON-79321248</nova:project>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <nova:port uuid="6c46396c-25f3-4442-b5bf-7ba6361eed17">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <entry name="serial">e8375e53-0781-4214-93e9-725707aab45d</entry>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <entry name="uuid">e8375e53-0781-4214-93e9-725707aab45d</entry>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/e8375e53-0781-4214-93e9-725707aab45d_disk">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/e8375e53-0781-4214-93e9-725707aab45d_disk.config">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:ab:f6:9f"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <target dev="tap6c46396c-25"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/e8375e53-0781-4214-93e9-725707aab45d/console.log" append="off"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:38:05 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:38:05 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:38:05 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:38:05 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.710 227766 DEBUG nova.compute.manager [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Preparing to wait for external event network-vif-plugged-6c46396c-25f3-4442-b5bf-7ba6361eed17 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.711 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Acquiring lock "e8375e53-0781-4214-93e9-725707aab45d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.711 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.711 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.712 227766 DEBUG nova.virt.libvirt.vif [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:37:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1271327709',display_name='tempest-ImagesNegativeTestJSON-server-1271327709',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1271327709',id=34,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6efe15a8c8b44f02b78e989774efff46',ramdisk_id='',reservation_id='r-gme042xa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesNegativeTestJSON-79321248',owner_user_name='tempest-ImagesNegativeTestJSON-79321248-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:37:58Z,user_data=None,user_id='0b90ffd889434d4992770d9c8694044d',uuid=e8375e53-0781-4214-93e9-725707aab45d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "address": "fa:16:3e:ab:f6:9f", "network": {"id": "1b41d31a-47ee-41d6-9860-bbbbe4b282f2", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1925335251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6efe15a8c8b44f02b78e989774efff46", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46396c-25", "ovs_interfaceid": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.713 227766 DEBUG nova.network.os_vif_util [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Converting VIF {"id": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "address": "fa:16:3e:ab:f6:9f", "network": {"id": "1b41d31a-47ee-41d6-9860-bbbbe4b282f2", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1925335251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6efe15a8c8b44f02b78e989774efff46", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46396c-25", "ovs_interfaceid": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.714 227766 DEBUG nova.network.os_vif_util [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:f6:9f,bridge_name='br-int',has_traffic_filtering=True,id=6c46396c-25f3-4442-b5bf-7ba6361eed17,network=Network(1b41d31a-47ee-41d6-9860-bbbbe4b282f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46396c-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.714 227766 DEBUG os_vif [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:f6:9f,bridge_name='br-int',has_traffic_filtering=True,id=6c46396c-25f3-4442-b5bf-7ba6361eed17,network=Network(1b41d31a-47ee-41d6-9860-bbbbe4b282f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46396c-25') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.715 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.715 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.716 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.719 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.719 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c46396c-25, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.720 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6c46396c-25, col_values=(('external_ids', {'iface-id': '6c46396c-25f3-4442-b5bf-7ba6361eed17', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:f6:9f', 'vm-uuid': 'e8375e53-0781-4214-93e9-725707aab45d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:05 np0005593234 NetworkManager[48942]: <info>  [1769161085.7223] manager: (tap6c46396c-25): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.723 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.727 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.729 227766 INFO os_vif [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:f6:9f,bridge_name='br-int',has_traffic_filtering=True,id=6c46396c-25f3-4442-b5bf-7ba6361eed17,network=Network(1b41d31a-47ee-41d6-9860-bbbbe4b282f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46396c-25')#033[00m
Jan 23 04:38:05 np0005593234 systemd-udevd[244157]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.739 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[3b44baf2-00e4-4365-82a6-bcac46f19afc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.742 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f1a8f220-6bda-4744-84d3-c369f7664640]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 NetworkManager[48942]: <info>  [1769161085.7649] device (tap8eab8076-00): carrier: link connected
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.769 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b21f4b-16a3-4622-b754-90482338dd19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.785 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8b828592-ca47-4c08-8c09-91dc22e2393d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8eab8076-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:5b:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498072, 'reachable_time': 22665, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244179, 'error': None, 'target': 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.799 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d7df95dd-97f6-4c2f-b296-a8ef6ac563f9]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe46:5b99'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498072, 'tstamp': 498072}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244180, 'error': None, 'target': 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.815 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4ec72a28-f66f-4524-9ca1-b32fc3259313]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8eab8076-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:46:5b:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498072, 'reachable_time': 22665, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244181, 'error': None, 'target': 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.828 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.828 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.828 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] No VIF found with MAC fa:16:3e:ab:f6:9f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.829 227766 INFO nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Using config drive#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.845 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4cb3f3b0-6f54-44a1-8599-983ac699effa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.854 227766 DEBUG nova.storage.rbd_utils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] rbd image e8375e53-0781-4214-93e9-725707aab45d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.860 227766 INFO nova.compute.manager [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Post operation of migration started#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.903 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[732284da-f3ab-46e0-ad17-541119a3c41e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.905 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8eab8076-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.905 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.905 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8eab8076-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:05 np0005593234 NetworkManager[48942]: <info>  [1769161085.9078] manager: (tap8eab8076-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.908 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:05 np0005593234 kernel: tap8eab8076-00: entered promiscuous mode
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.913 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8eab8076-00, col_values=(('external_ids', {'iface-id': 'b545a870-aa18-4f64-a8a7-f8512824c4cc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:05 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:05Z|00087|binding|INFO|Releasing lport b545a870-aa18-4f64-a8a7-f8512824c4cc from this chassis (sb_readonly=0)
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.914 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.929 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:05 np0005593234 nova_compute[227762]: 2026-01-23 09:38:05.933 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.933 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8eab8076-0848-4daf-bbac-f3f8b65ca750.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8eab8076-0848-4daf-bbac-f3f8b65ca750.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.934 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0b195fef-bde6-4469-9796-47b3803b8ca7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.935 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-8eab8076-0848-4daf-bbac-f3f8b65ca750
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/8eab8076-0848-4daf-bbac-f3f8b65ca750.pid.haproxy
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 8eab8076-0848-4daf-bbac-f3f8b65ca750
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:38:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:05.935 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'env', 'PROCESS_TAG=haproxy-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8eab8076-0848-4daf-bbac-f3f8b65ca750.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:38:06 np0005593234 podman[244234]: 2026-01-23 09:38:06.313594027 +0000 UTC m=+0.049083232 container create 10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 04:38:06 np0005593234 systemd[1]: Started libpod-conmon-10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1.scope.
Jan 23 04:38:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:06.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:06 np0005593234 nova_compute[227762]: 2026-01-23 09:38:06.372 227766 DEBUG nova.network.neutron [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Successfully created port: 805027f4-e7d4-48b3-9fac-a3e7901dbd9f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:38:06 np0005593234 podman[244234]: 2026-01-23 09:38:06.286235024 +0000 UTC m=+0.021724269 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:38:06 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:38:06 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c7e94588b446a6e33816455026bc4edc3bb9e14cedb0e85cec3d198149c28eb2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:38:06 np0005593234 podman[244234]: 2026-01-23 09:38:06.398812898 +0000 UTC m=+0.134302123 container init 10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 23 04:38:06 np0005593234 podman[244234]: 2026-01-23 09:38:06.404495836 +0000 UTC m=+0.139985031 container start 10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 04:38:06 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[244250]: [NOTICE]   (244254) : New worker (244256) forked
Jan 23 04:38:06 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[244250]: [NOTICE]   (244254) : Loading success.
Jan 23 04:38:06 np0005593234 nova_compute[227762]: 2026-01-23 09:38:06.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:06 np0005593234 nova_compute[227762]: 2026-01-23 09:38:06.785 227766 DEBUG oslo_concurrency.lockutils [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Acquiring lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:38:06 np0005593234 nova_compute[227762]: 2026-01-23 09:38:06.786 227766 DEBUG oslo_concurrency.lockutils [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Acquired lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:38:06 np0005593234 nova_compute[227762]: 2026-01-23 09:38:06.786 227766 DEBUG nova.network.neutron [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:38:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:06.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:06 np0005593234 nova_compute[227762]: 2026-01-23 09:38:06.927 227766 DEBUG nova.network.neutron [req-8057924d-56e1-441c-8531-f4f86ac64d4b req-58246398-3102-4d42-88e7-4a92a8dca51d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Updated VIF entry in instance network info cache for port 6c46396c-25f3-4442-b5bf-7ba6361eed17. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:38:06 np0005593234 nova_compute[227762]: 2026-01-23 09:38:06.928 227766 DEBUG nova.network.neutron [req-8057924d-56e1-441c-8531-f4f86ac64d4b req-58246398-3102-4d42-88e7-4a92a8dca51d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Updating instance_info_cache with network_info: [{"id": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "address": "fa:16:3e:ab:f6:9f", "network": {"id": "1b41d31a-47ee-41d6-9860-bbbbe4b282f2", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1925335251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6efe15a8c8b44f02b78e989774efff46", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46396c-25", "ovs_interfaceid": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:38:06 np0005593234 nova_compute[227762]: 2026-01-23 09:38:06.946 227766 DEBUG oslo_concurrency.lockutils [req-8057924d-56e1-441c-8531-f4f86ac64d4b req-58246398-3102-4d42-88e7-4a92a8dca51d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-e8375e53-0781-4214-93e9-725707aab45d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.097 227766 INFO nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Creating config drive at /var/lib/nova/instances/e8375e53-0781-4214-93e9-725707aab45d/disk.config#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.103 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e8375e53-0781-4214-93e9-725707aab45d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdcuva0m4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.165 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161072.1638741, aede4522-9d5a-4475-9dd9-46c044901917 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.166 227766 INFO nova.compute.manager [-] [instance: aede4522-9d5a-4475-9dd9-46c044901917] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.198 227766 DEBUG nova.compute.manager [None req-02659d19-5ba4-470c-b6fd-ecd963c9e321 - - - - - -] [instance: aede4522-9d5a-4475-9dd9-46c044901917] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.232 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e8375e53-0781-4214-93e9-725707aab45d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdcuva0m4" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.263 227766 DEBUG nova.storage.rbd_utils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] rbd image e8375e53-0781-4214-93e9-725707aab45d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.266 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/e8375e53-0781-4214-93e9-725707aab45d/disk.config e8375e53-0781-4214-93e9-725707aab45d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.416 227766 DEBUG oslo_concurrency.processutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/e8375e53-0781-4214-93e9-725707aab45d/disk.config e8375e53-0781-4214-93e9-725707aab45d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.417 227766 INFO nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Deleting local config drive /var/lib/nova/instances/e8375e53-0781-4214-93e9-725707aab45d/disk.config because it was imported into RBD.#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.453 227766 DEBUG nova.network.neutron [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Successfully updated port: 805027f4-e7d4-48b3-9fac-a3e7901dbd9f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:38:07 np0005593234 kernel: tap6c46396c-25: entered promiscuous mode
Jan 23 04:38:07 np0005593234 NetworkManager[48942]: <info>  [1769161087.4654] manager: (tap6c46396c-25): new Tun device (/org/freedesktop/NetworkManager/Devices/56)
Jan 23 04:38:07 np0005593234 systemd-udevd[244166]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:38:07 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:07Z|00088|binding|INFO|Claiming lport 6c46396c-25f3-4442-b5bf-7ba6361eed17 for this chassis.
Jan 23 04:38:07 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:07Z|00089|binding|INFO|6c46396c-25f3-4442-b5bf-7ba6361eed17: Claiming fa:16:3e:ab:f6:9f 10.100.0.3
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.468 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:07 np0005593234 NetworkManager[48942]: <info>  [1769161087.4794] device (tap6c46396c-25): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:38:07 np0005593234 NetworkManager[48942]: <info>  [1769161087.4799] device (tap6c46396c-25): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.480 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:f6:9f 10.100.0.3'], port_security=['fa:16:3e:ab:f6:9f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e8375e53-0781-4214-93e9-725707aab45d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b41d31a-47ee-41d6-9860-bbbbe4b282f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6efe15a8c8b44f02b78e989774efff46', 'neutron:revision_number': '2', 'neutron:security_group_ids': '65c0b0df-0cf0-47ef-b51a-18c2db044988', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed06a6cf-fa19-40a7-84a1-3338b2cd7022, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=6c46396c-25f3-4442-b5bf-7ba6361eed17) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.482 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 6c46396c-25f3-4442-b5bf-7ba6361eed17 in datapath 1b41d31a-47ee-41d6-9860-bbbbe4b282f2 bound to our chassis#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.485 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1b41d31a-47ee-41d6-9860-bbbbe4b282f2#033[00m
Jan 23 04:38:07 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:07Z|00090|binding|INFO|Setting lport 6c46396c-25f3-4442-b5bf-7ba6361eed17 ovn-installed in OVS
Jan 23 04:38:07 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:07Z|00091|binding|INFO|Setting lport 6c46396c-25f3-4442-b5bf-7ba6361eed17 up in Southbound
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.487 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.489 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.497 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ee6eff81-a84b-44e3-abc3-6590598c7df9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.498 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1b41d31a-41 in ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:38:07 np0005593234 systemd-machined[195626]: New machine qemu-16-instance-00000022.
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.500 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1b41d31a-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.501 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5d64268f-c2d9-4ab0-8c6a-9981def73f7d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.502 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2451125b-e116-422a-8865-9620bd0e253a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.507 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "refresh_cache-52908364-c256-4f35-8ea4-1904a14fa399" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.508 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquired lock "refresh_cache-52908364-c256-4f35-8ea4-1904a14fa399" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.508 227766 DEBUG nova.network.neutron [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:38:07 np0005593234 systemd[1]: Started Virtual Machine qemu-16-instance-00000022.
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.516 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[7e5c91ac-4a07-4de9-a64d-c179f14f462e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.549 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e01d03dd-f55e-419c-bbe2-cc0a04b2529e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.583 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e7b6e404-4f00-46ab-8d0a-d00abd469a3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.591 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e1bc77b9-5cbf-483e-9d7e-d074af64d9ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 NetworkManager[48942]: <info>  [1769161087.5922] manager: (tap1b41d31a-40): new Veth device (/org/freedesktop/NetworkManager/Devices/57)
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.623 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[872e2538-40c7-4e78-9653-ce023b1ecf23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.625 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[671eb83a-9790-4c43-95f2-eb894af4a82c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 NetworkManager[48942]: <info>  [1769161087.6447] device (tap1b41d31a-40): carrier: link connected
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.649 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e12ac5bc-e869-4db7-bf50-26b8f650d2ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.666 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e58289ce-2171-4d8d-9622-fa7072a75020]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b41d31a-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:3f:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498260, 'reachable_time': 24931, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244335, 'error': None, 'target': 'ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.683 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[185b0e27-cf7d-4ee8-aa77-751377b02fd0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe13:3f4c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498260, 'tstamp': 498260}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244336, 'error': None, 'target': 'ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.701 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[37b3c0cc-ea39-49ca-8262-64672424bd8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b41d31a-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:13:3f:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498260, 'reachable_time': 24931, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244337, 'error': None, 'target': 'ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.707 227766 DEBUG nova.network.neutron [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.730 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[364df4bf-2c21-4eb1-92b1-3966fd321b6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.774 227766 DEBUG nova.compute.manager [req-ba16bbf6-5202-46b7-98ea-2bd3ec044c74 req-8d0c7ec3-1353-433f-8916-394f0dafe576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Received event network-vif-plugged-6c46396c-25f3-4442-b5bf-7ba6361eed17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.775 227766 DEBUG oslo_concurrency.lockutils [req-ba16bbf6-5202-46b7-98ea-2bd3ec044c74 req-8d0c7ec3-1353-433f-8916-394f0dafe576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "e8375e53-0781-4214-93e9-725707aab45d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.775 227766 DEBUG oslo_concurrency.lockutils [req-ba16bbf6-5202-46b7-98ea-2bd3ec044c74 req-8d0c7ec3-1353-433f-8916-394f0dafe576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.775 227766 DEBUG oslo_concurrency.lockutils [req-ba16bbf6-5202-46b7-98ea-2bd3ec044c74 req-8d0c7ec3-1353-433f-8916-394f0dafe576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.775 227766 DEBUG nova.compute.manager [req-ba16bbf6-5202-46b7-98ea-2bd3ec044c74 req-8d0c7ec3-1353-433f-8916-394f0dafe576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Processing event network-vif-plugged-6c46396c-25f3-4442-b5bf-7ba6361eed17 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.788 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b12580f0-4092-4d27-b66e-c08d67b3d66c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.790 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b41d31a-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.791 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.792 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b41d31a-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:07 np0005593234 kernel: tap1b41d31a-40: entered promiscuous mode
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.793 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:07 np0005593234 NetworkManager[48942]: <info>  [1769161087.7946] manager: (tap1b41d31a-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.800 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1b41d31a-40, col_values=(('external_ids', {'iface-id': '79373b0f-3f99-47b0-9478-649addef5661'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.801 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:07 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:07Z|00092|binding|INFO|Releasing lport 79373b0f-3f99-47b0-9478-649addef5661 from this chassis (sb_readonly=0)
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.802 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.804 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1b41d31a-47ee-41d6-9860-bbbbe4b282f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1b41d31a-47ee-41d6-9860-bbbbe4b282f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.806 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a13ae6fe-ba03-4f3a-99e9-0d7f73244a2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.815 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.817 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-1b41d31a-47ee-41d6-9860-bbbbe4b282f2
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/1b41d31a-47ee-41d6-9860-bbbbe4b282f2.pid.haproxy
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 1b41d31a-47ee-41d6-9860-bbbbe4b282f2
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:38:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:07.818 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2', 'env', 'PROCESS_TAG=haproxy-1b41d31a-47ee-41d6-9860-bbbbe4b282f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1b41d31a-47ee-41d6-9860-bbbbe4b282f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.897 227766 DEBUG nova.compute.manager [req-29068447-5e4a-437e-92d0-784e8add60f7 req-7e744c9b-fae9-4275-b94b-e97dc9d4999d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Received event network-changed-805027f4-e7d4-48b3-9fac-a3e7901dbd9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.897 227766 DEBUG nova.compute.manager [req-29068447-5e4a-437e-92d0-784e8add60f7 req-7e744c9b-fae9-4275-b94b-e97dc9d4999d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Refreshing instance network info cache due to event network-changed-805027f4-e7d4-48b3-9fac-a3e7901dbd9f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:38:07 np0005593234 nova_compute[227762]: 2026-01-23 09:38:07.897 227766 DEBUG oslo_concurrency.lockutils [req-29068447-5e4a-437e-92d0-784e8add60f7 req-7e744c9b-fae9-4275-b94b-e97dc9d4999d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-52908364-c256-4f35-8ea4-1904a14fa399" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.130 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:08 np0005593234 podman[244407]: 2026-01-23 09:38:08.185012741 +0000 UTC m=+0.054529383 container create 4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 04:38:08 np0005593234 systemd[1]: Started libpod-conmon-4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61.scope.
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.226 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161088.2264287, e8375e53-0781-4214-93e9-725707aab45d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.227 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] VM Started (Lifecycle Event)#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.229 227766 DEBUG nova.compute.manager [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.232 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.237 227766 INFO nova.virt.libvirt.driver [-] [instance: e8375e53-0781-4214-93e9-725707aab45d] Instance spawned successfully.#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.237 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:38:08 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:38:08 np0005593234 podman[244407]: 2026-01-23 09:38:08.15552725 +0000 UTC m=+0.025043902 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:38:08 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4e44fd7a179bc6f2fbf9b7b34047814c49b1745e922828ab51c1e8f2c428568/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.259 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:08 np0005593234 podman[244407]: 2026-01-23 09:38:08.264509573 +0000 UTC m=+0.134026225 container init 4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.264 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.265 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.265 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.265 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.266 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.266 227766 DEBUG nova.virt.libvirt.driver [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:08 np0005593234 podman[244407]: 2026-01-23 09:38:08.270374726 +0000 UTC m=+0.139891358 container start 4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.271 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:38:08 np0005593234 neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2[244427]: [NOTICE]   (244431) : New worker (244433) forked
Jan 23 04:38:08 np0005593234 neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2[244427]: [NOTICE]   (244431) : Loading success.
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.315 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.316 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161088.2292144, e8375e53-0781-4214-93e9-725707aab45d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.317 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:38:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:08.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.679 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.683 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161088.2315404, e8375e53-0781-4214-93e9-725707aab45d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.683 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.726 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.729 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.765 227766 INFO nova.compute.manager [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Took 9.89 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.765 227766 DEBUG nova.compute.manager [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:08.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.800 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.869 227766 INFO nova.compute.manager [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Took 11.02 seconds to build instance.#033[00m
Jan 23 04:38:08 np0005593234 nova_compute[227762]: 2026-01-23 09:38:08.896 227766 DEBUG oslo_concurrency.lockutils [None req-b2cc64c5-13b7-4ae6-be23-2e2b640ccb4d 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.167s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.426 227766 DEBUG nova.network.neutron [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Updating instance_info_cache with network_info: [{"id": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "address": "fa:16:3e:36:e2:ca", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap805027f4-e7", "ovs_interfaceid": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.460 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Releasing lock "refresh_cache-52908364-c256-4f35-8ea4-1904a14fa399" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.460 227766 DEBUG nova.compute.manager [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Instance network_info: |[{"id": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "address": "fa:16:3e:36:e2:ca", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap805027f4-e7", "ovs_interfaceid": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.461 227766 DEBUG oslo_concurrency.lockutils [req-29068447-5e4a-437e-92d0-784e8add60f7 req-7e744c9b-fae9-4275-b94b-e97dc9d4999d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-52908364-c256-4f35-8ea4-1904a14fa399" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.461 227766 DEBUG nova.network.neutron [req-29068447-5e4a-437e-92d0-784e8add60f7 req-7e744c9b-fae9-4275-b94b-e97dc9d4999d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Refreshing network info cache for port 805027f4-e7d4-48b3-9fac-a3e7901dbd9f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.466 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Start _get_guest_xml network_info=[{"id": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "address": "fa:16:3e:36:e2:ca", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap805027f4-e7", "ovs_interfaceid": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.469 227766 WARNING nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.477 227766 DEBUG nova.virt.libvirt.host [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.478 227766 DEBUG nova.virt.libvirt.host [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.483 227766 DEBUG nova.virt.libvirt.host [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.484 227766 DEBUG nova.virt.libvirt.host [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.486 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.486 227766 DEBUG nova.virt.hardware [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.487 227766 DEBUG nova.virt.hardware [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.487 227766 DEBUG nova.virt.hardware [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.488 227766 DEBUG nova.virt.hardware [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.488 227766 DEBUG nova.virt.hardware [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.488 227766 DEBUG nova.virt.hardware [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.489 227766 DEBUG nova.virt.hardware [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.489 227766 DEBUG nova.virt.hardware [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.490 227766 DEBUG nova.virt.hardware [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.490 227766 DEBUG nova.virt.hardware [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.490 227766 DEBUG nova.virt.hardware [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.493 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.671 227766 DEBUG nova.network.neutron [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Updating instance_info_cache with network_info: [{"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.705 227766 DEBUG oslo_concurrency.lockutils [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Releasing lock "refresh_cache-261ab1ec-f79b-4867-bcb6-1c1d7491120e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.734 227766 DEBUG oslo_concurrency.lockutils [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.735 227766 DEBUG oslo_concurrency.lockutils [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.735 227766 DEBUG oslo_concurrency.lockutils [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.740 227766 INFO nova.virt.libvirt.driver [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:38:09 np0005593234 virtqemud[227483]: Domain id=15 name='instance-0000001d' uuid=261ab1ec-f79b-4867-bcb6-1c1d7491120e is tainted: custom-monitor
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.825 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.890 227766 DEBUG nova.compute.manager [req-eb30050d-004c-4fa3-b3ea-889cc138c4d6 req-3001f41b-e3df-4a50-89fb-2b319cb896ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Received event network-vif-plugged-6c46396c-25f3-4442-b5bf-7ba6361eed17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.890 227766 DEBUG oslo_concurrency.lockutils [req-eb30050d-004c-4fa3-b3ea-889cc138c4d6 req-3001f41b-e3df-4a50-89fb-2b319cb896ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "e8375e53-0781-4214-93e9-725707aab45d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.891 227766 DEBUG oslo_concurrency.lockutils [req-eb30050d-004c-4fa3-b3ea-889cc138c4d6 req-3001f41b-e3df-4a50-89fb-2b319cb896ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.891 227766 DEBUG oslo_concurrency.lockutils [req-eb30050d-004c-4fa3-b3ea-889cc138c4d6 req-3001f41b-e3df-4a50-89fb-2b319cb896ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.891 227766 DEBUG nova.compute.manager [req-eb30050d-004c-4fa3-b3ea-889cc138c4d6 req-3001f41b-e3df-4a50-89fb-2b319cb896ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] No waiting events found dispatching network-vif-plugged-6c46396c-25f3-4442-b5bf-7ba6361eed17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.891 227766 WARNING nova.compute.manager [req-eb30050d-004c-4fa3-b3ea-889cc138c4d6 req-3001f41b-e3df-4a50-89fb-2b319cb896ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Received unexpected event network-vif-plugged-6c46396c-25f3-4442-b5bf-7ba6361eed17 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:38:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:38:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2597416812' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.932 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.957 227766 DEBUG nova.storage.rbd_utils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 52908364-c256-4f35-8ea4-1904a14fa399_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:09 np0005593234 nova_compute[227762]: 2026-01-23 09:38:09.963 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:10.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:38:10 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1245002121' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.388 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.389 227766 DEBUG nova.virt.libvirt.vif [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:38:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1665006440',display_name='tempest-ServersAdminTestJSON-server-1665006440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1665006440',id=35,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-ez7ohlpn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:38:04Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=52908364-c256-4f35-8ea4-1904a14fa399,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "address": "fa:16:3e:36:e2:ca", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap805027f4-e7", "ovs_interfaceid": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.390 227766 DEBUG nova.network.os_vif_util [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "address": "fa:16:3e:36:e2:ca", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap805027f4-e7", "ovs_interfaceid": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.390 227766 DEBUG nova.network.os_vif_util [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:e2:ca,bridge_name='br-int',has_traffic_filtering=True,id=805027f4-e7d4-48b3-9fac-a3e7901dbd9f,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap805027f4-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.391 227766 DEBUG nova.objects.instance [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'pci_devices' on Instance uuid 52908364-c256-4f35-8ea4-1904a14fa399 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.406 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-e8375e53-0781-4214-93e9-725707aab45d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.406 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-e8375e53-0781-4214-93e9-725707aab45d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.407 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.408 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid e8375e53-0781-4214-93e9-725707aab45d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.419 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  <uuid>52908364-c256-4f35-8ea4-1904a14fa399</uuid>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  <name>instance-00000023</name>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServersAdminTestJSON-server-1665006440</nova:name>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:38:09</nova:creationTime>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <nova:user uuid="191a72cfd0a841e9806246e07eb62fa6">tempest-ServersAdminTestJSON-1167530593-project-member</nova:user>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <nova:project uuid="1a5f46b255cd4387bd3e4c0acaa39466">tempest-ServersAdminTestJSON-1167530593</nova:project>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <nova:port uuid="805027f4-e7d4-48b3-9fac-a3e7901dbd9f">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <entry name="serial">52908364-c256-4f35-8ea4-1904a14fa399</entry>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <entry name="uuid">52908364-c256-4f35-8ea4-1904a14fa399</entry>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/52908364-c256-4f35-8ea4-1904a14fa399_disk">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/52908364-c256-4f35-8ea4-1904a14fa399_disk.config">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:36:e2:ca"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <target dev="tap805027f4-e7"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/52908364-c256-4f35-8ea4-1904a14fa399/console.log" append="off"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:38:10 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:38:10 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:38:10 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:38:10 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.424 227766 DEBUG nova.compute.manager [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Preparing to wait for external event network-vif-plugged-805027f4-e7d4-48b3-9fac-a3e7901dbd9f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.425 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "52908364-c256-4f35-8ea4-1904a14fa399-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.426 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.426 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.429 227766 DEBUG nova.virt.libvirt.vif [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:38:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1665006440',display_name='tempest-ServersAdminTestJSON-server-1665006440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1665006440',id=35,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-ez7ohlpn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:38:04Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=52908364-c256-4f35-8ea4-1904a14fa399,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "address": "fa:16:3e:36:e2:ca", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap805027f4-e7", "ovs_interfaceid": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.430 227766 DEBUG nova.network.os_vif_util [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "address": "fa:16:3e:36:e2:ca", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap805027f4-e7", "ovs_interfaceid": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.432 227766 DEBUG nova.network.os_vif_util [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:36:e2:ca,bridge_name='br-int',has_traffic_filtering=True,id=805027f4-e7d4-48b3-9fac-a3e7901dbd9f,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap805027f4-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.433 227766 DEBUG os_vif [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:e2:ca,bridge_name='br-int',has_traffic_filtering=True,id=805027f4-e7d4-48b3-9fac-a3e7901dbd9f,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap805027f4-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.435 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.436 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.438 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.450 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.451 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap805027f4-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.452 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap805027f4-e7, col_values=(('external_ids', {'iface-id': '805027f4-e7d4-48b3-9fac-a3e7901dbd9f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:36:e2:ca', 'vm-uuid': '52908364-c256-4f35-8ea4-1904a14fa399'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:10 np0005593234 NetworkManager[48942]: <info>  [1769161090.4541] manager: (tap805027f4-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.455 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.460 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.461 227766 INFO os_vif [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:36:e2:ca,bridge_name='br-int',has_traffic_filtering=True,id=805027f4-e7d4-48b3-9fac-a3e7901dbd9f,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap805027f4-e7')#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.531 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.531 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.531 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] No VIF found with MAC fa:16:3e:36:e2:ca, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.532 227766 INFO nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Using config drive#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.566 227766 DEBUG nova.storage.rbd_utils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 52908364-c256-4f35-8ea4-1904a14fa399_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.631 227766 DEBUG oslo_concurrency.lockutils [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Acquiring lock "e8375e53-0781-4214-93e9-725707aab45d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.631 227766 DEBUG oslo_concurrency.lockutils [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.632 227766 DEBUG oslo_concurrency.lockutils [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Acquiring lock "e8375e53-0781-4214-93e9-725707aab45d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.632 227766 DEBUG oslo_concurrency.lockutils [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.632 227766 DEBUG oslo_concurrency.lockutils [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.633 227766 INFO nova.compute.manager [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Terminating instance#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.634 227766 DEBUG nova.compute.manager [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:38:10 np0005593234 kernel: tap6c46396c-25 (unregistering): left promiscuous mode
Jan 23 04:38:10 np0005593234 NetworkManager[48942]: <info>  [1769161090.6819] device (tap6c46396c-25): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:38:10 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:10Z|00093|binding|INFO|Releasing lport 6c46396c-25f3-4442-b5bf-7ba6361eed17 from this chassis (sb_readonly=0)
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.690 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:10 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:10Z|00094|binding|INFO|Setting lport 6c46396c-25f3-4442-b5bf-7ba6361eed17 down in Southbound
Jan 23 04:38:10 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:10Z|00095|binding|INFO|Removing iface tap6c46396c-25 ovn-installed in OVS
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.693 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:10.699 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:f6:9f 10.100.0.3'], port_security=['fa:16:3e:ab:f6:9f 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'e8375e53-0781-4214-93e9-725707aab45d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b41d31a-47ee-41d6-9860-bbbbe4b282f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6efe15a8c8b44f02b78e989774efff46', 'neutron:revision_number': '4', 'neutron:security_group_ids': '65c0b0df-0cf0-47ef-b51a-18c2db044988', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ed06a6cf-fa19-40a7-84a1-3338b2cd7022, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=6c46396c-25f3-4442-b5bf-7ba6361eed17) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:38:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:10.700 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 6c46396c-25f3-4442-b5bf-7ba6361eed17 in datapath 1b41d31a-47ee-41d6-9860-bbbbe4b282f2 unbound from our chassis#033[00m
Jan 23 04:38:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:10.702 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1b41d31a-47ee-41d6-9860-bbbbe4b282f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:38:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:10.704 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[07f5ff6c-8c91-441e-a729-baf9de42f174]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:10.704 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2 namespace which is not needed anymore#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.716 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:10 np0005593234 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000022.scope: Deactivated successfully.
Jan 23 04:38:10 np0005593234 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000022.scope: Consumed 3.162s CPU time.
Jan 23 04:38:10 np0005593234 systemd-machined[195626]: Machine qemu-16-instance-00000022 terminated.
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.753 227766 INFO nova.virt.libvirt.driver [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 23 04:38:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:10.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:10 np0005593234 neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2[244427]: [NOTICE]   (244431) : haproxy version is 2.8.14-c23fe91
Jan 23 04:38:10 np0005593234 neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2[244427]: [NOTICE]   (244431) : path to executable is /usr/sbin/haproxy
Jan 23 04:38:10 np0005593234 neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2[244427]: [WARNING]  (244431) : Exiting Master process...
Jan 23 04:38:10 np0005593234 neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2[244427]: [ALERT]    (244431) : Current worker (244433) exited with code 143 (Terminated)
Jan 23 04:38:10 np0005593234 neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2[244427]: [WARNING]  (244431) : All workers exited. Exiting... (0)
Jan 23 04:38:10 np0005593234 systemd[1]: libpod-4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61.scope: Deactivated successfully.
Jan 23 04:38:10 np0005593234 podman[244549]: 2026-01-23 09:38:10.830684646 +0000 UTC m=+0.042001202 container died 4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:38:10 np0005593234 NetworkManager[48942]: <info>  [1769161090.8513] manager: (tap6c46396c-25): new Tun device (/org/freedesktop/NetworkManager/Devices/60)
Jan 23 04:38:10 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61-userdata-shm.mount: Deactivated successfully.
Jan 23 04:38:10 np0005593234 systemd[1]: var-lib-containers-storage-overlay-b4e44fd7a179bc6f2fbf9b7b34047814c49b1745e922828ab51c1e8f2c428568-merged.mount: Deactivated successfully.
Jan 23 04:38:10 np0005593234 podman[244549]: 2026-01-23 09:38:10.878112146 +0000 UTC m=+0.089428672 container cleanup 4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.877 227766 INFO nova.virt.libvirt.driver [-] [instance: e8375e53-0781-4214-93e9-725707aab45d] Instance destroyed successfully.#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.878 227766 DEBUG nova.objects.instance [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lazy-loading 'resources' on Instance uuid e8375e53-0781-4214-93e9-725707aab45d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:10 np0005593234 systemd[1]: libpod-conmon-4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61.scope: Deactivated successfully.
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.894 227766 DEBUG nova.virt.libvirt.vif [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:37:56Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesNegativeTestJSON-server-1271327709',display_name='tempest-ImagesNegativeTestJSON-server-1271327709',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesnegativetestjson-server-1271327709',id=34,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:38:08Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6efe15a8c8b44f02b78e989774efff46',ramdisk_id='',reservation_id='r-gme042xa',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesNegativeTestJSON-79321248',owner_user_name='tempest-ImagesNegativeTestJSON-79321248-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:38:08Z,user_data=None,user_id='0b90ffd889434d4992770d9c8694044d',uuid=e8375e53-0781-4214-93e9-725707aab45d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "address": "fa:16:3e:ab:f6:9f", "network": {"id": "1b41d31a-47ee-41d6-9860-bbbbe4b282f2", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1925335251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6efe15a8c8b44f02b78e989774efff46", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46396c-25", "ovs_interfaceid": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.894 227766 DEBUG nova.network.os_vif_util [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Converting VIF {"id": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "address": "fa:16:3e:ab:f6:9f", "network": {"id": "1b41d31a-47ee-41d6-9860-bbbbe4b282f2", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1925335251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6efe15a8c8b44f02b78e989774efff46", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46396c-25", "ovs_interfaceid": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.895 227766 DEBUG nova.network.os_vif_util [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:f6:9f,bridge_name='br-int',has_traffic_filtering=True,id=6c46396c-25f3-4442-b5bf-7ba6361eed17,network=Network(1b41d31a-47ee-41d6-9860-bbbbe4b282f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46396c-25') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.895 227766 DEBUG os_vif [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:f6:9f,bridge_name='br-int',has_traffic_filtering=True,id=6c46396c-25f3-4442-b5bf-7ba6361eed17,network=Network(1b41d31a-47ee-41d6-9860-bbbbe4b282f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46396c-25') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.897 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.897 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c46396c-25, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.898 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.900 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.903 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.905 227766 INFO os_vif [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:f6:9f,bridge_name='br-int',has_traffic_filtering=True,id=6c46396c-25f3-4442-b5bf-7ba6361eed17,network=Network(1b41d31a-47ee-41d6-9860-bbbbe4b282f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6c46396c-25')#033[00m
Jan 23 04:38:10 np0005593234 podman[244589]: 2026-01-23 09:38:10.94069294 +0000 UTC m=+0.040514686 container remove 4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:38:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:10.946 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[faf1ec7f-d8a2-4d54-a6c0-4967bb6d6346]: (4, ('Fri Jan 23 09:38:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2 (4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61)\n4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61\nFri Jan 23 09:38:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2 (4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61)\n4b73412f49bf7af5d9eaecacaee7de08f7ac840c4af6cfc8e0ef0a8515765b61\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:10.947 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6d207873-ff5a-4d8d-b2a9-dfa86fc5b38f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:10.948 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b41d31a-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.950 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:10 np0005593234 kernel: tap1b41d31a-40: left promiscuous mode
Jan 23 04:38:10 np0005593234 nova_compute[227762]: 2026-01-23 09:38:10.967 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:10.970 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bdc62642-dd82-4646-bd37-e01b0774fd68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:10.985 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d45eca0f-47ce-4ee1-ac28-f318890a9865]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:10.986 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a6636c5e-2fa9-4738-9d9a-87de32e48689]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:11.000 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1b43a935-b244-460a-b1fa-54d0f6ed7230]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498253, 'reachable_time': 32444, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244626, 'error': None, 'target': 'ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:11.004 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1b41d31a-47ee-41d6-9860-bbbbe4b282f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:38:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:11.004 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[3307b87a-3dd8-4176-ad0f-a389eaa933c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:11 np0005593234 systemd[1]: run-netns-ovnmeta\x2d1b41d31a\x2d47ee\x2d41d6\x2d9860\x2dbbbbe4b282f2.mount: Deactivated successfully.
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.264 227766 INFO nova.virt.libvirt.driver [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Deleting instance files /var/lib/nova/instances/e8375e53-0781-4214-93e9-725707aab45d_del#033[00m
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.265 227766 INFO nova.virt.libvirt.driver [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Deletion of /var/lib/nova/instances/e8375e53-0781-4214-93e9-725707aab45d_del complete#033[00m
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.527 227766 INFO nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Creating config drive at /var/lib/nova/instances/52908364-c256-4f35-8ea4-1904a14fa399/disk.config#033[00m
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.533 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/52908364-c256-4f35-8ea4-1904a14fa399/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpag__5sq2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.661 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/52908364-c256-4f35-8ea4-1904a14fa399/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpag__5sq2" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.691 227766 DEBUG nova.storage.rbd_utils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] rbd image 52908364-c256-4f35-8ea4-1904a14fa399_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.695 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/52908364-c256-4f35-8ea4-1904a14fa399/disk.config 52908364-c256-4f35-8ea4-1904a14fa399_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.718 227766 DEBUG nova.network.neutron [req-29068447-5e4a-437e-92d0-784e8add60f7 req-7e744c9b-fae9-4275-b94b-e97dc9d4999d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Updated VIF entry in instance network info cache for port 805027f4-e7d4-48b3-9fac-a3e7901dbd9f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.719 227766 DEBUG nova.network.neutron [req-29068447-5e4a-437e-92d0-784e8add60f7 req-7e744c9b-fae9-4275-b94b-e97dc9d4999d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Updating instance_info_cache with network_info: [{"id": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "address": "fa:16:3e:36:e2:ca", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap805027f4-e7", "ovs_interfaceid": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.759 227766 INFO nova.virt.libvirt.driver [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.763 227766 DEBUG nova.compute.manager [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.889 227766 DEBUG oslo_concurrency.processutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/52908364-c256-4f35-8ea4-1904a14fa399/disk.config 52908364-c256-4f35-8ea4-1904a14fa399_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.194s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.890 227766 INFO nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Deleting local config drive /var/lib/nova/instances/52908364-c256-4f35-8ea4-1904a14fa399/disk.config because it was imported into RBD.#033[00m
Jan 23 04:38:11 np0005593234 systemd-udevd[244531]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:38:11 np0005593234 kernel: tap805027f4-e7: entered promiscuous mode
Jan 23 04:38:11 np0005593234 NetworkManager[48942]: <info>  [1769161091.9400] manager: (tap805027f4-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/61)
Jan 23 04:38:11 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:11Z|00096|binding|INFO|Claiming lport 805027f4-e7d4-48b3-9fac-a3e7901dbd9f for this chassis.
Jan 23 04:38:11 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:11Z|00097|binding|INFO|805027f4-e7d4-48b3-9fac-a3e7901dbd9f: Claiming fa:16:3e:36:e2:ca 10.100.0.10
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.941 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:11 np0005593234 NetworkManager[48942]: <info>  [1769161091.9507] device (tap805027f4-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:38:11 np0005593234 NetworkManager[48942]: <info>  [1769161091.9517] device (tap805027f4-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:38:11 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:11Z|00098|binding|INFO|Setting lport 805027f4-e7d4-48b3-9fac-a3e7901dbd9f ovn-installed in OVS
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.960 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:11 np0005593234 nova_compute[227762]: 2026-01-23 09:38:11.962 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:11 np0005593234 systemd-machined[195626]: New machine qemu-17-instance-00000023.
Jan 23 04:38:11 np0005593234 systemd[1]: Started Virtual Machine qemu-17-instance-00000023.
Jan 23 04:38:12 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:12Z|00099|binding|INFO|Setting lport 805027f4-e7d4-48b3-9fac-a3e7901dbd9f up in Southbound
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.187 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:e2:ca 10.100.0.10'], port_security=['fa:16:3e:36:e2:ca 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '52908364-c256-4f35-8ea4-1904a14fa399', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a5f46b255cd4387bd3e4c0acaa39466', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7d939c30-94ef-4237-8ee8-7374d4fefcd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bd55a4d-ba72-4dcd-bf4e-ec1dab31b370, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=805027f4-e7d4-48b3-9fac-a3e7901dbd9f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.188 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 805027f4-e7d4-48b3-9fac-a3e7901dbd9f in datapath 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c bound to our chassis#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.190 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.200 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bfdf1e6f-2dfd-49cf-8f76-5bbcbb4baaf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.201 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1f2b13ad-71 in ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.203 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1f2b13ad-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.203 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ac41c4a6-1710-470f-825a-1c77f69bb72b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.204 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[82bca535-7409-4fac-99bb-4c0ea57c7bd6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.212 227766 INFO nova.compute.manager [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Took 1.58 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.216 227766 DEBUG oslo.service.loopingcall [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.235 227766 DEBUG nova.compute.manager [-] [instance: e8375e53-0781-4214-93e9-725707aab45d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.235 227766 DEBUG nova.network.neutron [-] [instance: e8375e53-0781-4214-93e9-725707aab45d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.237 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[4c218a71-44b0-4b82-bf66-79492d230dbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.240 227766 DEBUG nova.objects.instance [None req-5be3b4d0-1559-4f86-9a92-4b6b08004a67 933b0942f8ee41568f9bab0377f99d4a cf9f72c217124dc988cfe3d1b549fa02 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.242 227766 DEBUG oslo_concurrency.lockutils [req-29068447-5e4a-437e-92d0-784e8add60f7 req-7e744c9b-fae9-4275-b94b-e97dc9d4999d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-52908364-c256-4f35-8ea4-1904a14fa399" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.245 227766 DEBUG nova.compute.manager [req-156aa33d-c6f9-4f85-8243-3bf73a23713e req-b4c26264-1d8a-40b2-8fb4-84dd6fe88313 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Received event network-vif-unplugged-6c46396c-25f3-4442-b5bf-7ba6361eed17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.246 227766 DEBUG oslo_concurrency.lockutils [req-156aa33d-c6f9-4f85-8243-3bf73a23713e req-b4c26264-1d8a-40b2-8fb4-84dd6fe88313 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "e8375e53-0781-4214-93e9-725707aab45d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.246 227766 DEBUG oslo_concurrency.lockutils [req-156aa33d-c6f9-4f85-8243-3bf73a23713e req-b4c26264-1d8a-40b2-8fb4-84dd6fe88313 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.246 227766 DEBUG oslo_concurrency.lockutils [req-156aa33d-c6f9-4f85-8243-3bf73a23713e req-b4c26264-1d8a-40b2-8fb4-84dd6fe88313 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.246 227766 DEBUG nova.compute.manager [req-156aa33d-c6f9-4f85-8243-3bf73a23713e req-b4c26264-1d8a-40b2-8fb4-84dd6fe88313 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] No waiting events found dispatching network-vif-unplugged-6c46396c-25f3-4442-b5bf-7ba6361eed17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.247 227766 DEBUG nova.compute.manager [req-156aa33d-c6f9-4f85-8243-3bf73a23713e req-b4c26264-1d8a-40b2-8fb4-84dd6fe88313 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Received event network-vif-unplugged-6c46396c-25f3-4442-b5bf-7ba6361eed17 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.253 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b45e144a-6865-4324-a74f-406b328d370f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.281 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d80446d1-6bfc-4ea1-bee1-6a41b5514e0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 NetworkManager[48942]: <info>  [1769161092.2885] manager: (tap1f2b13ad-70): new Veth device (/org/freedesktop/NetworkManager/Devices/62)
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.290 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eba879ca-9f7a-4ff8-8d58-2981cb3b142c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.321 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7e323f2b-1ed6-4878-9bbf-7351f0227046]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.323 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d4b7e8a4-df7f-4c56-a7c8-9c1a702765c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 NetworkManager[48942]: <info>  [1769161092.3444] device (tap1f2b13ad-70): carrier: link connected
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.350 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4834abba-c557-4e0e-b904-3a584e08df63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:12.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.366 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[160eb676-9d5a-4169-b907-61be1cacd9db]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f2b13ad-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:78:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498730, 'reachable_time': 21933, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 244763, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.382 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3f1736ba-d58d-45a6-adc1-ea9dacb2bdb8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec8:78b8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 498730, 'tstamp': 498730}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 244764, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.399 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[71860fac-2a53-4ce4-b155-b265c9af749d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1f2b13ad-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c8:78:b8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498730, 'reachable_time': 21933, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 244765, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.427 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[634f28c8-9950-4fd0-bf32-9fbf6927c28f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.485 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[33fb98e1-b04e-4f26-b8a1-620adcaf2195]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.487 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f2b13ad-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.487 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.487 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f2b13ad-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:12 np0005593234 kernel: tap1f2b13ad-70: entered promiscuous mode
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.490 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:12 np0005593234 NetworkManager[48942]: <info>  [1769161092.4914] manager: (tap1f2b13ad-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.494 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1f2b13ad-70, col_values=(('external_ids', {'iface-id': '5880c863-f7b0-4399-b221-f31849823320'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:12 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:12Z|00100|binding|INFO|Releasing lport 5880c863-f7b0-4399-b221-f31849823320 from this chassis (sb_readonly=0)
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.496 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.499 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.503 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ab944888-693d-4111-b4ce-11a24276aa54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.503 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c.pid.haproxy
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:38:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:12.504 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'env', 'PROCESS_TAG=haproxy-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.511 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.766 227766 DEBUG nova.compute.manager [req-25b5b226-9bfd-4b3a-9d0c-12f3f2e76e60 req-23c96503-327c-4a66-8309-2e0f885f009e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Received event network-vif-plugged-805027f4-e7d4-48b3-9fac-a3e7901dbd9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.767 227766 DEBUG oslo_concurrency.lockutils [req-25b5b226-9bfd-4b3a-9d0c-12f3f2e76e60 req-23c96503-327c-4a66-8309-2e0f885f009e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "52908364-c256-4f35-8ea4-1904a14fa399-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.767 227766 DEBUG oslo_concurrency.lockutils [req-25b5b226-9bfd-4b3a-9d0c-12f3f2e76e60 req-23c96503-327c-4a66-8309-2e0f885f009e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.768 227766 DEBUG oslo_concurrency.lockutils [req-25b5b226-9bfd-4b3a-9d0c-12f3f2e76e60 req-23c96503-327c-4a66-8309-2e0f885f009e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.768 227766 DEBUG nova.compute.manager [req-25b5b226-9bfd-4b3a-9d0c-12f3f2e76e60 req-23c96503-327c-4a66-8309-2e0f885f009e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Processing event network-vif-plugged-805027f4-e7d4-48b3-9fac-a3e7901dbd9f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:38:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:12.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.845 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161092.8453643, 52908364-c256-4f35-8ea4-1904a14fa399 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.846 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] VM Started (Lifecycle Event)#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.850 227766 DEBUG nova.compute.manager [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.853 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.856 227766 INFO nova.virt.libvirt.driver [-] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Instance spawned successfully.#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.856 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:38:12 np0005593234 podman[244839]: 2026-01-23 09:38:12.878172135 +0000 UTC m=+0.051623862 container create 92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.907 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.911 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.911 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.912 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.912 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.913 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.913 227766 DEBUG nova.virt.libvirt.driver [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:38:12 np0005593234 systemd[1]: Started libpod-conmon-92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140.scope.
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.917 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:38:12 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:38:12 np0005593234 podman[244839]: 2026-01-23 09:38:12.853511115 +0000 UTC m=+0.026962852 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:38:12 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60c1ad6f58447939381a32aa37e51b4b6f3d6df1bc60cac87b18765a1f3e4371/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:38:12 np0005593234 podman[244839]: 2026-01-23 09:38:12.96227394 +0000 UTC m=+0.135725667 container init 92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:38:12 np0005593234 podman[244839]: 2026-01-23 09:38:12.967592616 +0000 UTC m=+0.141044343 container start 92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 23 04:38:12 np0005593234 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[244854]: [NOTICE]   (244858) : New worker (244860) forked
Jan 23 04:38:12 np0005593234 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[244854]: [NOTICE]   (244858) : Loading success.
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.989 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.990 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161092.8455443, 52908364-c256-4f35-8ea4-1904a14fa399 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:12 np0005593234 nova_compute[227762]: 2026-01-23 09:38:12.990 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:38:13 np0005593234 nova_compute[227762]: 2026-01-23 09:38:13.081 227766 INFO nova.compute.manager [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Took 8.66 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:38:13 np0005593234 nova_compute[227762]: 2026-01-23 09:38:13.084 227766 DEBUG nova.compute.manager [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:13 np0005593234 nova_compute[227762]: 2026-01-23 09:38:13.087 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:13 np0005593234 nova_compute[227762]: 2026-01-23 09:38:13.093 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161092.8527818, 52908364-c256-4f35-8ea4-1904a14fa399 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:13 np0005593234 nova_compute[227762]: 2026-01-23 09:38:13.094 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:38:13 np0005593234 nova_compute[227762]: 2026-01-23 09:38:13.131 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:13 np0005593234 nova_compute[227762]: 2026-01-23 09:38:13.160 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:13 np0005593234 nova_compute[227762]: 2026-01-23 09:38:13.163 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:38:13 np0005593234 nova_compute[227762]: 2026-01-23 09:38:13.241 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:38:13 np0005593234 nova_compute[227762]: 2026-01-23 09:38:13.309 227766 INFO nova.compute.manager [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Took 10.32 seconds to build instance.#033[00m
Jan 23 04:38:13 np0005593234 nova_compute[227762]: 2026-01-23 09:38:13.374 227766 DEBUG oslo_concurrency.lockutils [None req-72c30c51-7ce5-41ca-b538-8e375d96cf7b 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:38:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:14.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:38:14 np0005593234 podman[244870]: 2026-01-23 09:38:14.7512418 +0000 UTC m=+0.045366797 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 04:38:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:38:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:14.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:38:14 np0005593234 nova_compute[227762]: 2026-01-23 09:38:14.886 227766 DEBUG nova.compute.manager [req-793ba24c-844f-42eb-811a-25710602ebb8 req-6113ce84-5ff2-4ead-aee1-e699ca0e889a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Received event network-vif-plugged-6c46396c-25f3-4442-b5bf-7ba6361eed17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:14 np0005593234 nova_compute[227762]: 2026-01-23 09:38:14.886 227766 DEBUG oslo_concurrency.lockutils [req-793ba24c-844f-42eb-811a-25710602ebb8 req-6113ce84-5ff2-4ead-aee1-e699ca0e889a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "e8375e53-0781-4214-93e9-725707aab45d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:14 np0005593234 nova_compute[227762]: 2026-01-23 09:38:14.886 227766 DEBUG oslo_concurrency.lockutils [req-793ba24c-844f-42eb-811a-25710602ebb8 req-6113ce84-5ff2-4ead-aee1-e699ca0e889a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:14 np0005593234 nova_compute[227762]: 2026-01-23 09:38:14.886 227766 DEBUG oslo_concurrency.lockutils [req-793ba24c-844f-42eb-811a-25710602ebb8 req-6113ce84-5ff2-4ead-aee1-e699ca0e889a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:14 np0005593234 nova_compute[227762]: 2026-01-23 09:38:14.887 227766 DEBUG nova.compute.manager [req-793ba24c-844f-42eb-811a-25710602ebb8 req-6113ce84-5ff2-4ead-aee1-e699ca0e889a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] No waiting events found dispatching network-vif-plugged-6c46396c-25f3-4442-b5bf-7ba6361eed17 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:38:14 np0005593234 nova_compute[227762]: 2026-01-23 09:38:14.887 227766 WARNING nova.compute.manager [req-793ba24c-844f-42eb-811a-25710602ebb8 req-6113ce84-5ff2-4ead-aee1-e699ca0e889a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Received unexpected event network-vif-plugged-6c46396c-25f3-4442-b5bf-7ba6361eed17 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:38:15 np0005593234 nova_compute[227762]: 2026-01-23 09:38:15.121 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] Updating instance_info_cache with network_info: [{"id": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "address": "fa:16:3e:ab:f6:9f", "network": {"id": "1b41d31a-47ee-41d6-9860-bbbbe4b282f2", "bridge": "br-int", "label": "tempest-ImagesNegativeTestJSON-1925335251-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6efe15a8c8b44f02b78e989774efff46", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6c46396c-25", "ovs_interfaceid": "6c46396c-25f3-4442-b5bf-7ba6361eed17", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:38:15 np0005593234 nova_compute[227762]: 2026-01-23 09:38:15.900 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:15 np0005593234 nova_compute[227762]: 2026-01-23 09:38:15.910 227766 DEBUG nova.compute.manager [req-711c394a-19f6-42a3-8abe-244fce0ca821 req-cce65117-dc52-457a-9ec6-772425e5b441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Received event network-vif-plugged-805027f4-e7d4-48b3-9fac-a3e7901dbd9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:15 np0005593234 nova_compute[227762]: 2026-01-23 09:38:15.910 227766 DEBUG oslo_concurrency.lockutils [req-711c394a-19f6-42a3-8abe-244fce0ca821 req-cce65117-dc52-457a-9ec6-772425e5b441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "52908364-c256-4f35-8ea4-1904a14fa399-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:15 np0005593234 nova_compute[227762]: 2026-01-23 09:38:15.911 227766 DEBUG oslo_concurrency.lockutils [req-711c394a-19f6-42a3-8abe-244fce0ca821 req-cce65117-dc52-457a-9ec6-772425e5b441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:15 np0005593234 nova_compute[227762]: 2026-01-23 09:38:15.911 227766 DEBUG oslo_concurrency.lockutils [req-711c394a-19f6-42a3-8abe-244fce0ca821 req-cce65117-dc52-457a-9ec6-772425e5b441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:15 np0005593234 nova_compute[227762]: 2026-01-23 09:38:15.911 227766 DEBUG nova.compute.manager [req-711c394a-19f6-42a3-8abe-244fce0ca821 req-cce65117-dc52-457a-9ec6-772425e5b441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] No waiting events found dispatching network-vif-plugged-805027f4-e7d4-48b3-9fac-a3e7901dbd9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:38:15 np0005593234 nova_compute[227762]: 2026-01-23 09:38:15.912 227766 WARNING nova.compute.manager [req-711c394a-19f6-42a3-8abe-244fce0ca821 req-cce65117-dc52-457a-9ec6-772425e5b441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Received unexpected event network-vif-plugged-805027f4-e7d4-48b3-9fac-a3e7901dbd9f for instance with vm_state active and task_state None.#033[00m
Jan 23 04:38:15 np0005593234 nova_compute[227762]: 2026-01-23 09:38:15.914 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-e8375e53-0781-4214-93e9-725707aab45d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:38:15 np0005593234 nova_compute[227762]: 2026-01-23 09:38:15.915 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:38:15 np0005593234 nova_compute[227762]: 2026-01-23 09:38:15.915 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:15 np0005593234 nova_compute[227762]: 2026-01-23 09:38:15.916 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:15 np0005593234 nova_compute[227762]: 2026-01-23 09:38:15.916 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:16 np0005593234 nova_compute[227762]: 2026-01-23 09:38:16.005 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:16 np0005593234 nova_compute[227762]: 2026-01-23 09:38:16.006 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:16 np0005593234 nova_compute[227762]: 2026-01-23 09:38:16.006 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:16 np0005593234 nova_compute[227762]: 2026-01-23 09:38:16.006 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:38:16 np0005593234 nova_compute[227762]: 2026-01-23 09:38:16.007 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:16 np0005593234 nova_compute[227762]: 2026-01-23 09:38:16.050 227766 DEBUG nova.network.neutron [-] [instance: e8375e53-0781-4214-93e9-725707aab45d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:38:16 np0005593234 nova_compute[227762]: 2026-01-23 09:38:16.125 227766 INFO nova.compute.manager [-] [instance: e8375e53-0781-4214-93e9-725707aab45d] Took 3.89 seconds to deallocate network for instance.#033[00m
Jan 23 04:38:16 np0005593234 nova_compute[227762]: 2026-01-23 09:38:16.259 227766 DEBUG oslo_concurrency.lockutils [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:16 np0005593234 nova_compute[227762]: 2026-01-23 09:38:16.260 227766 DEBUG oslo_concurrency.lockutils [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:38:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:16.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:38:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:38:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1875989465' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:38:16 np0005593234 nova_compute[227762]: 2026-01-23 09:38:16.532 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:16 np0005593234 nova_compute[227762]: 2026-01-23 09:38:16.723 227766 DEBUG oslo_concurrency.processutils [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:16.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.030 227766 DEBUG nova.compute.manager [req-5bb801ee-f2c0-4a3d-bb8e-b4244d3fdccb req-a7f03164-e965-41ae-8d67-d58c5d0e707e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: e8375e53-0781-4214-93e9-725707aab45d] Received event network-vif-deleted-6c46396c-25f3-4442-b5bf-7ba6361eed17 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:38:17 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/82722639' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.126 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000023 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.127 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000023 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.132 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.133 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000001d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.143 227766 DEBUG oslo_concurrency.processutils [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.151 227766 DEBUG nova.compute.provider_tree [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.203 227766 DEBUG nova.scheduler.client.report [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.261 227766 DEBUG oslo_concurrency.lockutils [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.338 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.340 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4392MB free_disk=20.78537368774414GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.340 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.341 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.347 227766 INFO nova.scheduler.client.report [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Deleted allocations for instance e8375e53-0781-4214-93e9-725707aab45d#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.518 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Applying migration context for instance 261ab1ec-f79b-4867-bcb6-1c1d7491120e as it has an incoming, in-progress migration 6e07873c-3bce-4fe1-8af8-b92dbe7fb0df. Migration status is running _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.519 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.520 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.589 227766 DEBUG oslo_concurrency.lockutils [None req-aef46b9f-e7a7-49ba-99bb-1d3035d51602 0b90ffd889434d4992770d9c8694044d 6efe15a8c8b44f02b78e989774efff46 - - default default] Lock "e8375e53-0781-4214-93e9-725707aab45d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.958s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.635 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 261ab1ec-f79b-4867-bcb6-1c1d7491120e actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.635 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 52908364-c256-4f35-8ea4-1904a14fa399 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.636 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.636 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:38:17 np0005593234 nova_compute[227762]: 2026-01-23 09:38:17.821 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:18 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:18Z|00101|binding|INFO|Releasing lport 5880c863-f7b0-4399-b221-f31849823320 from this chassis (sb_readonly=0)
Jan 23 04:38:18 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:18Z|00102|binding|INFO|Releasing lport b545a870-aa18-4f64-a8a7-f8512824c4cc from this chassis (sb_readonly=0)
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.046 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:38:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2805103317' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.262 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.269 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.307 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.319 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:18 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:18Z|00103|binding|INFO|Releasing lport 5880c863-f7b0-4399-b221-f31849823320 from this chassis (sb_readonly=0)
Jan 23 04:38:18 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:18Z|00104|binding|INFO|Releasing lport b545a870-aa18-4f64-a8a7-f8512824c4cc from this chassis (sb_readonly=0)
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.337 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:18.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.461 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.462 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.463 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.463 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.490 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.491 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.491 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.536 227766 DEBUG oslo_concurrency.lockutils [None req-2f30f094-ba33-4111-b5b6-c424fe38a546 7ef4bd97218845f7ab8bcdbdae714c9a f15de556837a4728b2af7d6cfc3cbfaf - - default default] Acquiring lock "refresh_cache-52908364-c256-4f35-8ea4-1904a14fa399" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.536 227766 DEBUG oslo_concurrency.lockutils [None req-2f30f094-ba33-4111-b5b6-c424fe38a546 7ef4bd97218845f7ab8bcdbdae714c9a f15de556837a4728b2af7d6cfc3cbfaf - - default default] Acquired lock "refresh_cache-52908364-c256-4f35-8ea4-1904a14fa399" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.537 227766 DEBUG nova.network.neutron [None req-2f30f094-ba33-4111-b5b6-c424fe38a546 7ef4bd97218845f7ab8bcdbdae714c9a f15de556837a4728b2af7d6cfc3cbfaf - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.542 227766 DEBUG oslo_concurrency.lockutils [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.542 227766 DEBUG oslo_concurrency.lockutils [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.543 227766 DEBUG oslo_concurrency.lockutils [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.543 227766 DEBUG oslo_concurrency.lockutils [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.543 227766 DEBUG oslo_concurrency.lockutils [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.545 227766 INFO nova.compute.manager [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Terminating instance#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.546 227766 DEBUG nova.compute.manager [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:38:18 np0005593234 kernel: tap27e277b3-21 (unregistering): left promiscuous mode
Jan 23 04:38:18 np0005593234 NetworkManager[48942]: <info>  [1769161098.6183] device (tap27e277b3-21): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:38:18 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:18Z|00105|binding|INFO|Releasing lport 27e277b3-2135-4e3e-b336-e0da87509465 from this chassis (sb_readonly=0)
Jan 23 04:38:18 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:18Z|00106|binding|INFO|Setting lport 27e277b3-2135-4e3e-b336-e0da87509465 down in Southbound
Jan 23 04:38:18 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:18Z|00107|binding|INFO|Removing iface tap27e277b3-21 ovn-installed in OVS
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.627 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:18.641 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:06:0e 10.100.0.11'], port_security=['fa:16:3e:34:06:0e 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '261ab1ec-f79b-4867-bcb6-1c1d7491120e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd0dce6e339c349d4ab97cee5e49fff3a', 'neutron:revision_number': '22', 'neutron:security_group_ids': '0179c400-b2f2-4914-b563-942a61ef1858', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cbb60528-b878-42fd-9c2f-0a3345010b1a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=27e277b3-2135-4e3e-b336-e0da87509465) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.644 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:18.643 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 27e277b3-2135-4e3e-b336-e0da87509465 in datapath 8eab8076-0848-4daf-bbac-f3f8b65ca750 unbound from our chassis#033[00m
Jan 23 04:38:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:18.646 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8eab8076-0848-4daf-bbac-f3f8b65ca750, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:38:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:18.648 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b34b9662-2a31-497d-a34c-f3790e97a859]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:18.649 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 namespace which is not needed anymore#033[00m
Jan 23 04:38:18 np0005593234 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001d.scope: Deactivated successfully.
Jan 23 04:38:18 np0005593234 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001d.scope: Consumed 1.949s CPU time.
Jan 23 04:38:18 np0005593234 systemd-machined[195626]: Machine qemu-15-instance-0000001d terminated.
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.762 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.769 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.776 227766 INFO nova.virt.libvirt.driver [-] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Instance destroyed successfully.#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.777 227766 DEBUG nova.objects.instance [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lazy-loading 'resources' on Instance uuid 261ab1ec-f79b-4867-bcb6-1c1d7491120e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:38:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:38:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:18.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:38:18 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[244250]: [NOTICE]   (244254) : haproxy version is 2.8.14-c23fe91
Jan 23 04:38:18 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[244250]: [NOTICE]   (244254) : path to executable is /usr/sbin/haproxy
Jan 23 04:38:18 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[244250]: [WARNING]  (244254) : Exiting Master process...
Jan 23 04:38:18 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[244250]: [WARNING]  (244254) : Exiting Master process...
Jan 23 04:38:18 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[244250]: [ALERT]    (244254) : Current worker (244256) exited with code 143 (Terminated)
Jan 23 04:38:18 np0005593234 neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750[244250]: [WARNING]  (244254) : All workers exited. Exiting... (0)
Jan 23 04:38:18 np0005593234 systemd[1]: libpod-10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1.scope: Deactivated successfully.
Jan 23 04:38:18 np0005593234 podman[244983]: 2026-01-23 09:38:18.826707051 +0000 UTC m=+0.083696264 container died 10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.848 227766 DEBUG nova.virt.libvirt.vif [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:37:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-724421301',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-724421301',id=29,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:37:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d0dce6e339c349d4ab97cee5e49fff3a',ramdisk_id='',reservation_id='r-106tqp53',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='2',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1207260646',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1207260646-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:38:12Z,user_data=None,user_id='4f72965e950c4761bfedd99fdc411a83',uuid=261ab1ec-f79b-4867-bcb6-1c1d7491120e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.849 227766 DEBUG nova.network.os_vif_util [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Converting VIF {"id": "27e277b3-2135-4e3e-b336-e0da87509465", "address": "fa:16:3e:34:06:0e", "network": {"id": "8eab8076-0848-4daf-bbac-f3f8b65ca750", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1369330040-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d0dce6e339c349d4ab97cee5e49fff3a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap27e277b3-21", "ovs_interfaceid": "27e277b3-2135-4e3e-b336-e0da87509465", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.850 227766 DEBUG nova.network.os_vif_util [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.850 227766 DEBUG os_vif [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.852 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.852 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap27e277b3-21, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.854 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.856 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:18 np0005593234 nova_compute[227762]: 2026-01-23 09:38:18.859 227766 INFO os_vif [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:06:0e,bridge_name='br-int',has_traffic_filtering=True,id=27e277b3-2135-4e3e-b336-e0da87509465,network=Network(8eab8076-0848-4daf-bbac-f3f8b65ca750),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap27e277b3-21')#033[00m
Jan 23 04:38:18 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1-userdata-shm.mount: Deactivated successfully.
Jan 23 04:38:18 np0005593234 systemd[1]: var-lib-containers-storage-overlay-c7e94588b446a6e33816455026bc4edc3bb9e14cedb0e85cec3d198149c28eb2-merged.mount: Deactivated successfully.
Jan 23 04:38:18 np0005593234 podman[244983]: 2026-01-23 09:38:18.928892711 +0000 UTC m=+0.185881894 container cleanup 10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:38:18 np0005593234 systemd[1]: libpod-conmon-10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1.scope: Deactivated successfully.
Jan 23 04:38:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:19 np0005593234 podman[245037]: 2026-01-23 09:38:19.070909914 +0000 UTC m=+0.122302309 container remove 10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:38:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:19.076 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[917bab08-ac64-4a12-a023-9ebe7bbe93cc]: (4, ('Fri Jan 23 09:38:18 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 (10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1)\n10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1\nFri Jan 23 09:38:18 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 (10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1)\n10b198ef7c9966bb20a2a9bd4638dc189529376ccfb13926cd5484eae9f0c1b1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:19.078 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c5930566-0ea5-4946-98ab-613ac9bcf700]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:19.079 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8eab8076-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:19 np0005593234 nova_compute[227762]: 2026-01-23 09:38:19.080 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:19 np0005593234 kernel: tap8eab8076-00: left promiscuous mode
Jan 23 04:38:19 np0005593234 nova_compute[227762]: 2026-01-23 09:38:19.082 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:19.094 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[42ce85a7-fca8-4ba9-8389-027cfca1e810]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:19 np0005593234 nova_compute[227762]: 2026-01-23 09:38:19.104 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:19.112 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e1ec494d-8e79-4d3c-acb5-27d594c19d14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:19.112 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f57e9228-707b-40a9-83be-bfb9ea701cc9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:19.133 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a866db60-5699-4b9b-aab5-d8fe5b203ced]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498065, 'reachable_time': 26631, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245052, 'error': None, 'target': 'ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:19.136 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8eab8076-0848-4daf-bbac-f3f8b65ca750 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:38:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:19.137 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[dbc03198-3a9c-4be5-8858-efc6595844fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:38:19 np0005593234 systemd[1]: run-netns-ovnmeta\x2d8eab8076\x2d0848\x2d4daf\x2dbbac\x2df3f8b65ca750.mount: Deactivated successfully.
Jan 23 04:38:19 np0005593234 nova_compute[227762]: 2026-01-23 09:38:19.231 227766 INFO nova.virt.libvirt.driver [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Deleting instance files /var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e_del#033[00m
Jan 23 04:38:19 np0005593234 nova_compute[227762]: 2026-01-23 09:38:19.232 227766 INFO nova.virt.libvirt.driver [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Deletion of /var/lib/nova/instances/261ab1ec-f79b-4867-bcb6-1c1d7491120e_del complete#033[00m
Jan 23 04:38:19 np0005593234 nova_compute[227762]: 2026-01-23 09:38:19.487 227766 INFO nova.compute.manager [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Took 0.94 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:38:19 np0005593234 nova_compute[227762]: 2026-01-23 09:38:19.487 227766 DEBUG oslo.service.loopingcall [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:38:19 np0005593234 nova_compute[227762]: 2026-01-23 09:38:19.488 227766 DEBUG nova.compute.manager [-] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:38:19 np0005593234 nova_compute[227762]: 2026-01-23 09:38:19.488 227766 DEBUG nova.network.neutron [-] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:38:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:38:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:20.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:38:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:20.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:21 np0005593234 nova_compute[227762]: 2026-01-23 09:38:21.511 227766 DEBUG nova.compute.manager [req-041e7c2c-1e5a-410c-ac8a-b4f0afb43d2d req-b808d131-458e-4340-94b3-e340b68d0b91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-vif-unplugged-27e277b3-2135-4e3e-b336-e0da87509465 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:21 np0005593234 nova_compute[227762]: 2026-01-23 09:38:21.512 227766 DEBUG oslo_concurrency.lockutils [req-041e7c2c-1e5a-410c-ac8a-b4f0afb43d2d req-b808d131-458e-4340-94b3-e340b68d0b91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:21 np0005593234 nova_compute[227762]: 2026-01-23 09:38:21.512 227766 DEBUG oslo_concurrency.lockutils [req-041e7c2c-1e5a-410c-ac8a-b4f0afb43d2d req-b808d131-458e-4340-94b3-e340b68d0b91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:21 np0005593234 nova_compute[227762]: 2026-01-23 09:38:21.512 227766 DEBUG oslo_concurrency.lockutils [req-041e7c2c-1e5a-410c-ac8a-b4f0afb43d2d req-b808d131-458e-4340-94b3-e340b68d0b91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:21 np0005593234 nova_compute[227762]: 2026-01-23 09:38:21.512 227766 DEBUG nova.compute.manager [req-041e7c2c-1e5a-410c-ac8a-b4f0afb43d2d req-b808d131-458e-4340-94b3-e340b68d0b91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] No waiting events found dispatching network-vif-unplugged-27e277b3-2135-4e3e-b336-e0da87509465 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:38:21 np0005593234 nova_compute[227762]: 2026-01-23 09:38:21.512 227766 DEBUG nova.compute.manager [req-041e7c2c-1e5a-410c-ac8a-b4f0afb43d2d req-b808d131-458e-4340-94b3-e340b68d0b91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-vif-unplugged-27e277b3-2135-4e3e-b336-e0da87509465 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:38:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:22.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:22.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:23 np0005593234 nova_compute[227762]: 2026-01-23 09:38:23.322 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:23 np0005593234 nova_compute[227762]: 2026-01-23 09:38:23.501 227766 DEBUG nova.network.neutron [-] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:38:23 np0005593234 nova_compute[227762]: 2026-01-23 09:38:23.672 227766 INFO nova.compute.manager [-] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Took 4.18 seconds to deallocate network for instance.#033[00m
Jan 23 04:38:23 np0005593234 nova_compute[227762]: 2026-01-23 09:38:23.854 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:23 np0005593234 nova_compute[227762]: 2026-01-23 09:38:23.894 227766 DEBUG nova.compute.manager [req-74107f9f-773f-44af-93c1-2b2626ed3f61 req-0c0c0018-a5f2-4964-8d74-f192025cfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:23 np0005593234 nova_compute[227762]: 2026-01-23 09:38:23.894 227766 DEBUG oslo_concurrency.lockutils [req-74107f9f-773f-44af-93c1-2b2626ed3f61 req-0c0c0018-a5f2-4964-8d74-f192025cfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:23 np0005593234 nova_compute[227762]: 2026-01-23 09:38:23.895 227766 DEBUG oslo_concurrency.lockutils [req-74107f9f-773f-44af-93c1-2b2626ed3f61 req-0c0c0018-a5f2-4964-8d74-f192025cfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:23 np0005593234 nova_compute[227762]: 2026-01-23 09:38:23.895 227766 DEBUG oslo_concurrency.lockutils [req-74107f9f-773f-44af-93c1-2b2626ed3f61 req-0c0c0018-a5f2-4964-8d74-f192025cfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:23 np0005593234 nova_compute[227762]: 2026-01-23 09:38:23.896 227766 DEBUG nova.compute.manager [req-74107f9f-773f-44af-93c1-2b2626ed3f61 req-0c0c0018-a5f2-4964-8d74-f192025cfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] No waiting events found dispatching network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:38:23 np0005593234 nova_compute[227762]: 2026-01-23 09:38:23.896 227766 WARNING nova.compute.manager [req-74107f9f-773f-44af-93c1-2b2626ed3f61 req-0c0c0018-a5f2-4964-8d74-f192025cfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received unexpected event network-vif-plugged-27e277b3-2135-4e3e-b336-e0da87509465 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:38:23 np0005593234 nova_compute[227762]: 2026-01-23 09:38:23.896 227766 DEBUG nova.compute.manager [req-74107f9f-773f-44af-93c1-2b2626ed3f61 req-0c0c0018-a5f2-4964-8d74-f192025cfda7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Received event network-vif-deleted-27e277b3-2135-4e3e-b336-e0da87509465 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:38:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:24 np0005593234 nova_compute[227762]: 2026-01-23 09:38:24.266 227766 DEBUG nova.network.neutron [None req-2f30f094-ba33-4111-b5b6-c424fe38a546 7ef4bd97218845f7ab8bcdbdae714c9a f15de556837a4728b2af7d6cfc3cbfaf - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Updating instance_info_cache with network_info: [{"id": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "address": "fa:16:3e:36:e2:ca", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap805027f4-e7", "ovs_interfaceid": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:38:24 np0005593234 nova_compute[227762]: 2026-01-23 09:38:24.301 227766 DEBUG oslo_concurrency.lockutils [None req-2f30f094-ba33-4111-b5b6-c424fe38a546 7ef4bd97218845f7ab8bcdbdae714c9a f15de556837a4728b2af7d6cfc3cbfaf - - default default] Releasing lock "refresh_cache-52908364-c256-4f35-8ea4-1904a14fa399" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:38:24 np0005593234 nova_compute[227762]: 2026-01-23 09:38:24.302 227766 DEBUG nova.compute.manager [None req-2f30f094-ba33-4111-b5b6-c424fe38a546 7ef4bd97218845f7ab8bcdbdae714c9a f15de556837a4728b2af7d6cfc3cbfaf - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Inject network info _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7144#033[00m
Jan 23 04:38:24 np0005593234 nova_compute[227762]: 2026-01-23 09:38:24.302 227766 DEBUG nova.compute.manager [None req-2f30f094-ba33-4111-b5b6-c424fe38a546 7ef4bd97218845f7ab8bcdbdae714c9a f15de556837a4728b2af7d6cfc3cbfaf - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] network_info to inject: |[{"id": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "address": "fa:16:3e:36:e2:ca", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap805027f4-e7", "ovs_interfaceid": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _inject_network_info /usr/lib/python3.9/site-packages/nova/compute/manager.py:7145#033[00m
Jan 23 04:38:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:24.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:24 np0005593234 nova_compute[227762]: 2026-01-23 09:38:24.676 227766 INFO nova.compute.manager [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Took 1.00 seconds to detach 1 volumes for instance.#033[00m
Jan 23 04:38:24 np0005593234 nova_compute[227762]: 2026-01-23 09:38:24.677 227766 DEBUG nova.compute.manager [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Deleting volume: b06791ec-66fd-4114-8448-7ea0b7f88f25 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 23 04:38:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:24.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:25 np0005593234 nova_compute[227762]: 2026-01-23 09:38:25.118 227766 DEBUG oslo_concurrency.lockutils [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:25 np0005593234 nova_compute[227762]: 2026-01-23 09:38:25.119 227766 DEBUG oslo_concurrency.lockutils [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:25 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:25Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:36:e2:ca 10.100.0.10
Jan 23 04:38:25 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:25Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:36:e2:ca 10.100.0.10
Jan 23 04:38:25 np0005593234 nova_compute[227762]: 2026-01-23 09:38:25.292 227766 DEBUG oslo_concurrency.processutils [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:38:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:38:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1308046281' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:38:25 np0005593234 nova_compute[227762]: 2026-01-23 09:38:25.728 227766 DEBUG oslo_concurrency.processutils [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:38:25 np0005593234 nova_compute[227762]: 2026-01-23 09:38:25.734 227766 DEBUG nova.compute.provider_tree [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:38:25 np0005593234 nova_compute[227762]: 2026-01-23 09:38:25.764 227766 DEBUG nova.scheduler.client.report [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:38:25 np0005593234 nova_compute[227762]: 2026-01-23 09:38:25.861 227766 DEBUG oslo_concurrency.lockutils [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.742s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:25 np0005593234 nova_compute[227762]: 2026-01-23 09:38:25.872 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161090.8713107, e8375e53-0781-4214-93e9-725707aab45d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:25 np0005593234 nova_compute[227762]: 2026-01-23 09:38:25.872 227766 INFO nova.compute.manager [-] [instance: e8375e53-0781-4214-93e9-725707aab45d] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:38:25 np0005593234 nova_compute[227762]: 2026-01-23 09:38:25.938 227766 DEBUG nova.compute.manager [None req-8b72f2c4-26aa-4c85-9fb6-1208814c7e92 - - - - - -] [instance: e8375e53-0781-4214-93e9-725707aab45d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:26 np0005593234 nova_compute[227762]: 2026-01-23 09:38:26.167 227766 INFO nova.scheduler.client.report [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Deleted allocations for instance 261ab1ec-f79b-4867-bcb6-1c1d7491120e#033[00m
Jan 23 04:38:26 np0005593234 nova_compute[227762]: 2026-01-23 09:38:26.346 227766 DEBUG oslo_concurrency.lockutils [None req-c8b7202d-5bc9-4dd3-b0d7-ba0718210298 4f72965e950c4761bfedd99fdc411a83 d0dce6e339c349d4ab97cee5e49fff3a - - default default] Lock "261ab1ec-f79b-4867-bcb6-1c1d7491120e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.804s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:26.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:26.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:28 np0005593234 nova_compute[227762]: 2026-01-23 09:38:28.325 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:28.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:28 np0005593234 podman[245081]: 2026-01-23 09:38:28.785711977 +0000 UTC m=+0.076644843 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 04:38:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:28.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:28 np0005593234 nova_compute[227762]: 2026-01-23 09:38:28.856 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:30.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:30.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:32.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:32.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:33 np0005593234 nova_compute[227762]: 2026-01-23 09:38:33.327 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:33 np0005593234 nova_compute[227762]: 2026-01-23 09:38:33.774 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161098.773702, 261ab1ec-f79b-4867-bcb6-1c1d7491120e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:38:33 np0005593234 nova_compute[227762]: 2026-01-23 09:38:33.775 227766 INFO nova.compute.manager [-] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:38:33 np0005593234 nova_compute[227762]: 2026-01-23 09:38:33.808 227766 DEBUG nova.compute.manager [None req-2c24273c-50fc-445a-a0e7-30286db0d701 - - - - - -] [instance: 261ab1ec-f79b-4867-bcb6-1c1d7491120e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:38:33 np0005593234 nova_compute[227762]: 2026-01-23 09:38:33.858 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:38:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:34.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:38:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:34.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:36.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:36.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:38 np0005593234 nova_compute[227762]: 2026-01-23 09:38:38.329 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:38.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:38.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:38 np0005593234 nova_compute[227762]: 2026-01-23 09:38:38.859 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:38:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:40.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:38:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:40.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:42.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:42.815 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:38:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:42.815 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:38:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:42.816 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:38:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:42.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:43 np0005593234 nova_compute[227762]: 2026-01-23 09:38:43.331 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:43 np0005593234 nova_compute[227762]: 2026-01-23 09:38:43.862 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:44.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:38:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/764414784' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:38:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:38:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/764414784' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:38:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:44.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:45 np0005593234 podman[245167]: 2026-01-23 09:38:45.75935485 +0000 UTC m=+0.054709548 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 23 04:38:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:38:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:46.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:38:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:46.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:48 np0005593234 nova_compute[227762]: 2026-01-23 09:38:48.333 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:48.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:48.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:48 np0005593234 nova_compute[227762]: 2026-01-23 09:38:48.864 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:38:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:50.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:38:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:50.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:52.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:52.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:53 np0005593234 nova_compute[227762]: 2026-01-23 09:38:53.251 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:53.251 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:38:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:53.254 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:38:53 np0005593234 nova_compute[227762]: 2026-01-23 09:38:53.336 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:53 np0005593234 nova_compute[227762]: 2026-01-23 09:38:53.865 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.057001799s ======
Jan 23 04:38:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:54.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.057001799s
Jan 23 04:38:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:54.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:38:56.257 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:38:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:38:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:56.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:38:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:56.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:58 np0005593234 nova_compute[227762]: 2026-01-23 09:38:58.337 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:38:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:38:58.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:38:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:38:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:38:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:38:58.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:38:58 np0005593234 nova_compute[227762]: 2026-01-23 09:38:58.866 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:38:58Z|00108|binding|INFO|Releasing lport 5880c863-f7b0-4399-b221-f31849823320 from this chassis (sb_readonly=0)
Jan 23 04:38:59 np0005593234 nova_compute[227762]: 2026-01-23 09:38:59.032 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:38:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:38:59 np0005593234 podman[245243]: 2026-01-23 09:38:59.819010983 +0000 UTC m=+0.119372208 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Jan 23 04:39:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:39:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:00.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:39:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:00.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:39:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:02.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:39:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:02.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:39:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:39:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:39:03 np0005593234 nova_compute[227762]: 2026-01-23 09:39:03.340 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:03 np0005593234 nova_compute[227762]: 2026-01-23 09:39:03.869 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:04.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:04.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:06.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:39:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:06.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:39:07 np0005593234 nova_compute[227762]: 2026-01-23 09:39:07.797 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:08 np0005593234 nova_compute[227762]: 2026-01-23 09:39:08.341 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:08.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:08.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:08 np0005593234 nova_compute[227762]: 2026-01-23 09:39:08.870 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:09 np0005593234 nova_compute[227762]: 2026-01-23 09:39:09.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:09 np0005593234 nova_compute[227762]: 2026-01-23 09:39:09.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:39:09 np0005593234 nova_compute[227762]: 2026-01-23 09:39:09.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:39:10 np0005593234 nova_compute[227762]: 2026-01-23 09:39:10.127 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-52908364-c256-4f35-8ea4-1904a14fa399" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:39:10 np0005593234 nova_compute[227762]: 2026-01-23 09:39:10.128 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-52908364-c256-4f35-8ea4-1904a14fa399" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:39:10 np0005593234 nova_compute[227762]: 2026-01-23 09:39:10.128 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:39:10 np0005593234 nova_compute[227762]: 2026-01-23 09:39:10.129 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 52908364-c256-4f35-8ea4-1904a14fa399 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:39:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:10.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:10.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:39:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:39:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:12.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:12.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:13 np0005593234 nova_compute[227762]: 2026-01-23 09:39:13.344 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:13 np0005593234 nova_compute[227762]: 2026-01-23 09:39:13.872 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.122 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Updating instance_info_cache with network_info: [{"id": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "address": "fa:16:3e:36:e2:ca", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap805027f4-e7", "ovs_interfaceid": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.160 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-52908364-c256-4f35-8ea4-1904a14fa399" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.160 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.160 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.160 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.161 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.161 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.161 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.161 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.213 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.213 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.214 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.214 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.214 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:39:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:14.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:39:14 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/801505846' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.671 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.763 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000023 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.763 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000023 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:39:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:14.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.910 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.911 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4566MB free_disk=20.806194305419922GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.911 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:14 np0005593234 nova_compute[227762]: 2026-01-23 09:39:14.911 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.014 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 52908364-c256-4f35-8ea4-1904a14fa399 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.015 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.015 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.034 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.068 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.069 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.095 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.150 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.266 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:39:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:39:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4157839132' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.703 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.708 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.738 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.768 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:39:15 np0005593234 nova_compute[227762]: 2026-01-23 09:39:15.768 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.857s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:16.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:16 np0005593234 podman[245558]: 2026-01-23 09:39:16.761423173 +0000 UTC m=+0.055624708 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:39:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:16.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:17 np0005593234 nova_compute[227762]: 2026-01-23 09:39:17.763 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:17 np0005593234 nova_compute[227762]: 2026-01-23 09:39:17.763 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:17 np0005593234 nova_compute[227762]: 2026-01-23 09:39:17.804 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:39:18 np0005593234 nova_compute[227762]: 2026-01-23 09:39:18.347 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:39:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:18.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:39:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:18.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:18 np0005593234 nova_compute[227762]: 2026-01-23 09:39:18.873 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:20.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:20.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:22.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:22.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:23 np0005593234 nova_compute[227762]: 2026-01-23 09:39:23.349 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:23 np0005593234 nova_compute[227762]: 2026-01-23 09:39:23.875 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:39:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:24.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:39:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:24.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:26.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:26.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:28 np0005593234 nova_compute[227762]: 2026-01-23 09:39:28.350 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:39:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:28.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:39:28 np0005593234 nova_compute[227762]: 2026-01-23 09:39:28.876 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:28.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:30.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:30 np0005593234 podman[245586]: 2026-01-23 09:39:30.768114963 +0000 UTC m=+0.067840438 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:39:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:30.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:32.517 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:39:32 np0005593234 nova_compute[227762]: 2026-01-23 09:39:32.518 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:32.519 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:39:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:32.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:39:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:32.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.241 227766 DEBUG oslo_concurrency.lockutils [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "52908364-c256-4f35-8ea4-1904a14fa399" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.242 227766 DEBUG oslo_concurrency.lockutils [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.242 227766 DEBUG oslo_concurrency.lockutils [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "52908364-c256-4f35-8ea4-1904a14fa399-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.243 227766 DEBUG oslo_concurrency.lockutils [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.243 227766 DEBUG oslo_concurrency.lockutils [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.244 227766 INFO nova.compute.manager [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Terminating instance#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.245 227766 DEBUG nova.compute.manager [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:39:33 np0005593234 kernel: tap805027f4-e7 (unregistering): left promiscuous mode
Jan 23 04:39:33 np0005593234 NetworkManager[48942]: <info>  [1769161173.3074] device (tap805027f4-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:39:33 np0005593234 ovn_controller[134547]: 2026-01-23T09:39:33Z|00109|binding|INFO|Releasing lport 805027f4-e7d4-48b3-9fac-a3e7901dbd9f from this chassis (sb_readonly=0)
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.315 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:33 np0005593234 ovn_controller[134547]: 2026-01-23T09:39:33Z|00110|binding|INFO|Setting lport 805027f4-e7d4-48b3-9fac-a3e7901dbd9f down in Southbound
Jan 23 04:39:33 np0005593234 ovn_controller[134547]: 2026-01-23T09:39:33Z|00111|binding|INFO|Removing iface tap805027f4-e7 ovn-installed in OVS
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.336 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.343 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:36:e2:ca 10.100.0.10'], port_security=['fa:16:3e:36:e2:ca 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '52908364-c256-4f35-8ea4-1904a14fa399', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1a5f46b255cd4387bd3e4c0acaa39466', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7d939c30-94ef-4237-8ee8-7374d4fefcd9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2bd55a4d-ba72-4dcd-bf4e-ec1dab31b370, chassis=[], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=805027f4-e7d4-48b3-9fac-a3e7901dbd9f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.344 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 805027f4-e7d4-48b3-9fac-a3e7901dbd9f in datapath 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c unbound from our chassis#033[00m
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.345 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.347 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8147d85b-fc84-4b7f-950f-ed72280db88d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.347 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c namespace which is not needed anymore#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.354 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:33 np0005593234 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000023.scope: Deactivated successfully.
Jan 23 04:39:33 np0005593234 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000023.scope: Consumed 15.953s CPU time.
Jan 23 04:39:33 np0005593234 systemd-machined[195626]: Machine qemu-17-instance-00000023 terminated.
Jan 23 04:39:33 np0005593234 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[244854]: [NOTICE]   (244858) : haproxy version is 2.8.14-c23fe91
Jan 23 04:39:33 np0005593234 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[244854]: [NOTICE]   (244858) : path to executable is /usr/sbin/haproxy
Jan 23 04:39:33 np0005593234 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[244854]: [WARNING]  (244858) : Exiting Master process...
Jan 23 04:39:33 np0005593234 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[244854]: [WARNING]  (244858) : Exiting Master process...
Jan 23 04:39:33 np0005593234 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[244854]: [ALERT]    (244858) : Current worker (244860) exited with code 143 (Terminated)
Jan 23 04:39:33 np0005593234 neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c[244854]: [WARNING]  (244858) : All workers exited. Exiting... (0)
Jan 23 04:39:33 np0005593234 systemd[1]: libpod-92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140.scope: Deactivated successfully.
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.491 227766 INFO nova.virt.libvirt.driver [-] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Instance destroyed successfully.#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.492 227766 DEBUG nova.objects.instance [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lazy-loading 'resources' on Instance uuid 52908364-c256-4f35-8ea4-1904a14fa399 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:39:33 np0005593234 podman[245688]: 2026-01-23 09:39:33.498714479 +0000 UTC m=+0.056182375 container died 92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 04:39:33 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140-userdata-shm.mount: Deactivated successfully.
Jan 23 04:39:33 np0005593234 systemd[1]: var-lib-containers-storage-overlay-60c1ad6f58447939381a32aa37e51b4b6f3d6df1bc60cac87b18765a1f3e4371-merged.mount: Deactivated successfully.
Jan 23 04:39:33 np0005593234 podman[245688]: 2026-01-23 09:39:33.541424332 +0000 UTC m=+0.098892228 container cleanup 92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:39:33 np0005593234 systemd[1]: libpod-conmon-92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140.scope: Deactivated successfully.
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.570 227766 DEBUG nova.virt.libvirt.vif [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:38:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersAdminTestJSON-server-1665006440',display_name='tempest-ServersAdminTestJSON-server-1665006440',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serversadmintestjson-server-1665006440',id=35,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:38:13Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1a5f46b255cd4387bd3e4c0acaa39466',ramdisk_id='',reservation_id='r-ez7ohlpn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersAdminTestJSON-1167530593',owner_user_name='tempest-ServersAdminTestJSON-1167530593-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:38:13Z,user_data=None,user_id='191a72cfd0a841e9806246e07eb62fa6',uuid=52908364-c256-4f35-8ea4-1904a14fa399,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "address": "fa:16:3e:36:e2:ca", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap805027f4-e7", "ovs_interfaceid": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.570 227766 DEBUG nova.network.os_vif_util [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converting VIF {"id": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "address": "fa:16:3e:36:e2:ca", "network": {"id": "1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c", "bridge": "br-int", "label": "tempest-ServersAdminTestJSON-62484463-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1a5f46b255cd4387bd3e4c0acaa39466", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap805027f4-e7", "ovs_interfaceid": "805027f4-e7d4-48b3-9fac-a3e7901dbd9f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.571 227766 DEBUG nova.network.os_vif_util [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:36:e2:ca,bridge_name='br-int',has_traffic_filtering=True,id=805027f4-e7d4-48b3-9fac-a3e7901dbd9f,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap805027f4-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.571 227766 DEBUG os_vif [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:e2:ca,bridge_name='br-int',has_traffic_filtering=True,id=805027f4-e7d4-48b3-9fac-a3e7901dbd9f,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap805027f4-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.573 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.574 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap805027f4-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.575 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.576 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.580 227766 INFO os_vif [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:36:e2:ca,bridge_name='br-int',has_traffic_filtering=True,id=805027f4-e7d4-48b3-9fac-a3e7901dbd9f,network=Network(1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap805027f4-e7')#033[00m
Jan 23 04:39:33 np0005593234 podman[245730]: 2026-01-23 09:39:33.637265294 +0000 UTC m=+0.064490594 container remove 92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.645 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[265a9c20-810e-4335-ba24-f7c3154d068b]: (4, ('Fri Jan 23 09:39:33 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c (92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140)\n92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140\nFri Jan 23 09:39:33 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c (92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140)\n92cc1dbfbaf1bf4d8d68e9cafe64fdb197dc0902831187165831bdb12cbe3140\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.648 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[817ef02f-1005-4683-aa73-4de18222c6b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.649 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f2b13ad-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.651 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:33 np0005593234 kernel: tap1f2b13ad-70: left promiscuous mode
Jan 23 04:39:33 np0005593234 nova_compute[227762]: 2026-01-23 09:39:33.664 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.667 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[01e2650e-6b07-4a37-9f17-cc3b90f0d968]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.683 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[67a402b9-6c1e-483d-965c-f17eaab8835b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.684 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[70f946b7-447c-481b-a81f-83fba907ece2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.703 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[db326280-d2b9-4d77-8417-6a927421ad57]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 498723, 'reachable_time': 43182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 245761, 'error': None, 'target': 'ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:33 np0005593234 systemd[1]: run-netns-ovnmeta\x2d1f2b13ad\x2d7b25\x2d4a2b\x2db4d5\x2d7432a67ce12c.mount: Deactivated successfully.
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.706 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1f2b13ad-7b25-4a2b-b4d5-7432a67ce12c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:39:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:33.707 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[4faf561a-477b-4425-a05f-ce1e036738a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:39:34 np0005593234 nova_compute[227762]: 2026-01-23 09:39:34.012 227766 INFO nova.virt.libvirt.driver [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Deleting instance files /var/lib/nova/instances/52908364-c256-4f35-8ea4-1904a14fa399_del#033[00m
Jan 23 04:39:34 np0005593234 nova_compute[227762]: 2026-01-23 09:39:34.013 227766 INFO nova.virt.libvirt.driver [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Deletion of /var/lib/nova/instances/52908364-c256-4f35-8ea4-1904a14fa399_del complete#033[00m
Jan 23 04:39:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:34 np0005593234 nova_compute[227762]: 2026-01-23 09:39:34.111 227766 INFO nova.compute.manager [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Took 0.87 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:39:34 np0005593234 nova_compute[227762]: 2026-01-23 09:39:34.112 227766 DEBUG oslo.service.loopingcall [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:39:34 np0005593234 nova_compute[227762]: 2026-01-23 09:39:34.112 227766 DEBUG nova.compute.manager [-] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:39:34 np0005593234 nova_compute[227762]: 2026-01-23 09:39:34.112 227766 DEBUG nova.network.neutron [-] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:39:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:39:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:34.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:39:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:34.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:35 np0005593234 nova_compute[227762]: 2026-01-23 09:39:35.032 227766 DEBUG nova.compute.manager [req-077edd4e-58a6-42c7-8b3f-e29ee11df4ed req-43123c41-2167-445f-9b26-f76546230e76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Received event network-vif-unplugged-805027f4-e7d4-48b3-9fac-a3e7901dbd9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:39:35 np0005593234 nova_compute[227762]: 2026-01-23 09:39:35.033 227766 DEBUG oslo_concurrency.lockutils [req-077edd4e-58a6-42c7-8b3f-e29ee11df4ed req-43123c41-2167-445f-9b26-f76546230e76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "52908364-c256-4f35-8ea4-1904a14fa399-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:35 np0005593234 nova_compute[227762]: 2026-01-23 09:39:35.034 227766 DEBUG oslo_concurrency.lockutils [req-077edd4e-58a6-42c7-8b3f-e29ee11df4ed req-43123c41-2167-445f-9b26-f76546230e76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:35 np0005593234 nova_compute[227762]: 2026-01-23 09:39:35.034 227766 DEBUG oslo_concurrency.lockutils [req-077edd4e-58a6-42c7-8b3f-e29ee11df4ed req-43123c41-2167-445f-9b26-f76546230e76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:35 np0005593234 nova_compute[227762]: 2026-01-23 09:39:35.035 227766 DEBUG nova.compute.manager [req-077edd4e-58a6-42c7-8b3f-e29ee11df4ed req-43123c41-2167-445f-9b26-f76546230e76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] No waiting events found dispatching network-vif-unplugged-805027f4-e7d4-48b3-9fac-a3e7901dbd9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:39:35 np0005593234 nova_compute[227762]: 2026-01-23 09:39:35.035 227766 DEBUG nova.compute.manager [req-077edd4e-58a6-42c7-8b3f-e29ee11df4ed req-43123c41-2167-445f-9b26-f76546230e76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Received event network-vif-unplugged-805027f4-e7d4-48b3-9fac-a3e7901dbd9f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:39:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:36.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:36.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:37 np0005593234 nova_compute[227762]: 2026-01-23 09:39:37.970 227766 DEBUG nova.compute.manager [req-f957ad79-b41c-475e-8224-85d670c19633 req-8496ab28-8366-4ec3-abe8-9bcfb66b4a8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Received event network-vif-plugged-805027f4-e7d4-48b3-9fac-a3e7901dbd9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:39:37 np0005593234 nova_compute[227762]: 2026-01-23 09:39:37.971 227766 DEBUG oslo_concurrency.lockutils [req-f957ad79-b41c-475e-8224-85d670c19633 req-8496ab28-8366-4ec3-abe8-9bcfb66b4a8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "52908364-c256-4f35-8ea4-1904a14fa399-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:37 np0005593234 nova_compute[227762]: 2026-01-23 09:39:37.971 227766 DEBUG oslo_concurrency.lockutils [req-f957ad79-b41c-475e-8224-85d670c19633 req-8496ab28-8366-4ec3-abe8-9bcfb66b4a8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:37 np0005593234 nova_compute[227762]: 2026-01-23 09:39:37.971 227766 DEBUG oslo_concurrency.lockutils [req-f957ad79-b41c-475e-8224-85d670c19633 req-8496ab28-8366-4ec3-abe8-9bcfb66b4a8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:37 np0005593234 nova_compute[227762]: 2026-01-23 09:39:37.971 227766 DEBUG nova.compute.manager [req-f957ad79-b41c-475e-8224-85d670c19633 req-8496ab28-8366-4ec3-abe8-9bcfb66b4a8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] No waiting events found dispatching network-vif-plugged-805027f4-e7d4-48b3-9fac-a3e7901dbd9f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:39:37 np0005593234 nova_compute[227762]: 2026-01-23 09:39:37.972 227766 WARNING nova.compute.manager [req-f957ad79-b41c-475e-8224-85d670c19633 req-8496ab28-8366-4ec3-abe8-9bcfb66b4a8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Received unexpected event network-vif-plugged-805027f4-e7d4-48b3-9fac-a3e7901dbd9f for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:39:38 np0005593234 nova_compute[227762]: 2026-01-23 09:39:38.357 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:38.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:38 np0005593234 nova_compute[227762]: 2026-01-23 09:39:38.629 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:38 np0005593234 nova_compute[227762]: 2026-01-23 09:39:38.855 227766 DEBUG nova.network.neutron [-] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:39:38 np0005593234 nova_compute[227762]: 2026-01-23 09:39:38.877 227766 INFO nova.compute.manager [-] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Took 4.77 seconds to deallocate network for instance.#033[00m
Jan 23 04:39:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:38.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:38 np0005593234 nova_compute[227762]: 2026-01-23 09:39:38.942 227766 DEBUG oslo_concurrency.lockutils [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:38 np0005593234 nova_compute[227762]: 2026-01-23 09:39:38.943 227766 DEBUG oslo_concurrency.lockutils [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:39 np0005593234 nova_compute[227762]: 2026-01-23 09:39:39.026 227766 DEBUG oslo_concurrency.processutils [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:39:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:39:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/310599573' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:39:39 np0005593234 nova_compute[227762]: 2026-01-23 09:39:39.721 227766 DEBUG oslo_concurrency.processutils [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.695s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:39:39 np0005593234 nova_compute[227762]: 2026-01-23 09:39:39.729 227766 DEBUG nova.compute.provider_tree [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:39:39 np0005593234 nova_compute[227762]: 2026-01-23 09:39:39.761 227766 DEBUG nova.scheduler.client.report [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:39:39 np0005593234 nova_compute[227762]: 2026-01-23 09:39:39.809 227766 DEBUG oslo_concurrency.lockutils [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.866s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:39 np0005593234 nova_compute[227762]: 2026-01-23 09:39:39.914 227766 INFO nova.scheduler.client.report [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Deleted allocations for instance 52908364-c256-4f35-8ea4-1904a14fa399#033[00m
Jan 23 04:39:40 np0005593234 nova_compute[227762]: 2026-01-23 09:39:40.003 227766 DEBUG oslo_concurrency.lockutils [None req-af5d4006-d36c-47cc-bb46-5f158a483107 191a72cfd0a841e9806246e07eb62fa6 1a5f46b255cd4387bd3e4c0acaa39466 - - default default] Lock "52908364-c256-4f35-8ea4-1904a14fa399" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.761s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:40 np0005593234 nova_compute[227762]: 2026-01-23 09:39:40.208 227766 DEBUG nova.compute.manager [req-ec5e3df0-c827-4b75-860a-c3e5678556ae req-997d0e46-747e-4ca9-94f6-ff5fc01bad31 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Received event network-vif-deleted-805027f4-e7d4-48b3-9fac-a3e7901dbd9f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:39:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:40.521 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:39:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:40.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:39:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:40.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:39:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:42.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:42.816 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:39:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:42.817 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:39:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:39:42.817 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:39:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:42.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:43 np0005593234 nova_compute[227762]: 2026-01-23 09:39:43.359 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:43 np0005593234 nova_compute[227762]: 2026-01-23 09:39:43.631 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:44.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:39:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:44.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:39:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:46.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:46.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:47 np0005593234 podman[245792]: 2026-01-23 09:39:47.752523445 +0000 UTC m=+0.051363025 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:39:48 np0005593234 nova_compute[227762]: 2026-01-23 09:39:48.361 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:48 np0005593234 nova_compute[227762]: 2026-01-23 09:39:48.489 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161173.488184, 52908364-c256-4f35-8ea4-1904a14fa399 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:39:48 np0005593234 nova_compute[227762]: 2026-01-23 09:39:48.490 227766 INFO nova.compute.manager [-] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:39:48 np0005593234 nova_compute[227762]: 2026-01-23 09:39:48.517 227766 DEBUG nova.compute.manager [None req-a9ae6a2e-9a73-4967-aa1f-a5bddc001ffa - - - - - -] [instance: 52908364-c256-4f35-8ea4-1904a14fa399] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:39:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:48.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:48 np0005593234 nova_compute[227762]: 2026-01-23 09:39:48.633 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:48.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:50.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:50.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:52.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:52.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:53 np0005593234 nova_compute[227762]: 2026-01-23 09:39:53.399 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:53 np0005593234 nova_compute[227762]: 2026-01-23 09:39:53.635 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:39:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:39:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:54.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:39:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:54.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:56.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:56.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:58 np0005593234 nova_compute[227762]: 2026-01-23 09:39:58.401 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:39:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:39:58.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:39:58 np0005593234 nova_compute[227762]: 2026-01-23 09:39:58.636 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:39:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:39:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:39:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:39:58.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:39:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:00 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 04:40:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:40:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:00.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:40:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:00.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:01 np0005593234 nova_compute[227762]: 2026-01-23 09:40:01.799 227766 DEBUG oslo_concurrency.lockutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "0edc214c-75fd-434c-bc75-940bef41d987" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:01 np0005593234 nova_compute[227762]: 2026-01-23 09:40:01.799 227766 DEBUG oslo_concurrency.lockutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "0edc214c-75fd-434c-bc75-940bef41d987" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:01 np0005593234 podman[245867]: 2026-01-23 09:40:01.808360828 +0000 UTC m=+0.101006134 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 04:40:01 np0005593234 nova_compute[227762]: 2026-01-23 09:40:01.874 227766 DEBUG nova.compute.manager [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:40:02 np0005593234 nova_compute[227762]: 2026-01-23 09:40:02.047 227766 DEBUG oslo_concurrency.lockutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:02 np0005593234 nova_compute[227762]: 2026-01-23 09:40:02.048 227766 DEBUG oslo_concurrency.lockutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:02 np0005593234 nova_compute[227762]: 2026-01-23 09:40:02.054 227766 DEBUG nova.virt.hardware [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:40:02 np0005593234 nova_compute[227762]: 2026-01-23 09:40:02.054 227766 INFO nova.compute.claims [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:40:02 np0005593234 nova_compute[227762]: 2026-01-23 09:40:02.321 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:02.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:40:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1731613302' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:40:02 np0005593234 nova_compute[227762]: 2026-01-23 09:40:02.785 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:02 np0005593234 nova_compute[227762]: 2026-01-23 09:40:02.791 227766 DEBUG nova.compute.provider_tree [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:40:02 np0005593234 nova_compute[227762]: 2026-01-23 09:40:02.840 227766 DEBUG nova.scheduler.client.report [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:40:02 np0005593234 nova_compute[227762]: 2026-01-23 09:40:02.889 227766 DEBUG oslo_concurrency.lockutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.841s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:02 np0005593234 nova_compute[227762]: 2026-01-23 09:40:02.889 227766 DEBUG nova.compute.manager [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:40:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:02.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:02 np0005593234 nova_compute[227762]: 2026-01-23 09:40:02.969 227766 DEBUG nova.compute.manager [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:40:02 np0005593234 nova_compute[227762]: 2026-01-23 09:40:02.970 227766 DEBUG nova.network.neutron [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:40:02 np0005593234 nova_compute[227762]: 2026-01-23 09:40:02.999 227766 INFO nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.027 227766 DEBUG nova.compute.manager [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.147 227766 DEBUG nova.compute.manager [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.148 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.149 227766 INFO nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Creating image(s)#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.178 227766 DEBUG nova.storage.rbd_utils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 0edc214c-75fd-434c-bc75-940bef41d987_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.209 227766 DEBUG nova.storage.rbd_utils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 0edc214c-75fd-434c-bc75-940bef41d987_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.240 227766 DEBUG nova.storage.rbd_utils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 0edc214c-75fd-434c-bc75-940bef41d987_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.244 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.304 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.305 227766 DEBUG oslo_concurrency.lockutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.305 227766 DEBUG oslo_concurrency.lockutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.306 227766 DEBUG oslo_concurrency.lockutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.332 227766 DEBUG nova.storage.rbd_utils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 0edc214c-75fd-434c-bc75-940bef41d987_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.336 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0edc214c-75fd-434c-bc75-940bef41d987_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.450 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.637 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.688 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0edc214c-75fd-434c-bc75-940bef41d987_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.352s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.776 227766 DEBUG nova.storage.rbd_utils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] resizing rbd image 0edc214c-75fd-434c-bc75-940bef41d987_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.897 227766 DEBUG nova.objects.instance [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lazy-loading 'migration_context' on Instance uuid 0edc214c-75fd-434c-bc75-940bef41d987 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.922 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.922 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Ensure instance console log exists: /var/lib/nova/instances/0edc214c-75fd-434c-bc75-940bef41d987/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.923 227766 DEBUG oslo_concurrency.lockutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.923 227766 DEBUG oslo_concurrency.lockutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:03 np0005593234 nova_compute[227762]: 2026-01-23 09:40:03.924 227766 DEBUG oslo_concurrency.lockutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:03 np0005593234 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 23 04:40:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.129 227766 DEBUG nova.network.neutron [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.130 227766 DEBUG nova.compute.manager [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.131 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.137 227766 WARNING nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.145 227766 DEBUG nova.virt.libvirt.host [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.147 227766 DEBUG nova.virt.libvirt.host [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.150 227766 DEBUG nova.virt.libvirt.host [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.151 227766 DEBUG nova.virt.libvirt.host [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.153 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.153 227766 DEBUG nova.virt.hardware [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.154 227766 DEBUG nova.virt.hardware [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.154 227766 DEBUG nova.virt.hardware [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.154 227766 DEBUG nova.virt.hardware [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.154 227766 DEBUG nova.virt.hardware [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.154 227766 DEBUG nova.virt.hardware [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.155 227766 DEBUG nova.virt.hardware [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.155 227766 DEBUG nova.virt.hardware [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.155 227766 DEBUG nova.virt.hardware [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.155 227766 DEBUG nova.virt.hardware [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.156 227766 DEBUG nova.virt.hardware [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.159 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:40:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1361839921' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:40:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:40:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:04.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.594 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.617 227766 DEBUG nova.storage.rbd_utils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 0edc214c-75fd-434c-bc75-940bef41d987_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:04 np0005593234 nova_compute[227762]: 2026-01-23 09:40:04.621 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:04.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:40:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1240931046' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:40:05 np0005593234 nova_compute[227762]: 2026-01-23 09:40:05.049 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:05 np0005593234 nova_compute[227762]: 2026-01-23 09:40:05.051 227766 DEBUG nova.objects.instance [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0edc214c-75fd-434c-bc75-940bef41d987 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:40:05 np0005593234 nova_compute[227762]: 2026-01-23 09:40:05.210 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  <uuid>0edc214c-75fd-434c-bc75-940bef41d987</uuid>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  <name>instance-00000024</name>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServersOnMultiNodesTest-server-593939490</nova:name>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:40:04</nova:creationTime>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <nova:user uuid="a3b5a7f627074988a8a05a20558595fe">tempest-ServersOnMultiNodesTest-288318576-project-member</nova:user>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <nova:project uuid="e8778f3a187440f3879f9d9533d45855">tempest-ServersOnMultiNodesTest-288318576</nova:project>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <entry name="serial">0edc214c-75fd-434c-bc75-940bef41d987</entry>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <entry name="uuid">0edc214c-75fd-434c-bc75-940bef41d987</entry>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/0edc214c-75fd-434c-bc75-940bef41d987_disk">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/0edc214c-75fd-434c-bc75-940bef41d987_disk.config">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/0edc214c-75fd-434c-bc75-940bef41d987/console.log" append="off"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:40:05 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:40:05 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:40:05 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:40:05 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:40:05 np0005593234 nova_compute[227762]: 2026-01-23 09:40:05.419 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:40:05 np0005593234 nova_compute[227762]: 2026-01-23 09:40:05.419 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:40:05 np0005593234 nova_compute[227762]: 2026-01-23 09:40:05.420 227766 INFO nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Using config drive#033[00m
Jan 23 04:40:05 np0005593234 nova_compute[227762]: 2026-01-23 09:40:05.445 227766 DEBUG nova.storage.rbd_utils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 0edc214c-75fd-434c-bc75-940bef41d987_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:05 np0005593234 nova_compute[227762]: 2026-01-23 09:40:05.724 227766 INFO nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Creating config drive at /var/lib/nova/instances/0edc214c-75fd-434c-bc75-940bef41d987/disk.config#033[00m
Jan 23 04:40:05 np0005593234 nova_compute[227762]: 2026-01-23 09:40:05.729 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0edc214c-75fd-434c-bc75-940bef41d987/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpon18egsj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:05 np0005593234 nova_compute[227762]: 2026-01-23 09:40:05.860 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0edc214c-75fd-434c-bc75-940bef41d987/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpon18egsj" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:05 np0005593234 nova_compute[227762]: 2026-01-23 09:40:05.890 227766 DEBUG nova.storage.rbd_utils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 0edc214c-75fd-434c-bc75-940bef41d987_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:05 np0005593234 nova_compute[227762]: 2026-01-23 09:40:05.894 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0edc214c-75fd-434c-bc75-940bef41d987/disk.config 0edc214c-75fd-434c-bc75-940bef41d987_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.088 227766 DEBUG oslo_concurrency.processutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0edc214c-75fd-434c-bc75-940bef41d987/disk.config 0edc214c-75fd-434c-bc75-940bef41d987_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.194s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.089 227766 INFO nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Deleting local config drive /var/lib/nova/instances/0edc214c-75fd-434c-bc75-940bef41d987/disk.config because it was imported into RBD.#033[00m
Jan 23 04:40:06 np0005593234 systemd-machined[195626]: New machine qemu-18-instance-00000024.
Jan 23 04:40:06 np0005593234 systemd[1]: Started Virtual Machine qemu-18-instance-00000024.
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.552 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161206.5523655, 0edc214c-75fd-434c-bc75-940bef41d987 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.554 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.558 227766 DEBUG nova.compute.manager [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.558 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.561 227766 INFO nova.virt.libvirt.driver [-] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Instance spawned successfully.#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.562 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:40:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:06.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.599 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.603 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.603 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.604 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.604 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.604 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.605 227766 DEBUG nova.virt.libvirt.driver [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.609 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.653 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.654 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161206.55331, 0edc214c-75fd-434c-bc75-940bef41d987 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.654 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] VM Started (Lifecycle Event)#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.695 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.698 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.701 227766 INFO nova.compute.manager [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Took 3.55 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.702 227766 DEBUG nova.compute.manager [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.837 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.896 227766 INFO nova.compute.manager [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Took 4.92 seconds to build instance.#033[00m
Jan 23 04:40:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:40:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:06.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:40:06 np0005593234 nova_compute[227762]: 2026-01-23 09:40:06.930 227766 DEBUG oslo_concurrency.lockutils [None req-32a60252-bd04-4e19-957b-b8e966a8364d a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "0edc214c-75fd-434c-bc75-940bef41d987" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:07 np0005593234 nova_compute[227762]: 2026-01-23 09:40:07.087 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:07 np0005593234 nova_compute[227762]: 2026-01-23 09:40:07.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:08 np0005593234 nova_compute[227762]: 2026-01-23 09:40:08.453 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:08.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:08 np0005593234 nova_compute[227762]: 2026-01-23 09:40:08.638 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:08.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:10.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:10 np0005593234 nova_compute[227762]: 2026-01-23 09:40:10.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:10 np0005593234 nova_compute[227762]: 2026-01-23 09:40:10.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:40:10 np0005593234 nova_compute[227762]: 2026-01-23 09:40:10.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:40:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:10.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:10 np0005593234 podman[246435]: 2026-01-23 09:40:10.992868366 +0000 UTC m=+0.061616475 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507)
Jan 23 04:40:11 np0005593234 podman[246435]: 2026-01-23 09:40:11.089015888 +0000 UTC m=+0.157763977 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, ceph=True)
Jan 23 04:40:11 np0005593234 nova_compute[227762]: 2026-01-23 09:40:11.452 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-0edc214c-75fd-434c-bc75-940bef41d987" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:40:11 np0005593234 nova_compute[227762]: 2026-01-23 09:40:11.453 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-0edc214c-75fd-434c-bc75-940bef41d987" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:40:11 np0005593234 nova_compute[227762]: 2026-01-23 09:40:11.454 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:40:11 np0005593234 nova_compute[227762]: 2026-01-23 09:40:11.454 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 0edc214c-75fd-434c-bc75-940bef41d987 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:40:11 np0005593234 podman[246593]: 2026-01-23 09:40:11.732053032 +0000 UTC m=+0.060933433 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:40:11 np0005593234 podman[246593]: 2026-01-23 09:40:11.743022375 +0000 UTC m=+0.071902766 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:40:11 np0005593234 podman[246659]: 2026-01-23 09:40:11.969752253 +0000 UTC m=+0.061027806 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, name=keepalived, architecture=x86_64, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container)
Jan 23 04:40:11 np0005593234 podman[246659]: 2026-01-23 09:40:11.979740695 +0000 UTC m=+0.071016228 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, release=1793, name=keepalived, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=Ceph keepalived, com.redhat.component=keepalived-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., vcs-type=git, build-date=2023-02-22T09:23:20, description=keepalived for Ceph)
Jan 23 04:40:12 np0005593234 nova_compute[227762]: 2026-01-23 09:40:12.262 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:40:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:12.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:12 np0005593234 nova_compute[227762]: 2026-01-23 09:40:12.891 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:40:12 np0005593234 nova_compute[227762]: 2026-01-23 09:40:12.917 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-0edc214c-75fd-434c-bc75-940bef41d987" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:40:12 np0005593234 nova_compute[227762]: 2026-01-23 09:40:12.918 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:40:12 np0005593234 nova_compute[227762]: 2026-01-23 09:40:12.919 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:12 np0005593234 nova_compute[227762]: 2026-01-23 09:40:12.919 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:12 np0005593234 nova_compute[227762]: 2026-01-23 09:40:12.919 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:40:12 np0005593234 nova_compute[227762]: 2026-01-23 09:40:12.920 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:12.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:12 np0005593234 nova_compute[227762]: 2026-01-23 09:40:12.947 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:12 np0005593234 nova_compute[227762]: 2026-01-23 09:40:12.948 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:12 np0005593234 nova_compute[227762]: 2026-01-23 09:40:12.948 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:12 np0005593234 nova_compute[227762]: 2026-01-23 09:40:12.948 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:40:12 np0005593234 nova_compute[227762]: 2026-01-23 09:40:12.949 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:40:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:40:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:40:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:40:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:40:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:40:13 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/298863387' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:40:13 np0005593234 nova_compute[227762]: 2026-01-23 09:40:13.408 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:13 np0005593234 nova_compute[227762]: 2026-01-23 09:40:13.456 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:13 np0005593234 nova_compute[227762]: 2026-01-23 09:40:13.612 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000024 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:40:13 np0005593234 nova_compute[227762]: 2026-01-23 09:40:13.613 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000024 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:40:13 np0005593234 nova_compute[227762]: 2026-01-23 09:40:13.683 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:13 np0005593234 nova_compute[227762]: 2026-01-23 09:40:13.804 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:40:13 np0005593234 nova_compute[227762]: 2026-01-23 09:40:13.805 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4644MB free_disk=20.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:40:13 np0005593234 nova_compute[227762]: 2026-01-23 09:40:13.805 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:13 np0005593234 nova_compute[227762]: 2026-01-23 09:40:13.806 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:13 np0005593234 nova_compute[227762]: 2026-01-23 09:40:13.913 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 0edc214c-75fd-434c-bc75-940bef41d987 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:40:13 np0005593234 nova_compute[227762]: 2026-01-23 09:40:13.914 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:40:13 np0005593234 nova_compute[227762]: 2026-01-23 09:40:13.914 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:40:13 np0005593234 nova_compute[227762]: 2026-01-23 09:40:13.976 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:40:14 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2865637109' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:40:14 np0005593234 nova_compute[227762]: 2026-01-23 09:40:14.418 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:14 np0005593234 nova_compute[227762]: 2026-01-23 09:40:14.425 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:40:14 np0005593234 nova_compute[227762]: 2026-01-23 09:40:14.457 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:40:14 np0005593234 nova_compute[227762]: 2026-01-23 09:40:14.497 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:40:14 np0005593234 nova_compute[227762]: 2026-01-23 09:40:14.498 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:40:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:14.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:40:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:14.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:15 np0005593234 nova_compute[227762]: 2026-01-23 09:40:15.324 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:15 np0005593234 nova_compute[227762]: 2026-01-23 09:40:15.325 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.418802) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161216418931, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2631, "num_deletes": 508, "total_data_size": 5437027, "memory_usage": 5511472, "flush_reason": "Manual Compaction"}
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161216450076, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 3563880, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30709, "largest_seqno": 33335, "table_properties": {"data_size": 3553785, "index_size": 5885, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3141, "raw_key_size": 24606, "raw_average_key_size": 19, "raw_value_size": 3531218, "raw_average_value_size": 2827, "num_data_blocks": 256, "num_entries": 1249, "num_filter_entries": 1249, "num_deletions": 508, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161017, "oldest_key_time": 1769161017, "file_creation_time": 1769161216, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 31390 microseconds, and 7176 cpu microseconds.
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.450229) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 3563880 bytes OK
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.450275) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.451786) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.451800) EVENT_LOG_v1 {"time_micros": 1769161216451795, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.451816) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 5424414, prev total WAL file size 5424414, number of live WAL files 2.
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.453512) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(3480KB)], [60(8662KB)]
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161216453641, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 12434125, "oldest_snapshot_seqno": -1}
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 5576 keys, 10381568 bytes, temperature: kUnknown
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161216520623, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 10381568, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10342405, "index_size": 24141, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13957, "raw_key_size": 143294, "raw_average_key_size": 25, "raw_value_size": 10240155, "raw_average_value_size": 1836, "num_data_blocks": 974, "num_entries": 5576, "num_filter_entries": 5576, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769161216, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.520845) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 10381568 bytes
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.522356) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 185.8 rd, 155.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 8.5 +0.0 blob) out(9.9 +0.0 blob), read-write-amplify(6.4) write-amplify(2.9) OK, records in: 6612, records dropped: 1036 output_compression: NoCompression
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.522374) EVENT_LOG_v1 {"time_micros": 1769161216522365, "job": 36, "event": "compaction_finished", "compaction_time_micros": 66906, "compaction_time_cpu_micros": 20765, "output_level": 6, "num_output_files": 1, "total_output_size": 10381568, "num_input_records": 6612, "num_output_records": 5576, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161216523113, "job": 36, "event": "table_file_deletion", "file_number": 62}
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161216525139, "job": 36, "event": "table_file_deletion", "file_number": 60}
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.453413) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.525168) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.525171) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.525172) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.525174) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:40:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:40:16.525175) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:40:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:16.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:40:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:16.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:40:18 np0005593234 nova_compute[227762]: 2026-01-23 09:40:18.458 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:40:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:18.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:40:18 np0005593234 nova_compute[227762]: 2026-01-23 09:40:18.685 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:18 np0005593234 nova_compute[227762]: 2026-01-23 09:40:18.740 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:18 np0005593234 nova_compute[227762]: 2026-01-23 09:40:18.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:40:18 np0005593234 podman[246921]: 2026-01-23 09:40:18.781505847 +0000 UTC m=+0.057639940 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 04:40:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:18.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:20.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:40:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:40:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:20.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:22.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:40:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:22.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:40:23 np0005593234 nova_compute[227762]: 2026-01-23 09:40:23.459 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:23 np0005593234 nova_compute[227762]: 2026-01-23 09:40:23.686 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:24.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:24.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:26 np0005593234 nova_compute[227762]: 2026-01-23 09:40:26.294 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "47e7874b-1f9f-46aa-9227-d6f9b2bf1e51" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:26 np0005593234 nova_compute[227762]: 2026-01-23 09:40:26.295 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "47e7874b-1f9f-46aa-9227-d6f9b2bf1e51" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:26 np0005593234 nova_compute[227762]: 2026-01-23 09:40:26.325 227766 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:40:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:40:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:26.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:40:26 np0005593234 nova_compute[227762]: 2026-01-23 09:40:26.861 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:26 np0005593234 nova_compute[227762]: 2026-01-23 09:40:26.862 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:26 np0005593234 nova_compute[227762]: 2026-01-23 09:40:26.875 227766 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:40:26 np0005593234 nova_compute[227762]: 2026-01-23 09:40:26.875 227766 INFO nova.compute.claims [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:40:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:26.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.105 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:40:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3994313133' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.543 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.549 227766 DEBUG nova.compute.provider_tree [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.565 227766 DEBUG nova.scheduler.client.report [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.594 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.632 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "59d13db1-5c8c-4a9f-bd88-e66f1a7b46ce" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.633 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "59d13db1-5c8c-4a9f-bd88-e66f1a7b46ce" acquired by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.659 227766 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] No node specified, defaulting to compute-2.ctlplane.example.com _get_nodename /usr/lib/python3.9/site-packages/nova/compute/manager.py:10505#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.697 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "59d13db1-5c8c-4a9f-bd88-e66f1a7b46ce" "released" by "nova.compute.manager.ComputeManager._validate_instance_group_policy.<locals>._do_validation" :: held 0.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.698 227766 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.752 227766 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.752 227766 DEBUG nova.network.neutron [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.791 227766 INFO nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.823 227766 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.962 227766 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.963 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.963 227766 INFO nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Creating image(s)#033[00m
Jan 23 04:40:27 np0005593234 nova_compute[227762]: 2026-01-23 09:40:27.993 227766 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.030 227766 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.057 227766 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.061 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.118 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.119 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.120 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.120 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.153 227766 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.157 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.198 227766 DEBUG nova.network.neutron [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] No network configured allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1188#033[00m
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.199 227766 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Instance network_info: |[]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.461 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:28.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.687 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:28 np0005593234 nova_compute[227762]: 2026-01-23 09:40:28.949 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.792s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:28.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.027 227766 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] resizing rbd image 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:40:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.670 227766 DEBUG nova.objects.instance [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lazy-loading 'migration_context' on Instance uuid 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.694 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.694 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Ensure instance console log exists: /var/lib/nova/instances/47e7874b-1f9f-46aa-9227-d6f9b2bf1e51/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.695 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.695 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.696 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.697 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.702 227766 WARNING nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.708 227766 DEBUG nova.virt.libvirt.host [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.708 227766 DEBUG nova.virt.libvirt.host [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.711 227766 DEBUG nova.virt.libvirt.host [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.711 227766 DEBUG nova.virt.libvirt.host [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.712 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.713 227766 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.713 227766 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.713 227766 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.714 227766 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.714 227766 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.714 227766 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.714 227766 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.715 227766 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.715 227766 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.715 227766 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.715 227766 DEBUG nova.virt.hardware [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:40:29 np0005593234 nova_compute[227762]: 2026-01-23 09:40:29.718 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:40:30 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2479704602' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:40:30 np0005593234 nova_compute[227762]: 2026-01-23 09:40:30.219 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:30 np0005593234 nova_compute[227762]: 2026-01-23 09:40:30.563 227766 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:30 np0005593234 nova_compute[227762]: 2026-01-23 09:40:30.568 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:30.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:30.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:40:30 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2803211524' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:40:31 np0005593234 nova_compute[227762]: 2026-01-23 09:40:31.447 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.879s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:31 np0005593234 nova_compute[227762]: 2026-01-23 09:40:31.450 227766 DEBUG nova.objects.instance [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lazy-loading 'pci_devices' on Instance uuid 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:40:31 np0005593234 nova_compute[227762]: 2026-01-23 09:40:31.473 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  <uuid>47e7874b-1f9f-46aa-9227-d6f9b2bf1e51</uuid>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  <name>instance-00000028</name>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServersOnMultiNodesTest-server-673627176-2</nova:name>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:40:29</nova:creationTime>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <nova:user uuid="a3b5a7f627074988a8a05a20558595fe">tempest-ServersOnMultiNodesTest-288318576-project-member</nova:user>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <nova:project uuid="e8778f3a187440f3879f9d9533d45855">tempest-ServersOnMultiNodesTest-288318576</nova:project>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <entry name="serial">47e7874b-1f9f-46aa-9227-d6f9b2bf1e51</entry>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <entry name="uuid">47e7874b-1f9f-46aa-9227-d6f9b2bf1e51</entry>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk.config">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/47e7874b-1f9f-46aa-9227-d6f9b2bf1e51/console.log" append="off"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:40:31 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:40:31 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:40:31 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:40:31 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:40:31 np0005593234 nova_compute[227762]: 2026-01-23 09:40:31.534 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:40:31 np0005593234 nova_compute[227762]: 2026-01-23 09:40:31.535 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:40:31 np0005593234 nova_compute[227762]: 2026-01-23 09:40:31.536 227766 INFO nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Using config drive#033[00m
Jan 23 04:40:31 np0005593234 nova_compute[227762]: 2026-01-23 09:40:31.557 227766 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:32.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:32 np0005593234 nova_compute[227762]: 2026-01-23 09:40:32.666 227766 INFO nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Creating config drive at /var/lib/nova/instances/47e7874b-1f9f-46aa-9227-d6f9b2bf1e51/disk.config#033[00m
Jan 23 04:40:32 np0005593234 nova_compute[227762]: 2026-01-23 09:40:32.679 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/47e7874b-1f9f-46aa-9227-d6f9b2bf1e51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxax9tf0e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:32 np0005593234 podman[247267]: 2026-01-23 09:40:32.798172266 +0000 UTC m=+0.096183413 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:40:32 np0005593234 nova_compute[227762]: 2026-01-23 09:40:32.818 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/47e7874b-1f9f-46aa-9227-d6f9b2bf1e51/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxax9tf0e" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:32 np0005593234 nova_compute[227762]: 2026-01-23 09:40:32.859 227766 DEBUG nova.storage.rbd_utils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] rbd image 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:40:32 np0005593234 nova_compute[227762]: 2026-01-23 09:40:32.863 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/47e7874b-1f9f-46aa-9227-d6f9b2bf1e51/disk.config 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:40:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:32.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:40:33 np0005593234 nova_compute[227762]: 2026-01-23 09:40:33.463 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:33 np0005593234 nova_compute[227762]: 2026-01-23 09:40:33.526 227766 DEBUG oslo_concurrency.processutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/47e7874b-1f9f-46aa-9227-d6f9b2bf1e51/disk.config 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.663s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:33 np0005593234 nova_compute[227762]: 2026-01-23 09:40:33.526 227766 INFO nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Deleting local config drive /var/lib/nova/instances/47e7874b-1f9f-46aa-9227-d6f9b2bf1e51/disk.config because it was imported into RBD.#033[00m
Jan 23 04:40:33 np0005593234 systemd-machined[195626]: New machine qemu-19-instance-00000028.
Jan 23 04:40:33 np0005593234 systemd[1]: Started Virtual Machine qemu-19-instance-00000028.
Jan 23 04:40:33 np0005593234 nova_compute[227762]: 2026-01-23 09:40:33.740 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.554 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161234.5540493, 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.556 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.560 227766 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.560 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.565 227766 INFO nova.virt.libvirt.driver [-] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Instance spawned successfully.#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.566 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.586 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.597 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.602 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.602 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.603 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.603 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.604 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.604 227766 DEBUG nova.virt.libvirt.driver [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.634 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.634 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161234.5556283, 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.634 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] VM Started (Lifecycle Event)#033[00m
Jan 23 04:40:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:40:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:34.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.668 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.672 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.679 227766 INFO nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Took 6.72 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.679 227766 DEBUG nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.691 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.726 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:40:34.726 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:40:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:40:34.728 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.777 227766 INFO nova.compute.manager [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Took 7.99 seconds to build instance.#033[00m
Jan 23 04:40:34 np0005593234 nova_compute[227762]: 2026-01-23 09:40:34.800 227766 DEBUG oslo_concurrency.lockutils [None req-ad0bb384-49c0-4098-b995-6a8ce88963c6 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "47e7874b-1f9f-46aa-9227-d6f9b2bf1e51" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.505s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:34.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:40:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:36.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:40:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:40:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:36.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:40:38 np0005593234 nova_compute[227762]: 2026-01-23 09:40:38.465 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:38.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:38 np0005593234 nova_compute[227762]: 2026-01-23 09:40:38.772 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:38 np0005593234 nova_compute[227762]: 2026-01-23 09:40:38.801 227766 DEBUG oslo_concurrency.lockutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "47e7874b-1f9f-46aa-9227-d6f9b2bf1e51" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:38 np0005593234 nova_compute[227762]: 2026-01-23 09:40:38.802 227766 DEBUG oslo_concurrency.lockutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "47e7874b-1f9f-46aa-9227-d6f9b2bf1e51" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:38 np0005593234 nova_compute[227762]: 2026-01-23 09:40:38.802 227766 DEBUG oslo_concurrency.lockutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "47e7874b-1f9f-46aa-9227-d6f9b2bf1e51-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:38 np0005593234 nova_compute[227762]: 2026-01-23 09:40:38.802 227766 DEBUG oslo_concurrency.lockutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "47e7874b-1f9f-46aa-9227-d6f9b2bf1e51-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:38 np0005593234 nova_compute[227762]: 2026-01-23 09:40:38.802 227766 DEBUG oslo_concurrency.lockutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "47e7874b-1f9f-46aa-9227-d6f9b2bf1e51-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:38 np0005593234 nova_compute[227762]: 2026-01-23 09:40:38.804 227766 INFO nova.compute.manager [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Terminating instance#033[00m
Jan 23 04:40:38 np0005593234 nova_compute[227762]: 2026-01-23 09:40:38.805 227766 DEBUG oslo_concurrency.lockutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "refresh_cache-47e7874b-1f9f-46aa-9227-d6f9b2bf1e51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:40:38 np0005593234 nova_compute[227762]: 2026-01-23 09:40:38.805 227766 DEBUG oslo_concurrency.lockutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquired lock "refresh_cache-47e7874b-1f9f-46aa-9227-d6f9b2bf1e51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:40:38 np0005593234 nova_compute[227762]: 2026-01-23 09:40:38.805 227766 DEBUG nova.network.neutron [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:40:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:40:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:38.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:40:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:39 np0005593234 nova_compute[227762]: 2026-01-23 09:40:39.232 227766 DEBUG nova.network.neutron [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:40:39 np0005593234 nova_compute[227762]: 2026-01-23 09:40:39.767 227766 DEBUG nova.network.neutron [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:40:39 np0005593234 nova_compute[227762]: 2026-01-23 09:40:39.782 227766 DEBUG oslo_concurrency.lockutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Releasing lock "refresh_cache-47e7874b-1f9f-46aa-9227-d6f9b2bf1e51" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:40:39 np0005593234 nova_compute[227762]: 2026-01-23 09:40:39.783 227766 DEBUG nova.compute.manager [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:40:39 np0005593234 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000028.scope: Deactivated successfully.
Jan 23 04:40:39 np0005593234 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000028.scope: Consumed 6.093s CPU time.
Jan 23 04:40:39 np0005593234 systemd-machined[195626]: Machine qemu-19-instance-00000028 terminated.
Jan 23 04:40:40 np0005593234 nova_compute[227762]: 2026-01-23 09:40:40.003 227766 INFO nova.virt.libvirt.driver [-] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Instance destroyed successfully.#033[00m
Jan 23 04:40:40 np0005593234 nova_compute[227762]: 2026-01-23 09:40:40.003 227766 DEBUG nova.objects.instance [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lazy-loading 'resources' on Instance uuid 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:40:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:40:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:40.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:40:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:40.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:41 np0005593234 nova_compute[227762]: 2026-01-23 09:40:41.214 227766 INFO nova.virt.libvirt.driver [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Deleting instance files /var/lib/nova/instances/47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_del#033[00m
Jan 23 04:40:41 np0005593234 nova_compute[227762]: 2026-01-23 09:40:41.215 227766 INFO nova.virt.libvirt.driver [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Deletion of /var/lib/nova/instances/47e7874b-1f9f-46aa-9227-d6f9b2bf1e51_del complete#033[00m
Jan 23 04:40:41 np0005593234 nova_compute[227762]: 2026-01-23 09:40:41.281 227766 INFO nova.compute.manager [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Took 1.50 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:40:41 np0005593234 nova_compute[227762]: 2026-01-23 09:40:41.282 227766 DEBUG oslo.service.loopingcall [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:40:41 np0005593234 nova_compute[227762]: 2026-01-23 09:40:41.282 227766 DEBUG nova.compute.manager [-] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:40:41 np0005593234 nova_compute[227762]: 2026-01-23 09:40:41.282 227766 DEBUG nova.network.neutron [-] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:40:41 np0005593234 nova_compute[227762]: 2026-01-23 09:40:41.683 227766 DEBUG nova.network.neutron [-] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:40:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:40:41.730 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:40:41 np0005593234 nova_compute[227762]: 2026-01-23 09:40:41.737 227766 DEBUG nova.network.neutron [-] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:40:41 np0005593234 nova_compute[227762]: 2026-01-23 09:40:41.761 227766 INFO nova.compute.manager [-] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Took 0.48 seconds to deallocate network for instance.#033[00m
Jan 23 04:40:41 np0005593234 nova_compute[227762]: 2026-01-23 09:40:41.878 227766 DEBUG oslo_concurrency.lockutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:41 np0005593234 nova_compute[227762]: 2026-01-23 09:40:41.878 227766 DEBUG oslo_concurrency.lockutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:41 np0005593234 nova_compute[227762]: 2026-01-23 09:40:41.961 227766 DEBUG oslo_concurrency.processutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:40:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3059910261' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:40:42 np0005593234 nova_compute[227762]: 2026-01-23 09:40:42.428 227766 DEBUG oslo_concurrency.processutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:42 np0005593234 nova_compute[227762]: 2026-01-23 09:40:42.434 227766 DEBUG nova.compute.provider_tree [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:40:42 np0005593234 nova_compute[227762]: 2026-01-23 09:40:42.455 227766 DEBUG nova.scheduler.client.report [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:40:42 np0005593234 nova_compute[227762]: 2026-01-23 09:40:42.482 227766 DEBUG oslo_concurrency.lockutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:42 np0005593234 nova_compute[227762]: 2026-01-23 09:40:42.520 227766 INFO nova.scheduler.client.report [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Deleted allocations for instance 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51#033[00m
Jan 23 04:40:42 np0005593234 nova_compute[227762]: 2026-01-23 09:40:42.599 227766 DEBUG oslo_concurrency.lockutils [None req-01302722-366e-4d32-9aa9-b24ad5a51f84 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "47e7874b-1f9f-46aa-9227-d6f9b2bf1e51" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.798s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:42.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:40:42.816 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:40:42.817 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:40:42.817 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:42.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:43 np0005593234 nova_compute[227762]: 2026-01-23 09:40:43.500 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:43 np0005593234 nova_compute[227762]: 2026-01-23 09:40:43.775 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:43 np0005593234 ovn_controller[134547]: 2026-01-23T09:40:43Z|00112|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 04:40:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:40:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:44.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:40:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:44.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:46.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:46.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:48 np0005593234 nova_compute[227762]: 2026-01-23 09:40:48.502 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:48.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:48 np0005593234 nova_compute[227762]: 2026-01-23 09:40:48.777 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:48.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:49 np0005593234 podman[247492]: 2026-01-23 09:40:49.771815059 +0000 UTC m=+0.064953018 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:40:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:50.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:50.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:51 np0005593234 nova_compute[227762]: 2026-01-23 09:40:51.694 227766 DEBUG oslo_concurrency.lockutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "0edc214c-75fd-434c-bc75-940bef41d987" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:51 np0005593234 nova_compute[227762]: 2026-01-23 09:40:51.695 227766 DEBUG oslo_concurrency.lockutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "0edc214c-75fd-434c-bc75-940bef41d987" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:51 np0005593234 nova_compute[227762]: 2026-01-23 09:40:51.695 227766 DEBUG oslo_concurrency.lockutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "0edc214c-75fd-434c-bc75-940bef41d987-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:51 np0005593234 nova_compute[227762]: 2026-01-23 09:40:51.695 227766 DEBUG oslo_concurrency.lockutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "0edc214c-75fd-434c-bc75-940bef41d987-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:51 np0005593234 nova_compute[227762]: 2026-01-23 09:40:51.696 227766 DEBUG oslo_concurrency.lockutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "0edc214c-75fd-434c-bc75-940bef41d987-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:51 np0005593234 nova_compute[227762]: 2026-01-23 09:40:51.697 227766 INFO nova.compute.manager [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Terminating instance#033[00m
Jan 23 04:40:51 np0005593234 nova_compute[227762]: 2026-01-23 09:40:51.697 227766 DEBUG oslo_concurrency.lockutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "refresh_cache-0edc214c-75fd-434c-bc75-940bef41d987" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:40:51 np0005593234 nova_compute[227762]: 2026-01-23 09:40:51.698 227766 DEBUG oslo_concurrency.lockutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquired lock "refresh_cache-0edc214c-75fd-434c-bc75-940bef41d987" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:40:51 np0005593234 nova_compute[227762]: 2026-01-23 09:40:51.698 227766 DEBUG nova.network.neutron [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:40:51 np0005593234 nova_compute[227762]: 2026-01-23 09:40:51.993 227766 DEBUG nova.network.neutron [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:40:52 np0005593234 nova_compute[227762]: 2026-01-23 09:40:52.429 227766 DEBUG nova.network.neutron [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:40:52 np0005593234 nova_compute[227762]: 2026-01-23 09:40:52.448 227766 DEBUG oslo_concurrency.lockutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Releasing lock "refresh_cache-0edc214c-75fd-434c-bc75-940bef41d987" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:40:52 np0005593234 nova_compute[227762]: 2026-01-23 09:40:52.449 227766 DEBUG nova.compute.manager [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:40:52 np0005593234 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000024.scope: Deactivated successfully.
Jan 23 04:40:52 np0005593234 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000024.scope: Consumed 14.012s CPU time.
Jan 23 04:40:52 np0005593234 systemd-machined[195626]: Machine qemu-18-instance-00000024 terminated.
Jan 23 04:40:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:40:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:52.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:40:52 np0005593234 nova_compute[227762]: 2026-01-23 09:40:52.674 227766 INFO nova.virt.libvirt.driver [-] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Instance destroyed successfully.#033[00m
Jan 23 04:40:52 np0005593234 nova_compute[227762]: 2026-01-23 09:40:52.674 227766 DEBUG nova.objects.instance [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lazy-loading 'resources' on Instance uuid 0edc214c-75fd-434c-bc75-940bef41d987 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:40:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:40:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:52.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.101 227766 INFO nova.virt.libvirt.driver [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Deleting instance files /var/lib/nova/instances/0edc214c-75fd-434c-bc75-940bef41d987_del#033[00m
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.103 227766 INFO nova.virt.libvirt.driver [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Deletion of /var/lib/nova/instances/0edc214c-75fd-434c-bc75-940bef41d987_del complete#033[00m
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.218 227766 INFO nova.compute.manager [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Took 0.77 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.219 227766 DEBUG oslo.service.loopingcall [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.219 227766 DEBUG nova.compute.manager [-] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.220 227766 DEBUG nova.network.neutron [-] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.504 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.669 227766 DEBUG nova.network.neutron [-] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.691 227766 DEBUG nova.network.neutron [-] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.706 227766 INFO nova.compute.manager [-] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Took 0.49 seconds to deallocate network for instance.#033[00m
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.764 227766 DEBUG oslo_concurrency.lockutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.765 227766 DEBUG oslo_concurrency.lockutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.778 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:53 np0005593234 nova_compute[227762]: 2026-01-23 09:40:53.882 227766 DEBUG oslo_concurrency.processutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:40:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e175 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:40:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:40:54 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1359575590' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:40:54 np0005593234 nova_compute[227762]: 2026-01-23 09:40:54.326 227766 DEBUG oslo_concurrency.processutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:40:54 np0005593234 nova_compute[227762]: 2026-01-23 09:40:54.332 227766 DEBUG nova.compute.provider_tree [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:40:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:40:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:54.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:40:54 np0005593234 nova_compute[227762]: 2026-01-23 09:40:54.766 227766 DEBUG nova.scheduler.client.report [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:40:54 np0005593234 nova_compute[227762]: 2026-01-23 09:40:54.803 227766 DEBUG oslo_concurrency.lockutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:54 np0005593234 nova_compute[227762]: 2026-01-23 09:40:54.835 227766 INFO nova.scheduler.client.report [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Deleted allocations for instance 0edc214c-75fd-434c-bc75-940bef41d987#033[00m
Jan 23 04:40:54 np0005593234 nova_compute[227762]: 2026-01-23 09:40:54.906 227766 DEBUG oslo_concurrency.lockutils [None req-10b5b3f1-e4af-44aa-9db7-603ff3d52914 a3b5a7f627074988a8a05a20558595fe e8778f3a187440f3879f9d9533d45855 - - default default] Lock "0edc214c-75fd-434c-bc75-940bef41d987" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:40:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:54.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:55 np0005593234 nova_compute[227762]: 2026-01-23 09:40:55.001 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161240.000586, 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:40:55 np0005593234 nova_compute[227762]: 2026-01-23 09:40:55.002 227766 INFO nova.compute.manager [-] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:40:55 np0005593234 nova_compute[227762]: 2026-01-23 09:40:55.044 227766 DEBUG nova.compute.manager [None req-630eba55-15ac-4680-9db7-6bd707b747f6 - - - - - -] [instance: 47e7874b-1f9f-46aa-9227-d6f9b2bf1e51] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:40:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e176 e176: 3 total, 3 up, 3 in
Jan 23 04:40:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:56.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:56.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e177 e177: 3 total, 3 up, 3 in
Jan 23 04:40:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e178 e178: 3 total, 3 up, 3 in
Jan 23 04:40:58 np0005593234 nova_compute[227762]: 2026-01-23 09:40:58.541 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:40:58.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:58 np0005593234 nova_compute[227762]: 2026-01-23 09:40:58.779 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:40:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:40:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:40:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:40:58.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:40:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:00.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:00.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:02.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:02.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:03 np0005593234 nova_compute[227762]: 2026-01-23 09:41:03.542 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:03 np0005593234 nova_compute[227762]: 2026-01-23 09:41:03.780 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:03 np0005593234 podman[247612]: 2026-01-23 09:41:03.831450101 +0000 UTC m=+0.116311321 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 04:41:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e179 e179: 3 total, 3 up, 3 in
Jan 23 04:41:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:04.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:41:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:04.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:41:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:06.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:41:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:06.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:41:07 np0005593234 nova_compute[227762]: 2026-01-23 09:41:07.673 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161252.6717458, 0edc214c-75fd-434c-bc75-940bef41d987 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:41:07 np0005593234 nova_compute[227762]: 2026-01-23 09:41:07.674 227766 INFO nova.compute.manager [-] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:41:07 np0005593234 nova_compute[227762]: 2026-01-23 09:41:07.696 227766 DEBUG nova.compute.manager [None req-e2eb181e-0087-4e84-a1f8-0d62708574c2 - - - - - -] [instance: 0edc214c-75fd-434c-bc75-940bef41d987] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:41:07 np0005593234 nova_compute[227762]: 2026-01-23 09:41:07.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:08 np0005593234 nova_compute[227762]: 2026-01-23 09:41:08.580 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:41:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:08.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:41:08 np0005593234 nova_compute[227762]: 2026-01-23 09:41:08.782 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:08.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e179 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:10.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:10.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:12.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:12 np0005593234 nova_compute[227762]: 2026-01-23 09:41:12.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:12 np0005593234 nova_compute[227762]: 2026-01-23 09:41:12.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:41:12 np0005593234 nova_compute[227762]: 2026-01-23 09:41:12.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:41:12 np0005593234 nova_compute[227762]: 2026-01-23 09:41:12.762 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:41:12 np0005593234 nova_compute[227762]: 2026-01-23 09:41:12.763 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:12 np0005593234 nova_compute[227762]: 2026-01-23 09:41:12.763 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:12 np0005593234 nova_compute[227762]: 2026-01-23 09:41:12.797 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:12 np0005593234 nova_compute[227762]: 2026-01-23 09:41:12.797 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:12 np0005593234 nova_compute[227762]: 2026-01-23 09:41:12.797 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:12 np0005593234 nova_compute[227762]: 2026-01-23 09:41:12.797 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:41:12 np0005593234 nova_compute[227762]: 2026-01-23 09:41:12.798 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:12.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:41:13 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1040011522' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:41:13 np0005593234 nova_compute[227762]: 2026-01-23 09:41:13.242 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:13 np0005593234 nova_compute[227762]: 2026-01-23 09:41:13.458 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:41:13 np0005593234 nova_compute[227762]: 2026-01-23 09:41:13.459 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4778MB free_disk=20.978435516357422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:41:13 np0005593234 nova_compute[227762]: 2026-01-23 09:41:13.459 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:13 np0005593234 nova_compute[227762]: 2026-01-23 09:41:13.460 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:13 np0005593234 nova_compute[227762]: 2026-01-23 09:41:13.582 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:13 np0005593234 nova_compute[227762]: 2026-01-23 09:41:13.783 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:13 np0005593234 nova_compute[227762]: 2026-01-23 09:41:13.784 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:41:13 np0005593234 nova_compute[227762]: 2026-01-23 09:41:13.785 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:41:13 np0005593234 nova_compute[227762]: 2026-01-23 09:41:13.906 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e180 e180: 3 total, 3 up, 3 in
Jan 23 04:41:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:41:14 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3455255923' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:41:14 np0005593234 nova_compute[227762]: 2026-01-23 09:41:14.382 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:14 np0005593234 nova_compute[227762]: 2026-01-23 09:41:14.388 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:41:14 np0005593234 nova_compute[227762]: 2026-01-23 09:41:14.406 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:41:14 np0005593234 nova_compute[227762]: 2026-01-23 09:41:14.457 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:41:14 np0005593234 nova_compute[227762]: 2026-01-23 09:41:14.458 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.998s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:14.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:14.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:15 np0005593234 nova_compute[227762]: 2026-01-23 09:41:15.440 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:15 np0005593234 nova_compute[227762]: 2026-01-23 09:41:15.464 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:15 np0005593234 nova_compute[227762]: 2026-01-23 09:41:15.465 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:15 np0005593234 nova_compute[227762]: 2026-01-23 09:41:15.465 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:41:15 np0005593234 nova_compute[227762]: 2026-01-23 09:41:15.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:16.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:41:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:16.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:41:18 np0005593234 nova_compute[227762]: 2026-01-23 09:41:18.583 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:18.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:18 np0005593234 nova_compute[227762]: 2026-01-23 09:41:18.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:18 np0005593234 nova_compute[227762]: 2026-01-23 09:41:18.784 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:41:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:19.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:41:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:20 np0005593234 podman[247767]: 2026-01-23 09:41:20.439505153 +0000 UTC m=+0.053069918 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 23 04:41:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:20.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:20 np0005593234 nova_compute[227762]: 2026-01-23 09:41:20.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:41:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:21.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e181 e181: 3 total, 3 up, 3 in
Jan 23 04:41:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:41:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:41:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:41:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:22.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:23.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:23 np0005593234 nova_compute[227762]: 2026-01-23 09:41:23.633 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:23 np0005593234 nova_compute[227762]: 2026-01-23 09:41:23.785 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:41:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:41:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:24.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:25.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:26.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:27.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:28 np0005593234 nova_compute[227762]: 2026-01-23 09:41:28.635 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:28.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:28 np0005593234 nova_compute[227762]: 2026-01-23 09:41:28.786 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:29.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:41:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:30.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:41:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:41:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:41:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:41:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:31.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:41:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:32.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:33.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:33 np0005593234 nova_compute[227762]: 2026-01-23 09:41:33.636 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:33 np0005593234 nova_compute[227762]: 2026-01-23 09:41:33.787 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:34.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:34 np0005593234 podman[247997]: 2026-01-23 09:41:34.783499742 +0000 UTC m=+0.079712939 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 04:41:34 np0005593234 nova_compute[227762]: 2026-01-23 09:41:34.908 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Acquiring lock "a8411989-0134-41c7-85e7-36173b393043" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:34 np0005593234 nova_compute[227762]: 2026-01-23 09:41:34.908 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:34 np0005593234 nova_compute[227762]: 2026-01-23 09:41:34.937 227766 DEBUG nova.compute.manager [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:41:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:41:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:35.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.047 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.048 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.055 227766 DEBUG nova.virt.hardware [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.055 227766 INFO nova.compute.claims [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.265 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:41:35 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2141966402' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.730 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.736 227766 DEBUG nova.compute.provider_tree [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.754 227766 DEBUG nova.scheduler.client.report [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.787 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.739s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.788 227766 DEBUG nova.compute.manager [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.902 227766 DEBUG nova.compute.manager [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.903 227766 DEBUG nova.network.neutron [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.930 227766 INFO nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Ignoring supplied device name: /dev/sda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:41:35 np0005593234 nova_compute[227762]: 2026-01-23 09:41:35.950 227766 DEBUG nova.compute.manager [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:41:36 np0005593234 nova_compute[227762]: 2026-01-23 09:41:36.126 227766 DEBUG nova.compute.manager [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:41:36 np0005593234 nova_compute[227762]: 2026-01-23 09:41:36.127 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:41:36 np0005593234 nova_compute[227762]: 2026-01-23 09:41:36.128 227766 INFO nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Creating image(s)#033[00m
Jan 23 04:41:36 np0005593234 nova_compute[227762]: 2026-01-23 09:41:36.152 227766 DEBUG nova.storage.rbd_utils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] rbd image a8411989-0134-41c7-85e7-36173b393043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:36.160 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:41:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:36.161 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:41:36 np0005593234 nova_compute[227762]: 2026-01-23 09:41:36.205 227766 DEBUG nova.storage.rbd_utils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] rbd image a8411989-0134-41c7-85e7-36173b393043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:36 np0005593234 nova_compute[227762]: 2026-01-23 09:41:36.228 227766 DEBUG nova.storage.rbd_utils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] rbd image a8411989-0134-41c7-85e7-36173b393043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:36 np0005593234 nova_compute[227762]: 2026-01-23 09:41:36.232 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Acquiring lock "042c073dd2256184660c2c54412f562524aad4af" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:36 np0005593234 nova_compute[227762]: 2026-01-23 09:41:36.232 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "042c073dd2256184660c2c54412f562524aad4af" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:36 np0005593234 nova_compute[227762]: 2026-01-23 09:41:36.237 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:36 np0005593234 nova_compute[227762]: 2026-01-23 09:41:36.258 227766 DEBUG nova.policy [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a4618f86429416889ef239f4b21bacc', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7bb32481db2547b49bc4f3a10883baef', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:41:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:36.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:36 np0005593234 nova_compute[227762]: 2026-01-23 09:41:36.784 227766 DEBUG nova.virt.libvirt.imagebackend [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/de0f1f21-0106-4885-a7ac-14a7ec714eff/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/de0f1f21-0106-4885-a7ac-14a7ec714eff/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 04:41:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:41:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:37.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:41:38 np0005593234 nova_compute[227762]: 2026-01-23 09:41:38.498 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:38 np0005593234 nova_compute[227762]: 2026-01-23 09:41:38.559 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af.part --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:38 np0005593234 nova_compute[227762]: 2026-01-23 09:41:38.561 227766 DEBUG nova.virt.images [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] de0f1f21-0106-4885-a7ac-14a7ec714eff was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 23 04:41:38 np0005593234 nova_compute[227762]: 2026-01-23 09:41:38.562 227766 DEBUG nova.privsep.utils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 23 04:41:38 np0005593234 nova_compute[227762]: 2026-01-23 09:41:38.562 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af.part /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:38 np0005593234 nova_compute[227762]: 2026-01-23 09:41:38.638 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:38.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:38 np0005593234 nova_compute[227762]: 2026-01-23 09:41:38.788 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:38 np0005593234 nova_compute[227762]: 2026-01-23 09:41:38.900 227766 DEBUG nova.network.neutron [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Successfully created port: 4c05c2ff-d433-43af-8e32-d6197dee340f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:41:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:39.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:39 np0005593234 nova_compute[227762]: 2026-01-23 09:41:39.091 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af.part /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af.converted" returned: 0 in 0.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:39 np0005593234 nova_compute[227762]: 2026-01-23 09:41:39.097 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:39 np0005593234 nova_compute[227762]: 2026-01-23 09:41:39.158 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af.converted --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:39 np0005593234 nova_compute[227762]: 2026-01-23 09:41:39.159 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "042c073dd2256184660c2c54412f562524aad4af" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:39 np0005593234 nova_compute[227762]: 2026-01-23 09:41:39.184 227766 DEBUG nova.storage.rbd_utils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] rbd image a8411989-0134-41c7-85e7-36173b393043_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:39 np0005593234 nova_compute[227762]: 2026-01-23 09:41:39.188 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af a8411989-0134-41c7-85e7-36173b393043_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:41:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:40.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:41:40 np0005593234 nova_compute[227762]: 2026-01-23 09:41:40.747 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af a8411989-0134-41c7-85e7-36173b393043_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.559s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:40 np0005593234 nova_compute[227762]: 2026-01-23 09:41:40.813 227766 DEBUG nova.storage.rbd_utils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] resizing rbd image a8411989-0134-41c7-85e7-36173b393043_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:41:40 np0005593234 nova_compute[227762]: 2026-01-23 09:41:40.910 227766 DEBUG nova.objects.instance [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lazy-loading 'migration_context' on Instance uuid a8411989-0134-41c7-85e7-36173b393043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:41:40 np0005593234 nova_compute[227762]: 2026-01-23 09:41:40.945 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:41:40 np0005593234 nova_compute[227762]: 2026-01-23 09:41:40.946 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Ensure instance console log exists: /var/lib/nova/instances/a8411989-0134-41c7-85e7-36173b393043/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:41:40 np0005593234 nova_compute[227762]: 2026-01-23 09:41:40.946 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:40 np0005593234 nova_compute[227762]: 2026-01-23 09:41:40.946 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:40 np0005593234 nova_compute[227762]: 2026-01-23 09:41:40.947 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:41.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:41 np0005593234 nova_compute[227762]: 2026-01-23 09:41:41.889 227766 DEBUG nova.network.neutron [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Successfully updated port: 4c05c2ff-d433-43af-8e32-d6197dee340f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:41:41 np0005593234 nova_compute[227762]: 2026-01-23 09:41:41.911 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Acquiring lock "refresh_cache-a8411989-0134-41c7-85e7-36173b393043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:41:41 np0005593234 nova_compute[227762]: 2026-01-23 09:41:41.911 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Acquired lock "refresh_cache-a8411989-0134-41c7-85e7-36173b393043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:41:41 np0005593234 nova_compute[227762]: 2026-01-23 09:41:41.911 227766 DEBUG nova.network.neutron [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:41:42 np0005593234 nova_compute[227762]: 2026-01-23 09:41:42.070 227766 DEBUG nova.compute.manager [req-f9bb12d4-0906-4625-bc99-756003c0517d req-add2cb4a-ab24-42eb-a612-3ab312097e6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Received event network-changed-4c05c2ff-d433-43af-8e32-d6197dee340f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:42 np0005593234 nova_compute[227762]: 2026-01-23 09:41:42.071 227766 DEBUG nova.compute.manager [req-f9bb12d4-0906-4625-bc99-756003c0517d req-add2cb4a-ab24-42eb-a612-3ab312097e6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Refreshing instance network info cache due to event network-changed-4c05c2ff-d433-43af-8e32-d6197dee340f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:41:42 np0005593234 nova_compute[227762]: 2026-01-23 09:41:42.071 227766 DEBUG oslo_concurrency.lockutils [req-f9bb12d4-0906-4625-bc99-756003c0517d req-add2cb4a-ab24-42eb-a612-3ab312097e6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a8411989-0134-41c7-85e7-36173b393043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:41:42 np0005593234 nova_compute[227762]: 2026-01-23 09:41:42.227 227766 DEBUG nova.network.neutron [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:41:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:41:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:42.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:41:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:42.817 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:42.818 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:42.818 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:43.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:43.163 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.640 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.789 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.816 227766 DEBUG nova.network.neutron [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Updating instance_info_cache with network_info: [{"id": "4c05c2ff-d433-43af-8e32-d6197dee340f", "address": "fa:16:3e:b0:f2:72", "network": {"id": "727f22a6-57c8-4d50-9d8c-b9e831c4902b", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1510828903-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bb32481db2547b49bc4f3a10883baef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c05c2ff-d4", "ovs_interfaceid": "4c05c2ff-d433-43af-8e32-d6197dee340f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.840 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Releasing lock "refresh_cache-a8411989-0134-41c7-85e7-36173b393043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.840 227766 DEBUG nova.compute.manager [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Instance network_info: |[{"id": "4c05c2ff-d433-43af-8e32-d6197dee340f", "address": "fa:16:3e:b0:f2:72", "network": {"id": "727f22a6-57c8-4d50-9d8c-b9e831c4902b", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1510828903-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bb32481db2547b49bc4f3a10883baef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c05c2ff-d4", "ovs_interfaceid": "4c05c2ff-d433-43af-8e32-d6197dee340f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.840 227766 DEBUG oslo_concurrency.lockutils [req-f9bb12d4-0906-4625-bc99-756003c0517d req-add2cb4a-ab24-42eb-a612-3ab312097e6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a8411989-0134-41c7-85e7-36173b393043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.841 227766 DEBUG nova.network.neutron [req-f9bb12d4-0906-4625-bc99-756003c0517d req-add2cb4a-ab24-42eb-a612-3ab312097e6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Refreshing network info cache for port 4c05c2ff-d433-43af-8e32-d6197dee340f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.843 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Start _get_guest_xml network_info=[{"id": "4c05c2ff-d433-43af-8e32-d6197dee340f", "address": "fa:16:3e:b0:f2:72", "network": {"id": "727f22a6-57c8-4d50-9d8c-b9e831c4902b", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1510828903-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bb32481db2547b49bc4f3a10883baef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c05c2ff-d4", "ovs_interfaceid": "4c05c2ff-d433-43af-8e32-d6197dee340f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'scsi', 'cdrom_bus': 'scsi', 'mapping': {'root': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'scsi', 'dev': 'sda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'scsi', 'dev': 'sdb', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:41:16Z,direct_url=<?>,disk_format='qcow2',id=de0f1f21-0106-4885-a7ac-14a7ec714eff,min_disk=0,min_ram=0,name='',owner='bfaf98c50275412bb160829c8fe02fe3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:41:22Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/sda', 'image': [{'device_name': '/dev/sda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'scsi', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': 'de0f1f21-0106-4885-a7ac-14a7ec714eff'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.847 227766 WARNING nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.851 227766 DEBUG nova.virt.libvirt.host [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.851 227766 DEBUG nova.virt.libvirt.host [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.855 227766 DEBUG nova.virt.libvirt.host [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.856 227766 DEBUG nova.virt.libvirt.host [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.857 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.857 227766 DEBUG nova.virt.hardware [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:41:16Z,direct_url=<?>,disk_format='qcow2',id=de0f1f21-0106-4885-a7ac-14a7ec714eff,min_disk=0,min_ram=0,name='',owner='bfaf98c50275412bb160829c8fe02fe3',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:41:22Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.857 227766 DEBUG nova.virt.hardware [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.857 227766 DEBUG nova.virt.hardware [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.858 227766 DEBUG nova.virt.hardware [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.858 227766 DEBUG nova.virt.hardware [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.858 227766 DEBUG nova.virt.hardware [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.858 227766 DEBUG nova.virt.hardware [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.858 227766 DEBUG nova.virt.hardware [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.858 227766 DEBUG nova.virt.hardware [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.859 227766 DEBUG nova.virt.hardware [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.859 227766 DEBUG nova.virt.hardware [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:41:43 np0005593234 nova_compute[227762]: 2026-01-23 09:41:43.861 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:44 np0005593234 ceph-osd[79769]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/18.2.7/rpm/el9/BUILD/ceph-18.2.7/src/cls/lock/cls_lock.cc:291: Could not read list of current lockers off disk: (2) No such file or directory
Jan 23 04:41:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:41:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3099231953' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.329 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.354 227766 DEBUG nova.storage.rbd_utils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] rbd image a8411989-0134-41c7-85e7-36173b393043_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.357 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:41:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:44.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:41:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:41:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3991618643' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.768 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.770 227766 DEBUG nova.virt.libvirt.vif [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:41:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-657713700',display_name='tempest-AttachSCSIVolumeTestJSON-server-657713700',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachscsivolumetestjson-server-657713700',id=44,image_ref='de0f1f21-0106-4885-a7ac-14a7ec714eff',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBALeFNUs+GYVQQqhQHFjDO5lw5M0y1Tt3vRvgYug2ng5cQxIzuhK3ImpNfCbdsCHy2Pzir3e9qqZTtkd5tTXp6+vUgC/FcyOXc6V9AcjGWqLg5yX6ioTlpK31gM0J5JdhA==',key_name='tempest-keypair-392233361',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7bb32481db2547b49bc4f3a10883baef',ramdisk_id='',reservation_id='r-aq9ryp00',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='de0f1f21-0106-4885-a7ac-14a7ec714eff',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='q35',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1359766574',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1359766574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:41:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a4618f86429416889ef239f4b21bacc',uuid=a8411989-0134-41c7-85e7-36173b393043,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c05c2ff-d433-43af-8e32-d6197dee340f", "address": "fa:16:3e:b0:f2:72", "network": {"id": "727f22a6-57c8-4d50-9d8c-b9e831c4902b", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1510828903-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bb32481db2547b49bc4f3a10883baef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c05c2ff-d4", "ovs_interfaceid": "4c05c2ff-d433-43af-8e32-d6197dee340f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.770 227766 DEBUG nova.network.os_vif_util [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Converting VIF {"id": "4c05c2ff-d433-43af-8e32-d6197dee340f", "address": "fa:16:3e:b0:f2:72", "network": {"id": "727f22a6-57c8-4d50-9d8c-b9e831c4902b", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1510828903-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bb32481db2547b49bc4f3a10883baef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c05c2ff-d4", "ovs_interfaceid": "4c05c2ff-d433-43af-8e32-d6197dee340f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.772 227766 DEBUG nova.network.os_vif_util [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:f2:72,bridge_name='br-int',has_traffic_filtering=True,id=4c05c2ff-d433-43af-8e32-d6197dee340f,network=Network(727f22a6-57c8-4d50-9d8c-b9e831c4902b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c05c2ff-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.773 227766 DEBUG nova.objects.instance [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lazy-loading 'pci_devices' on Instance uuid a8411989-0134-41c7-85e7-36173b393043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.801 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  <uuid>a8411989-0134-41c7-85e7-36173b393043</uuid>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  <name>instance-0000002c</name>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <nova:name>tempest-AttachSCSIVolumeTestJSON-server-657713700</nova:name>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:41:43</nova:creationTime>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <nova:user uuid="9a4618f86429416889ef239f4b21bacc">tempest-AttachSCSIVolumeTestJSON-1359766574-project-member</nova:user>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <nova:project uuid="7bb32481db2547b49bc4f3a10883baef">tempest-AttachSCSIVolumeTestJSON-1359766574</nova:project>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="de0f1f21-0106-4885-a7ac-14a7ec714eff"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <nova:port uuid="4c05c2ff-d433-43af-8e32-d6197dee340f">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <entry name="serial">a8411989-0134-41c7-85e7-36173b393043</entry>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <entry name="uuid">a8411989-0134-41c7-85e7-36173b393043</entry>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a8411989-0134-41c7-85e7-36173b393043_disk">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <target dev="sda" bus="scsi"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <address type="drive" controller="0" unit="0"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a8411989-0134-41c7-85e7-36173b393043_disk.config">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <target dev="sdb" bus="scsi"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <address type="drive" controller="0" unit="1"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="scsi" index="0" model="virtio-scsi"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:b0:f2:72"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <target dev="tap4c05c2ff-d4"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/a8411989-0134-41c7-85e7-36173b393043/console.log" append="off"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:41:44 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:41:44 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:41:44 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:41:44 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.803 227766 DEBUG nova.compute.manager [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Preparing to wait for external event network-vif-plugged-4c05c2ff-d433-43af-8e32-d6197dee340f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.803 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Acquiring lock "a8411989-0134-41c7-85e7-36173b393043-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.803 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.804 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.805 227766 DEBUG nova.virt.libvirt.vif [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:41:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-657713700',display_name='tempest-AttachSCSIVolumeTestJSON-server-657713700',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachscsivolumetestjson-server-657713700',id=44,image_ref='de0f1f21-0106-4885-a7ac-14a7ec714eff',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBALeFNUs+GYVQQqhQHFjDO5lw5M0y1Tt3vRvgYug2ng5cQxIzuhK3ImpNfCbdsCHy2Pzir3e9qqZTtkd5tTXp6+vUgC/FcyOXc6V9AcjGWqLg5yX6ioTlpK31gM0J5JdhA==',key_name='tempest-keypair-392233361',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7bb32481db2547b49bc4f3a10883baef',ramdisk_id='',reservation_id='r-aq9ryp00',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='de0f1f21-0106-4885-a7ac-14a7ec714eff',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_machine_type='q35',image_hw_scsi_model='virtio-scsi',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1359766574',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1359766574-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:41:35Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a4618f86429416889ef239f4b21bacc',uuid=a8411989-0134-41c7-85e7-36173b393043,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4c05c2ff-d433-43af-8e32-d6197dee340f", "address": "fa:16:3e:b0:f2:72", "network": {"id": "727f22a6-57c8-4d50-9d8c-b9e831c4902b", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1510828903-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bb32481db2547b49bc4f3a10883baef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c05c2ff-d4", "ovs_interfaceid": "4c05c2ff-d433-43af-8e32-d6197dee340f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.805 227766 DEBUG nova.network.os_vif_util [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Converting VIF {"id": "4c05c2ff-d433-43af-8e32-d6197dee340f", "address": "fa:16:3e:b0:f2:72", "network": {"id": "727f22a6-57c8-4d50-9d8c-b9e831c4902b", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1510828903-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bb32481db2547b49bc4f3a10883baef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c05c2ff-d4", "ovs_interfaceid": "4c05c2ff-d433-43af-8e32-d6197dee340f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.806 227766 DEBUG nova.network.os_vif_util [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b0:f2:72,bridge_name='br-int',has_traffic_filtering=True,id=4c05c2ff-d433-43af-8e32-d6197dee340f,network=Network(727f22a6-57c8-4d50-9d8c-b9e831c4902b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c05c2ff-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.806 227766 DEBUG os_vif [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:f2:72,bridge_name='br-int',has_traffic_filtering=True,id=4c05c2ff-d433-43af-8e32-d6197dee340f,network=Network(727f22a6-57c8-4d50-9d8c-b9e831c4902b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c05c2ff-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.807 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.807 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.808 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.811 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.811 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4c05c2ff-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.812 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4c05c2ff-d4, col_values=(('external_ids', {'iface-id': '4c05c2ff-d433-43af-8e32-d6197dee340f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b0:f2:72', 'vm-uuid': 'a8411989-0134-41c7-85e7-36173b393043'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.813 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:44 np0005593234 NetworkManager[48942]: <info>  [1769161304.8140] manager: (tap4c05c2ff-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/64)
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.816 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.819 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.820 227766 INFO os_vif [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b0:f2:72,bridge_name='br-int',has_traffic_filtering=True,id=4c05c2ff-d433-43af-8e32-d6197dee340f,network=Network(727f22a6-57c8-4d50-9d8c-b9e831c4902b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c05c2ff-d4')#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.866 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.866 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.867 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] No VIF found with MAC fa:16:3e:b0:f2:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.867 227766 INFO nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Using config drive#033[00m
Jan 23 04:41:44 np0005593234 nova_compute[227762]: 2026-01-23 09:41:44.895 227766 DEBUG nova.storage.rbd_utils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] rbd image a8411989-0134-41c7-85e7-36173b393043_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:45.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:45 np0005593234 nova_compute[227762]: 2026-01-23 09:41:45.542 227766 INFO nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Creating config drive at /var/lib/nova/instances/a8411989-0134-41c7-85e7-36173b393043/disk.config#033[00m
Jan 23 04:41:45 np0005593234 nova_compute[227762]: 2026-01-23 09:41:45.549 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a8411989-0134-41c7-85e7-36173b393043/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg_httwwt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:45 np0005593234 nova_compute[227762]: 2026-01-23 09:41:45.678 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a8411989-0134-41c7-85e7-36173b393043/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpg_httwwt" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:45 np0005593234 nova_compute[227762]: 2026-01-23 09:41:45.729 227766 DEBUG nova.storage.rbd_utils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] rbd image a8411989-0134-41c7-85e7-36173b393043_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:41:45 np0005593234 nova_compute[227762]: 2026-01-23 09:41:45.732 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a8411989-0134-41c7-85e7-36173b393043/disk.config a8411989-0134-41c7-85e7-36173b393043_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:41:45 np0005593234 nova_compute[227762]: 2026-01-23 09:41:45.879 227766 DEBUG nova.network.neutron [req-f9bb12d4-0906-4625-bc99-756003c0517d req-add2cb4a-ab24-42eb-a612-3ab312097e6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Updated VIF entry in instance network info cache for port 4c05c2ff-d433-43af-8e32-d6197dee340f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:41:45 np0005593234 nova_compute[227762]: 2026-01-23 09:41:45.880 227766 DEBUG nova.network.neutron [req-f9bb12d4-0906-4625-bc99-756003c0517d req-add2cb4a-ab24-42eb-a612-3ab312097e6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Updating instance_info_cache with network_info: [{"id": "4c05c2ff-d433-43af-8e32-d6197dee340f", "address": "fa:16:3e:b0:f2:72", "network": {"id": "727f22a6-57c8-4d50-9d8c-b9e831c4902b", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1510828903-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bb32481db2547b49bc4f3a10883baef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c05c2ff-d4", "ovs_interfaceid": "4c05c2ff-d433-43af-8e32-d6197dee340f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:41:45 np0005593234 nova_compute[227762]: 2026-01-23 09:41:45.898 227766 DEBUG oslo_concurrency.lockutils [req-f9bb12d4-0906-4625-bc99-756003c0517d req-add2cb4a-ab24-42eb-a612-3ab312097e6e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a8411989-0134-41c7-85e7-36173b393043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:41:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:41:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:46.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:41:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:47.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.036 227766 DEBUG oslo_concurrency.processutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a8411989-0134-41c7-85e7-36173b393043/disk.config a8411989-0134-41c7-85e7-36173b393043_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.303s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.037 227766 INFO nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Deleting local config drive /var/lib/nova/instances/a8411989-0134-41c7-85e7-36173b393043/disk.config because it was imported into RBD.#033[00m
Jan 23 04:41:47 np0005593234 kernel: tap4c05c2ff-d4: entered promiscuous mode
Jan 23 04:41:47 np0005593234 NetworkManager[48942]: <info>  [1769161307.0850] manager: (tap4c05c2ff-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/65)
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.083 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:47 np0005593234 ovn_controller[134547]: 2026-01-23T09:41:47Z|00113|binding|INFO|Claiming lport 4c05c2ff-d433-43af-8e32-d6197dee340f for this chassis.
Jan 23 04:41:47 np0005593234 ovn_controller[134547]: 2026-01-23T09:41:47Z|00114|binding|INFO|4c05c2ff-d433-43af-8e32-d6197dee340f: Claiming fa:16:3e:b0:f2:72 10.100.0.11
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.088 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.100 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:f2:72 10.100.0.11'], port_security=['fa:16:3e:b0:f2:72 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a8411989-0134-41c7-85e7-36173b393043', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-727f22a6-57c8-4d50-9d8c-b9e831c4902b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7bb32481db2547b49bc4f3a10883baef', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a146fc4c-89c7-4383-805c-0fc50ff3724e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f333c0b6-b6d9-4b7d-8543-88d3b087f640, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=4c05c2ff-d433-43af-8e32-d6197dee340f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.102 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 4c05c2ff-d433-43af-8e32-d6197dee340f in datapath 727f22a6-57c8-4d50-9d8c-b9e831c4902b bound to our chassis#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.103 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 727f22a6-57c8-4d50-9d8c-b9e831c4902b#033[00m
Jan 23 04:41:47 np0005593234 systemd-machined[195626]: New machine qemu-20-instance-0000002c.
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.117 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5fdbbfd5-abfb-4c5c-8011-6d8fad16d18f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.118 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap727f22a6-51 in ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.120 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap727f22a6-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.120 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[593a46ae-ab45-4d58-93d0-b3cf55c9c71b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.121 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[02c0d338-303a-4825-8ddf-67660244fcbe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 systemd[1]: Started Virtual Machine qemu-20-instance-0000002c.
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.132 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[5d9f8c2e-edbb-4f84-8d91-1558260b2d5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 systemd-udevd[248369]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.147 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fbca7c39-158a-4612-a087-42cb386a50c1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 NetworkManager[48942]: <info>  [1769161307.1515] device (tap4c05c2ff-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:41:47 np0005593234 NetworkManager[48942]: <info>  [1769161307.1524] device (tap4c05c2ff-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.152 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:47 np0005593234 ovn_controller[134547]: 2026-01-23T09:41:47Z|00115|binding|INFO|Setting lport 4c05c2ff-d433-43af-8e32-d6197dee340f ovn-installed in OVS
Jan 23 04:41:47 np0005593234 ovn_controller[134547]: 2026-01-23T09:41:47Z|00116|binding|INFO|Setting lport 4c05c2ff-d433-43af-8e32-d6197dee340f up in Southbound
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.158 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.176 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c38ab70b-62ba-4e36-9ba1-4f6f4007bed0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 systemd-udevd[248373]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:41:47 np0005593234 NetworkManager[48942]: <info>  [1769161307.1840] manager: (tap727f22a6-50): new Veth device (/org/freedesktop/NetworkManager/Devices/66)
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.182 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fee629ec-38cd-4d9a-a027-24c5e020f7ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.217 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[051ddb4e-0e27-455f-b04d-7def34f1bc7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.221 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[bd1dd676-412c-4d17-a9cf-643adc3adff3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 NetworkManager[48942]: <info>  [1769161307.2451] device (tap727f22a6-50): carrier: link connected
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.249 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a01fad77-c402-4bde-81da-d82444f784bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.265 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bb0a803c-aa5b-449b-a70f-2f4e83d8c6e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap727f22a6-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:86:5a:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520220, 'reachable_time': 39922, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 248399, 'error': None, 'target': 'ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.280 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eb8a1155-5556-44be-afba-10efe463cdc0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe86:5aea'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 520220, 'tstamp': 520220}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 248400, 'error': None, 'target': 'ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.293 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[76623a49-0951-4feb-b48b-ce5dfe4f1812]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap727f22a6-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:86:5a:ea'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520220, 'reachable_time': 39922, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 248401, 'error': None, 'target': 'ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.321 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[81934586-91c3-4724-a7c0-287e621a2a27]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.373 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e93de2a0-f079-4f3b-a0b9-c624ded8ecd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.375 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap727f22a6-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.375 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.376 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap727f22a6-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:47 np0005593234 NetworkManager[48942]: <info>  [1769161307.3784] manager: (tap727f22a6-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Jan 23 04:41:47 np0005593234 kernel: tap727f22a6-50: entered promiscuous mode
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.378 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.379 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.380 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap727f22a6-50, col_values=(('external_ids', {'iface-id': '7fce7ad3-7f13-4075-b117-8fe35169a722'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.381 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:47 np0005593234 ovn_controller[134547]: 2026-01-23T09:41:47Z|00117|binding|INFO|Releasing lport 7fce7ad3-7f13-4075-b117-8fe35169a722 from this chassis (sb_readonly=0)
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.396 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.397 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/727f22a6-57c8-4d50-9d8c-b9e831c4902b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/727f22a6-57c8-4d50-9d8c-b9e831c4902b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.398 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[844e8c69-7b91-4763-b032-a1a190be2465]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.399 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-727f22a6-57c8-4d50-9d8c-b9e831c4902b
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/727f22a6-57c8-4d50-9d8c-b9e831c4902b.pid.haproxy
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 727f22a6-57c8-4d50-9d8c-b9e831c4902b
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:41:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:41:47.399 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b', 'env', 'PROCESS_TAG=haproxy-727f22a6-57c8-4d50-9d8c-b9e831c4902b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/727f22a6-57c8-4d50-9d8c-b9e831c4902b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.732 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161307.7314577, a8411989-0134-41c7-85e7-36173b393043 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.733 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] VM Started (Lifecycle Event)#033[00m
Jan 23 04:41:47 np0005593234 podman[248476]: 2026-01-23 09:41:47.749912967 +0000 UTC m=+0.025846478 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.903 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.907 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161307.731707, a8411989-0134-41c7-85e7-36173b393043 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.907 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.990 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:41:47 np0005593234 nova_compute[227762]: 2026-01-23 09:41:47.993 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.019 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:41:48 np0005593234 podman[248476]: 2026-01-23 09:41:48.188451528 +0000 UTC m=+0.464385009 container create ddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 04:41:48 np0005593234 systemd[1]: Started libpod-conmon-ddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254.scope.
Jan 23 04:41:48 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:41:48 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5b46fa6c7ad0fbe305a70c958364971884c42f26e8f5626fea64cb4df2419959/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:41:48 np0005593234 podman[248476]: 2026-01-23 09:41:48.303721206 +0000 UTC m=+0.579654697 container init ddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 04:41:48 np0005593234 podman[248476]: 2026-01-23 09:41:48.308752263 +0000 UTC m=+0.584685714 container start ddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:41:48 np0005593234 neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b[248491]: [NOTICE]   (248495) : New worker (248497) forked
Jan 23 04:41:48 np0005593234 neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b[248491]: [NOTICE]   (248495) : Loading success.
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.483 227766 DEBUG nova.compute.manager [req-fa7e56d4-682f-4582-b964-ec4604d2da86 req-4ff622e1-f190-44d6-9549-6c045c65169d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Received event network-vif-plugged-4c05c2ff-d433-43af-8e32-d6197dee340f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.483 227766 DEBUG oslo_concurrency.lockutils [req-fa7e56d4-682f-4582-b964-ec4604d2da86 req-4ff622e1-f190-44d6-9549-6c045c65169d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a8411989-0134-41c7-85e7-36173b393043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.484 227766 DEBUG oslo_concurrency.lockutils [req-fa7e56d4-682f-4582-b964-ec4604d2da86 req-4ff622e1-f190-44d6-9549-6c045c65169d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.484 227766 DEBUG oslo_concurrency.lockutils [req-fa7e56d4-682f-4582-b964-ec4604d2da86 req-4ff622e1-f190-44d6-9549-6c045c65169d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.484 227766 DEBUG nova.compute.manager [req-fa7e56d4-682f-4582-b964-ec4604d2da86 req-4ff622e1-f190-44d6-9549-6c045c65169d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Processing event network-vif-plugged-4c05c2ff-d433-43af-8e32-d6197dee340f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.485 227766 DEBUG nova.compute.manager [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.489 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.489 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161308.4887252, a8411989-0134-41c7-85e7-36173b393043 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.490 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.494 227766 INFO nova.virt.libvirt.driver [-] [instance: a8411989-0134-41c7-85e7-36173b393043] Instance spawned successfully.#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.495 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Attempting to register defaults for the following image properties: ['hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.501 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.502 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.503 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.503 227766 DEBUG nova.virt.libvirt.driver [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.513 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.517 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.554 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.582 227766 INFO nova.compute.manager [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Took 12.46 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.583 227766 DEBUG nova.compute.manager [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.642 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.661 227766 INFO nova.compute.manager [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Took 13.67 seconds to build instance.#033[00m
Jan 23 04:41:48 np0005593234 nova_compute[227762]: 2026-01-23 09:41:48.682 227766 DEBUG oslo_concurrency.lockutils [None req-67c4165e-15ef-49f6-9292-809068ce1fa8 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:48.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:49.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:49 np0005593234 nova_compute[227762]: 2026-01-23 09:41:49.814 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:50 np0005593234 nova_compute[227762]: 2026-01-23 09:41:50.629 227766 DEBUG nova.compute.manager [req-c4b12e89-9b9d-433a-9520-f4e456d4ac41 req-c2d7f64f-9a8c-4edb-8b26-9363fce16c78 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Received event network-vif-plugged-4c05c2ff-d433-43af-8e32-d6197dee340f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:50 np0005593234 nova_compute[227762]: 2026-01-23 09:41:50.629 227766 DEBUG oslo_concurrency.lockutils [req-c4b12e89-9b9d-433a-9520-f4e456d4ac41 req-c2d7f64f-9a8c-4edb-8b26-9363fce16c78 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a8411989-0134-41c7-85e7-36173b393043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:41:50 np0005593234 nova_compute[227762]: 2026-01-23 09:41:50.630 227766 DEBUG oslo_concurrency.lockutils [req-c4b12e89-9b9d-433a-9520-f4e456d4ac41 req-c2d7f64f-9a8c-4edb-8b26-9363fce16c78 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:41:50 np0005593234 nova_compute[227762]: 2026-01-23 09:41:50.631 227766 DEBUG oslo_concurrency.lockutils [req-c4b12e89-9b9d-433a-9520-f4e456d4ac41 req-c2d7f64f-9a8c-4edb-8b26-9363fce16c78 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:41:50 np0005593234 nova_compute[227762]: 2026-01-23 09:41:50.631 227766 DEBUG nova.compute.manager [req-c4b12e89-9b9d-433a-9520-f4e456d4ac41 req-c2d7f64f-9a8c-4edb-8b26-9363fce16c78 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] No waiting events found dispatching network-vif-plugged-4c05c2ff-d433-43af-8e32-d6197dee340f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:41:50 np0005593234 nova_compute[227762]: 2026-01-23 09:41:50.631 227766 WARNING nova.compute.manager [req-c4b12e89-9b9d-433a-9520-f4e456d4ac41 req-c2d7f64f-9a8c-4edb-8b26-9363fce16c78 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Received unexpected event network-vif-plugged-4c05c2ff-d433-43af-8e32-d6197dee340f for instance with vm_state active and task_state None.#033[00m
Jan 23 04:41:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:41:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:50.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:41:50 np0005593234 podman[248507]: 2026-01-23 09:41:50.758438839 +0000 UTC m=+0.053033896 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 04:41:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:51.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:51 np0005593234 nova_compute[227762]: 2026-01-23 09:41:51.823 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:51 np0005593234 NetworkManager[48942]: <info>  [1769161311.8234] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Jan 23 04:41:51 np0005593234 NetworkManager[48942]: <info>  [1769161311.8247] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Jan 23 04:41:52 np0005593234 nova_compute[227762]: 2026-01-23 09:41:52.077 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:52 np0005593234 ovn_controller[134547]: 2026-01-23T09:41:52Z|00118|binding|INFO|Releasing lport 7fce7ad3-7f13-4075-b117-8fe35169a722 from this chassis (sb_readonly=0)
Jan 23 04:41:52 np0005593234 nova_compute[227762]: 2026-01-23 09:41:52.103 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:41:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:52.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:41:52 np0005593234 nova_compute[227762]: 2026-01-23 09:41:52.803 227766 DEBUG nova.compute.manager [req-dd37fc24-1b99-4252-a90d-e17ecd31bf8a req-2fb7bc66-b758-4e5b-ba4c-5f17ba96a9ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Received event network-changed-4c05c2ff-d433-43af-8e32-d6197dee340f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:41:52 np0005593234 nova_compute[227762]: 2026-01-23 09:41:52.803 227766 DEBUG nova.compute.manager [req-dd37fc24-1b99-4252-a90d-e17ecd31bf8a req-2fb7bc66-b758-4e5b-ba4c-5f17ba96a9ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Refreshing instance network info cache due to event network-changed-4c05c2ff-d433-43af-8e32-d6197dee340f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:41:52 np0005593234 nova_compute[227762]: 2026-01-23 09:41:52.803 227766 DEBUG oslo_concurrency.lockutils [req-dd37fc24-1b99-4252-a90d-e17ecd31bf8a req-2fb7bc66-b758-4e5b-ba4c-5f17ba96a9ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a8411989-0134-41c7-85e7-36173b393043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:41:52 np0005593234 nova_compute[227762]: 2026-01-23 09:41:52.804 227766 DEBUG oslo_concurrency.lockutils [req-dd37fc24-1b99-4252-a90d-e17ecd31bf8a req-2fb7bc66-b758-4e5b-ba4c-5f17ba96a9ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a8411989-0134-41c7-85e7-36173b393043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:41:52 np0005593234 nova_compute[227762]: 2026-01-23 09:41:52.804 227766 DEBUG nova.network.neutron [req-dd37fc24-1b99-4252-a90d-e17ecd31bf8a req-2fb7bc66-b758-4e5b-ba4c-5f17ba96a9ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Refreshing network info cache for port 4c05c2ff-d433-43af-8e32-d6197dee340f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:41:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:53.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:53 np0005593234 nova_compute[227762]: 2026-01-23 09:41:53.644 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:41:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.0 total, 600.0 interval#012Cumulative writes: 6629 writes, 34K keys, 6629 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.03 MB/s#012Cumulative WAL: 6628 writes, 6628 syncs, 1.00 writes per sync, written: 0.07 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1643 writes, 8220 keys, 1643 commit groups, 1.0 writes per commit group, ingest: 16.45 MB, 0.03 MB/s#012Interval WAL: 1642 writes, 1642 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     57.2      0.73              0.12        18    0.040       0      0       0.0       0.0#012  L6      1/0    9.90 MB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   3.6     97.7     80.4      1.84              0.64        17    0.108     85K   9976       0.0       0.0#012 Sum      1/0    9.90 MB   0.0      0.2     0.0      0.1       0.2      0.1       0.0   4.6     70.0     73.8      2.57              0.75        35    0.073     85K   9976       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   4.8    126.9    129.5      0.39              0.11         8    0.048     24K   3114       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   0.0     97.7     80.4      1.84              0.64        17    0.108     85K   9976       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     57.5      0.72              0.12        17    0.043       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 2400.0 total, 600.0 interval#012Flush(GB): cumulative 0.041, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.19 GB write, 0.08 MB/s write, 0.18 GB read, 0.07 MB/s read, 2.6 seconds#012Interval compaction: 0.05 GB write, 0.08 MB/s write, 0.05 GB read, 0.08 MB/s read, 0.4 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 304.00 MB usage: 19.80 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000196 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1147,19.12 MB,6.28788%) FilterBlock(35,248.48 KB,0.0798225%) IndexBlock(35,456.06 KB,0.146504%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:41:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:41:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:54.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:41:54 np0005593234 nova_compute[227762]: 2026-01-23 09:41:54.815 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:41:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:55.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:41:55 np0005593234 nova_compute[227762]: 2026-01-23 09:41:55.204 227766 DEBUG nova.network.neutron [req-dd37fc24-1b99-4252-a90d-e17ecd31bf8a req-2fb7bc66-b758-4e5b-ba4c-5f17ba96a9ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Updated VIF entry in instance network info cache for port 4c05c2ff-d433-43af-8e32-d6197dee340f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:41:55 np0005593234 nova_compute[227762]: 2026-01-23 09:41:55.205 227766 DEBUG nova.network.neutron [req-dd37fc24-1b99-4252-a90d-e17ecd31bf8a req-2fb7bc66-b758-4e5b-ba4c-5f17ba96a9ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Updating instance_info_cache with network_info: [{"id": "4c05c2ff-d433-43af-8e32-d6197dee340f", "address": "fa:16:3e:b0:f2:72", "network": {"id": "727f22a6-57c8-4d50-9d8c-b9e831c4902b", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1510828903-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bb32481db2547b49bc4f3a10883baef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c05c2ff-d4", "ovs_interfaceid": "4c05c2ff-d433-43af-8e32-d6197dee340f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:41:55 np0005593234 nova_compute[227762]: 2026-01-23 09:41:55.513 227766 DEBUG oslo_concurrency.lockutils [req-dd37fc24-1b99-4252-a90d-e17ecd31bf8a req-2fb7bc66-b758-4e5b-ba4c-5f17ba96a9ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a8411989-0134-41c7-85e7-36173b393043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:41:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:56.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:57.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:58 np0005593234 nova_compute[227762]: 2026-01-23 09:41:58.646 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:41:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:41:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:41:58.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:41:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:41:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:41:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:41:59.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:41:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e181 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:41:59 np0005593234 nova_compute[227762]: 2026-01-23 09:41:59.818 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:00 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #46. Immutable memtables: 3.
Jan 23 04:42:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:00.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:01.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:42:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:02.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:42:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 04:42:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:03.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 04:42:03 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:03Z|00119|binding|INFO|Releasing lport 7fce7ad3-7f13-4075-b117-8fe35169a722 from this chassis (sb_readonly=0)
Jan 23 04:42:03 np0005593234 nova_compute[227762]: 2026-01-23 09:42:03.290 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:03 np0005593234 nova_compute[227762]: 2026-01-23 09:42:03.648 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:03 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:03Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b0:f2:72 10.100.0.11
Jan 23 04:42:03 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:03Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b0:f2:72 10.100.0.11
Jan 23 04:42:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e182 e182: 3 total, 3 up, 3 in
Jan 23 04:42:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e182 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:04.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:04 np0005593234 nova_compute[227762]: 2026-01-23 09:42:04.820 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:05.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e183 e183: 3 total, 3 up, 3 in
Jan 23 04:42:05 np0005593234 podman[248586]: 2026-01-23 09:42:05.839103628 +0000 UTC m=+0.132863943 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 23 04:42:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e184 e184: 3 total, 3 up, 3 in
Jan 23 04:42:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:42:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:06.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:42:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:42:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:07.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:42:07 np0005593234 nova_compute[227762]: 2026-01-23 09:42:07.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:08 np0005593234 nova_compute[227762]: 2026-01-23 09:42:08.651 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:42:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:08.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:42:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:42:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:09.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:42:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e184 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:09 np0005593234 nova_compute[227762]: 2026-01-23 09:42:09.823 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:10.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:42:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:11.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:42:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e185 e185: 3 total, 3 up, 3 in
Jan 23 04:42:12 np0005593234 nova_compute[227762]: 2026-01-23 09:42:12.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:42:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:12.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:42:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:42:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:13.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:42:13 np0005593234 nova_compute[227762]: 2026-01-23 09:42:13.653 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:13 np0005593234 nova_compute[227762]: 2026-01-23 09:42:13.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:13 np0005593234 nova_compute[227762]: 2026-01-23 09:42:13.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:13 np0005593234 nova_compute[227762]: 2026-01-23 09:42:13.776 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:13 np0005593234 nova_compute[227762]: 2026-01-23 09:42:13.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:13 np0005593234 nova_compute[227762]: 2026-01-23 09:42:13.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:13 np0005593234 nova_compute[227762]: 2026-01-23 09:42:13.777 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:42:13 np0005593234 nova_compute[227762]: 2026-01-23 09:42:13.777 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e186 e186: 3 total, 3 up, 3 in
Jan 23 04:42:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:42:14 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1708712138' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:42:14 np0005593234 nova_compute[227762]: 2026-01-23 09:42:14.391 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.614s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:14 np0005593234 nova_compute[227762]: 2026-01-23 09:42:14.516 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000002c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:42:14 np0005593234 nova_compute[227762]: 2026-01-23 09:42:14.517 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000002c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:42:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:14 np0005593234 nova_compute[227762]: 2026-01-23 09:42:14.676 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:42:14 np0005593234 nova_compute[227762]: 2026-01-23 09:42:14.678 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4576MB free_disk=20.876258850097656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:42:14 np0005593234 nova_compute[227762]: 2026-01-23 09:42:14.679 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:14 np0005593234 nova_compute[227762]: 2026-01-23 09:42:14.679 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 23 04:42:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:14.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:14 np0005593234 nova_compute[227762]: 2026-01-23 09:42:14.779 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance a8411989-0134-41c7-85e7-36173b393043 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:42:14 np0005593234 nova_compute[227762]: 2026-01-23 09:42:14.780 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:42:14 np0005593234 nova_compute[227762]: 2026-01-23 09:42:14.780 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:42:14 np0005593234 nova_compute[227762]: 2026-01-23 09:42:14.825 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:14 np0005593234 nova_compute[227762]: 2026-01-23 09:42:14.849 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:42:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:15.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:42:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:42:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2152416502' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:42:15 np0005593234 nova_compute[227762]: 2026-01-23 09:42:15.284 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:15 np0005593234 nova_compute[227762]: 2026-01-23 09:42:15.288 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:42:15 np0005593234 nova_compute[227762]: 2026-01-23 09:42:15.324 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:42:15 np0005593234 nova_compute[227762]: 2026-01-23 09:42:15.360 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:42:15 np0005593234 nova_compute[227762]: 2026-01-23 09:42:15.361 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:16 np0005593234 nova_compute[227762]: 2026-01-23 09:42:16.039 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:16 np0005593234 nova_compute[227762]: 2026-01-23 09:42:16.361 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:16 np0005593234 nova_compute[227762]: 2026-01-23 09:42:16.362 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:42:16 np0005593234 nova_compute[227762]: 2026-01-23 09:42:16.362 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:42:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:42:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:16.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:42:16 np0005593234 nova_compute[227762]: 2026-01-23 09:42:16.890 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-a8411989-0134-41c7-85e7-36173b393043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:42:16 np0005593234 nova_compute[227762]: 2026-01-23 09:42:16.891 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-a8411989-0134-41c7-85e7-36173b393043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:42:16 np0005593234 nova_compute[227762]: 2026-01-23 09:42:16.891 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:42:16 np0005593234 nova_compute[227762]: 2026-01-23 09:42:16.891 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a8411989-0134-41c7-85e7-36173b393043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:42:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:17.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:18 np0005593234 nova_compute[227762]: 2026-01-23 09:42:18.654 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:18 np0005593234 nova_compute[227762]: 2026-01-23 09:42:18.745 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] Updating instance_info_cache with network_info: [{"id": "4c05c2ff-d433-43af-8e32-d6197dee340f", "address": "fa:16:3e:b0:f2:72", "network": {"id": "727f22a6-57c8-4d50-9d8c-b9e831c4902b", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1510828903-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bb32481db2547b49bc4f3a10883baef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c05c2ff-d4", "ovs_interfaceid": "4c05c2ff-d433-43af-8e32-d6197dee340f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:42:18 np0005593234 nova_compute[227762]: 2026-01-23 09:42:18.767 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-a8411989-0134-41c7-85e7-36173b393043" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:42:18 np0005593234 nova_compute[227762]: 2026-01-23 09:42:18.768 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:42:18 np0005593234 nova_compute[227762]: 2026-01-23 09:42:18.768 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:18 np0005593234 nova_compute[227762]: 2026-01-23 09:42:18.769 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:18 np0005593234 nova_compute[227762]: 2026-01-23 09:42:18.769 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:42:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:42:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:18.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:42:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:19.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.109769) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161339109875, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 1627, "num_deletes": 253, "total_data_size": 3551516, "memory_usage": 3595720, "flush_reason": "Manual Compaction"}
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161339201243, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 1478971, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33340, "largest_seqno": 34962, "table_properties": {"data_size": 1473559, "index_size": 2616, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 14502, "raw_average_key_size": 21, "raw_value_size": 1461617, "raw_average_value_size": 2146, "num_data_blocks": 116, "num_entries": 681, "num_filter_entries": 681, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161217, "oldest_key_time": 1769161217, "file_creation_time": 1769161339, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 91530 microseconds, and 4285 cpu microseconds.
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.201328) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 1478971 bytes OK
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.201353) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.226768) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.226818) EVENT_LOG_v1 {"time_micros": 1769161339226808, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.226842) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 3543918, prev total WAL file size 3543918, number of live WAL files 2.
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.227965) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303035' seq:72057594037927935, type:22 .. '6D6772737461740031323536' seq:0, type:0; will stop at (end)
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(1444KB)], [63(10138KB)]
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161339228067, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 11860539, "oldest_snapshot_seqno": -1}
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 5790 keys, 8860188 bytes, temperature: kUnknown
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161339304251, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 8860188, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8822456, "index_size": 22134, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14533, "raw_key_size": 148125, "raw_average_key_size": 25, "raw_value_size": 8719377, "raw_average_value_size": 1505, "num_data_blocks": 894, "num_entries": 5790, "num_filter_entries": 5790, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769161339, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.304531) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 8860188 bytes
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.306334) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 155.5 rd, 116.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 9.9 +0.0 blob) out(8.4 +0.0 blob), read-write-amplify(14.0) write-amplify(6.0) OK, records in: 6257, records dropped: 467 output_compression: NoCompression
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.306351) EVENT_LOG_v1 {"time_micros": 1769161339306343, "job": 38, "event": "compaction_finished", "compaction_time_micros": 76298, "compaction_time_cpu_micros": 19720, "output_level": 6, "num_output_files": 1, "total_output_size": 8860188, "num_input_records": 6257, "num_output_records": 5790, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161339306661, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161339308471, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.227907) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.308521) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.308525) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.308527) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.308528) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:42:19.308530) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:42:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:19 np0005593234 nova_compute[227762]: 2026-01-23 09:42:19.827 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:20.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:21.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:21 np0005593234 nova_compute[227762]: 2026-01-23 09:42:21.146 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:21 np0005593234 nova_compute[227762]: 2026-01-23 09:42:21.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:42:21 np0005593234 podman[248715]: 2026-01-23 09:42:21.762865372 +0000 UTC m=+0.057203274 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:42:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:42:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:22.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:42:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:23.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:23 np0005593234 nova_compute[227762]: 2026-01-23 09:42:23.656 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:23 np0005593234 nova_compute[227762]: 2026-01-23 09:42:23.759 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e187 e187: 3 total, 3 up, 3 in
Jan 23 04:42:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:24.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:24 np0005593234 nova_compute[227762]: 2026-01-23 09:42:24.829 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:25.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:26 np0005593234 nova_compute[227762]: 2026-01-23 09:42:26.756 227766 DEBUG oslo_concurrency.lockutils [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Acquiring lock "a8411989-0134-41c7-85e7-36173b393043" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:26 np0005593234 nova_compute[227762]: 2026-01-23 09:42:26.757 227766 DEBUG oslo_concurrency.lockutils [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:26 np0005593234 nova_compute[227762]: 2026-01-23 09:42:26.776 227766 DEBUG nova.objects.instance [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lazy-loading 'flavor' on Instance uuid a8411989-0134-41c7-85e7-36173b393043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:42:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:26.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:26 np0005593234 nova_compute[227762]: 2026-01-23 09:42:26.830 227766 DEBUG oslo_concurrency.lockutils [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.073s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:42:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:27.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.212 227766 DEBUG oslo_concurrency.lockutils [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Acquiring lock "a8411989-0134-41c7-85e7-36173b393043" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.213 227766 DEBUG oslo_concurrency.lockutils [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.213 227766 INFO nova.compute.manager [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Attaching volume 5bcdefce-ec76-4a6c-b618-d67828743aff to /dev/sdc#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.377 227766 DEBUG os_brick.utils [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.379 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.390 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.390 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[846da13f-e705-4236-aaf4-952ec492eae0]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.391 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.399 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.399 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[30b4f387-76b4-4dd2-b866-4195e633c76e]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.401 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.409 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.409 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[d9c8a951-f8a3-44a0-8274-6d09a2eabf74]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.411 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[089c2da0-44db-4143-a135-e01e11787e51]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.411 227766 DEBUG oslo_concurrency.processutils [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.434 227766 DEBUG oslo_concurrency.processutils [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CMD "nvme version" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.436 227766 DEBUG os_brick.initiator.connectors.lightos [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.437 227766 DEBUG os_brick.initiator.connectors.lightos [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.437 227766 DEBUG os_brick.initiator.connectors.lightos [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.438 227766 DEBUG os_brick.utils [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] <== get_connector_properties: return (59ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:42:27 np0005593234 nova_compute[227762]: 2026-01-23 09:42:27.438 227766 DEBUG nova.virt.block_device [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Updating existing volume attachment record: 65001924-ce89-4e0f-8d35-4ab6801db4ec _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 04:42:28 np0005593234 nova_compute[227762]: 2026-01-23 09:42:28.504 227766 DEBUG nova.objects.instance [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lazy-loading 'flavor' on Instance uuid a8411989-0134-41c7-85e7-36173b393043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:42:28 np0005593234 nova_compute[227762]: 2026-01-23 09:42:28.530 227766 DEBUG nova.virt.libvirt.guest [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 04:42:28 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:42:28 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-5bcdefce-ec76-4a6c-b618-d67828743aff">
Jan 23 04:42:28 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 04:42:28 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 04:42:28 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 04:42:28 np0005593234 nova_compute[227762]:  </source>
Jan 23 04:42:28 np0005593234 nova_compute[227762]:  <auth username="openstack">
Jan 23 04:42:28 np0005593234 nova_compute[227762]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:42:28 np0005593234 nova_compute[227762]:  </auth>
Jan 23 04:42:28 np0005593234 nova_compute[227762]:  <target dev="sdc" bus="scsi"/>
Jan 23 04:42:28 np0005593234 nova_compute[227762]:  <serial>5bcdefce-ec76-4a6c-b618-d67828743aff</serial>
Jan 23 04:42:28 np0005593234 nova_compute[227762]:  <address type="drive" controller="0" unit="2"/>
Jan 23 04:42:28 np0005593234 nova_compute[227762]: </disk>
Jan 23 04:42:28 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 04:42:28 np0005593234 nova_compute[227762]: 2026-01-23 09:42:28.658 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:28.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:28 np0005593234 nova_compute[227762]: 2026-01-23 09:42:28.803 227766 DEBUG nova.virt.libvirt.driver [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:42:28 np0005593234 nova_compute[227762]: 2026-01-23 09:42:28.804 227766 DEBUG nova.virt.libvirt.driver [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:42:28 np0005593234 nova_compute[227762]: 2026-01-23 09:42:28.804 227766 DEBUG nova.virt.libvirt.driver [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] No BDM found with device name sdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:42:28 np0005593234 nova_compute[227762]: 2026-01-23 09:42:28.804 227766 DEBUG nova.virt.libvirt.driver [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] No VIF found with MAC fa:16:3e:b0:f2:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:42:29 np0005593234 nova_compute[227762]: 2026-01-23 09:42:29.031 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Acquiring lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:29 np0005593234 nova_compute[227762]: 2026-01-23 09:42:29.031 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:29 np0005593234 nova_compute[227762]: 2026-01-23 09:42:29.055 227766 DEBUG nova.compute.manager [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:42:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:42:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:29.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:42:29 np0005593234 nova_compute[227762]: 2026-01-23 09:42:29.120 227766 DEBUG oslo_concurrency.lockutils [None req-4f3a4d5b-43cb-457a-85c2-b8f24711aa2a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.907s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:29 np0005593234 nova_compute[227762]: 2026-01-23 09:42:29.154 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:29 np0005593234 nova_compute[227762]: 2026-01-23 09:42:29.155 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:29 np0005593234 nova_compute[227762]: 2026-01-23 09:42:29.161 227766 DEBUG nova.virt.hardware [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:42:29 np0005593234 nova_compute[227762]: 2026-01-23 09:42:29.161 227766 INFO nova.compute.claims [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:42:29 np0005593234 nova_compute[227762]: 2026-01-23 09:42:29.303 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:42:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 2400.5 total, 600.0 interval#012Cumulative writes: 20K writes, 86K keys, 20K commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.04 MB/s#012Cumulative WAL: 20K writes, 6628 syncs, 3.12 writes per sync, written: 0.08 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 10K writes, 45K keys, 10K commit groups, 1.0 writes per commit group, ingest: 47.10 MB, 0.08 MB/s#012Interval WAL: 10K writes, 4145 syncs, 2.60 writes per sync, written: 0.05 GB, 0.08 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 04:42:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:29 np0005593234 nova_compute[227762]: 2026-01-23 09:42:29.831 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:42:29 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1442574037' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:42:29 np0005593234 nova_compute[227762]: 2026-01-23 09:42:29.959 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.656s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:29 np0005593234 nova_compute[227762]: 2026-01-23 09:42:29.964 227766 DEBUG nova.compute.provider_tree [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:42:29 np0005593234 nova_compute[227762]: 2026-01-23 09:42:29.982 227766 DEBUG nova.scheduler.client.report [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.015 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.016 227766 DEBUG nova.compute.manager [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.096 227766 DEBUG nova.compute.manager [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.096 227766 DEBUG nova.network.neutron [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.125 227766 INFO nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.187 227766 DEBUG nova.compute.manager [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.292 227766 DEBUG nova.compute.manager [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.293 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.294 227766 INFO nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Creating image(s)#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.322 227766 DEBUG nova.storage.rbd_utils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] rbd image 4ef48fbd-b990-487c-94a4-0149ee9204c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.350 227766 DEBUG nova.storage.rbd_utils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] rbd image 4ef48fbd-b990-487c-94a4-0149ee9204c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.375 227766 DEBUG nova.storage.rbd_utils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] rbd image 4ef48fbd-b990-487c-94a4-0149ee9204c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.379 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.403 227766 DEBUG nova.policy [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '187ce0cedde344a3b09ca4560410580e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4fd9229340ed4bf3a3a72baa6985a3e3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.441 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.442 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.442 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.442 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.467 227766 DEBUG nova.storage.rbd_utils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] rbd image 4ef48fbd-b990-487c-94a4-0149ee9204c9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.471 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 4ef48fbd-b990-487c-94a4-0149ee9204c9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:30.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.882 227766 DEBUG oslo_concurrency.lockutils [None req-a545939a-5c2e-49fa-babe-590469b6b33a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Acquiring lock "a8411989-0134-41c7-85e7-36173b393043" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.882 227766 DEBUG oslo_concurrency.lockutils [None req-a545939a-5c2e-49fa-babe-590469b6b33a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:30 np0005593234 nova_compute[227762]: 2026-01-23 09:42:30.898 227766 INFO nova.compute.manager [None req-a545939a-5c2e-49fa-babe-590469b6b33a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Detaching volume 5bcdefce-ec76-4a6c-b618-d67828743aff#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.059 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 4ef48fbd-b990-487c-94a4-0149ee9204c9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.588s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:31.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.562 227766 INFO nova.virt.block_device [None req-a545939a-5c2e-49fa-babe-590469b6b33a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Attempting to driver detach volume 5bcdefce-ec76-4a6c-b618-d67828743aff from mountpoint /dev/sdc#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.564 227766 DEBUG nova.network.neutron [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Successfully created port: bd040948-a661-431e-8f76-623ac2452642 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.615 227766 DEBUG nova.storage.rbd_utils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] resizing rbd image 4ef48fbd-b990-487c-94a4-0149ee9204c9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.701 227766 DEBUG nova.virt.libvirt.driver [None req-a545939a-5c2e-49fa-babe-590469b6b33a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Attempting to detach device sdc from instance a8411989-0134-41c7-85e7-36173b393043 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.701 227766 DEBUG nova.virt.libvirt.guest [None req-a545939a-5c2e-49fa-babe-590469b6b33a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 04:42:31 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-5bcdefce-ec76-4a6c-b618-d67828743aff">
Jan 23 04:42:31 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:  </source>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:  <target dev="sdc" bus="scsi"/>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:  <serial>5bcdefce-ec76-4a6c-b618-d67828743aff</serial>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:  <address type="drive" controller="0" bus="0" target="0" unit="2"/>
Jan 23 04:42:31 np0005593234 nova_compute[227762]: </disk>
Jan 23 04:42:31 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.823 227766 INFO nova.virt.libvirt.driver [None req-a545939a-5c2e-49fa-babe-590469b6b33a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Successfully detached device sdc from instance a8411989-0134-41c7-85e7-36173b393043 from the persistent domain config.#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.824 227766 DEBUG nova.virt.libvirt.driver [None req-a545939a-5c2e-49fa-babe-590469b6b33a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] (1/8): Attempting to detach device sdc with device alias scsi0-0-0-2 from instance a8411989-0134-41c7-85e7-36173b393043 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.825 227766 DEBUG nova.virt.libvirt.guest [None req-a545939a-5c2e-49fa-babe-590469b6b33a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 04:42:31 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-5bcdefce-ec76-4a6c-b618-d67828743aff">
Jan 23 04:42:31 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:  </source>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:  <target dev="sdc" bus="scsi"/>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:  <serial>5bcdefce-ec76-4a6c-b618-d67828743aff</serial>
Jan 23 04:42:31 np0005593234 nova_compute[227762]:  <address type="drive" controller="0" bus="0" target="0" unit="2"/>
Jan 23 04:42:31 np0005593234 nova_compute[227762]: </disk>
Jan 23 04:42:31 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.894 227766 DEBUG nova.objects.instance [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lazy-loading 'migration_context' on Instance uuid 4ef48fbd-b990-487c-94a4-0149ee9204c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.904 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769161351.9044566, a8411989-0134-41c7-85e7-36173b393043 => scsi0-0-0-2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.905 227766 DEBUG nova.virt.libvirt.driver [None req-a545939a-5c2e-49fa-babe-590469b6b33a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Start waiting for the detach event from libvirt for device sdc with device alias scsi0-0-0-2 for instance a8411989-0134-41c7-85e7-36173b393043 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.907 227766 INFO nova.virt.libvirt.driver [None req-a545939a-5c2e-49fa-babe-590469b6b33a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Successfully detached device sdc from instance a8411989-0134-41c7-85e7-36173b393043 from the live domain config.#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.922 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.922 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Ensure instance console log exists: /var/lib/nova/instances/4ef48fbd-b990-487c-94a4-0149ee9204c9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.923 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.923 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:31 np0005593234 nova_compute[227762]: 2026-01-23 09:42:31.924 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:42:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:42:32 np0005593234 nova_compute[227762]: 2026-01-23 09:42:32.274 227766 DEBUG nova.objects.instance [None req-a545939a-5c2e-49fa-babe-590469b6b33a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lazy-loading 'flavor' on Instance uuid a8411989-0134-41c7-85e7-36173b393043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:42:32 np0005593234 nova_compute[227762]: 2026-01-23 09:42:32.336 227766 DEBUG oslo_concurrency.lockutils [None req-a545939a-5c2e-49fa-babe-590469b6b33a 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.454s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:32 np0005593234 nova_compute[227762]: 2026-01-23 09:42:32.510 227766 DEBUG nova.network.neutron [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Successfully updated port: bd040948-a661-431e-8f76-623ac2452642 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:42:32 np0005593234 nova_compute[227762]: 2026-01-23 09:42:32.527 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Acquiring lock "refresh_cache-4ef48fbd-b990-487c-94a4-0149ee9204c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:42:32 np0005593234 nova_compute[227762]: 2026-01-23 09:42:32.528 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Acquired lock "refresh_cache-4ef48fbd-b990-487c-94a4-0149ee9204c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:42:32 np0005593234 nova_compute[227762]: 2026-01-23 09:42:32.528 227766 DEBUG nova.network.neutron [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:42:32 np0005593234 nova_compute[227762]: 2026-01-23 09:42:32.654 227766 DEBUG nova.compute.manager [req-7fcdf584-fd17-4170-83c6-afd94e59402d req-358863ac-6b87-4d21-b832-e8bbc662f020 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Received event network-changed-bd040948-a661-431e-8f76-623ac2452642 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:32 np0005593234 nova_compute[227762]: 2026-01-23 09:42:32.655 227766 DEBUG nova.compute.manager [req-7fcdf584-fd17-4170-83c6-afd94e59402d req-358863ac-6b87-4d21-b832-e8bbc662f020 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Refreshing instance network info cache due to event network-changed-bd040948-a661-431e-8f76-623ac2452642. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:42:32 np0005593234 nova_compute[227762]: 2026-01-23 09:42:32.655 227766 DEBUG oslo_concurrency.lockutils [req-7fcdf584-fd17-4170-83c6-afd94e59402d req-358863ac-6b87-4d21-b832-e8bbc662f020 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-4ef48fbd-b990-487c-94a4-0149ee9204c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:42:32 np0005593234 nova_compute[227762]: 2026-01-23 09:42:32.694 227766 DEBUG nova.network.neutron [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:42:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:42:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:32.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:42:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:42:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:42:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:42:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:42:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:33.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:42:33 np0005593234 nova_compute[227762]: 2026-01-23 09:42:33.660 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.348 227766 DEBUG oslo_concurrency.lockutils [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Acquiring lock "a8411989-0134-41c7-85e7-36173b393043" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.349 227766 DEBUG oslo_concurrency.lockutils [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.349 227766 DEBUG oslo_concurrency.lockutils [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Acquiring lock "a8411989-0134-41c7-85e7-36173b393043-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.349 227766 DEBUG oslo_concurrency.lockutils [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.349 227766 DEBUG oslo_concurrency.lockutils [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.351 227766 INFO nova.compute.manager [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Terminating instance#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.351 227766 DEBUG nova.compute.manager [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:42:34 np0005593234 kernel: tap4c05c2ff-d4 (unregistering): left promiscuous mode
Jan 23 04:42:34 np0005593234 NetworkManager[48942]: <info>  [1769161354.4285] device (tap4c05c2ff-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:42:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:34Z|00120|binding|INFO|Releasing lport 4c05c2ff-d433-43af-8e32-d6197dee340f from this chassis (sb_readonly=0)
Jan 23 04:42:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:34Z|00121|binding|INFO|Setting lport 4c05c2ff-d433-43af-8e32-d6197dee340f down in Southbound
Jan 23 04:42:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:34Z|00122|binding|INFO|Removing iface tap4c05c2ff-d4 ovn-installed in OVS
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.438 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.440 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.448 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:f2:72 10.100.0.11'], port_security=['fa:16:3e:b0:f2:72 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a8411989-0134-41c7-85e7-36173b393043', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-727f22a6-57c8-4d50-9d8c-b9e831c4902b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7bb32481db2547b49bc4f3a10883baef', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a146fc4c-89c7-4383-805c-0fc50ff3724e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.193'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f333c0b6-b6d9-4b7d-8543-88d3b087f640, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=4c05c2ff-d433-43af-8e32-d6197dee340f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.450 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 4c05c2ff-d433-43af-8e32-d6197dee340f in datapath 727f22a6-57c8-4d50-9d8c-b9e831c4902b unbound from our chassis#033[00m
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.452 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 727f22a6-57c8-4d50-9d8c-b9e831c4902b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.454 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[35d1fe7d-161a-4ad0-a75f-b15d9760c6f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.454 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b namespace which is not needed anymore#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.460 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:34 np0005593234 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002c.scope: Deactivated successfully.
Jan 23 04:42:34 np0005593234 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002c.scope: Consumed 14.833s CPU time.
Jan 23 04:42:34 np0005593234 systemd-machined[195626]: Machine qemu-20-instance-0000002c terminated.
Jan 23 04:42:34 np0005593234 neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b[248491]: [NOTICE]   (248495) : haproxy version is 2.8.14-c23fe91
Jan 23 04:42:34 np0005593234 neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b[248491]: [NOTICE]   (248495) : path to executable is /usr/sbin/haproxy
Jan 23 04:42:34 np0005593234 neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b[248491]: [WARNING]  (248495) : Exiting Master process...
Jan 23 04:42:34 np0005593234 neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b[248491]: [ALERT]    (248495) : Current worker (248497) exited with code 143 (Terminated)
Jan 23 04:42:34 np0005593234 neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b[248491]: [WARNING]  (248495) : All workers exited. Exiting... (0)
Jan 23 04:42:34 np0005593234 systemd[1]: libpod-ddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254.scope: Deactivated successfully.
Jan 23 04:42:34 np0005593234 kernel: tap4c05c2ff-d4: entered promiscuous mode
Jan 23 04:42:34 np0005593234 kernel: tap4c05c2ff-d4 (unregistering): left promiscuous mode
Jan 23 04:42:34 np0005593234 NetworkManager[48942]: <info>  [1769161354.5722] manager: (tap4c05c2ff-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/70)
Jan 23 04:42:34 np0005593234 podman[249165]: 2026-01-23 09:42:34.574195776 +0000 UTC m=+0.040277466 container died ddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.578 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.588 227766 INFO nova.virt.libvirt.driver [-] [instance: a8411989-0134-41c7-85e7-36173b393043] Instance destroyed successfully.#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.589 227766 DEBUG nova.objects.instance [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lazy-loading 'resources' on Instance uuid a8411989-0134-41c7-85e7-36173b393043 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:42:34 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254-userdata-shm.mount: Deactivated successfully.
Jan 23 04:42:34 np0005593234 systemd[1]: var-lib-containers-storage-overlay-5b46fa6c7ad0fbe305a70c958364971884c42f26e8f5626fea64cb4df2419959-merged.mount: Deactivated successfully.
Jan 23 04:42:34 np0005593234 podman[249165]: 2026-01-23 09:42:34.613263064 +0000 UTC m=+0.079344744 container cleanup ddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.623 227766 DEBUG nova.virt.libvirt.vif [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:41:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachSCSIVolumeTestJSON-server-657713700',display_name='tempest-AttachSCSIVolumeTestJSON-server-657713700',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachscsivolumetestjson-server-657713700',id=44,image_ref='de0f1f21-0106-4885-a7ac-14a7ec714eff',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBALeFNUs+GYVQQqhQHFjDO5lw5M0y1Tt3vRvgYug2ng5cQxIzuhK3ImpNfCbdsCHy2Pzir3e9qqZTtkd5tTXp6+vUgC/FcyOXc6V9AcjGWqLg5yX6ioTlpK31gM0J5JdhA==',key_name='tempest-keypair-392233361',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:41:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7bb32481db2547b49bc4f3a10883baef',ramdisk_id='',reservation_id='r-aq9ryp00',resources=None,root_device_name='/dev/sda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='de0f1f21-0106-4885-a7ac-14a7ec714eff',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='scsi',image_hw_disk_bus='scsi',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_scsi_model='virtio-scsi',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachSCSIVolumeTestJSON-1359766574',owner_user_name='tempest-AttachSCSIVolumeTestJSON-1359766574-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:41:48Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9a4618f86429416889ef239f4b21bacc',uuid=a8411989-0134-41c7-85e7-36173b393043,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4c05c2ff-d433-43af-8e32-d6197dee340f", "address": "fa:16:3e:b0:f2:72", "network": {"id": "727f22a6-57c8-4d50-9d8c-b9e831c4902b", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1510828903-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bb32481db2547b49bc4f3a10883baef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c05c2ff-d4", "ovs_interfaceid": "4c05c2ff-d433-43af-8e32-d6197dee340f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.624 227766 DEBUG nova.network.os_vif_util [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Converting VIF {"id": "4c05c2ff-d433-43af-8e32-d6197dee340f", "address": "fa:16:3e:b0:f2:72", "network": {"id": "727f22a6-57c8-4d50-9d8c-b9e831c4902b", "bridge": "br-int", "label": "tempest-AttachSCSIVolumeTestJSON-1510828903-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7bb32481db2547b49bc4f3a10883baef", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4c05c2ff-d4", "ovs_interfaceid": "4c05c2ff-d433-43af-8e32-d6197dee340f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.625 227766 DEBUG nova.network.os_vif_util [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b0:f2:72,bridge_name='br-int',has_traffic_filtering=True,id=4c05c2ff-d433-43af-8e32-d6197dee340f,network=Network(727f22a6-57c8-4d50-9d8c-b9e831c4902b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c05c2ff-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.625 227766 DEBUG os_vif [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:f2:72,bridge_name='br-int',has_traffic_filtering=True,id=4c05c2ff-d433-43af-8e32-d6197dee340f,network=Network(727f22a6-57c8-4d50-9d8c-b9e831c4902b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c05c2ff-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:42:34 np0005593234 systemd[1]: libpod-conmon-ddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254.scope: Deactivated successfully.
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.630 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.630 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4c05c2ff-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.632 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.634 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.636 227766 INFO os_vif [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b0:f2:72,bridge_name='br-int',has_traffic_filtering=True,id=4c05c2ff-d433-43af-8e32-d6197dee340f,network=Network(727f22a6-57c8-4d50-9d8c-b9e831c4902b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4c05c2ff-d4')#033[00m
Jan 23 04:42:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e187 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:34 np0005593234 podman[249204]: 2026-01-23 09:42:34.682291757 +0000 UTC m=+0.045198981 container remove ddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.688 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[545f8fc8-4876-460d-8e3f-15e1f8eb5c0e]: (4, ('Fri Jan 23 09:42:34 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b (ddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254)\nddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254\nFri Jan 23 09:42:34 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b (ddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254)\nddfdc040626fa4478b965f63f2dc82f1a679c9105faea322f30c7621d2d28254\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.690 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[586c744e-23c2-4a0e-b397-f46f970e6dd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.691 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap727f22a6-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.693 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:34 np0005593234 kernel: tap727f22a6-50: left promiscuous mode
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.707 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.709 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1099d339-b370-43a5-a10c-192cc0c06b28]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.725 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[93b30193-d1f8-4bb8-9d26-e1e3d96fd2f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.726 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9beea17f-11b7-461d-bb5c-b43396385014]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.741 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[899b55cf-e81f-4640-9c0f-6dda6665e37b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 520213, 'reachable_time': 19679, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249238, 'error': None, 'target': 'ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.745 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-727f22a6-57c8-4d50-9d8c-b9e831c4902b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:42:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:34.745 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[7daef263-e30f-4598-812a-c84eb20c9aa1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:34 np0005593234 systemd[1]: run-netns-ovnmeta\x2d727f22a6\x2d57c8\x2d4d50\x2d9d8c\x2db9e831c4902b.mount: Deactivated successfully.
Jan 23 04:42:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:34.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.893 227766 DEBUG nova.network.neutron [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Updating instance_info_cache with network_info: [{"id": "bd040948-a661-431e-8f76-623ac2452642", "address": "fa:16:3e:3e:ce:a8", "network": {"id": "f19933f5-cfe3-4319-a83b-b72dde692ab6", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-1169169895-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4fd9229340ed4bf3a3a72baa6985a3e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd040948-a6", "ovs_interfaceid": "bd040948-a661-431e-8f76-623ac2452642", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.936 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Releasing lock "refresh_cache-4ef48fbd-b990-487c-94a4-0149ee9204c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.936 227766 DEBUG nova.compute.manager [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Instance network_info: |[{"id": "bd040948-a661-431e-8f76-623ac2452642", "address": "fa:16:3e:3e:ce:a8", "network": {"id": "f19933f5-cfe3-4319-a83b-b72dde692ab6", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-1169169895-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4fd9229340ed4bf3a3a72baa6985a3e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd040948-a6", "ovs_interfaceid": "bd040948-a661-431e-8f76-623ac2452642", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.936 227766 DEBUG oslo_concurrency.lockutils [req-7fcdf584-fd17-4170-83c6-afd94e59402d req-358863ac-6b87-4d21-b832-e8bbc662f020 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-4ef48fbd-b990-487c-94a4-0149ee9204c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.937 227766 DEBUG nova.network.neutron [req-7fcdf584-fd17-4170-83c6-afd94e59402d req-358863ac-6b87-4d21-b832-e8bbc662f020 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Refreshing network info cache for port bd040948-a661-431e-8f76-623ac2452642 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.939 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Start _get_guest_xml network_info=[{"id": "bd040948-a661-431e-8f76-623ac2452642", "address": "fa:16:3e:3e:ce:a8", "network": {"id": "f19933f5-cfe3-4319-a83b-b72dde692ab6", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-1169169895-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4fd9229340ed4bf3a3a72baa6985a3e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd040948-a6", "ovs_interfaceid": "bd040948-a661-431e-8f76-623ac2452642", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.943 227766 WARNING nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.950 227766 DEBUG nova.virt.libvirt.host [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.951 227766 DEBUG nova.virt.libvirt.host [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.958 227766 DEBUG nova.virt.libvirt.host [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.959 227766 DEBUG nova.virt.libvirt.host [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.960 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.960 227766 DEBUG nova.virt.hardware [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.960 227766 DEBUG nova.virt.hardware [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.961 227766 DEBUG nova.virt.hardware [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.961 227766 DEBUG nova.virt.hardware [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.961 227766 DEBUG nova.virt.hardware [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.961 227766 DEBUG nova.virt.hardware [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.961 227766 DEBUG nova.virt.hardware [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.965 227766 DEBUG nova.virt.hardware [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.966 227766 DEBUG nova.virt.hardware [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.966 227766 DEBUG nova.virt.hardware [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.966 227766 DEBUG nova.virt.hardware [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:42:34 np0005593234 nova_compute[227762]: 2026-01-23 09:42:34.968 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:35.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.109 227766 DEBUG nova.compute.manager [req-3172e3f4-b58c-487c-bb6d-0320a1e8cf60 req-20ef7092-85ae-4c43-b2c0-b61cf3c06bca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Received event network-vif-unplugged-4c05c2ff-d433-43af-8e32-d6197dee340f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.109 227766 DEBUG oslo_concurrency.lockutils [req-3172e3f4-b58c-487c-bb6d-0320a1e8cf60 req-20ef7092-85ae-4c43-b2c0-b61cf3c06bca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a8411989-0134-41c7-85e7-36173b393043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.109 227766 DEBUG oslo_concurrency.lockutils [req-3172e3f4-b58c-487c-bb6d-0320a1e8cf60 req-20ef7092-85ae-4c43-b2c0-b61cf3c06bca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.110 227766 DEBUG oslo_concurrency.lockutils [req-3172e3f4-b58c-487c-bb6d-0320a1e8cf60 req-20ef7092-85ae-4c43-b2c0-b61cf3c06bca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.110 227766 DEBUG nova.compute.manager [req-3172e3f4-b58c-487c-bb6d-0320a1e8cf60 req-20ef7092-85ae-4c43-b2c0-b61cf3c06bca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] No waiting events found dispatching network-vif-unplugged-4c05c2ff-d433-43af-8e32-d6197dee340f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.110 227766 DEBUG nova.compute.manager [req-3172e3f4-b58c-487c-bb6d-0320a1e8cf60 req-20ef7092-85ae-4c43-b2c0-b61cf3c06bca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Received event network-vif-unplugged-4c05c2ff-d433-43af-8e32-d6197dee340f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:42:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:42:35 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3381225868' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.457 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.480 227766 DEBUG nova.storage.rbd_utils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] rbd image 4ef48fbd-b990-487c-94a4-0149ee9204c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.484 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:42:35 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1199371794' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.922 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.924 227766 DEBUG nova.virt.libvirt.vif [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:42:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-UpdateMultiattachVolumeNegativeTest-server-1063948538',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-updatemultiattachvolumenegativetest-server-1063948538',id=47,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOW/D7eFeqMjBvekfY9VqlM3EY9Lv7j0wpym0wwbZXZxi5xiYHs3Y+SGaRgTVfBABcO7R/jAYgVwXr4x4dmhbR/VewPXJyWaKlJux19vulauSxlm5JZb+T430JhpaEya2w==',key_name='tempest-keypair-1370438234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4fd9229340ed4bf3a3a72baa6985a3e3',ramdisk_id='',reservation_id='r-rg1r02jt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-UpdateMultiattachVolumeNegativeTest-1520463047',owner_user_name='tempest-UpdateMultiattachVolumeNegativeTest-1520463047-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:42:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='187ce0cedde344a3b09ca4560410580e',uuid=4ef48fbd-b990-487c-94a4-0149ee9204c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bd040948-a661-431e-8f76-623ac2452642", "address": "fa:16:3e:3e:ce:a8", "network": {"id": "f19933f5-cfe3-4319-a83b-b72dde692ab6", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-1169169895-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4fd9229340ed4bf3a3a72baa6985a3e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd040948-a6", "ovs_interfaceid": "bd040948-a661-431e-8f76-623ac2452642", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.924 227766 DEBUG nova.network.os_vif_util [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Converting VIF {"id": "bd040948-a661-431e-8f76-623ac2452642", "address": "fa:16:3e:3e:ce:a8", "network": {"id": "f19933f5-cfe3-4319-a83b-b72dde692ab6", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-1169169895-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4fd9229340ed4bf3a3a72baa6985a3e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd040948-a6", "ovs_interfaceid": "bd040948-a661-431e-8f76-623ac2452642", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.925 227766 DEBUG nova.network.os_vif_util [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:ce:a8,bridge_name='br-int',has_traffic_filtering=True,id=bd040948-a661-431e-8f76-623ac2452642,network=Network(f19933f5-cfe3-4319-a83b-b72dde692ab6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd040948-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:42:35 np0005593234 nova_compute[227762]: 2026-01-23 09:42:35.927 227766 DEBUG nova.objects.instance [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4ef48fbd-b990-487c-94a4-0149ee9204c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.008 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  <uuid>4ef48fbd-b990-487c-94a4-0149ee9204c9</uuid>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  <name>instance-0000002f</name>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <nova:name>tempest-UpdateMultiattachVolumeNegativeTest-server-1063948538</nova:name>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:42:34</nova:creationTime>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <nova:user uuid="187ce0cedde344a3b09ca4560410580e">tempest-UpdateMultiattachVolumeNegativeTest-1520463047-project-member</nova:user>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <nova:project uuid="4fd9229340ed4bf3a3a72baa6985a3e3">tempest-UpdateMultiattachVolumeNegativeTest-1520463047</nova:project>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <nova:port uuid="bd040948-a661-431e-8f76-623ac2452642">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <entry name="serial">4ef48fbd-b990-487c-94a4-0149ee9204c9</entry>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <entry name="uuid">4ef48fbd-b990-487c-94a4-0149ee9204c9</entry>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/4ef48fbd-b990-487c-94a4-0149ee9204c9_disk">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/4ef48fbd-b990-487c-94a4-0149ee9204c9_disk.config">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:3e:ce:a8"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <target dev="tapbd040948-a6"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/4ef48fbd-b990-487c-94a4-0149ee9204c9/console.log" append="off"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:42:36 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:42:36 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:42:36 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:42:36 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.010 227766 DEBUG nova.compute.manager [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Preparing to wait for external event network-vif-plugged-bd040948-a661-431e-8f76-623ac2452642 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.010 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Acquiring lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.011 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.011 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.012 227766 DEBUG nova.virt.libvirt.vif [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:42:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-UpdateMultiattachVolumeNegativeTest-server-1063948538',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-updatemultiattachvolumenegativetest-server-1063948538',id=47,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOW/D7eFeqMjBvekfY9VqlM3EY9Lv7j0wpym0wwbZXZxi5xiYHs3Y+SGaRgTVfBABcO7R/jAYgVwXr4x4dmhbR/VewPXJyWaKlJux19vulauSxlm5JZb+T430JhpaEya2w==',key_name='tempest-keypair-1370438234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4fd9229340ed4bf3a3a72baa6985a3e3',ramdisk_id='',reservation_id='r-rg1r02jt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-UpdateMultiattachVolumeNegativeTest-1520463047',owner_user_name='tempest-UpdateMultiattachVolumeNegativeTest-1520463047-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:42:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='187ce0cedde344a3b09ca4560410580e',uuid=4ef48fbd-b990-487c-94a4-0149ee9204c9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "bd040948-a661-431e-8f76-623ac2452642", "address": "fa:16:3e:3e:ce:a8", "network": {"id": "f19933f5-cfe3-4319-a83b-b72dde692ab6", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-1169169895-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4fd9229340ed4bf3a3a72baa6985a3e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd040948-a6", "ovs_interfaceid": "bd040948-a661-431e-8f76-623ac2452642", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.012 227766 DEBUG nova.network.os_vif_util [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Converting VIF {"id": "bd040948-a661-431e-8f76-623ac2452642", "address": "fa:16:3e:3e:ce:a8", "network": {"id": "f19933f5-cfe3-4319-a83b-b72dde692ab6", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-1169169895-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4fd9229340ed4bf3a3a72baa6985a3e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd040948-a6", "ovs_interfaceid": "bd040948-a661-431e-8f76-623ac2452642", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.013 227766 DEBUG nova.network.os_vif_util [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3e:ce:a8,bridge_name='br-int',has_traffic_filtering=True,id=bd040948-a661-431e-8f76-623ac2452642,network=Network(f19933f5-cfe3-4319-a83b-b72dde692ab6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd040948-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.014 227766 DEBUG os_vif [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:ce:a8,bridge_name='br-int',has_traffic_filtering=True,id=bd040948-a661-431e-8f76-623ac2452642,network=Network(f19933f5-cfe3-4319-a83b-b72dde692ab6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd040948-a6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.014 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.015 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.015 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.018 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.019 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd040948-a6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.019 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbd040948-a6, col_values=(('external_ids', {'iface-id': 'bd040948-a661-431e-8f76-623ac2452642', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3e:ce:a8', 'vm-uuid': '4ef48fbd-b990-487c-94a4-0149ee9204c9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.021 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:36 np0005593234 NetworkManager[48942]: <info>  [1769161356.0218] manager: (tapbd040948-a6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/71)
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.027 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.029 227766 INFO os_vif [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3e:ce:a8,bridge_name='br-int',has_traffic_filtering=True,id=bd040948-a661-431e-8f76-623ac2452642,network=Network(f19933f5-cfe3-4319-a83b-b72dde692ab6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd040948-a6')#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.131 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.131 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.132 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] No VIF found with MAC fa:16:3e:3e:ce:a8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.132 227766 INFO nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Using config drive#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.161 227766 DEBUG nova.storage.rbd_utils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] rbd image 4ef48fbd-b990-487c-94a4-0149ee9204c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:36 np0005593234 podman[249306]: 2026-01-23 09:42:36.168602377 +0000 UTC m=+0.107761731 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.178 227766 INFO nova.virt.libvirt.driver [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Deleting instance files /var/lib/nova/instances/a8411989-0134-41c7-85e7-36173b393043_del#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.178 227766 INFO nova.virt.libvirt.driver [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Deletion of /var/lib/nova/instances/a8411989-0134-41c7-85e7-36173b393043_del complete#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.245 227766 INFO nova.compute.manager [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Took 1.89 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.246 227766 DEBUG oslo.service.loopingcall [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.246 227766 DEBUG nova.compute.manager [-] [instance: a8411989-0134-41c7-85e7-36173b393043] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.246 227766 DEBUG nova.network.neutron [-] [instance: a8411989-0134-41c7-85e7-36173b393043] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:42:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:36.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.831 227766 INFO nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Creating config drive at /var/lib/nova/instances/4ef48fbd-b990-487c-94a4-0149ee9204c9/disk.config#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.837 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4ef48fbd-b990-487c-94a4-0149ee9204c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplhm98r8i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.856 227766 DEBUG nova.network.neutron [req-7fcdf584-fd17-4170-83c6-afd94e59402d req-358863ac-6b87-4d21-b832-e8bbc662f020 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Updated VIF entry in instance network info cache for port bd040948-a661-431e-8f76-623ac2452642. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.857 227766 DEBUG nova.network.neutron [req-7fcdf584-fd17-4170-83c6-afd94e59402d req-358863ac-6b87-4d21-b832-e8bbc662f020 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Updating instance_info_cache with network_info: [{"id": "bd040948-a661-431e-8f76-623ac2452642", "address": "fa:16:3e:3e:ce:a8", "network": {"id": "f19933f5-cfe3-4319-a83b-b72dde692ab6", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-1169169895-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4fd9229340ed4bf3a3a72baa6985a3e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd040948-a6", "ovs_interfaceid": "bd040948-a661-431e-8f76-623ac2452642", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:42:36 np0005593234 nova_compute[227762]: 2026-01-23 09:42:36.964 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4ef48fbd-b990-487c-94a4-0149ee9204c9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplhm98r8i" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:42:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:37.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:42:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e188 e188: 3 total, 3 up, 3 in
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.259 227766 DEBUG nova.storage.rbd_utils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] rbd image 4ef48fbd-b990-487c-94a4-0149ee9204c9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.262 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4ef48fbd-b990-487c-94a4-0149ee9204c9/disk.config 4ef48fbd-b990-487c-94a4-0149ee9204c9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.287 227766 DEBUG oslo_concurrency.lockutils [req-7fcdf584-fd17-4170-83c6-afd94e59402d req-358863ac-6b87-4d21-b832-e8bbc662f020 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-4ef48fbd-b990-487c-94a4-0149ee9204c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.289 227766 DEBUG nova.compute.manager [req-dacaaba2-086b-4e8a-9917-2f4a5ca68a9a req-9c37cffb-9ec5-48af-b58f-a8881f2fca90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Received event network-vif-plugged-4c05c2ff-d433-43af-8e32-d6197dee340f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.290 227766 DEBUG oslo_concurrency.lockutils [req-dacaaba2-086b-4e8a-9917-2f4a5ca68a9a req-9c37cffb-9ec5-48af-b58f-a8881f2fca90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a8411989-0134-41c7-85e7-36173b393043-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.290 227766 DEBUG oslo_concurrency.lockutils [req-dacaaba2-086b-4e8a-9917-2f4a5ca68a9a req-9c37cffb-9ec5-48af-b58f-a8881f2fca90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.290 227766 DEBUG oslo_concurrency.lockutils [req-dacaaba2-086b-4e8a-9917-2f4a5ca68a9a req-9c37cffb-9ec5-48af-b58f-a8881f2fca90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.290 227766 DEBUG nova.compute.manager [req-dacaaba2-086b-4e8a-9917-2f4a5ca68a9a req-9c37cffb-9ec5-48af-b58f-a8881f2fca90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] No waiting events found dispatching network-vif-plugged-4c05c2ff-d433-43af-8e32-d6197dee340f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.291 227766 WARNING nova.compute.manager [req-dacaaba2-086b-4e8a-9917-2f4a5ca68a9a req-9c37cffb-9ec5-48af-b58f-a8881f2fca90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Received unexpected event network-vif-plugged-4c05c2ff-d433-43af-8e32-d6197dee340f for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.405 227766 DEBUG oslo_concurrency.processutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4ef48fbd-b990-487c-94a4-0149ee9204c9/disk.config 4ef48fbd-b990-487c-94a4-0149ee9204c9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.406 227766 INFO nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Deleting local config drive /var/lib/nova/instances/4ef48fbd-b990-487c-94a4-0149ee9204c9/disk.config because it was imported into RBD.#033[00m
Jan 23 04:42:37 np0005593234 kernel: tapbd040948-a6: entered promiscuous mode
Jan 23 04:42:37 np0005593234 NetworkManager[48942]: <info>  [1769161357.4597] manager: (tapbd040948-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Jan 23 04:42:37 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:37Z|00123|binding|INFO|Claiming lport bd040948-a661-431e-8f76-623ac2452642 for this chassis.
Jan 23 04:42:37 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:37Z|00124|binding|INFO|bd040948-a661-431e-8f76-623ac2452642: Claiming fa:16:3e:3e:ce:a8 10.100.0.11
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.461 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.468 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:ce:a8 10.100.0.11'], port_security=['fa:16:3e:3e:ce:a8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4ef48fbd-b990-487c-94a4-0149ee9204c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f19933f5-cfe3-4319-a83b-b72dde692ab6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4fd9229340ed4bf3a3a72baa6985a3e3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '232ea62b-b441-41b5-8457-7d5744ac9ac2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=91ae19d5-b9ed-444d-b1cd-8fb0c58abf8d, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=bd040948-a661-431e-8f76-623ac2452642) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.469 144381 INFO neutron.agent.ovn.metadata.agent [-] Port bd040948-a661-431e-8f76-623ac2452642 in datapath f19933f5-cfe3-4319-a83b-b72dde692ab6 bound to our chassis#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.471 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f19933f5-cfe3-4319-a83b-b72dde692ab6#033[00m
Jan 23 04:42:37 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:37Z|00125|binding|INFO|Setting lport bd040948-a661-431e-8f76-623ac2452642 ovn-installed in OVS
Jan 23 04:42:37 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:37Z|00126|binding|INFO|Setting lport bd040948-a661-431e-8f76-623ac2452642 up in Southbound
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.477 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.479 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.486 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d9af3a57-1f36-4cdc-8291-6d71b69e1bfb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.487 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf19933f5-c1 in ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.488 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf19933f5-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.488 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0df76e17-3629-4a9b-95d6-fab61668da19]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.490 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[afde1a7d-864c-46fa-9f32-cc145d7dcbe3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 systemd-udevd[249404]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:42:37 np0005593234 systemd-machined[195626]: New machine qemu-21-instance-0000002f.
Jan 23 04:42:37 np0005593234 NetworkManager[48942]: <info>  [1769161357.5035] device (tapbd040948-a6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:42:37 np0005593234 NetworkManager[48942]: <info>  [1769161357.5042] device (tapbd040948-a6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.505 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[5e6d5a69-560a-45a2-ae58-d9032806243b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 systemd[1]: Started Virtual Machine qemu-21-instance-0000002f.
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.518 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[165819d7-6b65-4cec-988c-8e2174e15f45]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.543 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d41f05c0-d1f7-4fdc-b11c-d367f6e83c1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 systemd-udevd[249407]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.549 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5b2ffc8d-4e84-485c-929e-cf801ba908f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 NetworkManager[48942]: <info>  [1769161357.5498] manager: (tapf19933f5-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.582 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f61d93aa-2679-449f-8010-b44333cc4ffd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.585 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[40aa8716-a9d6-41fd-9608-0a5582bbb24c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 NetworkManager[48942]: <info>  [1769161357.6064] device (tapf19933f5-c0): carrier: link connected
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.612 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[928a98ef-117a-4818-91a2-eac38cd4f2bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.628 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[161e184a-dbc1-4c70-b5eb-9ae0b82c5ecb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf19933f5-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:fb:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525256, 'reachable_time': 29890, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 249436, 'error': None, 'target': 'ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.639 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.674 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.682 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[901617da-bbc0-4bb3-bd15-a49bacc8bcdd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feec:fb09'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 525256, 'tstamp': 525256}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 249437, 'error': None, 'target': 'ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.699 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5a3c0ecb-3365-479a-ad79-08a049bcf7f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf19933f5-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ec:fb:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525256, 'reachable_time': 29890, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 249438, 'error': None, 'target': 'ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.728 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dffd1fbf-48f7-483a-a8c8-34820968504a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.793 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ba2d4133-f141-4be5-95bc-657c39f729b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.795 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf19933f5-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.795 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.796 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf19933f5-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:37 np0005593234 NetworkManager[48942]: <info>  [1769161357.7989] manager: (tapf19933f5-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.798 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:37 np0005593234 kernel: tapf19933f5-c0: entered promiscuous mode
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.800 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.802 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf19933f5-c0, col_values=(('external_ids', {'iface-id': '03e2fba1-8299-41b0-8205-575cd62d3292'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.803 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:37 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:37Z|00127|binding|INFO|Releasing lport 03e2fba1-8299-41b0-8205-575cd62d3292 from this chassis (sb_readonly=0)
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.804 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.805 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f19933f5-cfe3-4319-a83b-b72dde692ab6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f19933f5-cfe3-4319-a83b-b72dde692ab6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.806 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[476af5ef-73c5-44a4-8b7f-819834cc99f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.807 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-f19933f5-cfe3-4319-a83b-b72dde692ab6
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/f19933f5-cfe3-4319-a83b-b72dde692ab6.pid.haproxy
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID f19933f5-cfe3-4319-a83b-b72dde692ab6
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:42:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:37.809 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6', 'env', 'PROCESS_TAG=haproxy-f19933f5-cfe3-4319-a83b-b72dde692ab6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f19933f5-cfe3-4319-a83b-b72dde692ab6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.818 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.832 227766 DEBUG nova.network.neutron [-] [instance: a8411989-0134-41c7-85e7-36173b393043] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.868 227766 INFO nova.compute.manager [-] [instance: a8411989-0134-41c7-85e7-36173b393043] Took 1.62 seconds to deallocate network for instance.#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.926 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161357.9255185, 4ef48fbd-b990-487c-94a4-0149ee9204c9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.926 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] VM Started (Lifecycle Event)#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.978 227766 DEBUG oslo_concurrency.lockutils [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.979 227766 DEBUG oslo_concurrency.lockutils [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.980 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.989 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161357.9267347, 4ef48fbd-b990-487c-94a4-0149ee9204c9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:42:37 np0005593234 nova_compute[227762]: 2026-01-23 09:42:37.989 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:42:38 np0005593234 nova_compute[227762]: 2026-01-23 09:42:38.025 227766 DEBUG nova.compute.manager [req-1b81ba31-6788-4485-b0b4-f083fbf00e6d req-dad6f8f7-cb97-447a-975a-85b006386314 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a8411989-0134-41c7-85e7-36173b393043] Received event network-vif-deleted-4c05c2ff-d433-43af-8e32-d6197dee340f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:38 np0005593234 nova_compute[227762]: 2026-01-23 09:42:38.029 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:38 np0005593234 nova_compute[227762]: 2026-01-23 09:42:38.032 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:42:38 np0005593234 nova_compute[227762]: 2026-01-23 09:42:38.055 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:42:38 np0005593234 nova_compute[227762]: 2026-01-23 09:42:38.083 227766 DEBUG oslo_concurrency.processutils [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:42:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e189 e189: 3 total, 3 up, 3 in
Jan 23 04:42:38 np0005593234 podman[249511]: 2026-01-23 09:42:38.150217931 +0000 UTC m=+0.020676456 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:42:38 np0005593234 podman[249511]: 2026-01-23 09:42:38.413380836 +0000 UTC m=+0.283839341 container create dd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 04:42:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:42:38 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1033354146' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:42:38 np0005593234 nova_compute[227762]: 2026-01-23 09:42:38.550 227766 DEBUG oslo_concurrency.processutils [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:42:38 np0005593234 systemd[1]: Started libpod-conmon-dd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c.scope.
Jan 23 04:42:38 np0005593234 nova_compute[227762]: 2026-01-23 09:42:38.560 227766 DEBUG nova.compute.provider_tree [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:42:38 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:42:38 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88f0b9f8bba97f04567158b247cd9b226a95c540c09a336a51f14cb86bbf97d2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:42:38 np0005593234 nova_compute[227762]: 2026-01-23 09:42:38.588 227766 DEBUG nova.scheduler.client.report [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:42:38 np0005593234 nova_compute[227762]: 2026-01-23 09:42:38.650 227766 DEBUG oslo_concurrency.lockutils [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:38 np0005593234 nova_compute[227762]: 2026-01-23 09:42:38.663 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:38 np0005593234 nova_compute[227762]: 2026-01-23 09:42:38.676 227766 INFO nova.scheduler.client.report [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Deleted allocations for instance a8411989-0134-41c7-85e7-36173b393043#033[00m
Jan 23 04:42:38 np0005593234 podman[249511]: 2026-01-23 09:42:38.704855744 +0000 UTC m=+0.575314329 container init dd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 04:42:38 np0005593234 podman[249511]: 2026-01-23 09:42:38.710469439 +0000 UTC m=+0.580927974 container start dd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 04:42:38 np0005593234 neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6[249546]: [NOTICE]   (249550) : New worker (249552) forked
Jan 23 04:42:38 np0005593234 neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6[249546]: [NOTICE]   (249550) : Loading success.
Jan 23 04:42:38 np0005593234 nova_compute[227762]: 2026-01-23 09:42:38.752 227766 DEBUG oslo_concurrency.lockutils [None req-8eec6058-c0bf-4413-967c-b5fe2d750c18 9a4618f86429416889ef239f4b21bacc 7bb32481db2547b49bc4f3a10883baef - - default default] Lock "a8411989-0134-41c7-85e7-36173b393043" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.404s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:38.775 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:42:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:38.777 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:42:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:42:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:38.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:42:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:39.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e190 e190: 3 total, 3 up, 3 in
Jan 23 04:42:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:42:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.381 227766 DEBUG nova.compute.manager [req-4b844bbd-403e-483f-b3ba-c1c7776b7ce2 req-f9372f0d-7f6c-4c27-b368-50b157fed47f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Received event network-vif-plugged-bd040948-a661-431e-8f76-623ac2452642 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.382 227766 DEBUG oslo_concurrency.lockutils [req-4b844bbd-403e-483f-b3ba-c1c7776b7ce2 req-f9372f0d-7f6c-4c27-b368-50b157fed47f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.382 227766 DEBUG oslo_concurrency.lockutils [req-4b844bbd-403e-483f-b3ba-c1c7776b7ce2 req-f9372f0d-7f6c-4c27-b368-50b157fed47f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.382 227766 DEBUG oslo_concurrency.lockutils [req-4b844bbd-403e-483f-b3ba-c1c7776b7ce2 req-f9372f0d-7f6c-4c27-b368-50b157fed47f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.383 227766 DEBUG nova.compute.manager [req-4b844bbd-403e-483f-b3ba-c1c7776b7ce2 req-f9372f0d-7f6c-4c27-b368-50b157fed47f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Processing event network-vif-plugged-bd040948-a661-431e-8f76-623ac2452642 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.383 227766 DEBUG nova.compute.manager [req-4b844bbd-403e-483f-b3ba-c1c7776b7ce2 req-f9372f0d-7f6c-4c27-b368-50b157fed47f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Received event network-vif-plugged-bd040948-a661-431e-8f76-623ac2452642 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.383 227766 DEBUG oslo_concurrency.lockutils [req-4b844bbd-403e-483f-b3ba-c1c7776b7ce2 req-f9372f0d-7f6c-4c27-b368-50b157fed47f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.383 227766 DEBUG oslo_concurrency.lockutils [req-4b844bbd-403e-483f-b3ba-c1c7776b7ce2 req-f9372f0d-7f6c-4c27-b368-50b157fed47f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.384 227766 DEBUG oslo_concurrency.lockutils [req-4b844bbd-403e-483f-b3ba-c1c7776b7ce2 req-f9372f0d-7f6c-4c27-b368-50b157fed47f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.384 227766 DEBUG nova.compute.manager [req-4b844bbd-403e-483f-b3ba-c1c7776b7ce2 req-f9372f0d-7f6c-4c27-b368-50b157fed47f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] No waiting events found dispatching network-vif-plugged-bd040948-a661-431e-8f76-623ac2452642 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.384 227766 WARNING nova.compute.manager [req-4b844bbd-403e-483f-b3ba-c1c7776b7ce2 req-f9372f0d-7f6c-4c27-b368-50b157fed47f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Received unexpected event network-vif-plugged-bd040948-a661-431e-8f76-623ac2452642 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.385 227766 DEBUG nova.compute.manager [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.388 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161360.3885279, 4ef48fbd-b990-487c-94a4-0149ee9204c9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.389 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.391 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.394 227766 INFO nova.virt.libvirt.driver [-] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Instance spawned successfully.#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.394 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.433 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.440 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.444 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.444 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.445 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.445 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.445 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.446 227766 DEBUG nova.virt.libvirt.driver [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.472 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.509 227766 INFO nova.compute.manager [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Took 10.22 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.510 227766 DEBUG nova.compute.manager [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.595 227766 INFO nova.compute.manager [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Took 11.47 seconds to build instance.#033[00m
Jan 23 04:42:40 np0005593234 nova_compute[227762]: 2026-01-23 09:42:40.658 227766 DEBUG oslo_concurrency.lockutils [None req-93ee25ec-6e38-4e25-ba08-c1af8dff58bc 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:40.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:41 np0005593234 nova_compute[227762]: 2026-01-23 09:42:41.024 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:42:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:41.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:42:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e191 e191: 3 total, 3 up, 3 in
Jan 23 04:42:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:42.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:42.818 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:42:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:42.819 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:42:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:42:42.820 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:42:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:43.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:43 np0005593234 nova_compute[227762]: 2026-01-23 09:42:43.675 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e191 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:42:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/720066135' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:42:44 np0005593234 nova_compute[227762]: 2026-01-23 09:42:44.789 227766 DEBUG nova.compute.manager [req-1c447a3c-aba8-470c-9d8b-85ec295494d1 req-05bcf3a5-5dd0-4662-89af-fa353bf646aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Received event network-changed-bd040948-a661-431e-8f76-623ac2452642 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:42:44 np0005593234 nova_compute[227762]: 2026-01-23 09:42:44.790 227766 DEBUG nova.compute.manager [req-1c447a3c-aba8-470c-9d8b-85ec295494d1 req-05bcf3a5-5dd0-4662-89af-fa353bf646aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Refreshing instance network info cache due to event network-changed-bd040948-a661-431e-8f76-623ac2452642. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:42:44 np0005593234 nova_compute[227762]: 2026-01-23 09:42:44.791 227766 DEBUG oslo_concurrency.lockutils [req-1c447a3c-aba8-470c-9d8b-85ec295494d1 req-05bcf3a5-5dd0-4662-89af-fa353bf646aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-4ef48fbd-b990-487c-94a4-0149ee9204c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:42:44 np0005593234 nova_compute[227762]: 2026-01-23 09:42:44.791 227766 DEBUG oslo_concurrency.lockutils [req-1c447a3c-aba8-470c-9d8b-85ec295494d1 req-05bcf3a5-5dd0-4662-89af-fa353bf646aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-4ef48fbd-b990-487c-94a4-0149ee9204c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:42:44 np0005593234 nova_compute[227762]: 2026-01-23 09:42:44.792 227766 DEBUG nova.network.neutron [req-1c447a3c-aba8-470c-9d8b-85ec295494d1 req-05bcf3a5-5dd0-4662-89af-fa353bf646aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Refreshing network info cache for port bd040948-a661-431e-8f76-623ac2452642 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:42:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:44.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:45.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e192 e192: 3 total, 3 up, 3 in
Jan 23 04:42:46 np0005593234 nova_compute[227762]: 2026-01-23 09:42:46.026 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:46 np0005593234 nova_compute[227762]: 2026-01-23 09:42:46.642 227766 DEBUG nova.network.neutron [req-1c447a3c-aba8-470c-9d8b-85ec295494d1 req-05bcf3a5-5dd0-4662-89af-fa353bf646aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Updated VIF entry in instance network info cache for port bd040948-a661-431e-8f76-623ac2452642. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:42:46 np0005593234 nova_compute[227762]: 2026-01-23 09:42:46.643 227766 DEBUG nova.network.neutron [req-1c447a3c-aba8-470c-9d8b-85ec295494d1 req-05bcf3a5-5dd0-4662-89af-fa353bf646aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Updating instance_info_cache with network_info: [{"id": "bd040948-a661-431e-8f76-623ac2452642", "address": "fa:16:3e:3e:ce:a8", "network": {"id": "f19933f5-cfe3-4319-a83b-b72dde692ab6", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-1169169895-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4fd9229340ed4bf3a3a72baa6985a3e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd040948-a6", "ovs_interfaceid": "bd040948-a661-431e-8f76-623ac2452642", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:42:46 np0005593234 nova_compute[227762]: 2026-01-23 09:42:46.682 227766 DEBUG oslo_concurrency.lockutils [req-1c447a3c-aba8-470c-9d8b-85ec295494d1 req-05bcf3a5-5dd0-4662-89af-fa353bf646aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-4ef48fbd-b990-487c-94a4-0149ee9204c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:42:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:46.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:42:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:47.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:42:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:48.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:49.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:49 np0005593234 nova_compute[227762]: 2026-01-23 09:42:49.106 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e193 e193: 3 total, 3 up, 3 in
Jan 23 04:42:49 np0005593234 nova_compute[227762]: 2026-01-23 09:42:49.588 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161354.5870843, a8411989-0134-41c7-85e7-36173b393043 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:42:49 np0005593234 nova_compute[227762]: 2026-01-23 09:42:49.588 227766 INFO nova.compute.manager [-] [instance: a8411989-0134-41c7-85e7-36173b393043] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:42:49 np0005593234 nova_compute[227762]: 2026-01-23 09:42:49.615 227766 DEBUG nova.compute.manager [None req-2e19b2a0-a181-4af4-a5ec-d7e975440798 - - - - - -] [instance: a8411989-0134-41c7-85e7-36173b393043] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:42:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e193 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:50.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:51 np0005593234 nova_compute[227762]: 2026-01-23 09:42:51.060 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:42:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:51.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:42:52 np0005593234 podman[249618]: 2026-01-23 09:42:52.760813772 +0000 UTC m=+0.051831047 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 04:42:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:52.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:42:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:53.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:42:53 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:53Z|00128|binding|INFO|Releasing lport 03e2fba1-8299-41b0-8205-575cd62d3292 from this chassis (sb_readonly=0)
Jan 23 04:42:53 np0005593234 nova_compute[227762]: 2026-01-23 09:42:53.691 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:53 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:53Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3e:ce:a8 10.100.0.11
Jan 23 04:42:53 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:53Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3e:ce:a8 10.100.0.11
Jan 23 04:42:54 np0005593234 nova_compute[227762]: 2026-01-23 09:42:54.107 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e194 e194: 3 total, 3 up, 3 in
Jan 23 04:42:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:42:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:54.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:55.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:56 np0005593234 nova_compute[227762]: 2026-01-23 09:42:56.062 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:56.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:57.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:42:58Z|00129|binding|INFO|Releasing lport 03e2fba1-8299-41b0-8205-575cd62d3292 from this chassis (sb_readonly=0)
Jan 23 04:42:58 np0005593234 nova_compute[227762]: 2026-01-23 09:42:58.393 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:42:58.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:42:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:42:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:42:59.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:42:59 np0005593234 nova_compute[227762]: 2026-01-23 09:42:59.109 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:42:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:00.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:01 np0005593234 nova_compute[227762]: 2026-01-23 09:43:01.064 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:01.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:02.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:03.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:04 np0005593234 nova_compute[227762]: 2026-01-23 09:43:04.111 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:04.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:43:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:05.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:43:06 np0005593234 nova_compute[227762]: 2026-01-23 09:43:06.068 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:06 np0005593234 podman[249696]: 2026-01-23 09:43:06.805592793 +0000 UTC m=+0.090334247 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 04:43:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:06.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e195 e195: 3 total, 3 up, 3 in
Jan 23 04:43:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:43:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:07.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:43:08 np0005593234 nova_compute[227762]: 2026-01-23 09:43:08.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e196 e196: 3 total, 3 up, 3 in
Jan 23 04:43:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:08.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:09 np0005593234 nova_compute[227762]: 2026-01-23 09:43:09.112 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:09.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e197 e197: 3 total, 3 up, 3 in
Jan 23 04:43:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e197 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:10.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:11 np0005593234 nova_compute[227762]: 2026-01-23 09:43:11.071 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:11.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:11 np0005593234 nova_compute[227762]: 2026-01-23 09:43:11.338 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:12.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:43:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:13.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:43:13 np0005593234 nova_compute[227762]: 2026-01-23 09:43:13.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:13 np0005593234 nova_compute[227762]: 2026-01-23 09:43:13.774 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:13 np0005593234 nova_compute[227762]: 2026-01-23 09:43:13.774 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:13 np0005593234 nova_compute[227762]: 2026-01-23 09:43:13.775 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:13 np0005593234 nova_compute[227762]: 2026-01-23 09:43:13.775 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:43:13 np0005593234 nova_compute[227762]: 2026-01-23 09:43:13.776 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:14 np0005593234 nova_compute[227762]: 2026-01-23 09:43:14.114 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e198 e198: 3 total, 3 up, 3 in
Jan 23 04:43:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:43:14 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1092188621' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:43:14 np0005593234 nova_compute[227762]: 2026-01-23 09:43:14.239 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:14 np0005593234 nova_compute[227762]: 2026-01-23 09:43:14.326 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000002f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:43:14 np0005593234 nova_compute[227762]: 2026-01-23 09:43:14.327 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000002f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:43:14 np0005593234 nova_compute[227762]: 2026-01-23 09:43:14.498 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:43:14 np0005593234 nova_compute[227762]: 2026-01-23 09:43:14.499 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4549MB free_disk=20.87606430053711GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:43:14 np0005593234 nova_compute[227762]: 2026-01-23 09:43:14.500 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:14 np0005593234 nova_compute[227762]: 2026-01-23 09:43:14.500 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:14 np0005593234 nova_compute[227762]: 2026-01-23 09:43:14.582 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 4ef48fbd-b990-487c-94a4-0149ee9204c9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:43:14 np0005593234 nova_compute[227762]: 2026-01-23 09:43:14.583 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:43:14 np0005593234 nova_compute[227762]: 2026-01-23 09:43:14.583 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:43:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:14.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:14 np0005593234 nova_compute[227762]: 2026-01-23 09:43:14.898 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:43:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:15.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:43:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:43:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4266186955' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:43:15 np0005593234 nova_compute[227762]: 2026-01-23 09:43:15.311 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:15 np0005593234 nova_compute[227762]: 2026-01-23 09:43:15.315 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:43:15 np0005593234 nova_compute[227762]: 2026-01-23 09:43:15.389 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:43:15 np0005593234 nova_compute[227762]: 2026-01-23 09:43:15.423 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:43:15 np0005593234 nova_compute[227762]: 2026-01-23 09:43:15.424 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.924s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:15 np0005593234 nova_compute[227762]: 2026-01-23 09:43:15.424 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:15 np0005593234 nova_compute[227762]: 2026-01-23 09:43:15.425 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 04:43:16 np0005593234 nova_compute[227762]: 2026-01-23 09:43:16.074 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:16 np0005593234 nova_compute[227762]: 2026-01-23 09:43:16.458 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:16 np0005593234 nova_compute[227762]: 2026-01-23 09:43:16.490 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:16 np0005593234 nova_compute[227762]: 2026-01-23 09:43:16.491 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:16 np0005593234 nova_compute[227762]: 2026-01-23 09:43:16.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:16 np0005593234 nova_compute[227762]: 2026-01-23 09:43:16.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:43:16 np0005593234 nova_compute[227762]: 2026-01-23 09:43:16.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:43:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:16.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:16 np0005593234 nova_compute[227762]: 2026-01-23 09:43:16.960 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-4ef48fbd-b990-487c-94a4-0149ee9204c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:43:16 np0005593234 nova_compute[227762]: 2026-01-23 09:43:16.960 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-4ef48fbd-b990-487c-94a4-0149ee9204c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:43:16 np0005593234 nova_compute[227762]: 2026-01-23 09:43:16.961 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:43:16 np0005593234 nova_compute[227762]: 2026-01-23 09:43:16.961 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4ef48fbd-b990-487c-94a4-0149ee9204c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:17.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.136 227766 DEBUG oslo_concurrency.lockutils [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Acquiring lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.137 227766 DEBUG oslo_concurrency.lockutils [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.171 227766 DEBUG nova.objects.instance [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lazy-loading 'flavor' on Instance uuid 4ef48fbd-b990-487c-94a4-0149ee9204c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.218 227766 DEBUG oslo_concurrency.lockutils [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.081s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.589 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "b157065e-5625-4012-8e6f-9b22cef56ddc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.590 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.624 227766 DEBUG oslo_concurrency.lockutils [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Acquiring lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.625 227766 DEBUG oslo_concurrency.lockutils [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.625 227766 INFO nova.compute.manager [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Attaching volume 99bccbe9-de42-409d-aa8f-e509f6080e7b to /dev/vdb#033[00m
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.630 227766 DEBUG nova.compute.manager [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.748 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.748 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.756 227766 DEBUG nova.virt.hardware [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:43:18 np0005593234 nova_compute[227762]: 2026-01-23 09:43:18.756 227766 INFO nova.compute.claims [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:43:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:18.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.066 227766 DEBUG os_brick.utils [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.068 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.080 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.080 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[d0931340-ddfa-465a-ba39-3ecf1e1d1ccb]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.083 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.092 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.092 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[a5f3bb61-5e7e-41c3-867c-a01accc00b8a]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.094 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.102 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.102 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[f1e80f68-ce81-450b-946a-6994dbfc3936]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.104 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[ba84e4c4-1739-4a63-9aa3-ebf9b5e9bc6b]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.104 227766 DEBUG oslo_concurrency.processutils [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.125 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:19.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.130 227766 DEBUG oslo_concurrency.processutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.150 227766 DEBUG oslo_concurrency.processutils [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] CMD "nvme version" returned: 0 in 0.046s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.153 227766 DEBUG os_brick.initiator.connectors.lightos [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.154 227766 DEBUG os_brick.initiator.connectors.lightos [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.154 227766 DEBUG os_brick.initiator.connectors.lightos [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.154 227766 DEBUG os_brick.utils [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] <== get_connector_properties: return (88ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.155 227766 DEBUG nova.virt.block_device [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Updating existing volume attachment record: bd199762-e898-4cb0-8325-dc78f5aba525 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.415 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Updating instance_info_cache with network_info: [{"id": "bd040948-a661-431e-8f76-623ac2452642", "address": "fa:16:3e:3e:ce:a8", "network": {"id": "f19933f5-cfe3-4319-a83b-b72dde692ab6", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-1169169895-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4fd9229340ed4bf3a3a72baa6985a3e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd040948-a6", "ovs_interfaceid": "bd040948-a661-431e-8f76-623ac2452642", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.440 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-4ef48fbd-b990-487c-94a4-0149ee9204c9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.440 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.441 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.441 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.441 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.480 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:43:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3777926192' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.587 227766 DEBUG oslo_concurrency.processutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.592 227766 DEBUG nova.compute.provider_tree [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.607 227766 DEBUG nova.scheduler.client.report [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.629 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.881s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.630 227766 DEBUG nova.compute.manager [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.695 227766 DEBUG nova.compute.manager [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.696 227766 DEBUG nova.network.neutron [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.715 227766 INFO nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.738 227766 DEBUG nova.compute.manager [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:43:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.874 227766 DEBUG nova.compute.manager [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.875 227766 DEBUG nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.876 227766 INFO nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Creating image(s)#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.897 227766 DEBUG nova.storage.rbd_utils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image b157065e-5625-4012-8e6f-9b22cef56ddc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.920 227766 DEBUG nova.storage.rbd_utils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image b157065e-5625-4012-8e6f-9b22cef56ddc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.945 227766 DEBUG nova.storage.rbd_utils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image b157065e-5625-4012-8e6f-9b22cef56ddc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.949 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "c2364ff9c0ac135923a2025898d799fc1efe3717" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:19 np0005593234 nova_compute[227762]: 2026-01-23 09:43:19.950 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "c2364ff9c0ac135923a2025898d799fc1efe3717" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.099 227766 DEBUG nova.policy [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56da68482e3a4fb582dcccad45f8f71b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05bc71a77710455e8b34ead7fec81a31', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.102 227766 DEBUG nova.objects.instance [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lazy-loading 'flavor' on Instance uuid 4ef48fbd-b990-487c-94a4-0149ee9204c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.161 227766 DEBUG nova.virt.libvirt.driver [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Attempting to attach volume 99bccbe9-de42-409d-aa8f-e509f6080e7b with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.164 227766 DEBUG nova.virt.libvirt.guest [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 04:43:20 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:43:20 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-99bccbe9-de42-409d-aa8f-e509f6080e7b">
Jan 23 04:43:20 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 04:43:20 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 04:43:20 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 04:43:20 np0005593234 nova_compute[227762]:  </source>
Jan 23 04:43:20 np0005593234 nova_compute[227762]:  <auth username="openstack">
Jan 23 04:43:20 np0005593234 nova_compute[227762]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:43:20 np0005593234 nova_compute[227762]:  </auth>
Jan 23 04:43:20 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 04:43:20 np0005593234 nova_compute[227762]:  <serial>99bccbe9-de42-409d-aa8f-e509f6080e7b</serial>
Jan 23 04:43:20 np0005593234 nova_compute[227762]:  <shareable/>
Jan 23 04:43:20 np0005593234 nova_compute[227762]: </disk>
Jan 23 04:43:20 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.340 227766 DEBUG nova.virt.libvirt.driver [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.342 227766 DEBUG nova.virt.libvirt.driver [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.342 227766 DEBUG nova.virt.libvirt.driver [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.342 227766 DEBUG nova.virt.libvirt.driver [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] No VIF found with MAC fa:16:3e:3e:ce:a8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.452 227766 DEBUG nova.virt.libvirt.imagebackend [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/b5a73d98-c27b-4745-95e1-6675f24e35ae/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/b5a73d98-c27b-4745-95e1-6675f24e35ae/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.524 227766 DEBUG nova.virt.libvirt.imagebackend [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Selected location: {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/b5a73d98-c27b-4745-95e1-6675f24e35ae/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.526 227766 DEBUG nova.storage.rbd_utils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] cloning images/b5a73d98-c27b-4745-95e1-6675f24e35ae@snap to None/b157065e-5625-4012-8e6f-9b22cef56ddc_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.645 227766 DEBUG oslo_concurrency.lockutils [None req-a54ee15a-0923-4509-a6f4-f80ae619d8a6 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.020s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.661 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "c2364ff9c0ac135923a2025898d799fc1efe3717" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.711s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.806 227766 DEBUG nova.objects.instance [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'migration_context' on Instance uuid b157065e-5625-4012-8e6f-9b22cef56ddc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.823 227766 DEBUG nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.823 227766 DEBUG nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Ensure instance console log exists: /var/lib/nova/instances/b157065e-5625-4012-8e6f-9b22cef56ddc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.824 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.824 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:20 np0005593234 nova_compute[227762]: 2026-01-23 09:43:20.824 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:20.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:21 np0005593234 nova_compute[227762]: 2026-01-23 09:43:21.077 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:21.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:21 np0005593234 nova_compute[227762]: 2026-01-23 09:43:21.436 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:21 np0005593234 nova_compute[227762]: 2026-01-23 09:43:21.802 227766 DEBUG nova.network.neutron [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Successfully created port: fa5ce613-0317-4fe5-8ae8-93daf23d11c0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.570716) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161402570795, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 1111, "num_deletes": 256, "total_data_size": 2059179, "memory_usage": 2080960, "flush_reason": "Manual Compaction"}
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161402579867, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 1356601, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34967, "largest_seqno": 36073, "table_properties": {"data_size": 1351584, "index_size": 2477, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11783, "raw_average_key_size": 20, "raw_value_size": 1341192, "raw_average_value_size": 2348, "num_data_blocks": 107, "num_entries": 571, "num_filter_entries": 571, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161340, "oldest_key_time": 1769161340, "file_creation_time": 1769161402, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 9189 microseconds, and 4050 cpu microseconds.
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.579917) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 1356601 bytes OK
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.579935) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.581505) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.581520) EVENT_LOG_v1 {"time_micros": 1769161402581514, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.581537) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 2053618, prev total WAL file size 2053618, number of live WAL files 2.
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.582296) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(1324KB)], [66(8652KB)]
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161402582402, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 10216789, "oldest_snapshot_seqno": -1}
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 5829 keys, 8326839 bytes, temperature: kUnknown
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161402637313, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 8326839, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8289352, "index_size": 21825, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14597, "raw_key_size": 149781, "raw_average_key_size": 25, "raw_value_size": 8185938, "raw_average_value_size": 1404, "num_data_blocks": 874, "num_entries": 5829, "num_filter_entries": 5829, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769161402, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.637763) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 8326839 bytes
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.640676) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 185.6 rd, 151.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 8.4 +0.0 blob) out(7.9 +0.0 blob), read-write-amplify(13.7) write-amplify(6.1) OK, records in: 6361, records dropped: 532 output_compression: NoCompression
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.640724) EVENT_LOG_v1 {"time_micros": 1769161402640707, "job": 40, "event": "compaction_finished", "compaction_time_micros": 55040, "compaction_time_cpu_micros": 21090, "output_level": 6, "num_output_files": 1, "total_output_size": 8326839, "num_input_records": 6361, "num_output_records": 5829, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161402641237, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161402642644, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.582215) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.642741) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.642746) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.642748) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.642750) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:43:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:43:22.642752) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:43:22 np0005593234 nova_compute[227762]: 2026-01-23 09:43:22.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:22 np0005593234 nova_compute[227762]: 2026-01-23 09:43:22.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 04:43:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:22.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:22 np0005593234 nova_compute[227762]: 2026-01-23 09:43:22.939 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 04:43:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:43:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:23.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:43:23 np0005593234 podman[250054]: 2026-01-23 09:43:23.764850853 +0000 UTC m=+0.051992022 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:43:23 np0005593234 nova_compute[227762]: 2026-01-23 09:43:23.799 227766 DEBUG nova.network.neutron [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Successfully updated port: fa5ce613-0317-4fe5-8ae8-93daf23d11c0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:43:23 np0005593234 nova_compute[227762]: 2026-01-23 09:43:23.824 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "refresh_cache-b157065e-5625-4012-8e6f-9b22cef56ddc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:43:23 np0005593234 nova_compute[227762]: 2026-01-23 09:43:23.824 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquired lock "refresh_cache-b157065e-5625-4012-8e6f-9b22cef56ddc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:43:23 np0005593234 nova_compute[227762]: 2026-01-23 09:43:23.825 227766 DEBUG nova.network.neutron [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:43:23 np0005593234 nova_compute[227762]: 2026-01-23 09:43:23.917 227766 DEBUG nova.compute.manager [req-886a421b-e397-4c0e-b207-21a84cc71e5e req-8e961fbf-7cb5-4dd5-a85d-e572f118603a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Received event network-changed-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:23 np0005593234 nova_compute[227762]: 2026-01-23 09:43:23.918 227766 DEBUG nova.compute.manager [req-886a421b-e397-4c0e-b207-21a84cc71e5e req-8e961fbf-7cb5-4dd5-a85d-e572f118603a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Refreshing instance network info cache due to event network-changed-fa5ce613-0317-4fe5-8ae8-93daf23d11c0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:43:23 np0005593234 nova_compute[227762]: 2026-01-23 09:43:23.918 227766 DEBUG oslo_concurrency.lockutils [req-886a421b-e397-4c0e-b207-21a84cc71e5e req-8e961fbf-7cb5-4dd5-a85d-e572f118603a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-b157065e-5625-4012-8e6f-9b22cef56ddc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:43:23 np0005593234 nova_compute[227762]: 2026-01-23 09:43:23.938 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:24 np0005593234 nova_compute[227762]: 2026-01-23 09:43:24.094 227766 DEBUG nova.network.neutron [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:43:24 np0005593234 nova_compute[227762]: 2026-01-23 09:43:24.118 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:24.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:24 np0005593234 nova_compute[227762]: 2026-01-23 09:43:24.966 227766 DEBUG oslo_concurrency.lockutils [None req-f73b2e30-2bec-47fe-8f39-3d3afafa3a90 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Acquiring lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:24 np0005593234 nova_compute[227762]: 2026-01-23 09:43:24.966 227766 DEBUG oslo_concurrency.lockutils [None req-f73b2e30-2bec-47fe-8f39-3d3afafa3a90 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:24 np0005593234 nova_compute[227762]: 2026-01-23 09:43:24.985 227766 INFO nova.compute.manager [None req-f73b2e30-2bec-47fe-8f39-3d3afafa3a90 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Detaching volume 99bccbe9-de42-409d-aa8f-e509f6080e7b#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.084 227766 DEBUG nova.network.neutron [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Updating instance_info_cache with network_info: [{"id": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "address": "fa:16:3e:fe:69:cb", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa5ce613-03", "ovs_interfaceid": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.107 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Releasing lock "refresh_cache-b157065e-5625-4012-8e6f-9b22cef56ddc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.107 227766 DEBUG nova.compute.manager [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Instance network_info: |[{"id": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "address": "fa:16:3e:fe:69:cb", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa5ce613-03", "ovs_interfaceid": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.107 227766 DEBUG oslo_concurrency.lockutils [req-886a421b-e397-4c0e-b207-21a84cc71e5e req-8e961fbf-7cb5-4dd5-a85d-e572f118603a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-b157065e-5625-4012-8e6f-9b22cef56ddc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.108 227766 DEBUG nova.network.neutron [req-886a421b-e397-4c0e-b207-21a84cc71e5e req-8e961fbf-7cb5-4dd5-a85d-e572f118603a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Refreshing network info cache for port fa5ce613-0317-4fe5-8ae8-93daf23d11c0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.111 227766 DEBUG nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Start _get_guest_xml network_info=[{"id": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "address": "fa:16:3e:fe:69:cb", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa5ce613-03", "ovs_interfaceid": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T09:43:04Z,direct_url=<?>,disk_format='raw',id=b5a73d98-c27b-4745-95e1-6675f24e35ae,min_disk=1,min_ram=0,name='tempest-test-snap-903138486',owner='05bc71a77710455e8b34ead7fec81a31',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T09:43:11Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': 'b5a73d98-c27b-4745-95e1-6675f24e35ae'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.115 227766 WARNING nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.121 227766 DEBUG nova.virt.libvirt.host [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.122 227766 DEBUG nova.virt.libvirt.host [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.124 227766 DEBUG nova.virt.libvirt.host [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.125 227766 DEBUG nova.virt.libvirt.host [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.126 227766 DEBUG nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.126 227766 DEBUG nova.virt.hardware [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T09:43:04Z,direct_url=<?>,disk_format='raw',id=b5a73d98-c27b-4745-95e1-6675f24e35ae,min_disk=1,min_ram=0,name='tempest-test-snap-903138486',owner='05bc71a77710455e8b34ead7fec81a31',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T09:43:11Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.127 227766 DEBUG nova.virt.hardware [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.127 227766 DEBUG nova.virt.hardware [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.127 227766 DEBUG nova.virt.hardware [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.127 227766 DEBUG nova.virt.hardware [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.128 227766 DEBUG nova.virt.hardware [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.128 227766 DEBUG nova.virt.hardware [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.128 227766 DEBUG nova.virt.hardware [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.128 227766 DEBUG nova.virt.hardware [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.128 227766 DEBUG nova.virt.hardware [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.129 227766 DEBUG nova.virt.hardware [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.132 227766 DEBUG oslo_concurrency.processutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:43:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:25.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.168 227766 INFO nova.virt.block_device [None req-f73b2e30-2bec-47fe-8f39-3d3afafa3a90 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Attempting to driver detach volume 99bccbe9-de42-409d-aa8f-e509f6080e7b from mountpoint /dev/vdb#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.181 227766 DEBUG nova.virt.libvirt.driver [None req-f73b2e30-2bec-47fe-8f39-3d3afafa3a90 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Attempting to detach device vdb from instance 4ef48fbd-b990-487c-94a4-0149ee9204c9 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.182 227766 DEBUG nova.virt.libvirt.guest [None req-f73b2e30-2bec-47fe-8f39-3d3afafa3a90 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-99bccbe9-de42-409d-aa8f-e509f6080e7b">
Jan 23 04:43:25 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  </source>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  <serial>99bccbe9-de42-409d-aa8f-e509f6080e7b</serial>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  <shareable/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]: </disk>
Jan 23 04:43:25 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.200 227766 INFO nova.virt.libvirt.driver [None req-f73b2e30-2bec-47fe-8f39-3d3afafa3a90 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Successfully detached device vdb from instance 4ef48fbd-b990-487c-94a4-0149ee9204c9 from the persistent domain config.#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.201 227766 DEBUG nova.virt.libvirt.driver [None req-f73b2e30-2bec-47fe-8f39-3d3afafa3a90 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 4ef48fbd-b990-487c-94a4-0149ee9204c9 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.202 227766 DEBUG nova.virt.libvirt.guest [None req-f73b2e30-2bec-47fe-8f39-3d3afafa3a90 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-99bccbe9-de42-409d-aa8f-e509f6080e7b">
Jan 23 04:43:25 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  </source>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  <serial>99bccbe9-de42-409d-aa8f-e509f6080e7b</serial>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  <shareable/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 04:43:25 np0005593234 nova_compute[227762]: </disk>
Jan 23 04:43:25 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.255 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769161405.2548347, 4ef48fbd-b990-487c-94a4-0149ee9204c9 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.256 227766 DEBUG nova.virt.libvirt.driver [None req-f73b2e30-2bec-47fe-8f39-3d3afafa3a90 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 4ef48fbd-b990-487c-94a4-0149ee9204c9 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.258 227766 INFO nova.virt.libvirt.driver [None req-f73b2e30-2bec-47fe-8f39-3d3afafa3a90 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Successfully detached device vdb from instance 4ef48fbd-b990-487c-94a4-0149ee9204c9 from the live domain config.#033[00m
Jan 23 04:43:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:43:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3651022041' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.679 227766 DEBUG oslo_concurrency.processutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.548s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.711 227766 DEBUG nova.storage.rbd_utils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image b157065e-5625-4012-8e6f-9b22cef56ddc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:25 np0005593234 nova_compute[227762]: 2026-01-23 09:43:25.716 227766 DEBUG oslo_concurrency.processutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.082 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.114 227766 DEBUG nova.objects.instance [None req-f73b2e30-2bec-47fe-8f39-3d3afafa3a90 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lazy-loading 'flavor' on Instance uuid 4ef48fbd-b990-487c-94a4-0149ee9204c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:43:26 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/758487584' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.159 227766 DEBUG oslo_concurrency.processutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.161 227766 DEBUG nova.virt.libvirt.vif [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:43:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1954358251',display_name='tempest-ImagesTestJSON-server-1954358251',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1954358251',id=50,image_ref='b5a73d98-c27b-4745-95e1-6675f24e35ae',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-w1scufj5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='47eda3a7-c47a-48cc-8381-a702e2e27bfc',image_min_disk='1',image_min_ram='0',image_owner_id='05bc71a77710455e8b34ead7fec81a31',image_owner_project_name='tempest-ImagesTestJSON-1507872051',image_owner_user_name='tempest-ImagesTestJSON-1507872051-project-member',image_user_id='56da68482e3a4fb582dcccad45f8f71b',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:43:19Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=b157065e-5625-4012-8e6f-9b22cef56ddc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "address": "fa:16:3e:fe:69:cb", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa5ce613-03", "ovs_interfaceid": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.161 227766 DEBUG nova.network.os_vif_util [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "address": "fa:16:3e:fe:69:cb", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa5ce613-03", "ovs_interfaceid": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.162 227766 DEBUG nova.network.os_vif_util [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:69:cb,bridge_name='br-int',has_traffic_filtering=True,id=fa5ce613-0317-4fe5-8ae8-93daf23d11c0,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa5ce613-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.163 227766 DEBUG nova.objects.instance [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'pci_devices' on Instance uuid b157065e-5625-4012-8e6f-9b22cef56ddc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.783 227766 DEBUG nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  <uuid>b157065e-5625-4012-8e6f-9b22cef56ddc</uuid>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  <name>instance-00000032</name>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <nova:name>tempest-ImagesTestJSON-server-1954358251</nova:name>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:43:25</nova:creationTime>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <nova:user uuid="56da68482e3a4fb582dcccad45f8f71b">tempest-ImagesTestJSON-1507872051-project-member</nova:user>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <nova:project uuid="05bc71a77710455e8b34ead7fec81a31">tempest-ImagesTestJSON-1507872051</nova:project>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="b5a73d98-c27b-4745-95e1-6675f24e35ae"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <nova:port uuid="fa5ce613-0317-4fe5-8ae8-93daf23d11c0">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <entry name="serial">b157065e-5625-4012-8e6f-9b22cef56ddc</entry>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <entry name="uuid">b157065e-5625-4012-8e6f-9b22cef56ddc</entry>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/b157065e-5625-4012-8e6f-9b22cef56ddc_disk">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/b157065e-5625-4012-8e6f-9b22cef56ddc_disk.config">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:fe:69:cb"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <target dev="tapfa5ce613-03"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/b157065e-5625-4012-8e6f-9b22cef56ddc/console.log" append="off"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <input type="keyboard" bus="usb"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:43:26 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:43:26 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:43:26 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:43:26 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.785 227766 DEBUG nova.compute.manager [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Preparing to wait for external event network-vif-plugged-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.785 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.786 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.786 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.787 227766 DEBUG nova.virt.libvirt.vif [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:43:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1954358251',display_name='tempest-ImagesTestJSON-server-1954358251',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1954358251',id=50,image_ref='b5a73d98-c27b-4745-95e1-6675f24e35ae',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-w1scufj5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='47eda3a7-c47a-48cc-8381-a702e2e27bfc',image_min_disk='1',image_min_ram='0',image_owner_id='05bc71a77710455e8b34ead7fec81a31',image_owner_project_name='tempest-ImagesTestJSON-1507872051',image_owner_user_name='tempest-ImagesTestJSON-1507872051-project-member',image_user_id='56da68482e3a4fb582dcccad45f8f71b',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:43:19Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=b157065e-5625-4012-8e6f-9b22cef56ddc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "address": "fa:16:3e:fe:69:cb", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa5ce613-03", "ovs_interfaceid": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.787 227766 DEBUG nova.network.os_vif_util [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "address": "fa:16:3e:fe:69:cb", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa5ce613-03", "ovs_interfaceid": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.788 227766 DEBUG nova.network.os_vif_util [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:69:cb,bridge_name='br-int',has_traffic_filtering=True,id=fa5ce613-0317-4fe5-8ae8-93daf23d11c0,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa5ce613-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.788 227766 DEBUG os_vif [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:69:cb,bridge_name='br-int',has_traffic_filtering=True,id=fa5ce613-0317-4fe5-8ae8-93daf23d11c0,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa5ce613-03') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.789 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.789 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.790 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.794 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.795 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa5ce613-03, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.795 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfa5ce613-03, col_values=(('external_ids', {'iface-id': 'fa5ce613-0317-4fe5-8ae8-93daf23d11c0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fe:69:cb', 'vm-uuid': 'b157065e-5625-4012-8e6f-9b22cef56ddc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.842 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:26 np0005593234 NetworkManager[48942]: <info>  [1769161406.8438] manager: (tapfa5ce613-03): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.846 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.849 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.850 227766 INFO os_vif [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:69:cb,bridge_name='br-int',has_traffic_filtering=True,id=fa5ce613-0317-4fe5-8ae8-93daf23d11c0,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa5ce613-03')#033[00m
Jan 23 04:43:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:26.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:26 np0005593234 nova_compute[227762]: 2026-01-23 09:43:26.995 227766 DEBUG oslo_concurrency.lockutils [None req-f73b2e30-2bec-47fe-8f39-3d3afafa3a90 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 2.029s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.039 227766 DEBUG nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.040 227766 DEBUG nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.040 227766 DEBUG nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No VIF found with MAC fa:16:3e:fe:69:cb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.040 227766 INFO nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Using config drive#033[00m
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.068 227766 DEBUG nova.storage.rbd_utils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image b157065e-5625-4012-8e6f-9b22cef56ddc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:43:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:27.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.568 227766 INFO nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Creating config drive at /var/lib/nova/instances/b157065e-5625-4012-8e6f-9b22cef56ddc/disk.config#033[00m
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.572 227766 DEBUG oslo_concurrency.processutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b157065e-5625-4012-8e6f-9b22cef56ddc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgsnb8r53 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.699 227766 DEBUG oslo_concurrency.processutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b157065e-5625-4012-8e6f-9b22cef56ddc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgsnb8r53" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.731 227766 DEBUG nova.storage.rbd_utils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image b157065e-5625-4012-8e6f-9b22cef56ddc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.735 227766 DEBUG oslo_concurrency.processutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b157065e-5625-4012-8e6f-9b22cef56ddc/disk.config b157065e-5625-4012-8e6f-9b22cef56ddc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.921 227766 DEBUG oslo_concurrency.processutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b157065e-5625-4012-8e6f-9b22cef56ddc/disk.config b157065e-5625-4012-8e6f-9b22cef56ddc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.186s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.922 227766 INFO nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Deleting local config drive /var/lib/nova/instances/b157065e-5625-4012-8e6f-9b22cef56ddc/disk.config because it was imported into RBD.#033[00m
Jan 23 04:43:27 np0005593234 kernel: tapfa5ce613-03: entered promiscuous mode
Jan 23 04:43:27 np0005593234 NetworkManager[48942]: <info>  [1769161407.9664] manager: (tapfa5ce613-03): new Tun device (/org/freedesktop/NetworkManager/Devices/76)
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.966 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:27 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:27Z|00130|binding|INFO|Claiming lport fa5ce613-0317-4fe5-8ae8-93daf23d11c0 for this chassis.
Jan 23 04:43:27 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:27Z|00131|binding|INFO|fa5ce613-0317-4fe5-8ae8-93daf23d11c0: Claiming fa:16:3e:fe:69:cb 10.100.0.4
Jan 23 04:43:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:27.975 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:69:cb 10.100.0.4'], port_security=['fa:16:3e:fe:69:cb 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b157065e-5625-4012-8e6f-9b22cef56ddc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2696fd4-5fd7-4934-88ac-40162fad555d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05bc71a77710455e8b34ead7fec81a31', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ab8b868e-d8b1-4e1d-87d5-538f88b95e73', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3459fea4-e2ba-482e-8d51-91ef5b74d71a, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fa5ce613-0317-4fe5-8ae8-93daf23d11c0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:43:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:27.977 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fa5ce613-0317-4fe5-8ae8-93daf23d11c0 in datapath c2696fd4-5fd7-4934-88ac-40162fad555d bound to our chassis#033[00m
Jan 23 04:43:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:27.979 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c2696fd4-5fd7-4934-88ac-40162fad555d#033[00m
Jan 23 04:43:27 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:27Z|00132|binding|INFO|Setting lport fa5ce613-0317-4fe5-8ae8-93daf23d11c0 ovn-installed in OVS
Jan 23 04:43:27 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:27Z|00133|binding|INFO|Setting lport fa5ce613-0317-4fe5-8ae8-93daf23d11c0 up in Southbound
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.984 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:27 np0005593234 nova_compute[227762]: 2026-01-23 09:43:27.987 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:27.995 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[41db0cfa-c520-4bec-8e36-be26ade8a455]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:27.996 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc2696fd4-51 in ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:43:27 np0005593234 systemd-udevd[250215]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:43:28 np0005593234 systemd-machined[195626]: New machine qemu-22-instance-00000032.
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.000 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc2696fd4-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.001 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d32a7e9e-bea0-4cb0-aeaf-d73a9119731d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.002 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cb2904c1-e819-437f-8bc4-ebb1753c465b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 NetworkManager[48942]: <info>  [1769161408.0116] device (tapfa5ce613-03): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:43:28 np0005593234 NetworkManager[48942]: <info>  [1769161408.0124] device (tapfa5ce613-03): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:43:28 np0005593234 systemd[1]: Started Virtual Machine qemu-22-instance-00000032.
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.015 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c6c987-d51b-418d-9f9f-305bfaacdb7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.032 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5f94ff3d-7e57-46e1-b560-1fa64eb970d7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.060 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[62d2785b-b915-4ba5-9536-05291e258f13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 systemd-udevd[250218]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.066 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[527b1367-c402-4c97-8e17-2f1022e8971d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 NetworkManager[48942]: <info>  [1769161408.0678] manager: (tapc2696fd4-50): new Veth device (/org/freedesktop/NetworkManager/Devices/77)
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.098 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[07281238-4563-4bd7-ad4c-ab6cc9ea3401]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.100 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[2c709674-6f08-4dff-bc4f-16d78f0a550d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 NetworkManager[48942]: <info>  [1769161408.1190] device (tapc2696fd4-50): carrier: link connected
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.124 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ce38bbcf-1e8a-4fff-be52-2e1dcc7c1959]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.139 227766 DEBUG nova.network.neutron [req-886a421b-e397-4c0e-b207-21a84cc71e5e req-8e961fbf-7cb5-4dd5-a85d-e572f118603a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Updated VIF entry in instance network info cache for port fa5ce613-0317-4fe5-8ae8-93daf23d11c0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.139 227766 DEBUG nova.network.neutron [req-886a421b-e397-4c0e-b207-21a84cc71e5e req-8e961fbf-7cb5-4dd5-a85d-e572f118603a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Updating instance_info_cache with network_info: [{"id": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "address": "fa:16:3e:fe:69:cb", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa5ce613-03", "ovs_interfaceid": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.141 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6f573dd7-e561-4ba9-9307-80f7c6d074fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc2696fd4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:02:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530307, 'reachable_time': 35307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250247, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.156 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d74aa9d3-81a3-4b5c-a3b4-7ef2be09fae5]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:20d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 530307, 'tstamp': 530307}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 250248, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.161 227766 DEBUG oslo_concurrency.lockutils [req-886a421b-e397-4c0e-b207-21a84cc71e5e req-8e961fbf-7cb5-4dd5-a85d-e572f118603a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-b157065e-5625-4012-8e6f-9b22cef56ddc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.170 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f2df64e1-43d3-4f4a-bebd-bd4d030418bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc2696fd4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:02:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530307, 'reachable_time': 35307, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 250249, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.202 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[22a09681-90f8-4123-a7db-98072276b8b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.286 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c5135f3f-5670-4572-aa39-a4ac7ad9c2b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.289 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2696fd4-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.289 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.290 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc2696fd4-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:28 np0005593234 kernel: tapc2696fd4-50: entered promiscuous mode
Jan 23 04:43:28 np0005593234 NetworkManager[48942]: <info>  [1769161408.2931] manager: (tapc2696fd4-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.294 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.295 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc2696fd4-50, col_values=(('external_ids', {'iface-id': '38b24332-af6b-47d2-95fe-400f5feeadcb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:28 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:28Z|00134|binding|INFO|Releasing lport 38b24332-af6b-47d2-95fe-400f5feeadcb from this chassis (sb_readonly=0)
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.312 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.313 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.314 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8cdbfe2d-6408-4cd1-ae9c-096ecb349e8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.316 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-c2696fd4-5fd7-4934-88ac-40162fad555d
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID c2696fd4-5fd7-4934-88ac-40162fad555d
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.316 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'env', 'PROCESS_TAG=haproxy-c2696fd4-5fd7-4934-88ac-40162fad555d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c2696fd4-5fd7-4934-88ac-40162fad555d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.486 227766 DEBUG nova.compute.manager [req-400c91c0-204e-4d47-b691-9d4590807ded req-38923b5f-9f4d-46cd-9125-eb27ce940ec4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Received event network-vif-plugged-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.486 227766 DEBUG oslo_concurrency.lockutils [req-400c91c0-204e-4d47-b691-9d4590807ded req-38923b5f-9f4d-46cd-9125-eb27ce940ec4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.492 227766 DEBUG oslo_concurrency.lockutils [req-400c91c0-204e-4d47-b691-9d4590807ded req-38923b5f-9f4d-46cd-9125-eb27ce940ec4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.005s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.492 227766 DEBUG oslo_concurrency.lockutils [req-400c91c0-204e-4d47-b691-9d4590807ded req-38923b5f-9f4d-46cd-9125-eb27ce940ec4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.492 227766 DEBUG nova.compute.manager [req-400c91c0-204e-4d47-b691-9d4590807ded req-38923b5f-9f4d-46cd-9125-eb27ce940ec4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Processing event network-vif-plugged-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:43:28 np0005593234 podman[250281]: 2026-01-23 09:43:28.717479527 +0000 UTC m=+0.056477591 container create 5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 04:43:28 np0005593234 systemd[1]: Started libpod-conmon-5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6.scope.
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.767 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.768 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:28 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:43:28 np0005593234 podman[250281]: 2026-01-23 09:43:28.684558241 +0000 UTC m=+0.023556325 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:43:28 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2c77d33942b0aecdfdcb4d862e9a5a49e267917ba8ea78009c9e706dba707010/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:43:28 np0005593234 podman[250281]: 2026-01-23 09:43:28.795610783 +0000 UTC m=+0.134608847 container init 5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:43:28 np0005593234 podman[250281]: 2026-01-23 09:43:28.803721596 +0000 UTC m=+0.142719660 container start 5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 04:43:28 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[250331]: [NOTICE]   (250342) : New worker (250345) forked
Jan 23 04:43:28 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[250331]: [NOTICE]   (250342) : Loading success.
Jan 23 04:43:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:28.868 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.872 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161408.8722749, b157065e-5625-4012-8e6f-9b22cef56ddc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.873 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] VM Started (Lifecycle Event)#033[00m
Jan 23 04:43:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:28.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.875 227766 DEBUG nova.compute.manager [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.878 227766 DEBUG nova.virt.libvirt.driver [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.880 227766 INFO nova.virt.libvirt.driver [-] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Instance spawned successfully.#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.881 227766 INFO nova.compute.manager [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Took 9.01 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.881 227766 DEBUG nova.compute.manager [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.914 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.917 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.945 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.946 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161408.8724504, b157065e-5625-4012-8e6f-9b22cef56ddc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.947 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.964 227766 INFO nova.compute.manager [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Took 10.25 seconds to build instance.#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.976 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.981 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161408.8777306, b157065e-5625-4012-8e6f-9b22cef56ddc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:28 np0005593234 nova_compute[227762]: 2026-01-23 09:43:28.982 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.003 227766 DEBUG oslo_concurrency.lockutils [None req-0a0efcaa-173b-49d6-98ff-019a8e6da9cf 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.413s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.007 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.011 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.119 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:29.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.320 227766 DEBUG oslo_concurrency.lockutils [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Acquiring lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.321 227766 DEBUG oslo_concurrency.lockutils [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.321 227766 DEBUG oslo_concurrency.lockutils [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Acquiring lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.322 227766 DEBUG oslo_concurrency.lockutils [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.322 227766 DEBUG oslo_concurrency.lockutils [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.323 227766 INFO nova.compute.manager [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Terminating instance#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.324 227766 DEBUG nova.compute.manager [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:43:29 np0005593234 kernel: tapbd040948-a6 (unregistering): left promiscuous mode
Jan 23 04:43:29 np0005593234 NetworkManager[48942]: <info>  [1769161409.3876] device (tapbd040948-a6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:43:29 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:29Z|00135|binding|INFO|Releasing lport bd040948-a661-431e-8f76-623ac2452642 from this chassis (sb_readonly=0)
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.397 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:29 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:29Z|00136|binding|INFO|Setting lport bd040948-a661-431e-8f76-623ac2452642 down in Southbound
Jan 23 04:43:29 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:29Z|00137|binding|INFO|Removing iface tapbd040948-a6 ovn-installed in OVS
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.400 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.407 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3e:ce:a8 10.100.0.11'], port_security=['fa:16:3e:3e:ce:a8 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4ef48fbd-b990-487c-94a4-0149ee9204c9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f19933f5-cfe3-4319-a83b-b72dde692ab6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4fd9229340ed4bf3a3a72baa6985a3e3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '232ea62b-b441-41b5-8457-7d5744ac9ac2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.199'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=91ae19d5-b9ed-444d-b1cd-8fb0c58abf8d, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=bd040948-a661-431e-8f76-623ac2452642) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.408 144381 INFO neutron.agent.ovn.metadata.agent [-] Port bd040948-a661-431e-8f76-623ac2452642 in datapath f19933f5-cfe3-4319-a83b-b72dde692ab6 unbound from our chassis#033[00m
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.410 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f19933f5-cfe3-4319-a83b-b72dde692ab6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.410 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9624deef-d2e7-4f1b-bdd1-b956b2a9de16]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.411 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6 namespace which is not needed anymore#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.413 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:29 np0005593234 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Jan 23 04:43:29 np0005593234 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000002f.scope: Consumed 14.335s CPU time.
Jan 23 04:43:29 np0005593234 systemd-machined[195626]: Machine qemu-21-instance-0000002f terminated.
Jan 23 04:43:29 np0005593234 NetworkManager[48942]: <info>  [1769161409.5442] manager: (tapbd040948-a6): new Tun device (/org/freedesktop/NetworkManager/Devices/79)
Jan 23 04:43:29 np0005593234 neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6[249546]: [NOTICE]   (249550) : haproxy version is 2.8.14-c23fe91
Jan 23 04:43:29 np0005593234 neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6[249546]: [NOTICE]   (249550) : path to executable is /usr/sbin/haproxy
Jan 23 04:43:29 np0005593234 neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6[249546]: [WARNING]  (249550) : Exiting Master process...
Jan 23 04:43:29 np0005593234 neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6[249546]: [ALERT]    (249550) : Current worker (249552) exited with code 143 (Terminated)
Jan 23 04:43:29 np0005593234 neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6[249546]: [WARNING]  (249550) : All workers exited. Exiting... (0)
Jan 23 04:43:29 np0005593234 systemd[1]: libpod-dd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c.scope: Deactivated successfully.
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.558 227766 INFO nova.virt.libvirt.driver [-] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Instance destroyed successfully.#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.559 227766 DEBUG nova.objects.instance [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lazy-loading 'resources' on Instance uuid 4ef48fbd-b990-487c-94a4-0149ee9204c9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:29 np0005593234 podman[250375]: 2026-01-23 09:43:29.565300681 +0000 UTC m=+0.048789582 container died dd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.577 227766 DEBUG nova.virt.libvirt.vif [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:42:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-UpdateMultiattachVolumeNegativeTest-server-1063948538',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-updatemultiattachvolumenegativetest-server-1063948538',id=47,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOW/D7eFeqMjBvekfY9VqlM3EY9Lv7j0wpym0wwbZXZxi5xiYHs3Y+SGaRgTVfBABcO7R/jAYgVwXr4x4dmhbR/VewPXJyWaKlJux19vulauSxlm5JZb+T430JhpaEya2w==',key_name='tempest-keypair-1370438234',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:42:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4fd9229340ed4bf3a3a72baa6985a3e3',ramdisk_id='',reservation_id='r-rg1r02jt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-UpdateMultiattachVolumeNegativeTest-1520463047',owner_user_name='tempest-UpdateMultiattachVolumeNegativeTest-1520463047-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:42:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='187ce0cedde344a3b09ca4560410580e',uuid=4ef48fbd-b990-487c-94a4-0149ee9204c9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "bd040948-a661-431e-8f76-623ac2452642", "address": "fa:16:3e:3e:ce:a8", "network": {"id": "f19933f5-cfe3-4319-a83b-b72dde692ab6", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-1169169895-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4fd9229340ed4bf3a3a72baa6985a3e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd040948-a6", "ovs_interfaceid": "bd040948-a661-431e-8f76-623ac2452642", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.578 227766 DEBUG nova.network.os_vif_util [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Converting VIF {"id": "bd040948-a661-431e-8f76-623ac2452642", "address": "fa:16:3e:3e:ce:a8", "network": {"id": "f19933f5-cfe3-4319-a83b-b72dde692ab6", "bridge": "br-int", "label": "tempest-UpdateMultiattachVolumeNegativeTest-1169169895-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4fd9229340ed4bf3a3a72baa6985a3e3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbd040948-a6", "ovs_interfaceid": "bd040948-a661-431e-8f76-623ac2452642", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.579 227766 DEBUG nova.network.os_vif_util [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3e:ce:a8,bridge_name='br-int',has_traffic_filtering=True,id=bd040948-a661-431e-8f76-623ac2452642,network=Network(f19933f5-cfe3-4319-a83b-b72dde692ab6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd040948-a6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.579 227766 DEBUG os_vif [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:ce:a8,bridge_name='br-int',has_traffic_filtering=True,id=bd040948-a661-431e-8f76-623ac2452642,network=Network(f19933f5-cfe3-4319-a83b-b72dde692ab6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd040948-a6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.581 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.582 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd040948-a6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.583 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.585 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.587 227766 INFO os_vif [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3e:ce:a8,bridge_name='br-int',has_traffic_filtering=True,id=bd040948-a661-431e-8f76-623ac2452642,network=Network(f19933f5-cfe3-4319-a83b-b72dde692ab6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbd040948-a6')#033[00m
Jan 23 04:43:29 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-dd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c-userdata-shm.mount: Deactivated successfully.
Jan 23 04:43:29 np0005593234 systemd[1]: var-lib-containers-storage-overlay-88f0b9f8bba97f04567158b247cd9b226a95c540c09a336a51f14cb86bbf97d2-merged.mount: Deactivated successfully.
Jan 23 04:43:29 np0005593234 podman[250375]: 2026-01-23 09:43:29.616088424 +0000 UTC m=+0.099577325 container cleanup dd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:43:29 np0005593234 systemd[1]: libpod-conmon-dd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c.scope: Deactivated successfully.
Jan 23 04:43:29 np0005593234 podman[250432]: 2026-01-23 09:43:29.682263238 +0000 UTC m=+0.041904948 container remove dd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.688 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0d0be9dc-6268-40ae-8b79-de9ea082c58e]: (4, ('Fri Jan 23 09:43:29 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6 (dd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c)\ndd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c\nFri Jan 23 09:43:29 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6 (dd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c)\ndd0c97a4c624ad9fccfed7cebb7a07d619360ba4b54b5a5ab331c38b3fbb615c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.690 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[deb968bc-49db-4f06-ae17-cd2bf90b7e60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.691 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf19933f5-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.692 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:29 np0005593234 kernel: tapf19933f5-c0: left promiscuous mode
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.694 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.697 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5ef2db40-6d10-43ac-a4f2-47926c5e8eb7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:29 np0005593234 nova_compute[227762]: 2026-01-23 09:43:29.708 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.711 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[effdb565-0af5-4436-8c58-85bb1f03f592]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.712 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5ced11e8-3417-4824-9107-b01acdb038b2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.726 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec15932-b87d-4241-8387-085edb8964d7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 525249, 'reachable_time': 32258, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250451, 'error': None, 'target': 'ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:29 np0005593234 systemd[1]: run-netns-ovnmeta\x2df19933f5\x2dcfe3\x2d4319\x2da83b\x2db72dde692ab6.mount: Deactivated successfully.
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.731 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f19933f5-cfe3-4319-a83b-b72dde692ab6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:43:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:29.731 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f1ef74-7026-4c30-9b98-466bc67c62ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:30 np0005593234 nova_compute[227762]: 2026-01-23 09:43:30.048 227766 INFO nova.virt.libvirt.driver [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Deleting instance files /var/lib/nova/instances/4ef48fbd-b990-487c-94a4-0149ee9204c9_del#033[00m
Jan 23 04:43:30 np0005593234 nova_compute[227762]: 2026-01-23 09:43:30.049 227766 INFO nova.virt.libvirt.driver [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Deletion of /var/lib/nova/instances/4ef48fbd-b990-487c-94a4-0149ee9204c9_del complete#033[00m
Jan 23 04:43:30 np0005593234 nova_compute[227762]: 2026-01-23 09:43:30.282 227766 INFO nova.compute.manager [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Took 0.96 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:43:30 np0005593234 nova_compute[227762]: 2026-01-23 09:43:30.283 227766 DEBUG oslo.service.loopingcall [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:43:30 np0005593234 nova_compute[227762]: 2026-01-23 09:43:30.284 227766 DEBUG nova.compute.manager [-] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:43:30 np0005593234 nova_compute[227762]: 2026-01-23 09:43:30.284 227766 DEBUG nova.network.neutron [-] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:43:30 np0005593234 nova_compute[227762]: 2026-01-23 09:43:30.609 227766 DEBUG nova.compute.manager [req-c56fec95-33a9-48ae-bcdd-15d7fe36dc7f req-b750f56e-f58a-45ed-9261-5e7f1d5b1daa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Received event network-vif-plugged-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:30 np0005593234 nova_compute[227762]: 2026-01-23 09:43:30.610 227766 DEBUG oslo_concurrency.lockutils [req-c56fec95-33a9-48ae-bcdd-15d7fe36dc7f req-b750f56e-f58a-45ed-9261-5e7f1d5b1daa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:30 np0005593234 nova_compute[227762]: 2026-01-23 09:43:30.611 227766 DEBUG oslo_concurrency.lockutils [req-c56fec95-33a9-48ae-bcdd-15d7fe36dc7f req-b750f56e-f58a-45ed-9261-5e7f1d5b1daa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:30 np0005593234 nova_compute[227762]: 2026-01-23 09:43:30.612 227766 DEBUG oslo_concurrency.lockutils [req-c56fec95-33a9-48ae-bcdd-15d7fe36dc7f req-b750f56e-f58a-45ed-9261-5e7f1d5b1daa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:30 np0005593234 nova_compute[227762]: 2026-01-23 09:43:30.613 227766 DEBUG nova.compute.manager [req-c56fec95-33a9-48ae-bcdd-15d7fe36dc7f req-b750f56e-f58a-45ed-9261-5e7f1d5b1daa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] No waiting events found dispatching network-vif-plugged-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:43:30 np0005593234 nova_compute[227762]: 2026-01-23 09:43:30.614 227766 WARNING nova.compute.manager [req-c56fec95-33a9-48ae-bcdd-15d7fe36dc7f req-b750f56e-f58a-45ed-9261-5e7f1d5b1daa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Received unexpected event network-vif-plugged-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:43:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:30.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:31.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.166 227766 DEBUG oslo_concurrency.lockutils [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "b157065e-5625-4012-8e6f-9b22cef56ddc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.167 227766 DEBUG oslo_concurrency.lockutils [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.167 227766 DEBUG oslo_concurrency.lockutils [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.167 227766 DEBUG oslo_concurrency.lockutils [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.168 227766 DEBUG oslo_concurrency.lockutils [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:31 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:43:31 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.175 227766 INFO nova.compute.manager [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Terminating instance#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.176 227766 DEBUG nova.compute.manager [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:43:31 np0005593234 kernel: tapfa5ce613-03 (unregistering): left promiscuous mode
Jan 23 04:43:31 np0005593234 NetworkManager[48942]: <info>  [1769161411.2178] device (tapfa5ce613-03): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.219 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:31 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:31Z|00138|binding|INFO|Releasing lport fa5ce613-0317-4fe5-8ae8-93daf23d11c0 from this chassis (sb_readonly=0)
Jan 23 04:43:31 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:31Z|00139|binding|INFO|Setting lport fa5ce613-0317-4fe5-8ae8-93daf23d11c0 down in Southbound
Jan 23 04:43:31 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:31Z|00140|binding|INFO|Removing iface tapfa5ce613-03 ovn-installed in OVS
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.227 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:69:cb 10.100.0.4'], port_security=['fa:16:3e:fe:69:cb 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'b157065e-5625-4012-8e6f-9b22cef56ddc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2696fd4-5fd7-4934-88ac-40162fad555d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05bc71a77710455e8b34ead7fec81a31', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ab8b868e-d8b1-4e1d-87d5-538f88b95e73', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3459fea4-e2ba-482e-8d51-91ef5b74d71a, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fa5ce613-0317-4fe5-8ae8-93daf23d11c0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.228 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fa5ce613-0317-4fe5-8ae8-93daf23d11c0 in datapath c2696fd4-5fd7-4934-88ac-40162fad555d unbound from our chassis#033[00m
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.230 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c2696fd4-5fd7-4934-88ac-40162fad555d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.231 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e1efc3cb-dba2-4e3a-83c7-25c302990fe1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.231 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d namespace which is not needed anymore#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.240 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.249 227766 DEBUG nova.compute.manager [req-1d0d5163-b8f1-444d-a46f-649cfcbb74b9 req-30c7244b-25f4-4763-8046-d46247b6437d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Received event network-vif-unplugged-bd040948-a661-431e-8f76-623ac2452642 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.250 227766 DEBUG oslo_concurrency.lockutils [req-1d0d5163-b8f1-444d-a46f-649cfcbb74b9 req-30c7244b-25f4-4763-8046-d46247b6437d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.250 227766 DEBUG oslo_concurrency.lockutils [req-1d0d5163-b8f1-444d-a46f-649cfcbb74b9 req-30c7244b-25f4-4763-8046-d46247b6437d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.250 227766 DEBUG oslo_concurrency.lockutils [req-1d0d5163-b8f1-444d-a46f-649cfcbb74b9 req-30c7244b-25f4-4763-8046-d46247b6437d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.251 227766 DEBUG nova.compute.manager [req-1d0d5163-b8f1-444d-a46f-649cfcbb74b9 req-30c7244b-25f4-4763-8046-d46247b6437d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] No waiting events found dispatching network-vif-unplugged-bd040948-a661-431e-8f76-623ac2452642 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.251 227766 DEBUG nova.compute.manager [req-1d0d5163-b8f1-444d-a46f-649cfcbb74b9 req-30c7244b-25f4-4763-8046-d46247b6437d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Received event network-vif-unplugged-bd040948-a661-431e-8f76-623ac2452642 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.251 227766 DEBUG nova.compute.manager [req-1d0d5163-b8f1-444d-a46f-649cfcbb74b9 req-30c7244b-25f4-4763-8046-d46247b6437d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Received event network-vif-plugged-bd040948-a661-431e-8f76-623ac2452642 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.251 227766 DEBUG oslo_concurrency.lockutils [req-1d0d5163-b8f1-444d-a46f-649cfcbb74b9 req-30c7244b-25f4-4763-8046-d46247b6437d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.252 227766 DEBUG oslo_concurrency.lockutils [req-1d0d5163-b8f1-444d-a46f-649cfcbb74b9 req-30c7244b-25f4-4763-8046-d46247b6437d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.252 227766 DEBUG oslo_concurrency.lockutils [req-1d0d5163-b8f1-444d-a46f-649cfcbb74b9 req-30c7244b-25f4-4763-8046-d46247b6437d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.252 227766 DEBUG nova.compute.manager [req-1d0d5163-b8f1-444d-a46f-649cfcbb74b9 req-30c7244b-25f4-4763-8046-d46247b6437d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] No waiting events found dispatching network-vif-plugged-bd040948-a661-431e-8f76-623ac2452642 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.253 227766 WARNING nova.compute.manager [req-1d0d5163-b8f1-444d-a46f-649cfcbb74b9 req-30c7244b-25f4-4763-8046-d46247b6437d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Received unexpected event network-vif-plugged-bd040948-a661-431e-8f76-623ac2452642 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:43:31 np0005593234 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000032.scope: Deactivated successfully.
Jan 23 04:43:31 np0005593234 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d00000032.scope: Consumed 3.317s CPU time.
Jan 23 04:43:31 np0005593234 systemd-machined[195626]: Machine qemu-22-instance-00000032 terminated.
Jan 23 04:43:31 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[250331]: [NOTICE]   (250342) : haproxy version is 2.8.14-c23fe91
Jan 23 04:43:31 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[250331]: [NOTICE]   (250342) : path to executable is /usr/sbin/haproxy
Jan 23 04:43:31 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[250331]: [WARNING]  (250342) : Exiting Master process...
Jan 23 04:43:31 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[250331]: [ALERT]    (250342) : Current worker (250345) exited with code 143 (Terminated)
Jan 23 04:43:31 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[250331]: [WARNING]  (250342) : All workers exited. Exiting... (0)
Jan 23 04:43:31 np0005593234 systemd[1]: libpod-5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6.scope: Deactivated successfully.
Jan 23 04:43:31 np0005593234 conmon[250331]: conmon 5c4646b22495fe33b160 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6.scope/container/memory.events
Jan 23 04:43:31 np0005593234 podman[250478]: 2026-01-23 09:43:31.361820873 +0000 UTC m=+0.041080303 container died 5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 04:43:31 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6-userdata-shm.mount: Deactivated successfully.
Jan 23 04:43:31 np0005593234 systemd[1]: var-lib-containers-storage-overlay-2c77d33942b0aecdfdcb4d862e9a5a49e267917ba8ea78009c9e706dba707010-merged.mount: Deactivated successfully.
Jan 23 04:43:31 np0005593234 podman[250478]: 2026-01-23 09:43:31.392879601 +0000 UTC m=+0.072139031 container cleanup 5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 04:43:31 np0005593234 systemd[1]: libpod-conmon-5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6.scope: Deactivated successfully.
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.408 227766 INFO nova.virt.libvirt.driver [-] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Instance destroyed successfully.#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.408 227766 DEBUG nova.objects.instance [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'resources' on Instance uuid b157065e-5625-4012-8e6f-9b22cef56ddc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.434 227766 DEBUG nova.virt.libvirt.vif [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:43:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1954358251',display_name='tempest-ImagesTestJSON-server-1954358251',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1954358251',id=50,image_ref='b5a73d98-c27b-4745-95e1-6675f24e35ae',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:43:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-w1scufj5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_boot_roles='member,reader',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='47eda3a7-c47a-48cc-8381-a702e2e27bfc',image_min_disk='1',image_min_ram='0',image_owner_id='05bc71a77710455e8b34ead7fec81a31',image_owner_project_name='tempest-ImagesTestJSON-1507872051',image_owner_user_name='tempest-ImagesTestJSON-1507872051-project-member',image_user_id='56da68482e3a4fb582dcccad45f8f71b',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:43:28Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=b157065e-5625-4012-8e6f-9b22cef56ddc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "address": "fa:16:3e:fe:69:cb", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa5ce613-03", "ovs_interfaceid": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.435 227766 DEBUG nova.network.os_vif_util [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "address": "fa:16:3e:fe:69:cb", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa5ce613-03", "ovs_interfaceid": "fa5ce613-0317-4fe5-8ae8-93daf23d11c0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.435 227766 DEBUG nova.network.os_vif_util [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:69:cb,bridge_name='br-int',has_traffic_filtering=True,id=fa5ce613-0317-4fe5-8ae8-93daf23d11c0,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa5ce613-03') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.436 227766 DEBUG os_vif [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:69:cb,bridge_name='br-int',has_traffic_filtering=True,id=fa5ce613-0317-4fe5-8ae8-93daf23d11c0,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa5ce613-03') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.437 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.437 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa5ce613-03, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:31 np0005593234 podman[250512]: 2026-01-23 09:43:31.450385994 +0000 UTC m=+0.036830340 container remove 5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.481 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.482 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.484 227766 INFO os_vif [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:69:cb,bridge_name='br-int',has_traffic_filtering=True,id=fa5ce613-0317-4fe5-8ae8-93daf23d11c0,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa5ce613-03')#033[00m
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.485 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d99f7480-e128-45ab-a6b7-a0bcb664ed2b]: (4, ('Fri Jan 23 09:43:31 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d (5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6)\n5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6\nFri Jan 23 09:43:31 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d (5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6)\n5c4646b22495fe33b16040d4b470a0ef606a44989d0c1d84369492604013e7c6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.489 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[10451010-b55a-47c8-8402-de4a7b963319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.490 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2696fd4-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:31 np0005593234 kernel: tapc2696fd4-50: left promiscuous mode
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.496 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[02debe1c-426e-4162-acc0-b26923516900]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.508 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.513 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.514 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[62fc0229-f9c3-4636-b04d-5a85eca96bf4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.515 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8d9555b7-edbb-4a9f-9240-61d92f7e58a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.530 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1df37d20-5138-4d0f-befa-cb2eb61571df]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 530301, 'reachable_time': 22619, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 250546, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:31 np0005593234 systemd[1]: run-netns-ovnmeta\x2dc2696fd4\x2d5fd7\x2d4934\x2d88ac\x2d40162fad555d.mount: Deactivated successfully.
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.535 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:43:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:31.535 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[299b706e-a14a-4c3f-b187-467a96d1ebfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.894 227766 INFO nova.virt.libvirt.driver [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Deleting instance files /var/lib/nova/instances/b157065e-5625-4012-8e6f-9b22cef56ddc_del#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.895 227766 INFO nova.virt.libvirt.driver [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Deletion of /var/lib/nova/instances/b157065e-5625-4012-8e6f-9b22cef56ddc_del complete#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.974 227766 INFO nova.compute.manager [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Took 0.80 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.975 227766 DEBUG oslo.service.loopingcall [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.975 227766 DEBUG nova.compute.manager [-] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:43:31 np0005593234 nova_compute[227762]: 2026-01-23 09:43:31.975 227766 DEBUG nova.network.neutron [-] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:43:32 np0005593234 nova_compute[227762]: 2026-01-23 09:43:32.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:43:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:32.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:33.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.447 227766 DEBUG nova.network.neutron [-] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.454 227766 DEBUG nova.compute.manager [req-c0641b19-fcdd-4a1e-bd8c-77a9fcdf82d4 req-53decf1b-6888-4a3c-85f9-eb70f1b9da8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Received event network-vif-unplugged-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.454 227766 DEBUG oslo_concurrency.lockutils [req-c0641b19-fcdd-4a1e-bd8c-77a9fcdf82d4 req-53decf1b-6888-4a3c-85f9-eb70f1b9da8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.455 227766 DEBUG oslo_concurrency.lockutils [req-c0641b19-fcdd-4a1e-bd8c-77a9fcdf82d4 req-53decf1b-6888-4a3c-85f9-eb70f1b9da8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.455 227766 DEBUG oslo_concurrency.lockutils [req-c0641b19-fcdd-4a1e-bd8c-77a9fcdf82d4 req-53decf1b-6888-4a3c-85f9-eb70f1b9da8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.456 227766 DEBUG nova.compute.manager [req-c0641b19-fcdd-4a1e-bd8c-77a9fcdf82d4 req-53decf1b-6888-4a3c-85f9-eb70f1b9da8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] No waiting events found dispatching network-vif-unplugged-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.456 227766 DEBUG nova.compute.manager [req-c0641b19-fcdd-4a1e-bd8c-77a9fcdf82d4 req-53decf1b-6888-4a3c-85f9-eb70f1b9da8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Received event network-vif-unplugged-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.456 227766 DEBUG nova.compute.manager [req-c0641b19-fcdd-4a1e-bd8c-77a9fcdf82d4 req-53decf1b-6888-4a3c-85f9-eb70f1b9da8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Received event network-vif-plugged-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.456 227766 DEBUG oslo_concurrency.lockutils [req-c0641b19-fcdd-4a1e-bd8c-77a9fcdf82d4 req-53decf1b-6888-4a3c-85f9-eb70f1b9da8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.456 227766 DEBUG oslo_concurrency.lockutils [req-c0641b19-fcdd-4a1e-bd8c-77a9fcdf82d4 req-53decf1b-6888-4a3c-85f9-eb70f1b9da8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.457 227766 DEBUG oslo_concurrency.lockutils [req-c0641b19-fcdd-4a1e-bd8c-77a9fcdf82d4 req-53decf1b-6888-4a3c-85f9-eb70f1b9da8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.457 227766 DEBUG nova.compute.manager [req-c0641b19-fcdd-4a1e-bd8c-77a9fcdf82d4 req-53decf1b-6888-4a3c-85f9-eb70f1b9da8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] No waiting events found dispatching network-vif-plugged-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.457 227766 WARNING nova.compute.manager [req-c0641b19-fcdd-4a1e-bd8c-77a9fcdf82d4 req-53decf1b-6888-4a3c-85f9-eb70f1b9da8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Received unexpected event network-vif-plugged-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.507 227766 INFO nova.compute.manager [-] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Took 3.22 seconds to deallocate network for instance.#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.684 227766 DEBUG oslo_concurrency.lockutils [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.684 227766 DEBUG oslo_concurrency.lockutils [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:33 np0005593234 nova_compute[227762]: 2026-01-23 09:43:33.726 227766 DEBUG nova.compute.manager [req-74e6067d-bae6-4b2f-805f-655bdde40564 req-7f3aeaaf-39af-41d1-826d-a6363c29588f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Received event network-vif-deleted-bd040948-a661-431e-8f76-623ac2452642 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:34 np0005593234 nova_compute[227762]: 2026-01-23 09:43:34.122 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:34 np0005593234 nova_compute[227762]: 2026-01-23 09:43:34.197 227766 DEBUG oslo_concurrency.processutils [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:34 np0005593234 nova_compute[227762]: 2026-01-23 09:43:34.422 227766 DEBUG nova.network.neutron [-] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:43:34 np0005593234 nova_compute[227762]: 2026-01-23 09:43:34.465 227766 INFO nova.compute.manager [-] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Took 2.49 seconds to deallocate network for instance.#033[00m
Jan 23 04:43:34 np0005593234 nova_compute[227762]: 2026-01-23 09:43:34.545 227766 DEBUG oslo_concurrency.lockutils [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:43:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3604094648' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:43:34 np0005593234 nova_compute[227762]: 2026-01-23 09:43:34.653 227766 DEBUG oslo_concurrency.processutils [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:34 np0005593234 nova_compute[227762]: 2026-01-23 09:43:34.660 227766 DEBUG nova.compute.provider_tree [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:43:34 np0005593234 nova_compute[227762]: 2026-01-23 09:43:34.690 227766 DEBUG nova.scheduler.client.report [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:43:34 np0005593234 nova_compute[227762]: 2026-01-23 09:43:34.720 227766 DEBUG oslo_concurrency.lockutils [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:34 np0005593234 nova_compute[227762]: 2026-01-23 09:43:34.723 227766 DEBUG oslo_concurrency.lockutils [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.178s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:34 np0005593234 nova_compute[227762]: 2026-01-23 09:43:34.758 227766 INFO nova.scheduler.client.report [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Deleted allocations for instance 4ef48fbd-b990-487c-94a4-0149ee9204c9#033[00m
Jan 23 04:43:34 np0005593234 nova_compute[227762]: 2026-01-23 09:43:34.797 227766 DEBUG oslo_concurrency.processutils [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:34 np0005593234 nova_compute[227762]: 2026-01-23 09:43:34.841 227766 DEBUG oslo_concurrency.lockutils [None req-199f15c8-95de-4e94-847a-28eafd4a594d 187ce0cedde344a3b09ca4560410580e 4fd9229340ed4bf3a3a72baa6985a3e3 - - default default] Lock "4ef48fbd-b990-487c-94a4-0149ee9204c9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.520s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e198 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:34.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:35.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:43:35 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1728650748' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:43:35 np0005593234 nova_compute[227762]: 2026-01-23 09:43:35.228 227766 DEBUG oslo_concurrency.processutils [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:35 np0005593234 nova_compute[227762]: 2026-01-23 09:43:35.234 227766 DEBUG nova.compute.provider_tree [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:43:35 np0005593234 nova_compute[227762]: 2026-01-23 09:43:35.253 227766 DEBUG nova.scheduler.client.report [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:43:35 np0005593234 nova_compute[227762]: 2026-01-23 09:43:35.290 227766 DEBUG oslo_concurrency.lockutils [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.567s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:35 np0005593234 nova_compute[227762]: 2026-01-23 09:43:35.414 227766 INFO nova.scheduler.client.report [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Deleted allocations for instance b157065e-5625-4012-8e6f-9b22cef56ddc#033[00m
Jan 23 04:43:35 np0005593234 nova_compute[227762]: 2026-01-23 09:43:35.498 227766 DEBUG oslo_concurrency.lockutils [None req-0e36e3b6-2d83-49e2-876a-109bb60f21ad 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "b157065e-5625-4012-8e6f-9b22cef56ddc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:35 np0005593234 nova_compute[227762]: 2026-01-23 09:43:35.654 227766 DEBUG nova.compute.manager [req-96bf71e5-d9b0-4fab-9a5b-3386cb545b11 req-9177ed59-67fb-4d98-a294-a792c73b7946 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Received event network-vif-deleted-fa5ce613-0317-4fe5-8ae8-93daf23d11c0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:36 np0005593234 nova_compute[227762]: 2026-01-23 09:43:36.484 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:36.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e199 e199: 3 total, 3 up, 3 in
Jan 23 04:43:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:43:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:37.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:43:37 np0005593234 podman[250648]: 2026-01-23 09:43:37.7931433 +0000 UTC m=+0.090353708 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 23 04:43:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:37.871 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:38.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:39 np0005593234 nova_compute[227762]: 2026-01-23 09:43:39.123 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:39.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:43:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:40.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:43:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:43:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:43:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:41.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:41 np0005593234 nova_compute[227762]: 2026-01-23 09:43:41.488 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:43:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:43:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:43:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:43:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:43:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:42.818 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:42.819 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:42.819 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:42.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:43.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:43 np0005593234 nova_compute[227762]: 2026-01-23 09:43:43.516 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:43 np0005593234 nova_compute[227762]: 2026-01-23 09:43:43.516 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:43 np0005593234 nova_compute[227762]: 2026-01-23 09:43:43.549 227766 DEBUG nova.compute.manager [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:43:43 np0005593234 nova_compute[227762]: 2026-01-23 09:43:43.667 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:43 np0005593234 nova_compute[227762]: 2026-01-23 09:43:43.667 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:43 np0005593234 nova_compute[227762]: 2026-01-23 09:43:43.674 227766 DEBUG nova.virt.hardware [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:43:43 np0005593234 nova_compute[227762]: 2026-01-23 09:43:43.674 227766 INFO nova.compute.claims [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:43:43 np0005593234 nova_compute[227762]: 2026-01-23 09:43:43.787 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.125 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:43:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4002376692' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.214 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.218 227766 DEBUG nova.compute.provider_tree [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.238 227766 DEBUG nova.scheduler.client.report [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.267 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.599s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.267 227766 DEBUG nova.compute.manager [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.327 227766 DEBUG nova.compute.manager [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.328 227766 DEBUG nova.network.neutron [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.352 227766 INFO nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.376 227766 DEBUG nova.compute.manager [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.489 227766 DEBUG nova.compute.manager [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.490 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.490 227766 INFO nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Creating image(s)#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.515 227766 DEBUG nova.storage.rbd_utils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.541 227766 DEBUG nova.storage.rbd_utils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.564 227766 DEBUG nova.storage.rbd_utils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.567 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.589 227766 DEBUG nova.policy [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '56da68482e3a4fb582dcccad45f8f71b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '05bc71a77710455e8b34ead7fec81a31', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.592 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161409.5557075, 4ef48fbd-b990-487c-94a4-0149ee9204c9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.592 227766 INFO nova.compute.manager [-] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.624 227766 DEBUG nova.compute.manager [None req-751672c3-f42e-4dc4-a59b-04ea05d307a4 - - - - - -] [instance: 4ef48fbd-b990-487c-94a4-0149ee9204c9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.626 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.626 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.627 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.627 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.650 227766 DEBUG nova.storage.rbd_utils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.654 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e199 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:43:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:44.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.921 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.267s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:44 np0005593234 nova_compute[227762]: 2026-01-23 09:43:44.994 227766 DEBUG nova.storage.rbd_utils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] resizing rbd image 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:43:45 np0005593234 nova_compute[227762]: 2026-01-23 09:43:45.112 227766 DEBUG nova.objects.instance [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'migration_context' on Instance uuid 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:43:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:45.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:43:45 np0005593234 nova_compute[227762]: 2026-01-23 09:43:45.436 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:43:45 np0005593234 nova_compute[227762]: 2026-01-23 09:43:45.437 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Ensure instance console log exists: /var/lib/nova/instances/52f4c5ec-125c-4f64-86ef-2e4af50dbd4c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:43:45 np0005593234 nova_compute[227762]: 2026-01-23 09:43:45.437 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:45 np0005593234 nova_compute[227762]: 2026-01-23 09:43:45.438 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:45 np0005593234 nova_compute[227762]: 2026-01-23 09:43:45.438 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:46 np0005593234 nova_compute[227762]: 2026-01-23 09:43:46.234 227766 DEBUG nova.network.neutron [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Successfully created port: 20d99ef4-421c-4778-8024-ee47a467e6ac _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:43:46 np0005593234 nova_compute[227762]: 2026-01-23 09:43:46.408 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161411.4066343, b157065e-5625-4012-8e6f-9b22cef56ddc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:46 np0005593234 nova_compute[227762]: 2026-01-23 09:43:46.408 227766 INFO nova.compute.manager [-] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:43:46 np0005593234 nova_compute[227762]: 2026-01-23 09:43:46.429 227766 DEBUG nova.compute.manager [None req-957f7a25-ee41-4a78-ac3f-0a85b88d95fd - - - - - -] [instance: b157065e-5625-4012-8e6f-9b22cef56ddc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:46 np0005593234 nova_compute[227762]: 2026-01-23 09:43:46.805 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:46.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:43:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:47.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:43:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:43:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:43:48 np0005593234 nova_compute[227762]: 2026-01-23 09:43:48.409 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:48 np0005593234 nova_compute[227762]: 2026-01-23 09:43:48.680 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:48 np0005593234 nova_compute[227762]: 2026-01-23 09:43:48.749 227766 DEBUG nova.network.neutron [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Successfully updated port: 20d99ef4-421c-4778-8024-ee47a467e6ac _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:43:48 np0005593234 nova_compute[227762]: 2026-01-23 09:43:48.771 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "refresh_cache-52f4c5ec-125c-4f64-86ef-2e4af50dbd4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:43:48 np0005593234 nova_compute[227762]: 2026-01-23 09:43:48.771 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquired lock "refresh_cache-52f4c5ec-125c-4f64-86ef-2e4af50dbd4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:43:48 np0005593234 nova_compute[227762]: 2026-01-23 09:43:48.771 227766 DEBUG nova.network.neutron [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:43:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:48.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:48 np0005593234 nova_compute[227762]: 2026-01-23 09:43:48.923 227766 DEBUG nova.compute.manager [req-1d114451-56ca-4e3a-94db-30c42f0e25fc req-9d27c444-f785-4903-8c17-4ebd3fa12fae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Received event network-changed-20d99ef4-421c-4778-8024-ee47a467e6ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:48 np0005593234 nova_compute[227762]: 2026-01-23 09:43:48.924 227766 DEBUG nova.compute.manager [req-1d114451-56ca-4e3a-94db-30c42f0e25fc req-9d27c444-f785-4903-8c17-4ebd3fa12fae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Refreshing instance network info cache due to event network-changed-20d99ef4-421c-4778-8024-ee47a467e6ac. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:43:48 np0005593234 nova_compute[227762]: 2026-01-23 09:43:48.924 227766 DEBUG oslo_concurrency.lockutils [req-1d114451-56ca-4e3a-94db-30c42f0e25fc req-9d27c444-f785-4903-8c17-4ebd3fa12fae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-52f4c5ec-125c-4f64-86ef-2e4af50dbd4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:43:49 np0005593234 nova_compute[227762]: 2026-01-23 09:43:49.126 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e200 e200: 3 total, 3 up, 3 in
Jan 23 04:43:49 np0005593234 nova_compute[227762]: 2026-01-23 09:43:49.139 227766 DEBUG nova.network.neutron [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:43:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:49.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.107 227766 DEBUG nova.network.neutron [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Updating instance_info_cache with network_info: [{"id": "20d99ef4-421c-4778-8024-ee47a467e6ac", "address": "fa:16:3e:de:bc:f0", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20d99ef4-42", "ovs_interfaceid": "20d99ef4-421c-4778-8024-ee47a467e6ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.152 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Releasing lock "refresh_cache-52f4c5ec-125c-4f64-86ef-2e4af50dbd4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.152 227766 DEBUG nova.compute.manager [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Instance network_info: |[{"id": "20d99ef4-421c-4778-8024-ee47a467e6ac", "address": "fa:16:3e:de:bc:f0", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20d99ef4-42", "ovs_interfaceid": "20d99ef4-421c-4778-8024-ee47a467e6ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.152 227766 DEBUG oslo_concurrency.lockutils [req-1d114451-56ca-4e3a-94db-30c42f0e25fc req-9d27c444-f785-4903-8c17-4ebd3fa12fae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-52f4c5ec-125c-4f64-86ef-2e4af50dbd4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.153 227766 DEBUG nova.network.neutron [req-1d114451-56ca-4e3a-94db-30c42f0e25fc req-9d27c444-f785-4903-8c17-4ebd3fa12fae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Refreshing network info cache for port 20d99ef4-421c-4778-8024-ee47a467e6ac _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.156 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Start _get_guest_xml network_info=[{"id": "20d99ef4-421c-4778-8024-ee47a467e6ac", "address": "fa:16:3e:de:bc:f0", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20d99ef4-42", "ovs_interfaceid": "20d99ef4-421c-4778-8024-ee47a467e6ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.160 227766 WARNING nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.168 227766 DEBUG nova.virt.libvirt.host [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.169 227766 DEBUG nova.virt.libvirt.host [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.196 227766 DEBUG nova.virt.libvirt.host [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.197 227766 DEBUG nova.virt.libvirt.host [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.198 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.198 227766 DEBUG nova.virt.hardware [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.198 227766 DEBUG nova.virt.hardware [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.199 227766 DEBUG nova.virt.hardware [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.199 227766 DEBUG nova.virt.hardware [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.199 227766 DEBUG nova.virt.hardware [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.199 227766 DEBUG nova.virt.hardware [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.199 227766 DEBUG nova.virt.hardware [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.199 227766 DEBUG nova.virt.hardware [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.200 227766 DEBUG nova.virt.hardware [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.200 227766 DEBUG nova.virt.hardware [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.200 227766 DEBUG nova.virt.hardware [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.203 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:43:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4147509936' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.635 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.663 227766 DEBUG nova.storage.rbd_utils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:50 np0005593234 nova_compute[227762]: 2026-01-23 09:43:50.666 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:50.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:43:51 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2752553455' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.116 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.118 227766 DEBUG nova.virt.libvirt.vif [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:43:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1923234229',display_name='tempest-ImagesTestJSON-server-1923234229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1923234229',id=52,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-bzida3pu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:43:44Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=52f4c5ec-125c-4f64-86ef-2e4af50dbd4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "20d99ef4-421c-4778-8024-ee47a467e6ac", "address": "fa:16:3e:de:bc:f0", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20d99ef4-42", "ovs_interfaceid": "20d99ef4-421c-4778-8024-ee47a467e6ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.119 227766 DEBUG nova.network.os_vif_util [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "20d99ef4-421c-4778-8024-ee47a467e6ac", "address": "fa:16:3e:de:bc:f0", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20d99ef4-42", "ovs_interfaceid": "20d99ef4-421c-4778-8024-ee47a467e6ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.120 227766 DEBUG nova.network.os_vif_util [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:bc:f0,bridge_name='br-int',has_traffic_filtering=True,id=20d99ef4-421c-4778-8024-ee47a467e6ac,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20d99ef4-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.121 227766 DEBUG nova.objects.instance [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'pci_devices' on Instance uuid 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:43:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:43:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:51.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.542 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  <uuid>52f4c5ec-125c-4f64-86ef-2e4af50dbd4c</uuid>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  <name>instance-00000034</name>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <nova:name>tempest-ImagesTestJSON-server-1923234229</nova:name>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:43:50</nova:creationTime>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <nova:user uuid="56da68482e3a4fb582dcccad45f8f71b">tempest-ImagesTestJSON-1507872051-project-member</nova:user>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <nova:project uuid="05bc71a77710455e8b34ead7fec81a31">tempest-ImagesTestJSON-1507872051</nova:project>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <nova:port uuid="20d99ef4-421c-4778-8024-ee47a467e6ac">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <entry name="serial">52f4c5ec-125c-4f64-86ef-2e4af50dbd4c</entry>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <entry name="uuid">52f4c5ec-125c-4f64-86ef-2e4af50dbd4c</entry>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk.config">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:de:bc:f0"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <target dev="tap20d99ef4-42"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/52f4c5ec-125c-4f64-86ef-2e4af50dbd4c/console.log" append="off"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:43:51 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:43:51 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:43:51 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:43:51 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.544 227766 DEBUG nova.compute.manager [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Preparing to wait for external event network-vif-plugged-20d99ef4-421c-4778-8024-ee47a467e6ac prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.544 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.545 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.545 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.546 227766 DEBUG nova.virt.libvirt.vif [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:43:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1923234229',display_name='tempest-ImagesTestJSON-server-1923234229',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1923234229',id=52,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-bzida3pu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:43:44Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=52f4c5ec-125c-4f64-86ef-2e4af50dbd4c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "20d99ef4-421c-4778-8024-ee47a467e6ac", "address": "fa:16:3e:de:bc:f0", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20d99ef4-42", "ovs_interfaceid": "20d99ef4-421c-4778-8024-ee47a467e6ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.546 227766 DEBUG nova.network.os_vif_util [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "20d99ef4-421c-4778-8024-ee47a467e6ac", "address": "fa:16:3e:de:bc:f0", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20d99ef4-42", "ovs_interfaceid": "20d99ef4-421c-4778-8024-ee47a467e6ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.547 227766 DEBUG nova.network.os_vif_util [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:bc:f0,bridge_name='br-int',has_traffic_filtering=True,id=20d99ef4-421c-4778-8024-ee47a467e6ac,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20d99ef4-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.548 227766 DEBUG os_vif [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:bc:f0,bridge_name='br-int',has_traffic_filtering=True,id=20d99ef4-421c-4778-8024-ee47a467e6ac,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20d99ef4-42') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.548 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.549 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.549 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.553 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.554 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20d99ef4-42, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.554 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap20d99ef4-42, col_values=(('external_ids', {'iface-id': '20d99ef4-421c-4778-8024-ee47a467e6ac', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:de:bc:f0', 'vm-uuid': '52f4c5ec-125c-4f64-86ef-2e4af50dbd4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.556 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:51 np0005593234 NetworkManager[48942]: <info>  [1769161431.5570] manager: (tap20d99ef4-42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.558 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.562 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:51 np0005593234 nova_compute[227762]: 2026-01-23 09:43:51.563 227766 INFO os_vif [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:bc:f0,bridge_name='br-int',has_traffic_filtering=True,id=20d99ef4-421c-4778-8024-ee47a467e6ac,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20d99ef4-42')#033[00m
Jan 23 04:43:52 np0005593234 nova_compute[227762]: 2026-01-23 09:43:52.862 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:43:52 np0005593234 nova_compute[227762]: 2026-01-23 09:43:52.862 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:43:52 np0005593234 nova_compute[227762]: 2026-01-23 09:43:52.862 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No VIF found with MAC fa:16:3e:de:bc:f0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:43:52 np0005593234 nova_compute[227762]: 2026-01-23 09:43:52.863 227766 INFO nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Using config drive#033[00m
Jan 23 04:43:52 np0005593234 nova_compute[227762]: 2026-01-23 09:43:52.887 227766 DEBUG nova.storage.rbd_utils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:43:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:52.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:43:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:53.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:54 np0005593234 nova_compute[227762]: 2026-01-23 09:43:54.128 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:54 np0005593234 nova_compute[227762]: 2026-01-23 09:43:54.682 227766 INFO nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Creating config drive at /var/lib/nova/instances/52f4c5ec-125c-4f64-86ef-2e4af50dbd4c/disk.config#033[00m
Jan 23 04:43:54 np0005593234 nova_compute[227762]: 2026-01-23 09:43:54.687 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/52f4c5ec-125c-4f64-86ef-2e4af50dbd4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8mfzzbx8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:54 np0005593234 podman[251256]: 2026-01-23 09:43:54.756666141 +0000 UTC m=+0.052745075 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:43:54 np0005593234 nova_compute[227762]: 2026-01-23 09:43:54.815 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/52f4c5ec-125c-4f64-86ef-2e4af50dbd4c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8mfzzbx8" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:54 np0005593234 nova_compute[227762]: 2026-01-23 09:43:54.845 227766 DEBUG nova.storage.rbd_utils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] rbd image 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:43:54 np0005593234 nova_compute[227762]: 2026-01-23 09:43:54.849 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/52f4c5ec-125c-4f64-86ef-2e4af50dbd4c/disk.config 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:43:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:43:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:54.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:55 np0005593234 nova_compute[227762]: 2026-01-23 09:43:55.018 227766 DEBUG oslo_concurrency.processutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/52f4c5ec-125c-4f64-86ef-2e4af50dbd4c/disk.config 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.169s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:43:55 np0005593234 nova_compute[227762]: 2026-01-23 09:43:55.019 227766 INFO nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Deleting local config drive /var/lib/nova/instances/52f4c5ec-125c-4f64-86ef-2e4af50dbd4c/disk.config because it was imported into RBD.#033[00m
Jan 23 04:43:55 np0005593234 kernel: tap20d99ef4-42: entered promiscuous mode
Jan 23 04:43:55 np0005593234 NetworkManager[48942]: <info>  [1769161435.0669] manager: (tap20d99ef4-42): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Jan 23 04:43:55 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:55Z|00141|binding|INFO|Claiming lport 20d99ef4-421c-4778-8024-ee47a467e6ac for this chassis.
Jan 23 04:43:55 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:55Z|00142|binding|INFO|20d99ef4-421c-4778-8024-ee47a467e6ac: Claiming fa:16:3e:de:bc:f0 10.100.0.3
Jan 23 04:43:55 np0005593234 nova_compute[227762]: 2026-01-23 09:43:55.068 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:55 np0005593234 nova_compute[227762]: 2026-01-23 09:43:55.071 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.093 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:bc:f0 10.100.0.3'], port_security=['fa:16:3e:de:bc:f0 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '52f4c5ec-125c-4f64-86ef-2e4af50dbd4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2696fd4-5fd7-4934-88ac-40162fad555d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05bc71a77710455e8b34ead7fec81a31', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ab8b868e-d8b1-4e1d-87d5-538f88b95e73', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3459fea4-e2ba-482e-8d51-91ef5b74d71a, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=20d99ef4-421c-4778-8024-ee47a467e6ac) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.094 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 20d99ef4-421c-4778-8024-ee47a467e6ac in datapath c2696fd4-5fd7-4934-88ac-40162fad555d bound to our chassis#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.096 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c2696fd4-5fd7-4934-88ac-40162fad555d#033[00m
Jan 23 04:43:55 np0005593234 systemd-udevd[251379]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:43:55 np0005593234 systemd-machined[195626]: New machine qemu-23-instance-00000034.
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.110 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3778a57c-cbd9-4433-bf0b-535dcb9e2617]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 NetworkManager[48942]: <info>  [1769161435.1128] device (tap20d99ef4-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.111 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc2696fd4-51 in ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:43:55 np0005593234 NetworkManager[48942]: <info>  [1769161435.1134] device (tap20d99ef4-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.113 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc2696fd4-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.113 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[02006a9e-20af-40d6-becf-a50d26c117da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.115 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[440d507d-c24c-4cc4-b8ec-7930c57fa153]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 systemd[1]: Started Virtual Machine qemu-23-instance-00000034.
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.126 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[654d5717-f218-465f-a148-13ecace82afa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 nova_compute[227762]: 2026-01-23 09:43:55.136 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:55 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:55Z|00143|binding|INFO|Setting lport 20d99ef4-421c-4778-8024-ee47a467e6ac ovn-installed in OVS
Jan 23 04:43:55 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:55Z|00144|binding|INFO|Setting lport 20d99ef4-421c-4778-8024-ee47a467e6ac up in Southbound
Jan 23 04:43:55 np0005593234 nova_compute[227762]: 2026-01-23 09:43:55.143 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.152 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9f3ce9c7-1191-4606-bb63-8acdbc52a491]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:55.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.181 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1baff416-5a3e-4567-a1a2-77185c9fb82f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.187 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6bcf6975-0904-4860-9831-a3da624bc700]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 NetworkManager[48942]: <info>  [1769161435.1878] manager: (tapc2696fd4-50): new Veth device (/org/freedesktop/NetworkManager/Devices/82)
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.215 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a89b6920-5c8c-4e88-9ed8-da384bb85538]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.217 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[112b09d1-e274-45e0-971b-144d0f471e84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 NetworkManager[48942]: <info>  [1769161435.2359] device (tapc2696fd4-50): carrier: link connected
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.243 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4e3aff9d-8c5a-4d97-a12f-53316501999c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.258 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d0eb311c-5098-41cb-aabe-ebd2700a6b14]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc2696fd4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:02:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533019, 'reachable_time': 38269, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251413, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.274 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f238d397-46dc-45c5-a88a-857b8b67a2a3]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:20d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 533019, 'tstamp': 533019}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 251414, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.290 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a0ada53c-512c-452f-877b-fc86449b279f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc2696fd4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:02:0d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 47], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533019, 'reachable_time': 38269, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 251415, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.314 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[75d056ed-e32a-4c4d-ac4c-beb608642962]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.371 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[afb71a8a-990a-4315-837e-6d3c603f0897]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.372 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2696fd4-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.372 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.373 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc2696fd4-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:55 np0005593234 NetworkManager[48942]: <info>  [1769161435.3756] manager: (tapc2696fd4-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Jan 23 04:43:55 np0005593234 kernel: tapc2696fd4-50: entered promiscuous mode
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.377 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc2696fd4-50, col_values=(('external_ids', {'iface-id': '38b24332-af6b-47d2-95fe-400f5feeadcb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:43:55 np0005593234 ovn_controller[134547]: 2026-01-23T09:43:55Z|00145|binding|INFO|Releasing lport 38b24332-af6b-47d2-95fe-400f5feeadcb from this chassis (sb_readonly=0)
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.393 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.394 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[23698627-d846-4be0-a8db-a61721cbdbde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:43:55 np0005593234 nova_compute[227762]: 2026-01-23 09:43:55.392 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.395 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-c2696fd4-5fd7-4934-88ac-40162fad555d
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/c2696fd4-5fd7-4934-88ac-40162fad555d.pid.haproxy
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID c2696fd4-5fd7-4934-88ac-40162fad555d
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:43:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:43:55.395 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'env', 'PROCESS_TAG=haproxy-c2696fd4-5fd7-4934-88ac-40162fad555d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c2696fd4-5fd7-4934-88ac-40162fad555d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:43:55 np0005593234 podman[251447]: 2026-01-23 09:43:55.746673448 +0000 UTC m=+0.048785032 container create eb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:43:55 np0005593234 systemd[1]: Started libpod-conmon-eb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01.scope.
Jan 23 04:43:55 np0005593234 podman[251447]: 2026-01-23 09:43:55.71981738 +0000 UTC m=+0.021928984 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:43:55 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:43:55 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/76e504772396622afdc82b8d8c0237674ec7defac981f914d96f4b7e16244c84/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:43:55 np0005593234 podman[251447]: 2026-01-23 09:43:55.835678973 +0000 UTC m=+0.137790587 container init eb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 04:43:55 np0005593234 podman[251447]: 2026-01-23 09:43:55.840524854 +0000 UTC m=+0.142636438 container start eb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 04:43:55 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[251464]: [NOTICE]   (251486) : New worker (251488) forked
Jan 23 04:43:55 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[251464]: [NOTICE]   (251486) : Loading success.
Jan 23 04:43:55 np0005593234 nova_compute[227762]: 2026-01-23 09:43:55.914 227766 DEBUG nova.network.neutron [req-1d114451-56ca-4e3a-94db-30c42f0e25fc req-9d27c444-f785-4903-8c17-4ebd3fa12fae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Updated VIF entry in instance network info cache for port 20d99ef4-421c-4778-8024-ee47a467e6ac. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:43:55 np0005593234 nova_compute[227762]: 2026-01-23 09:43:55.915 227766 DEBUG nova.network.neutron [req-1d114451-56ca-4e3a-94db-30c42f0e25fc req-9d27c444-f785-4903-8c17-4ebd3fa12fae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Updating instance_info_cache with network_info: [{"id": "20d99ef4-421c-4778-8024-ee47a467e6ac", "address": "fa:16:3e:de:bc:f0", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20d99ef4-42", "ovs_interfaceid": "20d99ef4-421c-4778-8024-ee47a467e6ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:43:55 np0005593234 nova_compute[227762]: 2026-01-23 09:43:55.953 227766 DEBUG oslo_concurrency.lockutils [req-1d114451-56ca-4e3a-94db-30c42f0e25fc req-9d27c444-f785-4903-8c17-4ebd3fa12fae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-52f4c5ec-125c-4f64-86ef-2e4af50dbd4c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:43:56 np0005593234 nova_compute[227762]: 2026-01-23 09:43:56.053 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161436.053377, 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:56 np0005593234 nova_compute[227762]: 2026-01-23 09:43:56.054 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] VM Started (Lifecycle Event)#033[00m
Jan 23 04:43:56 np0005593234 nova_compute[227762]: 2026-01-23 09:43:56.307 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:56 np0005593234 nova_compute[227762]: 2026-01-23 09:43:56.313 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161436.0562618, 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:56 np0005593234 nova_compute[227762]: 2026-01-23 09:43:56.313 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:43:56 np0005593234 nova_compute[227762]: 2026-01-23 09:43:56.557 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:56.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:57.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.250 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.255 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.263 227766 DEBUG nova.compute.manager [req-8227e578-8022-40cc-b655-3055e8128b37 req-3ad013ff-eda9-4976-ab64-4703b717caec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Received event network-vif-plugged-20d99ef4-421c-4778-8024-ee47a467e6ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.264 227766 DEBUG oslo_concurrency.lockutils [req-8227e578-8022-40cc-b655-3055e8128b37 req-3ad013ff-eda9-4976-ab64-4703b717caec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.264 227766 DEBUG oslo_concurrency.lockutils [req-8227e578-8022-40cc-b655-3055e8128b37 req-3ad013ff-eda9-4976-ab64-4703b717caec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.264 227766 DEBUG oslo_concurrency.lockutils [req-8227e578-8022-40cc-b655-3055e8128b37 req-3ad013ff-eda9-4976-ab64-4703b717caec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.265 227766 DEBUG nova.compute.manager [req-8227e578-8022-40cc-b655-3055e8128b37 req-3ad013ff-eda9-4976-ab64-4703b717caec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Processing event network-vif-plugged-20d99ef4-421c-4778-8024-ee47a467e6ac _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.265 227766 DEBUG nova.compute.manager [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.269 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.272 227766 INFO nova.virt.libvirt.driver [-] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Instance spawned successfully.#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.272 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.338 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.339 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161438.2688863, 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.339 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.440 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.445 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.450 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.450 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.451 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.451 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.452 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.453 227766 DEBUG nova.virt.libvirt.driver [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.490 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.528 227766 INFO nova.compute.manager [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Took 14.04 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.529 227766 DEBUG nova.compute.manager [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.633 227766 INFO nova.compute.manager [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Took 14.99 seconds to build instance.#033[00m
Jan 23 04:43:58 np0005593234 nova_compute[227762]: 2026-01-23 09:43:58.665 227766 DEBUG oslo_concurrency.lockutils [None req-6c1e09e8-50b1-435a-b2b0-a8984097ff1e 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.149s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:43:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:43:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:43:58.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:43:59 np0005593234 nova_compute[227762]: 2026-01-23 09:43:59.130 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:43:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:43:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:43:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:43:59.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:43:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:00 np0005593234 nova_compute[227762]: 2026-01-23 09:44:00.590 227766 DEBUG nova.compute.manager [req-7e82f419-04ce-413b-91d2-5a72244ee47a req-85a2333e-f46c-4f80-b84c-78b3a2f96003 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Received event network-vif-plugged-20d99ef4-421c-4778-8024-ee47a467e6ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:44:00 np0005593234 nova_compute[227762]: 2026-01-23 09:44:00.590 227766 DEBUG oslo_concurrency.lockutils [req-7e82f419-04ce-413b-91d2-5a72244ee47a req-85a2333e-f46c-4f80-b84c-78b3a2f96003 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:00 np0005593234 nova_compute[227762]: 2026-01-23 09:44:00.590 227766 DEBUG oslo_concurrency.lockutils [req-7e82f419-04ce-413b-91d2-5a72244ee47a req-85a2333e-f46c-4f80-b84c-78b3a2f96003 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:00 np0005593234 nova_compute[227762]: 2026-01-23 09:44:00.590 227766 DEBUG oslo_concurrency.lockutils [req-7e82f419-04ce-413b-91d2-5a72244ee47a req-85a2333e-f46c-4f80-b84c-78b3a2f96003 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:00 np0005593234 nova_compute[227762]: 2026-01-23 09:44:00.591 227766 DEBUG nova.compute.manager [req-7e82f419-04ce-413b-91d2-5a72244ee47a req-85a2333e-f46c-4f80-b84c-78b3a2f96003 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] No waiting events found dispatching network-vif-plugged-20d99ef4-421c-4778-8024-ee47a467e6ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:44:00 np0005593234 nova_compute[227762]: 2026-01-23 09:44:00.591 227766 WARNING nova.compute.manager [req-7e82f419-04ce-413b-91d2-5a72244ee47a req-85a2333e-f46c-4f80-b84c-78b3a2f96003 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Received unexpected event network-vif-plugged-20d99ef4-421c-4778-8024-ee47a467e6ac for instance with vm_state active and task_state None.#033[00m
Jan 23 04:44:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:00.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:01.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:01 np0005593234 nova_compute[227762]: 2026-01-23 09:44:01.559 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:02 np0005593234 nova_compute[227762]: 2026-01-23 09:44:02.191 227766 DEBUG nova.compute.manager [None req-30691854-ed67-4642-b08e-9c209e0e7863 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:44:02 np0005593234 nova_compute[227762]: 2026-01-23 09:44:02.257 227766 INFO nova.compute.manager [None req-30691854-ed67-4642-b08e-9c209e0e7863 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] instance snapshotting#033[00m
Jan 23 04:44:02 np0005593234 nova_compute[227762]: 2026-01-23 09:44:02.599 227766 INFO nova.virt.libvirt.driver [None req-30691854-ed67-4642-b08e-9c209e0e7863 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Beginning live snapshot process#033[00m
Jan 23 04:44:02 np0005593234 nova_compute[227762]: 2026-01-23 09:44:02.767 227766 DEBUG nova.virt.libvirt.imagebackend [None req-30691854-ed67-4642-b08e-9c209e0e7863 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 04:44:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:02.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:03 np0005593234 nova_compute[227762]: 2026-01-23 09:44:03.042 227766 DEBUG nova.storage.rbd_utils [None req-30691854-ed67-4642-b08e-9c209e0e7863 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] creating snapshot(aaec4a34bf024c6f95fccd4a794571f0) on rbd image(52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:44:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:44:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:03.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:44:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e201 e201: 3 total, 3 up, 3 in
Jan 23 04:44:03 np0005593234 nova_compute[227762]: 2026-01-23 09:44:03.503 227766 DEBUG nova.storage.rbd_utils [None req-30691854-ed67-4642-b08e-9c209e0e7863 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] cloning vms/52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk@aaec4a34bf024c6f95fccd4a794571f0 to images/4a8fa58a-d2a5-4de0-8047-4939888c3f46 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 04:44:03 np0005593234 nova_compute[227762]: 2026-01-23 09:44:03.631 227766 DEBUG nova.storage.rbd_utils [None req-30691854-ed67-4642-b08e-9c209e0e7863 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] flattening images/4a8fa58a-d2a5-4de0-8047-4939888c3f46 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 04:44:04 np0005593234 nova_compute[227762]: 2026-01-23 09:44:04.086 227766 DEBUG nova.storage.rbd_utils [None req-30691854-ed67-4642-b08e-9c209e0e7863 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] removing snapshot(aaec4a34bf024c6f95fccd4a794571f0) on rbd image(52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:44:04 np0005593234 nova_compute[227762]: 2026-01-23 09:44:04.132 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e202 e202: 3 total, 3 up, 3 in
Jan 23 04:44:04 np0005593234 nova_compute[227762]: 2026-01-23 09:44:04.605 227766 DEBUG nova.storage.rbd_utils [None req-30691854-ed67-4642-b08e-9c209e0e7863 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] creating snapshot(snap) on rbd image(4a8fa58a-d2a5-4de0-8047-4939888c3f46) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:44:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e202 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:04.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:05.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e203 e203: 3 total, 3 up, 3 in
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver [None req-30691854-ed67-4642-b08e-9c209e0e7863 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Failed to snapshot image: nova.exception.ImageNotFound: Image 4a8fa58a-d2a5-4de0-8047-4939888c3f46 could not be found.
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     image = self._client.call(
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver glanceclient.exc.HTTPNotFound: HTTP 404 Not Found: No image found with ID 4a8fa58a-d2a5-4de0-8047-4939888c3f46
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver 
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver During handling of the above exception, another exception occurred:
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver 
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver Traceback (most recent call last):
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py", line 3082, in snapshot
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     self._image_api.update(context, image_id, metadata,
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1243, in update
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     return session.update(context, image_id, image_info, data=data,
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 693, in update
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     _reraise_translated_image_exception(image_id)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 1031, in _reraise_translated_image_exception
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     raise new_exc.with_traceback(exc_trace)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 691, in update
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     image = self._update_v2(context, sent_service_image_meta, data)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 700, in _update_v2
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     image = self._client.call(
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/nova/image/glance.py", line 191, in call
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     result = getattr(controller, method)(*args, **kwargs)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 440, in update
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     unvalidated_image = self.get(image_id)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 197, in get
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     return self._get(image_id)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/utils.py", line 649, in inner
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     return RequestIdProxy(wrapped(*args, **kwargs))
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/v2/images.py", line 190, in _get
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     resp, body = self.http_client.get(url, headers=header)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/keystoneauth1/adapter.py", line 395, in get
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     return self.request(url, 'GET', **kwargs)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 380, in request
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     return self._handle_response(resp)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver   File "/usr/lib/python3.9/site-packages/glanceclient/common/http.py", line 120, in _handle_response
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver     raise exc.from_response(resp, resp.content)
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver nova.exception.ImageNotFound: Image 4a8fa58a-d2a5-4de0-8047-4939888c3f46 could not be found.
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.750 227766 ERROR nova.virt.libvirt.driver #033[00m
Jan 23 04:44:05 np0005593234 nova_compute[227762]: 2026-01-23 09:44:05.800 227766 DEBUG nova.storage.rbd_utils [None req-30691854-ed67-4642-b08e-9c209e0e7863 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] removing snapshot(snap) on rbd image(4a8fa58a-d2a5-4de0-8047-4939888c3f46) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:44:06 np0005593234 nova_compute[227762]: 2026-01-23 09:44:06.598 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e204 e204: 3 total, 3 up, 3 in
Jan 23 04:44:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:44:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:06.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:44:06 np0005593234 nova_compute[227762]: 2026-01-23 09:44:06.928 227766 WARNING nova.compute.manager [None req-30691854-ed67-4642-b08e-9c209e0e7863 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Image not found during snapshot: nova.exception.ImageNotFound: Image 4a8fa58a-d2a5-4de0-8047-4939888c3f46 could not be found.#033[00m
Jan 23 04:44:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:44:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:07.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.365 227766 DEBUG oslo_concurrency.lockutils [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.365 227766 DEBUG oslo_concurrency.lockutils [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.366 227766 DEBUG oslo_concurrency.lockutils [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.366 227766 DEBUG oslo_concurrency.lockutils [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.367 227766 DEBUG oslo_concurrency.lockutils [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.369 227766 INFO nova.compute.manager [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Terminating instance#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.370 227766 DEBUG nova.compute.manager [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:44:08 np0005593234 kernel: tap20d99ef4-42 (unregistering): left promiscuous mode
Jan 23 04:44:08 np0005593234 NetworkManager[48942]: <info>  [1769161448.4153] device (tap20d99ef4-42): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.424 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:44:08Z|00146|binding|INFO|Releasing lport 20d99ef4-421c-4778-8024-ee47a467e6ac from this chassis (sb_readonly=0)
Jan 23 04:44:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:44:08Z|00147|binding|INFO|Setting lport 20d99ef4-421c-4778-8024-ee47a467e6ac down in Southbound
Jan 23 04:44:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:44:08Z|00148|binding|INFO|Removing iface tap20d99ef4-42 ovn-installed in OVS
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.426 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.437 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:de:bc:f0 10.100.0.3'], port_security=['fa:16:3e:de:bc:f0 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '52f4c5ec-125c-4f64-86ef-2e4af50dbd4c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c2696fd4-5fd7-4934-88ac-40162fad555d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '05bc71a77710455e8b34ead7fec81a31', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ab8b868e-d8b1-4e1d-87d5-538f88b95e73', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3459fea4-e2ba-482e-8d51-91ef5b74d71a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=20d99ef4-421c-4778-8024-ee47a467e6ac) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.439 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 20d99ef4-421c-4778-8024-ee47a467e6ac in datapath c2696fd4-5fd7-4934-88ac-40162fad555d unbound from our chassis#033[00m
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.441 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c2696fd4-5fd7-4934-88ac-40162fad555d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.443 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2d96b803-bcb4-4d3a-8efe-61cefe2a56f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.444 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d namespace which is not needed anymore#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.446 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:08 np0005593234 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000034.scope: Deactivated successfully.
Jan 23 04:44:08 np0005593234 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d00000034.scope: Consumed 11.216s CPU time.
Jan 23 04:44:08 np0005593234 systemd-machined[195626]: Machine qemu-23-instance-00000034 terminated.
Jan 23 04:44:08 np0005593234 podman[251706]: 2026-01-23 09:44:08.567759876 +0000 UTC m=+0.122285024 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 04:44:08 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[251464]: [NOTICE]   (251486) : haproxy version is 2.8.14-c23fe91
Jan 23 04:44:08 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[251464]: [NOTICE]   (251486) : path to executable is /usr/sbin/haproxy
Jan 23 04:44:08 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[251464]: [WARNING]  (251486) : Exiting Master process...
Jan 23 04:44:08 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[251464]: [ALERT]    (251486) : Current worker (251488) exited with code 143 (Terminated)
Jan 23 04:44:08 np0005593234 neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d[251464]: [WARNING]  (251486) : All workers exited. Exiting... (0)
Jan 23 04:44:08 np0005593234 systemd[1]: libpod-eb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01.scope: Deactivated successfully.
Jan 23 04:44:08 np0005593234 podman[251747]: 2026-01-23 09:44:08.581921447 +0000 UTC m=+0.043397934 container died eb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.591 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.596 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.605 227766 INFO nova.virt.libvirt.driver [-] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Instance destroyed successfully.#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.606 227766 DEBUG nova.objects.instance [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lazy-loading 'resources' on Instance uuid 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:44:08 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01-userdata-shm.mount: Deactivated successfully.
Jan 23 04:44:08 np0005593234 systemd[1]: var-lib-containers-storage-overlay-76e504772396622afdc82b8d8c0237674ec7defac981f914d96f4b7e16244c84-merged.mount: Deactivated successfully.
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.626 227766 DEBUG nova.virt.libvirt.vif [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:43:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesTestJSON-server-1923234229',display_name='tempest-ImagesTestJSON-server-1923234229',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagestestjson-server-1923234229',id=52,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:43:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='05bc71a77710455e8b34ead7fec81a31',ramdisk_id='',reservation_id='r-bzida3pu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesTestJSON-1507872051',owner_user_name='tempest-ImagesTestJSON-1507872051-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:44:06Z,user_data=None,user_id='56da68482e3a4fb582dcccad45f8f71b',uuid=52f4c5ec-125c-4f64-86ef-2e4af50dbd4c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "20d99ef4-421c-4778-8024-ee47a467e6ac", "address": "fa:16:3e:de:bc:f0", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20d99ef4-42", "ovs_interfaceid": "20d99ef4-421c-4778-8024-ee47a467e6ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.627 227766 DEBUG nova.network.os_vif_util [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converting VIF {"id": "20d99ef4-421c-4778-8024-ee47a467e6ac", "address": "fa:16:3e:de:bc:f0", "network": {"id": "c2696fd4-5fd7-4934-88ac-40162fad555d", "bridge": "br-int", "label": "tempest-ImagesTestJSON-113670604-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "05bc71a77710455e8b34ead7fec81a31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20d99ef4-42", "ovs_interfaceid": "20d99ef4-421c-4778-8024-ee47a467e6ac", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.628 227766 DEBUG nova.network.os_vif_util [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:de:bc:f0,bridge_name='br-int',has_traffic_filtering=True,id=20d99ef4-421c-4778-8024-ee47a467e6ac,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20d99ef4-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.629 227766 DEBUG os_vif [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:bc:f0,bridge_name='br-int',has_traffic_filtering=True,id=20d99ef4-421c-4778-8024-ee47a467e6ac,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20d99ef4-42') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:44:08 np0005593234 podman[251747]: 2026-01-23 09:44:08.63142651 +0000 UTC m=+0.092902997 container cleanup eb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.631 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.631 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20d99ef4-42, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.672 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.674 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:08 np0005593234 systemd[1]: libpod-conmon-eb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01.scope: Deactivated successfully.
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.678 227766 INFO os_vif [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:de:bc:f0,bridge_name='br-int',has_traffic_filtering=True,id=20d99ef4-421c-4778-8024-ee47a467e6ac,network=Network(c2696fd4-5fd7-4934-88ac-40162fad555d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20d99ef4-42')#033[00m
Jan 23 04:44:08 np0005593234 podman[251787]: 2026-01-23 09:44:08.735489005 +0000 UTC m=+0.041243727 container remove eb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.741 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[30ba6f05-2e55-4eb0-b674-12c8fc03f7ef]: (4, ('Fri Jan 23 09:44:08 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d (eb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01)\neb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01\nFri Jan 23 09:44:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d (eb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01)\neb1da8796eab8b6812acb893d407798b8a47be455b0cd7027cd35b94af15ba01\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.743 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[218bb7e7-3216-4d43-b2bf-5dc2bbb7a4c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.744 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2696fd4-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.746 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:08 np0005593234 kernel: tapc2696fd4-50: left promiscuous mode
Jan 23 04:44:08 np0005593234 nova_compute[227762]: 2026-01-23 09:44:08.760 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.762 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aaabc103-1d5a-49a1-9701-fc769bef4df9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.786 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8a150fd3-133d-40e4-9b5b-c3993917f453]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.787 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5f480c4b-d7e8-438c-be2c-e177dd999b1a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.804 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6e2af38d-09aa-4ced-8fee-25c445e5ef3e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 533013, 'reachable_time': 30826, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 251820, 'error': None, 'target': 'ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:08 np0005593234 systemd[1]: run-netns-ovnmeta\x2dc2696fd4\x2d5fd7\x2d4934\x2d88ac\x2d40162fad555d.mount: Deactivated successfully.
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.811 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c2696fd4-5fd7-4934-88ac-40162fad555d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:08.812 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[13f88395-f56d-4c9e-a8cb-e3d87ca33844]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:08.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:09 np0005593234 nova_compute[227762]: 2026-01-23 09:44:09.134 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:44:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:09.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:44:09 np0005593234 nova_compute[227762]: 2026-01-23 09:44:09.215 227766 INFO nova.virt.libvirt.driver [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Deleting instance files /var/lib/nova/instances/52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_del#033[00m
Jan 23 04:44:09 np0005593234 nova_compute[227762]: 2026-01-23 09:44:09.216 227766 INFO nova.virt.libvirt.driver [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Deletion of /var/lib/nova/instances/52f4c5ec-125c-4f64-86ef-2e4af50dbd4c_del complete#033[00m
Jan 23 04:44:09 np0005593234 nova_compute[227762]: 2026-01-23 09:44:09.279 227766 INFO nova.compute.manager [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Took 0.91 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:44:09 np0005593234 nova_compute[227762]: 2026-01-23 09:44:09.279 227766 DEBUG oslo.service.loopingcall [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:44:09 np0005593234 nova_compute[227762]: 2026-01-23 09:44:09.280 227766 DEBUG nova.compute.manager [-] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:44:09 np0005593234 nova_compute[227762]: 2026-01-23 09:44:09.280 227766 DEBUG nova.network.neutron [-] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:44:09 np0005593234 nova_compute[227762]: 2026-01-23 09:44:09.486 227766 DEBUG nova.compute.manager [req-9e3bc15a-926e-4883-9a69-56a267125d2c req-f0dd64c0-2635-4ccb-b0b3-e3418bbbc714 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Received event network-vif-unplugged-20d99ef4-421c-4778-8024-ee47a467e6ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:44:09 np0005593234 nova_compute[227762]: 2026-01-23 09:44:09.487 227766 DEBUG oslo_concurrency.lockutils [req-9e3bc15a-926e-4883-9a69-56a267125d2c req-f0dd64c0-2635-4ccb-b0b3-e3418bbbc714 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:09 np0005593234 nova_compute[227762]: 2026-01-23 09:44:09.488 227766 DEBUG oslo_concurrency.lockutils [req-9e3bc15a-926e-4883-9a69-56a267125d2c req-f0dd64c0-2635-4ccb-b0b3-e3418bbbc714 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:09 np0005593234 nova_compute[227762]: 2026-01-23 09:44:09.488 227766 DEBUG oslo_concurrency.lockutils [req-9e3bc15a-926e-4883-9a69-56a267125d2c req-f0dd64c0-2635-4ccb-b0b3-e3418bbbc714 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:09 np0005593234 nova_compute[227762]: 2026-01-23 09:44:09.488 227766 DEBUG nova.compute.manager [req-9e3bc15a-926e-4883-9a69-56a267125d2c req-f0dd64c0-2635-4ccb-b0b3-e3418bbbc714 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] No waiting events found dispatching network-vif-unplugged-20d99ef4-421c-4778-8024-ee47a467e6ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:44:09 np0005593234 nova_compute[227762]: 2026-01-23 09:44:09.488 227766 DEBUG nova.compute.manager [req-9e3bc15a-926e-4883-9a69-56a267125d2c req-f0dd64c0-2635-4ccb-b0b3-e3418bbbc714 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Received event network-vif-unplugged-20d99ef4-421c-4778-8024-ee47a467e6ac for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:44:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e204 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:10 np0005593234 nova_compute[227762]: 2026-01-23 09:44:10.138 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:44:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:10.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.172 227766 DEBUG nova.network.neutron [-] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:44:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:11.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.196 227766 INFO nova.compute.manager [-] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Took 1.92 seconds to deallocate network for instance.#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.244 227766 DEBUG oslo_concurrency.lockutils [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.244 227766 DEBUG oslo_concurrency.lockutils [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.318 227766 DEBUG oslo_concurrency.processutils [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.679 227766 DEBUG nova.compute.manager [req-28e6f603-47cc-425b-901c-f48ab6cd1cdd req-942a5041-e993-459c-a97c-e20bdfad0792 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Received event network-vif-plugged-20d99ef4-421c-4778-8024-ee47a467e6ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.680 227766 DEBUG oslo_concurrency.lockutils [req-28e6f603-47cc-425b-901c-f48ab6cd1cdd req-942a5041-e993-459c-a97c-e20bdfad0792 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.680 227766 DEBUG oslo_concurrency.lockutils [req-28e6f603-47cc-425b-901c-f48ab6cd1cdd req-942a5041-e993-459c-a97c-e20bdfad0792 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.680 227766 DEBUG oslo_concurrency.lockutils [req-28e6f603-47cc-425b-901c-f48ab6cd1cdd req-942a5041-e993-459c-a97c-e20bdfad0792 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.680 227766 DEBUG nova.compute.manager [req-28e6f603-47cc-425b-901c-f48ab6cd1cdd req-942a5041-e993-459c-a97c-e20bdfad0792 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] No waiting events found dispatching network-vif-plugged-20d99ef4-421c-4778-8024-ee47a467e6ac pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.681 227766 WARNING nova.compute.manager [req-28e6f603-47cc-425b-901c-f48ab6cd1cdd req-942a5041-e993-459c-a97c-e20bdfad0792 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Received unexpected event network-vif-plugged-20d99ef4-421c-4778-8024-ee47a467e6ac for instance with vm_state deleted and task_state None.#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.681 227766 DEBUG nova.compute.manager [req-28e6f603-47cc-425b-901c-f48ab6cd1cdd req-942a5041-e993-459c-a97c-e20bdfad0792 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Received event network-vif-deleted-20d99ef4-421c-4778-8024-ee47a467e6ac external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:44:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:44:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1575505418' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.760 227766 DEBUG oslo_concurrency.processutils [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.767 227766 DEBUG nova.compute.provider_tree [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.799 227766 DEBUG nova.scheduler.client.report [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.825 227766 DEBUG oslo_concurrency.lockutils [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.868 227766 INFO nova.scheduler.client.report [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Deleted allocations for instance 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c#033[00m
Jan 23 04:44:11 np0005593234 nova_compute[227762]: 2026-01-23 09:44:11.942 227766 DEBUG oslo_concurrency.lockutils [None req-227cb6e9-9a0e-48f9-b4bd-c15fa4135018 56da68482e3a4fb582dcccad45f8f71b 05bc71a77710455e8b34ead7fec81a31 - - default default] Lock "52f4c5ec-125c-4f64-86ef-2e4af50dbd4c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:12.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:13.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:13 np0005593234 nova_compute[227762]: 2026-01-23 09:44:13.674 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:14 np0005593234 nova_compute[227762]: 2026-01-23 09:44:14.136 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e205 e205: 3 total, 3 up, 3 in
Jan 23 04:44:14 np0005593234 nova_compute[227762]: 2026-01-23 09:44:14.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:14 np0005593234 nova_compute[227762]: 2026-01-23 09:44:14.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:44:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:14.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:44:14 np0005593234 nova_compute[227762]: 2026-01-23 09:44:14.961 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:14 np0005593234 nova_compute[227762]: 2026-01-23 09:44:14.961 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:14 np0005593234 nova_compute[227762]: 2026-01-23 09:44:14.961 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:14 np0005593234 nova_compute[227762]: 2026-01-23 09:44:14.962 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:44:14 np0005593234 nova_compute[227762]: 2026-01-23 09:44:14.963 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:15.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:44:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2076529521' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:44:15 np0005593234 nova_compute[227762]: 2026-01-23 09:44:15.410 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:15 np0005593234 nova_compute[227762]: 2026-01-23 09:44:15.569 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:44:15 np0005593234 nova_compute[227762]: 2026-01-23 09:44:15.571 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4680MB free_disk=20.942752838134766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:44:15 np0005593234 nova_compute[227762]: 2026-01-23 09:44:15.571 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:15 np0005593234 nova_compute[227762]: 2026-01-23 09:44:15.571 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:15 np0005593234 nova_compute[227762]: 2026-01-23 09:44:15.771 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:44:15 np0005593234 nova_compute[227762]: 2026-01-23 09:44:15.772 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:44:15 np0005593234 nova_compute[227762]: 2026-01-23 09:44:15.791 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:44:15 np0005593234 nova_compute[227762]: 2026-01-23 09:44:15.811 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:44:15 np0005593234 nova_compute[227762]: 2026-01-23 09:44:15.812 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:44:15 np0005593234 nova_compute[227762]: 2026-01-23 09:44:15.831 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:44:15 np0005593234 nova_compute[227762]: 2026-01-23 09:44:15.858 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:44:15 np0005593234 nova_compute[227762]: 2026-01-23 09:44:15.877 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:44:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/951437207' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:44:16 np0005593234 nova_compute[227762]: 2026-01-23 09:44:16.331 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:16 np0005593234 nova_compute[227762]: 2026-01-23 09:44:16.339 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:44:16 np0005593234 nova_compute[227762]: 2026-01-23 09:44:16.363 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:44:16 np0005593234 nova_compute[227762]: 2026-01-23 09:44:16.389 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:44:16 np0005593234 nova_compute[227762]: 2026-01-23 09:44:16.389 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:44:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:16.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:44:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:17.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:18 np0005593234 nova_compute[227762]: 2026-01-23 09:44:18.391 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:18 np0005593234 nova_compute[227762]: 2026-01-23 09:44:18.391 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:18 np0005593234 nova_compute[227762]: 2026-01-23 09:44:18.391 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:44:18 np0005593234 nova_compute[227762]: 2026-01-23 09:44:18.677 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:18 np0005593234 nova_compute[227762]: 2026-01-23 09:44:18.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:18 np0005593234 nova_compute[227762]: 2026-01-23 09:44:18.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:18 np0005593234 nova_compute[227762]: 2026-01-23 09:44:18.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:44:18 np0005593234 nova_compute[227762]: 2026-01-23 09:44:18.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:44:18 np0005593234 nova_compute[227762]: 2026-01-23 09:44:18.764 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:44:18 np0005593234 nova_compute[227762]: 2026-01-23 09:44:18.764 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:18.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:18 np0005593234 nova_compute[227762]: 2026-01-23 09:44:18.960 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:19 np0005593234 nova_compute[227762]: 2026-01-23 09:44:19.161 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:19.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:20 np0005593234 nova_compute[227762]: 2026-01-23 09:44:20.032 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:20.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:44:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:21.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:44:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:22.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:23.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:23 np0005593234 nova_compute[227762]: 2026-01-23 09:44:23.602 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161448.6003346, 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:44:23 np0005593234 nova_compute[227762]: 2026-01-23 09:44:23.602 227766 INFO nova.compute.manager [-] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:44:23 np0005593234 nova_compute[227762]: 2026-01-23 09:44:23.632 227766 DEBUG nova.compute.manager [None req-686ea835-cce2-4cf1-a82f-c3600b44287a - - - - - -] [instance: 52f4c5ec-125c-4f64-86ef-2e4af50dbd4c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:44:23 np0005593234 nova_compute[227762]: 2026-01-23 09:44:23.680 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:23 np0005593234 nova_compute[227762]: 2026-01-23 09:44:23.762 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:44:24 np0005593234 nova_compute[227762]: 2026-01-23 09:44:24.163 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:44:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:24.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:44:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:25.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:25 np0005593234 podman[251947]: 2026-01-23 09:44:25.754347632 +0000 UTC m=+0.048562675 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 04:44:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:44:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:26.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:44:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:44:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:27.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:44:28 np0005593234 nova_compute[227762]: 2026-01-23 09:44:28.682 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:28.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:29 np0005593234 nova_compute[227762]: 2026-01-23 09:44:29.165 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:44:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:29.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:44:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:44:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:30.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:44:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:31.145 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:44:31 np0005593234 nova_compute[227762]: 2026-01-23 09:44:31.146 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:31.146 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:44:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:31.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:32.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:33.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:33 np0005593234 nova_compute[227762]: 2026-01-23 09:44:33.685 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:34 np0005593234 nova_compute[227762]: 2026-01-23 09:44:34.167 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:34.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:35.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:36.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:37.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:38 np0005593234 nova_compute[227762]: 2026-01-23 09:44:38.688 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:38 np0005593234 podman[252025]: 2026-01-23 09:44:38.779774573 +0000 UTC m=+0.076121644 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:44:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:38.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:39 np0005593234 nova_compute[227762]: 2026-01-23 09:44:39.168 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:39.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:44:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:40.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:44:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:41.149 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:41.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:42.819 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:42.820 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:42.820 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:42 np0005593234 nova_compute[227762]: 2026-01-23 09:44:42.937 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Acquiring lock "1608bb7b-ae4b-40c1-b404-16dabe957e37" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:42 np0005593234 nova_compute[227762]: 2026-01-23 09:44:42.938 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:42 np0005593234 nova_compute[227762]: 2026-01-23 09:44:42.961 227766 DEBUG nova.compute.manager [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:44:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:42.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.058 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.059 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.065 227766 DEBUG nova.virt.hardware [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.065 227766 INFO nova.compute.claims [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.219 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:44:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:43.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:44:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:44:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2756596775' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.691 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.696 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.702 227766 DEBUG nova.compute.provider_tree [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.727 227766 DEBUG nova.scheduler.client.report [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.756 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.698s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.758 227766 DEBUG nova.compute.manager [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.811 227766 DEBUG nova.compute.manager [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.812 227766 DEBUG nova.network.neutron [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.851 227766 INFO nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:44:43 np0005593234 nova_compute[227762]: 2026-01-23 09:44:43.893 227766 DEBUG nova.compute.manager [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.023 227766 DEBUG nova.compute.manager [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.024 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.025 227766 INFO nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Creating image(s)#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.052 227766 DEBUG nova.storage.rbd_utils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] rbd image 1608bb7b-ae4b-40c1-b404-16dabe957e37_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.082 227766 DEBUG nova.storage.rbd_utils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] rbd image 1608bb7b-ae4b-40c1-b404-16dabe957e37_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.109 227766 DEBUG nova.storage.rbd_utils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] rbd image 1608bb7b-ae4b-40c1-b404-16dabe957e37_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.113 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.136 227766 DEBUG nova.policy [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e2eb5d0826b74d23b502201e3cd116a3', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '24b47af7a3f745a7bc14b9a64c920144', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.170 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.173 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.174 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.174 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.175 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.201 227766 DEBUG nova.storage.rbd_utils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] rbd image 1608bb7b-ae4b-40c1-b404-16dabe957e37_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.205 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 1608bb7b-ae4b-40c1-b404-16dabe957e37_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:44.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:44 np0005593234 nova_compute[227762]: 2026-01-23 09:44:44.976 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 1608bb7b-ae4b-40c1-b404-16dabe957e37_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.771s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:45 np0005593234 nova_compute[227762]: 2026-01-23 09:44:45.045 227766 DEBUG nova.storage.rbd_utils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] resizing rbd image 1608bb7b-ae4b-40c1-b404-16dabe957e37_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:44:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:45.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:45 np0005593234 nova_compute[227762]: 2026-01-23 09:44:45.345 227766 DEBUG nova.objects.instance [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lazy-loading 'migration_context' on Instance uuid 1608bb7b-ae4b-40c1-b404-16dabe957e37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:44:45 np0005593234 nova_compute[227762]: 2026-01-23 09:44:45.745 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:44:45 np0005593234 nova_compute[227762]: 2026-01-23 09:44:45.745 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Ensure instance console log exists: /var/lib/nova/instances/1608bb7b-ae4b-40c1-b404-16dabe957e37/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:44:45 np0005593234 nova_compute[227762]: 2026-01-23 09:44:45.746 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:45 np0005593234 nova_compute[227762]: 2026-01-23 09:44:45.746 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:45 np0005593234 nova_compute[227762]: 2026-01-23 09:44:45.746 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:45 np0005593234 nova_compute[227762]: 2026-01-23 09:44:45.962 227766 DEBUG nova.network.neutron [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Successfully created port: 57475f81-ae6b-4bd0-ad78-c49eec106db7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:44:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:46.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:47.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:48 np0005593234 nova_compute[227762]: 2026-01-23 09:44:48.695 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:48 np0005593234 nova_compute[227762]: 2026-01-23 09:44:48.780 227766 DEBUG nova.network.neutron [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Successfully updated port: 57475f81-ae6b-4bd0-ad78-c49eec106db7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:44:48 np0005593234 nova_compute[227762]: 2026-01-23 09:44:48.795 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Acquiring lock "refresh_cache-1608bb7b-ae4b-40c1-b404-16dabe957e37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:44:48 np0005593234 nova_compute[227762]: 2026-01-23 09:44:48.796 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Acquired lock "refresh_cache-1608bb7b-ae4b-40c1-b404-16dabe957e37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:44:48 np0005593234 nova_compute[227762]: 2026-01-23 09:44:48.796 227766 DEBUG nova.network.neutron [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:44:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:44:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:48.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:44:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:44:49 np0005593234 nova_compute[227762]: 2026-01-23 09:44:49.055 227766 DEBUG nova.compute.manager [req-b815a2bd-86e9-4c62-a67b-30777e049d7c req-53c65ea8-5ccd-4c14-9844-12d72cd253c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Received event network-changed-57475f81-ae6b-4bd0-ad78-c49eec106db7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:44:49 np0005593234 nova_compute[227762]: 2026-01-23 09:44:49.055 227766 DEBUG nova.compute.manager [req-b815a2bd-86e9-4c62-a67b-30777e049d7c req-53c65ea8-5ccd-4c14-9844-12d72cd253c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Refreshing instance network info cache due to event network-changed-57475f81-ae6b-4bd0-ad78-c49eec106db7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:44:49 np0005593234 nova_compute[227762]: 2026-01-23 09:44:49.056 227766 DEBUG oslo_concurrency.lockutils [req-b815a2bd-86e9-4c62-a67b-30777e049d7c req-53c65ea8-5ccd-4c14-9844-12d72cd253c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-1608bb7b-ae4b-40c1-b404-16dabe957e37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:44:49 np0005593234 nova_compute[227762]: 2026-01-23 09:44:49.172 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:44:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:49.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:44:49 np0005593234 nova_compute[227762]: 2026-01-23 09:44:49.964 227766 DEBUG nova.network.neutron [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:44:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:44:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:44:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:44:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:50.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:51.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:52 np0005593234 nova_compute[227762]: 2026-01-23 09:44:52.235 227766 DEBUG nova.network.neutron [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Updating instance_info_cache with network_info: [{"id": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "address": "fa:16:3e:b1:32:2c", "network": {"id": "6775b063-5172-4226-8ee7-bbb7bf41d574", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1005289239-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "24b47af7a3f745a7bc14b9a64c920144", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57475f81-ae", "ovs_interfaceid": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:44:52 np0005593234 ovn_controller[134547]: 2026-01-23T09:44:52Z|00149|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 04:44:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:52.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:53.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.365 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Releasing lock "refresh_cache-1608bb7b-ae4b-40c1-b404-16dabe957e37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.365 227766 DEBUG nova.compute.manager [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Instance network_info: |[{"id": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "address": "fa:16:3e:b1:32:2c", "network": {"id": "6775b063-5172-4226-8ee7-bbb7bf41d574", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1005289239-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "24b47af7a3f745a7bc14b9a64c920144", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57475f81-ae", "ovs_interfaceid": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.366 227766 DEBUG oslo_concurrency.lockutils [req-b815a2bd-86e9-4c62-a67b-30777e049d7c req-53c65ea8-5ccd-4c14-9844-12d72cd253c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-1608bb7b-ae4b-40c1-b404-16dabe957e37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.366 227766 DEBUG nova.network.neutron [req-b815a2bd-86e9-4c62-a67b-30777e049d7c req-53c65ea8-5ccd-4c14-9844-12d72cd253c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Refreshing network info cache for port 57475f81-ae6b-4bd0-ad78-c49eec106db7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.369 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Start _get_guest_xml network_info=[{"id": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "address": "fa:16:3e:b1:32:2c", "network": {"id": "6775b063-5172-4226-8ee7-bbb7bf41d574", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1005289239-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "24b47af7a3f745a7bc14b9a64c920144", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57475f81-ae", "ovs_interfaceid": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.374 227766 WARNING nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.380 227766 DEBUG nova.virt.libvirt.host [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.380 227766 DEBUG nova.virt.libvirt.host [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.383 227766 DEBUG nova.virt.libvirt.host [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.383 227766 DEBUG nova.virt.libvirt.host [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.385 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.385 227766 DEBUG nova.virt.hardware [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.386 227766 DEBUG nova.virt.hardware [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.386 227766 DEBUG nova.virt.hardware [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.386 227766 DEBUG nova.virt.hardware [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.386 227766 DEBUG nova.virt.hardware [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.387 227766 DEBUG nova.virt.hardware [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.387 227766 DEBUG nova.virt.hardware [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.387 227766 DEBUG nova.virt.hardware [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.387 227766 DEBUG nova.virt.hardware [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.387 227766 DEBUG nova.virt.hardware [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.388 227766 DEBUG nova.virt.hardware [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.391 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.699 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:44:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1628783758' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.854 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.877 227766 DEBUG nova.storage.rbd_utils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] rbd image 1608bb7b-ae4b-40c1-b404-16dabe957e37_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:53 np0005593234 nova_compute[227762]: 2026-01-23 09:44:53.880 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.173 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:44:54 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1817328578' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.285 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.287 227766 DEBUG nova.virt.libvirt.vif [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:44:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-795664987',display_name='tempest-ImagesOneServerTestJSON-server-795664987',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-795664987',id=55,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='24b47af7a3f745a7bc14b9a64c920144',ramdisk_id='',reservation_id='r-t1s0asbx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-2122530602',owner_user_name='tempest-ImagesOneServerTestJSON-2122530602-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:44:43Z,user_data=None,user_id='e2eb5d0826b74d23b502201e3cd116a3',uuid=1608bb7b-ae4b-40c1-b404-16dabe957e37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "address": "fa:16:3e:b1:32:2c", "network": {"id": "6775b063-5172-4226-8ee7-bbb7bf41d574", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1005289239-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "24b47af7a3f745a7bc14b9a64c920144", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57475f81-ae", "ovs_interfaceid": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.287 227766 DEBUG nova.network.os_vif_util [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Converting VIF {"id": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "address": "fa:16:3e:b1:32:2c", "network": {"id": "6775b063-5172-4226-8ee7-bbb7bf41d574", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1005289239-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "24b47af7a3f745a7bc14b9a64c920144", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57475f81-ae", "ovs_interfaceid": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.288 227766 DEBUG nova.network.os_vif_util [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:32:2c,bridge_name='br-int',has_traffic_filtering=True,id=57475f81-ae6b-4bd0-ad78-c49eec106db7,network=Network(6775b063-5172-4226-8ee7-bbb7bf41d574),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57475f81-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.289 227766 DEBUG nova.objects.instance [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1608bb7b-ae4b-40c1-b404-16dabe957e37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.447 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  <uuid>1608bb7b-ae4b-40c1-b404-16dabe957e37</uuid>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  <name>instance-00000037</name>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <nova:name>tempest-ImagesOneServerTestJSON-server-795664987</nova:name>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:44:53</nova:creationTime>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <nova:user uuid="e2eb5d0826b74d23b502201e3cd116a3">tempest-ImagesOneServerTestJSON-2122530602-project-member</nova:user>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <nova:project uuid="24b47af7a3f745a7bc14b9a64c920144">tempest-ImagesOneServerTestJSON-2122530602</nova:project>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <nova:port uuid="57475f81-ae6b-4bd0-ad78-c49eec106db7">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <entry name="serial">1608bb7b-ae4b-40c1-b404-16dabe957e37</entry>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <entry name="uuid">1608bb7b-ae4b-40c1-b404-16dabe957e37</entry>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1608bb7b-ae4b-40c1-b404-16dabe957e37_disk">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1608bb7b-ae4b-40c1-b404-16dabe957e37_disk.config">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:b1:32:2c"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <target dev="tap57475f81-ae"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/1608bb7b-ae4b-40c1-b404-16dabe957e37/console.log" append="off"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:44:54 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:44:54 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:44:54 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:44:54 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.449 227766 DEBUG nova.compute.manager [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Preparing to wait for external event network-vif-plugged-57475f81-ae6b-4bd0-ad78-c49eec106db7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.450 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Acquiring lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.450 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.451 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.452 227766 DEBUG nova.virt.libvirt.vif [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:44:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-795664987',display_name='tempest-ImagesOneServerTestJSON-server-795664987',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-795664987',id=55,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='24b47af7a3f745a7bc14b9a64c920144',ramdisk_id='',reservation_id='r-t1s0asbx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ImagesOneServerTestJSON-2122530602',owner_user_name='tempest-ImagesOneServerTestJSON-2122530602-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:44:43Z,user_data=None,user_id='e2eb5d0826b74d23b502201e3cd116a3',uuid=1608bb7b-ae4b-40c1-b404-16dabe957e37,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "address": "fa:16:3e:b1:32:2c", "network": {"id": "6775b063-5172-4226-8ee7-bbb7bf41d574", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1005289239-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "24b47af7a3f745a7bc14b9a64c920144", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57475f81-ae", "ovs_interfaceid": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.453 227766 DEBUG nova.network.os_vif_util [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Converting VIF {"id": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "address": "fa:16:3e:b1:32:2c", "network": {"id": "6775b063-5172-4226-8ee7-bbb7bf41d574", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1005289239-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "24b47af7a3f745a7bc14b9a64c920144", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57475f81-ae", "ovs_interfaceid": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.453 227766 DEBUG nova.network.os_vif_util [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:32:2c,bridge_name='br-int',has_traffic_filtering=True,id=57475f81-ae6b-4bd0-ad78-c49eec106db7,network=Network(6775b063-5172-4226-8ee7-bbb7bf41d574),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57475f81-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.454 227766 DEBUG os_vif [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:32:2c,bridge_name='br-int',has_traffic_filtering=True,id=57475f81-ae6b-4bd0-ad78-c49eec106db7,network=Network(6775b063-5172-4226-8ee7-bbb7bf41d574),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57475f81-ae') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.455 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.455 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.456 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.460 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.461 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap57475f81-ae, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.462 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap57475f81-ae, col_values=(('external_ids', {'iface-id': '57475f81-ae6b-4bd0-ad78-c49eec106db7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:32:2c', 'vm-uuid': '1608bb7b-ae4b-40c1-b404-16dabe957e37'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.464 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:54 np0005593234 NetworkManager[48942]: <info>  [1769161494.4649] manager: (tap57475f81-ae): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/84)
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.466 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.471 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.472 227766 INFO os_vif [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:32:2c,bridge_name='br-int',has_traffic_filtering=True,id=57475f81-ae6b-4bd0-ad78-c49eec106db7,network=Network(6775b063-5172-4226-8ee7-bbb7bf41d574),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57475f81-ae')#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.684 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.684 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.684 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] No VIF found with MAC fa:16:3e:b1:32:2c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.685 227766 INFO nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Using config drive#033[00m
Jan 23 04:44:54 np0005593234 nova_compute[227762]: 2026-01-23 09:44:54.707 227766 DEBUG nova.storage.rbd_utils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] rbd image 1608bb7b-ae4b-40c1-b404-16dabe957e37_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:44:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:54.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:55.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:55 np0005593234 nova_compute[227762]: 2026-01-23 09:44:55.676 227766 INFO nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Creating config drive at /var/lib/nova/instances/1608bb7b-ae4b-40c1-b404-16dabe957e37/disk.config#033[00m
Jan 23 04:44:55 np0005593234 nova_compute[227762]: 2026-01-23 09:44:55.680 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1608bb7b-ae4b-40c1-b404-16dabe957e37/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph_0mhzkq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:55 np0005593234 nova_compute[227762]: 2026-01-23 09:44:55.814 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1608bb7b-ae4b-40c1-b404-16dabe957e37/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph_0mhzkq" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:55 np0005593234 nova_compute[227762]: 2026-01-23 09:44:55.841 227766 DEBUG nova.storage.rbd_utils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] rbd image 1608bb7b-ae4b-40c1-b404-16dabe957e37_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:44:55 np0005593234 nova_compute[227762]: 2026-01-23 09:44:55.845 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1608bb7b-ae4b-40c1-b404-16dabe957e37/disk.config 1608bb7b-ae4b-40c1-b404-16dabe957e37_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:44:56 np0005593234 nova_compute[227762]: 2026-01-23 09:44:56.000 227766 DEBUG oslo_concurrency.processutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1608bb7b-ae4b-40c1-b404-16dabe957e37/disk.config 1608bb7b-ae4b-40c1-b404-16dabe957e37_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.155s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:44:56 np0005593234 nova_compute[227762]: 2026-01-23 09:44:56.001 227766 INFO nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Deleting local config drive /var/lib/nova/instances/1608bb7b-ae4b-40c1-b404-16dabe957e37/disk.config because it was imported into RBD.#033[00m
Jan 23 04:44:56 np0005593234 kernel: tap57475f81-ae: entered promiscuous mode
Jan 23 04:44:56 np0005593234 ovn_controller[134547]: 2026-01-23T09:44:56Z|00150|binding|INFO|Claiming lport 57475f81-ae6b-4bd0-ad78-c49eec106db7 for this chassis.
Jan 23 04:44:56 np0005593234 NetworkManager[48942]: <info>  [1769161496.0748] manager: (tap57475f81-ae): new Tun device (/org/freedesktop/NetworkManager/Devices/85)
Jan 23 04:44:56 np0005593234 ovn_controller[134547]: 2026-01-23T09:44:56Z|00151|binding|INFO|57475f81-ae6b-4bd0-ad78-c49eec106db7: Claiming fa:16:3e:b1:32:2c 10.100.0.7
Jan 23 04:44:56 np0005593234 nova_compute[227762]: 2026-01-23 09:44:56.075 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:56 np0005593234 systemd-udevd[252616]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.125 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:32:2c 10.100.0.7'], port_security=['fa:16:3e:b1:32:2c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1608bb7b-ae4b-40c1-b404-16dabe957e37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6775b063-5172-4226-8ee7-bbb7bf41d574', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24b47af7a3f745a7bc14b9a64c920144', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c82ffd29-94e3-4adc-8eb5-200f3da764b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ec6bf9e-f40f-4a6b-ac38-5129d9705bbd, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=57475f81-ae6b-4bd0-ad78-c49eec106db7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.126 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 57475f81-ae6b-4bd0-ad78-c49eec106db7 in datapath 6775b063-5172-4226-8ee7-bbb7bf41d574 bound to our chassis#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.128 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6775b063-5172-4226-8ee7-bbb7bf41d574#033[00m
Jan 23 04:44:56 np0005593234 NetworkManager[48942]: <info>  [1769161496.1354] device (tap57475f81-ae): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:44:56 np0005593234 NetworkManager[48942]: <info>  [1769161496.1361] device (tap57475f81-ae): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:44:56 np0005593234 systemd-machined[195626]: New machine qemu-24-instance-00000037.
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.139 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[568f9e5d-2023-4148-98f9-ef0c214f1801]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.140 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6775b063-51 in ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.143 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6775b063-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.143 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f24fa9d7-e67f-4819-a1c1-5229e9901d39]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.144 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd8b5b7-a72b-47d9-a9b6-316d54d0c3fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.159 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[97c46119-dd71-4916-864f-2057c7781cd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 nova_compute[227762]: 2026-01-23 09:44:56.161 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:56 np0005593234 systemd[1]: Started Virtual Machine qemu-24-instance-00000037.
Jan 23 04:44:56 np0005593234 ovn_controller[134547]: 2026-01-23T09:44:56Z|00152|binding|INFO|Setting lport 57475f81-ae6b-4bd0-ad78-c49eec106db7 ovn-installed in OVS
Jan 23 04:44:56 np0005593234 ovn_controller[134547]: 2026-01-23T09:44:56Z|00153|binding|INFO|Setting lport 57475f81-ae6b-4bd0-ad78-c49eec106db7 up in Southbound
Jan 23 04:44:56 np0005593234 nova_compute[227762]: 2026-01-23 09:44:56.174 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:56 np0005593234 podman[252610]: 2026-01-23 09:44:56.17768659 +0000 UTC m=+0.068812047 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.181 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[61a3340f-2229-44e0-9f10-da9d24361e80]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.210 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4215278a-e142-47a4-b263-5c9e58693841]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 systemd-udevd[252626]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:44:56 np0005593234 NetworkManager[48942]: <info>  [1769161496.2162] manager: (tap6775b063-50): new Veth device (/org/freedesktop/NetworkManager/Devices/86)
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.215 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[32bd91cb-f370-43e2-ac0a-cc2fa3b5f990]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.244 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a7a974c6-276e-42ed-a8ac-73a12c3d151f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.247 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f50d7107-e781-4f1d-b328-3ff7eecbd14d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 NetworkManager[48942]: <info>  [1769161496.2661] device (tap6775b063-50): carrier: link connected
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.271 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[29f96219-df56-42dc-9aef-95bbce402b1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:44:56 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.286 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f74a41a7-19cb-4816-9c1e-551a21dcb263]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6775b063-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:d5:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539122, 'reachable_time': 35438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 252666, 'error': None, 'target': 'ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.300 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f23e2fc9-6857-4077-9994-df93284a1358]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:d573'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 539122, 'tstamp': 539122}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 252667, 'error': None, 'target': 'ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.313 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec1dd3d-b9f0-4bec-9790-9c40a421d502]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6775b063-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:d5:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539122, 'reachable_time': 35438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 252668, 'error': None, 'target': 'ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 nova_compute[227762]: 2026-01-23 09:44:56.314 227766 DEBUG nova.network.neutron [req-b815a2bd-86e9-4c62-a67b-30777e049d7c req-53c65ea8-5ccd-4c14-9844-12d72cd253c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Updated VIF entry in instance network info cache for port 57475f81-ae6b-4bd0-ad78-c49eec106db7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:44:56 np0005593234 nova_compute[227762]: 2026-01-23 09:44:56.315 227766 DEBUG nova.network.neutron [req-b815a2bd-86e9-4c62-a67b-30777e049d7c req-53c65ea8-5ccd-4c14-9844-12d72cd253c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Updating instance_info_cache with network_info: [{"id": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "address": "fa:16:3e:b1:32:2c", "network": {"id": "6775b063-5172-4226-8ee7-bbb7bf41d574", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1005289239-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "24b47af7a3f745a7bc14b9a64c920144", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57475f81-ae", "ovs_interfaceid": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.339 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c0ac1ee2-bd68-4446-ae93-4ec23b9f8046]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.386 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ad9579ea-2ecb-46d8-b345-9e1132989b60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.387 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6775b063-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.388 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.388 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6775b063-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:56 np0005593234 kernel: tap6775b063-50: entered promiscuous mode
Jan 23 04:44:56 np0005593234 nova_compute[227762]: 2026-01-23 09:44:56.390 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.392 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6775b063-50, col_values=(('external_ids', {'iface-id': '19633461-d38d-4e48-af36-1cbe0e5d5d09'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:44:56 np0005593234 ovn_controller[134547]: 2026-01-23T09:44:56Z|00154|binding|INFO|Releasing lport 19633461-d38d-4e48-af36-1cbe0e5d5d09 from this chassis (sb_readonly=0)
Jan 23 04:44:56 np0005593234 NetworkManager[48942]: <info>  [1769161496.3932] manager: (tap6775b063-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/87)
Jan 23 04:44:56 np0005593234 nova_compute[227762]: 2026-01-23 09:44:56.393 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:56 np0005593234 nova_compute[227762]: 2026-01-23 09:44:56.405 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.406 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6775b063-5172-4226-8ee7-bbb7bf41d574.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6775b063-5172-4226-8ee7-bbb7bf41d574.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.406 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ab9cad74-4a7d-4049-963d-5271ac7b6a30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.407 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-6775b063-5172-4226-8ee7-bbb7bf41d574
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/6775b063-5172-4226-8ee7-bbb7bf41d574.pid.haproxy
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 6775b063-5172-4226-8ee7-bbb7bf41d574
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:44:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:44:56.408 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574', 'env', 'PROCESS_TAG=haproxy-6775b063-5172-4226-8ee7-bbb7bf41d574', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6775b063-5172-4226-8ee7-bbb7bf41d574.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:44:56 np0005593234 podman[252700]: 2026-01-23 09:44:56.7787642 +0000 UTC m=+0.050526216 container create 116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:44:56 np0005593234 systemd[1]: Started libpod-conmon-116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f.scope.
Jan 23 04:44:56 np0005593234 podman[252700]: 2026-01-23 09:44:56.75150585 +0000 UTC m=+0.023267886 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:44:56 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:44:56 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b9c2f85e95b4282fa4c1e92267374ab86bba255571f3ad1269b7bd2c4311fb7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:44:56 np0005593234 podman[252700]: 2026-01-23 09:44:56.864528514 +0000 UTC m=+0.136290550 container init 116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:44:56 np0005593234 podman[252700]: 2026-01-23 09:44:56.86951904 +0000 UTC m=+0.141281046 container start 116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:44:56 np0005593234 neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574[252716]: [NOTICE]   (252720) : New worker (252722) forked
Jan 23 04:44:56 np0005593234 neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574[252716]: [NOTICE]   (252720) : Loading success.
Jan 23 04:44:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:56.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:57 np0005593234 nova_compute[227762]: 2026-01-23 09:44:57.072 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161497.0722206, 1608bb7b-ae4b-40c1-b404-16dabe957e37 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:44:57 np0005593234 nova_compute[227762]: 2026-01-23 09:44:57.073 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] VM Started (Lifecycle Event)#033[00m
Jan 23 04:44:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:57.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:58 np0005593234 nova_compute[227762]: 2026-01-23 09:44:58.563 227766 DEBUG oslo_concurrency.lockutils [req-b815a2bd-86e9-4c62-a67b-30777e049d7c req-53c65ea8-5ccd-4c14-9844-12d72cd253c7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-1608bb7b-ae4b-40c1-b404-16dabe957e37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:44:58 np0005593234 nova_compute[227762]: 2026-01-23 09:44:58.655 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:44:58 np0005593234 nova_compute[227762]: 2026-01-23 09:44:58.660 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161497.072366, 1608bb7b-ae4b-40c1-b404-16dabe957e37 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:44:58 np0005593234 nova_compute[227762]: 2026-01-23 09:44:58.660 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:44:58 np0005593234 nova_compute[227762]: 2026-01-23 09:44:58.802 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:44:58 np0005593234 nova_compute[227762]: 2026-01-23 09:44:58.805 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:44:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:44:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:44:58.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:44:59 np0005593234 nova_compute[227762]: 2026-01-23 09:44:59.048 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:44:59 np0005593234 nova_compute[227762]: 2026-01-23 09:44:59.176 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:44:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:44:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:44:59.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:44:59 np0005593234 nova_compute[227762]: 2026-01-23 09:44:59.464 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:44:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:00.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.013 227766 DEBUG nova.compute.manager [req-5046b0c0-dec6-46be-af06-031837bbf507 req-e89b11cf-e26e-4f12-9845-a52d9907f595 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Received event network-vif-plugged-57475f81-ae6b-4bd0-ad78-c49eec106db7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.014 227766 DEBUG oslo_concurrency.lockutils [req-5046b0c0-dec6-46be-af06-031837bbf507 req-e89b11cf-e26e-4f12-9845-a52d9907f595 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.014 227766 DEBUG oslo_concurrency.lockutils [req-5046b0c0-dec6-46be-af06-031837bbf507 req-e89b11cf-e26e-4f12-9845-a52d9907f595 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.014 227766 DEBUG oslo_concurrency.lockutils [req-5046b0c0-dec6-46be-af06-031837bbf507 req-e89b11cf-e26e-4f12-9845-a52d9907f595 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.015 227766 DEBUG nova.compute.manager [req-5046b0c0-dec6-46be-af06-031837bbf507 req-e89b11cf-e26e-4f12-9845-a52d9907f595 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Processing event network-vif-plugged-57475f81-ae6b-4bd0-ad78-c49eec106db7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.015 227766 DEBUG nova.compute.manager [req-5046b0c0-dec6-46be-af06-031837bbf507 req-e89b11cf-e26e-4f12-9845-a52d9907f595 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Received event network-vif-plugged-57475f81-ae6b-4bd0-ad78-c49eec106db7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.015 227766 DEBUG oslo_concurrency.lockutils [req-5046b0c0-dec6-46be-af06-031837bbf507 req-e89b11cf-e26e-4f12-9845-a52d9907f595 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.015 227766 DEBUG oslo_concurrency.lockutils [req-5046b0c0-dec6-46be-af06-031837bbf507 req-e89b11cf-e26e-4f12-9845-a52d9907f595 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.015 227766 DEBUG oslo_concurrency.lockutils [req-5046b0c0-dec6-46be-af06-031837bbf507 req-e89b11cf-e26e-4f12-9845-a52d9907f595 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.016 227766 DEBUG nova.compute.manager [req-5046b0c0-dec6-46be-af06-031837bbf507 req-e89b11cf-e26e-4f12-9845-a52d9907f595 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] No waiting events found dispatching network-vif-plugged-57475f81-ae6b-4bd0-ad78-c49eec106db7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.016 227766 WARNING nova.compute.manager [req-5046b0c0-dec6-46be-af06-031837bbf507 req-e89b11cf-e26e-4f12-9845-a52d9907f595 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Received unexpected event network-vif-plugged-57475f81-ae6b-4bd0-ad78-c49eec106db7 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.016 227766 DEBUG nova.compute.manager [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.020 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161501.0200717, 1608bb7b-ae4b-40c1-b404-16dabe957e37 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.020 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.022 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.025 227766 INFO nova.virt.libvirt.driver [-] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Instance spawned successfully.#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.025 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.125 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.128 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.136 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.137 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.137 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.138 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.138 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.138 227766 DEBUG nova.virt.libvirt.driver [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:45:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:01.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.266 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.506 227766 INFO nova.compute.manager [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Took 17.48 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:45:01 np0005593234 nova_compute[227762]: 2026-01-23 09:45:01.508 227766 DEBUG nova.compute.manager [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:45:02 np0005593234 nova_compute[227762]: 2026-01-23 09:45:02.504 227766 INFO nova.compute.manager [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Took 19.49 seconds to build instance.#033[00m
Jan 23 04:45:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:02.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:03.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:04 np0005593234 nova_compute[227762]: 2026-01-23 09:45:04.179 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:04 np0005593234 nova_compute[227762]: 2026-01-23 09:45:04.236 227766 DEBUG oslo_concurrency.lockutils [None req-f9c8bdf7-1e54-4734-95cf-addc7f9b85b6 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.298s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:04 np0005593234 nova_compute[227762]: 2026-01-23 09:45:04.465 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:04.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:05.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:07.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:07.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:09.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:09 np0005593234 nova_compute[227762]: 2026-01-23 09:45:09.183 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:09.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:09 np0005593234 nova_compute[227762]: 2026-01-23 09:45:09.467 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:09 np0005593234 podman[252780]: 2026-01-23 09:45:09.794274062 +0000 UTC m=+0.089627405 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller)
Jan 23 04:45:09 np0005593234 nova_compute[227762]: 2026-01-23 09:45:09.876 227766 DEBUG nova.compute.manager [None req-6b91eec1-c3f6-42b5-9ca2-1d9f7d558c74 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:45:09 np0005593234 nova_compute[227762]: 2026-01-23 09:45:09.923 227766 INFO nova.compute.manager [None req-6b91eec1-c3f6-42b5-9ca2-1d9f7d558c74 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] instance snapshotting#033[00m
Jan 23 04:45:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:10 np0005593234 nova_compute[227762]: 2026-01-23 09:45:10.342 227766 INFO nova.virt.libvirt.driver [None req-6b91eec1-c3f6-42b5-9ca2-1d9f7d558c74 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Beginning live snapshot process#033[00m
Jan 23 04:45:10 np0005593234 nova_compute[227762]: 2026-01-23 09:45:10.590 227766 DEBUG nova.virt.libvirt.imagebackend [None req-6b91eec1-c3f6-42b5-9ca2-1d9f7d558c74 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 04:45:10 np0005593234 nova_compute[227762]: 2026-01-23 09:45:10.874 227766 DEBUG nova.storage.rbd_utils [None req-6b91eec1-c3f6-42b5-9ca2-1d9f7d558c74 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] creating snapshot(17163fdc6a5f4d47bee1e276da483d4a) on rbd image(1608bb7b-ae4b-40c1-b404-16dabe957e37_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:45:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:11.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:11.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e206 e206: 3 total, 3 up, 3 in
Jan 23 04:45:11 np0005593234 nova_compute[227762]: 2026-01-23 09:45:11.612 227766 DEBUG nova.storage.rbd_utils [None req-6b91eec1-c3f6-42b5-9ca2-1d9f7d558c74 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] cloning vms/1608bb7b-ae4b-40c1-b404-16dabe957e37_disk@17163fdc6a5f4d47bee1e276da483d4a to images/985f1d15-bd4f-4eb9-b8ae-0616b8fcb8b9 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 04:45:11 np0005593234 nova_compute[227762]: 2026-01-23 09:45:11.733 227766 DEBUG nova.storage.rbd_utils [None req-6b91eec1-c3f6-42b5-9ca2-1d9f7d558c74 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] flattening images/985f1d15-bd4f-4eb9-b8ae-0616b8fcb8b9 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 04:45:11 np0005593234 nova_compute[227762]: 2026-01-23 09:45:11.793 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:12 np0005593234 nova_compute[227762]: 2026-01-23 09:45:12.059 227766 DEBUG nova.storage.rbd_utils [None req-6b91eec1-c3f6-42b5-9ca2-1d9f7d558c74 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] removing snapshot(17163fdc6a5f4d47bee1e276da483d4a) on rbd image(1608bb7b-ae4b-40c1-b404-16dabe957e37_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:45:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e207 e207: 3 total, 3 up, 3 in
Jan 23 04:45:12 np0005593234 nova_compute[227762]: 2026-01-23 09:45:12.605 227766 DEBUG nova.storage.rbd_utils [None req-6b91eec1-c3f6-42b5-9ca2-1d9f7d558c74 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] creating snapshot(snap) on rbd image(985f1d15-bd4f-4eb9-b8ae-0616b8fcb8b9) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:45:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:13.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:13.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e208 e208: 3 total, 3 up, 3 in
Jan 23 04:45:14 np0005593234 nova_compute[227762]: 2026-01-23 09:45:14.186 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:14 np0005593234 ovn_controller[134547]: 2026-01-23T09:45:14Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:32:2c 10.100.0.7
Jan 23 04:45:14 np0005593234 ovn_controller[134547]: 2026-01-23T09:45:14Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:32:2c 10.100.0.7
Jan 23 04:45:14 np0005593234 nova_compute[227762]: 2026-01-23 09:45:14.469 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:45:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:15.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:45:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:15.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:15 np0005593234 nova_compute[227762]: 2026-01-23 09:45:15.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:15 np0005593234 nova_compute[227762]: 2026-01-23 09:45:15.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:15 np0005593234 nova_compute[227762]: 2026-01-23 09:45:15.780 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:15 np0005593234 nova_compute[227762]: 2026-01-23 09:45:15.780 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:15 np0005593234 nova_compute[227762]: 2026-01-23 09:45:15.781 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:15 np0005593234 nova_compute[227762]: 2026-01-23 09:45:15.781 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:45:15 np0005593234 nova_compute[227762]: 2026-01-23 09:45:15.781 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:45:16 np0005593234 nova_compute[227762]: 2026-01-23 09:45:16.213 227766 INFO nova.virt.libvirt.driver [None req-6b91eec1-c3f6-42b5-9ca2-1d9f7d558c74 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Snapshot image upload complete#033[00m
Jan 23 04:45:16 np0005593234 nova_compute[227762]: 2026-01-23 09:45:16.214 227766 INFO nova.compute.manager [None req-6b91eec1-c3f6-42b5-9ca2-1d9f7d558c74 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Took 6.29 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 23 04:45:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:45:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1994590513' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:45:16 np0005593234 nova_compute[227762]: 2026-01-23 09:45:16.247 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:45:16 np0005593234 nova_compute[227762]: 2026-01-23 09:45:16.361 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000037 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:45:16 np0005593234 nova_compute[227762]: 2026-01-23 09:45:16.361 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000037 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:45:16 np0005593234 nova_compute[227762]: 2026-01-23 09:45:16.503 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:45:16 np0005593234 nova_compute[227762]: 2026-01-23 09:45:16.505 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4513MB free_disk=20.901145935058594GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:45:16 np0005593234 nova_compute[227762]: 2026-01-23 09:45:16.505 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:16 np0005593234 nova_compute[227762]: 2026-01-23 09:45:16.505 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:16 np0005593234 nova_compute[227762]: 2026-01-23 09:45:16.584 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 1608bb7b-ae4b-40c1-b404-16dabe957e37 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:45:16 np0005593234 nova_compute[227762]: 2026-01-23 09:45:16.584 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:45:16 np0005593234 nova_compute[227762]: 2026-01-23 09:45:16.584 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:45:16 np0005593234 nova_compute[227762]: 2026-01-23 09:45:16.637 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:45:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:17.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:45:17 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2870520970' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:45:17 np0005593234 nova_compute[227762]: 2026-01-23 09:45:17.049 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:45:17 np0005593234 nova_compute[227762]: 2026-01-23 09:45:17.055 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:45:17 np0005593234 nova_compute[227762]: 2026-01-23 09:45:17.078 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:45:17 np0005593234 nova_compute[227762]: 2026-01-23 09:45:17.141 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:45:17 np0005593234 nova_compute[227762]: 2026-01-23 09:45:17.141 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:17.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:18 np0005593234 nova_compute[227762]: 2026-01-23 09:45:18.136 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:18 np0005593234 nova_compute[227762]: 2026-01-23 09:45:18.164 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:18 np0005593234 nova_compute[227762]: 2026-01-23 09:45:18.164 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:18 np0005593234 nova_compute[227762]: 2026-01-23 09:45:18.164 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:45:18 np0005593234 nova_compute[227762]: 2026-01-23 09:45:18.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:18 np0005593234 nova_compute[227762]: 2026-01-23 09:45:18.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:45:18 np0005593234 nova_compute[227762]: 2026-01-23 09:45:18.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:45:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:19.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:19 np0005593234 nova_compute[227762]: 2026-01-23 09:45:19.224 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:19.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:19 np0005593234 nova_compute[227762]: 2026-01-23 09:45:19.326 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-1608bb7b-ae4b-40c1-b404-16dabe957e37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:45:19 np0005593234 nova_compute[227762]: 2026-01-23 09:45:19.327 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-1608bb7b-ae4b-40c1-b404-16dabe957e37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:45:19 np0005593234 nova_compute[227762]: 2026-01-23 09:45:19.327 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:45:19 np0005593234 nova_compute[227762]: 2026-01-23 09:45:19.328 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1608bb7b-ae4b-40c1-b404-16dabe957e37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:45:19 np0005593234 nova_compute[227762]: 2026-01-23 09:45:19.471 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e208 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e209 e209: 3 total, 3 up, 3 in
Jan 23 04:45:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:21.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:21.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:21 np0005593234 nova_compute[227762]: 2026-01-23 09:45:21.625 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Updating instance_info_cache with network_info: [{"id": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "address": "fa:16:3e:b1:32:2c", "network": {"id": "6775b063-5172-4226-8ee7-bbb7bf41d574", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1005289239-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "24b47af7a3f745a7bc14b9a64c920144", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57475f81-ae", "ovs_interfaceid": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:45:21 np0005593234 nova_compute[227762]: 2026-01-23 09:45:21.652 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-1608bb7b-ae4b-40c1-b404-16dabe957e37" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:45:21 np0005593234 nova_compute[227762]: 2026-01-23 09:45:21.653 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:45:21 np0005593234 nova_compute[227762]: 2026-01-23 09:45:21.653 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:22 np0005593234 nova_compute[227762]: 2026-01-23 09:45:22.340 227766 DEBUG nova.compute.manager [None req-41820385-a1d7-4e5d-b018-8555a2c95844 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:45:22 np0005593234 nova_compute[227762]: 2026-01-23 09:45:22.405 227766 INFO nova.compute.manager [None req-41820385-a1d7-4e5d-b018-8555a2c95844 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] instance snapshotting#033[00m
Jan 23 04:45:22 np0005593234 nova_compute[227762]: 2026-01-23 09:45:22.758 227766 INFO nova.virt.libvirt.driver [None req-41820385-a1d7-4e5d-b018-8555a2c95844 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Beginning live snapshot process#033[00m
Jan 23 04:45:23 np0005593234 nova_compute[227762]: 2026-01-23 09:45:23.003 227766 DEBUG nova.virt.libvirt.imagebackend [None req-41820385-a1d7-4e5d-b018-8555a2c95844 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 04:45:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:23.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:23.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:23 np0005593234 nova_compute[227762]: 2026-01-23 09:45:23.647 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:23 np0005593234 nova_compute[227762]: 2026-01-23 09:45:23.672 227766 DEBUG nova.storage.rbd_utils [None req-41820385-a1d7-4e5d-b018-8555a2c95844 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] creating snapshot(7b403c2c14134d6094e03c489d517ff0) on rbd image(1608bb7b-ae4b-40c1-b404-16dabe957e37_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:45:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e210 e210: 3 total, 3 up, 3 in
Jan 23 04:45:24 np0005593234 nova_compute[227762]: 2026-01-23 09:45:24.232 227766 DEBUG nova.storage.rbd_utils [None req-41820385-a1d7-4e5d-b018-8555a2c95844 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] cloning vms/1608bb7b-ae4b-40c1-b404-16dabe957e37_disk@7b403c2c14134d6094e03c489d517ff0 to images/975acd30-ef25-4077-8c73-a70095688223 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 04:45:24 np0005593234 nova_compute[227762]: 2026-01-23 09:45:24.265 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:24 np0005593234 nova_compute[227762]: 2026-01-23 09:45:24.381 227766 DEBUG nova.storage.rbd_utils [None req-41820385-a1d7-4e5d-b018-8555a2c95844 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] flattening images/975acd30-ef25-4077-8c73-a70095688223 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 04:45:24 np0005593234 nova_compute[227762]: 2026-01-23 09:45:24.472 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:24 np0005593234 nova_compute[227762]: 2026-01-23 09:45:24.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:45:24 np0005593234 nova_compute[227762]: 2026-01-23 09:45:24.882 227766 DEBUG nova.storage.rbd_utils [None req-41820385-a1d7-4e5d-b018-8555a2c95844 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] removing snapshot(7b403c2c14134d6094e03c489d517ff0) on rbd image(1608bb7b-ae4b-40c1-b404-16dabe957e37_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:45:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:25.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:25.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e211 e211: 3 total, 3 up, 3 in
Jan 23 04:45:25 np0005593234 nova_compute[227762]: 2026-01-23 09:45:25.524 227766 DEBUG nova.storage.rbd_utils [None req-41820385-a1d7-4e5d-b018-8555a2c95844 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] creating snapshot(snap) on rbd image(975acd30-ef25-4077-8c73-a70095688223) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:45:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e212 e212: 3 total, 3 up, 3 in
Jan 23 04:45:26 np0005593234 podman[253194]: 2026-01-23 09:45:26.768953002 +0000 UTC m=+0.061920571 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:45:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:27.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:27.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:45:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:29.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:45:29 np0005593234 nova_compute[227762]: 2026-01-23 09:45:29.230 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:29.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:29 np0005593234 nova_compute[227762]: 2026-01-23 09:45:29.474 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:31.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:31 np0005593234 nova_compute[227762]: 2026-01-23 09:45:31.223 227766 INFO nova.virt.libvirt.driver [None req-41820385-a1d7-4e5d-b018-8555a2c95844 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Snapshot image upload complete#033[00m
Jan 23 04:45:31 np0005593234 nova_compute[227762]: 2026-01-23 09:45:31.224 227766 INFO nova.compute.manager [None req-41820385-a1d7-4e5d-b018-8555a2c95844 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Took 8.82 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 23 04:45:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:31.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:33.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:33.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e213 e213: 3 total, 3 up, 3 in
Jan 23 04:45:34 np0005593234 nova_compute[227762]: 2026-01-23 09:45:34.232 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:34 np0005593234 nova_compute[227762]: 2026-01-23 09:45:34.475 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e213 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:35.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:45:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:35.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:45:36 np0005593234 nova_compute[227762]: 2026-01-23 09:45:36.453 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:36.455 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:45:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:36.456 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:45:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e214 e214: 3 total, 3 up, 3 in
Jan 23 04:45:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:37.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:37.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.289 227766 DEBUG oslo_concurrency.lockutils [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Acquiring lock "1608bb7b-ae4b-40c1-b404-16dabe957e37" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.290 227766 DEBUG oslo_concurrency.lockutils [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.290 227766 DEBUG oslo_concurrency.lockutils [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Acquiring lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.290 227766 DEBUG oslo_concurrency.lockutils [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.291 227766 DEBUG oslo_concurrency.lockutils [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.292 227766 INFO nova.compute.manager [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Terminating instance#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.293 227766 DEBUG nova.compute.manager [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:45:38 np0005593234 kernel: tap57475f81-ae (unregistering): left promiscuous mode
Jan 23 04:45:38 np0005593234 NetworkManager[48942]: <info>  [1769161538.3451] device (tap57475f81-ae): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.358 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:38 np0005593234 ovn_controller[134547]: 2026-01-23T09:45:38Z|00155|binding|INFO|Releasing lport 57475f81-ae6b-4bd0-ad78-c49eec106db7 from this chassis (sb_readonly=0)
Jan 23 04:45:38 np0005593234 ovn_controller[134547]: 2026-01-23T09:45:38Z|00156|binding|INFO|Setting lport 57475f81-ae6b-4bd0-ad78-c49eec106db7 down in Southbound
Jan 23 04:45:38 np0005593234 ovn_controller[134547]: 2026-01-23T09:45:38Z|00157|binding|INFO|Removing iface tap57475f81-ae ovn-installed in OVS
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.360 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.371 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:32:2c 10.100.0.7'], port_security=['fa:16:3e:b1:32:2c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '1608bb7b-ae4b-40c1-b404-16dabe957e37', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6775b063-5172-4226-8ee7-bbb7bf41d574', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '24b47af7a3f745a7bc14b9a64c920144', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c82ffd29-94e3-4adc-8eb5-200f3da764b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3ec6bf9e-f40f-4a6b-ac38-5129d9705bbd, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=57475f81-ae6b-4bd0-ad78-c49eec106db7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.373 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 57475f81-ae6b-4bd0-ad78-c49eec106db7 in datapath 6775b063-5172-4226-8ee7-bbb7bf41d574 unbound from our chassis#033[00m
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.375 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6775b063-5172-4226-8ee7-bbb7bf41d574, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.377 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5c6ab89d-4ac0-4e7b-abdc-5569f4d103fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.378 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574 namespace which is not needed anymore#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.389 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:38 np0005593234 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000037.scope: Deactivated successfully.
Jan 23 04:45:38 np0005593234 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000037.scope: Consumed 15.152s CPU time.
Jan 23 04:45:38 np0005593234 systemd-machined[195626]: Machine qemu-24-instance-00000037 terminated.
Jan 23 04:45:38 np0005593234 neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574[252716]: [NOTICE]   (252720) : haproxy version is 2.8.14-c23fe91
Jan 23 04:45:38 np0005593234 neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574[252716]: [NOTICE]   (252720) : path to executable is /usr/sbin/haproxy
Jan 23 04:45:38 np0005593234 neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574[252716]: [WARNING]  (252720) : Exiting Master process...
Jan 23 04:45:38 np0005593234 neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574[252716]: [ALERT]    (252720) : Current worker (252722) exited with code 143 (Terminated)
Jan 23 04:45:38 np0005593234 neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574[252716]: [WARNING]  (252720) : All workers exited. Exiting... (0)
Jan 23 04:45:38 np0005593234 systemd[1]: libpod-116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f.scope: Deactivated successfully.
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.516 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:38 np0005593234 podman[253294]: 2026-01-23 09:45:38.52108854 +0000 UTC m=+0.051220648 container died 116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.521 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.530 227766 INFO nova.virt.libvirt.driver [-] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Instance destroyed successfully.#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.530 227766 DEBUG nova.objects.instance [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lazy-loading 'resources' on Instance uuid 1608bb7b-ae4b-40c1-b404-16dabe957e37 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:45:38 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f-userdata-shm.mount: Deactivated successfully.
Jan 23 04:45:38 np0005593234 systemd[1]: var-lib-containers-storage-overlay-8b9c2f85e95b4282fa4c1e92267374ab86bba255571f3ad1269b7bd2c4311fb7-merged.mount: Deactivated successfully.
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.554 227766 DEBUG nova.virt.libvirt.vif [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:44:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ImagesOneServerTestJSON-server-795664987',display_name='tempest-ImagesOneServerTestJSON-server-795664987',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-imagesoneservertestjson-server-795664987',id=55,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:45:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='24b47af7a3f745a7bc14b9a64c920144',ramdisk_id='',reservation_id='r-t1s0asbx',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ImagesOneServerTestJSON-2122530602',owner_user_name='tempest-ImagesOneServerTestJSON-2122530602-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:45:31Z,user_data=None,user_id='e2eb5d0826b74d23b502201e3cd116a3',uuid=1608bb7b-ae4b-40c1-b404-16dabe957e37,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "address": "fa:16:3e:b1:32:2c", "network": {"id": "6775b063-5172-4226-8ee7-bbb7bf41d574", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1005289239-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "24b47af7a3f745a7bc14b9a64c920144", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57475f81-ae", "ovs_interfaceid": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.555 227766 DEBUG nova.network.os_vif_util [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Converting VIF {"id": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "address": "fa:16:3e:b1:32:2c", "network": {"id": "6775b063-5172-4226-8ee7-bbb7bf41d574", "bridge": "br-int", "label": "tempest-ImagesOneServerTestJSON-1005289239-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "24b47af7a3f745a7bc14b9a64c920144", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap57475f81-ae", "ovs_interfaceid": "57475f81-ae6b-4bd0-ad78-c49eec106db7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.556 227766 DEBUG nova.network.os_vif_util [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:32:2c,bridge_name='br-int',has_traffic_filtering=True,id=57475f81-ae6b-4bd0-ad78-c49eec106db7,network=Network(6775b063-5172-4226-8ee7-bbb7bf41d574),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57475f81-ae') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.556 227766 DEBUG os_vif [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:32:2c,bridge_name='br-int',has_traffic_filtering=True,id=57475f81-ae6b-4bd0-ad78-c49eec106db7,network=Network(6775b063-5172-4226-8ee7-bbb7bf41d574),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57475f81-ae') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.558 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.559 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap57475f81-ae, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.560 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:38 np0005593234 podman[253294]: 2026-01-23 09:45:38.562295945 +0000 UTC m=+0.092428033 container cleanup 116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.563 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.566 227766 INFO os_vif [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:32:2c,bridge_name='br-int',has_traffic_filtering=True,id=57475f81-ae6b-4bd0-ad78-c49eec106db7,network=Network(6775b063-5172-4226-8ee7-bbb7bf41d574),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap57475f81-ae')#033[00m
Jan 23 04:45:38 np0005593234 systemd[1]: libpod-conmon-116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f.scope: Deactivated successfully.
Jan 23 04:45:38 np0005593234 podman[253331]: 2026-01-23 09:45:38.62887216 +0000 UTC m=+0.044056754 container remove 116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.635 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[64d98406-8627-475c-b35e-2cd70b0f4224]: (4, ('Fri Jan 23 09:45:38 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574 (116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f)\n116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f\nFri Jan 23 09:45:38 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574 (116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f)\n116f508c5161f4eb03c7c20caba690c8b949634c3e7a47419596e26a2e50923f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.636 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[34dfef33-ae29-4970-a9e5-556f583dded9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.637 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6775b063-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.639 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:38 np0005593234 kernel: tap6775b063-50: left promiscuous mode
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.653 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.655 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[09fa67a5-4f73-4029-8571-6d1d87769db2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.673 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4eb85916-0577-4b1c-83de-2f2a71d494cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.674 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6d2cb47b-6a64-4431-8d0c-f353c39d4a45]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.689 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2af95ce8-197c-4896-a040-f2e60a07bfe5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 539116, 'reachable_time': 18158, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 253364, 'error': None, 'target': 'ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:45:38 np0005593234 systemd[1]: run-netns-ovnmeta\x2d6775b063\x2d5172\x2d4226\x2d8ee7\x2dbbb7bf41d574.mount: Deactivated successfully.
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.694 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6775b063-5172-4226-8ee7-bbb7bf41d574 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:45:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:38.694 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[bc162973-b8ba-48ca-9437-90aa66717634]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.701 227766 DEBUG nova.compute.manager [req-96dea23e-89e4-4ba1-b747-2a506c363de9 req-da1ac1b5-1770-48be-b773-2f29ea615a8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Received event network-vif-unplugged-57475f81-ae6b-4bd0-ad78-c49eec106db7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.702 227766 DEBUG oslo_concurrency.lockutils [req-96dea23e-89e4-4ba1-b747-2a506c363de9 req-da1ac1b5-1770-48be-b773-2f29ea615a8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.702 227766 DEBUG oslo_concurrency.lockutils [req-96dea23e-89e4-4ba1-b747-2a506c363de9 req-da1ac1b5-1770-48be-b773-2f29ea615a8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.702 227766 DEBUG oslo_concurrency.lockutils [req-96dea23e-89e4-4ba1-b747-2a506c363de9 req-da1ac1b5-1770-48be-b773-2f29ea615a8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.703 227766 DEBUG nova.compute.manager [req-96dea23e-89e4-4ba1-b747-2a506c363de9 req-da1ac1b5-1770-48be-b773-2f29ea615a8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] No waiting events found dispatching network-vif-unplugged-57475f81-ae6b-4bd0-ad78-c49eec106db7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:45:38 np0005593234 nova_compute[227762]: 2026-01-23 09:45:38.703 227766 DEBUG nova.compute.manager [req-96dea23e-89e4-4ba1-b747-2a506c363de9 req-da1ac1b5-1770-48be-b773-2f29ea615a8b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Received event network-vif-unplugged-57475f81-ae6b-4bd0-ad78-c49eec106db7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:45:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:39.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:39 np0005593234 nova_compute[227762]: 2026-01-23 09:45:39.105 227766 INFO nova.virt.libvirt.driver [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Deleting instance files /var/lib/nova/instances/1608bb7b-ae4b-40c1-b404-16dabe957e37_del#033[00m
Jan 23 04:45:39 np0005593234 nova_compute[227762]: 2026-01-23 09:45:39.106 227766 INFO nova.virt.libvirt.driver [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Deletion of /var/lib/nova/instances/1608bb7b-ae4b-40c1-b404-16dabe957e37_del complete#033[00m
Jan 23 04:45:39 np0005593234 nova_compute[227762]: 2026-01-23 09:45:39.177 227766 INFO nova.compute.manager [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Took 0.88 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:45:39 np0005593234 nova_compute[227762]: 2026-01-23 09:45:39.178 227766 DEBUG oslo.service.loopingcall [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:45:39 np0005593234 nova_compute[227762]: 2026-01-23 09:45:39.178 227766 DEBUG nova.compute.manager [-] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:45:39 np0005593234 nova_compute[227762]: 2026-01-23 09:45:39.179 227766 DEBUG nova.network.neutron [-] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:45:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:39.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:39 np0005593234 nova_compute[227762]: 2026-01-23 09:45:39.371 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e214 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:40 np0005593234 podman[253367]: 2026-01-23 09:45:40.785434649 +0000 UTC m=+0.078173388 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:45:40 np0005593234 nova_compute[227762]: 2026-01-23 09:45:40.838 227766 DEBUG nova.compute.manager [req-0e8cfe81-89d1-4ee3-810c-a0f50b10412c req-22684a3b-b15e-4f9c-b479-4a6bb2896561 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Received event network-vif-plugged-57475f81-ae6b-4bd0-ad78-c49eec106db7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:45:40 np0005593234 nova_compute[227762]: 2026-01-23 09:45:40.838 227766 DEBUG oslo_concurrency.lockutils [req-0e8cfe81-89d1-4ee3-810c-a0f50b10412c req-22684a3b-b15e-4f9c-b479-4a6bb2896561 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:40 np0005593234 nova_compute[227762]: 2026-01-23 09:45:40.839 227766 DEBUG oslo_concurrency.lockutils [req-0e8cfe81-89d1-4ee3-810c-a0f50b10412c req-22684a3b-b15e-4f9c-b479-4a6bb2896561 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:40 np0005593234 nova_compute[227762]: 2026-01-23 09:45:40.839 227766 DEBUG oslo_concurrency.lockutils [req-0e8cfe81-89d1-4ee3-810c-a0f50b10412c req-22684a3b-b15e-4f9c-b479-4a6bb2896561 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:40 np0005593234 nova_compute[227762]: 2026-01-23 09:45:40.839 227766 DEBUG nova.compute.manager [req-0e8cfe81-89d1-4ee3-810c-a0f50b10412c req-22684a3b-b15e-4f9c-b479-4a6bb2896561 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] No waiting events found dispatching network-vif-plugged-57475f81-ae6b-4bd0-ad78-c49eec106db7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:45:40 np0005593234 nova_compute[227762]: 2026-01-23 09:45:40.839 227766 WARNING nova.compute.manager [req-0e8cfe81-89d1-4ee3-810c-a0f50b10412c req-22684a3b-b15e-4f9c-b479-4a6bb2896561 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Received unexpected event network-vif-plugged-57475f81-ae6b-4bd0-ad78-c49eec106db7 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:45:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:45:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:41.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:45:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:45:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:41.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:45:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:41.458 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:45:41 np0005593234 nova_compute[227762]: 2026-01-23 09:45:41.677 227766 DEBUG nova.network.neutron [-] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:45:41 np0005593234 nova_compute[227762]: 2026-01-23 09:45:41.863 227766 INFO nova.compute.manager [-] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Took 2.68 seconds to deallocate network for instance.#033[00m
Jan 23 04:45:41 np0005593234 nova_compute[227762]: 2026-01-23 09:45:41.944 227766 DEBUG oslo_concurrency.lockutils [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:41 np0005593234 nova_compute[227762]: 2026-01-23 09:45:41.944 227766 DEBUG oslo_concurrency.lockutils [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:42 np0005593234 nova_compute[227762]: 2026-01-23 09:45:42.083 227766 DEBUG oslo_concurrency.processutils [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:45:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:45:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4146912579' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:45:42 np0005593234 nova_compute[227762]: 2026-01-23 09:45:42.521 227766 DEBUG oslo_concurrency.processutils [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:45:42 np0005593234 nova_compute[227762]: 2026-01-23 09:45:42.528 227766 DEBUG nova.compute.provider_tree [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:45:42 np0005593234 nova_compute[227762]: 2026-01-23 09:45:42.560 227766 DEBUG nova.scheduler.client.report [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:45:42 np0005593234 nova_compute[227762]: 2026-01-23 09:45:42.662 227766 DEBUG oslo_concurrency.lockutils [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.718s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:42.820 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:42.821 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:45:42.822 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:43 np0005593234 nova_compute[227762]: 2026-01-23 09:45:43.009 227766 DEBUG nova.compute.manager [req-9f5a9fe6-7ce1-47a2-a954-9355d8a0c29e req-eb512d36-876d-419a-b99a-251ad9e61c62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Received event network-vif-deleted-57475f81-ae6b-4bd0-ad78-c49eec106db7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:45:43 np0005593234 nova_compute[227762]: 2026-01-23 09:45:43.037 227766 INFO nova.scheduler.client.report [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Deleted allocations for instance 1608bb7b-ae4b-40c1-b404-16dabe957e37#033[00m
Jan 23 04:45:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:43.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:43.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:43 np0005593234 nova_compute[227762]: 2026-01-23 09:45:43.381 227766 DEBUG oslo_concurrency.lockutils [None req-957052fc-c2a9-449e-acbe-c0d4b79c8b63 e2eb5d0826b74d23b502201e3cd116a3 24b47af7a3f745a7bc14b9a64c920144 - - default default] Lock "1608bb7b-ae4b-40c1-b404-16dabe957e37" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.091s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:43 np0005593234 nova_compute[227762]: 2026-01-23 09:45:43.561 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e215 e215: 3 total, 3 up, 3 in
Jan 23 04:45:44 np0005593234 nova_compute[227762]: 2026-01-23 09:45:44.372 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:45.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:45:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:45.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:45:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:47.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:47.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:48 np0005593234 nova_compute[227762]: 2026-01-23 09:45:48.565 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:49.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:45:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:49.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:45:49 np0005593234 nova_compute[227762]: 2026-01-23 09:45:49.416 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e216 e216: 3 total, 3 up, 3 in
Jan 23 04:45:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e216 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:50 np0005593234 nova_compute[227762]: 2026-01-23 09:45:50.586 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Acquiring lock "4312368d-83d8-4d95-98a1-18bc91f966ce" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:50 np0005593234 nova_compute[227762]: 2026-01-23 09:45:50.586 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:50 np0005593234 nova_compute[227762]: 2026-01-23 09:45:50.639 227766 DEBUG nova.compute.manager [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:45:50 np0005593234 nova_compute[227762]: 2026-01-23 09:45:50.805 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:50 np0005593234 nova_compute[227762]: 2026-01-23 09:45:50.806 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:50 np0005593234 nova_compute[227762]: 2026-01-23 09:45:50.814 227766 DEBUG nova.virt.hardware [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:45:50 np0005593234 nova_compute[227762]: 2026-01-23 09:45:50.814 227766 INFO nova.compute.claims [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:45:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:45:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:51.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.110 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:45:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:51.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:45:51 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1451420404' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.531 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.536 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.538 227766 DEBUG nova.compute.provider_tree [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.577 227766 DEBUG nova.scheduler.client.report [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.630 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.824s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.631 227766 DEBUG nova.compute.manager [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.730 227766 DEBUG nova.compute.manager [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.731 227766 DEBUG nova.network.neutron [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.755 227766 INFO nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.776 227766 DEBUG nova.compute.manager [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.905 227766 DEBUG nova.compute.manager [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.906 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.907 227766 INFO nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Creating image(s)#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.934 227766 DEBUG nova.storage.rbd_utils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] rbd image 4312368d-83d8-4d95-98a1-18bc91f966ce_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.964 227766 DEBUG nova.storage.rbd_utils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] rbd image 4312368d-83d8-4d95-98a1-18bc91f966ce_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.993 227766 DEBUG nova.storage.rbd_utils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] rbd image 4312368d-83d8-4d95-98a1-18bc91f966ce_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:45:51 np0005593234 nova_compute[227762]: 2026-01-23 09:45:51.996 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.057 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.058 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.058 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.059 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.087 227766 DEBUG nova.storage.rbd_utils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] rbd image 4312368d-83d8-4d95-98a1-18bc91f966ce_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.090 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 4312368d-83d8-4d95-98a1-18bc91f966ce_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.114 227766 DEBUG nova.policy [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '69cd789fe82e4fb6a1a6fc06333e456b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'bb5dcf0180e448f3971332f4d58c11f9', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.385 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 4312368d-83d8-4d95-98a1-18bc91f966ce_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.295s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.444 227766 DEBUG nova.storage.rbd_utils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] resizing rbd image 4312368d-83d8-4d95-98a1-18bc91f966ce_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.545 227766 DEBUG nova.objects.instance [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lazy-loading 'migration_context' on Instance uuid 4312368d-83d8-4d95-98a1-18bc91f966ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.584 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.585 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Ensure instance console log exists: /var/lib/nova/instances/4312368d-83d8-4d95-98a1-18bc91f966ce/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.585 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.586 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:45:52 np0005593234 nova_compute[227762]: 2026-01-23 09:45:52.586 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:45:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:53.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:53.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:53 np0005593234 nova_compute[227762]: 2026-01-23 09:45:53.529 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161538.5278504, 1608bb7b-ae4b-40c1-b404-16dabe957e37 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:45:53 np0005593234 nova_compute[227762]: 2026-01-23 09:45:53.530 227766 INFO nova.compute.manager [-] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:45:53 np0005593234 nova_compute[227762]: 2026-01-23 09:45:53.568 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e217 e217: 3 total, 3 up, 3 in
Jan 23 04:45:54 np0005593234 nova_compute[227762]: 2026-01-23 09:45:54.091 227766 DEBUG nova.compute.manager [None req-82fab160-c0cb-4072-9417-fa508e1ce2dd - - - - - -] [instance: 1608bb7b-ae4b-40c1-b404-16dabe957e37] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:45:54 np0005593234 nova_compute[227762]: 2026-01-23 09:45:54.461 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:54 np0005593234 nova_compute[227762]: 2026-01-23 09:45:54.562 227766 DEBUG nova.network.neutron [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Successfully created port: f53d0ca7-d346-4e78-831f-33d2d4b1ec75 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:45:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e217 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:45:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:55.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:55.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:57.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:57.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:57 np0005593234 podman[253795]: 2026-01-23 09:45:57.751753249 +0000 UTC m=+0.046590375 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 23 04:45:57 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:45:57 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:45:57 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:45:58 np0005593234 nova_compute[227762]: 2026-01-23 09:45:58.121 227766 DEBUG nova.network.neutron [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Successfully updated port: f53d0ca7-d346-4e78-831f-33d2d4b1ec75 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:45:58 np0005593234 nova_compute[227762]: 2026-01-23 09:45:58.570 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e218 e218: 3 total, 3 up, 3 in
Jan 23 04:45:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:45:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:45:59.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:45:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:45:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:45:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:45:59.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:45:59 np0005593234 nova_compute[227762]: 2026-01-23 09:45:59.502 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:45:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:00 np0005593234 nova_compute[227762]: 2026-01-23 09:46:00.936 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Acquiring lock "refresh_cache-4312368d-83d8-4d95-98a1-18bc91f966ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:46:00 np0005593234 nova_compute[227762]: 2026-01-23 09:46:00.936 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Acquired lock "refresh_cache-4312368d-83d8-4d95-98a1-18bc91f966ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:46:00 np0005593234 nova_compute[227762]: 2026-01-23 09:46:00.936 227766 DEBUG nova.network.neutron [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:46:00 np0005593234 nova_compute[227762]: 2026-01-23 09:46:00.990 227766 DEBUG nova.compute.manager [req-f40dd6a6-94c3-4f70-9b06-d08373452820 req-ec1d8ee1-22d6-4087-9f4d-b7cbd8650970 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Received event network-changed-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:46:00 np0005593234 nova_compute[227762]: 2026-01-23 09:46:00.990 227766 DEBUG nova.compute.manager [req-f40dd6a6-94c3-4f70-9b06-d08373452820 req-ec1d8ee1-22d6-4087-9f4d-b7cbd8650970 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Refreshing instance network info cache due to event network-changed-f53d0ca7-d346-4e78-831f-33d2d4b1ec75. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:46:00 np0005593234 nova_compute[227762]: 2026-01-23 09:46:00.990 227766 DEBUG oslo_concurrency.lockutils [req-f40dd6a6-94c3-4f70-9b06-d08373452820 req-ec1d8ee1-22d6-4087-9f4d-b7cbd8650970 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-4312368d-83d8-4d95-98a1-18bc91f966ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:46:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:46:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:01.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:46:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:46:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:01.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:46:01 np0005593234 nova_compute[227762]: 2026-01-23 09:46:01.755 227766 DEBUG nova.network.neutron [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:46:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:03.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:03.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:03 np0005593234 nova_compute[227762]: 2026-01-23 09:46:03.574 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:46:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:46:04 np0005593234 nova_compute[227762]: 2026-01-23 09:46:04.506 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:05.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:05.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:07.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:07.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:08 np0005593234 nova_compute[227762]: 2026-01-23 09:46:08.279 227766 DEBUG nova.network.neutron [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Updating instance_info_cache with network_info: [{"id": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "address": "fa:16:3e:d7:f8:1b", "network": {"id": "3c924c14-6a63-4f18-9be4-d068c6154900", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691052156-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb5dcf0180e448f3971332f4d58c11f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d0ca7-d3", "ovs_interfaceid": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:46:08 np0005593234 nova_compute[227762]: 2026-01-23 09:46:08.578 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:09.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:09.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:09 np0005593234 nova_compute[227762]: 2026-01-23 09:46:09.507 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.239 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Releasing lock "refresh_cache-4312368d-83d8-4d95-98a1-18bc91f966ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.239 227766 DEBUG nova.compute.manager [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Instance network_info: |[{"id": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "address": "fa:16:3e:d7:f8:1b", "network": {"id": "3c924c14-6a63-4f18-9be4-d068c6154900", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691052156-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb5dcf0180e448f3971332f4d58c11f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d0ca7-d3", "ovs_interfaceid": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.240 227766 DEBUG oslo_concurrency.lockutils [req-f40dd6a6-94c3-4f70-9b06-d08373452820 req-ec1d8ee1-22d6-4087-9f4d-b7cbd8650970 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-4312368d-83d8-4d95-98a1-18bc91f966ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.240 227766 DEBUG nova.network.neutron [req-f40dd6a6-94c3-4f70-9b06-d08373452820 req-ec1d8ee1-22d6-4087-9f4d-b7cbd8650970 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Refreshing network info cache for port f53d0ca7-d346-4e78-831f-33d2d4b1ec75 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.243 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Start _get_guest_xml network_info=[{"id": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "address": "fa:16:3e:d7:f8:1b", "network": {"id": "3c924c14-6a63-4f18-9be4-d068c6154900", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691052156-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb5dcf0180e448f3971332f4d58c11f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d0ca7-d3", "ovs_interfaceid": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.246 227766 WARNING nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.251 227766 DEBUG nova.virt.libvirt.host [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.252 227766 DEBUG nova.virt.libvirt.host [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.254 227766 DEBUG nova.virt.libvirt.host [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.255 227766 DEBUG nova.virt.libvirt.host [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.256 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.256 227766 DEBUG nova.virt.hardware [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.257 227766 DEBUG nova.virt.hardware [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.257 227766 DEBUG nova.virt.hardware [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.257 227766 DEBUG nova.virt.hardware [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.258 227766 DEBUG nova.virt.hardware [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.258 227766 DEBUG nova.virt.hardware [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.258 227766 DEBUG nova.virt.hardware [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.258 227766 DEBUG nova.virt.hardware [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.259 227766 DEBUG nova.virt.hardware [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.259 227766 DEBUG nova.virt.hardware [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.259 227766 DEBUG nova.virt.hardware [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.262 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:46:10 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3141605574' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.692 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.714 227766 DEBUG nova.storage.rbd_utils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] rbd image 4312368d-83d8-4d95-98a1-18bc91f966ce_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:10 np0005593234 nova_compute[227762]: 2026-01-23 09:46:10.718 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:46:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:11.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:46:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:46:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/172909784' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:46:11 np0005593234 nova_compute[227762]: 2026-01-23 09:46:11.166 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:11 np0005593234 nova_compute[227762]: 2026-01-23 09:46:11.168 227766 DEBUG nova.virt.libvirt.vif [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:45:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=57,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFmiYxtuVv6oSYUp9Qbidd/GX1flYjM4Z3DDAWV39PaoPurD8XRzELTjykQToXYbOq/Z+fKKM6KxeYr3Xqkk4Uuft9sDZA+ttBS2CbmK3386CyurHb3d/o1d00leJXKUww==',key_name='tempest-keypair-2142523924',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bb5dcf0180e448f3971332f4d58c11f9',ramdisk_id='',reservation_id='r-iqwb588k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-718255531',owner_user_name='tempest-ServersV294TestFqdnHostnames-718255531-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:45:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='69cd789fe82e4fb6a1a6fc06333e456b',uuid=4312368d-83d8-4d95-98a1-18bc91f966ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "address": "fa:16:3e:d7:f8:1b", "network": {"id": "3c924c14-6a63-4f18-9be4-d068c6154900", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691052156-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb5dcf0180e448f3971332f4d58c11f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d0ca7-d3", "ovs_interfaceid": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:46:11 np0005593234 nova_compute[227762]: 2026-01-23 09:46:11.168 227766 DEBUG nova.network.os_vif_util [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Converting VIF {"id": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "address": "fa:16:3e:d7:f8:1b", "network": {"id": "3c924c14-6a63-4f18-9be4-d068c6154900", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691052156-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb5dcf0180e448f3971332f4d58c11f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d0ca7-d3", "ovs_interfaceid": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:46:11 np0005593234 nova_compute[227762]: 2026-01-23 09:46:11.169 227766 DEBUG nova.network.os_vif_util [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:f8:1b,bridge_name='br-int',has_traffic_filtering=True,id=f53d0ca7-d346-4e78-831f-33d2d4b1ec75,network=Network(3c924c14-6a63-4f18-9be4-d068c6154900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d0ca7-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:46:11 np0005593234 nova_compute[227762]: 2026-01-23 09:46:11.170 227766 DEBUG nova.objects.instance [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4312368d-83d8-4d95-98a1-18bc91f966ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:46:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:11.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:11 np0005593234 podman[253934]: 2026-01-23 09:46:11.77053042 +0000 UTC m=+0.068733334 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 23 04:46:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:13.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:13.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:13 np0005593234 nova_compute[227762]: 2026-01-23 09:46:13.580 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:13 np0005593234 nova_compute[227762]: 2026-01-23 09:46:13.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.312284) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161574312363, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 2272, "num_deletes": 263, "total_data_size": 5093194, "memory_usage": 5171088, "flush_reason": "Manual Compaction"}
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161574332865, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 3336685, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36078, "largest_seqno": 38345, "table_properties": {"data_size": 3327268, "index_size": 5911, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19761, "raw_average_key_size": 20, "raw_value_size": 3308268, "raw_average_value_size": 3446, "num_data_blocks": 256, "num_entries": 960, "num_filter_entries": 960, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161403, "oldest_key_time": 1769161403, "file_creation_time": 1769161574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 20634 microseconds, and 9398 cpu microseconds.
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.332929) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 3336685 bytes OK
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.332954) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.334977) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.334989) EVENT_LOG_v1 {"time_micros": 1769161574334985, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.335006) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 5082943, prev total WAL file size 5082943, number of live WAL files 2.
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.336080) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303033' seq:72057594037927935, type:22 .. '6C6F676D0031323535' seq:0, type:0; will stop at (end)
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(3258KB)], [69(8131KB)]
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161574336164, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 11663524, "oldest_snapshot_seqno": -1}
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.411 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  <uuid>4312368d-83d8-4d95-98a1-18bc91f966ce</uuid>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  <name>instance-00000039</name>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <nova:name>guest-instance-1</nova:name>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:46:10</nova:creationTime>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <nova:user uuid="69cd789fe82e4fb6a1a6fc06333e456b">tempest-ServersV294TestFqdnHostnames-718255531-project-member</nova:user>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <nova:project uuid="bb5dcf0180e448f3971332f4d58c11f9">tempest-ServersV294TestFqdnHostnames-718255531</nova:project>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <nova:port uuid="f53d0ca7-d346-4e78-831f-33d2d4b1ec75">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <entry name="serial">4312368d-83d8-4d95-98a1-18bc91f966ce</entry>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <entry name="uuid">4312368d-83d8-4d95-98a1-18bc91f966ce</entry>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/4312368d-83d8-4d95-98a1-18bc91f966ce_disk">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/4312368d-83d8-4d95-98a1-18bc91f966ce_disk.config">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:d7:f8:1b"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <target dev="tapf53d0ca7-d3"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/4312368d-83d8-4d95-98a1-18bc91f966ce/console.log" append="off"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:46:14 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:46:14 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:46:14 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:46:14 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.413 227766 DEBUG nova.compute.manager [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Preparing to wait for external event network-vif-plugged-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.413 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Acquiring lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.413 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.414 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.414 227766 DEBUG nova.virt.libvirt.vif [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:45:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=57,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFmiYxtuVv6oSYUp9Qbidd/GX1flYjM4Z3DDAWV39PaoPurD8XRzELTjykQToXYbOq/Z+fKKM6KxeYr3Xqkk4Uuft9sDZA+ttBS2CbmK3386CyurHb3d/o1d00leJXKUww==',key_name='tempest-keypair-2142523924',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='bb5dcf0180e448f3971332f4d58c11f9',ramdisk_id='',reservation_id='r-iqwb588k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersV294TestFqdnHostnames-718255531',owner_user_name='tempest-ServersV294TestFqdnHostnames-718255531-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:45:51Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='69cd789fe82e4fb6a1a6fc06333e456b',uuid=4312368d-83d8-4d95-98a1-18bc91f966ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "address": "fa:16:3e:d7:f8:1b", "network": {"id": "3c924c14-6a63-4f18-9be4-d068c6154900", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691052156-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb5dcf0180e448f3971332f4d58c11f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d0ca7-d3", "ovs_interfaceid": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.415 227766 DEBUG nova.network.os_vif_util [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Converting VIF {"id": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "address": "fa:16:3e:d7:f8:1b", "network": {"id": "3c924c14-6a63-4f18-9be4-d068c6154900", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691052156-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb5dcf0180e448f3971332f4d58c11f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d0ca7-d3", "ovs_interfaceid": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.415 227766 DEBUG nova.network.os_vif_util [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d7:f8:1b,bridge_name='br-int',has_traffic_filtering=True,id=f53d0ca7-d346-4e78-831f-33d2d4b1ec75,network=Network(3c924c14-6a63-4f18-9be4-d068c6154900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d0ca7-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.416 227766 DEBUG os_vif [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:f8:1b,bridge_name='br-int',has_traffic_filtering=True,id=f53d0ca7-d346-4e78-831f-33d2d4b1ec75,network=Network(3c924c14-6a63-4f18-9be4-d068c6154900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d0ca7-d3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.416 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.417 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.417 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6249 keys, 11511150 bytes, temperature: kUnknown
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161574418049, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 11511150, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11467279, "index_size": 27121, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15685, "raw_key_size": 159724, "raw_average_key_size": 25, "raw_value_size": 11353023, "raw_average_value_size": 1816, "num_data_blocks": 1096, "num_entries": 6249, "num_filter_entries": 6249, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769161574, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.418320) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 11511150 bytes
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.419540) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 142.3 rd, 140.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.2, 7.9 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(6.9) write-amplify(3.4) OK, records in: 6789, records dropped: 540 output_compression: NoCompression
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.419557) EVENT_LOG_v1 {"time_micros": 1769161574419548, "job": 42, "event": "compaction_finished", "compaction_time_micros": 81985, "compaction_time_cpu_micros": 23310, "output_level": 6, "num_output_files": 1, "total_output_size": 11511150, "num_input_records": 6789, "num_output_records": 6249, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161574420215, "job": 42, "event": "table_file_deletion", "file_number": 71}
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.420 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.421 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf53d0ca7-d3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161574421744, "job": 42, "event": "table_file_deletion", "file_number": 69}
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.421 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf53d0ca7-d3, col_values=(('external_ids', {'iface-id': 'f53d0ca7-d346-4e78-831f-33d2d4b1ec75', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d7:f8:1b', 'vm-uuid': '4312368d-83d8-4d95-98a1-18bc91f966ce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.336004) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.421842) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.421846) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.421848) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.421850) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:14.421852) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.423 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:14 np0005593234 NetworkManager[48942]: <info>  [1769161574.4238] manager: (tapf53d0ca7-d3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.425 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.430 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.431 227766 INFO os_vif [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d7:f8:1b,bridge_name='br-int',has_traffic_filtering=True,id=f53d0ca7-d346-4e78-831f-33d2d4b1ec75,network=Network(3c924c14-6a63-4f18-9be4-d068c6154900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d0ca7-d3')#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.508 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.721 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.722 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.722 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] No VIF found with MAC fa:16:3e:d7:f8:1b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.723 227766 INFO nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Using config drive#033[00m
Jan 23 04:46:14 np0005593234 nova_compute[227762]: 2026-01-23 09:46:14.744 227766 DEBUG nova.storage.rbd_utils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] rbd image 4312368d-83d8-4d95-98a1-18bc91f966ce_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:15.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:46:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:15.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:46:15 np0005593234 nova_compute[227762]: 2026-01-23 09:46:15.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:16 np0005593234 nova_compute[227762]: 2026-01-23 09:46:16.203 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:16 np0005593234 nova_compute[227762]: 2026-01-23 09:46:16.204 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:16 np0005593234 nova_compute[227762]: 2026-01-23 09:46:16.204 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:16 np0005593234 nova_compute[227762]: 2026-01-23 09:46:16.204 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:46:16 np0005593234 nova_compute[227762]: 2026-01-23 09:46:16.204 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:46:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3260828330' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:46:16 np0005593234 nova_compute[227762]: 2026-01-23 09:46:16.644 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:46:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:17.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.240 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000039 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.241 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000039 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:46:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:17.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.400 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.401 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4675MB free_disk=20.922027587890625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.401 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.401 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.457 227766 INFO nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Creating config drive at /var/lib/nova/instances/4312368d-83d8-4d95-98a1-18bc91f966ce/disk.config#033[00m
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.466 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4312368d-83d8-4d95-98a1-18bc91f966ce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9tocy63v execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.598 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4312368d-83d8-4d95-98a1-18bc91f966ce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9tocy63v" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.623 227766 DEBUG nova.storage.rbd_utils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] rbd image 4312368d-83d8-4d95-98a1-18bc91f966ce_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.627 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4312368d-83d8-4d95-98a1-18bc91f966ce/disk.config 4312368d-83d8-4d95-98a1-18bc91f966ce_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.715 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 4312368d-83d8-4d95-98a1-18bc91f966ce actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.716 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.716 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:46:17 np0005593234 nova_compute[227762]: 2026-01-23 09:46:17.811 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:18 np0005593234 nova_compute[227762]: 2026-01-23 09:46:18.196 227766 DEBUG nova.network.neutron [req-f40dd6a6-94c3-4f70-9b06-d08373452820 req-ec1d8ee1-22d6-4087-9f4d-b7cbd8650970 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Updated VIF entry in instance network info cache for port f53d0ca7-d346-4e78-831f-33d2d4b1ec75. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:46:18 np0005593234 nova_compute[227762]: 2026-01-23 09:46:18.197 227766 DEBUG nova.network.neutron [req-f40dd6a6-94c3-4f70-9b06-d08373452820 req-ec1d8ee1-22d6-4087-9f4d-b7cbd8650970 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Updating instance_info_cache with network_info: [{"id": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "address": "fa:16:3e:d7:f8:1b", "network": {"id": "3c924c14-6a63-4f18-9be4-d068c6154900", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691052156-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb5dcf0180e448f3971332f4d58c11f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d0ca7-d3", "ovs_interfaceid": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:46:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:46:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/546304430' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:46:18 np0005593234 nova_compute[227762]: 2026-01-23 09:46:18.250 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:18 np0005593234 nova_compute[227762]: 2026-01-23 09:46:18.256 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:46:18 np0005593234 nova_compute[227762]: 2026-01-23 09:46:18.982 227766 DEBUG oslo_concurrency.processutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4312368d-83d8-4d95-98a1-18bc91f966ce/disk.config 4312368d-83d8-4d95-98a1-18bc91f966ce_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.355s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:18 np0005593234 nova_compute[227762]: 2026-01-23 09:46:18.982 227766 INFO nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Deleting local config drive /var/lib/nova/instances/4312368d-83d8-4d95-98a1-18bc91f966ce/disk.config because it was imported into RBD.#033[00m
Jan 23 04:46:19 np0005593234 kernel: tapf53d0ca7-d3: entered promiscuous mode
Jan 23 04:46:19 np0005593234 ovn_controller[134547]: 2026-01-23T09:46:19Z|00158|binding|INFO|Claiming lport f53d0ca7-d346-4e78-831f-33d2d4b1ec75 for this chassis.
Jan 23 04:46:19 np0005593234 ovn_controller[134547]: 2026-01-23T09:46:19Z|00159|binding|INFO|f53d0ca7-d346-4e78-831f-33d2d4b1ec75: Claiming fa:16:3e:d7:f8:1b 10.100.0.13
Jan 23 04:46:19 np0005593234 nova_compute[227762]: 2026-01-23 09:46:19.030 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:19 np0005593234 NetworkManager[48942]: <info>  [1769161579.0313] manager: (tapf53d0ca7-d3): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Jan 23 04:46:19 np0005593234 nova_compute[227762]: 2026-01-23 09:46:19.034 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:19 np0005593234 systemd-udevd[254130]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:46:19 np0005593234 NetworkManager[48942]: <info>  [1769161579.0691] device (tapf53d0ca7-d3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:46:19 np0005593234 NetworkManager[48942]: <info>  [1769161579.0696] device (tapf53d0ca7-d3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:46:19 np0005593234 systemd-machined[195626]: New machine qemu-25-instance-00000039.
Jan 23 04:46:19 np0005593234 nova_compute[227762]: 2026-01-23 09:46:19.097 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:19 np0005593234 systemd[1]: Started Virtual Machine qemu-25-instance-00000039.
Jan 23 04:46:19 np0005593234 ovn_controller[134547]: 2026-01-23T09:46:19Z|00160|binding|INFO|Setting lport f53d0ca7-d346-4e78-831f-33d2d4b1ec75 ovn-installed in OVS
Jan 23 04:46:19 np0005593234 nova_compute[227762]: 2026-01-23 09:46:19.106 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:19.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:19.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:19 np0005593234 nova_compute[227762]: 2026-01-23 09:46:19.423 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:19 np0005593234 nova_compute[227762]: 2026-01-23 09:46:19.510 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:19 np0005593234 nova_compute[227762]: 2026-01-23 09:46:19.707 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161579.7069712, 4312368d-83d8-4d95-98a1-18bc91f966ce => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:46:19 np0005593234 nova_compute[227762]: 2026-01-23 09:46:19.707 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] VM Started (Lifecycle Event)#033[00m
Jan 23 04:46:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:20 np0005593234 ovn_controller[134547]: 2026-01-23T09:46:20Z|00161|binding|INFO|Setting lport f53d0ca7-d346-4e78-831f-33d2d4b1ec75 up in Southbound
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.242 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:f8:1b 10.100.0.13'], port_security=['fa:16:3e:d7:f8:1b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4312368d-83d8-4d95-98a1-18bc91f966ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c924c14-6a63-4f18-9be4-d068c6154900', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bb5dcf0180e448f3971332f4d58c11f9', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3b690f9b-0d4f-4f60-98e0-a388f02a35fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a4e9f06-27dd-4315-89ba-882116d1424d, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=f53d0ca7-d346-4e78-831f-33d2d4b1ec75) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.243 144381 INFO neutron.agent.ovn.metadata.agent [-] Port f53d0ca7-d346-4e78-831f-33d2d4b1ec75 in datapath 3c924c14-6a63-4f18-9be4-d068c6154900 bound to our chassis#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.245 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3c924c14-6a63-4f18-9be4-d068c6154900#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.257 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6eb08877-2dc9-48b5-a24f-9782f437690c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.258 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3c924c14-61 in ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.260 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3c924c14-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.260 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4c3ac7e8-be2a-4884-b537-46ea7ebc5fa8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.261 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5aa378a4-eebe-424f-b627-7aaeabe86c64]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.277 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[ef2668f9-0b96-48c0-9adb-9e5f2087c96c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.305 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb243ec-1803-4f04-9ad8-7a512b1920e4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.338 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[256553ab-327f-45dd-99ae-33e2f5c5c7d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.344 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c4d85e-1713-412e-8154-7865acbcf171]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 NetworkManager[48942]: <info>  [1769161580.3458] manager: (tap3c924c14-60): new Veth device (/org/freedesktop/NetworkManager/Devices/90)
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.373 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7445a4f4-1dc2-4859-839f-f21ebdbd03c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.375 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9e9681e0-d883-4911-8060-df863f4f38cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 NetworkManager[48942]: <info>  [1769161580.3963] device (tap3c924c14-60): carrier: link connected
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.402 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e00af0a3-cb5e-4375-9620-9c9fa33dc8dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.418 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[81c62424-78e1-4df7-8ed3-c2a171fe1034]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c924c14-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:d0:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547535, 'reachable_time': 19057, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254209, 'error': None, 'target': 'ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.432 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[92857365-a02b-4179-b948-79e8f283c0b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe41:d010'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 547535, 'tstamp': 547535}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 254210, 'error': None, 'target': 'ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.447 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3b241f95-457d-4de3-92e7-ce7909ac8623]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c924c14-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:41:d0:10'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547535, 'reachable_time': 19057, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 254211, 'error': None, 'target': 'ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.474 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5f7ee2e0-6c15-4cc8-970e-5b471ab15b72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.530 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d7e55000-3af7-4229-96e7-14207a1522a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.532 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c924c14-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.532 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.533 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c924c14-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:20 np0005593234 nova_compute[227762]: 2026-01-23 09:46:20.535 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:20 np0005593234 NetworkManager[48942]: <info>  [1769161580.5359] manager: (tap3c924c14-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/91)
Jan 23 04:46:20 np0005593234 kernel: tap3c924c14-60: entered promiscuous mode
Jan 23 04:46:20 np0005593234 nova_compute[227762]: 2026-01-23 09:46:20.537 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.538 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3c924c14-60, col_values=(('external_ids', {'iface-id': 'ac03bb92-c2b9-42f2-8262-31c869e0dff8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:20 np0005593234 nova_compute[227762]: 2026-01-23 09:46:20.538 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:20 np0005593234 ovn_controller[134547]: 2026-01-23T09:46:20Z|00162|binding|INFO|Releasing lport ac03bb92-c2b9-42f2-8262-31c869e0dff8 from this chassis (sb_readonly=0)
Jan 23 04:46:20 np0005593234 nova_compute[227762]: 2026-01-23 09:46:20.539 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.540 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3c924c14-6a63-4f18-9be4-d068c6154900.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3c924c14-6a63-4f18-9be4-d068c6154900.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.541 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0cf58272-eeef-4f59-8f35-c68dd2ef83db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.542 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-3c924c14-6a63-4f18-9be4-d068c6154900
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/3c924c14-6a63-4f18-9be4-d068c6154900.pid.haproxy
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 3c924c14-6a63-4f18-9be4-d068c6154900
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:46:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:20.543 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900', 'env', 'PROCESS_TAG=haproxy-3c924c14-6a63-4f18-9be4-d068c6154900', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3c924c14-6a63-4f18-9be4-d068c6154900.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:46:20 np0005593234 nova_compute[227762]: 2026-01-23 09:46:20.554 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:20 np0005593234 nova_compute[227762]: 2026-01-23 09:46:20.960 227766 DEBUG oslo_concurrency.lockutils [req-f40dd6a6-94c3-4f70-9b06-d08373452820 req-ec1d8ee1-22d6-4087-9f4d-b7cbd8650970 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-4312368d-83d8-4d95-98a1-18bc91f966ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:46:20 np0005593234 podman[254244]: 2026-01-23 09:46:20.874081441 +0000 UTC m=+0.022799891 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:46:20 np0005593234 podman[254244]: 2026-01-23 09:46:20.976430073 +0000 UTC m=+0.125148513 container create 43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.021 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:46:21 np0005593234 systemd[1]: Started libpod-conmon-43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f.scope.
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.038 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.042 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161579.710082, 4312368d-83d8-4d95-98a1-18bc91f966ce => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.043 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:46:21 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:46:21 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f47f610d964c686c647541ad555c86a842be4c2067a66e43b6bfc6d981366069/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:46:21 np0005593234 podman[254244]: 2026-01-23 09:46:21.072776536 +0000 UTC m=+0.221494996 container init 43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 04:46:21 np0005593234 podman[254244]: 2026-01-23 09:46:21.078638119 +0000 UTC m=+0.227356559 container start 43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:46:21 np0005593234 neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900[254259]: [NOTICE]   (254263) : New worker (254265) forked
Jan 23 04:46:21 np0005593234 neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900[254259]: [NOTICE]   (254263) : Loading success.
Jan 23 04:46:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:21.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.133 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.134 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.147 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.150 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.227 227766 DEBUG nova.compute.manager [req-05e427e7-ab60-44e7-8913-c740633382fc req-f1f2738a-44cf-422a-966a-58fa81b6a8d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Received event network-vif-plugged-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.228 227766 DEBUG oslo_concurrency.lockutils [req-05e427e7-ab60-44e7-8913-c740633382fc req-f1f2738a-44cf-422a-966a-58fa81b6a8d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.228 227766 DEBUG oslo_concurrency.lockutils [req-05e427e7-ab60-44e7-8913-c740633382fc req-f1f2738a-44cf-422a-966a-58fa81b6a8d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.228 227766 DEBUG oslo_concurrency.lockutils [req-05e427e7-ab60-44e7-8913-c740633382fc req-f1f2738a-44cf-422a-966a-58fa81b6a8d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.229 227766 DEBUG nova.compute.manager [req-05e427e7-ab60-44e7-8913-c740633382fc req-f1f2738a-44cf-422a-966a-58fa81b6a8d1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Processing event network-vif-plugged-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.229 227766 DEBUG nova.compute.manager [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.233 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.236 227766 INFO nova.virt.libvirt.driver [-] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Instance spawned successfully.#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.237 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.267 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.268 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161581.2326748, 4312368d-83d8-4d95-98a1-18bc91f966ce => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.269 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.282 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.283 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.283 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.284 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.284 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.285 227766 DEBUG nova.virt.libvirt.driver [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:46:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:21.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.337 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.340 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.391 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.481 227766 INFO nova.compute.manager [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Took 29.58 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:46:21 np0005593234 nova_compute[227762]: 2026-01-23 09:46:21.528 227766 DEBUG nova.compute.manager [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:46:22 np0005593234 nova_compute[227762]: 2026-01-23 09:46:22.134 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:22 np0005593234 nova_compute[227762]: 2026-01-23 09:46:22.135 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:22 np0005593234 nova_compute[227762]: 2026-01-23 09:46:22.136 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:46:22 np0005593234 nova_compute[227762]: 2026-01-23 09:46:22.136 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:46:22 np0005593234 nova_compute[227762]: 2026-01-23 09:46:22.412 227766 INFO nova.compute.manager [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Took 31.67 seconds to build instance.#033[00m
Jan 23 04:46:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:23.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:46:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:23.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:46:24 np0005593234 nova_compute[227762]: 2026-01-23 09:46:24.425 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:24 np0005593234 nova_compute[227762]: 2026-01-23 09:46:24.512 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:24 np0005593234 nova_compute[227762]: 2026-01-23 09:46:24.762 227766 DEBUG nova.compute.manager [req-e6758064-f194-4354-a543-09bb767e074b req-f787c781-8924-4064-af52-d25260aac1fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Received event network-vif-plugged-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:46:24 np0005593234 nova_compute[227762]: 2026-01-23 09:46:24.763 227766 DEBUG oslo_concurrency.lockutils [req-e6758064-f194-4354-a543-09bb767e074b req-f787c781-8924-4064-af52-d25260aac1fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:24 np0005593234 nova_compute[227762]: 2026-01-23 09:46:24.763 227766 DEBUG oslo_concurrency.lockutils [req-e6758064-f194-4354-a543-09bb767e074b req-f787c781-8924-4064-af52-d25260aac1fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:24 np0005593234 nova_compute[227762]: 2026-01-23 09:46:24.763 227766 DEBUG oslo_concurrency.lockutils [req-e6758064-f194-4354-a543-09bb767e074b req-f787c781-8924-4064-af52-d25260aac1fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:24 np0005593234 nova_compute[227762]: 2026-01-23 09:46:24.764 227766 DEBUG nova.compute.manager [req-e6758064-f194-4354-a543-09bb767e074b req-f787c781-8924-4064-af52-d25260aac1fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] No waiting events found dispatching network-vif-plugged-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:46:24 np0005593234 nova_compute[227762]: 2026-01-23 09:46:24.764 227766 WARNING nova.compute.manager [req-e6758064-f194-4354-a543-09bb767e074b req-f787c781-8924-4064-af52-d25260aac1fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Received unexpected event network-vif-plugged-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:46:24 np0005593234 nova_compute[227762]: 2026-01-23 09:46:24.774 227766 DEBUG oslo_concurrency.lockutils [None req-87dcd864-ee16-4e8c-b489-aad6630d2b14 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 34.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:46:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:25.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:46:25 np0005593234 nova_compute[227762]: 2026-01-23 09:46:25.301 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-4312368d-83d8-4d95-98a1-18bc91f966ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:46:25 np0005593234 nova_compute[227762]: 2026-01-23 09:46:25.301 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-4312368d-83d8-4d95-98a1-18bc91f966ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:46:25 np0005593234 nova_compute[227762]: 2026-01-23 09:46:25.302 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:46:25 np0005593234 nova_compute[227762]: 2026-01-23 09:46:25.302 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4312368d-83d8-4d95-98a1-18bc91f966ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:46:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:46:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:25.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:46:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:27.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:46:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:27.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:46:28 np0005593234 podman[254278]: 2026-01-23 09:46:28.76868329 +0000 UTC m=+0.056331229 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:46:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:29.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:46:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:29.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:46:29 np0005593234 nova_compute[227762]: 2026-01-23 09:46:29.428 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:29 np0005593234 nova_compute[227762]: 2026-01-23 09:46:29.513 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:30 np0005593234 NetworkManager[48942]: <info>  [1769161590.1698] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Jan 23 04:46:30 np0005593234 NetworkManager[48942]: <info>  [1769161590.1708] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Jan 23 04:46:30 np0005593234 nova_compute[227762]: 2026-01-23 09:46:30.169 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:30 np0005593234 nova_compute[227762]: 2026-01-23 09:46:30.254 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:30 np0005593234 ovn_controller[134547]: 2026-01-23T09:46:30Z|00163|binding|INFO|Releasing lport ac03bb92-c2b9-42f2-8262-31c869e0dff8 from this chassis (sb_readonly=0)
Jan 23 04:46:30 np0005593234 nova_compute[227762]: 2026-01-23 09:46:30.266 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:30 np0005593234 nova_compute[227762]: 2026-01-23 09:46:30.290 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:30 np0005593234 nova_compute[227762]: 2026-01-23 09:46:30.777 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Updating instance_info_cache with network_info: [{"id": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "address": "fa:16:3e:d7:f8:1b", "network": {"id": "3c924c14-6a63-4f18-9be4-d068c6154900", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691052156-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb5dcf0180e448f3971332f4d58c11f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d0ca7-d3", "ovs_interfaceid": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:46:30 np0005593234 nova_compute[227762]: 2026-01-23 09:46:30.965 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-4312368d-83d8-4d95-98a1-18bc91f966ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:46:30 np0005593234 nova_compute[227762]: 2026-01-23 09:46:30.965 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:46:30 np0005593234 nova_compute[227762]: 2026-01-23 09:46:30.965 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:30 np0005593234 nova_compute[227762]: 2026-01-23 09:46:30.965 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:30 np0005593234 nova_compute[227762]: 2026-01-23 09:46:30.966 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:30 np0005593234 nova_compute[227762]: 2026-01-23 09:46:30.966 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:30 np0005593234 nova_compute[227762]: 2026-01-23 09:46:30.966 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:46:30 np0005593234 nova_compute[227762]: 2026-01-23 09:46:30.966 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:46:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:31.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:46:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:31.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:46:31 np0005593234 nova_compute[227762]: 2026-01-23 09:46:31.563 227766 DEBUG nova.compute.manager [req-543aa89a-97b0-47da-bb92-304fdbfe58cc req-53736548-fbbd-4f62-8a9c-7542826f42cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Received event network-changed-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:46:31 np0005593234 nova_compute[227762]: 2026-01-23 09:46:31.563 227766 DEBUG nova.compute.manager [req-543aa89a-97b0-47da-bb92-304fdbfe58cc req-53736548-fbbd-4f62-8a9c-7542826f42cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Refreshing instance network info cache due to event network-changed-f53d0ca7-d346-4e78-831f-33d2d4b1ec75. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:46:31 np0005593234 nova_compute[227762]: 2026-01-23 09:46:31.564 227766 DEBUG oslo_concurrency.lockutils [req-543aa89a-97b0-47da-bb92-304fdbfe58cc req-53736548-fbbd-4f62-8a9c-7542826f42cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-4312368d-83d8-4d95-98a1-18bc91f966ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:46:31 np0005593234 nova_compute[227762]: 2026-01-23 09:46:31.564 227766 DEBUG oslo_concurrency.lockutils [req-543aa89a-97b0-47da-bb92-304fdbfe58cc req-53736548-fbbd-4f62-8a9c-7542826f42cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-4312368d-83d8-4d95-98a1-18bc91f966ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:46:31 np0005593234 nova_compute[227762]: 2026-01-23 09:46:31.564 227766 DEBUG nova.network.neutron [req-543aa89a-97b0-47da-bb92-304fdbfe58cc req-53736548-fbbd-4f62-8a9c-7542826f42cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Refreshing network info cache for port f53d0ca7-d346-4e78-831f-33d2d4b1ec75 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.613952) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161592613997, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 444, "num_deletes": 251, "total_data_size": 478353, "memory_usage": 487128, "flush_reason": "Manual Compaction"}
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161592623670, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 315025, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38350, "largest_seqno": 38789, "table_properties": {"data_size": 312605, "index_size": 520, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6165, "raw_average_key_size": 18, "raw_value_size": 307724, "raw_average_value_size": 943, "num_data_blocks": 23, "num_entries": 326, "num_filter_entries": 326, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161574, "oldest_key_time": 1769161574, "file_creation_time": 1769161592, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 9745 microseconds, and 1811 cpu microseconds.
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.623700) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 315025 bytes OK
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.623716) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.633359) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.633397) EVENT_LOG_v1 {"time_micros": 1769161592633388, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.633419) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 475570, prev total WAL file size 475570, number of live WAL files 2.
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.633891) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(307KB)], [72(10MB)]
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161592633920, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 11826175, "oldest_snapshot_seqno": -1}
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6065 keys, 10005873 bytes, temperature: kUnknown
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161592702277, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 10005873, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9964554, "index_size": 25081, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15173, "raw_key_size": 156635, "raw_average_key_size": 25, "raw_value_size": 9854738, "raw_average_value_size": 1624, "num_data_blocks": 1003, "num_entries": 6065, "num_filter_entries": 6065, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769161592, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.702543) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 10005873 bytes
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.704937) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.7 rd, 146.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 11.0 +0.0 blob) out(9.5 +0.0 blob), read-write-amplify(69.3) write-amplify(31.8) OK, records in: 6575, records dropped: 510 output_compression: NoCompression
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.704959) EVENT_LOG_v1 {"time_micros": 1769161592704948, "job": 44, "event": "compaction_finished", "compaction_time_micros": 68468, "compaction_time_cpu_micros": 24319, "output_level": 6, "num_output_files": 1, "total_output_size": 10005873, "num_input_records": 6575, "num_output_records": 6065, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161592705136, "job": 44, "event": "table_file_deletion", "file_number": 74}
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161592706953, "job": 44, "event": "table_file_deletion", "file_number": 72}
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.633853) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.706984) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.706988) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.706990) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.706991) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:32 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:46:32.706993) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:46:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:33.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:46:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:33.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:46:34 np0005593234 nova_compute[227762]: 2026-01-23 09:46:34.430 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:34 np0005593234 nova_compute[227762]: 2026-01-23 09:46:34.515 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e218 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:35.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:46:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:35.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:46:35 np0005593234 ovn_controller[134547]: 2026-01-23T09:46:35Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d7:f8:1b 10.100.0.13
Jan 23 04:46:35 np0005593234 ovn_controller[134547]: 2026-01-23T09:46:35Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d7:f8:1b 10.100.0.13
Jan 23 04:46:35 np0005593234 nova_compute[227762]: 2026-01-23 09:46:35.648 227766 DEBUG nova.network.neutron [req-543aa89a-97b0-47da-bb92-304fdbfe58cc req-53736548-fbbd-4f62-8a9c-7542826f42cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Updated VIF entry in instance network info cache for port f53d0ca7-d346-4e78-831f-33d2d4b1ec75. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:46:35 np0005593234 nova_compute[227762]: 2026-01-23 09:46:35.649 227766 DEBUG nova.network.neutron [req-543aa89a-97b0-47da-bb92-304fdbfe58cc req-53736548-fbbd-4f62-8a9c-7542826f42cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Updating instance_info_cache with network_info: [{"id": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "address": "fa:16:3e:d7:f8:1b", "network": {"id": "3c924c14-6a63-4f18-9be4-d068c6154900", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691052156-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb5dcf0180e448f3971332f4d58c11f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d0ca7-d3", "ovs_interfaceid": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:46:35 np0005593234 nova_compute[227762]: 2026-01-23 09:46:35.744 227766 DEBUG oslo_concurrency.lockutils [req-543aa89a-97b0-47da-bb92-304fdbfe58cc req-53736548-fbbd-4f62-8a9c-7542826f42cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-4312368d-83d8-4d95-98a1-18bc91f966ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:46:36 np0005593234 nova_compute[227762]: 2026-01-23 09:46:36.546 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:36.545 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:46:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:36.547 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:46:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e219 e219: 3 total, 3 up, 3 in
Jan 23 04:46:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:37.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:46:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:37.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:46:37 np0005593234 nova_compute[227762]: 2026-01-23 09:46:37.711 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e220 e220: 3 total, 3 up, 3 in
Jan 23 04:46:38 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #47. Immutable memtables: 4.
Jan 23 04:46:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e221 e221: 3 total, 3 up, 3 in
Jan 23 04:46:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:39.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:39.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:39 np0005593234 nova_compute[227762]: 2026-01-23 09:46:39.479 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:39 np0005593234 nova_compute[227762]: 2026-01-23 09:46:39.517 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:41.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:41.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:42 np0005593234 podman[254354]: 2026-01-23 09:46:42.800691162 +0000 UTC m=+0.090554263 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:46:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:42.821 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:42.822 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:42.822 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:43.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:43.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e222 e222: 3 total, 3 up, 3 in
Jan 23 04:46:44 np0005593234 nova_compute[227762]: 2026-01-23 09:46:44.482 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:44 np0005593234 nova_compute[227762]: 2026-01-23 09:46:44.519 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:46:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2416432550' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:46:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:46:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2416432550' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:46:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e222 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:45.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:45.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:46.591 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:47.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:47.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e223 e223: 3 total, 3 up, 3 in
Jan 23 04:46:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:49.100 144742 DEBUG eventlet.wsgi.server [-] (144742) accepted '' server /usr/lib/python3.9/site-packages/eventlet/wsgi.py:1004#033[00m
Jan 23 04:46:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:49.103 144742 DEBUG neutron.agent.ovn.metadata.server [-] Request: GET /openstack/latest/meta_data.json HTTP/1.0#015
Jan 23 04:46:49 np0005593234 ovn_metadata_agent[144376]: Accept: */*#015
Jan 23 04:46:49 np0005593234 ovn_metadata_agent[144376]: Connection: close#015
Jan 23 04:46:49 np0005593234 ovn_metadata_agent[144376]: Content-Type: text/plain#015
Jan 23 04:46:49 np0005593234 ovn_metadata_agent[144376]: Host: 169.254.169.254#015
Jan 23 04:46:49 np0005593234 ovn_metadata_agent[144376]: User-Agent: curl/7.84.0#015
Jan 23 04:46:49 np0005593234 ovn_metadata_agent[144376]: X-Forwarded-For: 10.100.0.13#015
Jan 23 04:46:49 np0005593234 ovn_metadata_agent[144376]: X-Ovn-Network-Id: 3c924c14-6a63-4f18-9be4-d068c6154900 __call__ /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:82#033[00m
Jan 23 04:46:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:49.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:49.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:49 np0005593234 nova_compute[227762]: 2026-01-23 09:46:49.521 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:46:49 np0005593234 nova_compute[227762]: 2026-01-23 09:46:49.523 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:46:49 np0005593234 nova_compute[227762]: 2026-01-23 09:46:49.523 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 04:46:49 np0005593234 nova_compute[227762]: 2026-01-23 09:46:49.523 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 04:46:49 np0005593234 nova_compute[227762]: 2026-01-23 09:46:49.538 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:49 np0005593234 nova_compute[227762]: 2026-01-23 09:46:49.539 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 04:46:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e224 e224: 3 total, 3 up, 3 in
Jan 23 04:46:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e225 e225: 3 total, 3 up, 3 in
Jan 23 04:46:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:50.834 144742 DEBUG neutron.agent.ovn.metadata.server [-] <Response [200]> _proxy_request /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/server.py:161#033[00m
Jan 23 04:46:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:50.834 144742 INFO eventlet.wsgi.server [-] 10.100.0.13,<local> "GET /openstack/latest/meta_data.json HTTP/1.1" status: 200  len: 1673 time: 1.7318549#033[00m
Jan 23 04:46:50 np0005593234 haproxy-metadata-proxy-3c924c14-6a63-4f18-9be4-d068c6154900[254265]: 10.100.0.13:51298 [23/Jan/2026:09:46:49.098] listener listener/metadata 0/0/0/1735/1735 200 1657 - - ---- 1/1/0/0/0 0/0 "GET /openstack/latest/meta_data.json HTTP/1.1"
Jan 23 04:46:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:51.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.289 227766 DEBUG oslo_concurrency.lockutils [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Acquiring lock "4312368d-83d8-4d95-98a1-18bc91f966ce" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.290 227766 DEBUG oslo_concurrency.lockutils [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.290 227766 DEBUG oslo_concurrency.lockutils [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Acquiring lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.290 227766 DEBUG oslo_concurrency.lockutils [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.291 227766 DEBUG oslo_concurrency.lockutils [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.292 227766 INFO nova.compute.manager [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Terminating instance#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.292 227766 DEBUG nova.compute.manager [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:46:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:51.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:51 np0005593234 kernel: tapf53d0ca7-d3 (unregistering): left promiscuous mode
Jan 23 04:46:51 np0005593234 NetworkManager[48942]: <info>  [1769161611.3686] device (tapf53d0ca7-d3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:46:51 np0005593234 ovn_controller[134547]: 2026-01-23T09:46:51Z|00164|binding|INFO|Releasing lport f53d0ca7-d346-4e78-831f-33d2d4b1ec75 from this chassis (sb_readonly=0)
Jan 23 04:46:51 np0005593234 ovn_controller[134547]: 2026-01-23T09:46:51Z|00165|binding|INFO|Setting lport f53d0ca7-d346-4e78-831f-33d2d4b1ec75 down in Southbound
Jan 23 04:46:51 np0005593234 ovn_controller[134547]: 2026-01-23T09:46:51Z|00166|binding|INFO|Removing iface tapf53d0ca7-d3 ovn-installed in OVS
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.374 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.376 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.395 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.398 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d7:f8:1b 10.100.0.13'], port_security=['fa:16:3e:d7:f8:1b 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '4312368d-83d8-4d95-98a1-18bc91f966ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c924c14-6a63-4f18-9be4-d068c6154900', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'bb5dcf0180e448f3971332f4d58c11f9', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3b690f9b-0d4f-4f60-98e0-a388f02a35fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.194'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9a4e9f06-27dd-4315-89ba-882116d1424d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=f53d0ca7-d346-4e78-831f-33d2d4b1ec75) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.399 144381 INFO neutron.agent.ovn.metadata.agent [-] Port f53d0ca7-d346-4e78-831f-33d2d4b1ec75 in datapath 3c924c14-6a63-4f18-9be4-d068c6154900 unbound from our chassis#033[00m
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.400 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3c924c14-6a63-4f18-9be4-d068c6154900, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.401 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[98798964-ae7f-4ce6-b2e5-06debc842e91]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.402 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900 namespace which is not needed anymore#033[00m
Jan 23 04:46:51 np0005593234 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000039.scope: Deactivated successfully.
Jan 23 04:46:51 np0005593234 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000039.scope: Consumed 14.102s CPU time.
Jan 23 04:46:51 np0005593234 systemd-machined[195626]: Machine qemu-25-instance-00000039 terminated.
Jan 23 04:46:51 np0005593234 ovn_controller[134547]: 2026-01-23T09:46:51Z|00167|binding|INFO|Releasing lport ac03bb92-c2b9-42f2-8262-31c869e0dff8 from this chassis (sb_readonly=0)
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.435 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:51 np0005593234 neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900[254259]: [NOTICE]   (254263) : haproxy version is 2.8.14-c23fe91
Jan 23 04:46:51 np0005593234 neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900[254259]: [NOTICE]   (254263) : path to executable is /usr/sbin/haproxy
Jan 23 04:46:51 np0005593234 neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900[254259]: [WARNING]  (254263) : Exiting Master process...
Jan 23 04:46:51 np0005593234 neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900[254259]: [WARNING]  (254263) : Exiting Master process...
Jan 23 04:46:51 np0005593234 neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900[254259]: [ALERT]    (254263) : Current worker (254265) exited with code 143 (Terminated)
Jan 23 04:46:51 np0005593234 neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900[254259]: [WARNING]  (254263) : All workers exited. Exiting... (0)
Jan 23 04:46:51 np0005593234 systemd[1]: libpod-43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f.scope: Deactivated successfully.
Jan 23 04:46:51 np0005593234 podman[254409]: 2026-01-23 09:46:51.536762379 +0000 UTC m=+0.046462860 container died 43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.540 227766 INFO nova.virt.libvirt.driver [-] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Instance destroyed successfully.#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.543 227766 DEBUG nova.objects.instance [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lazy-loading 'resources' on Instance uuid 4312368d-83d8-4d95-98a1-18bc91f966ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:46:51 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f-userdata-shm.mount: Deactivated successfully.
Jan 23 04:46:51 np0005593234 systemd[1]: var-lib-containers-storage-overlay-f47f610d964c686c647541ad555c86a842be4c2067a66e43b6bfc6d981366069-merged.mount: Deactivated successfully.
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.578 227766 DEBUG nova.virt.libvirt.vif [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=1.1.1.1,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:45:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='guest-instance-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx-guest-test.domaintest.com',id=57,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFmiYxtuVv6oSYUp9Qbidd/GX1flYjM4Z3DDAWV39PaoPurD8XRzELTjykQToXYbOq/Z+fKKM6KxeYr3Xqkk4Uuft9sDZA+ttBS2CbmK3386CyurHb3d/o1d00leJXKUww==',key_name='tempest-keypair-2142523924',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:46:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='bb5dcf0180e448f3971332f4d58c11f9',ramdisk_id='',reservation_id='r-iqwb588k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersV294TestFqdnHostnames-718255531',owner_user_name='tempest-ServersV294TestFqdnHostnames-718255531-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:46:21Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='69cd789fe82e4fb6a1a6fc06333e456b',uuid=4312368d-83d8-4d95-98a1-18bc91f966ce,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "address": "fa:16:3e:d7:f8:1b", "network": {"id": "3c924c14-6a63-4f18-9be4-d068c6154900", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691052156-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb5dcf0180e448f3971332f4d58c11f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d0ca7-d3", "ovs_interfaceid": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.578 227766 DEBUG nova.network.os_vif_util [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Converting VIF {"id": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "address": "fa:16:3e:d7:f8:1b", "network": {"id": "3c924c14-6a63-4f18-9be4-d068c6154900", "bridge": "br-int", "label": "tempest-ServersV294TestFqdnHostnames-1691052156-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "bb5dcf0180e448f3971332f4d58c11f9", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf53d0ca7-d3", "ovs_interfaceid": "f53d0ca7-d346-4e78-831f-33d2d4b1ec75", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.579 227766 DEBUG nova.network.os_vif_util [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d7:f8:1b,bridge_name='br-int',has_traffic_filtering=True,id=f53d0ca7-d346-4e78-831f-33d2d4b1ec75,network=Network(3c924c14-6a63-4f18-9be4-d068c6154900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d0ca7-d3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.579 227766 DEBUG os_vif [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:f8:1b,bridge_name='br-int',has_traffic_filtering=True,id=f53d0ca7-d346-4e78-831f-33d2d4b1ec75,network=Network(3c924c14-6a63-4f18-9be4-d068c6154900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d0ca7-d3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.581 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.581 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf53d0ca7-d3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:51 np0005593234 podman[254409]: 2026-01-23 09:46:51.582298508 +0000 UTC m=+0.091998969 container cleanup 43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.584 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:46:51 np0005593234 systemd[1]: libpod-conmon-43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f.scope: Deactivated successfully.
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.596 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.599 227766 INFO os_vif [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d7:f8:1b,bridge_name='br-int',has_traffic_filtering=True,id=f53d0ca7-d346-4e78-831f-33d2d4b1ec75,network=Network(3c924c14-6a63-4f18-9be4-d068c6154900),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf53d0ca7-d3')#033[00m
Jan 23 04:46:51 np0005593234 ovn_controller[134547]: 2026-01-23T09:46:51Z|00168|binding|INFO|Releasing lport ac03bb92-c2b9-42f2-8262-31c869e0dff8 from this chassis (sb_readonly=0)
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.618 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:51 np0005593234 podman[254446]: 2026-01-23 09:46:51.642424295 +0000 UTC m=+0.037450667 container remove 43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.647 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fc499564-d132-4efb-8685-8b918d2ffc4e]: (4, ('Fri Jan 23 09:46:51 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900 (43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f)\n43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f\nFri Jan 23 09:46:51 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900 (43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f)\n43bfffa9bc558cb93fbe120947b6aa9fdb3354cb22efc6abfad934359d9dec8f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.649 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b39912af-33de-40f7-acbb-9c472121fa09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.650 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c924c14-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:46:51 np0005593234 kernel: tap3c924c14-60: left promiscuous mode
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.654 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:51 np0005593234 nova_compute[227762]: 2026-01-23 09:46:51.666 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.669 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f755930c-e0f9-41ad-97de-77c5b5097647]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.684 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cfe7f843-52e2-4589-9a33-b15f21d0b343]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.686 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[df0e5747-5b8c-4a5e-8eca-dbbbf8158d6e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.701 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d81741db-c5cd-43c3-b456-2b0caa0cd2d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 547529, 'reachable_time': 34081, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 254477, 'error': None, 'target': 'ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.703 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3c924c14-6a63-4f18-9be4-d068c6154900 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:46:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:46:51.703 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[ca102a8f-8816-42c7-b857-48415ad3e570]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:46:51 np0005593234 systemd[1]: run-netns-ovnmeta\x2d3c924c14\x2d6a63\x2d4f18\x2d9be4\x2dd068c6154900.mount: Deactivated successfully.
Jan 23 04:46:52 np0005593234 nova_compute[227762]: 2026-01-23 09:46:52.087 227766 INFO nova.virt.libvirt.driver [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Deleting instance files /var/lib/nova/instances/4312368d-83d8-4d95-98a1-18bc91f966ce_del#033[00m
Jan 23 04:46:52 np0005593234 nova_compute[227762]: 2026-01-23 09:46:52.088 227766 INFO nova.virt.libvirt.driver [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Deletion of /var/lib/nova/instances/4312368d-83d8-4d95-98a1-18bc91f966ce_del complete#033[00m
Jan 23 04:46:52 np0005593234 nova_compute[227762]: 2026-01-23 09:46:52.176 227766 DEBUG nova.compute.manager [req-deffd58f-a72f-4f55-95cf-78f4f964fe22 req-1a4667e0-4cff-4907-ab2d-55c89897d4a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Received event network-vif-unplugged-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:46:52 np0005593234 nova_compute[227762]: 2026-01-23 09:46:52.176 227766 DEBUG oslo_concurrency.lockutils [req-deffd58f-a72f-4f55-95cf-78f4f964fe22 req-1a4667e0-4cff-4907-ab2d-55c89897d4a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:52 np0005593234 nova_compute[227762]: 2026-01-23 09:46:52.176 227766 DEBUG oslo_concurrency.lockutils [req-deffd58f-a72f-4f55-95cf-78f4f964fe22 req-1a4667e0-4cff-4907-ab2d-55c89897d4a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:52 np0005593234 nova_compute[227762]: 2026-01-23 09:46:52.177 227766 DEBUG oslo_concurrency.lockutils [req-deffd58f-a72f-4f55-95cf-78f4f964fe22 req-1a4667e0-4cff-4907-ab2d-55c89897d4a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:52 np0005593234 nova_compute[227762]: 2026-01-23 09:46:52.177 227766 DEBUG nova.compute.manager [req-deffd58f-a72f-4f55-95cf-78f4f964fe22 req-1a4667e0-4cff-4907-ab2d-55c89897d4a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] No waiting events found dispatching network-vif-unplugged-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:46:52 np0005593234 nova_compute[227762]: 2026-01-23 09:46:52.177 227766 DEBUG nova.compute.manager [req-deffd58f-a72f-4f55-95cf-78f4f964fe22 req-1a4667e0-4cff-4907-ab2d-55c89897d4a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Received event network-vif-unplugged-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:46:52 np0005593234 nova_compute[227762]: 2026-01-23 09:46:52.210 227766 INFO nova.compute.manager [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Took 0.92 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:46:52 np0005593234 nova_compute[227762]: 2026-01-23 09:46:52.211 227766 DEBUG oslo.service.loopingcall [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:46:52 np0005593234 nova_compute[227762]: 2026-01-23 09:46:52.211 227766 DEBUG nova.compute.manager [-] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:46:52 np0005593234 nova_compute[227762]: 2026-01-23 09:46:52.211 227766 DEBUG nova.network.neutron [-] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:46:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:53.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:53.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:54 np0005593234 nova_compute[227762]: 2026-01-23 09:46:54.545 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:54 np0005593234 nova_compute[227762]: 2026-01-23 09:46:54.859 227766 DEBUG nova.compute.manager [req-e8a6b34a-4488-41a7-8cc9-7418a81d457d req-28a8309d-2876-4ae3-b9cc-2d51078586fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Received event network-vif-plugged-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:46:54 np0005593234 nova_compute[227762]: 2026-01-23 09:46:54.860 227766 DEBUG oslo_concurrency.lockutils [req-e8a6b34a-4488-41a7-8cc9-7418a81d457d req-28a8309d-2876-4ae3-b9cc-2d51078586fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:54 np0005593234 nova_compute[227762]: 2026-01-23 09:46:54.860 227766 DEBUG oslo_concurrency.lockutils [req-e8a6b34a-4488-41a7-8cc9-7418a81d457d req-28a8309d-2876-4ae3-b9cc-2d51078586fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:54 np0005593234 nova_compute[227762]: 2026-01-23 09:46:54.861 227766 DEBUG oslo_concurrency.lockutils [req-e8a6b34a-4488-41a7-8cc9-7418a81d457d req-28a8309d-2876-4ae3-b9cc-2d51078586fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:54 np0005593234 nova_compute[227762]: 2026-01-23 09:46:54.861 227766 DEBUG nova.compute.manager [req-e8a6b34a-4488-41a7-8cc9-7418a81d457d req-28a8309d-2876-4ae3-b9cc-2d51078586fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] No waiting events found dispatching network-vif-plugged-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:46:54 np0005593234 nova_compute[227762]: 2026-01-23 09:46:54.861 227766 WARNING nova.compute.manager [req-e8a6b34a-4488-41a7-8cc9-7418a81d457d req-28a8309d-2876-4ae3-b9cc-2d51078586fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Received unexpected event network-vif-plugged-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:46:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e225 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:46:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e226 e226: 3 total, 3 up, 3 in
Jan 23 04:46:55 np0005593234 nova_compute[227762]: 2026-01-23 09:46:55.008 227766 DEBUG nova.network.neutron [-] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:46:55 np0005593234 nova_compute[227762]: 2026-01-23 09:46:55.043 227766 INFO nova.compute.manager [-] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Took 2.83 seconds to deallocate network for instance.#033[00m
Jan 23 04:46:55 np0005593234 nova_compute[227762]: 2026-01-23 09:46:55.153 227766 DEBUG oslo_concurrency.lockutils [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:46:55 np0005593234 nova_compute[227762]: 2026-01-23 09:46:55.154 227766 DEBUG oslo_concurrency.lockutils [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:46:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:46:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:55.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:46:55 np0005593234 nova_compute[227762]: 2026-01-23 09:46:55.235 227766 DEBUG oslo_concurrency.processutils [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:46:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:55.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:46:55 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2101524369' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:46:55 np0005593234 nova_compute[227762]: 2026-01-23 09:46:55.646 227766 DEBUG oslo_concurrency.processutils [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:46:55 np0005593234 nova_compute[227762]: 2026-01-23 09:46:55.652 227766 DEBUG nova.compute.provider_tree [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:46:55 np0005593234 nova_compute[227762]: 2026-01-23 09:46:55.810 227766 DEBUG nova.scheduler.client.report [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:46:55 np0005593234 nova_compute[227762]: 2026-01-23 09:46:55.829 227766 DEBUG nova.compute.manager [req-fadfd249-e96c-410b-bb57-995fbace356e req-2fec22aa-5b5d-46a9-b7e7-684775395487 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Received event network-vif-deleted-f53d0ca7-d346-4e78-831f-33d2d4b1ec75 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:46:55 np0005593234 nova_compute[227762]: 2026-01-23 09:46:55.855 227766 DEBUG oslo_concurrency.lockutils [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:55 np0005593234 nova_compute[227762]: 2026-01-23 09:46:55.949 227766 INFO nova.scheduler.client.report [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Deleted allocations for instance 4312368d-83d8-4d95-98a1-18bc91f966ce#033[00m
Jan 23 04:46:56 np0005593234 nova_compute[227762]: 2026-01-23 09:46:56.189 227766 DEBUG oslo_concurrency.lockutils [None req-d316f11c-a413-48f3-b2d7-130171d437b4 69cd789fe82e4fb6a1a6fc06333e456b bb5dcf0180e448f3971332f4d58c11f9 - - default default] Lock "4312368d-83d8-4d95-98a1-18bc91f966ce" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.899s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:46:56 np0005593234 nova_compute[227762]: 2026-01-23 09:46:56.586 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:57.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:57.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e227 e227: 3 total, 3 up, 3 in
Jan 23 04:46:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:46:59.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:46:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:46:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:46:59.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:46:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e228 e228: 3 total, 3 up, 3 in
Jan 23 04:46:59 np0005593234 nova_compute[227762]: 2026-01-23 09:46:59.547 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:46:59 np0005593234 podman[254556]: 2026-01-23 09:46:59.763471303 +0000 UTC m=+0.048717870 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 04:46:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e228 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e229 e229: 3 total, 3 up, 3 in
Jan 23 04:47:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:01.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:01.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:01 np0005593234 nova_compute[227762]: 2026-01-23 09:47:01.589 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:03.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:47:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:03.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:47:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e230 e230: 3 total, 3 up, 3 in
Jan 23 04:47:04 np0005593234 nova_compute[227762]: 2026-01-23 09:47:04.548 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:05.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:47:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:05.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:47:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:47:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:47:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:47:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:47:06 np0005593234 nova_compute[227762]: 2026-01-23 09:47:06.539 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161611.537754, 4312368d-83d8-4d95-98a1-18bc91f966ce => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:47:06 np0005593234 nova_compute[227762]: 2026-01-23 09:47:06.540 227766 INFO nova.compute.manager [-] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:47:06 np0005593234 nova_compute[227762]: 2026-01-23 09:47:06.592 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:06 np0005593234 nova_compute[227762]: 2026-01-23 09:47:06.718 227766 DEBUG nova.compute.manager [None req-b7b28cb5-d570-417a-9f56-ae686edb2c9e - - - - - -] [instance: 4312368d-83d8-4d95-98a1-18bc91f966ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:07.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:47:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:47:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:07.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:47:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:47:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:09.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:47:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:09.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:09 np0005593234 nova_compute[227762]: 2026-01-23 09:47:09.586 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:11.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:11.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:11 np0005593234 nova_compute[227762]: 2026-01-23 09:47:11.596 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:47:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:47:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:47:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:13.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:47:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:13.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:13 np0005593234 podman[254762]: 2026-01-23 09:47:13.804346251 +0000 UTC m=+0.096713077 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 04:47:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e231 e231: 3 total, 3 up, 3 in
Jan 23 04:47:14 np0005593234 nova_compute[227762]: 2026-01-23 09:47:14.588 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:14 np0005593234 nova_compute[227762]: 2026-01-23 09:47:14.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:15.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:15.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e232 e232: 3 total, 3 up, 3 in
Jan 23 04:47:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e233 e233: 3 total, 3 up, 3 in
Jan 23 04:47:16 np0005593234 nova_compute[227762]: 2026-01-23 09:47:16.599 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:17.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:17.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e234 e234: 3 total, 3 up, 3 in
Jan 23 04:47:17 np0005593234 nova_compute[227762]: 2026-01-23 09:47:17.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:17 np0005593234 nova_compute[227762]: 2026-01-23 09:47:17.769 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:17 np0005593234 nova_compute[227762]: 2026-01-23 09:47:17.769 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:17 np0005593234 nova_compute[227762]: 2026-01-23 09:47:17.769 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:17 np0005593234 nova_compute[227762]: 2026-01-23 09:47:17.769 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:47:17 np0005593234 nova_compute[227762]: 2026-01-23 09:47:17.769 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:47:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1545439500' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:47:18 np0005593234 nova_compute[227762]: 2026-01-23 09:47:18.286 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:18 np0005593234 nova_compute[227762]: 2026-01-23 09:47:18.446 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:47:18 np0005593234 nova_compute[227762]: 2026-01-23 09:47:18.448 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4609MB free_disk=20.858360290527344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:47:18 np0005593234 nova_compute[227762]: 2026-01-23 09:47:18.448 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:18 np0005593234 nova_compute[227762]: 2026-01-23 09:47:18.448 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:18 np0005593234 nova_compute[227762]: 2026-01-23 09:47:18.537 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:47:18 np0005593234 nova_compute[227762]: 2026-01-23 09:47:18.538 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:47:18 np0005593234 nova_compute[227762]: 2026-01-23 09:47:18.556 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:47:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3023340630' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:47:19 np0005593234 nova_compute[227762]: 2026-01-23 09:47:19.083 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:19 np0005593234 nova_compute[227762]: 2026-01-23 09:47:19.089 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:47:19 np0005593234 nova_compute[227762]: 2026-01-23 09:47:19.106 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:47:19 np0005593234 nova_compute[227762]: 2026-01-23 09:47:19.127 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:47:19 np0005593234 nova_compute[227762]: 2026-01-23 09:47:19.128 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:19.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:47:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:19.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:47:19 np0005593234 nova_compute[227762]: 2026-01-23 09:47:19.589 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e235 e235: 3 total, 3 up, 3 in
Jan 23 04:47:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e235 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:20 np0005593234 nova_compute[227762]: 2026-01-23 09:47:20.128 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:20 np0005593234 nova_compute[227762]: 2026-01-23 09:47:20.151 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:20 np0005593234 nova_compute[227762]: 2026-01-23 09:47:20.151 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:20 np0005593234 nova_compute[227762]: 2026-01-23 09:47:20.152 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:47:20 np0005593234 nova_compute[227762]: 2026-01-23 09:47:20.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:20 np0005593234 nova_compute[227762]: 2026-01-23 09:47:20.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:47:20 np0005593234 nova_compute[227762]: 2026-01-23 09:47:20.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:47:20 np0005593234 nova_compute[227762]: 2026-01-23 09:47:20.766 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:47:20 np0005593234 nova_compute[227762]: 2026-01-23 09:47:20.767 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:21.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:47:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:21.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:47:21 np0005593234 nova_compute[227762]: 2026-01-23 09:47:21.602 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:22 np0005593234 nova_compute[227762]: 2026-01-23 09:47:22.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:22 np0005593234 nova_compute[227762]: 2026-01-23 09:47:22.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:47:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:23.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:47:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:47:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:23.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:47:24 np0005593234 nova_compute[227762]: 2026-01-23 09:47:24.605 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:24 np0005593234 nova_compute[227762]: 2026-01-23 09:47:24.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:47:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e236 e236: 3 total, 3 up, 3 in
Jan 23 04:47:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e236 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:25.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:25.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:26 np0005593234 nova_compute[227762]: 2026-01-23 09:47:26.605 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:27.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:27.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e237 e237: 3 total, 3 up, 3 in
Jan 23 04:47:28 np0005593234 ceph-mgr[77448]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 04:47:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:47:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:29.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:47:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:29.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e238 e238: 3 total, 3 up, 3 in
Jan 23 04:47:29 np0005593234 nova_compute[227762]: 2026-01-23 09:47:29.647 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e238 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:30 np0005593234 podman[254892]: 2026-01-23 09:47:30.755406601 +0000 UTC m=+0.048451272 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:47:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:47:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:31.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:47:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:31.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:31 np0005593234 nova_compute[227762]: 2026-01-23 09:47:31.609 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:33.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e239 e239: 3 total, 3 up, 3 in
Jan 23 04:47:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:33.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:34 np0005593234 nova_compute[227762]: 2026-01-23 09:47:34.651 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e239 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:35.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:35.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:36 np0005593234 nova_compute[227762]: 2026-01-23 09:47:36.612 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:37.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:37.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:39.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e240 e240: 3 total, 3 up, 3 in
Jan 23 04:47:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:47:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:39.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:47:39 np0005593234 nova_compute[227762]: 2026-01-23 09:47:39.695 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:41.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:41.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:41 np0005593234 nova_compute[227762]: 2026-01-23 09:47:41.616 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:42.823 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:42.824 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:42.824 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:47:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:43.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:47:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:43.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e241 e241: 3 total, 3 up, 3 in
Jan 23 04:47:44 np0005593234 nova_compute[227762]: 2026-01-23 09:47:44.697 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:44 np0005593234 podman[254968]: 2026-01-23 09:47:44.77430146 +0000 UTC m=+0.072763805 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 04:47:44 np0005593234 nova_compute[227762]: 2026-01-23 09:47:44.804 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:44 np0005593234 nova_compute[227762]: 2026-01-23 09:47:44.805 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:44 np0005593234 nova_compute[227762]: 2026-01-23 09:47:44.825 227766 DEBUG nova.compute.manager [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:47:44 np0005593234 nova_compute[227762]: 2026-01-23 09:47:44.919 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:44 np0005593234 nova_compute[227762]: 2026-01-23 09:47:44.920 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:44 np0005593234 nova_compute[227762]: 2026-01-23 09:47:44.926 227766 DEBUG nova.virt.hardware [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:47:44 np0005593234 nova_compute[227762]: 2026-01-23 09:47:44.926 227766 INFO nova.compute.claims [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:47:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.068 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:45.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:45.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:47:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3135547613' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.490 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.496 227766 DEBUG nova.compute.provider_tree [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.525 227766 DEBUG nova.scheduler.client.report [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:47:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:45.561 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:47:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:45.562 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.565 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.567 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.648s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.568 227766 DEBUG nova.compute.manager [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.669 227766 DEBUG nova.compute.manager [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.670 227766 DEBUG nova.network.neutron [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.697 227766 INFO nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.724 227766 DEBUG nova.compute.manager [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.832 227766 DEBUG nova.compute.manager [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.834 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.834 227766 INFO nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Creating image(s)#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.866 227766 DEBUG nova.storage.rbd_utils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.898 227766 DEBUG nova.storage.rbd_utils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.923 227766 DEBUG nova.storage.rbd_utils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.926 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.993 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.994 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.995 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:45 np0005593234 nova_compute[227762]: 2026-01-23 09:47:45.995 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:46 np0005593234 nova_compute[227762]: 2026-01-23 09:47:46.021 227766 DEBUG nova.storage.rbd_utils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:46 np0005593234 nova_compute[227762]: 2026-01-23 09:47:46.024 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:46 np0005593234 nova_compute[227762]: 2026-01-23 09:47:46.524 227766 DEBUG nova.policy [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77cda1e9a0404425a06c34637e696603', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '390d19f683334995a5268cf9b4d5e464', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:47:46 np0005593234 nova_compute[227762]: 2026-01-23 09:47:46.882 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:47:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:47.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:47:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:47.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:47 np0005593234 nova_compute[227762]: 2026-01-23 09:47:47.610 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.586s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:47 np0005593234 nova_compute[227762]: 2026-01-23 09:47:47.684 227766 DEBUG nova.storage.rbd_utils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] resizing rbd image ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:47:48 np0005593234 nova_compute[227762]: 2026-01-23 09:47:48.284 227766 DEBUG nova.objects.instance [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'migration_context' on Instance uuid ba01649c-a6ef-4784-b3dd-49d03f96cef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:47:48 np0005593234 nova_compute[227762]: 2026-01-23 09:47:48.489 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:47:48 np0005593234 nova_compute[227762]: 2026-01-23 09:47:48.489 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Ensure instance console log exists: /var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:47:48 np0005593234 nova_compute[227762]: 2026-01-23 09:47:48.501 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:48 np0005593234 nova_compute[227762]: 2026-01-23 09:47:48.502 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:48 np0005593234 nova_compute[227762]: 2026-01-23 09:47:48.502 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:48 np0005593234 nova_compute[227762]: 2026-01-23 09:47:48.754 227766 DEBUG nova.network.neutron [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Successfully created port: cafea320-d23e-45bb-a6b2-46cfa7bd8741 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:47:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:47:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:49.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:47:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:49.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:49 np0005593234 nova_compute[227762]: 2026-01-23 09:47:49.698 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:49 np0005593234 nova_compute[227762]: 2026-01-23 09:47:49.872 227766 DEBUG nova.network.neutron [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Successfully updated port: cafea320-d23e-45bb-a6b2-46cfa7bd8741 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:47:49 np0005593234 nova_compute[227762]: 2026-01-23 09:47:49.886 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:47:49 np0005593234 nova_compute[227762]: 2026-01-23 09:47:49.887 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquired lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:47:49 np0005593234 nova_compute[227762]: 2026-01-23 09:47:49.887 227766 DEBUG nova.network.neutron [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:47:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:50 np0005593234 nova_compute[227762]: 2026-01-23 09:47:50.010 227766 DEBUG nova.compute.manager [req-56f51411-edd3-4c3f-b973-686b471ec77a req-4aa90cba-fd2b-431f-8c07-0cad7165b1cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received event network-changed-cafea320-d23e-45bb-a6b2-46cfa7bd8741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:47:50 np0005593234 nova_compute[227762]: 2026-01-23 09:47:50.010 227766 DEBUG nova.compute.manager [req-56f51411-edd3-4c3f-b973-686b471ec77a req-4aa90cba-fd2b-431f-8c07-0cad7165b1cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Refreshing instance network info cache due to event network-changed-cafea320-d23e-45bb-a6b2-46cfa7bd8741. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:47:50 np0005593234 nova_compute[227762]: 2026-01-23 09:47:50.011 227766 DEBUG oslo_concurrency.lockutils [req-56f51411-edd3-4c3f-b973-686b471ec77a req-4aa90cba-fd2b-431f-8c07-0cad7165b1cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:47:50 np0005593234 nova_compute[227762]: 2026-01-23 09:47:50.430 227766 DEBUG nova.network.neutron [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:47:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:50.563 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:51.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:51.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.704 227766 DEBUG nova.network.neutron [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updating instance_info_cache with network_info: [{"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.885 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Releasing lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.885 227766 DEBUG nova.compute.manager [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Instance network_info: |[{"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.886 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.886 227766 DEBUG oslo_concurrency.lockutils [req-56f51411-edd3-4c3f-b973-686b471ec77a req-4aa90cba-fd2b-431f-8c07-0cad7165b1cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.886 227766 DEBUG nova.network.neutron [req-56f51411-edd3-4c3f-b973-686b471ec77a req-4aa90cba-fd2b-431f-8c07-0cad7165b1cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Refreshing network info cache for port cafea320-d23e-45bb-a6b2-46cfa7bd8741 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.889 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Start _get_guest_xml network_info=[{"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.893 227766 WARNING nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.898 227766 DEBUG nova.virt.libvirt.host [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.899 227766 DEBUG nova.virt.libvirt.host [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.905 227766 DEBUG nova.virt.libvirt.host [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.905 227766 DEBUG nova.virt.libvirt.host [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.906 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.906 227766 DEBUG nova.virt.hardware [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.907 227766 DEBUG nova.virt.hardware [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.907 227766 DEBUG nova.virt.hardware [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.907 227766 DEBUG nova.virt.hardware [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.907 227766 DEBUG nova.virt.hardware [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.908 227766 DEBUG nova.virt.hardware [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.908 227766 DEBUG nova.virt.hardware [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.908 227766 DEBUG nova.virt.hardware [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.909 227766 DEBUG nova.virt.hardware [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.909 227766 DEBUG nova.virt.hardware [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.909 227766 DEBUG nova.virt.hardware [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:47:51 np0005593234 nova_compute[227762]: 2026-01-23 09:47:51.913 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:51 np0005593234 ovn_controller[134547]: 2026-01-23T09:47:51Z|00169|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Jan 23 04:47:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:47:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/419652368' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.368 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.394 227766 DEBUG nova.storage.rbd_utils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.398 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:47:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3039988979' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.842 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.844 227766 DEBUG nova.virt.libvirt.vif [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:47:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1172485769',display_name='tempest-AttachInterfacesTestJSON-server-1172485769',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1172485769',id=62,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDHG7P7LPqhPdhdP9QeVe34Yjpl7KzdYfgP2NH94mbs2nERHgn5Mq0EdsvgIZnluIsywiR5553mEm6eEs6CP9hgYZdCuMOYdAYJrW3IIj+b2hJ9DHiMcy6FZA0BTPNAN1g==',key_name='tempest-keypair-1309682370',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-8nyczwr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:47:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=ba01649c-a6ef-4784-b3dd-49d03f96cef4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.844 227766 DEBUG nova.network.os_vif_util [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.845 227766 DEBUG nova.network.os_vif_util [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:9b:f9,bridge_name='br-int',has_traffic_filtering=True,id=cafea320-d23e-45bb-a6b2-46cfa7bd8741,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcafea320-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.846 227766 DEBUG nova.objects.instance [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'pci_devices' on Instance uuid ba01649c-a6ef-4784-b3dd-49d03f96cef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.860 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  <uuid>ba01649c-a6ef-4784-b3dd-49d03f96cef4</uuid>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  <name>instance-0000003e</name>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <nova:name>tempest-AttachInterfacesTestJSON-server-1172485769</nova:name>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:47:51</nova:creationTime>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <nova:port uuid="cafea320-d23e-45bb-a6b2-46cfa7bd8741">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <entry name="serial">ba01649c-a6ef-4784-b3dd-49d03f96cef4</entry>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <entry name="uuid">ba01649c-a6ef-4784-b3dd-49d03f96cef4</entry>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk.config">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:06:9b:f9"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <target dev="tapcafea320-d2"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/console.log" append="off"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:47:52 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:47:52 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:47:52 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:47:52 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.862 227766 DEBUG nova.compute.manager [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Preparing to wait for external event network-vif-plugged-cafea320-d23e-45bb-a6b2-46cfa7bd8741 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.862 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.863 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.863 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.864 227766 DEBUG nova.virt.libvirt.vif [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:47:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1172485769',display_name='tempest-AttachInterfacesTestJSON-server-1172485769',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1172485769',id=62,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDHG7P7LPqhPdhdP9QeVe34Yjpl7KzdYfgP2NH94mbs2nERHgn5Mq0EdsvgIZnluIsywiR5553mEm6eEs6CP9hgYZdCuMOYdAYJrW3IIj+b2hJ9DHiMcy6FZA0BTPNAN1g==',key_name='tempest-keypair-1309682370',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-8nyczwr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:47:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=ba01649c-a6ef-4784-b3dd-49d03f96cef4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.864 227766 DEBUG nova.network.os_vif_util [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.865 227766 DEBUG nova.network.os_vif_util [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:9b:f9,bridge_name='br-int',has_traffic_filtering=True,id=cafea320-d23e-45bb-a6b2-46cfa7bd8741,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcafea320-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.865 227766 DEBUG os_vif [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:9b:f9,bridge_name='br-int',has_traffic_filtering=True,id=cafea320-d23e-45bb-a6b2-46cfa7bd8741,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcafea320-d2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.866 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.866 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.867 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.870 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.870 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapcafea320-d2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.871 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapcafea320-d2, col_values=(('external_ids', {'iface-id': 'cafea320-d23e-45bb-a6b2-46cfa7bd8741', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:9b:f9', 'vm-uuid': 'ba01649c-a6ef-4784-b3dd-49d03f96cef4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.872 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:52 np0005593234 NetworkManager[48942]: <info>  [1769161672.8734] manager: (tapcafea320-d2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.875 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.879 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.880 227766 INFO os_vif [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:9b:f9,bridge_name='br-int',has_traffic_filtering=True,id=cafea320-d23e-45bb-a6b2-46cfa7bd8741,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcafea320-d2')#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.935 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.936 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.936 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:06:9b:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.937 227766 INFO nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Using config drive#033[00m
Jan 23 04:47:52 np0005593234 nova_compute[227762]: 2026-01-23 09:47:52.960 227766 DEBUG nova.storage.rbd_utils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:47:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:53.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:47:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:47:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:53.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:47:53 np0005593234 nova_compute[227762]: 2026-01-23 09:47:53.767 227766 INFO nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Creating config drive at /var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/disk.config#033[00m
Jan 23 04:47:53 np0005593234 nova_compute[227762]: 2026-01-23 09:47:53.775 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpykecg1dv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:53 np0005593234 nova_compute[227762]: 2026-01-23 09:47:53.871 227766 DEBUG nova.network.neutron [req-56f51411-edd3-4c3f-b973-686b471ec77a req-4aa90cba-fd2b-431f-8c07-0cad7165b1cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updated VIF entry in instance network info cache for port cafea320-d23e-45bb-a6b2-46cfa7bd8741. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:47:53 np0005593234 nova_compute[227762]: 2026-01-23 09:47:53.873 227766 DEBUG nova.network.neutron [req-56f51411-edd3-4c3f-b973-686b471ec77a req-4aa90cba-fd2b-431f-8c07-0cad7165b1cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updating instance_info_cache with network_info: [{"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:47:53 np0005593234 nova_compute[227762]: 2026-01-23 09:47:53.899 227766 DEBUG oslo_concurrency.lockutils [req-56f51411-edd3-4c3f-b973-686b471ec77a req-4aa90cba-fd2b-431f-8c07-0cad7165b1cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:47:53 np0005593234 nova_compute[227762]: 2026-01-23 09:47:53.917 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpykecg1dv" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:53 np0005593234 nova_compute[227762]: 2026-01-23 09:47:53.965 227766 DEBUG nova.storage.rbd_utils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:47:53 np0005593234 nova_compute[227762]: 2026-01-23 09:47:53.971 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/disk.config ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.236 227766 DEBUG oslo_concurrency.processutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/disk.config ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.265s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.237 227766 INFO nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Deleting local config drive /var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/disk.config because it was imported into RBD.#033[00m
Jan 23 04:47:54 np0005593234 kernel: tapcafea320-d2: entered promiscuous mode
Jan 23 04:47:54 np0005593234 NetworkManager[48942]: <info>  [1769161674.2854] manager: (tapcafea320-d2): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Jan 23 04:47:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:47:54Z|00170|binding|INFO|Claiming lport cafea320-d23e-45bb-a6b2-46cfa7bd8741 for this chassis.
Jan 23 04:47:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:47:54Z|00171|binding|INFO|cafea320-d23e-45bb-a6b2-46cfa7bd8741: Claiming fa:16:3e:06:9b:f9 10.100.0.3
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.287 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.291 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.294 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:54 np0005593234 systemd-machined[195626]: New machine qemu-26-instance-0000003e.
Jan 23 04:47:54 np0005593234 systemd-udevd[255325]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.325 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:9b:f9 10.100.0.3'], port_security=['fa:16:3e:06:9b:f9 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ba01649c-a6ef-4784-b3dd-49d03f96cef4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '2', 'neutron:security_group_ids': '4347dab6-bc98-4fea-9c51-e02238fc830c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=cafea320-d23e-45bb-a6b2-46cfa7bd8741) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.326 144381 INFO neutron.agent.ovn.metadata.agent [-] Port cafea320-d23e-45bb-a6b2-46cfa7bd8741 in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 bound to our chassis#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.327 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4#033[00m
Jan 23 04:47:54 np0005593234 NetworkManager[48942]: <info>  [1769161674.3350] device (tapcafea320-d2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:47:54 np0005593234 NetworkManager[48942]: <info>  [1769161674.3363] device (tapcafea320-d2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.338 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6e43bb95-b3be-4453-8bd2-456bdb28b518]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.339 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7808328e-21 in ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.341 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7808328e-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.341 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8d2fa950-3a49-45db-affb-10a3bca690d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.342 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9487564c-b047-4d46-89ac-f34930039c2c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.354 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[0915577a-37d9-4f77-a73c-544e44b0cbf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 systemd[1]: Started Virtual Machine qemu-26-instance-0000003e.
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.359 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:47:54Z|00172|binding|INFO|Setting lport cafea320-d23e-45bb-a6b2-46cfa7bd8741 ovn-installed in OVS
Jan 23 04:47:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:47:54Z|00173|binding|INFO|Setting lport cafea320-d23e-45bb-a6b2-46cfa7bd8741 up in Southbound
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.363 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.379 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[441e5b6f-5559-43c8-89bd-06e4e6d29f34]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.401 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[29c60543-70aa-4815-9442-e1563fbb69dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 NetworkManager[48942]: <info>  [1769161674.4086] manager: (tap7808328e-20): new Veth device (/org/freedesktop/NetworkManager/Devices/96)
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.407 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a26f48d9-dced-4834-821c-c36267da3a1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.443 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b9761044-5c88-4975-b189-80f38f50002c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.446 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[3cf6c3fb-e865-4975-9d65-12376d259bea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 NetworkManager[48942]: <info>  [1769161674.4689] device (tap7808328e-20): carrier: link connected
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.473 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[99008fa8-f1c1-44f0-9113-3fa59cafac3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.491 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a35347eb-2acd-4e84-9f8e-a58e65de4727]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556942, 'reachable_time': 35606, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255358, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.513 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[486f00d5-49f2-4045-8844-25ce8b232ccc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:22ae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556942, 'tstamp': 556942}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255359, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.531 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bffc091e-96bf-42dc-b2e4-15987d051e8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556942, 'reachable_time': 35606, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 255360, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.561 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[769256a3-41ef-4a13-b2f4-697049505fae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.619 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[316d3ab2-0dfd-4fb2-8960-412784496264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.621 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.621 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.622 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7808328e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:54 np0005593234 NetworkManager[48942]: <info>  [1769161674.6239] manager: (tap7808328e-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Jan 23 04:47:54 np0005593234 kernel: tap7808328e-20: entered promiscuous mode
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.625 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.632 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7808328e-20, col_values=(('external_ids', {'iface-id': 'db11772c-e758-43ff-997c-e8c835433e90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.633 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:47:54Z|00174|binding|INFO|Releasing lport db11772c-e758-43ff-997c-e8c835433e90 from this chassis (sb_readonly=0)
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.646 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.648 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7808328e-22f9-46df-ac06-f8c3d6ad10c4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7808328e-22f9-46df-ac06-f8c3d6ad10c4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.649 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2a141ddf-dc07-4198-b06b-249fc00ef072]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.650 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-7808328e-22f9-46df-ac06-f8c3d6ad10c4
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/7808328e-22f9-46df-ac06-f8c3d6ad10c4.pid.haproxy
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 7808328e-22f9-46df-ac06-f8c3d6ad10c4
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:47:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:47:54.650 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'env', 'PROCESS_TAG=haproxy-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7808328e-22f9-46df-ac06-f8c3d6ad10c4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.700 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.892 227766 DEBUG nova.compute.manager [req-8dcceffa-5ab6-40a2-a170-6101395c3248 req-dfe301ac-fe65-47ee-81ea-1b2a0800b940 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received event network-vif-plugged-cafea320-d23e-45bb-a6b2-46cfa7bd8741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.894 227766 DEBUG oslo_concurrency.lockutils [req-8dcceffa-5ab6-40a2-a170-6101395c3248 req-dfe301ac-fe65-47ee-81ea-1b2a0800b940 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.894 227766 DEBUG oslo_concurrency.lockutils [req-8dcceffa-5ab6-40a2-a170-6101395c3248 req-dfe301ac-fe65-47ee-81ea-1b2a0800b940 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.894 227766 DEBUG oslo_concurrency.lockutils [req-8dcceffa-5ab6-40a2-a170-6101395c3248 req-dfe301ac-fe65-47ee-81ea-1b2a0800b940 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:54 np0005593234 nova_compute[227762]: 2026-01-23 09:47:54.895 227766 DEBUG nova.compute.manager [req-8dcceffa-5ab6-40a2-a170-6101395c3248 req-dfe301ac-fe65-47ee-81ea-1b2a0800b940 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Processing event network-vif-plugged-cafea320-d23e-45bb-a6b2-46cfa7bd8741 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:47:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e241 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:47:55 np0005593234 podman[255392]: 2026-01-23 09:47:55.01325835 +0000 UTC m=+0.046555462 container create 7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 23 04:47:55 np0005593234 systemd[1]: Started libpod-conmon-7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09.scope.
Jan 23 04:47:55 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:47:55 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b11dfd44ec2069e9c136ecc6944ac62cdc13805a958d923318bf4f2cb8dcb1bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:47:55 np0005593234 podman[255392]: 2026-01-23 09:47:55.07217753 +0000 UTC m=+0.105474662 container init 7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 04:47:55 np0005593234 podman[255392]: 2026-01-23 09:47:55.080591804 +0000 UTC m=+0.113888936 container start 7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:47:55 np0005593234 podman[255392]: 2026-01-23 09:47:54.988039339 +0000 UTC m=+0.021336471 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:47:55 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[255407]: [NOTICE]   (255419) : New worker (255429) forked
Jan 23 04:47:55 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[255407]: [NOTICE]   (255419) : Loading success.
Jan 23 04:47:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e242 e242: 3 total, 3 up, 3 in
Jan 23 04:47:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:47:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:55.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.266 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161675.2661314, ba01649c-a6ef-4784-b3dd-49d03f96cef4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.267 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] VM Started (Lifecycle Event)#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.269 227766 DEBUG nova.compute.manager [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.272 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.276 227766 INFO nova.virt.libvirt.driver [-] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Instance spawned successfully.#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.276 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.298 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.301 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.360 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.360 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.361 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.361 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.362 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.362 227766 DEBUG nova.virt.libvirt.driver [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.365 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.365 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161675.266301, ba01649c-a6ef-4784-b3dd-49d03f96cef4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.366 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.414 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.417 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161675.2717757, ba01649c-a6ef-4784-b3dd-49d03f96cef4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.418 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:47:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:55.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.449 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.453 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.472 227766 INFO nova.compute.manager [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Took 9.64 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.473 227766 DEBUG nova.compute.manager [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.485 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.545 227766 INFO nova.compute.manager [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Took 10.65 seconds to build instance.#033[00m
Jan 23 04:47:55 np0005593234 nova_compute[227762]: 2026-01-23 09:47:55.568 227766 DEBUG oslo_concurrency.lockutils [None req-b8580837-24cb-468d-a238-ea64a29d8036 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.763s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:56 np0005593234 nova_compute[227762]: 2026-01-23 09:47:56.979 227766 DEBUG nova.compute.manager [req-715b728c-c4af-4ebf-9daf-878a98fe815a req-4c881e34-5c42-48ce-a8be-c8421bcd589f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received event network-vif-plugged-cafea320-d23e-45bb-a6b2-46cfa7bd8741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:47:56 np0005593234 nova_compute[227762]: 2026-01-23 09:47:56.980 227766 DEBUG oslo_concurrency.lockutils [req-715b728c-c4af-4ebf-9daf-878a98fe815a req-4c881e34-5c42-48ce-a8be-c8421bcd589f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:47:56 np0005593234 nova_compute[227762]: 2026-01-23 09:47:56.980 227766 DEBUG oslo_concurrency.lockutils [req-715b728c-c4af-4ebf-9daf-878a98fe815a req-4c881e34-5c42-48ce-a8be-c8421bcd589f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:47:56 np0005593234 nova_compute[227762]: 2026-01-23 09:47:56.980 227766 DEBUG oslo_concurrency.lockutils [req-715b728c-c4af-4ebf-9daf-878a98fe815a req-4c881e34-5c42-48ce-a8be-c8421bcd589f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:47:56 np0005593234 nova_compute[227762]: 2026-01-23 09:47:56.980 227766 DEBUG nova.compute.manager [req-715b728c-c4af-4ebf-9daf-878a98fe815a req-4c881e34-5c42-48ce-a8be-c8421bcd589f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] No waiting events found dispatching network-vif-plugged-cafea320-d23e-45bb-a6b2-46cfa7bd8741 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:47:56 np0005593234 nova_compute[227762]: 2026-01-23 09:47:56.980 227766 WARNING nova.compute.manager [req-715b728c-c4af-4ebf-9daf-878a98fe815a req-4c881e34-5c42-48ce-a8be-c8421bcd589f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received unexpected event network-vif-plugged-cafea320-d23e-45bb-a6b2-46cfa7bd8741 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:47:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:57.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:47:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:57.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:47:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e243 e243: 3 total, 3 up, 3 in
Jan 23 04:47:57 np0005593234 nova_compute[227762]: 2026-01-23 09:47:57.873 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:58 np0005593234 NetworkManager[48942]: <info>  [1769161678.9832] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/98)
Jan 23 04:47:58 np0005593234 nova_compute[227762]: 2026-01-23 09:47:58.982 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:58 np0005593234 NetworkManager[48942]: <info>  [1769161678.9845] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Jan 23 04:47:59 np0005593234 nova_compute[227762]: 2026-01-23 09:47:59.145 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:59 np0005593234 ovn_controller[134547]: 2026-01-23T09:47:59Z|00175|binding|INFO|Releasing lport db11772c-e758-43ff-997c-e8c835433e90 from this chassis (sb_readonly=0)
Jan 23 04:47:59 np0005593234 nova_compute[227762]: 2026-01-23 09:47:59.160 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:47:59.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:59 np0005593234 nova_compute[227762]: 2026-01-23 09:47:59.427 227766 DEBUG nova.compute.manager [req-b8f8f11a-c191-4006-ac8b-1b590dfef1c5 req-01591148-df85-45fb-805b-bf750e5e4a1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received event network-changed-cafea320-d23e-45bb-a6b2-46cfa7bd8741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:47:59 np0005593234 nova_compute[227762]: 2026-01-23 09:47:59.428 227766 DEBUG nova.compute.manager [req-b8f8f11a-c191-4006-ac8b-1b590dfef1c5 req-01591148-df85-45fb-805b-bf750e5e4a1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Refreshing instance network info cache due to event network-changed-cafea320-d23e-45bb-a6b2-46cfa7bd8741. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:47:59 np0005593234 nova_compute[227762]: 2026-01-23 09:47:59.429 227766 DEBUG oslo_concurrency.lockutils [req-b8f8f11a-c191-4006-ac8b-1b590dfef1c5 req-01591148-df85-45fb-805b-bf750e5e4a1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:47:59 np0005593234 nova_compute[227762]: 2026-01-23 09:47:59.429 227766 DEBUG oslo_concurrency.lockutils [req-b8f8f11a-c191-4006-ac8b-1b590dfef1c5 req-01591148-df85-45fb-805b-bf750e5e4a1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:47:59 np0005593234 nova_compute[227762]: 2026-01-23 09:47:59.430 227766 DEBUG nova.network.neutron [req-b8f8f11a-c191-4006-ac8b-1b590dfef1c5 req-01591148-df85-45fb-805b-bf750e5e4a1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Refreshing network info cache for port cafea320-d23e-45bb-a6b2-46cfa7bd8741 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:47:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:47:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:47:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:47:59.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:47:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e244 e244: 3 total, 3 up, 3 in
Jan 23 04:47:59 np0005593234 nova_compute[227762]: 2026-01-23 09:47:59.701 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:47:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:01.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:01 np0005593234 nova_compute[227762]: 2026-01-23 09:48:01.298 227766 DEBUG nova.network.neutron [req-b8f8f11a-c191-4006-ac8b-1b590dfef1c5 req-01591148-df85-45fb-805b-bf750e5e4a1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updated VIF entry in instance network info cache for port cafea320-d23e-45bb-a6b2-46cfa7bd8741. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:48:01 np0005593234 nova_compute[227762]: 2026-01-23 09:48:01.298 227766 DEBUG nova.network.neutron [req-b8f8f11a-c191-4006-ac8b-1b590dfef1c5 req-01591148-df85-45fb-805b-bf750e5e4a1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updating instance_info_cache with network_info: [{"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:01 np0005593234 nova_compute[227762]: 2026-01-23 09:48:01.328 227766 DEBUG oslo_concurrency.lockutils [req-b8f8f11a-c191-4006-ac8b-1b590dfef1c5 req-01591148-df85-45fb-805b-bf750e5e4a1c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:48:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:01.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e245 e245: 3 total, 3 up, 3 in
Jan 23 04:48:01 np0005593234 podman[255518]: 2026-01-23 09:48:01.748327252 +0000 UTC m=+0.047229153 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:48:01 np0005593234 nova_compute[227762]: 2026-01-23 09:48:01.949 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:02 np0005593234 nova_compute[227762]: 2026-01-23 09:48:02.876 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:03.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:03.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:04 np0005593234 nova_compute[227762]: 2026-01-23 09:48:04.702 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e246 e246: 3 total, 3 up, 3 in
Jan 23 04:48:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e246 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:05.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:05.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:07.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:07.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:07 np0005593234 nova_compute[227762]: 2026-01-23 09:48:07.878 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:09 np0005593234 nova_compute[227762]: 2026-01-23 09:48:09.086 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:09.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:09.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:09 np0005593234 nova_compute[227762]: 2026-01-23 09:48:09.780 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 e247: 3 total, 3 up, 3 in
Jan 23 04:48:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:11.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:11.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:11 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:11Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:06:9b:f9 10.100.0.3
Jan 23 04:48:11 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:11Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:06:9b:f9 10.100.0.3
Jan 23 04:48:12 np0005593234 nova_compute[227762]: 2026-01-23 09:48:12.880 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:13.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:13.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:14 np0005593234 nova_compute[227762]: 2026-01-23 09:48:14.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:14 np0005593234 nova_compute[227762]: 2026-01-23 09:48:14.781 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:15.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:15.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:15 np0005593234 podman[255678]: 2026-01-23 09:48:15.802496858 +0000 UTC m=+0.084251336 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:48:16 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:16Z|00176|binding|INFO|Releasing lport db11772c-e758-43ff-997c-e8c835433e90 from this chassis (sb_readonly=0)
Jan 23 04:48:16 np0005593234 nova_compute[227762]: 2026-01-23 09:48:16.144 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:17.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:17.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:17 np0005593234 nova_compute[227762]: 2026-01-23 09:48:17.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:17 np0005593234 nova_compute[227762]: 2026-01-23 09:48:17.882 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:18 np0005593234 nova_compute[227762]: 2026-01-23 09:48:18.073 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:18 np0005593234 nova_compute[227762]: 2026-01-23 09:48:18.073 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:18 np0005593234 nova_compute[227762]: 2026-01-23 09:48:18.073 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:18 np0005593234 nova_compute[227762]: 2026-01-23 09:48:18.074 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:48:18 np0005593234 nova_compute[227762]: 2026-01-23 09:48:18.074 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:48:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3613533948' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:48:18 np0005593234 nova_compute[227762]: 2026-01-23 09:48:18.514 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:18 np0005593234 nova_compute[227762]: 2026-01-23 09:48:18.862 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000003e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:48:18 np0005593234 nova_compute[227762]: 2026-01-23 09:48:18.862 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000003e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.009 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.011 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4464MB free_disk=20.922149658203125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.011 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.012 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.166 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance ba01649c-a6ef-4784-b3dd-49d03f96cef4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.166 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.166 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:48:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:19.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.285 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:19.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:48:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2973063721' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.735 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.740 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.757 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.785 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.785 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:19 np0005593234 nova_compute[227762]: 2026-01-23 09:48:19.832 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:48:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:48:20 np0005593234 nova_compute[227762]: 2026-01-23 09:48:20.786 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:20 np0005593234 nova_compute[227762]: 2026-01-23 09:48:20.786 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:20 np0005593234 nova_compute[227762]: 2026-01-23 09:48:20.787 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:20 np0005593234 nova_compute[227762]: 2026-01-23 09:48:20.787 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:48:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000064s ======
Jan 23 04:48:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:21.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 23 04:48:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:21.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:48:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:48:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:48:22 np0005593234 nova_compute[227762]: 2026-01-23 09:48:22.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:22 np0005593234 nova_compute[227762]: 2026-01-23 09:48:22.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:48:22 np0005593234 nova_compute[227762]: 2026-01-23 09:48:22.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:48:22 np0005593234 nova_compute[227762]: 2026-01-23 09:48:22.884 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:23.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:23 np0005593234 nova_compute[227762]: 2026-01-23 09:48:23.450 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:48:23 np0005593234 nova_compute[227762]: 2026-01-23 09:48:23.450 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:48:23 np0005593234 nova_compute[227762]: 2026-01-23 09:48:23.451 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:48:23 np0005593234 nova_compute[227762]: 2026-01-23 09:48:23.451 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ba01649c-a6ef-4784-b3dd-49d03f96cef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:48:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:23.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:24 np0005593234 nova_compute[227762]: 2026-01-23 09:48:24.868 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:25.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:48:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:25.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:48:25 np0005593234 nova_compute[227762]: 2026-01-23 09:48:25.462 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updating instance_info_cache with network_info: [{"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:25 np0005593234 nova_compute[227762]: 2026-01-23 09:48:25.487 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:48:25 np0005593234 nova_compute[227762]: 2026-01-23 09:48:25.488 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:48:25 np0005593234 nova_compute[227762]: 2026-01-23 09:48:25.488 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:25 np0005593234 nova_compute[227762]: 2026-01-23 09:48:25.488 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:25 np0005593234 nova_compute[227762]: 2026-01-23 09:48:25.489 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:25 np0005593234 nova_compute[227762]: 2026-01-23 09:48:25.489 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 04:48:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:27.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:27.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:27 np0005593234 nova_compute[227762]: 2026-01-23 09:48:27.498 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:27 np0005593234 nova_compute[227762]: 2026-01-23 09:48:27.587 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:27 np0005593234 nova_compute[227762]: 2026-01-23 09:48:27.885 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:29.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:29.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:29 np0005593234 nova_compute[227762]: 2026-01-23 09:48:29.871 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:31.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:31.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:31 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:48:31 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:48:32 np0005593234 nova_compute[227762]: 2026-01-23 09:48:32.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:32 np0005593234 podman[255860]: 2026-01-23 09:48:32.767812397 +0000 UTC m=+0.051902500 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Jan 23 04:48:32 np0005593234 nova_compute[227762]: 2026-01-23 09:48:32.888 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:33.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:33.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:34 np0005593234 nova_compute[227762]: 2026-01-23 09:48:34.873 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:35.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:35.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:35 np0005593234 nova_compute[227762]: 2026-01-23 09:48:35.557 227766 DEBUG oslo_concurrency.lockutils [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "interface-ba01649c-a6ef-4784-b3dd-49d03f96cef4-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:35 np0005593234 nova_compute[227762]: 2026-01-23 09:48:35.558 227766 DEBUG oslo_concurrency.lockutils [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-ba01649c-a6ef-4784-b3dd-49d03f96cef4-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:35 np0005593234 nova_compute[227762]: 2026-01-23 09:48:35.558 227766 DEBUG nova.objects.instance [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'flavor' on Instance uuid ba01649c-a6ef-4784-b3dd-49d03f96cef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:48:35 np0005593234 nova_compute[227762]: 2026-01-23 09:48:35.858 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:48:35 np0005593234 nova_compute[227762]: 2026-01-23 09:48:35.858 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 04:48:35 np0005593234 nova_compute[227762]: 2026-01-23 09:48:35.923 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 04:48:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:48:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:37.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:48:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:37.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:37 np0005593234 nova_compute[227762]: 2026-01-23 09:48:37.492 227766 DEBUG nova.objects.instance [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'pci_requests' on Instance uuid ba01649c-a6ef-4784-b3dd-49d03f96cef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:48:37 np0005593234 nova_compute[227762]: 2026-01-23 09:48:37.515 227766 DEBUG nova.network.neutron [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:48:37 np0005593234 nova_compute[227762]: 2026-01-23 09:48:37.890 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:38 np0005593234 nova_compute[227762]: 2026-01-23 09:48:38.031 227766 DEBUG nova.policy [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77cda1e9a0404425a06c34637e696603', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '390d19f683334995a5268cf9b4d5e464', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:48:38 np0005593234 nova_compute[227762]: 2026-01-23 09:48:38.883 227766 DEBUG nova.network.neutron [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Successfully created port: b91e97fc-d648-4053-8fdc-b4fe912b3f69 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:48:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:39.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:39.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:39 np0005593234 nova_compute[227762]: 2026-01-23 09:48:39.736 227766 DEBUG nova.network.neutron [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Successfully updated port: b91e97fc-d648-4053-8fdc-b4fe912b3f69 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:48:39 np0005593234 nova_compute[227762]: 2026-01-23 09:48:39.752 227766 DEBUG oslo_concurrency.lockutils [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:48:39 np0005593234 nova_compute[227762]: 2026-01-23 09:48:39.753 227766 DEBUG oslo_concurrency.lockutils [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquired lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:48:39 np0005593234 nova_compute[227762]: 2026-01-23 09:48:39.753 227766 DEBUG nova.network.neutron [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:48:39 np0005593234 nova_compute[227762]: 2026-01-23 09:48:39.837 227766 DEBUG nova.compute.manager [req-b1fe01e3-b75e-4945-a445-ca0bf223ab6d req-2d8adde0-e2fa-45be-9b11-9b68b1120744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received event network-changed-b91e97fc-d648-4053-8fdc-b4fe912b3f69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:39 np0005593234 nova_compute[227762]: 2026-01-23 09:48:39.838 227766 DEBUG nova.compute.manager [req-b1fe01e3-b75e-4945-a445-ca0bf223ab6d req-2d8adde0-e2fa-45be-9b11-9b68b1120744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Refreshing instance network info cache due to event network-changed-b91e97fc-d648-4053-8fdc-b4fe912b3f69. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:48:39 np0005593234 nova_compute[227762]: 2026-01-23 09:48:39.838 227766 DEBUG oslo_concurrency.lockutils [req-b1fe01e3-b75e-4945-a445-ca0bf223ab6d req-2d8adde0-e2fa-45be-9b11-9b68b1120744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:48:39 np0005593234 nova_compute[227762]: 2026-01-23 09:48:39.874 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:39 np0005593234 nova_compute[227762]: 2026-01-23 09:48:39.904 227766 WARNING nova.network.neutron [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] 7808328e-22f9-46df-ac06-f8c3d6ad10c4 already exists in list: networks containing: ['7808328e-22f9-46df-ac06-f8c3d6ad10c4']. ignoring it#033[00m
Jan 23 04:48:40 np0005593234 nova_compute[227762]: 2026-01-23 09:48:40.032 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:41.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:41.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.510 227766 DEBUG nova.network.neutron [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updating instance_info_cache with network_info: [{"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.534 227766 DEBUG oslo_concurrency.lockutils [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Releasing lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.535 227766 DEBUG oslo_concurrency.lockutils [req-b1fe01e3-b75e-4945-a445-ca0bf223ab6d req-2d8adde0-e2fa-45be-9b11-9b68b1120744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.535 227766 DEBUG nova.network.neutron [req-b1fe01e3-b75e-4945-a445-ca0bf223ab6d req-2d8adde0-e2fa-45be-9b11-9b68b1120744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Refreshing network info cache for port b91e97fc-d648-4053-8fdc-b4fe912b3f69 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.539 227766 DEBUG nova.virt.libvirt.vif [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:47:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1172485769',display_name='tempest-AttachInterfacesTestJSON-server-1172485769',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1172485769',id=62,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDHG7P7LPqhPdhdP9QeVe34Yjpl7KzdYfgP2NH94mbs2nERHgn5Mq0EdsvgIZnluIsywiR5553mEm6eEs6CP9hgYZdCuMOYdAYJrW3IIj+b2hJ9DHiMcy6FZA0BTPNAN1g==',key_name='tempest-keypair-1309682370',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:47:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-8nyczwr2',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:47:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=ba01649c-a6ef-4784-b3dd-49d03f96cef4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.539 227766 DEBUG nova.network.os_vif_util [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.540 227766 DEBUG nova.network.os_vif_util [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.541 227766 DEBUG os_vif [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.541 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.542 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.542 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.546 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.546 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb91e97fc-d6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.546 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb91e97fc-d6, col_values=(('external_ids', {'iface-id': 'b91e97fc-d648-4053-8fdc-b4fe912b3f69', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bb:87:13', 'vm-uuid': 'ba01649c-a6ef-4784-b3dd-49d03f96cef4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.548 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:42 np0005593234 NetworkManager[48942]: <info>  [1769161722.5490] manager: (tapb91e97fc-d6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.550 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.556 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.557 227766 INFO os_vif [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6')#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.558 227766 DEBUG nova.virt.libvirt.vif [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:47:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1172485769',display_name='tempest-AttachInterfacesTestJSON-server-1172485769',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1172485769',id=62,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDHG7P7LPqhPdhdP9QeVe34Yjpl7KzdYfgP2NH94mbs2nERHgn5Mq0EdsvgIZnluIsywiR5553mEm6eEs6CP9hgYZdCuMOYdAYJrW3IIj+b2hJ9DHiMcy6FZA0BTPNAN1g==',key_name='tempest-keypair-1309682370',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:47:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-8nyczwr2',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:47:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=ba01649c-a6ef-4784-b3dd-49d03f96cef4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.559 227766 DEBUG nova.network.os_vif_util [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.559 227766 DEBUG nova.network.os_vif_util [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.562 227766 DEBUG nova.virt.libvirt.guest [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] attach device xml: <interface type="ethernet">
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  <mac address="fa:16:3e:bb:87:13"/>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  <model type="virtio"/>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  <mtu size="1442"/>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  <target dev="tapb91e97fc-d6"/>
Jan 23 04:48:42 np0005593234 nova_compute[227762]: </interface>
Jan 23 04:48:42 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 04:48:42 np0005593234 kernel: tapb91e97fc-d6: entered promiscuous mode
Jan 23 04:48:42 np0005593234 NetworkManager[48942]: <info>  [1769161722.5758] manager: (tapb91e97fc-d6): new Tun device (/org/freedesktop/NetworkManager/Devices/101)
Jan 23 04:48:42 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:42Z|00177|binding|INFO|Claiming lport b91e97fc-d648-4053-8fdc-b4fe912b3f69 for this chassis.
Jan 23 04:48:42 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:42Z|00178|binding|INFO|b91e97fc-d648-4053-8fdc-b4fe912b3f69: Claiming fa:16:3e:bb:87:13 10.100.0.14
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.577 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.588 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:87:13 10.100.0.14'], port_security=['fa:16:3e:bb:87:13 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ba01649c-a6ef-4784-b3dd-49d03f96cef4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e14d2748-8402-4583-8740-ef7703629f43', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=b91e97fc-d648-4053-8fdc-b4fe912b3f69) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.590 144381 INFO neutron.agent.ovn.metadata.agent [-] Port b91e97fc-d648-4053-8fdc-b4fe912b3f69 in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 bound to our chassis#033[00m
Jan 23 04:48:42 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:42Z|00179|binding|INFO|Setting lport b91e97fc-d648-4053-8fdc-b4fe912b3f69 ovn-installed in OVS
Jan 23 04:48:42 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:42Z|00180|binding|INFO|Setting lport b91e97fc-d648-4053-8fdc-b4fe912b3f69 up in Southbound
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.592 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.593 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.596 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.610 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1dce1781-41a1-4446-9a57-62f748f3569e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:42 np0005593234 systemd-udevd[255941]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:48:42 np0005593234 NetworkManager[48942]: <info>  [1769161722.6249] device (tapb91e97fc-d6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:48:42 np0005593234 NetworkManager[48942]: <info>  [1769161722.6253] device (tapb91e97fc-d6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.642 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[309f56d3-f90f-41ce-b92e-275d895d48a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.645 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[28f0fe12-ba2e-46bb-be2a-23bf37b9d501]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.664 227766 DEBUG nova.virt.libvirt.driver [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.665 227766 DEBUG nova.virt.libvirt.driver [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.665 227766 DEBUG nova.virt.libvirt.driver [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:06:9b:f9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.665 227766 DEBUG nova.virt.libvirt.driver [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:bb:87:13, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.671 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[be197dd2-077d-4c44-b1b5-9a4dc84221bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.689 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8871179b-36b7-484f-b5c1-c4abf061550f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556942, 'reachable_time': 35606, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255948, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.696 227766 DEBUG nova.virt.libvirt.guest [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1172485769</nova:name>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:48:42</nova:creationTime>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:48:42 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:    <nova:port uuid="cafea320-d23e-45bb-a6b2-46cfa7bd8741">
Jan 23 04:48:42 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:    <nova:port uuid="b91e97fc-d648-4053-8fdc-b4fe912b3f69">
Jan 23 04:48:42 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:48:42 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:48:42 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:48:42 np0005593234 nova_compute[227762]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.705 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[68c49db9-ba65-4bc1-a6ad-82d64a5619d7]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556954, 'tstamp': 556954}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255949, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556957, 'tstamp': 556957}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255949, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.707 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.709 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.710 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7808328e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.711 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.711 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7808328e-20, col_values=(('external_ids', {'iface-id': 'db11772c-e758-43ff-997c-e8c835433e90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.711 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:48:42 np0005593234 nova_compute[227762]: 2026-01-23 09:48:42.722 227766 DEBUG oslo_concurrency.lockutils [None req-4113be98-70f0-431d-9423-e4fa245e4239 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-ba01649c-a6ef-4784-b3dd-49d03f96cef4-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 7.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.824 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.824 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:42.825 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:43.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:43.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:43 np0005593234 nova_compute[227762]: 2026-01-23 09:48:43.681 227766 DEBUG nova.compute.manager [req-4420ef3a-656f-4a06-a575-85672466cc3d req-1babb1bc-72e8-4265-8104-e36ccb503044 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received event network-vif-plugged-b91e97fc-d648-4053-8fdc-b4fe912b3f69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:43 np0005593234 nova_compute[227762]: 2026-01-23 09:48:43.681 227766 DEBUG oslo_concurrency.lockutils [req-4420ef3a-656f-4a06-a575-85672466cc3d req-1babb1bc-72e8-4265-8104-e36ccb503044 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:43 np0005593234 nova_compute[227762]: 2026-01-23 09:48:43.682 227766 DEBUG oslo_concurrency.lockutils [req-4420ef3a-656f-4a06-a575-85672466cc3d req-1babb1bc-72e8-4265-8104-e36ccb503044 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:43 np0005593234 nova_compute[227762]: 2026-01-23 09:48:43.682 227766 DEBUG oslo_concurrency.lockutils [req-4420ef3a-656f-4a06-a575-85672466cc3d req-1babb1bc-72e8-4265-8104-e36ccb503044 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:43 np0005593234 nova_compute[227762]: 2026-01-23 09:48:43.682 227766 DEBUG nova.compute.manager [req-4420ef3a-656f-4a06-a575-85672466cc3d req-1babb1bc-72e8-4265-8104-e36ccb503044 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] No waiting events found dispatching network-vif-plugged-b91e97fc-d648-4053-8fdc-b4fe912b3f69 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:48:43 np0005593234 nova_compute[227762]: 2026-01-23 09:48:43.682 227766 WARNING nova.compute.manager [req-4420ef3a-656f-4a06-a575-85672466cc3d req-1babb1bc-72e8-4265-8104-e36ccb503044 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received unexpected event network-vif-plugged-b91e97fc-d648-4053-8fdc-b4fe912b3f69 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:48:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:44Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bb:87:13 10.100.0.14
Jan 23 04:48:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:44Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bb:87:13 10.100.0.14
Jan 23 04:48:44 np0005593234 nova_compute[227762]: 2026-01-23 09:48:44.876 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:44 np0005593234 nova_compute[227762]: 2026-01-23 09:48:44.914 227766 DEBUG oslo_concurrency.lockutils [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "interface-ba01649c-a6ef-4784-b3dd-49d03f96cef4-b91e97fc-d648-4053-8fdc-b4fe912b3f69" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:44 np0005593234 nova_compute[227762]: 2026-01-23 09:48:44.914 227766 DEBUG oslo_concurrency.lockutils [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-ba01649c-a6ef-4784-b3dd-49d03f96cef4-b91e97fc-d648-4053-8fdc-b4fe912b3f69" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:44 np0005593234 nova_compute[227762]: 2026-01-23 09:48:44.931 227766 DEBUG nova.objects.instance [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'flavor' on Instance uuid ba01649c-a6ef-4784-b3dd-49d03f96cef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:48:44 np0005593234 nova_compute[227762]: 2026-01-23 09:48:44.955 227766 DEBUG nova.virt.libvirt.vif [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:47:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1172485769',display_name='tempest-AttachInterfacesTestJSON-server-1172485769',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1172485769',id=62,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDHG7P7LPqhPdhdP9QeVe34Yjpl7KzdYfgP2NH94mbs2nERHgn5Mq0EdsvgIZnluIsywiR5553mEm6eEs6CP9hgYZdCuMOYdAYJrW3IIj+b2hJ9DHiMcy6FZA0BTPNAN1g==',key_name='tempest-keypair-1309682370',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:47:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-8nyczwr2',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:47:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=ba01649c-a6ef-4784-b3dd-49d03f96cef4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:48:44 np0005593234 nova_compute[227762]: 2026-01-23 09:48:44.956 227766 DEBUG nova.network.os_vif_util [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:44 np0005593234 nova_compute[227762]: 2026-01-23 09:48:44.957 227766 DEBUG nova.network.os_vif_util [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:44 np0005593234 nova_compute[227762]: 2026-01-23 09:48:44.961 227766 DEBUG nova.virt.libvirt.guest [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bb:87:13"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb91e97fc-d6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:48:44 np0005593234 nova_compute[227762]: 2026-01-23 09:48:44.963 227766 DEBUG nova.virt.libvirt.guest [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bb:87:13"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb91e97fc-d6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:48:44 np0005593234 nova_compute[227762]: 2026-01-23 09:48:44.965 227766 DEBUG nova.virt.libvirt.driver [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Attempting to detach device tapb91e97fc-d6 from instance ba01649c-a6ef-4784-b3dd-49d03f96cef4 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 04:48:44 np0005593234 nova_compute[227762]: 2026-01-23 09:48:44.966 227766 DEBUG nova.virt.libvirt.guest [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] detach device xml: <interface type="ethernet">
Jan 23 04:48:44 np0005593234 nova_compute[227762]:  <mac address="fa:16:3e:bb:87:13"/>
Jan 23 04:48:44 np0005593234 nova_compute[227762]:  <model type="virtio"/>
Jan 23 04:48:44 np0005593234 nova_compute[227762]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:48:44 np0005593234 nova_compute[227762]:  <mtu size="1442"/>
Jan 23 04:48:44 np0005593234 nova_compute[227762]:  <target dev="tapb91e97fc-d6"/>
Jan 23 04:48:44 np0005593234 nova_compute[227762]: </interface>
Jan 23 04:48:44 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:48:44 np0005593234 nova_compute[227762]: 2026-01-23 09:48:44.999 227766 DEBUG nova.virt.libvirt.guest [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bb:87:13"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb91e97fc-d6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.003 227766 DEBUG nova.virt.libvirt.guest [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:bb:87:13"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb91e97fc-d6"/></interface>not found in domain: <domain type='kvm' id='26'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <name>instance-0000003e</name>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <uuid>ba01649c-a6ef-4784-b3dd-49d03f96cef4</uuid>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1172485769</nova:name>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:48:42</nova:creationTime>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:port uuid="cafea320-d23e-45bb-a6b2-46cfa7bd8741">
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:port uuid="b91e97fc-d648-4053-8fdc-b4fe912b3f69">
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:48:45 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <memory unit='KiB'>131072</memory>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <vcpu placement='static'>1</vcpu>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <resource>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <partition>/machine</partition>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </resource>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <sysinfo type='smbios'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <entry name='manufacturer'>RDO</entry>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <entry name='serial'>ba01649c-a6ef-4784-b3dd-49d03f96cef4</entry>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <entry name='uuid'>ba01649c-a6ef-4784-b3dd-49d03f96cef4</entry>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <entry name='family'>Virtual Machine</entry>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <boot dev='hd'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <smbios mode='sysinfo'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <vmcoreinfo state='on'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <model fallback='forbid'>Nehalem</model>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <feature policy='require' name='x2apic'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <feature policy='require' name='hypervisor'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <feature policy='require' name='vme'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <clock offset='utc'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <timer name='hpet' present='no'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <on_poweroff>destroy</on_poweroff>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <on_reboot>restart</on_reboot>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <on_crash>destroy</on_crash>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <disk type='network' device='disk'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk' index='2'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target dev='vda' bus='virtio'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='virtio-disk0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <disk type='network' device='cdrom'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk.config' index='1'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target dev='sda' bus='sata'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <readonly/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='sata0-0-0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pcie.0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='1' port='0x10'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='2' port='0x11'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='3' port='0x12'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.3'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='4' port='0x13'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.4'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='5' port='0x14'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.5'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='6' port='0x15'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.6'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='7' port='0x16'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.7'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='8' port='0x17'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.8'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='9' port='0x18'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.9'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='10' port='0x19'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.10'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='11' port='0x1a'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.11'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='12' port='0x1b'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.12'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='13' port='0x1c'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.13'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='14' port='0x1d'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.14'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='15' port='0x1e'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.15'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='16' port='0x1f'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.16'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='17' port='0x20'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.17'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='18' port='0x21'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.18'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='19' port='0x22'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.19'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='20' port='0x23'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.20'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='21' port='0x24'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.21'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='22' port='0x25'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.22'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='23' port='0x26'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.23'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='24' port='0x27'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.24'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='25' port='0x28'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.25'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-pci-bridge'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.26'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='usb'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='sata' index='0'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='ide'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:06:9b:f9'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target dev='tapcafea320-d2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='net0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:bb:87:13'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target dev='tapb91e97fc-d6'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='net1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <serial type='pty'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/console.log' append='off'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target type='isa-serial' port='0'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <model name='isa-serial'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      </target>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/console.log' append='off'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target type='serial' port='0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </console>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <input type='tablet' bus='usb'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='input0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='usb' bus='0' port='1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <input type='mouse' bus='ps2'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='input1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <input type='keyboard' bus='ps2'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='input2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <listen type='address' address='::0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <audio id='1' type='none'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='video0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <watchdog model='itco' action='reset'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='watchdog0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </watchdog>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <memballoon model='virtio'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <stats period='10'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='balloon0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <rng model='virtio'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <backend model='random'>/dev/urandom</backend>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='rng0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <label>system_u:system_r:svirt_t:s0:c45,c581</label>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c45,c581</imagelabel>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <label>+107:+107</label>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <imagelabel>+107:+107</imagelabel>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:48:45 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:48:45 np0005593234 nova_compute[227762]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.004 227766 INFO nova.virt.libvirt.driver [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully detached device tapb91e97fc-d6 from instance ba01649c-a6ef-4784-b3dd-49d03f96cef4 from the persistent domain config.#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.005 227766 DEBUG nova.virt.libvirt.driver [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] (1/8): Attempting to detach device tapb91e97fc-d6 with device alias net1 from instance ba01649c-a6ef-4784-b3dd-49d03f96cef4 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.005 227766 DEBUG nova.virt.libvirt.guest [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] detach device xml: <interface type="ethernet">
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <mac address="fa:16:3e:bb:87:13"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <model type="virtio"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <mtu size="1442"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <target dev="tapb91e97fc-d6"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]: </interface>
Jan 23 04:48:45 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:48:45 np0005593234 kernel: tapb91e97fc-d6 (unregistering): left promiscuous mode
Jan 23 04:48:45 np0005593234 NetworkManager[48942]: <info>  [1769161725.1013] device (tapb91e97fc-d6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:48:45 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:45Z|00181|binding|INFO|Releasing lport b91e97fc-d648-4053-8fdc-b4fe912b3f69 from this chassis (sb_readonly=0)
Jan 23 04:48:45 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:45Z|00182|binding|INFO|Setting lport b91e97fc-d648-4053-8fdc-b4fe912b3f69 down in Southbound
Jan 23 04:48:45 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:45Z|00183|binding|INFO|Removing iface tapb91e97fc-d6 ovn-installed in OVS
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.109 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.111 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.187 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769161725.1137578, ba01649c-a6ef-4784-b3dd-49d03f96cef4 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.189 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bb:87:13 10.100.0.14'], port_security=['fa:16:3e:bb:87:13 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'ba01649c-a6ef-4784-b3dd-49d03f96cef4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e14d2748-8402-4583-8740-ef7703629f43', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=b91e97fc-d648-4053-8fdc-b4fe912b3f69) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.190 144381 INFO neutron.agent.ovn.metadata.agent [-] Port b91e97fc-d648-4053-8fdc-b4fe912b3f69 in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 unbound from our chassis#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.192 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.191 227766 DEBUG nova.virt.libvirt.driver [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Start waiting for the detach event from libvirt for device tapb91e97fc-d6 with device alias net1 for instance ba01649c-a6ef-4784-b3dd-49d03f96cef4 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.192 227766 DEBUG nova.virt.libvirt.guest [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bb:87:13"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb91e97fc-d6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.200 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.202 227766 DEBUG nova.virt.libvirt.guest [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:bb:87:13"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb91e97fc-d6"/></interface>not found in domain: <domain type='kvm' id='26'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <name>instance-0000003e</name>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <uuid>ba01649c-a6ef-4784-b3dd-49d03f96cef4</uuid>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1172485769</nova:name>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:48:42</nova:creationTime>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:port uuid="cafea320-d23e-45bb-a6b2-46cfa7bd8741">
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:port uuid="b91e97fc-d648-4053-8fdc-b4fe912b3f69">
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:48:45 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <memory unit='KiB'>131072</memory>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <vcpu placement='static'>1</vcpu>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <resource>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <partition>/machine</partition>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </resource>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <sysinfo type='smbios'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <entry name='manufacturer'>RDO</entry>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <entry name='serial'>ba01649c-a6ef-4784-b3dd-49d03f96cef4</entry>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <entry name='uuid'>ba01649c-a6ef-4784-b3dd-49d03f96cef4</entry>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <entry name='family'>Virtual Machine</entry>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <boot dev='hd'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <smbios mode='sysinfo'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <vmcoreinfo state='on'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <model fallback='forbid'>Nehalem</model>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <feature policy='require' name='x2apic'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <feature policy='require' name='hypervisor'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <feature policy='require' name='vme'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <clock offset='utc'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <timer name='hpet' present='no'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <on_poweroff>destroy</on_poweroff>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <on_reboot>restart</on_reboot>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <on_crash>destroy</on_crash>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <disk type='network' device='disk'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk' index='2'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target dev='vda' bus='virtio'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='virtio-disk0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <disk type='network' device='cdrom'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk.config' index='1'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target dev='sda' bus='sata'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <readonly/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='sata0-0-0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pcie.0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='1' port='0x10'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='2' port='0x11'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='3' port='0x12'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.3'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='4' port='0x13'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.4'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='5' port='0x14'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.5'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='6' port='0x15'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.6'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='7' port='0x16'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.7'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='8' port='0x17'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.8'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='9' port='0x18'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.9'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='10' port='0x19'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.10'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='11' port='0x1a'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.11'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='12' port='0x1b'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.12'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='13' port='0x1c'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.13'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='14' port='0x1d'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.14'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='15' port='0x1e'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.15'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='16' port='0x1f'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.16'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='17' port='0x20'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.17'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='18' port='0x21'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.18'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='19' port='0x22'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.19'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='20' port='0x23'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.20'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='21' port='0x24'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.21'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='22' port='0x25'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.22'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='23' port='0x26'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.23'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='24' port='0x27'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.24'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target chassis='25' port='0x28'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.25'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model name='pcie-pci-bridge'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='pci.26'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='usb'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <controller type='sata' index='0'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='ide'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:06:9b:f9'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target dev='tapcafea320-d2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='net0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <serial type='pty'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/console.log' append='off'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target type='isa-serial' port='0'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:        <model name='isa-serial'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      </target>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/console.log' append='off'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <target type='serial' port='0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </console>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <input type='tablet' bus='usb'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='input0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='usb' bus='0' port='1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <input type='mouse' bus='ps2'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='input1'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <input type='keyboard' bus='ps2'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='input2'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <listen type='address' address='::0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <audio id='1' type='none'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='video0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <watchdog model='itco' action='reset'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='watchdog0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </watchdog>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <memballoon model='virtio'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <stats period='10'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='balloon0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <rng model='virtio'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <backend model='random'>/dev/urandom</backend>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <alias name='rng0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <label>system_u:system_r:svirt_t:s0:c45,c581</label>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c45,c581</imagelabel>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <label>+107:+107</label>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <imagelabel>+107:+107</imagelabel>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:48:45 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:48:45 np0005593234 nova_compute[227762]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.203 227766 INFO nova.virt.libvirt.driver [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully detached device tapb91e97fc-d6 from instance ba01649c-a6ef-4784-b3dd-49d03f96cef4 from the live domain config.#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.204 227766 DEBUG nova.virt.libvirt.vif [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:47:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1172485769',display_name='tempest-AttachInterfacesTestJSON-server-1172485769',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1172485769',id=62,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDHG7P7LPqhPdhdP9QeVe34Yjpl7KzdYfgP2NH94mbs2nERHgn5Mq0EdsvgIZnluIsywiR5553mEm6eEs6CP9hgYZdCuMOYdAYJrW3IIj+b2hJ9DHiMcy6FZA0BTPNAN1g==',key_name='tempest-keypair-1309682370',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:47:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-8nyczwr2',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:47:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=ba01649c-a6ef-4784-b3dd-49d03f96cef4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.205 227766 DEBUG nova.network.os_vif_util [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.206 227766 DEBUG nova.network.os_vif_util [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.206 227766 DEBUG os_vif [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.208 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.208 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb91e97fc-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.208 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b87c00bf-11e8-4bb7-869a-7584851d94d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.209 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.211 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.214 227766 INFO os_vif [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6')#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.215 227766 DEBUG nova.virt.libvirt.guest [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1172485769</nova:name>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:48:45</nova:creationTime>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    <nova:port uuid="cafea320-d23e-45bb-a6b2-46cfa7bd8741">
Jan 23 04:48:45 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:48:45 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:48:45 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:48:45 np0005593234 nova_compute[227762]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.239 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4e209635-0ad2-45cd-b9be-66faa6f92baf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.241 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5aa64e-7949-4320-92ae-769ac0d5fa44]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.266 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ff403ffc-8bdb-4e49-a150-5a0e7da04941]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.282 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1324f0a4-df53-4408-abca-564dc57bc7f5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 7, 'rx_bytes': 616, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 56], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556942, 'reachable_time': 35606, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 255961, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.296 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5b1e9b-6dcf-41bf-9b55-532592ef45c0]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556954, 'tstamp': 556954}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255962, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 556957, 'tstamp': 556957}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 255962, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.298 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.300 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7808328e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.300 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.301 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.301 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7808328e-20, col_values=(('external_ids', {'iface-id': 'db11772c-e758-43ff-997c-e8c835433e90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.301 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:48:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:45.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:45.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.793 227766 DEBUG nova.compute.manager [req-8594bac1-db74-44ef-b448-29254b65bcbf req-e3da56d5-9ede-4dce-8def-80f1c84b8617 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received event network-vif-plugged-b91e97fc-d648-4053-8fdc-b4fe912b3f69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.793 227766 DEBUG oslo_concurrency.lockutils [req-8594bac1-db74-44ef-b448-29254b65bcbf req-e3da56d5-9ede-4dce-8def-80f1c84b8617 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.794 227766 DEBUG oslo_concurrency.lockutils [req-8594bac1-db74-44ef-b448-29254b65bcbf req-e3da56d5-9ede-4dce-8def-80f1c84b8617 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.794 227766 DEBUG oslo_concurrency.lockutils [req-8594bac1-db74-44ef-b448-29254b65bcbf req-e3da56d5-9ede-4dce-8def-80f1c84b8617 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.794 227766 DEBUG nova.compute.manager [req-8594bac1-db74-44ef-b448-29254b65bcbf req-e3da56d5-9ede-4dce-8def-80f1c84b8617 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] No waiting events found dispatching network-vif-plugged-b91e97fc-d648-4053-8fdc-b4fe912b3f69 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.794 227766 WARNING nova.compute.manager [req-8594bac1-db74-44ef-b448-29254b65bcbf req-e3da56d5-9ede-4dce-8def-80f1c84b8617 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received unexpected event network-vif-plugged-b91e97fc-d648-4053-8fdc-b4fe912b3f69 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.795 227766 DEBUG nova.compute.manager [req-8594bac1-db74-44ef-b448-29254b65bcbf req-e3da56d5-9ede-4dce-8def-80f1c84b8617 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received event network-vif-unplugged-b91e97fc-d648-4053-8fdc-b4fe912b3f69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.795 227766 DEBUG oslo_concurrency.lockutils [req-8594bac1-db74-44ef-b448-29254b65bcbf req-e3da56d5-9ede-4dce-8def-80f1c84b8617 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.795 227766 DEBUG oslo_concurrency.lockutils [req-8594bac1-db74-44ef-b448-29254b65bcbf req-e3da56d5-9ede-4dce-8def-80f1c84b8617 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.795 227766 DEBUG oslo_concurrency.lockutils [req-8594bac1-db74-44ef-b448-29254b65bcbf req-e3da56d5-9ede-4dce-8def-80f1c84b8617 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.795 227766 DEBUG nova.compute.manager [req-8594bac1-db74-44ef-b448-29254b65bcbf req-e3da56d5-9ede-4dce-8def-80f1c84b8617 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] No waiting events found dispatching network-vif-unplugged-b91e97fc-d648-4053-8fdc-b4fe912b3f69 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.796 227766 WARNING nova.compute.manager [req-8594bac1-db74-44ef-b448-29254b65bcbf req-e3da56d5-9ede-4dce-8def-80f1c84b8617 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received unexpected event network-vif-unplugged-b91e97fc-d648-4053-8fdc-b4fe912b3f69 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.869 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.869 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.871 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:48:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:45.871 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:45 np0005593234 nova_compute[227762]: 2026-01-23 09:48:45.998 227766 DEBUG oslo_concurrency.lockutils [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.115 227766 DEBUG nova.compute.manager [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received event network-vif-deleted-b91e97fc-d648-4053-8fdc-b4fe912b3f69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.116 227766 INFO nova.compute.manager [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Neutron deleted interface b91e97fc-d648-4053-8fdc-b4fe912b3f69; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.116 227766 DEBUG nova.network.neutron [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updating instance_info_cache with network_info: [{"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.155 227766 DEBUG nova.objects.instance [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lazy-loading 'system_metadata' on Instance uuid ba01649c-a6ef-4784-b3dd-49d03f96cef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.208 227766 DEBUG nova.objects.instance [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lazy-loading 'flavor' on Instance uuid ba01649c-a6ef-4784-b3dd-49d03f96cef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.230 227766 DEBUG nova.virt.libvirt.vif [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:47:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1172485769',display_name='tempest-AttachInterfacesTestJSON-server-1172485769',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1172485769',id=62,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDHG7P7LPqhPdhdP9QeVe34Yjpl7KzdYfgP2NH94mbs2nERHgn5Mq0EdsvgIZnluIsywiR5553mEm6eEs6CP9hgYZdCuMOYdAYJrW3IIj+b2hJ9DHiMcy6FZA0BTPNAN1g==',key_name='tempest-keypair-1309682370',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:47:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-8nyczwr2',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:47:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=ba01649c-a6ef-4784-b3dd-49d03f96cef4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.231 227766 DEBUG nova.network.os_vif_util [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Converting VIF {"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.231 227766 DEBUG nova.network.os_vif_util [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.234 227766 DEBUG nova.virt.libvirt.guest [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bb:87:13"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb91e97fc-d6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.237 227766 DEBUG nova.virt.libvirt.guest [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:bb:87:13"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb91e97fc-d6"/></interface>not found in domain: <domain type='kvm' id='26'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <name>instance-0000003e</name>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <uuid>ba01649c-a6ef-4784-b3dd-49d03f96cef4</uuid>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1172485769</nova:name>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:48:45</nova:creationTime>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:port uuid="cafea320-d23e-45bb-a6b2-46cfa7bd8741">
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:48:46 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <memory unit='KiB'>131072</memory>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <vcpu placement='static'>1</vcpu>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <resource>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <partition>/machine</partition>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </resource>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <sysinfo type='smbios'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <entry name='manufacturer'>RDO</entry>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <entry name='serial'>ba01649c-a6ef-4784-b3dd-49d03f96cef4</entry>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <entry name='uuid'>ba01649c-a6ef-4784-b3dd-49d03f96cef4</entry>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <entry name='family'>Virtual Machine</entry>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <boot dev='hd'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <smbios mode='sysinfo'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <vmcoreinfo state='on'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <model fallback='forbid'>Nehalem</model>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <feature policy='require' name='x2apic'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <feature policy='require' name='hypervisor'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <feature policy='require' name='vme'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <clock offset='utc'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <timer name='hpet' present='no'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <on_poweroff>destroy</on_poweroff>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <on_reboot>restart</on_reboot>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <on_crash>destroy</on_crash>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <disk type='network' device='disk'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk' index='2'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target dev='vda' bus='virtio'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='virtio-disk0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <disk type='network' device='cdrom'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk.config' index='1'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target dev='sda' bus='sata'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <readonly/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='sata0-0-0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pcie.0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='1' port='0x10'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='2' port='0x11'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='3' port='0x12'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.3'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='4' port='0x13'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.4'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='5' port='0x14'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.5'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='6' port='0x15'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.6'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='7' port='0x16'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.7'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='8' port='0x17'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.8'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='9' port='0x18'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.9'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='10' port='0x19'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.10'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='11' port='0x1a'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.11'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='12' port='0x1b'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.12'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='13' port='0x1c'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.13'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='14' port='0x1d'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.14'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='15' port='0x1e'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.15'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='16' port='0x1f'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.16'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='17' port='0x20'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.17'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='18' port='0x21'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.18'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='19' port='0x22'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.19'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='20' port='0x23'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.20'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='21' port='0x24'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.21'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='22' port='0x25'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.22'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='23' port='0x26'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.23'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='24' port='0x27'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.24'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='25' port='0x28'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.25'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-pci-bridge'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.26'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='usb'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='sata' index='0'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='ide'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:06:9b:f9'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target dev='tapcafea320-d2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='net0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <serial type='pty'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/console.log' append='off'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target type='isa-serial' port='0'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <model name='isa-serial'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      </target>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/console.log' append='off'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target type='serial' port='0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </console>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <input type='tablet' bus='usb'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='input0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='usb' bus='0' port='1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <input type='mouse' bus='ps2'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='input1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <input type='keyboard' bus='ps2'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='input2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <listen type='address' address='::0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <audio id='1' type='none'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='video0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <watchdog model='itco' action='reset'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='watchdog0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </watchdog>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <memballoon model='virtio'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <stats period='10'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='balloon0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <rng model='virtio'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <backend model='random'>/dev/urandom</backend>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='rng0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <label>system_u:system_r:svirt_t:s0:c45,c581</label>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c45,c581</imagelabel>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <label>+107:+107</label>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <imagelabel>+107:+107</imagelabel>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:48:46 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:48:46 np0005593234 nova_compute[227762]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.238 227766 DEBUG nova.virt.libvirt.guest [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:bb:87:13"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb91e97fc-d6"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.242 227766 DEBUG nova.virt.libvirt.guest [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:bb:87:13"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapb91e97fc-d6"/></interface>not found in domain: <domain type='kvm' id='26'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <name>instance-0000003e</name>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <uuid>ba01649c-a6ef-4784-b3dd-49d03f96cef4</uuid>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1172485769</nova:name>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:48:45</nova:creationTime>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:port uuid="cafea320-d23e-45bb-a6b2-46cfa7bd8741">
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:48:46 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <memory unit='KiB'>131072</memory>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <vcpu placement='static'>1</vcpu>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <resource>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <partition>/machine</partition>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </resource>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <sysinfo type='smbios'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <entry name='manufacturer'>RDO</entry>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <entry name='serial'>ba01649c-a6ef-4784-b3dd-49d03f96cef4</entry>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <entry name='uuid'>ba01649c-a6ef-4784-b3dd-49d03f96cef4</entry>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <entry name='family'>Virtual Machine</entry>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <boot dev='hd'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <smbios mode='sysinfo'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <vmcoreinfo state='on'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <model fallback='forbid'>Nehalem</model>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <feature policy='require' name='x2apic'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <feature policy='require' name='hypervisor'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <feature policy='require' name='vme'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <clock offset='utc'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <timer name='hpet' present='no'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <on_poweroff>destroy</on_poweroff>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <on_reboot>restart</on_reboot>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <on_crash>destroy</on_crash>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <disk type='network' device='disk'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk' index='2'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target dev='vda' bus='virtio'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='virtio-disk0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <disk type='network' device='cdrom'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/ba01649c-a6ef-4784-b3dd-49d03f96cef4_disk.config' index='1'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target dev='sda' bus='sata'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <readonly/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='sata0-0-0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pcie.0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='1' port='0x10'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='2' port='0x11'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='3' port='0x12'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.3'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='4' port='0x13'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.4'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='5' port='0x14'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.5'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='6' port='0x15'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.6'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='7' port='0x16'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.7'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='8' port='0x17'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.8'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='9' port='0x18'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.9'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='10' port='0x19'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.10'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='11' port='0x1a'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.11'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='12' port='0x1b'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.12'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='13' port='0x1c'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.13'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='14' port='0x1d'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.14'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='15' port='0x1e'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.15'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='16' port='0x1f'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.16'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='17' port='0x20'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.17'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='18' port='0x21'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.18'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='19' port='0x22'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.19'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='20' port='0x23'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.20'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='21' port='0x24'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.21'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='22' port='0x25'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.22'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='23' port='0x26'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.23'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='24' port='0x27'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.24'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target chassis='25' port='0x28'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.25'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model name='pcie-pci-bridge'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='pci.26'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='usb'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <controller type='sata' index='0'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='ide'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:06:9b:f9'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target dev='tapcafea320-d2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='net0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <serial type='pty'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/console.log' append='off'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target type='isa-serial' port='0'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:        <model name='isa-serial'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      </target>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4/console.log' append='off'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <target type='serial' port='0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </console>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <input type='tablet' bus='usb'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='input0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='usb' bus='0' port='1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <input type='mouse' bus='ps2'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='input1'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <input type='keyboard' bus='ps2'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='input2'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <listen type='address' address='::0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <audio id='1' type='none'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='video0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <watchdog model='itco' action='reset'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='watchdog0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </watchdog>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <memballoon model='virtio'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <stats period='10'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='balloon0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <rng model='virtio'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <backend model='random'>/dev/urandom</backend>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <alias name='rng0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <label>system_u:system_r:svirt_t:s0:c45,c581</label>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c45,c581</imagelabel>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <label>+107:+107</label>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <imagelabel>+107:+107</imagelabel>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:48:46 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:48:46 np0005593234 nova_compute[227762]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.245 227766 WARNING nova.virt.libvirt.driver [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Detaching interface fa:16:3e:bb:87:13 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapb91e97fc-d6' not found.#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.245 227766 DEBUG nova.virt.libvirt.vif [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:47:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1172485769',display_name='tempest-AttachInterfacesTestJSON-server-1172485769',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1172485769',id=62,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDHG7P7LPqhPdhdP9QeVe34Yjpl7KzdYfgP2NH94mbs2nERHgn5Mq0EdsvgIZnluIsywiR5553mEm6eEs6CP9hgYZdCuMOYdAYJrW3IIj+b2hJ9DHiMcy6FZA0BTPNAN1g==',key_name='tempest-keypair-1309682370',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:47:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-8nyczwr2',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:47:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=ba01649c-a6ef-4784-b3dd-49d03f96cef4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.246 227766 DEBUG nova.network.os_vif_util [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Converting VIF {"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.246 227766 DEBUG nova.network.os_vif_util [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.247 227766 DEBUG os_vif [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.248 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.248 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb91e97fc-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.248 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.250 227766 INFO os_vif [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6')#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.251 227766 DEBUG nova.virt.libvirt.guest [req-b903f92f-98e2-49ce-9f84-3c9b3d126fd8 req-3db26cac-4d3a-4406-bd37-e49e3ea1570f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-1172485769</nova:name>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:48:46</nova:creationTime>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    <nova:port uuid="cafea320-d23e-45bb-a6b2-46cfa7bd8741">
Jan 23 04:48:46 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:48:46 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:48:46 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:48:46 np0005593234 nova_compute[227762]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.597 227766 DEBUG nova.network.neutron [req-b1fe01e3-b75e-4945-a445-ca0bf223ab6d req-2d8adde0-e2fa-45be-9b11-9b68b1120744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updated VIF entry in instance network info cache for port b91e97fc-d648-4053-8fdc-b4fe912b3f69. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.598 227766 DEBUG nova.network.neutron [req-b1fe01e3-b75e-4945-a445-ca0bf223ab6d req-2d8adde0-e2fa-45be-9b11-9b68b1120744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updating instance_info_cache with network_info: [{"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.627 227766 DEBUG oslo_concurrency.lockutils [req-b1fe01e3-b75e-4945-a445-ca0bf223ab6d req-2d8adde0-e2fa-45be-9b11-9b68b1120744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.627 227766 DEBUG oslo_concurrency.lockutils [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquired lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:48:46 np0005593234 nova_compute[227762]: 2026-01-23 09:48:46.628 227766 DEBUG nova.network.neutron [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:48:46 np0005593234 podman[255964]: 2026-01-23 09:48:46.785237442 +0000 UTC m=+0.079232997 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 04:48:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:47.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:47.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:47 np0005593234 nova_compute[227762]: 2026-01-23 09:48:47.731 227766 DEBUG oslo_concurrency.lockutils [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:47 np0005593234 nova_compute[227762]: 2026-01-23 09:48:47.732 227766 DEBUG oslo_concurrency.lockutils [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:47 np0005593234 nova_compute[227762]: 2026-01-23 09:48:47.733 227766 DEBUG oslo_concurrency.lockutils [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:47 np0005593234 nova_compute[227762]: 2026-01-23 09:48:47.733 227766 DEBUG oslo_concurrency.lockutils [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:47 np0005593234 nova_compute[227762]: 2026-01-23 09:48:47.734 227766 DEBUG oslo_concurrency.lockutils [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:47 np0005593234 nova_compute[227762]: 2026-01-23 09:48:47.736 227766 INFO nova.compute.manager [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Terminating instance#033[00m
Jan 23 04:48:47 np0005593234 nova_compute[227762]: 2026-01-23 09:48:47.737 227766 DEBUG nova.compute.manager [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:48:47 np0005593234 kernel: tapcafea320-d2 (unregistering): left promiscuous mode
Jan 23 04:48:47 np0005593234 NetworkManager[48942]: <info>  [1769161727.7962] device (tapcafea320-d2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:48:47 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:47Z|00184|binding|INFO|Releasing lport cafea320-d23e-45bb-a6b2-46cfa7bd8741 from this chassis (sb_readonly=0)
Jan 23 04:48:47 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:47Z|00185|binding|INFO|Setting lport cafea320-d23e-45bb-a6b2-46cfa7bd8741 down in Southbound
Jan 23 04:48:47 np0005593234 nova_compute[227762]: 2026-01-23 09:48:47.862 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:47 np0005593234 ovn_controller[134547]: 2026-01-23T09:48:47Z|00186|binding|INFO|Removing iface tapcafea320-d2 ovn-installed in OVS
Jan 23 04:48:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:47.872 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:9b:f9 10.100.0.3'], port_security=['fa:16:3e:06:9b:f9 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ba01649c-a6ef-4784-b3dd-49d03f96cef4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '4', 'neutron:security_group_ids': '4347dab6-bc98-4fea-9c51-e02238fc830c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=cafea320-d23e-45bb-a6b2-46cfa7bd8741) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:48:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:47.873 144381 INFO neutron.agent.ovn.metadata.agent [-] Port cafea320-d23e-45bb-a6b2-46cfa7bd8741 in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 unbound from our chassis#033[00m
Jan 23 04:48:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:47.874 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:48:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:47.875 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[11fe8165-0357-4300-a163-1f7805a64dac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:47.875 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 namespace which is not needed anymore#033[00m
Jan 23 04:48:47 np0005593234 nova_compute[227762]: 2026-01-23 09:48:47.887 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:47 np0005593234 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003e.scope: Deactivated successfully.
Jan 23 04:48:47 np0005593234 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d0000003e.scope: Consumed 15.000s CPU time.
Jan 23 04:48:47 np0005593234 systemd-machined[195626]: Machine qemu-26-instance-0000003e terminated.
Jan 23 04:48:47 np0005593234 nova_compute[227762]: 2026-01-23 09:48:47.956 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:47 np0005593234 nova_compute[227762]: 2026-01-23 09:48:47.961 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:47 np0005593234 nova_compute[227762]: 2026-01-23 09:48:47.970 227766 INFO nova.virt.libvirt.driver [-] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Instance destroyed successfully.#033[00m
Jan 23 04:48:47 np0005593234 nova_compute[227762]: 2026-01-23 09:48:47.971 227766 DEBUG nova.objects.instance [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'resources' on Instance uuid ba01649c-a6ef-4784-b3dd-49d03f96cef4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:48:47 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[255407]: [NOTICE]   (255419) : haproxy version is 2.8.14-c23fe91
Jan 23 04:48:47 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[255407]: [NOTICE]   (255419) : path to executable is /usr/sbin/haproxy
Jan 23 04:48:47 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[255407]: [WARNING]  (255419) : Exiting Master process...
Jan 23 04:48:47 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[255407]: [ALERT]    (255419) : Current worker (255429) exited with code 143 (Terminated)
Jan 23 04:48:47 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[255407]: [WARNING]  (255419) : All workers exited. Exiting... (0)
Jan 23 04:48:47 np0005593234 systemd[1]: libpod-7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09.scope: Deactivated successfully.
Jan 23 04:48:48 np0005593234 podman[256017]: 2026-01-23 09:48:48.002193842 +0000 UTC m=+0.042531906 container died 7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:48:48 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09-userdata-shm.mount: Deactivated successfully.
Jan 23 04:48:48 np0005593234 systemd[1]: var-lib-containers-storage-overlay-b11dfd44ec2069e9c136ecc6944ac62cdc13805a958d923318bf4f2cb8dcb1bb-merged.mount: Deactivated successfully.
Jan 23 04:48:48 np0005593234 podman[256017]: 2026-01-23 09:48:48.04037249 +0000 UTC m=+0.080710564 container cleanup 7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:48:48 np0005593234 systemd[1]: libpod-conmon-7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09.scope: Deactivated successfully.
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.062 227766 DEBUG nova.virt.libvirt.vif [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:47:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1172485769',display_name='tempest-AttachInterfacesTestJSON-server-1172485769',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1172485769',id=62,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDHG7P7LPqhPdhdP9QeVe34Yjpl7KzdYfgP2NH94mbs2nERHgn5Mq0EdsvgIZnluIsywiR5553mEm6eEs6CP9hgYZdCuMOYdAYJrW3IIj+b2hJ9DHiMcy6FZA0BTPNAN1g==',key_name='tempest-keypair-1309682370',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:47:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-8nyczwr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:47:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=ba01649c-a6ef-4784-b3dd-49d03f96cef4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.063 227766 DEBUG nova.network.os_vif_util [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.064 227766 DEBUG nova.network.os_vif_util [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:06:9b:f9,bridge_name='br-int',has_traffic_filtering=True,id=cafea320-d23e-45bb-a6b2-46cfa7bd8741,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcafea320-d2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.064 227766 DEBUG os_vif [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:9b:f9,bridge_name='br-int',has_traffic_filtering=True,id=cafea320-d23e-45bb-a6b2-46cfa7bd8741,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcafea320-d2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.066 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.066 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapcafea320-d2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.067 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.070 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.072 227766 INFO os_vif [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:9b:f9,bridge_name='br-int',has_traffic_filtering=True,id=cafea320-d23e-45bb-a6b2-46cfa7bd8741,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapcafea320-d2')#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.073 227766 DEBUG nova.virt.libvirt.vif [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:47:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-1172485769',display_name='tempest-AttachInterfacesTestJSON-server-1172485769',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-1172485769',id=62,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDHG7P7LPqhPdhdP9QeVe34Yjpl7KzdYfgP2NH94mbs2nERHgn5Mq0EdsvgIZnluIsywiR5553mEm6eEs6CP9hgYZdCuMOYdAYJrW3IIj+b2hJ9DHiMcy6FZA0BTPNAN1g==',key_name='tempest-keypair-1309682370',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:47:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-8nyczwr2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:47:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=ba01649c-a6ef-4784-b3dd-49d03f96cef4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.073 227766 DEBUG nova.network.os_vif_util [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.075 227766 DEBUG nova.network.os_vif_util [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.075 227766 DEBUG os_vif [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.076 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.077 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb91e97fc-d6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.077 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.079 227766 INFO os_vif [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bb:87:13,bridge_name='br-int',has_traffic_filtering=True,id=b91e97fc-d648-4053-8fdc-b4fe912b3f69,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb91e97fc-d6')#033[00m
Jan 23 04:48:48 np0005593234 podman[256055]: 2026-01-23 09:48:48.102099298 +0000 UTC m=+0.041594377 container remove 7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 23 04:48:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:48.107 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4ffca528-930c-41a2-ace3-1e29e8547e53]: (4, ('Fri Jan 23 09:48:47 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 (7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09)\n7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09\nFri Jan 23 09:48:48 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 (7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09)\n7baff80253ea45598ce3a4d67e111af1d3ecf58434fca032eb988c1569e74f09\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:48.109 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1102c247-79de-484e-a3f6-60b6537d9a45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:48.110 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.112 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:48 np0005593234 kernel: tap7808328e-20: left promiscuous mode
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.124 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.125 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:48.126 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b8938f4b-1f6d-406a-85d9-d3e2d9823fa5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:48.155 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[59d86fce-2a84-4644-9d65-c15fbb0f3a20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:48.156 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c157f959-2341-400d-a696-1ffe304765a5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:48.172 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[14769682-859b-4a85-b4f8-7c2c70c85941]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 556935, 'reachable_time': 29687, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256088, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:48 np0005593234 systemd[1]: run-netns-ovnmeta\x2d7808328e\x2d22f9\x2d46df\x2dac06\x2df8c3d6ad10c4.mount: Deactivated successfully.
Jan 23 04:48:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:48.177 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:48:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:48:48.177 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[7a8da5f2-ccd5-45cd-ab55-f6a172396c2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.543 227766 INFO nova.virt.libvirt.driver [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Deleting instance files /var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4_del#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.544 227766 INFO nova.virt.libvirt.driver [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Deletion of /var/lib/nova/instances/ba01649c-a6ef-4784-b3dd-49d03f96cef4_del complete#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.605 227766 INFO nova.compute.manager [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Took 0.87 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.606 227766 DEBUG oslo.service.loopingcall [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.606 227766 DEBUG nova.compute.manager [-] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.606 227766 DEBUG nova.network.neutron [-] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.693 227766 DEBUG nova.compute.manager [req-a7ff4648-9334-431f-933f-e4530d4da231 req-54d2cce3-e5c5-431c-9ede-5fed937d8a17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received event network-vif-plugged-b91e97fc-d648-4053-8fdc-b4fe912b3f69 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.694 227766 DEBUG oslo_concurrency.lockutils [req-a7ff4648-9334-431f-933f-e4530d4da231 req-54d2cce3-e5c5-431c-9ede-5fed937d8a17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.694 227766 DEBUG oslo_concurrency.lockutils [req-a7ff4648-9334-431f-933f-e4530d4da231 req-54d2cce3-e5c5-431c-9ede-5fed937d8a17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.695 227766 DEBUG oslo_concurrency.lockutils [req-a7ff4648-9334-431f-933f-e4530d4da231 req-54d2cce3-e5c5-431c-9ede-5fed937d8a17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.695 227766 DEBUG nova.compute.manager [req-a7ff4648-9334-431f-933f-e4530d4da231 req-54d2cce3-e5c5-431c-9ede-5fed937d8a17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] No waiting events found dispatching network-vif-plugged-b91e97fc-d648-4053-8fdc-b4fe912b3f69 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:48:48 np0005593234 nova_compute[227762]: 2026-01-23 09:48:48.695 227766 WARNING nova.compute.manager [req-a7ff4648-9334-431f-933f-e4530d4da231 req-54d2cce3-e5c5-431c-9ede-5fed937d8a17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received unexpected event network-vif-plugged-b91e97fc-d648-4053-8fdc-b4fe912b3f69 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:48:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:49.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:49.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.221 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.229 227766 DEBUG nova.compute.manager [req-9791b1d9-5b1d-4f1b-abfa-81284a8992c7 req-a6c7a703-ffee-47d1-ac5a-9c1d4f982829 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received event network-vif-deleted-cafea320-d23e-45bb-a6b2-46cfa7bd8741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.230 227766 INFO nova.compute.manager [req-9791b1d9-5b1d-4f1b-abfa-81284a8992c7 req-a6c7a703-ffee-47d1-ac5a-9c1d4f982829 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Neutron deleted interface cafea320-d23e-45bb-a6b2-46cfa7bd8741; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.230 227766 DEBUG nova.network.neutron [req-9791b1d9-5b1d-4f1b-abfa-81284a8992c7 req-a6c7a703-ffee-47d1-ac5a-9c1d4f982829 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updating instance_info_cache with network_info: [{"id": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "address": "fa:16:3e:bb:87:13", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb91e97fc-d6", "ovs_interfaceid": "b91e97fc-d648-4053-8fdc-b4fe912b3f69", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.249 227766 DEBUG nova.network.neutron [-] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.255 227766 DEBUG nova.compute.manager [req-9791b1d9-5b1d-4f1b-abfa-81284a8992c7 req-a6c7a703-ffee-47d1-ac5a-9c1d4f982829 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Detach interface failed, port_id=cafea320-d23e-45bb-a6b2-46cfa7bd8741, reason: Instance ba01649c-a6ef-4784-b3dd-49d03f96cef4 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.273 227766 INFO nova.compute.manager [-] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Took 1.67 seconds to deallocate network for instance.#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.367 227766 DEBUG oslo_concurrency.lockutils [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.368 227766 DEBUG oslo_concurrency.lockutils [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.425 227766 DEBUG oslo_concurrency.processutils [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.627 227766 INFO nova.network.neutron [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Port b91e97fc-d648-4053-8fdc-b4fe912b3f69 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.628 227766 DEBUG nova.network.neutron [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Updating instance_info_cache with network_info: [{"id": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "address": "fa:16:3e:06:9b:f9", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapcafea320-d2", "ovs_interfaceid": "cafea320-d23e-45bb-a6b2-46cfa7bd8741", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.654 227766 DEBUG oslo_concurrency.lockutils [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Releasing lock "refresh_cache-ba01649c-a6ef-4784-b3dd-49d03f96cef4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.678 227766 DEBUG oslo_concurrency.lockutils [None req-cdc3b142-2e56-4c47-b2b3-e795adf239dd 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-ba01649c-a6ef-4784-b3dd-49d03f96cef4-b91e97fc-d648-4053-8fdc-b4fe912b3f69" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.764s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.790 227766 DEBUG nova.compute.manager [req-535b5aa6-3947-43a6-870c-00b04a34cbfb req-d38e9e14-1d64-401d-b89b-e5a8d7c53d87 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received event network-vif-unplugged-cafea320-d23e-45bb-a6b2-46cfa7bd8741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.790 227766 DEBUG oslo_concurrency.lockutils [req-535b5aa6-3947-43a6-870c-00b04a34cbfb req-d38e9e14-1d64-401d-b89b-e5a8d7c53d87 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.791 227766 DEBUG oslo_concurrency.lockutils [req-535b5aa6-3947-43a6-870c-00b04a34cbfb req-d38e9e14-1d64-401d-b89b-e5a8d7c53d87 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.791 227766 DEBUG oslo_concurrency.lockutils [req-535b5aa6-3947-43a6-870c-00b04a34cbfb req-d38e9e14-1d64-401d-b89b-e5a8d7c53d87 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.791 227766 DEBUG nova.compute.manager [req-535b5aa6-3947-43a6-870c-00b04a34cbfb req-d38e9e14-1d64-401d-b89b-e5a8d7c53d87 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] No waiting events found dispatching network-vif-unplugged-cafea320-d23e-45bb-a6b2-46cfa7bd8741 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.791 227766 WARNING nova.compute.manager [req-535b5aa6-3947-43a6-870c-00b04a34cbfb req-d38e9e14-1d64-401d-b89b-e5a8d7c53d87 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received unexpected event network-vif-unplugged-cafea320-d23e-45bb-a6b2-46cfa7bd8741 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.791 227766 DEBUG nova.compute.manager [req-535b5aa6-3947-43a6-870c-00b04a34cbfb req-d38e9e14-1d64-401d-b89b-e5a8d7c53d87 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received event network-vif-plugged-cafea320-d23e-45bb-a6b2-46cfa7bd8741 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.792 227766 DEBUG oslo_concurrency.lockutils [req-535b5aa6-3947-43a6-870c-00b04a34cbfb req-d38e9e14-1d64-401d-b89b-e5a8d7c53d87 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.792 227766 DEBUG oslo_concurrency.lockutils [req-535b5aa6-3947-43a6-870c-00b04a34cbfb req-d38e9e14-1d64-401d-b89b-e5a8d7c53d87 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.792 227766 DEBUG oslo_concurrency.lockutils [req-535b5aa6-3947-43a6-870c-00b04a34cbfb req-d38e9e14-1d64-401d-b89b-e5a8d7c53d87 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.792 227766 DEBUG nova.compute.manager [req-535b5aa6-3947-43a6-870c-00b04a34cbfb req-d38e9e14-1d64-401d-b89b-e5a8d7c53d87 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] No waiting events found dispatching network-vif-plugged-cafea320-d23e-45bb-a6b2-46cfa7bd8741 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.793 227766 WARNING nova.compute.manager [req-535b5aa6-3947-43a6-870c-00b04a34cbfb req-d38e9e14-1d64-401d-b89b-e5a8d7c53d87 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Received unexpected event network-vif-plugged-cafea320-d23e-45bb-a6b2-46cfa7bd8741 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 04:48:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:48:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2372965574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.888 227766 DEBUG oslo_concurrency.processutils [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.894 227766 DEBUG nova.compute.provider_tree [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.910 227766 DEBUG nova.scheduler.client.report [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.937 227766 DEBUG oslo_concurrency.lockutils [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.569s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:50 np0005593234 nova_compute[227762]: 2026-01-23 09:48:50.974 227766 INFO nova.scheduler.client.report [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Deleted allocations for instance ba01649c-a6ef-4784-b3dd-49d03f96cef4#033[00m
Jan 23 04:48:51 np0005593234 nova_compute[227762]: 2026-01-23 09:48:51.089 227766 DEBUG oslo_concurrency.lockutils [None req-cc79e5c1-1233-47b8-affb-dfc2a4eb1ee5 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "ba01649c-a6ef-4784-b3dd-49d03f96cef4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:48:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:51.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:51.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:52 np0005593234 nova_compute[227762]: 2026-01-23 09:48:52.005 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:52 np0005593234 nova_compute[227762]: 2026-01-23 09:48:52.277 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:53 np0005593234 nova_compute[227762]: 2026-01-23 09:48:53.068 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:53.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:53.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:54 np0005593234 nova_compute[227762]: 2026-01-23 09:48:54.938 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:48:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:55.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:55.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:57.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:48:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:57.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:48:58 np0005593234 nova_compute[227762]: 2026-01-23 09:48:58.069 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:48:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:48:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2017555438' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:48:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:48:59.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:48:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:48:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:48:59.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:48:59 np0005593234 nova_compute[227762]: 2026-01-23 09:48:59.940 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:01.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:49:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:01.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:49:01 np0005593234 nova_compute[227762]: 2026-01-23 09:49:01.882 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:01 np0005593234 nova_compute[227762]: 2026-01-23 09:49:01.883 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:01 np0005593234 nova_compute[227762]: 2026-01-23 09:49:01.909 227766 DEBUG nova.compute.manager [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:49:01 np0005593234 nova_compute[227762]: 2026-01-23 09:49:01.998 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:01 np0005593234 nova_compute[227762]: 2026-01-23 09:49:01.998 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.005 227766 DEBUG nova.virt.hardware [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.005 227766 INFO nova.compute.claims [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.192 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:49:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:49:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1142537263' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.637 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.644 227766 DEBUG nova.compute.provider_tree [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.677 227766 DEBUG nova.scheduler.client.report [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.708 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.709 227766 DEBUG nova.compute.manager [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.769 227766 DEBUG nova.compute.manager [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.769 227766 DEBUG nova.network.neutron [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.795 227766 INFO nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.814 227766 DEBUG nova.compute.manager [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.940 227766 DEBUG nova.compute.manager [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.942 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.942 227766 INFO nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Creating image(s)#033[00m
Jan 23 04:49:02 np0005593234 nova_compute[227762]: 2026-01-23 09:49:02.973 227766 DEBUG nova.storage.rbd_utils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.003 227766 DEBUG nova.storage.rbd_utils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.026 227766 DEBUG nova.storage.rbd_utils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.029 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.049 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161727.9676805, ba01649c-a6ef-4784-b3dd-49d03f96cef4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.050 227766 INFO nova.compute.manager [-] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.057 227766 DEBUG nova.policy [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77cda1e9a0404425a06c34637e696603', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '390d19f683334995a5268cf9b4d5e464', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.071 227766 DEBUG nova.compute.manager [None req-1cc472ab-f519-44fa-858b-92c715a92248 - - - - - -] [instance: ba01649c-a6ef-4784-b3dd-49d03f96cef4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.072 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.092 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.092 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.093 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.093 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.120 227766 DEBUG nova.storage.rbd_utils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:49:03 np0005593234 nova_compute[227762]: 2026-01-23 09:49:03.124 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:49:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:03.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:03.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:03 np0005593234 podman[256283]: 2026-01-23 09:49:03.761946287 +0000 UTC m=+0.052426396 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 04:49:04 np0005593234 nova_compute[227762]: 2026-01-23 09:49:04.111 227766 DEBUG nova.network.neutron [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Successfully created port: d4963f79-ec1b-4e35-b34d-22edfeb2fd2f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:49:04 np0005593234 nova_compute[227762]: 2026-01-23 09:49:04.175 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:49:04 np0005593234 nova_compute[227762]: 2026-01-23 09:49:04.245 227766 DEBUG nova.storage.rbd_utils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] resizing rbd image a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:49:04 np0005593234 nova_compute[227762]: 2026-01-23 09:49:04.360 227766 DEBUG nova.objects.instance [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'migration_context' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:49:04 np0005593234 nova_compute[227762]: 2026-01-23 09:49:04.375 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:49:04 np0005593234 nova_compute[227762]: 2026-01-23 09:49:04.376 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Ensure instance console log exists: /var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:49:04 np0005593234 nova_compute[227762]: 2026-01-23 09:49:04.376 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:04 np0005593234 nova_compute[227762]: 2026-01-23 09:49:04.377 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:04 np0005593234 nova_compute[227762]: 2026-01-23 09:49:04.377 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:04 np0005593234 nova_compute[227762]: 2026-01-23 09:49:04.994 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:05.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:05.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:05 np0005593234 nova_compute[227762]: 2026-01-23 09:49:05.586 227766 DEBUG nova.network.neutron [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Successfully updated port: d4963f79-ec1b-4e35-b34d-22edfeb2fd2f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:49:05 np0005593234 nova_compute[227762]: 2026-01-23 09:49:05.623 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:49:05 np0005593234 nova_compute[227762]: 2026-01-23 09:49:05.623 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquired lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:49:05 np0005593234 nova_compute[227762]: 2026-01-23 09:49:05.623 227766 DEBUG nova.network.neutron [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:49:05 np0005593234 nova_compute[227762]: 2026-01-23 09:49:05.868 227766 DEBUG nova.network.neutron [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.142 227766 DEBUG nova.compute.manager [req-15eb2d9c-6bb2-4799-9f0f-75fbbd54ce1c req-2c53ec6f-8ac4-4db6-8458-1868a61262df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-changed-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.143 227766 DEBUG nova.compute.manager [req-15eb2d9c-6bb2-4799-9f0f-75fbbd54ce1c req-2c53ec6f-8ac4-4db6-8458-1868a61262df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Refreshing instance network info cache due to event network-changed-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.143 227766 DEBUG oslo_concurrency.lockutils [req-15eb2d9c-6bb2-4799-9f0f-75fbbd54ce1c req-2c53ec6f-8ac4-4db6-8458-1868a61262df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:49:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:07.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.455 227766 DEBUG nova.network.neutron [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.479 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Releasing lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.480 227766 DEBUG nova.compute.manager [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Instance network_info: |[{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.480 227766 DEBUG oslo_concurrency.lockutils [req-15eb2d9c-6bb2-4799-9f0f-75fbbd54ce1c req-2c53ec6f-8ac4-4db6-8458-1868a61262df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.480 227766 DEBUG nova.network.neutron [req-15eb2d9c-6bb2-4799-9f0f-75fbbd54ce1c req-2c53ec6f-8ac4-4db6-8458-1868a61262df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Refreshing network info cache for port d4963f79-ec1b-4e35-b34d-22edfeb2fd2f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.483 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Start _get_guest_xml network_info=[{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.487 227766 WARNING nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.491 227766 DEBUG nova.virt.libvirt.host [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.492 227766 DEBUG nova.virt.libvirt.host [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.495 227766 DEBUG nova.virt.libvirt.host [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.495 227766 DEBUG nova.virt.libvirt.host [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.496 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.497 227766 DEBUG nova.virt.hardware [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.497 227766 DEBUG nova.virt.hardware [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.497 227766 DEBUG nova.virt.hardware [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.498 227766 DEBUG nova.virt.hardware [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.498 227766 DEBUG nova.virt.hardware [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.498 227766 DEBUG nova.virt.hardware [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.498 227766 DEBUG nova.virt.hardware [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.498 227766 DEBUG nova.virt.hardware [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:49:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:07.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.499 227766 DEBUG nova.virt.hardware [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.499 227766 DEBUG nova.virt.hardware [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.499 227766 DEBUG nova.virt.hardware [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.502 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:49:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:49:07 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3371917627' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.946 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.973 227766 DEBUG nova.storage.rbd_utils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:49:07 np0005593234 nova_compute[227762]: 2026-01-23 09:49:07.978 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.075 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:49:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2407959735' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.535 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.557s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.537 227766 DEBUG nova.virt.libvirt.vif [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:49:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.537 227766 DEBUG nova.network.os_vif_util [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.538 227766 DEBUG nova.network.os_vif_util [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:8d,bridge_name='br-int',has_traffic_filtering=True,id=d4963f79-ec1b-4e35-b34d-22edfeb2fd2f,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4963f79-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.539 227766 DEBUG nova.objects.instance [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'pci_devices' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.651 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  <uuid>a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</uuid>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  <name>instance-00000042</name>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <nova:name>tempest-AttachInterfacesTestJSON-server-195762306</nova:name>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:49:07</nova:creationTime>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <nova:port uuid="d4963f79-ec1b-4e35-b34d-22edfeb2fd2f">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <entry name="serial">a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</entry>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <entry name="uuid">a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</entry>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk.config">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:fc:6f:8d"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <target dev="tapd4963f79-ec"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/console.log" append="off"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:49:08 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:49:08 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:49:08 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:49:08 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.652 227766 DEBUG nova.compute.manager [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Preparing to wait for external event network-vif-plugged-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.652 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.652 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.652 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.653 227766 DEBUG nova.virt.libvirt.vif [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:49:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.654 227766 DEBUG nova.network.os_vif_util [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.654 227766 DEBUG nova.network.os_vif_util [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:8d,bridge_name='br-int',has_traffic_filtering=True,id=d4963f79-ec1b-4e35-b34d-22edfeb2fd2f,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4963f79-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.655 227766 DEBUG os_vif [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:8d,bridge_name='br-int',has_traffic_filtering=True,id=d4963f79-ec1b-4e35-b34d-22edfeb2fd2f,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4963f79-ec') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.655 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.656 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.656 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.658 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.658 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4963f79-ec, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.659 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd4963f79-ec, col_values=(('external_ids', {'iface-id': 'd4963f79-ec1b-4e35-b34d-22edfeb2fd2f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fc:6f:8d', 'vm-uuid': 'a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.660 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:08 np0005593234 NetworkManager[48942]: <info>  [1769161748.6611] manager: (tapd4963f79-ec): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/102)
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.662 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.666 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.668 227766 INFO os_vif [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:6f:8d,bridge_name='br-int',has_traffic_filtering=True,id=d4963f79-ec1b-4e35-b34d-22edfeb2fd2f,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4963f79-ec')#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.780 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.781 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.781 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:fc:6f:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.781 227766 INFO nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Using config drive#033[00m
Jan 23 04:49:08 np0005593234 nova_compute[227762]: 2026-01-23 09:49:08.806 227766 DEBUG nova.storage.rbd_utils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:49:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:09.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:09.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:09 np0005593234 nova_compute[227762]: 2026-01-23 09:49:09.565 227766 DEBUG nova.network.neutron [req-15eb2d9c-6bb2-4799-9f0f-75fbbd54ce1c req-2c53ec6f-8ac4-4db6-8458-1868a61262df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updated VIF entry in instance network info cache for port d4963f79-ec1b-4e35-b34d-22edfeb2fd2f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:49:09 np0005593234 nova_compute[227762]: 2026-01-23 09:49:09.566 227766 DEBUG nova.network.neutron [req-15eb2d9c-6bb2-4799-9f0f-75fbbd54ce1c req-2c53ec6f-8ac4-4db6-8458-1868a61262df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:49:09 np0005593234 nova_compute[227762]: 2026-01-23 09:49:09.593 227766 DEBUG oslo_concurrency.lockutils [req-15eb2d9c-6bb2-4799-9f0f-75fbbd54ce1c req-2c53ec6f-8ac4-4db6-8458-1868a61262df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:49:09 np0005593234 nova_compute[227762]: 2026-01-23 09:49:09.775 227766 INFO nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Creating config drive at /var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/disk.config#033[00m
Jan 23 04:49:09 np0005593234 nova_compute[227762]: 2026-01-23 09:49:09.782 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd3vbg6jc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:49:09 np0005593234 nova_compute[227762]: 2026-01-23 09:49:09.916 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd3vbg6jc" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:49:09 np0005593234 nova_compute[227762]: 2026-01-23 09:49:09.948 227766 DEBUG nova.storage.rbd_utils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] rbd image a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:49:09 np0005593234 nova_compute[227762]: 2026-01-23 09:49:09.952 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/disk.config a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:49:09 np0005593234 nova_compute[227762]: 2026-01-23 09:49:09.997 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.096 227766 DEBUG oslo_concurrency.processutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/disk.config a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.096 227766 INFO nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Deleting local config drive /var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/disk.config because it was imported into RBD.#033[00m
Jan 23 04:49:10 np0005593234 kernel: tapd4963f79-ec: entered promiscuous mode
Jan 23 04:49:10 np0005593234 NetworkManager[48942]: <info>  [1769161750.1436] manager: (tapd4963f79-ec): new Tun device (/org/freedesktop/NetworkManager/Devices/103)
Jan 23 04:49:10 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:10Z|00187|binding|INFO|Claiming lport d4963f79-ec1b-4e35-b34d-22edfeb2fd2f for this chassis.
Jan 23 04:49:10 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:10Z|00188|binding|INFO|d4963f79-ec1b-4e35-b34d-22edfeb2fd2f: Claiming fa:16:3e:fc:6f:8d 10.100.0.11
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.145 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.152 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.161 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:6f:8d 10.100.0.11'], port_security=['fa:16:3e:fc:6f:8d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cd689bfc-53a9-43da-a4d7-90eb165eac13', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d4963f79-ec1b-4e35-b34d-22edfeb2fd2f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.164 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d4963f79-ec1b-4e35-b34d-22edfeb2fd2f in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 bound to our chassis#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.166 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4#033[00m
Jan 23 04:49:10 np0005593234 systemd-udevd[256518]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.179 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5af59cb9-20d1-4a3a-b278-2baf67591124]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.181 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7808328e-21 in ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.182 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7808328e-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.183 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7486ea3e-b93d-4d5a-aca3-701f43658895]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 systemd-machined[195626]: New machine qemu-27-instance-00000042.
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.184 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e21eec2c-233d-41f5-99cb-9d30deaf131f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 NetworkManager[48942]: <info>  [1769161750.1919] device (tapd4963f79-ec): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:49:10 np0005593234 NetworkManager[48942]: <info>  [1769161750.1924] device (tapd4963f79-ec): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.197 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[da3e2f8b-7d75-4386-a227-566fde349197]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 systemd[1]: Started Virtual Machine qemu-27-instance-00000042.
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.212 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0dc1e50c-8d19-4f89-9e47-4fe5b0b8f609]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.217 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:10 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:10Z|00189|binding|INFO|Setting lport d4963f79-ec1b-4e35-b34d-22edfeb2fd2f ovn-installed in OVS
Jan 23 04:49:10 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:10Z|00190|binding|INFO|Setting lport d4963f79-ec1b-4e35-b34d-22edfeb2fd2f up in Southbound
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.224 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.238 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[487c7dd3-ab14-4955-a82e-1ad0757bedae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 systemd-udevd[256522]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:49:10 np0005593234 NetworkManager[48942]: <info>  [1769161750.2443] manager: (tap7808328e-20): new Veth device (/org/freedesktop/NetworkManager/Devices/104)
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.244 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[25269edb-61f7-467c-ad57-9f7fd347465f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.272 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[bc85965e-693d-4c01-8ac0-043dee0abdae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.276 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a8acb3ec-7178-4c45-ae0f-d2f44e9de59f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 NetworkManager[48942]: <info>  [1769161750.2981] device (tap7808328e-20): carrier: link connected
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.303 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f4736873-289f-4574-88b9-df4300cee10b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.321 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[059e6587-ec39-4647-9f1a-499663f0b852]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564525, 'reachable_time': 36528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 256551, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.339 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[74967017-78a1-4206-ad71-362d2711abb0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febb:22ae'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564525, 'tstamp': 564525}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 256552, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.356 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b4eedbf9-c2b1-4733-8569-d9483ccb2614]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564525, 'reachable_time': 36528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 256553, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.390 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cadfa6f9-2987-4440-a986-a16952a7dd86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.441 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a2362d1b-e44b-4151-bb77-985f43b6245a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.443 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.443 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.443 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7808328e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.445 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:10 np0005593234 kernel: tap7808328e-20: entered promiscuous mode
Jan 23 04:49:10 np0005593234 NetworkManager[48942]: <info>  [1769161750.4457] manager: (tap7808328e-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.447 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.448 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7808328e-20, col_values=(('external_ids', {'iface-id': 'db11772c-e758-43ff-997c-e8c835433e90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.449 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:10 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:10Z|00191|binding|INFO|Releasing lport db11772c-e758-43ff-997c-e8c835433e90 from this chassis (sb_readonly=0)
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.462 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.464 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7808328e-22f9-46df-ac06-f8c3d6ad10c4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7808328e-22f9-46df-ac06-f8c3d6ad10c4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.465 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a6d44ebf-d3e5-4ee4-90cd-d4242fb0063a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.466 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-7808328e-22f9-46df-ac06-f8c3d6ad10c4
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/7808328e-22f9-46df-ac06-f8c3d6ad10c4.pid.haproxy
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 7808328e-22f9-46df-ac06-f8c3d6ad10c4
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:49:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:10.467 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'env', 'PROCESS_TAG=haproxy-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7808328e-22f9-46df-ac06-f8c3d6ad10c4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.887 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161750.886729, a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.888 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] VM Started (Lifecycle Event)#033[00m
Jan 23 04:49:10 np0005593234 podman[256621]: 2026-01-23 09:49:10.813091222 +0000 UTC m=+0.023574972 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.912 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.917 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161750.8868656, a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.917 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.941 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.944 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:49:10 np0005593234 nova_compute[227762]: 2026-01-23 09:49:10.969 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:49:11 np0005593234 podman[256621]: 2026-01-23 09:49:11.039776497 +0000 UTC m=+0.250260227 container create c1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 04:49:11 np0005593234 systemd[1]: Started libpod-conmon-c1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71.scope.
Jan 23 04:49:11 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:49:11 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b80bad23c72592126a26e53b18dd5c550f825c5ce0005262243f369de9b0637e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:49:11 np0005593234 podman[256621]: 2026-01-23 09:49:11.11634249 +0000 UTC m=+0.326826240 container init c1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 04:49:11 np0005593234 podman[256621]: 2026-01-23 09:49:11.122838094 +0000 UTC m=+0.333321824 container start c1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:49:11 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[256642]: [NOTICE]   (256646) : New worker (256648) forked
Jan 23 04:49:11 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[256642]: [NOTICE]   (256646) : Loading success.
Jan 23 04:49:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:11.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:11.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.705 227766 DEBUG nova.compute.manager [req-7ebed210-f4e2-40c7-bd50-78d147fd7f3f req-64e93c68-f64d-418e-b0e7-7fc48aa12edc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-plugged-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.706 227766 DEBUG oslo_concurrency.lockutils [req-7ebed210-f4e2-40c7-bd50-78d147fd7f3f req-64e93c68-f64d-418e-b0e7-7fc48aa12edc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.706 227766 DEBUG oslo_concurrency.lockutils [req-7ebed210-f4e2-40c7-bd50-78d147fd7f3f req-64e93c68-f64d-418e-b0e7-7fc48aa12edc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.706 227766 DEBUG oslo_concurrency.lockutils [req-7ebed210-f4e2-40c7-bd50-78d147fd7f3f req-64e93c68-f64d-418e-b0e7-7fc48aa12edc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.706 227766 DEBUG nova.compute.manager [req-7ebed210-f4e2-40c7-bd50-78d147fd7f3f req-64e93c68-f64d-418e-b0e7-7fc48aa12edc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Processing event network-vif-plugged-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.707 227766 DEBUG nova.compute.manager [req-7ebed210-f4e2-40c7-bd50-78d147fd7f3f req-64e93c68-f64d-418e-b0e7-7fc48aa12edc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-plugged-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.707 227766 DEBUG oslo_concurrency.lockutils [req-7ebed210-f4e2-40c7-bd50-78d147fd7f3f req-64e93c68-f64d-418e-b0e7-7fc48aa12edc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.707 227766 DEBUG oslo_concurrency.lockutils [req-7ebed210-f4e2-40c7-bd50-78d147fd7f3f req-64e93c68-f64d-418e-b0e7-7fc48aa12edc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.707 227766 DEBUG oslo_concurrency.lockutils [req-7ebed210-f4e2-40c7-bd50-78d147fd7f3f req-64e93c68-f64d-418e-b0e7-7fc48aa12edc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.707 227766 DEBUG nova.compute.manager [req-7ebed210-f4e2-40c7-bd50-78d147fd7f3f req-64e93c68-f64d-418e-b0e7-7fc48aa12edc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] No waiting events found dispatching network-vif-plugged-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.708 227766 WARNING nova.compute.manager [req-7ebed210-f4e2-40c7-bd50-78d147fd7f3f req-64e93c68-f64d-418e-b0e7-7fc48aa12edc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received unexpected event network-vif-plugged-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.708 227766 DEBUG nova.compute.manager [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.712 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161752.7122393, a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.712 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.714 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.717 227766 INFO nova.virt.libvirt.driver [-] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Instance spawned successfully.#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.718 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.740 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.746 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.750 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.751 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.751 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.752 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.752 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.752 227766 DEBUG nova.virt.libvirt.driver [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.789 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.841 227766 INFO nova.compute.manager [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Took 9.90 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.842 227766 DEBUG nova.compute.manager [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.923 227766 INFO nova.compute.manager [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Took 10.95 seconds to build instance.#033[00m
Jan 23 04:49:12 np0005593234 nova_compute[227762]: 2026-01-23 09:49:12.946 227766 DEBUG oslo_concurrency.lockutils [None req-b1b5da06-198b-4154-8f52-eebe632f15b8 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.063s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:13.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:13.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:13 np0005593234 nova_compute[227762]: 2026-01-23 09:49:13.702 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:15 np0005593234 nova_compute[227762]: 2026-01-23 09:49:15.039 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:15.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:15.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:15 np0005593234 nova_compute[227762]: 2026-01-23 09:49:15.811 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:16 np0005593234 NetworkManager[48942]: <info>  [1769161756.2081] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/106)
Jan 23 04:49:16 np0005593234 nova_compute[227762]: 2026-01-23 09:49:16.207 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:16 np0005593234 NetworkManager[48942]: <info>  [1769161756.2092] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Jan 23 04:49:16 np0005593234 nova_compute[227762]: 2026-01-23 09:49:16.395 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:16 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:16Z|00192|binding|INFO|Releasing lport db11772c-e758-43ff-997c-e8c835433e90 from this chassis (sb_readonly=0)
Jan 23 04:49:16 np0005593234 nova_compute[227762]: 2026-01-23 09:49:16.411 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:17.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:17 np0005593234 nova_compute[227762]: 2026-01-23 09:49:17.508 227766 DEBUG nova.compute.manager [req-00fa8188-e250-41c1-8eb0-3aaa932d3e91 req-80b2672b-3bfa-46af-9d5e-bdb2e0473219 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-changed-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:49:17 np0005593234 nova_compute[227762]: 2026-01-23 09:49:17.509 227766 DEBUG nova.compute.manager [req-00fa8188-e250-41c1-8eb0-3aaa932d3e91 req-80b2672b-3bfa-46af-9d5e-bdb2e0473219 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Refreshing instance network info cache due to event network-changed-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:49:17 np0005593234 nova_compute[227762]: 2026-01-23 09:49:17.509 227766 DEBUG oslo_concurrency.lockutils [req-00fa8188-e250-41c1-8eb0-3aaa932d3e91 req-80b2672b-3bfa-46af-9d5e-bdb2e0473219 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:49:17 np0005593234 nova_compute[227762]: 2026-01-23 09:49:17.509 227766 DEBUG oslo_concurrency.lockutils [req-00fa8188-e250-41c1-8eb0-3aaa932d3e91 req-80b2672b-3bfa-46af-9d5e-bdb2e0473219 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:49:17 np0005593234 nova_compute[227762]: 2026-01-23 09:49:17.510 227766 DEBUG nova.network.neutron [req-00fa8188-e250-41c1-8eb0-3aaa932d3e91 req-80b2672b-3bfa-46af-9d5e-bdb2e0473219 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Refreshing network info cache for port d4963f79-ec1b-4e35-b34d-22edfeb2fd2f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:49:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:17.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:17 np0005593234 podman[256661]: 2026-01-23 09:49:17.802991722 +0000 UTC m=+0.091795612 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:49:18 np0005593234 nova_compute[227762]: 2026-01-23 09:49:18.705 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:18 np0005593234 nova_compute[227762]: 2026-01-23 09:49:18.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:18 np0005593234 nova_compute[227762]: 2026-01-23 09:49:18.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:49:18 np0005593234 nova_compute[227762]: 2026-01-23 09:49:18.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:18 np0005593234 nova_compute[227762]: 2026-01-23 09:49:18.781 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:18 np0005593234 nova_compute[227762]: 2026-01-23 09:49:18.781 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:18 np0005593234 nova_compute[227762]: 2026-01-23 09:49:18.782 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:18 np0005593234 nova_compute[227762]: 2026-01-23 09:49:18.782 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:49:18 np0005593234 nova_compute[227762]: 2026-01-23 09:49:18.782 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:49:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:49:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/36154979' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.203 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.335 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000042 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.336 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000042 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:49:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:19.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.485 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.486 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4510MB free_disk=20.92261505126953GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.486 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.487 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:19.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.641 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.641 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.642 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.688 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.711 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.712 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.738 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.792 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:49:19 np0005593234 nova_compute[227762]: 2026-01-23 09:49:19.865 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:49:20 np0005593234 nova_compute[227762]: 2026-01-23 09:49:20.041 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:20 np0005593234 nova_compute[227762]: 2026-01-23 09:49:20.129 227766 DEBUG nova.network.neutron [req-00fa8188-e250-41c1-8eb0-3aaa932d3e91 req-80b2672b-3bfa-46af-9d5e-bdb2e0473219 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updated VIF entry in instance network info cache for port d4963f79-ec1b-4e35-b34d-22edfeb2fd2f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:49:20 np0005593234 nova_compute[227762]: 2026-01-23 09:49:20.130 227766 DEBUG nova.network.neutron [req-00fa8188-e250-41c1-8eb0-3aaa932d3e91 req-80b2672b-3bfa-46af-9d5e-bdb2e0473219 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:49:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:49:20 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/542422848' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:49:20 np0005593234 nova_compute[227762]: 2026-01-23 09:49:20.304 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:49:20 np0005593234 nova_compute[227762]: 2026-01-23 09:49:20.313 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:49:20 np0005593234 nova_compute[227762]: 2026-01-23 09:49:20.701 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:49:20 np0005593234 nova_compute[227762]: 2026-01-23 09:49:20.726 227766 DEBUG oslo_concurrency.lockutils [req-00fa8188-e250-41c1-8eb0-3aaa932d3e91 req-80b2672b-3bfa-46af-9d5e-bdb2e0473219 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:49:20 np0005593234 nova_compute[227762]: 2026-01-23 09:49:20.787 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:49:20 np0005593234 nova_compute[227762]: 2026-01-23 09:49:20.788 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.301s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:49:20 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1173367794' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:49:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:49:20 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1173367794' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:49:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:21.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:21.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:22 np0005593234 nova_compute[227762]: 2026-01-23 09:49:22.885 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:23.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:23.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:23 np0005593234 nova_compute[227762]: 2026-01-23 09:49:23.707 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:23 np0005593234 nova_compute[227762]: 2026-01-23 09:49:23.783 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:23 np0005593234 nova_compute[227762]: 2026-01-23 09:49:23.857 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:23 np0005593234 nova_compute[227762]: 2026-01-23 09:49:23.858 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:23 np0005593234 nova_compute[227762]: 2026-01-23 09:49:23.858 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:24 np0005593234 nova_compute[227762]: 2026-01-23 09:49:24.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:24 np0005593234 nova_compute[227762]: 2026-01-23 09:49:24.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:49:24 np0005593234 nova_compute[227762]: 2026-01-23 09:49:24.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:49:25 np0005593234 nova_compute[227762]: 2026-01-23 09:49:25.042 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:25 np0005593234 nova_compute[227762]: 2026-01-23 09:49:25.148 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:49:25 np0005593234 nova_compute[227762]: 2026-01-23 09:49:25.149 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:49:25 np0005593234 nova_compute[227762]: 2026-01-23 09:49:25.149 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:49:25 np0005593234 nova_compute[227762]: 2026-01-23 09:49:25.150 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:49:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:25.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:25 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:25Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:fc:6f:8d 10.100.0.11
Jan 23 04:49:25 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:25Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:fc:6f:8d 10.100.0.11
Jan 23 04:49:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:25.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:27.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:27.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:28 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:28Z|00193|binding|INFO|Releasing lport db11772c-e758-43ff-997c-e8c835433e90 from this chassis (sb_readonly=0)
Jan 23 04:49:28 np0005593234 nova_compute[227762]: 2026-01-23 09:49:28.186 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:28 np0005593234 nova_compute[227762]: 2026-01-23 09:49:28.709 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:29.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:29.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:29 np0005593234 nova_compute[227762]: 2026-01-23 09:49:29.593 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:49:29 np0005593234 nova_compute[227762]: 2026-01-23 09:49:29.612 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:49:29 np0005593234 nova_compute[227762]: 2026-01-23 09:49:29.612 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:49:29 np0005593234 nova_compute[227762]: 2026-01-23 09:49:29.612 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:30 np0005593234 nova_compute[227762]: 2026-01-23 09:49:30.043 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:31.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:31.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:31 np0005593234 nova_compute[227762]: 2026-01-23 09:49:31.605 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:49:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:49:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:49:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:49:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:33.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:33.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:33 np0005593234 nova_compute[227762]: 2026-01-23 09:49:33.712 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:34 np0005593234 podman[256925]: 2026-01-23 09:49:34.771335245 +0000 UTC m=+0.058927011 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 04:49:35 np0005593234 nova_compute[227762]: 2026-01-23 09:49:35.046 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:35.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:35.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:37.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:37.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:37 np0005593234 nova_compute[227762]: 2026-01-23 09:49:37.898 227766 DEBUG oslo_concurrency.lockutils [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "interface-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:37 np0005593234 nova_compute[227762]: 2026-01-23 09:49:37.898 227766 DEBUG oslo_concurrency.lockutils [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:37 np0005593234 nova_compute[227762]: 2026-01-23 09:49:37.899 227766 DEBUG nova.objects.instance [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'flavor' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:49:37 np0005593234 nova_compute[227762]: 2026-01-23 09:49:37.944 227766 DEBUG nova.objects.instance [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'pci_requests' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:49:38 np0005593234 nova_compute[227762]: 2026-01-23 09:49:38.192 227766 DEBUG nova.network.neutron [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:49:38 np0005593234 nova_compute[227762]: 2026-01-23 09:49:38.714 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.375767) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161779375838, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 2521, "num_deletes": 263, "total_data_size": 5698567, "memory_usage": 5779488, "flush_reason": "Manual Compaction"}
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Jan 23 04:49:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:39.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161779499975, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 3733236, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 38794, "largest_seqno": 41310, "table_properties": {"data_size": 3722897, "index_size": 6641, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 21975, "raw_average_key_size": 21, "raw_value_size": 3702066, "raw_average_value_size": 3549, "num_data_blocks": 286, "num_entries": 1043, "num_filter_entries": 1043, "num_deletions": 263, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161593, "oldest_key_time": 1769161593, "file_creation_time": 1769161779, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 124444 microseconds, and 7872 cpu microseconds.
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.500214) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 3733236 bytes OK
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.500269) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.502608) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.502621) EVENT_LOG_v1 {"time_micros": 1769161779502616, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.502637) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 5687257, prev total WAL file size 5707856, number of live WAL files 2.
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.504138) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(3645KB)], [75(9771KB)]
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161779504224, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 13739109, "oldest_snapshot_seqno": -1}
Jan 23 04:49:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:39.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:39 np0005593234 nova_compute[227762]: 2026-01-23 09:49:39.550 227766 DEBUG nova.policy [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77cda1e9a0404425a06c34637e696603', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '390d19f683334995a5268cf9b4d5e464', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6573 keys, 11839725 bytes, temperature: kUnknown
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161779683377, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 11839725, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11793136, "index_size": 29069, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16453, "raw_key_size": 168274, "raw_average_key_size": 25, "raw_value_size": 11672559, "raw_average_value_size": 1775, "num_data_blocks": 1165, "num_entries": 6573, "num_filter_entries": 6573, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769161779, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.683629) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 11839725 bytes
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.797117) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 76.7 rd, 66.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 9.5 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(6.9) write-amplify(3.2) OK, records in: 7108, records dropped: 535 output_compression: NoCompression
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.797155) EVENT_LOG_v1 {"time_micros": 1769161779797141, "job": 46, "event": "compaction_finished", "compaction_time_micros": 179223, "compaction_time_cpu_micros": 25339, "output_level": 6, "num_output_files": 1, "total_output_size": 11839725, "num_input_records": 7108, "num_output_records": 6573, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161779798337, "job": 46, "event": "table_file_deletion", "file_number": 77}
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161779800273, "job": 46, "event": "table_file_deletion", "file_number": 75}
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.504088) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.800383) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.800387) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.800389) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.800390) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:49:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:49:39.800395) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:49:40 np0005593234 nova_compute[227762]: 2026-01-23 09:49:40.048 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:49:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:49:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:41.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:41.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:41 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:41Z|00194|binding|INFO|Releasing lport db11772c-e758-43ff-997c-e8c835433e90 from this chassis (sb_readonly=0)
Jan 23 04:49:41 np0005593234 nova_compute[227762]: 2026-01-23 09:49:41.729 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:42.825 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:42.826 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:42.827 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:42 np0005593234 nova_compute[227762]: 2026-01-23 09:49:42.894 227766 DEBUG nova.network.neutron [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Successfully created port: aa6f3a20-d469-4e97-90f4-60d418a600e6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:49:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:43.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:49:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:43.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:49:43 np0005593234 nova_compute[227762]: 2026-01-23 09:49:43.716 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:45 np0005593234 nova_compute[227762]: 2026-01-23 09:49:45.050 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:45.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:45.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:46 np0005593234 nova_compute[227762]: 2026-01-23 09:49:46.529 227766 DEBUG nova.network.neutron [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Successfully updated port: aa6f3a20-d469-4e97-90f4-60d418a600e6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:49:46 np0005593234 nova_compute[227762]: 2026-01-23 09:49:46.566 227766 DEBUG oslo_concurrency.lockutils [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:49:46 np0005593234 nova_compute[227762]: 2026-01-23 09:49:46.567 227766 DEBUG oslo_concurrency.lockutils [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquired lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:49:46 np0005593234 nova_compute[227762]: 2026-01-23 09:49:46.567 227766 DEBUG nova.network.neutron [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:49:47 np0005593234 nova_compute[227762]: 2026-01-23 09:49:47.122 227766 WARNING nova.network.neutron [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] 7808328e-22f9-46df-ac06-f8c3d6ad10c4 already exists in list: networks containing: ['7808328e-22f9-46df-ac06-f8c3d6ad10c4']. ignoring it#033[00m
Jan 23 04:49:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:47.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:47.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:48 np0005593234 nova_compute[227762]: 2026-01-23 09:49:48.718 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:48 np0005593234 podman[257051]: 2026-01-23 09:49:48.799149096 +0000 UTC m=+0.093955539 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 23 04:49:49 np0005593234 nova_compute[227762]: 2026-01-23 09:49:49.103 227766 DEBUG nova.compute.manager [req-468c5f69-f69c-44cc-8e25-23f32117e1ea req-c28b05cc-4b95-4d9d-9bd2-1bd6f972b183 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-changed-aa6f3a20-d469-4e97-90f4-60d418a600e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:49:49 np0005593234 nova_compute[227762]: 2026-01-23 09:49:49.104 227766 DEBUG nova.compute.manager [req-468c5f69-f69c-44cc-8e25-23f32117e1ea req-c28b05cc-4b95-4d9d-9bd2-1bd6f972b183 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Refreshing instance network info cache due to event network-changed-aa6f3a20-d469-4e97-90f4-60d418a600e6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:49:49 np0005593234 nova_compute[227762]: 2026-01-23 09:49:49.104 227766 DEBUG oslo_concurrency.lockutils [req-468c5f69-f69c-44cc-8e25-23f32117e1ea req-c28b05cc-4b95-4d9d-9bd2-1bd6f972b183 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:49:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:49.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:49.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:50 np0005593234 nova_compute[227762]: 2026-01-23 09:49:50.051 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:51.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:51.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.880 227766 DEBUG nova.network.neutron [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.901 227766 DEBUG oslo_concurrency.lockutils [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Releasing lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.903 227766 DEBUG oslo_concurrency.lockutils [req-468c5f69-f69c-44cc-8e25-23f32117e1ea req-c28b05cc-4b95-4d9d-9bd2-1bd6f972b183 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.904 227766 DEBUG nova.network.neutron [req-468c5f69-f69c-44cc-8e25-23f32117e1ea req-c28b05cc-4b95-4d9d-9bd2-1bd6f972b183 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Refreshing network info cache for port aa6f3a20-d469-4e97-90f4-60d418a600e6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.908 227766 DEBUG nova.virt.libvirt.vif [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:49:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.908 227766 DEBUG nova.network.os_vif_util [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.909 227766 DEBUG nova.network.os_vif_util [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:5e:fc,bridge_name='br-int',has_traffic_filtering=True,id=aa6f3a20-d469-4e97-90f4-60d418a600e6,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa6f3a20-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.910 227766 DEBUG os_vif [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:5e:fc,bridge_name='br-int',has_traffic_filtering=True,id=aa6f3a20-d469-4e97-90f4-60d418a600e6,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa6f3a20-d4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.910 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.911 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.911 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.915 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.915 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaa6f3a20-d4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.916 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaa6f3a20-d4, col_values=(('external_ids', {'iface-id': 'aa6f3a20-d469-4e97-90f4-60d418a600e6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a2:5e:fc', 'vm-uuid': 'a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.928 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:51 np0005593234 NetworkManager[48942]: <info>  [1769161791.9291] manager: (tapaa6f3a20-d4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/108)
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.931 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.935 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.936 227766 INFO os_vif [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a2:5e:fc,bridge_name='br-int',has_traffic_filtering=True,id=aa6f3a20-d469-4e97-90f4-60d418a600e6,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa6f3a20-d4')#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.937 227766 DEBUG nova.virt.libvirt.vif [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:49:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.938 227766 DEBUG nova.network.os_vif_util [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.938 227766 DEBUG nova.network.os_vif_util [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a2:5e:fc,bridge_name='br-int',has_traffic_filtering=True,id=aa6f3a20-d469-4e97-90f4-60d418a600e6,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa6f3a20-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.942 227766 DEBUG nova.virt.libvirt.guest [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] attach device xml: <interface type="ethernet">
Jan 23 04:49:51 np0005593234 nova_compute[227762]:  <mac address="fa:16:3e:a2:5e:fc"/>
Jan 23 04:49:51 np0005593234 nova_compute[227762]:  <model type="virtio"/>
Jan 23 04:49:51 np0005593234 nova_compute[227762]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:49:51 np0005593234 nova_compute[227762]:  <mtu size="1442"/>
Jan 23 04:49:51 np0005593234 nova_compute[227762]:  <target dev="tapaa6f3a20-d4"/>
Jan 23 04:49:51 np0005593234 nova_compute[227762]: </interface>
Jan 23 04:49:51 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 04:49:51 np0005593234 kernel: tapaa6f3a20-d4: entered promiscuous mode
Jan 23 04:49:51 np0005593234 NetworkManager[48942]: <info>  [1769161791.9559] manager: (tapaa6f3a20-d4): new Tun device (/org/freedesktop/NetworkManager/Devices/109)
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.957 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:51 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:51Z|00195|binding|INFO|Claiming lport aa6f3a20-d469-4e97-90f4-60d418a600e6 for this chassis.
Jan 23 04:49:51 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:51Z|00196|binding|INFO|aa6f3a20-d469-4e97-90f4-60d418a600e6: Claiming fa:16:3e:a2:5e:fc 10.100.0.13
Jan 23 04:49:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:51.969 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:5e:fc 10.100.0.13'], port_security=['fa:16:3e:a2:5e:fc 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e14d2748-8402-4583-8740-ef7703629f43', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=aa6f3a20-d469-4e97-90f4-60d418a600e6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:49:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:51.971 144381 INFO neutron.agent.ovn.metadata.agent [-] Port aa6f3a20-d469-4e97-90f4-60d418a600e6 in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 bound to our chassis#033[00m
Jan 23 04:49:51 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:51Z|00197|binding|INFO|Setting lport aa6f3a20-d469-4e97-90f4-60d418a600e6 ovn-installed in OVS
Jan 23 04:49:51 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:51Z|00198|binding|INFO|Setting lport aa6f3a20-d469-4e97-90f4-60d418a600e6 up in Southbound
Jan 23 04:49:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:51.973 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.973 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:51 np0005593234 nova_compute[227762]: 2026-01-23 09:49:51.980 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:51 np0005593234 systemd-udevd[257086]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:49:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:51.990 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[83f1cb10-b915-4ba4-a46f-4524f6404bc1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:52 np0005593234 NetworkManager[48942]: <info>  [1769161792.0044] device (tapaa6f3a20-d4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:49:52 np0005593234 NetworkManager[48942]: <info>  [1769161792.0050] device (tapaa6f3a20-d4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:49:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:52.026 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[54c6abfb-cc6d-4644-bb7c-34c42a7eccc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:52.030 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e1860bf1-6360-4929-ba81-f096fa999e64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:52.059 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[40caec85-7aa9-4cdb-a31f-646ab67fcc9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.071 227766 DEBUG nova.virt.libvirt.driver [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.071 227766 DEBUG nova.virt.libvirt.driver [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.071 227766 DEBUG nova.virt.libvirt.driver [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:fc:6f:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.072 227766 DEBUG nova.virt.libvirt.driver [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:a2:5e:fc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:49:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:52.076 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ebbda87e-48a4-41b0-9470-9376d41fcd78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564525, 'reachable_time': 36528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257093, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:52.093 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ad603ca1-8d6b-42cb-a31a-0f5338c58703]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564537, 'tstamp': 564537}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257094, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564539, 'tstamp': 564539}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257094, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:49:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:52.095 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.096 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.097 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:52.098 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7808328e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:52.098 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:49:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:52.098 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7808328e-20, col_values=(('external_ids', {'iface-id': 'db11772c-e758-43ff-997c-e8c835433e90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:52.099 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.116 227766 DEBUG nova.virt.libvirt.guest [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:49:52 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-195762306</nova:name>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:49:52</nova:creationTime>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:49:52 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:    <nova:port uuid="d4963f79-ec1b-4e35-b34d-22edfeb2fd2f">
Jan 23 04:49:52 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:    <nova:port uuid="aa6f3a20-d469-4e97-90f4-60d418a600e6">
Jan 23 04:49:52 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:49:52 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:49:52 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:49:52 np0005593234 nova_compute[227762]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.152 227766 DEBUG oslo_concurrency.lockutils [None req-2b1c0737-9571-4ace-a532-77a968a8528c 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 14.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.606 227766 DEBUG nova.compute.manager [req-89a55f89-5f4b-4b72-b195-eb29f0473e8d req-9f900848-e035-432b-868a-bf06035079aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-plugged-aa6f3a20-d469-4e97-90f4-60d418a600e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.607 227766 DEBUG oslo_concurrency.lockutils [req-89a55f89-5f4b-4b72-b195-eb29f0473e8d req-9f900848-e035-432b-868a-bf06035079aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.607 227766 DEBUG oslo_concurrency.lockutils [req-89a55f89-5f4b-4b72-b195-eb29f0473e8d req-9f900848-e035-432b-868a-bf06035079aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.607 227766 DEBUG oslo_concurrency.lockutils [req-89a55f89-5f4b-4b72-b195-eb29f0473e8d req-9f900848-e035-432b-868a-bf06035079aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.607 227766 DEBUG nova.compute.manager [req-89a55f89-5f4b-4b72-b195-eb29f0473e8d req-9f900848-e035-432b-868a-bf06035079aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] No waiting events found dispatching network-vif-plugged-aa6f3a20-d469-4e97-90f4-60d418a600e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.608 227766 WARNING nova.compute.manager [req-89a55f89-5f4b-4b72-b195-eb29f0473e8d req-9f900848-e035-432b-868a-bf06035079aa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received unexpected event network-vif-plugged-aa6f3a20-d469-4e97-90f4-60d418a600e6 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:49:52 np0005593234 nova_compute[227762]: 2026-01-23 09:49:52.609 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:52.609 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=25, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=24) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:49:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:52.611 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:49:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:49:52.611 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '25'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:49:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:53.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:53.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:53 np0005593234 nova_compute[227762]: 2026-01-23 09:49:53.913 227766 DEBUG oslo_concurrency.lockutils [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "interface-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:53 np0005593234 nova_compute[227762]: 2026-01-23 09:49:53.913 227766 DEBUG oslo_concurrency.lockutils [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:53 np0005593234 nova_compute[227762]: 2026-01-23 09:49:53.914 227766 DEBUG nova.objects.instance [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'flavor' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:49:54 np0005593234 nova_compute[227762]: 2026-01-23 09:49:54.557 227766 DEBUG nova.objects.instance [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'pci_requests' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:49:54 np0005593234 nova_compute[227762]: 2026-01-23 09:49:54.581 227766 DEBUG nova.network.neutron [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:49:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:54Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a2:5e:fc 10.100.0.13
Jan 23 04:49:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:49:54Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a2:5e:fc 10.100.0.13
Jan 23 04:49:55 np0005593234 nova_compute[227762]: 2026-01-23 09:49:55.014 227766 DEBUG nova.compute.manager [req-03e1196d-13a5-49ef-9f24-987525540050 req-19387e8b-909f-4b2d-b46e-0b41e923678c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-plugged-aa6f3a20-d469-4e97-90f4-60d418a600e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:49:55 np0005593234 nova_compute[227762]: 2026-01-23 09:49:55.015 227766 DEBUG oslo_concurrency.lockutils [req-03e1196d-13a5-49ef-9f24-987525540050 req-19387e8b-909f-4b2d-b46e-0b41e923678c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:49:55 np0005593234 nova_compute[227762]: 2026-01-23 09:49:55.015 227766 DEBUG oslo_concurrency.lockutils [req-03e1196d-13a5-49ef-9f24-987525540050 req-19387e8b-909f-4b2d-b46e-0b41e923678c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:49:55 np0005593234 nova_compute[227762]: 2026-01-23 09:49:55.015 227766 DEBUG oslo_concurrency.lockutils [req-03e1196d-13a5-49ef-9f24-987525540050 req-19387e8b-909f-4b2d-b46e-0b41e923678c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:49:55 np0005593234 nova_compute[227762]: 2026-01-23 09:49:55.015 227766 DEBUG nova.compute.manager [req-03e1196d-13a5-49ef-9f24-987525540050 req-19387e8b-909f-4b2d-b46e-0b41e923678c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] No waiting events found dispatching network-vif-plugged-aa6f3a20-d469-4e97-90f4-60d418a600e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:49:55 np0005593234 nova_compute[227762]: 2026-01-23 09:49:55.016 227766 WARNING nova.compute.manager [req-03e1196d-13a5-49ef-9f24-987525540050 req-19387e8b-909f-4b2d-b46e-0b41e923678c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received unexpected event network-vif-plugged-aa6f3a20-d469-4e97-90f4-60d418a600e6 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:49:55 np0005593234 nova_compute[227762]: 2026-01-23 09:49:55.053 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:49:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:55.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:55.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:55 np0005593234 nova_compute[227762]: 2026-01-23 09:49:55.612 227766 DEBUG nova.policy [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77cda1e9a0404425a06c34637e696603', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '390d19f683334995a5268cf9b4d5e464', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:49:55 np0005593234 nova_compute[227762]: 2026-01-23 09:49:55.675 227766 DEBUG nova.network.neutron [req-468c5f69-f69c-44cc-8e25-23f32117e1ea req-c28b05cc-4b95-4d9d-9bd2-1bd6f972b183 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updated VIF entry in instance network info cache for port aa6f3a20-d469-4e97-90f4-60d418a600e6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:49:55 np0005593234 nova_compute[227762]: 2026-01-23 09:49:55.675 227766 DEBUG nova.network.neutron [req-468c5f69-f69c-44cc-8e25-23f32117e1ea req-c28b05cc-4b95-4d9d-9bd2-1bd6f972b183 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:49:55 np0005593234 nova_compute[227762]: 2026-01-23 09:49:55.700 227766 DEBUG oslo_concurrency.lockutils [req-468c5f69-f69c-44cc-8e25-23f32117e1ea req-c28b05cc-4b95-4d9d-9bd2-1bd6f972b183 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:49:56 np0005593234 nova_compute[227762]: 2026-01-23 09:49:56.931 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:57.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:57.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:57 np0005593234 nova_compute[227762]: 2026-01-23 09:49:57.699 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:49:58 np0005593234 nova_compute[227762]: 2026-01-23 09:49:58.604 227766 DEBUG nova.network.neutron [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Successfully created port: fd8d3b77-47c1-41fa-b4ad-e9a868b18abe _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:49:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:49:59.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:49:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:49:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:49:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:49:59.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:00 np0005593234 nova_compute[227762]: 2026-01-23 09:50:00.056 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:00 np0005593234 nova_compute[227762]: 2026-01-23 09:50:00.529 227766 DEBUG nova.network.neutron [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Successfully updated port: fd8d3b77-47c1-41fa-b4ad-e9a868b18abe _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:50:00 np0005593234 nova_compute[227762]: 2026-01-23 09:50:00.588 227766 DEBUG oslo_concurrency.lockutils [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:00 np0005593234 nova_compute[227762]: 2026-01-23 09:50:00.588 227766 DEBUG oslo_concurrency.lockutils [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquired lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:00 np0005593234 nova_compute[227762]: 2026-01-23 09:50:00.588 227766 DEBUG nova.network.neutron [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:50:00 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 04:50:00 np0005593234 nova_compute[227762]: 2026-01-23 09:50:00.671 227766 DEBUG nova.compute.manager [req-504383ce-bb6b-4480-af79-9b81c938443f req-6e9086e0-75da-4ce4-bb16-d861b09c3306 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-changed-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:00 np0005593234 nova_compute[227762]: 2026-01-23 09:50:00.671 227766 DEBUG nova.compute.manager [req-504383ce-bb6b-4480-af79-9b81c938443f req-6e9086e0-75da-4ce4-bb16-d861b09c3306 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Refreshing instance network info cache due to event network-changed-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:50:00 np0005593234 nova_compute[227762]: 2026-01-23 09:50:00.671 227766 DEBUG oslo_concurrency.lockutils [req-504383ce-bb6b-4480-af79-9b81c938443f req-6e9086e0-75da-4ce4-bb16-d861b09c3306 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:00 np0005593234 nova_compute[227762]: 2026-01-23 09:50:00.825 227766 WARNING nova.network.neutron [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] 7808328e-22f9-46df-ac06-f8c3d6ad10c4 already exists in list: networks containing: ['7808328e-22f9-46df-ac06-f8c3d6ad10c4']. ignoring it#033[00m
Jan 23 04:50:00 np0005593234 nova_compute[227762]: 2026-01-23 09:50:00.825 227766 WARNING nova.network.neutron [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] 7808328e-22f9-46df-ac06-f8c3d6ad10c4 already exists in list: networks containing: ['7808328e-22f9-46df-ac06-f8c3d6ad10c4']. ignoring it#033[00m
Jan 23 04:50:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:01.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:01.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:01 np0005593234 nova_compute[227762]: 2026-01-23 09:50:01.752 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:01 np0005593234 nova_compute[227762]: 2026-01-23 09:50:01.971 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:03 np0005593234 nova_compute[227762]: 2026-01-23 09:50:03.344 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:03.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:03.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.059 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.467 227766 DEBUG nova.network.neutron [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:05.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.492 227766 DEBUG oslo_concurrency.lockutils [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Releasing lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.493 227766 DEBUG oslo_concurrency.lockutils [req-504383ce-bb6b-4480-af79-9b81c938443f req-6e9086e0-75da-4ce4-bb16-d861b09c3306 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.493 227766 DEBUG nova.network.neutron [req-504383ce-bb6b-4480-af79-9b81c938443f req-6e9086e0-75da-4ce4-bb16-d861b09c3306 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Refreshing network info cache for port fd8d3b77-47c1-41fa-b4ad-e9a868b18abe _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.496 227766 DEBUG nova.virt.libvirt.vif [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:49:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.496 227766 DEBUG nova.network.os_vif_util [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.497 227766 DEBUG nova.network.os_vif_util [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:8d:2a,bridge_name='br-int',has_traffic_filtering=True,id=fd8d3b77-47c1-41fa-b4ad-e9a868b18abe,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd8d3b77-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.497 227766 DEBUG os_vif [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:8d:2a,bridge_name='br-int',has_traffic_filtering=True,id=fd8d3b77-47c1-41fa-b4ad-e9a868b18abe,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd8d3b77-47') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.497 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.498 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.498 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.501 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.502 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfd8d3b77-47, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.502 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfd8d3b77-47, col_values=(('external_ids', {'iface-id': 'fd8d3b77-47c1-41fa-b4ad-e9a868b18abe', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a3:8d:2a', 'vm-uuid': 'a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.503 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:05 np0005593234 NetworkManager[48942]: <info>  [1769161805.5054] manager: (tapfd8d3b77-47): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.506 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.516 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.517 227766 INFO os_vif [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a3:8d:2a,bridge_name='br-int',has_traffic_filtering=True,id=fd8d3b77-47c1-41fa-b4ad-e9a868b18abe,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd8d3b77-47')#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.518 227766 DEBUG nova.virt.libvirt.vif [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:49:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.519 227766 DEBUG nova.network.os_vif_util [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.519 227766 DEBUG nova.network.os_vif_util [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a3:8d:2a,bridge_name='br-int',has_traffic_filtering=True,id=fd8d3b77-47c1-41fa-b4ad-e9a868b18abe,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd8d3b77-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.522 227766 DEBUG nova.virt.libvirt.guest [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] attach device xml: <interface type="ethernet">
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  <mac address="fa:16:3e:a3:8d:2a"/>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  <model type="virtio"/>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  <mtu size="1442"/>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  <target dev="tapfd8d3b77-47"/>
Jan 23 04:50:05 np0005593234 nova_compute[227762]: </interface>
Jan 23 04:50:05 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 04:50:05 np0005593234 kernel: tapfd8d3b77-47: entered promiscuous mode
Jan 23 04:50:05 np0005593234 NetworkManager[48942]: <info>  [1769161805.5421] manager: (tapfd8d3b77-47): new Tun device (/org/freedesktop/NetworkManager/Devices/111)
Jan 23 04:50:05 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:05Z|00199|binding|INFO|Claiming lport fd8d3b77-47c1-41fa-b4ad-e9a868b18abe for this chassis.
Jan 23 04:50:05 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:05Z|00200|binding|INFO|fd8d3b77-47c1-41fa-b4ad-e9a868b18abe: Claiming fa:16:3e:a3:8d:2a 10.100.0.4
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.542 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.553 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:8d:2a 10.100.0.4'], port_security=['fa:16:3e:a3:8d:2a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e14d2748-8402-4583-8740-ef7703629f43', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fd8d3b77-47c1-41fa-b4ad-e9a868b18abe) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.555 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fd8d3b77-47c1-41fa-b4ad-e9a868b18abe in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 bound to our chassis#033[00m
Jan 23 04:50:05 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:05Z|00201|binding|INFO|Setting lport fd8d3b77-47c1-41fa-b4ad-e9a868b18abe ovn-installed in OVS
Jan 23 04:50:05 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:05Z|00202|binding|INFO|Setting lport fd8d3b77-47c1-41fa-b4ad-e9a868b18abe up in Southbound
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.557 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4#033[00m
Jan 23 04:50:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:05.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.559 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.562 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.573 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9e5e5f67-4c51-49fe-9fb0-b284d386f205]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:05 np0005593234 systemd-udevd[257171]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.608 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4d110a56-00c6-4a94-a6fb-59751a9ddf56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:05 np0005593234 NetworkManager[48942]: <info>  [1769161805.6112] device (tapfd8d3b77-47): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:50:05 np0005593234 NetworkManager[48942]: <info>  [1769161805.6118] device (tapfd8d3b77-47): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.614 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b13729eb-a1af-4112-9139-294e5efd1c5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:05 np0005593234 podman[257159]: 2026-01-23 09:50:05.635503385 +0000 UTC m=+0.056683620 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.641 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a151eb09-e7f1-4165-b0a8-d38c03e5f9b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.657 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7a4d1788-d4ba-494b-a476-668ec5bed69d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564525, 'reachable_time': 36528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257186, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.669 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5f9c9f4d-ac77-42ff-b801-e7b4386c3912]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564537, 'tstamp': 564537}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257187, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564539, 'tstamp': 564539}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257187, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.670 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.672 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.673 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7808328e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.673 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.673 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7808328e-20, col_values=(('external_ids', {'iface-id': 'db11772c-e758-43ff-997c-e8c835433e90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:05.673 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.731 227766 DEBUG nova.virt.libvirt.driver [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.731 227766 DEBUG nova.virt.libvirt.driver [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.732 227766 DEBUG nova.virt.libvirt.driver [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:fc:6f:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.732 227766 DEBUG nova.virt.libvirt.driver [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:a2:5e:fc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.732 227766 DEBUG nova.virt.libvirt.driver [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:a3:8d:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.772 227766 DEBUG nova.virt.libvirt.guest [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-195762306</nova:name>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:50:05</nova:creationTime>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:50:05 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:    <nova:port uuid="d4963f79-ec1b-4e35-b34d-22edfeb2fd2f">
Jan 23 04:50:05 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:    <nova:port uuid="aa6f3a20-d469-4e97-90f4-60d418a600e6">
Jan 23 04:50:05 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:    <nova:port uuid="fd8d3b77-47c1-41fa-b4ad-e9a868b18abe">
Jan 23 04:50:05 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:05 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:50:05 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:50:05 np0005593234 nova_compute[227762]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 04:50:05 np0005593234 nova_compute[227762]: 2026-01-23 09:50:05.796 227766 DEBUG oslo_concurrency.lockutils [None req-2ecca26d-6e77-4d86-a302-55620f8997e0 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 11.883s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:50:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:07.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:50:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:50:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:07.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:50:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:08Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a3:8d:2a 10.100.0.4
Jan 23 04:50:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:08Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a3:8d:2a 10.100.0.4
Jan 23 04:50:08 np0005593234 nova_compute[227762]: 2026-01-23 09:50:08.351 227766 DEBUG nova.compute.manager [req-2408da50-f07c-44de-a51c-503a44590110 req-0000bc96-be1e-4a14-9ad8-a463e6197b65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-plugged-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:08 np0005593234 nova_compute[227762]: 2026-01-23 09:50:08.351 227766 DEBUG oslo_concurrency.lockutils [req-2408da50-f07c-44de-a51c-503a44590110 req-0000bc96-be1e-4a14-9ad8-a463e6197b65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:08 np0005593234 nova_compute[227762]: 2026-01-23 09:50:08.352 227766 DEBUG oslo_concurrency.lockutils [req-2408da50-f07c-44de-a51c-503a44590110 req-0000bc96-be1e-4a14-9ad8-a463e6197b65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:08 np0005593234 nova_compute[227762]: 2026-01-23 09:50:08.352 227766 DEBUG oslo_concurrency.lockutils [req-2408da50-f07c-44de-a51c-503a44590110 req-0000bc96-be1e-4a14-9ad8-a463e6197b65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:08 np0005593234 nova_compute[227762]: 2026-01-23 09:50:08.352 227766 DEBUG nova.compute.manager [req-2408da50-f07c-44de-a51c-503a44590110 req-0000bc96-be1e-4a14-9ad8-a463e6197b65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] No waiting events found dispatching network-vif-plugged-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:08 np0005593234 nova_compute[227762]: 2026-01-23 09:50:08.352 227766 WARNING nova.compute.manager [req-2408da50-f07c-44de-a51c-503a44590110 req-0000bc96-be1e-4a14-9ad8-a463e6197b65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received unexpected event network-vif-plugged-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe for instance with vm_state active and task_state None.#033[00m
Jan 23 04:50:08 np0005593234 nova_compute[227762]: 2026-01-23 09:50:08.930 227766 DEBUG nova.network.neutron [req-504383ce-bb6b-4480-af79-9b81c938443f req-6e9086e0-75da-4ce4-bb16-d861b09c3306 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updated VIF entry in instance network info cache for port fd8d3b77-47c1-41fa-b4ad-e9a868b18abe. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:50:08 np0005593234 nova_compute[227762]: 2026-01-23 09:50:08.931 227766 DEBUG nova.network.neutron [req-504383ce-bb6b-4480-af79-9b81c938443f req-6e9086e0-75da-4ce4-bb16-d861b09c3306 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:08 np0005593234 nova_compute[227762]: 2026-01-23 09:50:08.958 227766 DEBUG oslo_concurrency.lockutils [req-504383ce-bb6b-4480-af79-9b81c938443f req-6e9086e0-75da-4ce4-bb16-d861b09c3306 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:09.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:09.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:10 np0005593234 nova_compute[227762]: 2026-01-23 09:50:10.101 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:10 np0005593234 nova_compute[227762]: 2026-01-23 09:50:10.452 227766 DEBUG nova.compute.manager [req-ad138834-69bd-4547-b349-76b58feca728 req-9aa0b7c5-d540-414a-b3a1-f219d6f355b4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-plugged-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:10 np0005593234 nova_compute[227762]: 2026-01-23 09:50:10.452 227766 DEBUG oslo_concurrency.lockutils [req-ad138834-69bd-4547-b349-76b58feca728 req-9aa0b7c5-d540-414a-b3a1-f219d6f355b4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:10 np0005593234 nova_compute[227762]: 2026-01-23 09:50:10.453 227766 DEBUG oslo_concurrency.lockutils [req-ad138834-69bd-4547-b349-76b58feca728 req-9aa0b7c5-d540-414a-b3a1-f219d6f355b4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:10 np0005593234 nova_compute[227762]: 2026-01-23 09:50:10.453 227766 DEBUG oslo_concurrency.lockutils [req-ad138834-69bd-4547-b349-76b58feca728 req-9aa0b7c5-d540-414a-b3a1-f219d6f355b4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:10 np0005593234 nova_compute[227762]: 2026-01-23 09:50:10.453 227766 DEBUG nova.compute.manager [req-ad138834-69bd-4547-b349-76b58feca728 req-9aa0b7c5-d540-414a-b3a1-f219d6f355b4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] No waiting events found dispatching network-vif-plugged-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:10 np0005593234 nova_compute[227762]: 2026-01-23 09:50:10.453 227766 WARNING nova.compute.manager [req-ad138834-69bd-4547-b349-76b58feca728 req-9aa0b7c5-d540-414a-b3a1-f219d6f355b4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received unexpected event network-vif-plugged-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe for instance with vm_state active and task_state None.#033[00m
Jan 23 04:50:10 np0005593234 nova_compute[227762]: 2026-01-23 09:50:10.504 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:11 np0005593234 nova_compute[227762]: 2026-01-23 09:50:11.027 227766 DEBUG oslo_concurrency.lockutils [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "interface-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-ab21c4f3-543f-40b2-824b-45f30fc09046" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:11 np0005593234 nova_compute[227762]: 2026-01-23 09:50:11.028 227766 DEBUG oslo_concurrency.lockutils [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-ab21c4f3-543f-40b2-824b-45f30fc09046" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:11 np0005593234 nova_compute[227762]: 2026-01-23 09:50:11.029 227766 DEBUG nova.objects.instance [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'flavor' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:11 np0005593234 nova_compute[227762]: 2026-01-23 09:50:11.238 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:11.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:11.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:11 np0005593234 nova_compute[227762]: 2026-01-23 09:50:11.742 227766 DEBUG nova.objects.instance [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'pci_requests' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:11 np0005593234 nova_compute[227762]: 2026-01-23 09:50:11.757 227766 DEBUG nova.network.neutron [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:50:12 np0005593234 nova_compute[227762]: 2026-01-23 09:50:12.187 227766 DEBUG nova.policy [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '77cda1e9a0404425a06c34637e696603', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '390d19f683334995a5268cf9b4d5e464', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:50:13 np0005593234 nova_compute[227762]: 2026-01-23 09:50:13.426 227766 DEBUG nova.network.neutron [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Successfully updated port: ab21c4f3-543f-40b2-824b-45f30fc09046 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:50:13 np0005593234 nova_compute[227762]: 2026-01-23 09:50:13.446 227766 DEBUG oslo_concurrency.lockutils [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:13 np0005593234 nova_compute[227762]: 2026-01-23 09:50:13.447 227766 DEBUG oslo_concurrency.lockutils [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquired lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:13 np0005593234 nova_compute[227762]: 2026-01-23 09:50:13.447 227766 DEBUG nova.network.neutron [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:50:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:50:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:13.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:50:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:50:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:13.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:50:13 np0005593234 nova_compute[227762]: 2026-01-23 09:50:13.641 227766 DEBUG nova.compute.manager [req-d6e16a47-a69b-4d88-a8da-1b5a9ef9283c req-cdac7b53-6c06-4028-8f02-cc9216570f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-changed-ab21c4f3-543f-40b2-824b-45f30fc09046 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:13 np0005593234 nova_compute[227762]: 2026-01-23 09:50:13.641 227766 DEBUG nova.compute.manager [req-d6e16a47-a69b-4d88-a8da-1b5a9ef9283c req-cdac7b53-6c06-4028-8f02-cc9216570f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Refreshing instance network info cache due to event network-changed-ab21c4f3-543f-40b2-824b-45f30fc09046. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:50:13 np0005593234 nova_compute[227762]: 2026-01-23 09:50:13.642 227766 DEBUG oslo_concurrency.lockutils [req-d6e16a47-a69b-4d88-a8da-1b5a9ef9283c req-cdac7b53-6c06-4028-8f02-cc9216570f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:13 np0005593234 nova_compute[227762]: 2026-01-23 09:50:13.690 227766 WARNING nova.network.neutron [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] 7808328e-22f9-46df-ac06-f8c3d6ad10c4 already exists in list: networks containing: ['7808328e-22f9-46df-ac06-f8c3d6ad10c4']. ignoring it#033[00m
Jan 23 04:50:13 np0005593234 nova_compute[227762]: 2026-01-23 09:50:13.691 227766 WARNING nova.network.neutron [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] 7808328e-22f9-46df-ac06-f8c3d6ad10c4 already exists in list: networks containing: ['7808328e-22f9-46df-ac06-f8c3d6ad10c4']. ignoring it#033[00m
Jan 23 04:50:13 np0005593234 nova_compute[227762]: 2026-01-23 09:50:13.691 227766 WARNING nova.network.neutron [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] 7808328e-22f9-46df-ac06-f8c3d6ad10c4 already exists in list: networks containing: ['7808328e-22f9-46df-ac06-f8c3d6ad10c4']. ignoring it#033[00m
Jan 23 04:50:14 np0005593234 nova_compute[227762]: 2026-01-23 09:50:14.716 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:14 np0005593234 nova_compute[227762]: 2026-01-23 09:50:14.717 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:14 np0005593234 nova_compute[227762]: 2026-01-23 09:50:14.738 227766 DEBUG nova.compute.manager [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:50:14 np0005593234 nova_compute[227762]: 2026-01-23 09:50:14.827 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:14 np0005593234 nova_compute[227762]: 2026-01-23 09:50:14.827 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:14 np0005593234 nova_compute[227762]: 2026-01-23 09:50:14.835 227766 DEBUG nova.virt.hardware [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:50:14 np0005593234 nova_compute[227762]: 2026-01-23 09:50:14.836 227766 INFO nova.compute.claims [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:50:14 np0005593234 nova_compute[227762]: 2026-01-23 09:50:14.996 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:15 np0005593234 nova_compute[227762]: 2026-01-23 09:50:15.108 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:50:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4122899827' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:50:15 np0005593234 nova_compute[227762]: 2026-01-23 09:50:15.431 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:15 np0005593234 nova_compute[227762]: 2026-01-23 09:50:15.439 227766 DEBUG nova.compute.provider_tree [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:50:15 np0005593234 nova_compute[227762]: 2026-01-23 09:50:15.470 227766 DEBUG nova.scheduler.client.report [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:50:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:15.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:15 np0005593234 nova_compute[227762]: 2026-01-23 09:50:15.505 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:15.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:15 np0005593234 nova_compute[227762]: 2026-01-23 09:50:15.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:16 np0005593234 nova_compute[227762]: 2026-01-23 09:50:16.040 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.213s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:16 np0005593234 nova_compute[227762]: 2026-01-23 09:50:16.041 227766 DEBUG nova.compute.manager [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:50:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:50:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:17.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:50:17 np0005593234 nova_compute[227762]: 2026-01-23 09:50:17.511 227766 DEBUG nova.compute.manager [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:50:17 np0005593234 nova_compute[227762]: 2026-01-23 09:50:17.511 227766 DEBUG nova.network.neutron [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:50:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:50:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:17.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:50:17 np0005593234 nova_compute[227762]: 2026-01-23 09:50:17.763 227766 INFO nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:50:18 np0005593234 nova_compute[227762]: 2026-01-23 09:50:18.009 227766 DEBUG nova.compute.manager [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:50:18 np0005593234 nova_compute[227762]: 2026-01-23 09:50:18.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:18 np0005593234 nova_compute[227762]: 2026-01-23 09:50:18.956 227766 DEBUG nova.compute.manager [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:50:18 np0005593234 nova_compute[227762]: 2026-01-23 09:50:18.958 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:50:18 np0005593234 nova_compute[227762]: 2026-01-23 09:50:18.959 227766 INFO nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Creating image(s)#033[00m
Jan 23 04:50:18 np0005593234 nova_compute[227762]: 2026-01-23 09:50:18.992 227766 DEBUG nova.storage.rbd_utils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:19 np0005593234 podman[257241]: 2026-01-23 09:50:19.007746901 +0000 UTC m=+0.089504721 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.024 227766 DEBUG nova.storage.rbd_utils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.049 227766 DEBUG nova.storage.rbd_utils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.053 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.077 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.078 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.078 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.078 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.079 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.125 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.126 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.126 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.127 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.156 227766 DEBUG nova.storage.rbd_utils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.159 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:50:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/405221398' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.493 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:19.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:50:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:19.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.586 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000042 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.587 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000042 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.752 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.753 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4465MB free_disk=20.876388549804688GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.753 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.753 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.871 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.872 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 0409b666-6d7a-4831-9ba3-08afe2d0c46b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.872 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.872 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:50:19 np0005593234 nova_compute[227762]: 2026-01-23 09:50:19.930 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:20 np0005593234 nova_compute[227762]: 2026-01-23 09:50:20.145 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:50:20 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2641839503' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:50:20 np0005593234 nova_compute[227762]: 2026-01-23 09:50:20.387 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.227s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:20 np0005593234 nova_compute[227762]: 2026-01-23 09:50:20.420 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:20 np0005593234 nova_compute[227762]: 2026-01-23 09:50:20.454 227766 DEBUG nova.storage.rbd_utils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] resizing rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:50:20 np0005593234 nova_compute[227762]: 2026-01-23 09:50:20.640 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:20 np0005593234 nova_compute[227762]: 2026-01-23 09:50:20.644 227766 DEBUG nova.policy [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0cfac2191989448ead77e75ca3910ac4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '86d938c8e2bb41a79012befd500d1088', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:50:20 np0005593234 nova_compute[227762]: 2026-01-23 09:50:20.650 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:50:20 np0005593234 nova_compute[227762]: 2026-01-23 09:50:20.668 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:50:20 np0005593234 nova_compute[227762]: 2026-01-23 09:50:20.690 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:50:20 np0005593234 nova_compute[227762]: 2026-01-23 09:50:20.691 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.937s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:21 np0005593234 nova_compute[227762]: 2026-01-23 09:50:21.019 227766 DEBUG nova.objects.instance [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'migration_context' on Instance uuid 0409b666-6d7a-4831-9ba3-08afe2d0c46b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:21 np0005593234 nova_compute[227762]: 2026-01-23 09:50:21.039 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:50:21 np0005593234 nova_compute[227762]: 2026-01-23 09:50:21.039 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Ensure instance console log exists: /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:50:21 np0005593234 nova_compute[227762]: 2026-01-23 09:50:21.040 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:21 np0005593234 nova_compute[227762]: 2026-01-23 09:50:21.040 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:21 np0005593234 nova_compute[227762]: 2026-01-23 09:50:21.040 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:50:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:21.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:50:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:21.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:21 np0005593234 nova_compute[227762]: 2026-01-23 09:50:21.691 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:21 np0005593234 nova_compute[227762]: 2026-01-23 09:50:21.692 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:50:23 np0005593234 nova_compute[227762]: 2026-01-23 09:50:23.459 227766 DEBUG nova.network.neutron [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Successfully created port: 0b1311fa-410f-4d76-a118-cd5f14a68f51 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:50:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:23.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:23.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:23 np0005593234 nova_compute[227762]: 2026-01-23 09:50:23.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:23 np0005593234 nova_compute[227762]: 2026-01-23 09:50:23.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:23 np0005593234 nova_compute[227762]: 2026-01-23 09:50:23.747 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.181 227766 DEBUG nova.network.neutron [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab21c4f3-543f-40b2-824b-45f30fc09046", "address": "fa:16:3e:c2:9a:9a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab21c4f3-54", "ovs_interfaceid": "ab21c4f3-543f-40b2-824b-45f30fc09046", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.217 227766 DEBUG oslo_concurrency.lockutils [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Releasing lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.218 227766 DEBUG oslo_concurrency.lockutils [req-d6e16a47-a69b-4d88-a8da-1b5a9ef9283c req-cdac7b53-6c06-4028-8f02-cc9216570f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.218 227766 DEBUG nova.network.neutron [req-d6e16a47-a69b-4d88-a8da-1b5a9ef9283c req-cdac7b53-6c06-4028-8f02-cc9216570f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Refreshing network info cache for port ab21c4f3-543f-40b2-824b-45f30fc09046 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.221 227766 DEBUG nova.virt.libvirt.vif [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:49:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab21c4f3-543f-40b2-824b-45f30fc09046", "address": "fa:16:3e:c2:9a:9a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab21c4f3-54", "ovs_interfaceid": "ab21c4f3-543f-40b2-824b-45f30fc09046", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.222 227766 DEBUG nova.network.os_vif_util [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "ab21c4f3-543f-40b2-824b-45f30fc09046", "address": "fa:16:3e:c2:9a:9a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab21c4f3-54", "ovs_interfaceid": "ab21c4f3-543f-40b2-824b-45f30fc09046", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.222 227766 DEBUG nova.network.os_vif_util [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:9a:9a,bridge_name='br-int',has_traffic_filtering=True,id=ab21c4f3-543f-40b2-824b-45f30fc09046,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapab21c4f3-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.223 227766 DEBUG os_vif [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:9a:9a,bridge_name='br-int',has_traffic_filtering=True,id=ab21c4f3-543f-40b2-824b-45f30fc09046,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapab21c4f3-54') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.224 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.224 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.224 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.226 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.227 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapab21c4f3-54, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.227 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapab21c4f3-54, col_values=(('external_ids', {'iface-id': 'ab21c4f3-543f-40b2-824b-45f30fc09046', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:9a:9a', 'vm-uuid': 'a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.229 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:24 np0005593234 NetworkManager[48942]: <info>  [1769161824.2301] manager: (tapab21c4f3-54): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/112)
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.231 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.239 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.240 227766 INFO os_vif [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:9a:9a,bridge_name='br-int',has_traffic_filtering=True,id=ab21c4f3-543f-40b2-824b-45f30fc09046,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapab21c4f3-54')#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.241 227766 DEBUG nova.virt.libvirt.vif [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:49:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab21c4f3-543f-40b2-824b-45f30fc09046", "address": "fa:16:3e:c2:9a:9a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab21c4f3-54", "ovs_interfaceid": "ab21c4f3-543f-40b2-824b-45f30fc09046", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.241 227766 DEBUG nova.network.os_vif_util [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "ab21c4f3-543f-40b2-824b-45f30fc09046", "address": "fa:16:3e:c2:9a:9a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab21c4f3-54", "ovs_interfaceid": "ab21c4f3-543f-40b2-824b-45f30fc09046", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.242 227766 DEBUG nova.network.os_vif_util [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:9a:9a,bridge_name='br-int',has_traffic_filtering=True,id=ab21c4f3-543f-40b2-824b-45f30fc09046,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapab21c4f3-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.244 227766 DEBUG nova.virt.libvirt.guest [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] attach device xml: <interface type="ethernet">
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  <mac address="fa:16:3e:c2:9a:9a"/>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  <model type="virtio"/>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  <mtu size="1442"/>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  <target dev="tapab21c4f3-54"/>
Jan 23 04:50:24 np0005593234 nova_compute[227762]: </interface>
Jan 23 04:50:24 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 04:50:24 np0005593234 kernel: tapab21c4f3-54: entered promiscuous mode
Jan 23 04:50:24 np0005593234 NetworkManager[48942]: <info>  [1769161824.2539] manager: (tapab21c4f3-54): new Tun device (/org/freedesktop/NetworkManager/Devices/113)
Jan 23 04:50:24 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:24Z|00203|binding|INFO|Claiming lport ab21c4f3-543f-40b2-824b-45f30fc09046 for this chassis.
Jan 23 04:50:24 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:24Z|00204|binding|INFO|ab21c4f3-543f-40b2-824b-45f30fc09046: Claiming fa:16:3e:c2:9a:9a 10.100.0.9
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.257 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.263 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:9a:9a 10.100.0.9'], port_security=['fa:16:3e:c2:9a:9a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1797455705', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1797455705', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e14d2748-8402-4583-8740-ef7703629f43', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=ab21c4f3-543f-40b2-824b-45f30fc09046) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.265 144381 INFO neutron.agent.ovn.metadata.agent [-] Port ab21c4f3-543f-40b2-824b-45f30fc09046 in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 bound to our chassis#033[00m
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.266 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4#033[00m
Jan 23 04:50:24 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:24Z|00205|binding|INFO|Setting lport ab21c4f3-543f-40b2-824b-45f30fc09046 ovn-installed in OVS
Jan 23 04:50:24 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:24Z|00206|binding|INFO|Setting lport ab21c4f3-543f-40b2-824b-45f30fc09046 up in Southbound
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.273 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.277 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.281 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[19dac452-24b0-413f-a57f-360ffeacd5b8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:24 np0005593234 systemd-udevd[257514]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:50:24 np0005593234 NetworkManager[48942]: <info>  [1769161824.2999] device (tapab21c4f3-54): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:50:24 np0005593234 NetworkManager[48942]: <info>  [1769161824.3010] device (tapab21c4f3-54): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.311 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9641dae1-ce21-4398-9709-6a3a5fe7c1c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.313 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b462cb21-d3dc-45e5-a11b-0cd92ffb7a6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.338 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e9de12d0-8a4d-423b-badb-07d8d99136f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.353 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[206193de-bcb4-47f9-8a81-c923ef5051cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 9, 'rx_bytes': 784, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564525, 'reachable_time': 36528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257521, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.370 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4349d2d8-b688-45f1-a698-9a012c1b5e2e]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564537, 'tstamp': 564537}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257522, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564539, 'tstamp': 564539}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257522, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.372 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.373 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.374 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.375 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7808328e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.375 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.375 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7808328e-20, col_values=(('external_ids', {'iface-id': 'db11772c-e758-43ff-997c-e8c835433e90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:24.376 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.519 227766 DEBUG nova.virt.libvirt.driver [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.519 227766 DEBUG nova.virt.libvirt.driver [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.519 227766 DEBUG nova.virt.libvirt.driver [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:fc:6f:8d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.520 227766 DEBUG nova.virt.libvirt.driver [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:a2:5e:fc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.520 227766 DEBUG nova.virt.libvirt.driver [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:a3:8d:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.520 227766 DEBUG nova.virt.libvirt.driver [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] No VIF found with MAC fa:16:3e:c2:9a:9a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.593 227766 DEBUG nova.virt.libvirt.guest [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-195762306</nova:name>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:50:24</nova:creationTime>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    <nova:port uuid="d4963f79-ec1b-4e35-b34d-22edfeb2fd2f">
Jan 23 04:50:24 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    <nova:port uuid="aa6f3a20-d469-4e97-90f4-60d418a600e6">
Jan 23 04:50:24 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    <nova:port uuid="fd8d3b77-47c1-41fa-b4ad-e9a868b18abe">
Jan 23 04:50:24 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    <nova:port uuid="ab21c4f3-543f-40b2-824b-45f30fc09046">
Jan 23 04:50:24 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:24 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:50:24 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:50:24 np0005593234 nova_compute[227762]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.642 227766 DEBUG oslo_concurrency.lockutils [None req-a1a09d57-dd91-460d-b53b-81526f6dcdd2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-ab21c4f3-543f-40b2-824b-45f30fc09046" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 13.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.718 227766 DEBUG nova.compute.manager [req-1d278589-9dd3-4129-97b4-9fa4e21b5de7 req-32117231-e70a-4582-a985-9d2129ab114e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-plugged-ab21c4f3-543f-40b2-824b-45f30fc09046 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.718 227766 DEBUG oslo_concurrency.lockutils [req-1d278589-9dd3-4129-97b4-9fa4e21b5de7 req-32117231-e70a-4582-a985-9d2129ab114e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.719 227766 DEBUG oslo_concurrency.lockutils [req-1d278589-9dd3-4129-97b4-9fa4e21b5de7 req-32117231-e70a-4582-a985-9d2129ab114e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.719 227766 DEBUG oslo_concurrency.lockutils [req-1d278589-9dd3-4129-97b4-9fa4e21b5de7 req-32117231-e70a-4582-a985-9d2129ab114e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.719 227766 DEBUG nova.compute.manager [req-1d278589-9dd3-4129-97b4-9fa4e21b5de7 req-32117231-e70a-4582-a985-9d2129ab114e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] No waiting events found dispatching network-vif-plugged-ab21c4f3-543f-40b2-824b-45f30fc09046 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.719 227766 WARNING nova.compute.manager [req-1d278589-9dd3-4129-97b4-9fa4e21b5de7 req-32117231-e70a-4582-a985-9d2129ab114e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received unexpected event network-vif-plugged-ab21c4f3-543f-40b2-824b-45f30fc09046 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.747 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.747 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.748 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.764 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 04:50:24 np0005593234 nova_compute[227762]: 2026-01-23 09:50:24.989 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:25 np0005593234 nova_compute[227762]: 2026-01-23 09:50:25.244 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:25.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:50:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:25.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:50:25 np0005593234 nova_compute[227762]: 2026-01-23 09:50:25.628 227766 DEBUG nova.network.neutron [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Successfully updated port: 0b1311fa-410f-4d76-a118-cd5f14a68f51 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:50:25 np0005593234 nova_compute[227762]: 2026-01-23 09:50:25.986 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "refresh_cache-0409b666-6d7a-4831-9ba3-08afe2d0c46b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:25 np0005593234 nova_compute[227762]: 2026-01-23 09:50:25.986 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquired lock "refresh_cache-0409b666-6d7a-4831-9ba3-08afe2d0c46b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:25 np0005593234 nova_compute[227762]: 2026-01-23 09:50:25.986 227766 DEBUG nova.network.neutron [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:50:26 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:26Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c2:9a:9a 10.100.0.9
Jan 23 04:50:26 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:26Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c2:9a:9a 10.100.0.9
Jan 23 04:50:26 np0005593234 nova_compute[227762]: 2026-01-23 09:50:26.685 227766 DEBUG nova.network.neutron [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:50:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:27.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:27.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:28 np0005593234 nova_compute[227762]: 2026-01-23 09:50:28.412 227766 DEBUG nova.compute.manager [req-8e14439f-7ed6-4589-8221-89477e41dabe req-0a87b0a4-6040-485a-8773-2cd501a625cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received event network-changed-0b1311fa-410f-4d76-a118-cd5f14a68f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:28 np0005593234 nova_compute[227762]: 2026-01-23 09:50:28.413 227766 DEBUG nova.compute.manager [req-8e14439f-7ed6-4589-8221-89477e41dabe req-0a87b0a4-6040-485a-8773-2cd501a625cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Refreshing instance network info cache due to event network-changed-0b1311fa-410f-4d76-a118-cd5f14a68f51. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:50:28 np0005593234 nova_compute[227762]: 2026-01-23 09:50:28.413 227766 DEBUG oslo_concurrency.lockutils [req-8e14439f-7ed6-4589-8221-89477e41dabe req-0a87b0a4-6040-485a-8773-2cd501a625cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0409b666-6d7a-4831-9ba3-08afe2d0c46b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:28 np0005593234 nova_compute[227762]: 2026-01-23 09:50:28.764 227766 DEBUG nova.network.neutron [req-d6e16a47-a69b-4d88-a8da-1b5a9ef9283c req-cdac7b53-6c06-4028-8f02-cc9216570f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updated VIF entry in instance network info cache for port ab21c4f3-543f-40b2-824b-45f30fc09046. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:50:28 np0005593234 nova_compute[227762]: 2026-01-23 09:50:28.765 227766 DEBUG nova.network.neutron [req-d6e16a47-a69b-4d88-a8da-1b5a9ef9283c req-cdac7b53-6c06-4028-8f02-cc9216570f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab21c4f3-543f-40b2-824b-45f30fc09046", "address": "fa:16:3e:c2:9a:9a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab21c4f3-54", "ovs_interfaceid": "ab21c4f3-543f-40b2-824b-45f30fc09046", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:29 np0005593234 nova_compute[227762]: 2026-01-23 09:50:29.230 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:29 np0005593234 nova_compute[227762]: 2026-01-23 09:50:29.305 227766 DEBUG nova.compute.manager [req-494f84a6-aa79-4f52-bb8a-f73891bf1511 req-0e61a4bb-2fd0-4dd0-b23b-5c92d034d1da 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-plugged-ab21c4f3-543f-40b2-824b-45f30fc09046 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:29 np0005593234 nova_compute[227762]: 2026-01-23 09:50:29.305 227766 DEBUG oslo_concurrency.lockutils [req-494f84a6-aa79-4f52-bb8a-f73891bf1511 req-0e61a4bb-2fd0-4dd0-b23b-5c92d034d1da 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:29 np0005593234 nova_compute[227762]: 2026-01-23 09:50:29.306 227766 DEBUG oslo_concurrency.lockutils [req-494f84a6-aa79-4f52-bb8a-f73891bf1511 req-0e61a4bb-2fd0-4dd0-b23b-5c92d034d1da 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:29 np0005593234 nova_compute[227762]: 2026-01-23 09:50:29.306 227766 DEBUG oslo_concurrency.lockutils [req-494f84a6-aa79-4f52-bb8a-f73891bf1511 req-0e61a4bb-2fd0-4dd0-b23b-5c92d034d1da 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:29 np0005593234 nova_compute[227762]: 2026-01-23 09:50:29.306 227766 DEBUG nova.compute.manager [req-494f84a6-aa79-4f52-bb8a-f73891bf1511 req-0e61a4bb-2fd0-4dd0-b23b-5c92d034d1da 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] No waiting events found dispatching network-vif-plugged-ab21c4f3-543f-40b2-824b-45f30fc09046 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:29 np0005593234 nova_compute[227762]: 2026-01-23 09:50:29.306 227766 WARNING nova.compute.manager [req-494f84a6-aa79-4f52-bb8a-f73891bf1511 req-0e61a4bb-2fd0-4dd0-b23b-5c92d034d1da 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received unexpected event network-vif-plugged-ab21c4f3-543f-40b2-824b-45f30fc09046 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:50:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:50:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:29.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:50:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:29.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:30 np0005593234 nova_compute[227762]: 2026-01-23 09:50:30.245 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:30 np0005593234 nova_compute[227762]: 2026-01-23 09:50:30.686 227766 DEBUG nova.network.neutron [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Updating instance_info_cache with network_info: [{"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:31.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:31.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:31 np0005593234 nova_compute[227762]: 2026-01-23 09:50:31.965 227766 DEBUG oslo_concurrency.lockutils [req-d6e16a47-a69b-4d88-a8da-1b5a9ef9283c req-cdac7b53-6c06-4028-8f02-cc9216570f2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:31 np0005593234 nova_compute[227762]: 2026-01-23 09:50:31.965 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:31 np0005593234 nova_compute[227762]: 2026-01-23 09:50:31.966 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:50:31 np0005593234 nova_compute[227762]: 2026-01-23 09:50:31.966 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.452 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Releasing lock "refresh_cache-0409b666-6d7a-4831-9ba3-08afe2d0c46b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.453 227766 DEBUG nova.compute.manager [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Instance network_info: |[{"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.453 227766 DEBUG oslo_concurrency.lockutils [req-8e14439f-7ed6-4589-8221-89477e41dabe req-0a87b0a4-6040-485a-8773-2cd501a625cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0409b666-6d7a-4831-9ba3-08afe2d0c46b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.453 227766 DEBUG nova.network.neutron [req-8e14439f-7ed6-4589-8221-89477e41dabe req-0a87b0a4-6040-485a-8773-2cd501a625cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Refreshing network info cache for port 0b1311fa-410f-4d76-a118-cd5f14a68f51 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.457 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Start _get_guest_xml network_info=[{"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.462 227766 WARNING nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.467 227766 DEBUG oslo_concurrency.lockutils [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "interface-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-aa6f3a20-d469-4e97-90f4-60d418a600e6" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.467 227766 DEBUG oslo_concurrency.lockutils [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-aa6f3a20-d469-4e97-90f4-60d418a600e6" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.473 227766 DEBUG nova.virt.libvirt.host [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.474 227766 DEBUG nova.virt.libvirt.host [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.481 227766 DEBUG nova.virt.libvirt.host [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.481 227766 DEBUG nova.virt.libvirt.host [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.483 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.483 227766 DEBUG nova.virt.hardware [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.484 227766 DEBUG nova.virt.hardware [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.484 227766 DEBUG nova.virt.hardware [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.485 227766 DEBUG nova.virt.hardware [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.485 227766 DEBUG nova.virt.hardware [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.485 227766 DEBUG nova.virt.hardware [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.485 227766 DEBUG nova.virt.hardware [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.486 227766 DEBUG nova.virt.hardware [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.486 227766 DEBUG nova.virt.hardware [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.486 227766 DEBUG nova.virt.hardware [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.487 227766 DEBUG nova.virt.hardware [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.491 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.537 227766 DEBUG nova.objects.instance [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'flavor' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.574 227766 DEBUG nova.virt.libvirt.vif [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:49:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.574 227766 DEBUG nova.network.os_vif_util [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.575 227766 DEBUG nova.network.os_vif_util [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a2:5e:fc,bridge_name='br-int',has_traffic_filtering=True,id=aa6f3a20-d469-4e97-90f4-60d418a600e6,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa6f3a20-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.578 227766 DEBUG nova.virt.libvirt.guest [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a2:5e:fc"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaa6f3a20-d4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.580 227766 DEBUG nova.virt.libvirt.guest [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a2:5e:fc"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaa6f3a20-d4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.582 227766 DEBUG nova.virt.libvirt.driver [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Attempting to detach device tapaa6f3a20-d4 from instance a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.582 227766 DEBUG nova.virt.libvirt.guest [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] detach device xml: <interface type="ethernet">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <mac address="fa:16:3e:a2:5e:fc"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <model type="virtio"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <mtu size="1442"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <target dev="tapaa6f3a20-d4"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]: </interface>
Jan 23 04:50:32 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.589 227766 DEBUG nova.virt.libvirt.guest [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a2:5e:fc"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaa6f3a20-d4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.593 227766 DEBUG nova.virt.libvirt.guest [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a2:5e:fc"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaa6f3a20-d4"/></interface>not found in domain: <domain type='kvm' id='27'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <name>instance-00000042</name>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <uuid>a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</uuid>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-195762306</nova:name>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:50:24</nova:creationTime>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:port uuid="d4963f79-ec1b-4e35-b34d-22edfeb2fd2f">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:port uuid="aa6f3a20-d469-4e97-90f4-60d418a600e6">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:port uuid="fd8d3b77-47c1-41fa-b4ad-e9a868b18abe">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:port uuid="ab21c4f3-543f-40b2-824b-45f30fc09046">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:50:32 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <memory unit='KiB'>131072</memory>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <vcpu placement='static'>1</vcpu>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <resource>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <partition>/machine</partition>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </resource>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <sysinfo type='smbios'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <entry name='manufacturer'>RDO</entry>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <entry name='serial'>a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</entry>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <entry name='uuid'>a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</entry>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <entry name='family'>Virtual Machine</entry>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <boot dev='hd'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <smbios mode='sysinfo'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <vmcoreinfo state='on'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <model fallback='forbid'>Nehalem</model>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <feature policy='require' name='x2apic'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <feature policy='require' name='hypervisor'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <feature policy='require' name='vme'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <clock offset='utc'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <timer name='hpet' present='no'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <on_poweroff>destroy</on_poweroff>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <on_reboot>restart</on_reboot>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <on_crash>destroy</on_crash>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <disk type='network' device='disk'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk' index='2'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target dev='vda' bus='virtio'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='virtio-disk0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <disk type='network' device='cdrom'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk.config' index='1'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target dev='sda' bus='sata'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <readonly/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='sata0-0-0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pcie.0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='1' port='0x10'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='2' port='0x11'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='3' port='0x12'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.3'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='4' port='0x13'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.4'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='5' port='0x14'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.5'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='6' port='0x15'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.6'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='7' port='0x16'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.7'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='8' port='0x17'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.8'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='9' port='0x18'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.9'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='10' port='0x19'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.10'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='11' port='0x1a'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.11'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='12' port='0x1b'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.12'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='13' port='0x1c'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.13'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='14' port='0x1d'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.14'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='15' port='0x1e'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.15'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='16' port='0x1f'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.16'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='17' port='0x20'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.17'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='18' port='0x21'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.18'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='19' port='0x22'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.19'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='20' port='0x23'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.20'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='21' port='0x24'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.21'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='22' port='0x25'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.22'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='23' port='0x26'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.23'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='24' port='0x27'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.24'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='25' port='0x28'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.25'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-pci-bridge'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.26'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='usb'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='sata' index='0'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='ide'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:fc:6f:8d'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target dev='tapd4963f79-ec'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='net0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:a2:5e:fc'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target dev='tapaa6f3a20-d4'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='net1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:a3:8d:2a'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target dev='tapfd8d3b77-47'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='net2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:c2:9a:9a'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target dev='tapab21c4f3-54'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='net3'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <serial type='pty'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/console.log' append='off'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target type='isa-serial' port='0'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <model name='isa-serial'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      </target>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/console.log' append='off'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target type='serial' port='0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </console>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <input type='tablet' bus='usb'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='input0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='usb' bus='0' port='1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <input type='mouse' bus='ps2'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='input1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <input type='keyboard' bus='ps2'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='input2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <listen type='address' address='::0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <audio id='1' type='none'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='video0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <watchdog model='itco' action='reset'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='watchdog0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </watchdog>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <memballoon model='virtio'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <stats period='10'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='balloon0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <rng model='virtio'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <backend model='random'>/dev/urandom</backend>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='rng0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <label>system_u:system_r:svirt_t:s0:c395,c549</label>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c395,c549</imagelabel>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <label>+107:+107</label>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <imagelabel>+107:+107</imagelabel>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:50:32 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:50:32 np0005593234 nova_compute[227762]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.595 227766 INFO nova.virt.libvirt.driver [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully detached device tapaa6f3a20-d4 from instance a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 from the persistent domain config.#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.596 227766 DEBUG nova.virt.libvirt.driver [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] (1/8): Attempting to detach device tapaa6f3a20-d4 with device alias net1 from instance a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.596 227766 DEBUG nova.virt.libvirt.guest [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] detach device xml: <interface type="ethernet">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <mac address="fa:16:3e:a2:5e:fc"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <model type="virtio"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <mtu size="1442"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <target dev="tapaa6f3a20-d4"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]: </interface>
Jan 23 04:50:32 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 04:50:32 np0005593234 kernel: tapaa6f3a20-d4 (unregistering): left promiscuous mode
Jan 23 04:50:32 np0005593234 NetworkManager[48942]: <info>  [1769161832.6902] device (tapaa6f3a20-d4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:50:32 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:32Z|00207|binding|INFO|Releasing lport aa6f3a20-d469-4e97-90f4-60d418a600e6 from this chassis (sb_readonly=0)
Jan 23 04:50:32 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:32Z|00208|binding|INFO|Setting lport aa6f3a20-d469-4e97-90f4-60d418a600e6 down in Southbound
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.693 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:32 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:32Z|00209|binding|INFO|Removing iface tapaa6f3a20-d4 ovn-installed in OVS
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.695 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.699 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769161832.6993911, a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.701 227766 DEBUG nova.virt.libvirt.driver [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Start waiting for the detach event from libvirt for device tapaa6f3a20-d4 with device alias net1 for instance a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.702 227766 DEBUG nova.virt.libvirt.guest [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a2:5e:fc"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaa6f3a20-d4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.702 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a2:5e:fc 10.100.0.13'], port_security=['fa:16:3e:a2:5e:fc 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e14d2748-8402-4583-8740-ef7703629f43', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=aa6f3a20-d469-4e97-90f4-60d418a600e6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.703 144381 INFO neutron.agent.ovn.metadata.agent [-] Port aa6f3a20-d469-4e97-90f4-60d418a600e6 in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 unbound from our chassis#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.705 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.709 227766 DEBUG nova.virt.libvirt.guest [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a2:5e:fc"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaa6f3a20-d4"/></interface>not found in domain: <domain type='kvm' id='27'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <name>instance-00000042</name>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <uuid>a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</uuid>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-195762306</nova:name>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:50:24</nova:creationTime>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:port uuid="d4963f79-ec1b-4e35-b34d-22edfeb2fd2f">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:port uuid="aa6f3a20-d469-4e97-90f4-60d418a600e6">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:port uuid="fd8d3b77-47c1-41fa-b4ad-e9a868b18abe">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:port uuid="ab21c4f3-543f-40b2-824b-45f30fc09046">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:50:32 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <memory unit='KiB'>131072</memory>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <vcpu placement='static'>1</vcpu>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <resource>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <partition>/machine</partition>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </resource>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <sysinfo type='smbios'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <entry name='manufacturer'>RDO</entry>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <entry name='serial'>a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</entry>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <entry name='uuid'>a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</entry>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <entry name='family'>Virtual Machine</entry>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <boot dev='hd'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <smbios mode='sysinfo'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <vmcoreinfo state='on'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <model fallback='forbid'>Nehalem</model>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <feature policy='require' name='x2apic'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <feature policy='require' name='hypervisor'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <feature policy='require' name='vme'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <clock offset='utc'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <timer name='hpet' present='no'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <on_poweroff>destroy</on_poweroff>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <on_reboot>restart</on_reboot>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <on_crash>destroy</on_crash>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <disk type='network' device='disk'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk' index='2'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target dev='vda' bus='virtio'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='virtio-disk0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <disk type='network' device='cdrom'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk.config' index='1'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target dev='sda' bus='sata'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <readonly/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='sata0-0-0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pcie.0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='1' port='0x10'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='2' port='0x11'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='3' port='0x12'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.3'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='4' port='0x13'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.4'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='5' port='0x14'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.5'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='6' port='0x15'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.6'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='7' port='0x16'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.7'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='8' port='0x17'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.8'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='9' port='0x18'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.9'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='10' port='0x19'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.10'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='11' port='0x1a'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.11'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='12' port='0x1b'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.12'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='13' port='0x1c'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.13'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='14' port='0x1d'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.14'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='15' port='0x1e'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.15'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='16' port='0x1f'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.16'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='17' port='0x20'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.17'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='18' port='0x21'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.18'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='19' port='0x22'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.19'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='20' port='0x23'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.20'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='21' port='0x24'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.21'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='22' port='0x25'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.22'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='23' port='0x26'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.23'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='24' port='0x27'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.24'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target chassis='25' port='0x28'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.25'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model name='pcie-pci-bridge'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='pci.26'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='usb'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <controller type='sata' index='0'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='ide'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:fc:6f:8d'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target dev='tapd4963f79-ec'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='net0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:a3:8d:2a'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target dev='tapfd8d3b77-47'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='net2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:c2:9a:9a'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target dev='tapab21c4f3-54'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='net3'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <serial type='pty'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/console.log' append='off'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target type='isa-serial' port='0'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:        <model name='isa-serial'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      </target>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/console.log' append='off'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <target type='serial' port='0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </console>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <input type='tablet' bus='usb'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='input0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='usb' bus='0' port='1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <input type='mouse' bus='ps2'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='input1'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <input type='keyboard' bus='ps2'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='input2'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <listen type='address' address='::0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <audio id='1' type='none'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='video0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <watchdog model='itco' action='reset'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='watchdog0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </watchdog>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <memballoon model='virtio'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <stats period='10'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='balloon0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <rng model='virtio'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <backend model='random'>/dev/urandom</backend>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <alias name='rng0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <label>system_u:system_r:svirt_t:s0:c395,c549</label>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c395,c549</imagelabel>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <label>+107:+107</label>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <imagelabel>+107:+107</imagelabel>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:50:32 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:50:32 np0005593234 nova_compute[227762]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.709 227766 INFO nova.virt.libvirt.driver [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully detached device tapaa6f3a20-d4 from instance a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 from the live domain config.#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.710 227766 DEBUG nova.virt.libvirt.vif [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:49:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.710 227766 DEBUG nova.network.os_vif_util [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.713 227766 DEBUG nova.network.os_vif_util [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a2:5e:fc,bridge_name='br-int',has_traffic_filtering=True,id=aa6f3a20-d469-4e97-90f4-60d418a600e6,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa6f3a20-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.713 227766 DEBUG os_vif [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:5e:fc,bridge_name='br-int',has_traffic_filtering=True,id=aa6f3a20-d469-4e97-90f4-60d418a600e6,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa6f3a20-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.715 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.716 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa6f3a20-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.718 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.719 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.720 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.722 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.725 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[112c2b78-688b-4ff0-875b-e0eb312b68df]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.726 227766 INFO os_vif [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:5e:fc,bridge_name='br-int',has_traffic_filtering=True,id=aa6f3a20-d469-4e97-90f4-60d418a600e6,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa6f3a20-d4')#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.727 227766 DEBUG nova.virt.libvirt.guest [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-195762306</nova:name>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:50:32</nova:creationTime>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:port uuid="d4963f79-ec1b-4e35-b34d-22edfeb2fd2f">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:port uuid="fd8d3b77-47c1-41fa-b4ad-e9a868b18abe">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    <nova:port uuid="ab21c4f3-543f-40b2-824b-45f30fc09046">
Jan 23 04:50:32 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:32 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:50:32 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:50:32 np0005593234 nova_compute[227762]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.755 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b9bc2ce9-da8c-443a-9a08-27eed82824fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.757 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[cdfbebaa-4534-41f2-8af7-39b9a097f553]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.781 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[067f4e87-7d65-44d2-95d1-0c798e69c9eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.799 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[908c52ef-2ad3-45d1-899d-38954c5bd75c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7808328e-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:bb:22:ae'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 13, 'tx_packets': 11, 'rx_bytes': 826, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 13, 'tx_packets': 11, 'rx_bytes': 826, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 60], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564525, 'reachable_time': 36528, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257559, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.821 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9c73251d-6202-4104-90dc-acb5b1ae06e1]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564537, 'tstamp': 564537}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257560, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap7808328e-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 564539, 'tstamp': 564539}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257560, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.822 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.824 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.825 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.825 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7808328e-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.825 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.825 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7808328e-20, col_values=(('external_ids', {'iface-id': 'db11772c-e758-43ff-997c-e8c835433e90'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:32.826 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:50:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/978982576' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.934 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.962 227766 DEBUG nova.storage.rbd_utils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:32 np0005593234 nova_compute[227762]: 2026-01-23 09:50:32.966 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:50:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/274597971' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.416 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.418 227766 DEBUG nova.virt.libvirt.vif [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:50:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-680599882',display_name='tempest-ServerDiskConfigTestJSON-server-680599882',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-680599882',id=69,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-wqu3xxlv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:50:18Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=0409b666-6d7a-4831-9ba3-08afe2d0c46b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.418 227766 DEBUG nova.network.os_vif_util [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.419 227766 DEBUG nova.network.os_vif_util [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.420 227766 DEBUG nova.objects.instance [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0409b666-6d7a-4831-9ba3-08afe2d0c46b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:33.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:33.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.616 227766 DEBUG nova.compute.manager [req-9746c240-c8e8-47fb-9cf8-20715db2a578 req-2ea23829-cd40-46d8-ad10-0d1c169dc5e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-unplugged-aa6f3a20-d469-4e97-90f4-60d418a600e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.616 227766 DEBUG oslo_concurrency.lockutils [req-9746c240-c8e8-47fb-9cf8-20715db2a578 req-2ea23829-cd40-46d8-ad10-0d1c169dc5e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.617 227766 DEBUG oslo_concurrency.lockutils [req-9746c240-c8e8-47fb-9cf8-20715db2a578 req-2ea23829-cd40-46d8-ad10-0d1c169dc5e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.617 227766 DEBUG oslo_concurrency.lockutils [req-9746c240-c8e8-47fb-9cf8-20715db2a578 req-2ea23829-cd40-46d8-ad10-0d1c169dc5e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.617 227766 DEBUG nova.compute.manager [req-9746c240-c8e8-47fb-9cf8-20715db2a578 req-2ea23829-cd40-46d8-ad10-0d1c169dc5e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] No waiting events found dispatching network-vif-unplugged-aa6f3a20-d469-4e97-90f4-60d418a600e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.618 227766 WARNING nova.compute.manager [req-9746c240-c8e8-47fb-9cf8-20715db2a578 req-2ea23829-cd40-46d8-ad10-0d1c169dc5e0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received unexpected event network-vif-unplugged-aa6f3a20-d469-4e97-90f4-60d418a600e6 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.643 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  <uuid>0409b666-6d7a-4831-9ba3-08afe2d0c46b</uuid>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  <name>instance-00000045</name>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-680599882</nova:name>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:50:32</nova:creationTime>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <nova:user uuid="0cfac2191989448ead77e75ca3910ac4">tempest-ServerDiskConfigTestJSON-211417238-project-member</nova:user>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <nova:project uuid="86d938c8e2bb41a79012befd500d1088">tempest-ServerDiskConfigTestJSON-211417238</nova:project>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <nova:port uuid="0b1311fa-410f-4d76-a118-cd5f14a68f51">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <entry name="serial">0409b666-6d7a-4831-9ba3-08afe2d0c46b</entry>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <entry name="uuid">0409b666-6d7a-4831-9ba3-08afe2d0c46b</entry>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk.config">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:63:53:fd"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <target dev="tap0b1311fa-41"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/console.log" append="off"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:50:33 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:50:33 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:50:33 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:50:33 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.644 227766 DEBUG nova.compute.manager [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Preparing to wait for external event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.644 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.644 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.644 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.645 227766 DEBUG nova.virt.libvirt.vif [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:50:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-680599882',display_name='tempest-ServerDiskConfigTestJSON-server-680599882',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-680599882',id=69,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-wqu3xxlv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:50:18Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=0409b666-6d7a-4831-9ba3-08afe2d0c46b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.645 227766 DEBUG nova.network.os_vif_util [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.646 227766 DEBUG nova.network.os_vif_util [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.646 227766 DEBUG os_vif [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.647 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.647 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.648 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.653 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.653 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b1311fa-41, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.654 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b1311fa-41, col_values=(('external_ids', {'iface-id': '0b1311fa-410f-4d76-a118-cd5f14a68f51', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:63:53:fd', 'vm-uuid': '0409b666-6d7a-4831-9ba3-08afe2d0c46b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:33 np0005593234 NetworkManager[48942]: <info>  [1769161833.6566] manager: (tap0b1311fa-41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.655 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.658 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.660 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.661 227766 INFO os_vif [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41')#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.763 227766 DEBUG oslo_concurrency.lockutils [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.767 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.767 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.768 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No VIF found with MAC fa:16:3e:63:53:fd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.768 227766 INFO nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Using config drive#033[00m
Jan 23 04:50:33 np0005593234 nova_compute[227762]: 2026-01-23 09:50:33.793 227766 DEBUG nova.storage.rbd_utils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.154 227766 INFO nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Creating config drive at /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/disk.config#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.159 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5zo5lvgw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.286 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5zo5lvgw" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.311 227766 DEBUG nova.storage.rbd_utils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.315 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/disk.config 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.453 227766 DEBUG oslo_concurrency.processutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/disk.config 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.454 227766 INFO nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Deleting local config drive /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/disk.config because it was imported into RBD.#033[00m
Jan 23 04:50:34 np0005593234 kernel: tap0b1311fa-41: entered promiscuous mode
Jan 23 04:50:34 np0005593234 systemd-udevd[257551]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:50:34 np0005593234 NetworkManager[48942]: <info>  [1769161834.5015] manager: (tap0b1311fa-41): new Tun device (/org/freedesktop/NetworkManager/Devices/115)
Jan 23 04:50:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:34Z|00210|binding|INFO|Claiming lport 0b1311fa-410f-4d76-a118-cd5f14a68f51 for this chassis.
Jan 23 04:50:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:34Z|00211|binding|INFO|0b1311fa-410f-4d76-a118-cd5f14a68f51: Claiming fa:16:3e:63:53:fd 10.100.0.4
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.504 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:34 np0005593234 NetworkManager[48942]: <info>  [1769161834.5110] device (tap0b1311fa-41): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:50:34 np0005593234 NetworkManager[48942]: <info>  [1769161834.5117] device (tap0b1311fa-41): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:50:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:34Z|00212|binding|INFO|Setting lport 0b1311fa-410f-4d76-a118-cd5f14a68f51 ovn-installed in OVS
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.523 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.525 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:34Z|00213|binding|INFO|Setting lport 0b1311fa-410f-4d76-a118-cd5f14a68f51 up in Southbound
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.530 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:53:fd 10.100.0.4'], port_security=['fa:16:3e:63:53:fd 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0409b666-6d7a-4831-9ba3-08afe2d0c46b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=0b1311fa-410f-4d76-a118-cd5f14a68f51) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.531 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 0b1311fa-410f-4d76-a118-cd5f14a68f51 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 bound to our chassis#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.533 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6d2cdc4c-47a0-475b-8e71-39465d365de3#033[00m
Jan 23 04:50:34 np0005593234 systemd-machined[195626]: New machine qemu-28-instance-00000045.
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.544 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a13323c7-f902-4564-9f6d-a7e7b043023e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.544 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6d2cdc4c-41 in ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.546 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6d2cdc4c-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.546 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[18876a2c-634e-42bf-804b-bf2616c531c5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.546 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1a4155b1-0f8d-4909-a942-01750bcdee43]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 systemd[1]: Started Virtual Machine qemu-28-instance-00000045.
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.566 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[4d94cbf8-96fd-423a-bd30-00a2c78b8786]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.594 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6dc29971-23d1-4071-b6c7-f532828dd37d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.624 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b65ac7-68b3-46ca-9924-ee27d68ab05e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.630 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[938ef7f3-ff62-4833-8c14-537664d136cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 NetworkManager[48942]: <info>  [1769161834.6312] manager: (tap6d2cdc4c-40): new Veth device (/org/freedesktop/NetworkManager/Devices/116)
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.657 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a84cf8ed-f06f-4754-a6e5-c3ac3f8ae9ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.660 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[359d4c8a-fe30-4dcc-9662-1bc60a91a9ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 NetworkManager[48942]: <info>  [1769161834.6803] device (tap6d2cdc4c-40): carrier: link connected
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.686 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[be4e7b22-e3e4-4ec7-8825-8d1b31dae07e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.698 227766 DEBUG nova.network.neutron [req-8e14439f-7ed6-4589-8221-89477e41dabe req-0a87b0a4-6040-485a-8773-2cd501a625cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Updated VIF entry in instance network info cache for port 0b1311fa-410f-4d76-a118-cd5f14a68f51. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.699 227766 DEBUG nova.network.neutron [req-8e14439f-7ed6-4589-8221-89477e41dabe req-0a87b0a4-6040-485a-8773-2cd501a625cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Updating instance_info_cache with network_info: [{"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.701 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c1a2054c-ea56-425d-b2f4-bd7f837a7e36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572964, 'reachable_time': 17485, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257706, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.714 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9141fd0e-f51a-413d-8ff5-a6d7f778ef77]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:5a26'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 572964, 'tstamp': 572964}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 257707, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.731 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[36695b43-427d-44e9-b8b2-e19fecac6ddc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572964, 'reachable_time': 17485, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 257708, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.734 227766 DEBUG oslo_concurrency.lockutils [req-8e14439f-7ed6-4589-8221-89477e41dabe req-0a87b0a4-6040-485a-8773-2cd501a625cc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0409b666-6d7a-4831-9ba3-08afe2d0c46b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.758 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[925323cc-b94a-4bf5-a6f5-5de326ded5ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.774 227766 DEBUG nova.compute.manager [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-deleted-aa6f3a20-d469-4e97-90f4-60d418a600e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.775 227766 INFO nova.compute.manager [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Neutron deleted interface aa6f3a20-d469-4e97-90f4-60d418a600e6; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.775 227766 DEBUG nova.network.neutron [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab21c4f3-543f-40b2-824b-45f30fc09046", "address": "fa:16:3e:c2:9a:9a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab21c4f3-54", "ovs_interfaceid": "ab21c4f3-543f-40b2-824b-45f30fc09046", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.790 227766 DEBUG nova.compute.manager [req-9dc3883d-e928-4171-9f11-00bf8292e0a8 req-f8ec7369-2009-4277-ad8f-aa08537b52ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.791 227766 DEBUG oslo_concurrency.lockutils [req-9dc3883d-e928-4171-9f11-00bf8292e0a8 req-f8ec7369-2009-4277-ad8f-aa08537b52ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.791 227766 DEBUG oslo_concurrency.lockutils [req-9dc3883d-e928-4171-9f11-00bf8292e0a8 req-f8ec7369-2009-4277-ad8f-aa08537b52ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.791 227766 DEBUG oslo_concurrency.lockutils [req-9dc3883d-e928-4171-9f11-00bf8292e0a8 req-f8ec7369-2009-4277-ad8f-aa08537b52ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.791 227766 DEBUG nova.compute.manager [req-9dc3883d-e928-4171-9f11-00bf8292e0a8 req-f8ec7369-2009-4277-ad8f-aa08537b52ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Processing event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.808 227766 DEBUG nova.objects.instance [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lazy-loading 'system_metadata' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.823 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5f300581-e75e-4f01-b4da-36bb454aa35f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.824 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.825 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.825 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d2cdc4c-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.827 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:34 np0005593234 NetworkManager[48942]: <info>  [1769161834.8282] manager: (tap6d2cdc4c-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Jan 23 04:50:34 np0005593234 kernel: tap6d2cdc4c-40: entered promiscuous mode
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.828 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.830 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6d2cdc4c-40, col_values=(('external_ids', {'iface-id': '04f6c0b6-99ee-4958-bc01-68fa310042f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.831 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.833 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.834 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:50:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:34Z|00214|binding|INFO|Releasing lport 04f6c0b6-99ee-4958-bc01-68fa310042f0 from this chassis (sb_readonly=0)
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.835 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6f5ccd0a-ecc4-4cd3-8232-92e0ce22c9e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.836 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.836 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'env', 'PROCESS_TAG=haproxy-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6d2cdc4c-47a0-475b-8e71-39465d365de3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.836 227766 DEBUG nova.objects.instance [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lazy-loading 'flavor' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.847 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.859 227766 DEBUG nova.virt.libvirt.vif [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:49:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.860 227766 DEBUG nova.network.os_vif_util [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Converting VIF {"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.861 227766 DEBUG nova.network.os_vif_util [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a2:5e:fc,bridge_name='br-int',has_traffic_filtering=True,id=aa6f3a20-d469-4e97-90f4-60d418a600e6,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa6f3a20-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.864 227766 DEBUG nova.virt.libvirt.guest [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a2:5e:fc"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaa6f3a20-d4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.868 227766 DEBUG nova.virt.libvirt.guest [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a2:5e:fc"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaa6f3a20-d4"/></interface>not found in domain: <domain type='kvm' id='27'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <name>instance-00000042</name>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <uuid>a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</uuid>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-195762306</nova:name>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:50:32</nova:creationTime>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:port uuid="d4963f79-ec1b-4e35-b34d-22edfeb2fd2f">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:port uuid="fd8d3b77-47c1-41fa-b4ad-e9a868b18abe">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:port uuid="ab21c4f3-543f-40b2-824b-45f30fc09046">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:50:34 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <memory unit='KiB'>131072</memory>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <vcpu placement='static'>1</vcpu>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <resource>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <partition>/machine</partition>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </resource>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <sysinfo type='smbios'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <entry name='manufacturer'>RDO</entry>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <entry name='serial'>a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</entry>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <entry name='uuid'>a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</entry>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <entry name='family'>Virtual Machine</entry>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <boot dev='hd'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <smbios mode='sysinfo'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <vmcoreinfo state='on'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <model fallback='forbid'>Nehalem</model>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <feature policy='require' name='x2apic'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <feature policy='require' name='hypervisor'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <feature policy='require' name='vme'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <clock offset='utc'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <timer name='hpet' present='no'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <on_poweroff>destroy</on_poweroff>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <on_reboot>restart</on_reboot>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <on_crash>destroy</on_crash>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <disk type='network' device='disk'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk' index='2'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target dev='vda' bus='virtio'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='virtio-disk0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <disk type='network' device='cdrom'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk.config' index='1'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target dev='sda' bus='sata'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <readonly/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='sata0-0-0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pcie.0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='1' port='0x10'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='2' port='0x11'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='3' port='0x12'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.3'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='4' port='0x13'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.4'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='5' port='0x14'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.5'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='6' port='0x15'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.6'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='7' port='0x16'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.7'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='8' port='0x17'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.8'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='9' port='0x18'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.9'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='10' port='0x19'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.10'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='11' port='0x1a'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.11'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='12' port='0x1b'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.12'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='13' port='0x1c'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.13'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='14' port='0x1d'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.14'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='15' port='0x1e'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.15'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='16' port='0x1f'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.16'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='17' port='0x20'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.17'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='18' port='0x21'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.18'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='19' port='0x22'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.19'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='20' port='0x23'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.20'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='21' port='0x24'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.21'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='22' port='0x25'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.22'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='23' port='0x26'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.23'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='24' port='0x27'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.24'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='25' port='0x28'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.25'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-pci-bridge'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.26'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='usb'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='sata' index='0'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='ide'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:fc:6f:8d'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target dev='tapd4963f79-ec'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='net0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:a3:8d:2a'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target dev='tapfd8d3b77-47'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='net2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:c2:9a:9a'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target dev='tapab21c4f3-54'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='net3'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <serial type='pty'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/console.log' append='off'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target type='isa-serial' port='0'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <model name='isa-serial'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      </target>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/console.log' append='off'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target type='serial' port='0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </console>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <input type='tablet' bus='usb'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='input0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='usb' bus='0' port='1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <input type='mouse' bus='ps2'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='input1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <input type='keyboard' bus='ps2'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='input2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <listen type='address' address='::0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <audio id='1' type='none'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='video0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <watchdog model='itco' action='reset'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='watchdog0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </watchdog>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <memballoon model='virtio'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <stats period='10'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='balloon0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <rng model='virtio'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <backend model='random'>/dev/urandom</backend>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='rng0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <label>system_u:system_r:svirt_t:s0:c395,c549</label>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c395,c549</imagelabel>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <label>+107:+107</label>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <imagelabel>+107:+107</imagelabel>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:50:34 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:50:34 np0005593234 nova_compute[227762]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.871 227766 DEBUG nova.virt.libvirt.guest [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:a2:5e:fc"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaa6f3a20-d4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.875 227766 DEBUG nova.virt.libvirt.guest [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:a2:5e:fc"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapaa6f3a20-d4"/></interface>not found in domain: <domain type='kvm' id='27'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <name>instance-00000042</name>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <uuid>a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</uuid>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-195762306</nova:name>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:50:32</nova:creationTime>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:port uuid="d4963f79-ec1b-4e35-b34d-22edfeb2fd2f">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:port uuid="fd8d3b77-47c1-41fa-b4ad-e9a868b18abe">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:port uuid="ab21c4f3-543f-40b2-824b-45f30fc09046">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:50:34 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <memory unit='KiB'>131072</memory>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <vcpu placement='static'>1</vcpu>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <resource>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <partition>/machine</partition>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </resource>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <sysinfo type='smbios'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <entry name='manufacturer'>RDO</entry>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <entry name='serial'>a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</entry>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <entry name='uuid'>a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92</entry>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <entry name='family'>Virtual Machine</entry>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <boot dev='hd'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <smbios mode='sysinfo'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <vmcoreinfo state='on'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <model fallback='forbid'>Nehalem</model>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <feature policy='require' name='x2apic'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <feature policy='require' name='hypervisor'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <feature policy='require' name='vme'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <clock offset='utc'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <timer name='hpet' present='no'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <on_poweroff>destroy</on_poweroff>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <on_reboot>restart</on_reboot>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <on_crash>destroy</on_crash>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <disk type='network' device='disk'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk' index='2'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target dev='vda' bus='virtio'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='virtio-disk0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <disk type='network' device='cdrom'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_disk.config' index='1'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target dev='sda' bus='sata'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <readonly/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='sata0-0-0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pcie.0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='1' port='0x10'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='2' port='0x11'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='3' port='0x12'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.3'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='4' port='0x13'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.4'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='5' port='0x14'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.5'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='6' port='0x15'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.6'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='7' port='0x16'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.7'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='8' port='0x17'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.8'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='9' port='0x18'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.9'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='10' port='0x19'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.10'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='11' port='0x1a'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.11'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='12' port='0x1b'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.12'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='13' port='0x1c'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.13'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='14' port='0x1d'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.14'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='15' port='0x1e'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.15'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='16' port='0x1f'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.16'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='17' port='0x20'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.17'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='18' port='0x21'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.18'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='19' port='0x22'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.19'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='20' port='0x23'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.20'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='21' port='0x24'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.21'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='22' port='0x25'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.22'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='23' port='0x26'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.23'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='24' port='0x27'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.24'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target chassis='25' port='0x28'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.25'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model name='pcie-pci-bridge'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='pci.26'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='usb'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <controller type='sata' index='0'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='ide'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </controller>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:fc:6f:8d'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target dev='tapd4963f79-ec'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='net0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:a3:8d:2a'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target dev='tapfd8d3b77-47'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='net2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x07' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:c2:9a:9a'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target dev='tapab21c4f3-54'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='net3'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x08' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <serial type='pty'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/console.log' append='off'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target type='isa-serial' port='0'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:        <model name='isa-serial'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      </target>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <console type='pty' tty='/dev/pts/0'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <source path='/dev/pts/0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92/console.log' append='off'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <target type='serial' port='0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </console>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <input type='tablet' bus='usb'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='input0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='usb' bus='0' port='1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <input type='mouse' bus='ps2'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='input1'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <input type='keyboard' bus='ps2'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='input2'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </input>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <listen type='address' address='::0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <audio id='1' type='none'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='video0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <watchdog model='itco' action='reset'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='watchdog0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </watchdog>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <memballoon model='virtio'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <stats period='10'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='balloon0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <rng model='virtio'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <backend model='random'>/dev/urandom</backend>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <alias name='rng0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <label>system_u:system_r:svirt_t:s0:c395,c549</label>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c395,c549</imagelabel>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <label>+107:+107</label>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <imagelabel>+107:+107</imagelabel>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 04:50:34 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:50:34 np0005593234 nova_compute[227762]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.878 227766 WARNING nova.virt.libvirt.driver [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Detaching interface fa:16:3e:a2:5e:fc failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapaa6f3a20-d4' not found.#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.879 227766 DEBUG nova.virt.libvirt.vif [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:49:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.879 227766 DEBUG nova.network.os_vif_util [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Converting VIF {"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.880 227766 DEBUG nova.network.os_vif_util [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a2:5e:fc,bridge_name='br-int',has_traffic_filtering=True,id=aa6f3a20-d469-4e97-90f4-60d418a600e6,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa6f3a20-d4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.881 227766 DEBUG os_vif [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:5e:fc,bridge_name='br-int',has_traffic_filtering=True,id=aa6f3a20-d469-4e97-90f4-60d418a600e6,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa6f3a20-d4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.886 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.886 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaa6f3a20-d4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.887 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.889 227766 INFO os_vif [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a2:5e:fc,bridge_name='br-int',has_traffic_filtering=True,id=aa6f3a20-d469-4e97-90f4-60d418a600e6,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaa6f3a20-d4')#033[00m
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.890 227766 DEBUG nova.virt.libvirt.guest [req-8cb647e5-353e-4c02-99f9-4a5d5533ed1a req-26e9ee4c-1f59-42b6-83ed-98f9f8c42710 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:name>tempest-AttachInterfacesTestJSON-server-195762306</nova:name>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 09:50:34</nova:creationTime>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:user uuid="77cda1e9a0404425a06c34637e696603">tempest-AttachInterfacesTestJSON-746967993-project-member</nova:user>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:project uuid="390d19f683334995a5268cf9b4d5e464">tempest-AttachInterfacesTestJSON-746967993</nova:project>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:port uuid="d4963f79-ec1b-4e35-b34d-22edfeb2fd2f">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:port uuid="fd8d3b77-47c1-41fa-b4ad-e9a868b18abe">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    <nova:port uuid="ab21c4f3-543f-40b2-824b-45f30fc09046">
Jan 23 04:50:34 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 04:50:34 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 04:50:34 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 04:50:34 np0005593234 nova_compute[227762]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.936 144381 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 80a2aae4-afa9-4e60-bde7-8518c5bafa50 with type ""#033[00m
Jan 23 04:50:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:34.938 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:9a:9a 10.100.0.9'], port_security=['fa:16:3e:c2:9a:9a 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-AttachInterfacesTestJSON-1797455705', 'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-AttachInterfacesTestJSON-1797455705', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e14d2748-8402-4583-8740-ef7703629f43', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=6, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=ab21c4f3-543f-40b2-824b-45f30fc09046) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:34Z|00215|binding|INFO|Removing iface tapab21c4f3-54 ovn-installed in OVS
Jan 23 04:50:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:34Z|00216|binding|INFO|Removing lport ab21c4f3-543f-40b2-824b-45f30fc09046 ovn-installed in OVS
Jan 23 04:50:34 np0005593234 nova_compute[227762]: 2026-01-23 09:50:34.959 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.212 227766 DEBUG oslo_concurrency.lockutils [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.213 227766 DEBUG oslo_concurrency.lockutils [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.213 227766 DEBUG oslo_concurrency.lockutils [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.213 227766 DEBUG oslo_concurrency.lockutils [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.213 227766 DEBUG oslo_concurrency.lockutils [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.214 227766 INFO nova.compute.manager [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Terminating instance#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.215 227766 DEBUG nova.compute.manager [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:50:35 np0005593234 podman[257737]: 2026-01-23 09:50:35.230944917 +0000 UTC m=+0.048434261 container create 71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 04:50:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.284 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 systemd[1]: Started libpod-conmon-71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536.scope.
Jan 23 04:50:35 np0005593234 podman[257737]: 2026-01-23 09:50:35.204207027 +0000 UTC m=+0.021696391 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:50:35 np0005593234 kernel: tapd4963f79-ec (unregistering): left promiscuous mode
Jan 23 04:50:35 np0005593234 NetworkManager[48942]: <info>  [1769161835.3066] device (tapd4963f79-ec): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.320 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:35Z|00217|binding|INFO|Releasing lport d4963f79-ec1b-4e35-b34d-22edfeb2fd2f from this chassis (sb_readonly=0)
Jan 23 04:50:35 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:35Z|00218|binding|INFO|Setting lport d4963f79-ec1b-4e35-b34d-22edfeb2fd2f down in Southbound
Jan 23 04:50:35 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:35Z|00219|binding|INFO|Removing iface tapd4963f79-ec ovn-installed in OVS
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.324 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.330 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:6f:8d 10.100.0.11'], port_security=['fa:16:3e:fc:6f:8d 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cd689bfc-53a9-43da-a4d7-90eb165eac13', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.209'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d4963f79-ec1b-4e35-b34d-22edfeb2fd2f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:35 np0005593234 kernel: tapfd8d3b77-47 (unregistering): left promiscuous mode
Jan 23 04:50:35 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c7ca396c721ffbd395929abd0d07121d9cc0a0c755b49ef90cbd361fee3d7c00/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:50:35 np0005593234 NetworkManager[48942]: <info>  [1769161835.3406] device (tapfd8d3b77-47): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.341 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.347 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:35Z|00220|binding|INFO|Releasing lport fd8d3b77-47c1-41fa-b4ad-e9a868b18abe from this chassis (sb_readonly=0)
Jan 23 04:50:35 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:35Z|00221|binding|INFO|Setting lport fd8d3b77-47c1-41fa-b4ad-e9a868b18abe down in Southbound
Jan 23 04:50:35 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:35Z|00222|binding|INFO|Removing iface tapfd8d3b77-47 ovn-installed in OVS
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.348 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 podman[257737]: 2026-01-23 09:50:35.357133718 +0000 UTC m=+0.174623092 container init 71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.359 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a3:8d:2a 10.100.0.4'], port_security=['fa:16:3e:a3:8d:2a 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '390d19f683334995a5268cf9b4d5e464', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e14d2748-8402-4583-8740-ef7703629f43', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=396f5815-d5dc-4484-bb15-e71911e6f8a2, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fd8d3b77-47c1-41fa-b4ad-e9a868b18abe) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:35 np0005593234 kernel: tapab21c4f3-54 (unregistering): left promiscuous mode
Jan 23 04:50:35 np0005593234 podman[257737]: 2026-01-23 09:50:35.362979772 +0000 UTC m=+0.180469106 container start 71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 04:50:35 np0005593234 NetworkManager[48942]: <info>  [1769161835.3656] device (tapab21c4f3-54): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.367 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.374 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[257753]: [NOTICE]   (257766) : New worker (257771) forked
Jan 23 04:50:35 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[257753]: [NOTICE]   (257766) : Loading success.
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.416 144381 INFO neutron.agent.ovn.metadata.agent [-] Port ab21c4f3-543f-40b2-824b-45f30fc09046 in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 unbound from our chassis#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.417 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.418 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b3f9bf3e-9f7b-40af-b431-0d31e92c5593]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.418 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 namespace which is not needed anymore#033[00m
Jan 23 04:50:35 np0005593234 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000042.scope: Deactivated successfully.
Jan 23 04:50:35 np0005593234 systemd[1]: machine-qemu\x2d27\x2dinstance\x2d00000042.scope: Consumed 16.108s CPU time.
Jan 23 04:50:35 np0005593234 systemd-machined[195626]: Machine qemu-27-instance-00000042 terminated.
Jan 23 04:50:35 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[256642]: [NOTICE]   (256646) : haproxy version is 2.8.14-c23fe91
Jan 23 04:50:35 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[256642]: [NOTICE]   (256646) : path to executable is /usr/sbin/haproxy
Jan 23 04:50:35 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[256642]: [WARNING]  (256646) : Exiting Master process...
Jan 23 04:50:35 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[256642]: [WARNING]  (256646) : Exiting Master process...
Jan 23 04:50:35 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[256642]: [ALERT]    (256646) : Current worker (256648) exited with code 143 (Terminated)
Jan 23 04:50:35 np0005593234 neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4[256642]: [WARNING]  (256646) : All workers exited. Exiting... (0)
Jan 23 04:50:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:50:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:35.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:50:35 np0005593234 systemd[1]: libpod-c1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71.scope: Deactivated successfully.
Jan 23 04:50:35 np0005593234 podman[257797]: 2026-01-23 09:50:35.527700072 +0000 UTC m=+0.040752750 container died c1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 23 04:50:35 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71-userdata-shm.mount: Deactivated successfully.
Jan 23 04:50:35 np0005593234 systemd[1]: var-lib-containers-storage-overlay-b80bad23c72592126a26e53b18dd5c550f825c5ce0005262243f369de9b0637e-merged.mount: Deactivated successfully.
Jan 23 04:50:35 np0005593234 podman[257797]: 2026-01-23 09:50:35.566512951 +0000 UTC m=+0.079565629 container cleanup c1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 04:50:35 np0005593234 systemd[1]: libpod-conmon-c1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71.scope: Deactivated successfully.
Jan 23 04:50:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:35.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:35 np0005593234 podman[257827]: 2026-01-23 09:50:35.619557236 +0000 UTC m=+0.035619199 container remove c1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.625 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1be5f068-9d86-4d0c-976e-047023bb02e8]: (4, ('Fri Jan 23 09:50:35 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 (c1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71)\nc1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71\nFri Jan 23 09:50:35 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 (c1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71)\nc1d0a2c0d16e3f97d0faa53df03391cd813f95e54f064a4838af00332af89e71\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.627 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[04c596bd-dd2a-4118-b692-d4a36864e893]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.627 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7808328e-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.629 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.632 227766 DEBUG nova.compute.manager [req-0b2ce91d-ca32-4195-a2d4-f85a683da824 req-a68ba1c9-eda6-4f7a-9763-71466b16cae7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-unplugged-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.633 227766 DEBUG oslo_concurrency.lockutils [req-0b2ce91d-ca32-4195-a2d4-f85a683da824 req-a68ba1c9-eda6-4f7a-9763-71466b16cae7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.633 227766 DEBUG oslo_concurrency.lockutils [req-0b2ce91d-ca32-4195-a2d4-f85a683da824 req-a68ba1c9-eda6-4f7a-9763-71466b16cae7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.633 227766 DEBUG oslo_concurrency.lockutils [req-0b2ce91d-ca32-4195-a2d4-f85a683da824 req-a68ba1c9-eda6-4f7a-9763-71466b16cae7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.633 227766 DEBUG nova.compute.manager [req-0b2ce91d-ca32-4195-a2d4-f85a683da824 req-a68ba1c9-eda6-4f7a-9763-71466b16cae7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] No waiting events found dispatching network-vif-unplugged-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:35 np0005593234 kernel: tap7808328e-20: left promiscuous mode
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.634 227766 DEBUG nova.compute.manager [req-0b2ce91d-ca32-4195-a2d4-f85a683da824 req-a68ba1c9-eda6-4f7a-9763-71466b16cae7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-unplugged-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:50:35 np0005593234 NetworkManager[48942]: <info>  [1769161835.6516] manager: (tapfd8d3b77-47): new Tun device (/org/freedesktop/NetworkManager/Devices/118)
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.659 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.667 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fd125f16-0f83-471a-a7af-a50b4e4ed3ec]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.677 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.682 227766 INFO nova.virt.libvirt.driver [-] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Instance destroyed successfully.#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.682 227766 DEBUG nova.objects.instance [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lazy-loading 'resources' on Instance uuid a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.683 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9b27a870-1f79-419c-91bd-ab8d9f60803a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.685 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4ff32d31-bc0d-4d68-81b9-2b9d8a58d997]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.691 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.700 227766 DEBUG nova.virt.libvirt.vif [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:49:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.701 227766 DEBUG nova.network.os_vif_util [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.701 227766 DEBUG nova.network.os_vif_util [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fc:6f:8d,bridge_name='br-int',has_traffic_filtering=True,id=d4963f79-ec1b-4e35-b34d-22edfeb2fd2f,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4963f79-ec') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.702 227766 DEBUG os_vif [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:6f:8d,bridge_name='br-int',has_traffic_filtering=True,id=d4963f79-ec1b-4e35-b34d-22edfeb2fd2f,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4963f79-ec') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.703 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.703 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4963f79-ec, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.699 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2e1a6d18-92c1-44b6-8b5f-2b75218b9f67]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 564519, 'reachable_time': 24869, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 257878, 'error': None, 'target': 'ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.704 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.706 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:50:35 np0005593234 systemd[1]: run-netns-ovnmeta\x2d7808328e\x2d22f9\x2d46df\x2dac06\x2df8c3d6ad10c4.mount: Deactivated successfully.
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.710 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7808328e-22f9-46df-ac06-f8c3d6ad10c4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.710 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[8cf75025-4a0e-43ef-85a4-753bf65500d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.711 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.711 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d4963f79-ec1b-4e35-b34d-22edfeb2fd2f in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 unbound from our chassis#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.712 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.712 227766 INFO os_vif [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fc:6f:8d,bridge_name='br-int',has_traffic_filtering=True,id=d4963f79-ec1b-4e35-b34d-22edfeb2fd2f,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4963f79-ec')#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.713 227766 DEBUG nova.virt.libvirt.vif [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:49:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.713 227766 DEBUG nova.network.os_vif_util [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.713 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[89f225e6-2304-410a-bd32-1b2ad8232772]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.714 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fd8d3b77-47c1-41fa-b4ad-e9a868b18abe in datapath 7808328e-22f9-46df-ac06-f8c3d6ad10c4 unbound from our chassis#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.714 227766 DEBUG nova.network.os_vif_util [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a3:8d:2a,bridge_name='br-int',has_traffic_filtering=True,id=fd8d3b77-47c1-41fa-b4ad-e9a868b18abe,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd8d3b77-47') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.714 227766 DEBUG os_vif [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:8d:2a,bridge_name='br-int',has_traffic_filtering=True,id=fd8d3b77-47c1-41fa-b4ad-e9a868b18abe,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd8d3b77-47') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.715 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7808328e-22f9-46df-ac06-f8c3d6ad10c4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.715 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.715 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfd8d3b77-47, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:35.715 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e35bbc6a-1667-4331-9bed-998a31ab46a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.716 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.718 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.722 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.724 227766 INFO os_vif [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a3:8d:2a,bridge_name='br-int',has_traffic_filtering=True,id=fd8d3b77-47c1-41fa-b4ad-e9a868b18abe,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfd8d3b77-47')#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.725 227766 DEBUG nova.virt.libvirt.vif [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:49:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachInterfacesTestJSON-server-195762306',display_name='tempest-AttachInterfacesTestJSON-server-195762306',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachinterfacestestjson-server-195762306',id=66,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMKDXyUsq2IpB3CeC/QRd+GAkbVbRm3n46urNe3KoaPDhQtHAmmc6lOr2aLr55mKmrtT+BGMeHEmcelobR3kjxuZAMTkWOef6DX29hPC/Zh3zLJkWcxmmCMg4iVg+vcDag==',key_name='tempest-keypair-117314675',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:49:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='390d19f683334995a5268cf9b4d5e464',ramdisk_id='',reservation_id='r-wt5r4cdg',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachInterfacesTestJSON-746967993',owner_user_name='tempest-AttachInterfacesTestJSON-746967993-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:49:12Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='77cda1e9a0404425a06c34637e696603',uuid=a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ab21c4f3-543f-40b2-824b-45f30fc09046", "address": "fa:16:3e:c2:9a:9a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab21c4f3-54", "ovs_interfaceid": "ab21c4f3-543f-40b2-824b-45f30fc09046", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.725 227766 DEBUG nova.network.os_vif_util [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converting VIF {"id": "ab21c4f3-543f-40b2-824b-45f30fc09046", "address": "fa:16:3e:c2:9a:9a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab21c4f3-54", "ovs_interfaceid": "ab21c4f3-543f-40b2-824b-45f30fc09046", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.725 227766 DEBUG nova.network.os_vif_util [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c2:9a:9a,bridge_name='br-int',has_traffic_filtering=True,id=ab21c4f3-543f-40b2-824b-45f30fc09046,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapab21c4f3-54') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.726 227766 DEBUG os_vif [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:9a:9a,bridge_name='br-int',has_traffic_filtering=True,id=ab21c4f3-543f-40b2-824b-45f30fc09046,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapab21c4f3-54') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.727 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.727 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapab21c4f3-54, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.728 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.730 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.731 227766 INFO os_vif [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:9a:9a,bridge_name='br-int',has_traffic_filtering=True,id=ab21c4f3-543f-40b2-824b-45f30fc09046,network=Network(7808328e-22f9-46df-ac06-f8c3d6ad10c4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapab21c4f3-54')#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.754 227766 DEBUG nova.compute.manager [req-42d710cd-efe6-4a7c-b45c-9d0be4fcb878 req-c1834300-7233-43c9-8266-e244d3a9539a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-plugged-aa6f3a20-d469-4e97-90f4-60d418a600e6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.755 227766 DEBUG oslo_concurrency.lockutils [req-42d710cd-efe6-4a7c-b45c-9d0be4fcb878 req-c1834300-7233-43c9-8266-e244d3a9539a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.755 227766 DEBUG oslo_concurrency.lockutils [req-42d710cd-efe6-4a7c-b45c-9d0be4fcb878 req-c1834300-7233-43c9-8266-e244d3a9539a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.755 227766 DEBUG oslo_concurrency.lockutils [req-42d710cd-efe6-4a7c-b45c-9d0be4fcb878 req-c1834300-7233-43c9-8266-e244d3a9539a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.755 227766 DEBUG nova.compute.manager [req-42d710cd-efe6-4a7c-b45c-9d0be4fcb878 req-c1834300-7233-43c9-8266-e244d3a9539a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] No waiting events found dispatching network-vif-plugged-aa6f3a20-d469-4e97-90f4-60d418a600e6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:35 np0005593234 nova_compute[227762]: 2026-01-23 09:50:35.756 227766 WARNING nova.compute.manager [req-42d710cd-efe6-4a7c-b45c-9d0be4fcb878 req-c1834300-7233-43c9-8266-e244d3a9539a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received unexpected event network-vif-plugged-aa6f3a20-d469-4e97-90f4-60d418a600e6 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:50:35 np0005593234 podman[257852]: 2026-01-23 09:50:35.764487915 +0000 UTC m=+0.088367645 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.377 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161836.3767495, 0409b666-6d7a-4831-9ba3-08afe2d0c46b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.377 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] VM Started (Lifecycle Event)#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.380 227766 DEBUG nova.compute.manager [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.384 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.387 227766 INFO nova.virt.libvirt.driver [-] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Instance spawned successfully.#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.387 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.410 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.418 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.418 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.419 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.419 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.419 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.420 227766 DEBUG nova.virt.libvirt.driver [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.423 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.451 227766 INFO nova.virt.libvirt.driver [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Deleting instance files /var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_del#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.452 227766 INFO nova.virt.libvirt.driver [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Deletion of /var/lib/nova/instances/a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92_del complete#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.482 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.482 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161836.3797402, 0409b666-6d7a-4831-9ba3-08afe2d0c46b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.482 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.520 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.522 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161836.382682, 0409b666-6d7a-4831-9ba3-08afe2d0c46b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.523 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.545 227766 INFO nova.compute.manager [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Took 17.59 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.546 227766 DEBUG nova.compute.manager [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.547 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.547 227766 INFO nova.compute.manager [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Took 1.33 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.548 227766 DEBUG oslo.service.loopingcall [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.548 227766 DEBUG nova.compute.manager [-] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.548 227766 DEBUG nova.network.neutron [-] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.555 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.588 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.624 227766 INFO nova.compute.manager [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Took 21.82 seconds to build instance.#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.651 227766 DEBUG oslo_concurrency.lockutils [None req-d01c2c85-0d15-4f7c-a895-56b4cf35d862 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.934s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.895 227766 DEBUG nova.compute.manager [req-137d88b8-19a8-4519-8fff-043ce7b842eb req-90a98228-f169-4e35-bbe9-05da9b52d295 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-deleted-ab21c4f3-543f-40b2-824b-45f30fc09046 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.896 227766 INFO nova.compute.manager [req-137d88b8-19a8-4519-8fff-043ce7b842eb req-90a98228-f169-4e35-bbe9-05da9b52d295 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Neutron deleted interface ab21c4f3-543f-40b2-824b-45f30fc09046; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.896 227766 DEBUG nova.network.neutron [req-137d88b8-19a8-4519-8fff-043ce7b842eb req-90a98228-f169-4e35-bbe9-05da9b52d295 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.918 227766 DEBUG nova.compute.manager [req-137d88b8-19a8-4519-8fff-043ce7b842eb req-90a98228-f169-4e35-bbe9-05da9b52d295 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Detach interface failed, port_id=ab21c4f3-543f-40b2-824b-45f30fc09046, reason: Instance a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.919 227766 DEBUG nova.compute.manager [req-137d88b8-19a8-4519-8fff-043ce7b842eb req-90a98228-f169-4e35-bbe9-05da9b52d295 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-unplugged-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.919 227766 DEBUG oslo_concurrency.lockutils [req-137d88b8-19a8-4519-8fff-043ce7b842eb req-90a98228-f169-4e35-bbe9-05da9b52d295 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.919 227766 DEBUG oslo_concurrency.lockutils [req-137d88b8-19a8-4519-8fff-043ce7b842eb req-90a98228-f169-4e35-bbe9-05da9b52d295 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.919 227766 DEBUG oslo_concurrency.lockutils [req-137d88b8-19a8-4519-8fff-043ce7b842eb req-90a98228-f169-4e35-bbe9-05da9b52d295 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.919 227766 DEBUG nova.compute.manager [req-137d88b8-19a8-4519-8fff-043ce7b842eb req-90a98228-f169-4e35-bbe9-05da9b52d295 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] No waiting events found dispatching network-vif-unplugged-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:36 np0005593234 nova_compute[227762]: 2026-01-23 09:50:36.920 227766 DEBUG nova.compute.manager [req-137d88b8-19a8-4519-8fff-043ce7b842eb req-90a98228-f169-4e35-bbe9-05da9b52d295 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-unplugged-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:50:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:37.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:37.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:37 np0005593234 nova_compute[227762]: 2026-01-23 09:50:37.750 227766 DEBUG nova.compute.manager [req-2d3dd40f-40ed-42a0-8a71-bf54f0aa532b req-9e3a11c9-d6bb-45d8-9b59-e40ed9de1655 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-plugged-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:37 np0005593234 nova_compute[227762]: 2026-01-23 09:50:37.750 227766 DEBUG oslo_concurrency.lockutils [req-2d3dd40f-40ed-42a0-8a71-bf54f0aa532b req-9e3a11c9-d6bb-45d8-9b59-e40ed9de1655 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:37 np0005593234 nova_compute[227762]: 2026-01-23 09:50:37.751 227766 DEBUG oslo_concurrency.lockutils [req-2d3dd40f-40ed-42a0-8a71-bf54f0aa532b req-9e3a11c9-d6bb-45d8-9b59-e40ed9de1655 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:37 np0005593234 nova_compute[227762]: 2026-01-23 09:50:37.751 227766 DEBUG oslo_concurrency.lockutils [req-2d3dd40f-40ed-42a0-8a71-bf54f0aa532b req-9e3a11c9-d6bb-45d8-9b59-e40ed9de1655 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:37 np0005593234 nova_compute[227762]: 2026-01-23 09:50:37.751 227766 DEBUG nova.compute.manager [req-2d3dd40f-40ed-42a0-8a71-bf54f0aa532b req-9e3a11c9-d6bb-45d8-9b59-e40ed9de1655 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] No waiting events found dispatching network-vif-plugged-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:37 np0005593234 nova_compute[227762]: 2026-01-23 09:50:37.751 227766 WARNING nova.compute.manager [req-2d3dd40f-40ed-42a0-8a71-bf54f0aa532b req-9e3a11c9-d6bb-45d8-9b59-e40ed9de1655 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received unexpected event network-vif-plugged-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.202 227766 DEBUG neutronclient.v2_0.client [-] Error message: {"NeutronError": {"type": "PortNotFound", "message": "Port ab21c4f3-543f-40b2-824b-45f30fc09046 could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.203 227766 DEBUG nova.network.neutron [-] Unable to show port ab21c4f3-543f-40b2-824b-45f30fc09046 as it no longer exists. _unbind_ports /usr/lib/python3.9/site-packages/nova/network/neutron.py:666#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.355 227766 DEBUG nova.compute.manager [req-f94dfebf-360f-423b-b05a-af84553bdb8f req-91d12102-503f-4b3e-8ef6-ffa920e2dc02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.356 227766 DEBUG oslo_concurrency.lockutils [req-f94dfebf-360f-423b-b05a-af84553bdb8f req-91d12102-503f-4b3e-8ef6-ffa920e2dc02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.356 227766 DEBUG oslo_concurrency.lockutils [req-f94dfebf-360f-423b-b05a-af84553bdb8f req-91d12102-503f-4b3e-8ef6-ffa920e2dc02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.356 227766 DEBUG oslo_concurrency.lockutils [req-f94dfebf-360f-423b-b05a-af84553bdb8f req-91d12102-503f-4b3e-8ef6-ffa920e2dc02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.357 227766 DEBUG nova.compute.manager [req-f94dfebf-360f-423b-b05a-af84553bdb8f req-91d12102-503f-4b3e-8ef6-ffa920e2dc02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] No waiting events found dispatching network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.357 227766 WARNING nova.compute.manager [req-f94dfebf-360f-423b-b05a-af84553bdb8f req-91d12102-503f-4b3e-8ef6-ffa920e2dc02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received unexpected event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.975 227766 DEBUG nova.compute.manager [req-5aca0116-955d-4daf-a693-d093d16bc152 req-cf29dc7a-1214-44f7-8df1-ba4f7f156510 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-plugged-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.976 227766 DEBUG oslo_concurrency.lockutils [req-5aca0116-955d-4daf-a693-d093d16bc152 req-cf29dc7a-1214-44f7-8df1-ba4f7f156510 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.976 227766 DEBUG oslo_concurrency.lockutils [req-5aca0116-955d-4daf-a693-d093d16bc152 req-cf29dc7a-1214-44f7-8df1-ba4f7f156510 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.977 227766 DEBUG oslo_concurrency.lockutils [req-5aca0116-955d-4daf-a693-d093d16bc152 req-cf29dc7a-1214-44f7-8df1-ba4f7f156510 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.977 227766 DEBUG nova.compute.manager [req-5aca0116-955d-4daf-a693-d093d16bc152 req-cf29dc7a-1214-44f7-8df1-ba4f7f156510 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] No waiting events found dispatching network-vif-plugged-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:38 np0005593234 nova_compute[227762]: 2026-01-23 09:50:38.977 227766 WARNING nova.compute.manager [req-5aca0116-955d-4daf-a693-d093d16bc152 req-cf29dc7a-1214-44f7-8df1-ba4f7f156510 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received unexpected event network-vif-plugged-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:50:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:50:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:39.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:50:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:50:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:39.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.761208) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161839761302, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 806, "num_deletes": 250, "total_data_size": 1567811, "memory_usage": 1583592, "flush_reason": "Manual Compaction"}
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161839769251, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 668299, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 41315, "largest_seqno": 42116, "table_properties": {"data_size": 665073, "index_size": 1070, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8668, "raw_average_key_size": 20, "raw_value_size": 658303, "raw_average_value_size": 1567, "num_data_blocks": 49, "num_entries": 420, "num_filter_entries": 420, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161779, "oldest_key_time": 1769161779, "file_creation_time": 1769161839, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 8110 microseconds, and 3000 cpu microseconds.
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.769326) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 668299 bytes OK
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.769344) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.770620) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.770635) EVENT_LOG_v1 {"time_micros": 1769161839770630, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.770651) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 1563599, prev total WAL file size 1563599, number of live WAL files 2.
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.771240) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323535' seq:72057594037927935, type:22 .. '6D6772737461740031353036' seq:0, type:0; will stop at (end)
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(652KB)], [78(11MB)]
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161839771292, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 12508024, "oldest_snapshot_seqno": -1}
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 6507 keys, 9021237 bytes, temperature: kUnknown
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161839842710, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 9021237, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8979130, "index_size": 24717, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16325, "raw_key_size": 167128, "raw_average_key_size": 25, "raw_value_size": 8863762, "raw_average_value_size": 1362, "num_data_blocks": 985, "num_entries": 6507, "num_filter_entries": 6507, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769161839, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.843165) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 9021237 bytes
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.845105) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.8 rd, 126.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 11.3 +0.0 blob) out(8.6 +0.0 blob), read-write-amplify(32.2) write-amplify(13.5) OK, records in: 6993, records dropped: 486 output_compression: NoCompression
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.845141) EVENT_LOG_v1 {"time_micros": 1769161839845124, "job": 48, "event": "compaction_finished", "compaction_time_micros": 71547, "compaction_time_cpu_micros": 27625, "output_level": 6, "num_output_files": 1, "total_output_size": 9021237, "num_input_records": 6993, "num_output_records": 6507, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161839845559, "job": 48, "event": "table_file_deletion", "file_number": 80}
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161839850271, "job": 48, "event": "table_file_deletion", "file_number": 78}
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.771161) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.850425) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.850433) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.850434) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.850436) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:50:39 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:50:39.850437) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.082 227766 DEBUG nova.compute.manager [req-2e557695-1515-43a6-83c0-9dd2375e91e6 req-82e8e3da-dcf3-420f-9c7e-5537f5f29705 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-deleted-d4963f79-ec1b-4e35-b34d-22edfeb2fd2f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.083 227766 INFO nova.compute.manager [req-2e557695-1515-43a6-83c0-9dd2375e91e6 req-82e8e3da-dcf3-420f-9c7e-5537f5f29705 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Neutron deleted interface d4963f79-ec1b-4e35-b34d-22edfeb2fd2f; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.084 227766 DEBUG nova.network.neutron [req-2e557695-1515-43a6-83c0-9dd2375e91e6 req-82e8e3da-dcf3-420f-9c7e-5537f5f29705 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.108 227766 DEBUG nova.compute.manager [req-2e557695-1515-43a6-83c0-9dd2375e91e6 req-82e8e3da-dcf3-420f-9c7e-5537f5f29705 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Detach interface failed, port_id=d4963f79-ec1b-4e35-b34d-22edfeb2fd2f, reason: Instance a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 04:50:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.287 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.596 227766 DEBUG nova.network.neutron [-] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.623 227766 INFO nova.compute.manager [-] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Took 4.07 seconds to deallocate network for instance.#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.632 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [{"id": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "address": "fa:16:3e:fc:6f:8d", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4963f79-ec", "ovs_interfaceid": "d4963f79-ec1b-4e35-b34d-22edfeb2fd2f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "address": "fa:16:3e:a2:5e:fc", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaa6f3a20-d4", "ovs_interfaceid": "aa6f3a20-d469-4e97-90f4-60d418a600e6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "address": "fa:16:3e:a3:8d:2a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfd8d3b77-47", "ovs_interfaceid": "fd8d3b77-47c1-41fa-b4ad-e9a868b18abe", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "ab21c4f3-543f-40b2-824b-45f30fc09046", "address": "fa:16:3e:c2:9a:9a", "network": {"id": "7808328e-22f9-46df-ac06-f8c3d6ad10c4", "bridge": "br-int", "label": "tempest-AttachInterfacesTestJSON-1070463615-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "390d19f683334995a5268cf9b4d5e464", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapab21c4f3-54", "ovs_interfaceid": "ab21c4f3-543f-40b2-824b-45f30fc09046", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.671 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.671 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.671 227766 DEBUG oslo_concurrency.lockutils [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquired lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.672 227766 DEBUG nova.network.neutron [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.673 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.680 227766 DEBUG oslo_concurrency.lockutils [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.681 227766 DEBUG oslo_concurrency.lockutils [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.728 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:40 np0005593234 nova_compute[227762]: 2026-01-23 09:50:40.765 227766 DEBUG oslo_concurrency.processutils [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:40 np0005593234 podman[258181]: 2026-01-23 09:50:40.884406218 +0000 UTC m=+0.059998614 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 04:50:40 np0005593234 podman[258181]: 2026-01-23 09:50:40.97400555 +0000 UTC m=+0.149597936 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0)
Jan 23 04:50:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:50:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1757528810' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:50:41 np0005593234 nova_compute[227762]: 2026-01-23 09:50:41.218 227766 DEBUG oslo_concurrency.processutils [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:41 np0005593234 nova_compute[227762]: 2026-01-23 09:50:41.226 227766 DEBUG nova.compute.provider_tree [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:50:41 np0005593234 nova_compute[227762]: 2026-01-23 09:50:41.243 227766 DEBUG nova.scheduler.client.report [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:50:41 np0005593234 nova_compute[227762]: 2026-01-23 09:50:41.264 227766 DEBUG oslo_concurrency.lockutils [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.584s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:41 np0005593234 nova_compute[227762]: 2026-01-23 09:50:41.296 227766 INFO nova.scheduler.client.report [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Deleted allocations for instance a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92#033[00m
Jan 23 04:50:41 np0005593234 nova_compute[227762]: 2026-01-23 09:50:41.368 227766 DEBUG oslo_concurrency.lockutils [None req-d2588d35-82fb-408e-ae80-17158d747fb2 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:41.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:41.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:41 np0005593234 nova_compute[227762]: 2026-01-23 09:50:41.607 227766 INFO nova.network.neutron [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Port d4963f79-ec1b-4e35-b34d-22edfeb2fd2f from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 23 04:50:41 np0005593234 nova_compute[227762]: 2026-01-23 09:50:41.607 227766 INFO nova.network.neutron [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Port aa6f3a20-d469-4e97-90f4-60d418a600e6 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 23 04:50:41 np0005593234 nova_compute[227762]: 2026-01-23 09:50:41.607 227766 INFO nova.network.neutron [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Port fd8d3b77-47c1-41fa-b4ad-e9a868b18abe from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 23 04:50:41 np0005593234 nova_compute[227762]: 2026-01-23 09:50:41.607 227766 INFO nova.network.neutron [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Port ab21c4f3-543f-40b2-824b-45f30fc09046 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 23 04:50:41 np0005593234 nova_compute[227762]: 2026-01-23 09:50:41.608 227766 DEBUG nova.network.neutron [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:50:41 np0005593234 podman[258349]: 2026-01-23 09:50:41.648853743 +0000 UTC m=+0.110307903 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:50:41 np0005593234 nova_compute[227762]: 2026-01-23 09:50:41.666 227766 DEBUG oslo_concurrency.lockutils [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Releasing lock "refresh_cache-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:50:41 np0005593234 nova_compute[227762]: 2026-01-23 09:50:41.707 227766 DEBUG oslo_concurrency.lockutils [None req-6bba6f65-14a7-487b-bccb-cf834454fc29 77cda1e9a0404425a06c34637e696603 390d19f683334995a5268cf9b4d5e464 - - default default] Lock "interface-a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92-aa6f3a20-d469-4e97-90f4-60d418a600e6" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 9.240s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:41 np0005593234 podman[258371]: 2026-01-23 09:50:41.723771466 +0000 UTC m=+0.057488756 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:50:41 np0005593234 podman[258349]: 2026-01-23 09:50:41.729287159 +0000 UTC m=+0.190741299 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 04:50:41 np0005593234 podman[258420]: 2026-01-23 09:50:41.966443173 +0000 UTC m=+0.058791977 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, com.redhat.component=keepalived-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, release=1793, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, io.buildah.version=1.28.2, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc.)
Jan 23 04:50:41 np0005593234 podman[258420]: 2026-01-23 09:50:41.985973273 +0000 UTC m=+0.078322067 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, build-date=2023-02-22T09:23:20, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, architecture=x86_64, com.redhat.component=keepalived-container, io.openshift.expose-services=, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, vcs-type=git)
Jan 23 04:50:42 np0005593234 nova_compute[227762]: 2026-01-23 09:50:42.230 227766 DEBUG nova.compute.manager [req-5076f218-aa35-4d42-b82b-82192a438ddb req-44457759-6f79-4433-bd90-dcc2265809c1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Received event network-vif-deleted-fd8d3b77-47c1-41fa-b4ad-e9a868b18abe external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:42 np0005593234 nova_compute[227762]: 2026-01-23 09:50:42.231 227766 INFO nova.compute.manager [req-5076f218-aa35-4d42-b82b-82192a438ddb req-44457759-6f79-4433-bd90-dcc2265809c1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Neutron deleted interface fd8d3b77-47c1-41fa-b4ad-e9a868b18abe; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:50:42 np0005593234 nova_compute[227762]: 2026-01-23 09:50:42.231 227766 DEBUG nova.network.neutron [req-5076f218-aa35-4d42-b82b-82192a438ddb req-44457759-6f79-4433-bd90-dcc2265809c1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Jan 23 04:50:42 np0005593234 nova_compute[227762]: 2026-01-23 09:50:42.233 227766 DEBUG nova.compute.manager [req-5076f218-aa35-4d42-b82b-82192a438ddb req-44457759-6f79-4433-bd90-dcc2265809c1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Detach interface failed, port_id=fd8d3b77-47c1-41fa-b4ad-e9a868b18abe, reason: Instance a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 04:50:42 np0005593234 nova_compute[227762]: 2026-01-23 09:50:42.583 227766 INFO nova.compute.manager [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Rebuilding instance#033[00m
Jan 23 04:50:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:42.825 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:42.826 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:42.827 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:42 np0005593234 nova_compute[227762]: 2026-01-23 09:50:42.885 227766 DEBUG nova.objects.instance [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 0409b666-6d7a-4831-9ba3-08afe2d0c46b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:42 np0005593234 nova_compute[227762]: 2026-01-23 09:50:42.903 227766 DEBUG nova.compute.manager [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:42 np0005593234 nova_compute[227762]: 2026-01-23 09:50:42.954 227766 DEBUG nova.objects.instance [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'pci_requests' on Instance uuid 0409b666-6d7a-4831-9ba3-08afe2d0c46b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:42 np0005593234 nova_compute[227762]: 2026-01-23 09:50:42.970 227766 DEBUG nova.objects.instance [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0409b666-6d7a-4831-9ba3-08afe2d0c46b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:42 np0005593234 nova_compute[227762]: 2026-01-23 09:50:42.986 227766 DEBUG nova.objects.instance [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'resources' on Instance uuid 0409b666-6d7a-4831-9ba3-08afe2d0c46b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:43 np0005593234 nova_compute[227762]: 2026-01-23 09:50:43.000 227766 DEBUG nova.objects.instance [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'migration_context' on Instance uuid 0409b666-6d7a-4831-9ba3-08afe2d0c46b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:43 np0005593234 nova_compute[227762]: 2026-01-23 09:50:43.012 227766 DEBUG nova.objects.instance [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:50:43 np0005593234 nova_compute[227762]: 2026-01-23 09:50:43.016 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 04:50:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:50:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:50:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:50:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:43.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:43.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:50:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:50:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:50:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2555136528' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:50:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:50:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2555136528' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:50:44 np0005593234 nova_compute[227762]: 2026-01-23 09:50:44.665 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:50:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:45 np0005593234 nova_compute[227762]: 2026-01-23 09:50:45.288 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:45.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:50:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:45.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:50:45 np0005593234 nova_compute[227762]: 2026-01-23 09:50:45.730 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:47.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:47.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:49.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:50:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:49.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:50:49 np0005593234 podman[258588]: 2026-01-23 09:50:49.790476968 +0000 UTC m=+0.087763611 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:50:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:50 np0005593234 nova_compute[227762]: 2026-01-23 09:50:50.290 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:50 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #48. Immutable memtables: 5.
Jan 23 04:50:50 np0005593234 nova_compute[227762]: 2026-01-23 09:50:50.680 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161835.6793206, a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:50:50 np0005593234 nova_compute[227762]: 2026-01-23 09:50:50.680 227766 INFO nova.compute.manager [-] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:50:50 np0005593234 nova_compute[227762]: 2026-01-23 09:50:50.700 227766 DEBUG nova.compute.manager [None req-71dc672e-ba3f-47f9-ba7a-1bd161e39eb6 - - - - - -] [instance: a4f3d38e-d52c-40ba-aa02-f8bd63ebfc92] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:50:50 np0005593234 nova_compute[227762]: 2026-01-23 09:50:50.731 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:51.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:51.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:52 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:52Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:63:53:fd 10.100.0.4
Jan 23 04:50:52 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:52Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:63:53:fd 10.100.0.4
Jan 23 04:50:53 np0005593234 nova_compute[227762]: 2026-01-23 09:50:53.057 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 04:50:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:50:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:53.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:50:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:53.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:50:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:50:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:54.291 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=26, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=25) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:54 np0005593234 nova_compute[227762]: 2026-01-23 09:50:54.291 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:54.292 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:50:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:50:55 np0005593234 nova_compute[227762]: 2026-01-23 09:50:55.293 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:55.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:55.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:55 np0005593234 nova_compute[227762]: 2026-01-23 09:50:55.732 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.070 227766 INFO nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Instance shutdown successfully after 13 seconds.#033[00m
Jan 23 04:50:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:56.294 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '26'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:56 np0005593234 kernel: tap0b1311fa-41 (unregistering): left promiscuous mode
Jan 23 04:50:56 np0005593234 NetworkManager[48942]: <info>  [1769161856.6557] device (tap0b1311fa-41): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.665 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:56 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:56Z|00223|binding|INFO|Releasing lport 0b1311fa-410f-4d76-a118-cd5f14a68f51 from this chassis (sb_readonly=0)
Jan 23 04:50:56 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:56Z|00224|binding|INFO|Setting lport 0b1311fa-410f-4d76-a118-cd5f14a68f51 down in Southbound
Jan 23 04:50:56 np0005593234 ovn_controller[134547]: 2026-01-23T09:50:56Z|00225|binding|INFO|Removing iface tap0b1311fa-41 ovn-installed in OVS
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.668 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:56.694 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:53:fd 10.100.0.4'], port_security=['fa:16:3e:63:53:fd 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0409b666-6d7a-4831-9ba3-08afe2d0c46b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=0b1311fa-410f-4d76-a118-cd5f14a68f51) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.696 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:56.697 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 0b1311fa-410f-4d76-a118-cd5f14a68f51 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 unbound from our chassis#033[00m
Jan 23 04:50:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:56.698 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d2cdc4c-47a0-475b-8e71-39465d365de3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:50:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:56.700 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[da7410bb-1cf4-4c7c-9167-2fd1a1bea021]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:56.700 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace which is not needed anymore#033[00m
Jan 23 04:50:56 np0005593234 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000045.scope: Deactivated successfully.
Jan 23 04:50:56 np0005593234 systemd[1]: machine-qemu\x2d28\x2dinstance\x2d00000045.scope: Consumed 14.559s CPU time.
Jan 23 04:50:56 np0005593234 systemd-machined[195626]: Machine qemu-28-instance-00000045 terminated.
Jan 23 04:50:56 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[257753]: [NOTICE]   (257766) : haproxy version is 2.8.14-c23fe91
Jan 23 04:50:56 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[257753]: [NOTICE]   (257766) : path to executable is /usr/sbin/haproxy
Jan 23 04:50:56 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[257753]: [WARNING]  (257766) : Exiting Master process...
Jan 23 04:50:56 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[257753]: [ALERT]    (257766) : Current worker (257771) exited with code 143 (Terminated)
Jan 23 04:50:56 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[257753]: [WARNING]  (257766) : All workers exited. Exiting... (0)
Jan 23 04:50:56 np0005593234 systemd[1]: libpod-71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536.scope: Deactivated successfully.
Jan 23 04:50:56 np0005593234 podman[258692]: 2026-01-23 09:50:56.839357553 +0000 UTC m=+0.048198626 container died 71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:50:56 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536-userdata-shm.mount: Deactivated successfully.
Jan 23 04:50:56 np0005593234 systemd[1]: var-lib-containers-storage-overlay-c7ca396c721ffbd395929abd0d07121d9cc0a0c755b49ef90cbd361fee3d7c00-merged.mount: Deactivated successfully.
Jan 23 04:50:56 np0005593234 podman[258692]: 2026-01-23 09:50:56.890005515 +0000 UTC m=+0.098846588 container cleanup 71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.894 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:56 np0005593234 systemd[1]: libpod-conmon-71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536.scope: Deactivated successfully.
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.899 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.907 227766 INFO nova.virt.libvirt.driver [-] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Instance destroyed successfully.#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.913 227766 INFO nova.virt.libvirt.driver [-] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Instance destroyed successfully.#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.915 227766 DEBUG nova.virt.libvirt.vif [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:50:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-680599882',display_name='tempest-ServerDiskConfigTestJSON-server-680599882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-680599882',id=69,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:50:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-wqu3xxlv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:50:41Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=0409b666-6d7a-4831-9ba3-08afe2d0c46b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.915 227766 DEBUG nova.network.os_vif_util [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.916 227766 DEBUG nova.network.os_vif_util [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.917 227766 DEBUG os_vif [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.919 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.919 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b1311fa-41, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.921 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.922 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.924 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.927 227766 INFO os_vif [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41')#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.950 227766 DEBUG nova.compute.manager [req-a67ef421-463f-490e-88b1-9a8aaaff96ad req-0007a525-b056-4a7d-9312-9b226cc8084a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received event network-vif-unplugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.951 227766 DEBUG oslo_concurrency.lockutils [req-a67ef421-463f-490e-88b1-9a8aaaff96ad req-0007a525-b056-4a7d-9312-9b226cc8084a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.952 227766 DEBUG oslo_concurrency.lockutils [req-a67ef421-463f-490e-88b1-9a8aaaff96ad req-0007a525-b056-4a7d-9312-9b226cc8084a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.952 227766 DEBUG oslo_concurrency.lockutils [req-a67ef421-463f-490e-88b1-9a8aaaff96ad req-0007a525-b056-4a7d-9312-9b226cc8084a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.953 227766 DEBUG nova.compute.manager [req-a67ef421-463f-490e-88b1-9a8aaaff96ad req-0007a525-b056-4a7d-9312-9b226cc8084a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] No waiting events found dispatching network-vif-unplugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.953 227766 WARNING nova.compute.manager [req-a67ef421-463f-490e-88b1-9a8aaaff96ad req-0007a525-b056-4a7d-9312-9b226cc8084a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received unexpected event network-vif-unplugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 23 04:50:56 np0005593234 podman[258727]: 2026-01-23 09:50:56.973985968 +0000 UTC m=+0.058987083 container remove 71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 04:50:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:56.980 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[237da4ca-e092-43d7-8068-bbf411a3a7c9]: (4, ('Fri Jan 23 09:50:56 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536)\n71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536\nFri Jan 23 09:50:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536)\n71f8a163e6b2dd2a9d2ce9225cbf4ab7e2198ee3b4a6fdcd1aebfbd957e71536\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:56.983 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[816d837f-a64d-499f-b266-ba4a272b5808]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:56.984 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.986 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:56 np0005593234 kernel: tap6d2cdc4c-40: left promiscuous mode
Jan 23 04:50:56 np0005593234 nova_compute[227762]: 2026-01-23 09:50:56.998 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:57.002 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1c445130-5ea3-4792-8352-be0c3277c0f7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:57.023 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[887433c1-7bf3-4735-b20d-7f953946ded4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:57.024 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c6185563-d118-41c8-8c3b-8ba9a6b584dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:57.040 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0b6e8af0-9fc3-44e5-b3fc-64225c10a189]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 572957, 'reachable_time': 44589, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 258763, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:57 np0005593234 systemd[1]: run-netns-ovnmeta\x2d6d2cdc4c\x2d47a0\x2d475b\x2d8e71\x2d39465d365de3.mount: Deactivated successfully.
Jan 23 04:50:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:57.043 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:50:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:50:57.044 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[1970eb18-22d6-4a50-b253-5818be1fdcba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.396 227766 INFO nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Deleting instance files /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b_del#033[00m
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.397 227766 INFO nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Deletion of /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b_del complete#033[00m
Jan 23 04:50:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:57.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.564 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.564 227766 INFO nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Creating image(s)#033[00m
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.588 227766 DEBUG nova.storage.rbd_utils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:57.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.614 227766 DEBUG nova.storage.rbd_utils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:50:57 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4255452350' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.647 227766 DEBUG nova.storage.rbd_utils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.651 227766 DEBUG oslo_concurrency.processutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.713 227766 DEBUG oslo_concurrency.processutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.714 227766 DEBUG oslo_concurrency.lockutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.715 227766 DEBUG oslo_concurrency.lockutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.715 227766 DEBUG oslo_concurrency.lockutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.741 227766 DEBUG nova.storage.rbd_utils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:57 np0005593234 nova_compute[227762]: 2026-01-23 09:50:57.745 227766 DEBUG oslo_concurrency.processutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.136 227766 DEBUG oslo_concurrency.processutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.392s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.212 227766 DEBUG nova.storage.rbd_utils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] resizing rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.311 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.312 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Ensure instance console log exists: /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.312 227766 DEBUG oslo_concurrency.lockutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.312 227766 DEBUG oslo_concurrency.lockutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.313 227766 DEBUG oslo_concurrency.lockutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.315 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Start _get_guest_xml network_info=[{"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.318 227766 WARNING nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.324 227766 DEBUG nova.virt.libvirt.host [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.324 227766 DEBUG nova.virt.libvirt.host [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.327 227766 DEBUG nova.virt.libvirt.host [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.327 227766 DEBUG nova.virt.libvirt.host [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.328 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.329 227766 DEBUG nova.virt.hardware [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.329 227766 DEBUG nova.virt.hardware [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.329 227766 DEBUG nova.virt.hardware [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.329 227766 DEBUG nova.virt.hardware [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.329 227766 DEBUG nova.virt.hardware [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.329 227766 DEBUG nova.virt.hardware [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.330 227766 DEBUG nova.virt.hardware [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.330 227766 DEBUG nova.virt.hardware [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.330 227766 DEBUG nova.virt.hardware [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.330 227766 DEBUG nova.virt.hardware [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.330 227766 DEBUG nova.virt.hardware [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.331 227766 DEBUG nova.objects.instance [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 0409b666-6d7a-4831-9ba3-08afe2d0c46b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.366 227766 DEBUG oslo_concurrency.processutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:50:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2276049398' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.782 227766 DEBUG oslo_concurrency.processutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.808 227766 DEBUG nova.storage.rbd_utils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:58 np0005593234 nova_compute[227762]: 2026-01-23 09:50:58.812 227766 DEBUG oslo_concurrency.processutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.211 227766 DEBUG nova.compute.manager [req-ef5250ab-9abf-415a-b49a-94c9f328b6a7 req-3400c19c-d3e8-48be-8e9f-cc240e59b62c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.212 227766 DEBUG oslo_concurrency.lockutils [req-ef5250ab-9abf-415a-b49a-94c9f328b6a7 req-3400c19c-d3e8-48be-8e9f-cc240e59b62c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.212 227766 DEBUG oslo_concurrency.lockutils [req-ef5250ab-9abf-415a-b49a-94c9f328b6a7 req-3400c19c-d3e8-48be-8e9f-cc240e59b62c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.213 227766 DEBUG oslo_concurrency.lockutils [req-ef5250ab-9abf-415a-b49a-94c9f328b6a7 req-3400c19c-d3e8-48be-8e9f-cc240e59b62c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.213 227766 DEBUG nova.compute.manager [req-ef5250ab-9abf-415a-b49a-94c9f328b6a7 req-3400c19c-d3e8-48be-8e9f-cc240e59b62c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] No waiting events found dispatching network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.213 227766 WARNING nova.compute.manager [req-ef5250ab-9abf-415a-b49a-94c9f328b6a7 req-3400c19c-d3e8-48be-8e9f-cc240e59b62c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received unexpected event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 23 04:50:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:50:59 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/719033522' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.264 227766 DEBUG oslo_concurrency.processutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.265 227766 DEBUG nova.virt.libvirt.vif [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:50:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-680599882',display_name='tempest-ServerDiskConfigTestJSON-server-680599882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-680599882',id=69,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:50:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-wqu3xxlv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:50:57Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=0409b666-6d7a-4831-9ba3-08afe2d0c46b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.266 227766 DEBUG nova.network.os_vif_util [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.266 227766 DEBUG nova.network.os_vif_util [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.269 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  <uuid>0409b666-6d7a-4831-9ba3-08afe2d0c46b</uuid>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  <name>instance-00000045</name>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-680599882</nova:name>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:50:58</nova:creationTime>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <nova:user uuid="0cfac2191989448ead77e75ca3910ac4">tempest-ServerDiskConfigTestJSON-211417238-project-member</nova:user>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <nova:project uuid="86d938c8e2bb41a79012befd500d1088">tempest-ServerDiskConfigTestJSON-211417238</nova:project>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="ae1f9e37-418c-462f-81d1-3599a6d89de9"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <nova:port uuid="0b1311fa-410f-4d76-a118-cd5f14a68f51">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <entry name="serial">0409b666-6d7a-4831-9ba3-08afe2d0c46b</entry>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <entry name="uuid">0409b666-6d7a-4831-9ba3-08afe2d0c46b</entry>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk.config">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:63:53:fd"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <target dev="tap0b1311fa-41"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/console.log" append="off"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:50:59 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:50:59 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:50:59 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:50:59 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.270 227766 DEBUG nova.compute.manager [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Preparing to wait for external event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.271 227766 DEBUG oslo_concurrency.lockutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.271 227766 DEBUG oslo_concurrency.lockutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.271 227766 DEBUG oslo_concurrency.lockutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.272 227766 DEBUG nova.virt.libvirt.vif [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:50:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-680599882',display_name='tempest-ServerDiskConfigTestJSON-server-680599882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-680599882',id=69,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:50:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-wqu3xxlv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:50:57Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=0409b666-6d7a-4831-9ba3-08afe2d0c46b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.272 227766 DEBUG nova.network.os_vif_util [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.273 227766 DEBUG nova.network.os_vif_util [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.273 227766 DEBUG os_vif [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.274 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.274 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.274 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.277 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.277 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0b1311fa-41, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.277 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0b1311fa-41, col_values=(('external_ids', {'iface-id': '0b1311fa-410f-4d76-a118-cd5f14a68f51', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:63:53:fd', 'vm-uuid': '0409b666-6d7a-4831-9ba3-08afe2d0c46b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.279 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:59 np0005593234 NetworkManager[48942]: <info>  [1769161859.2798] manager: (tap0b1311fa-41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/119)
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.281 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.285 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.286 227766 INFO os_vif [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41')#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.360 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.361 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.361 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No VIF found with MAC fa:16:3e:63:53:fd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.361 227766 INFO nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Using config drive#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.384 227766 DEBUG nova.storage.rbd_utils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.420 227766 DEBUG nova.objects.instance [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 0409b666-6d7a-4831-9ba3-08afe2d0c46b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:59 np0005593234 nova_compute[227762]: 2026-01-23 09:50:59.475 227766 DEBUG nova.objects.instance [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'keypairs' on Instance uuid 0409b666-6d7a-4831-9ba3-08afe2d0c46b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:50:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:50:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:50:59.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:50:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:50:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:50:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:50:59.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:51:00 np0005593234 nova_compute[227762]: 2026-01-23 09:51:00.118 227766 INFO nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Creating config drive at /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/disk.config#033[00m
Jan 23 04:51:00 np0005593234 nova_compute[227762]: 2026-01-23 09:51:00.123 227766 DEBUG oslo_concurrency.processutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj5epnnad execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:00 np0005593234 nova_compute[227762]: 2026-01-23 09:51:00.251 227766 DEBUG oslo_concurrency.processutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj5epnnad" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:00 np0005593234 nova_compute[227762]: 2026-01-23 09:51:00.276 227766 DEBUG nova.storage.rbd_utils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:00 np0005593234 nova_compute[227762]: 2026-01-23 09:51:00.280 227766 DEBUG oslo_concurrency.processutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/disk.config 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:00 np0005593234 nova_compute[227762]: 2026-01-23 09:51:00.300 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593234 nova_compute[227762]: 2026-01-23 09:51:00.544 227766 DEBUG oslo_concurrency.processutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/disk.config 0409b666-6d7a-4831-9ba3-08afe2d0c46b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.265s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:00 np0005593234 nova_compute[227762]: 2026-01-23 09:51:00.545 227766 INFO nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Deleting local config drive /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b/disk.config because it was imported into RBD.#033[00m
Jan 23 04:51:00 np0005593234 kernel: tap0b1311fa-41: entered promiscuous mode
Jan 23 04:51:00 np0005593234 NetworkManager[48942]: <info>  [1769161860.5953] manager: (tap0b1311fa-41): new Tun device (/org/freedesktop/NetworkManager/Devices/120)
Jan 23 04:51:00 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:00Z|00226|binding|INFO|Claiming lport 0b1311fa-410f-4d76-a118-cd5f14a68f51 for this chassis.
Jan 23 04:51:00 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:00Z|00227|binding|INFO|0b1311fa-410f-4d76-a118-cd5f14a68f51: Claiming fa:16:3e:63:53:fd 10.100.0.4
Jan 23 04:51:00 np0005593234 nova_compute[227762]: 2026-01-23 09:51:00.595 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:00Z|00228|binding|INFO|Setting lport 0b1311fa-410f-4d76-a118-cd5f14a68f51 ovn-installed in OVS
Jan 23 04:51:00 np0005593234 nova_compute[227762]: 2026-01-23 09:51:00.611 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593234 nova_compute[227762]: 2026-01-23 09:51:00.613 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:00 np0005593234 systemd-machined[195626]: New machine qemu-29-instance-00000045.
Jan 23 04:51:00 np0005593234 systemd-udevd[259119]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:51:00 np0005593234 NetworkManager[48942]: <info>  [1769161860.6375] device (tap0b1311fa-41): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:51:00 np0005593234 systemd[1]: Started Virtual Machine qemu-29-instance-00000045.
Jan 23 04:51:00 np0005593234 NetworkManager[48942]: <info>  [1769161860.6381] device (tap0b1311fa-41): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:51:00 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:00Z|00229|binding|INFO|Setting lport 0b1311fa-410f-4d76-a118-cd5f14a68f51 up in Southbound
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.699 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:53:fd 10.100.0.4'], port_security=['fa:16:3e:63:53:fd 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0409b666-6d7a-4831-9ba3-08afe2d0c46b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=0b1311fa-410f-4d76-a118-cd5f14a68f51) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.701 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 0b1311fa-410f-4d76-a118-cd5f14a68f51 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 bound to our chassis#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.702 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6d2cdc4c-47a0-475b-8e71-39465d365de3#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.714 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b8961781-b670-47c2-8ebd-67b9b6f76c63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.715 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6d2cdc4c-41 in ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.716 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6d2cdc4c-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.717 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[55acad73-d671-4c67-9fe5-d14d837fd7e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.718 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[62bdf029-745a-4b6b-8a23-1562e417ef78]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.733 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[f6126c10-5757-463c-829a-04495266e746]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.746 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e72b66-4e45-4108-949f-51da5c33c598]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.779 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[02b521f2-1e9b-4c13-8c56-0a546b97c753]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.785 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d4b8581c-6dd0-45f3-8351-28ba66a98809]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593234 NetworkManager[48942]: <info>  [1769161860.7860] manager: (tap6d2cdc4c-40): new Veth device (/org/freedesktop/NetworkManager/Devices/121)
Jan 23 04:51:00 np0005593234 systemd-udevd[259121]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.823 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[fd13cafb-72c4-4299-bbf7-03bcdc08266e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.826 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[16edc6a1-f2c6-4d77-8834-9f2498cdbb15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593234 NetworkManager[48942]: <info>  [1769161860.8533] device (tap6d2cdc4c-40): carrier: link connected
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.862 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b77768bb-7620-4da0-88df-fb8d40b94359]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.883 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c201a004-3393-4d2c-8f3b-cdbc56839d17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575581, 'reachable_time': 34043, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259152, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.900 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[451db594-f69c-4e3b-8202-d1a21612107a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:5a26'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 575581, 'tstamp': 575581}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259153, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.916 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e2dcab3d-f096-43d9-a755-b844f6c2b31f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575581, 'reachable_time': 34043, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259154, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:00.950 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[69ea1237-7aed-444f-9fdf-17f87ba29970]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:01.010 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2783093a-9eac-474b-b9ce-83ca860f367f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:01.011 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:01.011 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:01.012 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d2cdc4c-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:01 np0005593234 kernel: tap6d2cdc4c-40: entered promiscuous mode
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.014 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:01 np0005593234 NetworkManager[48942]: <info>  [1769161861.0169] manager: (tap6d2cdc4c-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:01.022 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6d2cdc4c-40, col_values=(('external_ids', {'iface-id': '04f6c0b6-99ee-4958-bc01-68fa310042f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.024 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:01 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:01Z|00230|binding|INFO|Releasing lport 04f6c0b6-99ee-4958-bc01-68fa310042f0 from this chassis (sb_readonly=0)
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.037 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:01.038 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:01.039 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[84d5d1e6-de5f-4e01-aee8-dddf7350cea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:01.040 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:51:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:01.040 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'env', 'PROCESS_TAG=haproxy-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6d2cdc4c-47a0-475b-8e71-39465d365de3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.340 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for 0409b666-6d7a-4831-9ba3-08afe2d0c46b due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.340 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161861.3390303, 0409b666-6d7a-4831-9ba3-08afe2d0c46b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.341 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] VM Started (Lifecycle Event)#033[00m
Jan 23 04:51:01 np0005593234 podman[259229]: 2026-01-23 09:51:01.409163606 +0000 UTC m=+0.043026405 container create d1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 04:51:01 np0005593234 systemd[1]: Started libpod-conmon-d1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92.scope.
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.456 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.461 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161861.339881, 0409b666-6d7a-4831-9ba3-08afe2d0c46b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.462 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:51:01 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:51:01 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac8e3bb555b3d7e2e0f353fa2569c82a42e6fc019d71373f1a8c4c3a99839f79/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:01 np0005593234 podman[259229]: 2026-01-23 09:51:01.387031845 +0000 UTC m=+0.020894654 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:51:01 np0005593234 podman[259229]: 2026-01-23 09:51:01.490067373 +0000 UTC m=+0.123930172 container init d1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 04:51:01 np0005593234 podman[259229]: 2026-01-23 09:51:01.496404752 +0000 UTC m=+0.130267551 container start d1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:51:01 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259244]: [NOTICE]   (259248) : New worker (259250) forked
Jan 23 04:51:01 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259244]: [NOTICE]   (259248) : Loading success.
Jan 23 04:51:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:51:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:01.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:51:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:51:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:01.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.642 227766 DEBUG nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.642 227766 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.642 227766 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.643 227766 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.643 227766 DEBUG nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Processing event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.643 227766 DEBUG nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.643 227766 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.643 227766 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.644 227766 DEBUG oslo_concurrency.lockutils [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.644 227766 DEBUG nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] No waiting events found dispatching network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.644 227766 WARNING nova.compute.manager [req-9a6acc0c-5bff-4365-8bd1-4b8bbd759da1 req-42ea4257-0beb-452e-8db0-1fb4360fb7a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received unexpected event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.644 227766 DEBUG nova.compute.manager [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.650 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.653 227766 INFO nova.virt.libvirt.driver [-] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Instance spawned successfully.#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.654 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.877 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.882 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161861.6488318, 0409b666-6d7a-4831-9ba3-08afe2d0c46b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.883 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.914 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.915 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.915 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.915 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.916 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.916 227766 DEBUG nova.virt.libvirt.driver [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.921 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.925 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:51:01 np0005593234 nova_compute[227762]: 2026-01-23 09:51:01.965 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 04:51:02 np0005593234 nova_compute[227762]: 2026-01-23 09:51:02.153 227766 DEBUG nova.compute.manager [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:02 np0005593234 nova_compute[227762]: 2026-01-23 09:51:02.315 227766 DEBUG oslo_concurrency.lockutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:02 np0005593234 nova_compute[227762]: 2026-01-23 09:51:02.315 227766 DEBUG oslo_concurrency.lockutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:02 np0005593234 nova_compute[227762]: 2026-01-23 09:51:02.315 227766 DEBUG nova.objects.instance [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:51:03 np0005593234 nova_compute[227762]: 2026-01-23 09:51:03.521 227766 DEBUG oslo_concurrency.lockutils [None req-c001d1ce-9900-4291-b600-c67a07fccb6f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 1.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:03.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:51:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:03.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:51:04 np0005593234 nova_compute[227762]: 2026-01-23 09:51:04.280 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:05 np0005593234 nova_compute[227762]: 2026-01-23 09:51:05.297 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:05.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:05.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:05 np0005593234 nova_compute[227762]: 2026-01-23 09:51:05.984 227766 DEBUG oslo_concurrency.lockutils [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:05 np0005593234 nova_compute[227762]: 2026-01-23 09:51:05.984 227766 DEBUG oslo_concurrency.lockutils [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:05 np0005593234 nova_compute[227762]: 2026-01-23 09:51:05.985 227766 DEBUG oslo_concurrency.lockutils [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:05 np0005593234 nova_compute[227762]: 2026-01-23 09:51:05.985 227766 DEBUG oslo_concurrency.lockutils [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:05 np0005593234 nova_compute[227762]: 2026-01-23 09:51:05.985 227766 DEBUG oslo_concurrency.lockutils [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:05 np0005593234 nova_compute[227762]: 2026-01-23 09:51:05.986 227766 INFO nova.compute.manager [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Terminating instance#033[00m
Jan 23 04:51:05 np0005593234 nova_compute[227762]: 2026-01-23 09:51:05.987 227766 DEBUG nova.compute.manager [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:51:06 np0005593234 kernel: tap0b1311fa-41 (unregistering): left promiscuous mode
Jan 23 04:51:06 np0005593234 NetworkManager[48942]: <info>  [1769161866.0228] device (tap0b1311fa-41): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:51:06 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:06Z|00231|binding|INFO|Releasing lport 0b1311fa-410f-4d76-a118-cd5f14a68f51 from this chassis (sb_readonly=0)
Jan 23 04:51:06 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:06Z|00232|binding|INFO|Setting lport 0b1311fa-410f-4d76-a118-cd5f14a68f51 down in Southbound
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.031 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:06 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:06Z|00233|binding|INFO|Removing iface tap0b1311fa-41 ovn-installed in OVS
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.033 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.044 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:63:53:fd 10.100.0.4'], port_security=['fa:16:3e:63:53:fd 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0409b666-6d7a-4831-9ba3-08afe2d0c46b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=0b1311fa-410f-4d76-a118-cd5f14a68f51) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.046 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 0b1311fa-410f-4d76-a118-cd5f14a68f51 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 unbound from our chassis#033[00m
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.048 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d2cdc4c-47a0-475b-8e71-39465d365de3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.049 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a332e40e-409c-4de3-bc1f-a786849a1979]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.049 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace which is not needed anymore#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.051 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:06 np0005593234 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000045.scope: Deactivated successfully.
Jan 23 04:51:06 np0005593234 systemd[1]: machine-qemu\x2d29\x2dinstance\x2d00000045.scope: Consumed 5.193s CPU time.
Jan 23 04:51:06 np0005593234 systemd-machined[195626]: Machine qemu-29-instance-00000045 terminated.
Jan 23 04:51:06 np0005593234 podman[259263]: 2026-01-23 09:51:06.127029244 +0000 UTC m=+0.080525945 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.209 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.214 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.227 227766 INFO nova.virt.libvirt.driver [-] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Instance destroyed successfully.#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.228 227766 DEBUG nova.objects.instance [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'resources' on Instance uuid 0409b666-6d7a-4831-9ba3-08afe2d0c46b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.250 227766 DEBUG nova.virt.libvirt.vif [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=True,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:50:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-680599882',display_name='tempest-ServerDiskConfigTestJSON-server-680599882',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-680599882',id=69,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:51:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-wqu3xxlv',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:51:02Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=0409b666-6d7a-4831-9ba3-08afe2d0c46b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.251 227766 DEBUG nova.network.os_vif_util [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "address": "fa:16:3e:63:53:fd", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0b1311fa-41", "ovs_interfaceid": "0b1311fa-410f-4d76-a118-cd5f14a68f51", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.252 227766 DEBUG nova.network.os_vif_util [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.252 227766 DEBUG os_vif [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:51:06 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259244]: [NOTICE]   (259248) : haproxy version is 2.8.14-c23fe91
Jan 23 04:51:06 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259244]: [NOTICE]   (259248) : path to executable is /usr/sbin/haproxy
Jan 23 04:51:06 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259244]: [WARNING]  (259248) : Exiting Master process...
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.254 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.254 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0b1311fa-41, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.255 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:06 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259244]: [ALERT]    (259248) : Current worker (259250) exited with code 143 (Terminated)
Jan 23 04:51:06 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259244]: [WARNING]  (259248) : All workers exited. Exiting... (0)
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.257 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:06 np0005593234 systemd[1]: libpod-d1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92.scope: Deactivated successfully.
Jan 23 04:51:06 np0005593234 conmon[259244]: conmon d1a922d7ac83ba4b885d <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92.scope/container/memory.events
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.259 227766 INFO os_vif [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:63:53:fd,bridge_name='br-int',has_traffic_filtering=True,id=0b1311fa-410f-4d76-a118-cd5f14a68f51,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0b1311fa-41')#033[00m
Jan 23 04:51:06 np0005593234 podman[259303]: 2026-01-23 09:51:06.265409376 +0000 UTC m=+0.133920424 container died d1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:51:06 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92-userdata-shm.mount: Deactivated successfully.
Jan 23 04:51:06 np0005593234 systemd[1]: var-lib-containers-storage-overlay-ac8e3bb555b3d7e2e0f353fa2569c82a42e6fc019d71373f1a8c4c3a99839f79-merged.mount: Deactivated successfully.
Jan 23 04:51:06 np0005593234 podman[259303]: 2026-01-23 09:51:06.385201358 +0000 UTC m=+0.253712396 container cleanup d1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 04:51:06 np0005593234 systemd[1]: libpod-conmon-d1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92.scope: Deactivated successfully.
Jan 23 04:51:06 np0005593234 podman[259363]: 2026-01-23 09:51:06.443416637 +0000 UTC m=+0.036894634 container remove d1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.448 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eee64cf8-c93c-46d1-b210-869a388cd8ba]: (4, ('Fri Jan 23 09:51:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (d1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92)\nd1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92\nFri Jan 23 09:51:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (d1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92)\nd1a922d7ac83ba4b885d21a5cecedd7ceab4316014dae9354a71a0165e5bad92\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.450 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4cc0ef3f-8730-4bc1-8f47-a6472b009861]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.451 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.453 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:06 np0005593234 kernel: tap6d2cdc4c-40: left promiscuous mode
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.455 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.458 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[65f99cbd-255c-417c-9ed6-32c53bf6f2c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.469 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.471 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[19f5390f-2c33-47a4-93a1-41a4163fc7ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.472 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[429fb3d7-344a-40d9-a6e2-8c775062c63c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.486 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b91a66a3-c097-4459-87c4-9979ffcbc880]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 575573, 'reachable_time': 27648, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259378, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:06 np0005593234 systemd[1]: run-netns-ovnmeta\x2d6d2cdc4c\x2d47a0\x2d475b\x2d8e71\x2d39465d365de3.mount: Deactivated successfully.
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.489 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:51:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:06.489 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[7817331d-612e-4f1e-a34d-27d4085f6109]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.728 227766 INFO nova.virt.libvirt.driver [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Deleting instance files /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b_del#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.729 227766 INFO nova.virt.libvirt.driver [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Deletion of /var/lib/nova/instances/0409b666-6d7a-4831-9ba3-08afe2d0c46b_del complete#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.989 227766 INFO nova.compute.manager [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Took 1.00 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.990 227766 DEBUG oslo.service.loopingcall [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.990 227766 DEBUG nova.compute.manager [-] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:51:06 np0005593234 nova_compute[227762]: 2026-01-23 09:51:06.991 227766 DEBUG nova.network.neutron [-] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:51:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:07.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:51:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:07.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:51:08 np0005593234 nova_compute[227762]: 2026-01-23 09:51:08.058 227766 DEBUG nova.network.neutron [-] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:08 np0005593234 nova_compute[227762]: 2026-01-23 09:51:08.073 227766 DEBUG nova.compute.manager [req-cbd327bf-2a83-4b0e-adc8-d0819d816b79 req-1ef1e3c4-2e4b-49f8-a483-991956ece94b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received event network-vif-unplugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:08 np0005593234 nova_compute[227762]: 2026-01-23 09:51:08.073 227766 DEBUG oslo_concurrency.lockutils [req-cbd327bf-2a83-4b0e-adc8-d0819d816b79 req-1ef1e3c4-2e4b-49f8-a483-991956ece94b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:08 np0005593234 nova_compute[227762]: 2026-01-23 09:51:08.074 227766 DEBUG oslo_concurrency.lockutils [req-cbd327bf-2a83-4b0e-adc8-d0819d816b79 req-1ef1e3c4-2e4b-49f8-a483-991956ece94b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:08 np0005593234 nova_compute[227762]: 2026-01-23 09:51:08.074 227766 DEBUG oslo_concurrency.lockutils [req-cbd327bf-2a83-4b0e-adc8-d0819d816b79 req-1ef1e3c4-2e4b-49f8-a483-991956ece94b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:08 np0005593234 nova_compute[227762]: 2026-01-23 09:51:08.074 227766 DEBUG nova.compute.manager [req-cbd327bf-2a83-4b0e-adc8-d0819d816b79 req-1ef1e3c4-2e4b-49f8-a483-991956ece94b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] No waiting events found dispatching network-vif-unplugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:08 np0005593234 nova_compute[227762]: 2026-01-23 09:51:08.075 227766 DEBUG nova.compute.manager [req-cbd327bf-2a83-4b0e-adc8-d0819d816b79 req-1ef1e3c4-2e4b-49f8-a483-991956ece94b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received event network-vif-unplugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:51:08 np0005593234 nova_compute[227762]: 2026-01-23 09:51:08.083 227766 INFO nova.compute.manager [-] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Took 1.09 seconds to deallocate network for instance.#033[00m
Jan 23 04:51:08 np0005593234 nova_compute[227762]: 2026-01-23 09:51:08.532 227766 DEBUG oslo_concurrency.lockutils [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:08 np0005593234 nova_compute[227762]: 2026-01-23 09:51:08.533 227766 DEBUG oslo_concurrency.lockutils [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:08 np0005593234 nova_compute[227762]: 2026-01-23 09:51:08.682 227766 DEBUG nova.compute.manager [req-6ef5eb25-680d-4805-b426-ee0d762bce49 req-ff23bb2c-f4e7-47fa-872e-c44ceb068fe2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received event network-vif-deleted-0b1311fa-410f-4d76-a118-cd5f14a68f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:08 np0005593234 nova_compute[227762]: 2026-01-23 09:51:08.899 227766 DEBUG oslo_concurrency.processutils [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:51:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2033481364' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:51:09 np0005593234 nova_compute[227762]: 2026-01-23 09:51:09.387 227766 DEBUG oslo_concurrency.processutils [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:09 np0005593234 nova_compute[227762]: 2026-01-23 09:51:09.394 227766 DEBUG nova.compute.provider_tree [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:51:09 np0005593234 nova_compute[227762]: 2026-01-23 09:51:09.417 227766 DEBUG nova.scheduler.client.report [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:51:09 np0005593234 nova_compute[227762]: 2026-01-23 09:51:09.450 227766 DEBUG oslo_concurrency.lockutils [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.917s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:09 np0005593234 nova_compute[227762]: 2026-01-23 09:51:09.496 227766 INFO nova.scheduler.client.report [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Deleted allocations for instance 0409b666-6d7a-4831-9ba3-08afe2d0c46b#033[00m
Jan 23 04:51:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:51:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:09.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:51:09 np0005593234 nova_compute[227762]: 2026-01-23 09:51:09.604 227766 DEBUG oslo_concurrency.lockutils [None req-b0c4caf5-211c-4630-b7b8-7aeeb20da32c 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:51:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:09.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.190 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "27805b05-1c12-4131-9b61-c4fabd93f60d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.191 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.207 227766 DEBUG nova.compute.manager [req-ba249264-d7ce-4dc6-9d7d-425a89fd4bbd req-6b125244-d971-47ed-a0a3-2b13e94bdf22 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.208 227766 DEBUG oslo_concurrency.lockutils [req-ba249264-d7ce-4dc6-9d7d-425a89fd4bbd req-6b125244-d971-47ed-a0a3-2b13e94bdf22 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.208 227766 DEBUG oslo_concurrency.lockutils [req-ba249264-d7ce-4dc6-9d7d-425a89fd4bbd req-6b125244-d971-47ed-a0a3-2b13e94bdf22 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.208 227766 DEBUG oslo_concurrency.lockutils [req-ba249264-d7ce-4dc6-9d7d-425a89fd4bbd req-6b125244-d971-47ed-a0a3-2b13e94bdf22 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0409b666-6d7a-4831-9ba3-08afe2d0c46b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.208 227766 DEBUG nova.compute.manager [req-ba249264-d7ce-4dc6-9d7d-425a89fd4bbd req-6b125244-d971-47ed-a0a3-2b13e94bdf22 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] No waiting events found dispatching network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.209 227766 WARNING nova.compute.manager [req-ba249264-d7ce-4dc6-9d7d-425a89fd4bbd req-6b125244-d971-47ed-a0a3-2b13e94bdf22 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Received unexpected event network-vif-plugged-0b1311fa-410f-4d76-a118-cd5f14a68f51 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.210 227766 DEBUG nova.compute.manager [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:51:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.312 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.312 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.341 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.349 227766 DEBUG nova.virt.hardware [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.349 227766 INFO nova.compute.claims [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.501 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:51:10 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/614367486' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.967 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:10 np0005593234 nova_compute[227762]: 2026-01-23 09:51:10.975 227766 DEBUG nova.compute.provider_tree [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:51:11 np0005593234 nova_compute[227762]: 2026-01-23 09:51:11.210 227766 DEBUG nova.scheduler.client.report [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:51:11 np0005593234 nova_compute[227762]: 2026-01-23 09:51:11.257 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:11 np0005593234 nova_compute[227762]: 2026-01-23 09:51:11.499 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:11 np0005593234 nova_compute[227762]: 2026-01-23 09:51:11.501 227766 DEBUG nova.compute.manager [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:51:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:11.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:11.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:11 np0005593234 nova_compute[227762]: 2026-01-23 09:51:11.897 227766 DEBUG nova.compute.manager [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:51:11 np0005593234 nova_compute[227762]: 2026-01-23 09:51:11.898 227766 DEBUG nova.network.neutron [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.317 227766 INFO nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.352 227766 DEBUG nova.compute.manager [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.360 227766 DEBUG nova.policy [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0cfac2191989448ead77e75ca3910ac4', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '86d938c8e2bb41a79012befd500d1088', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.508 227766 DEBUG nova.compute.manager [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.509 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.509 227766 INFO nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Creating image(s)#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.535 227766 DEBUG nova.storage.rbd_utils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.561 227766 DEBUG nova.storage.rbd_utils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.585 227766 DEBUG nova.storage.rbd_utils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.588 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.647 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.648 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.649 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.649 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.676 227766 DEBUG nova.storage.rbd_utils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.680 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 27805b05-1c12-4131-9b61-c4fabd93f60d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:12 np0005593234 nova_compute[227762]: 2026-01-23 09:51:12.988 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 27805b05-1c12-4131-9b61-c4fabd93f60d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.309s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:13 np0005593234 nova_compute[227762]: 2026-01-23 09:51:13.050 227766 DEBUG nova.storage.rbd_utils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] resizing rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:51:13 np0005593234 nova_compute[227762]: 2026-01-23 09:51:13.158 227766 DEBUG nova.objects.instance [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'migration_context' on Instance uuid 27805b05-1c12-4131-9b61-c4fabd93f60d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:13 np0005593234 nova_compute[227762]: 2026-01-23 09:51:13.183 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:51:13 np0005593234 nova_compute[227762]: 2026-01-23 09:51:13.183 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Ensure instance console log exists: /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:51:13 np0005593234 nova_compute[227762]: 2026-01-23 09:51:13.184 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:13 np0005593234 nova_compute[227762]: 2026-01-23 09:51:13.184 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:13 np0005593234 nova_compute[227762]: 2026-01-23 09:51:13.185 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:13 np0005593234 nova_compute[227762]: 2026-01-23 09:51:13.206 227766 DEBUG nova.network.neutron [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Successfully created port: c0bfdf23-cc4e-4663-8b3c-29e15588af59 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:51:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:13.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:13.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:15 np0005593234 nova_compute[227762]: 2026-01-23 09:51:15.134 227766 DEBUG nova.network.neutron [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Successfully updated port: c0bfdf23-cc4e-4663-8b3c-29e15588af59 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:51:15 np0005593234 nova_compute[227762]: 2026-01-23 09:51:15.176 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "refresh_cache-27805b05-1c12-4131-9b61-c4fabd93f60d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:15 np0005593234 nova_compute[227762]: 2026-01-23 09:51:15.176 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquired lock "refresh_cache-27805b05-1c12-4131-9b61-c4fabd93f60d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:15 np0005593234 nova_compute[227762]: 2026-01-23 09:51:15.176 227766 DEBUG nova.network.neutron [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:51:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:15 np0005593234 nova_compute[227762]: 2026-01-23 09:51:15.276 227766 DEBUG nova.compute.manager [req-07b2b05a-1c8d-4d8b-af8f-4e67ed1fa937 req-e4950a6a-2d3c-471f-8869-428a62181b73 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Received event network-changed-c0bfdf23-cc4e-4663-8b3c-29e15588af59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:15 np0005593234 nova_compute[227762]: 2026-01-23 09:51:15.276 227766 DEBUG nova.compute.manager [req-07b2b05a-1c8d-4d8b-af8f-4e67ed1fa937 req-e4950a6a-2d3c-471f-8869-428a62181b73 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Refreshing instance network info cache due to event network-changed-c0bfdf23-cc4e-4663-8b3c-29e15588af59. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:51:15 np0005593234 nova_compute[227762]: 2026-01-23 09:51:15.276 227766 DEBUG oslo_concurrency.lockutils [req-07b2b05a-1c8d-4d8b-af8f-4e67ed1fa937 req-e4950a6a-2d3c-471f-8869-428a62181b73 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-27805b05-1c12-4131-9b61-c4fabd93f60d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:15 np0005593234 nova_compute[227762]: 2026-01-23 09:51:15.383 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:15 np0005593234 nova_compute[227762]: 2026-01-23 09:51:15.391 227766 DEBUG nova.network.neutron [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:51:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:51:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:15.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:51:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:15.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:16 np0005593234 nova_compute[227762]: 2026-01-23 09:51:16.260 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:51:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:17.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:51:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:17.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:17 np0005593234 nova_compute[227762]: 2026-01-23 09:51:17.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.401 227766 DEBUG nova.network.neutron [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Updating instance_info_cache with network_info: [{"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.446 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Releasing lock "refresh_cache-27805b05-1c12-4131-9b61-c4fabd93f60d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.446 227766 DEBUG nova.compute.manager [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Instance network_info: |[{"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.447 227766 DEBUG oslo_concurrency.lockutils [req-07b2b05a-1c8d-4d8b-af8f-4e67ed1fa937 req-e4950a6a-2d3c-471f-8869-428a62181b73 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-27805b05-1c12-4131-9b61-c4fabd93f60d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.447 227766 DEBUG nova.network.neutron [req-07b2b05a-1c8d-4d8b-af8f-4e67ed1fa937 req-e4950a6a-2d3c-471f-8869-428a62181b73 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Refreshing network info cache for port c0bfdf23-cc4e-4663-8b3c-29e15588af59 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.450 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Start _get_guest_xml network_info=[{"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.455 227766 WARNING nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.461 227766 DEBUG nova.virt.libvirt.host [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.461 227766 DEBUG nova.virt.libvirt.host [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.468 227766 DEBUG nova.virt.libvirt.host [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.469 227766 DEBUG nova.virt.libvirt.host [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.471 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.471 227766 DEBUG nova.virt.hardware [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.472 227766 DEBUG nova.virt.hardware [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.472 227766 DEBUG nova.virt.hardware [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.473 227766 DEBUG nova.virt.hardware [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.473 227766 DEBUG nova.virt.hardware [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.473 227766 DEBUG nova.virt.hardware [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.474 227766 DEBUG nova.virt.hardware [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.474 227766 DEBUG nova.virt.hardware [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.474 227766 DEBUG nova.virt.hardware [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.474 227766 DEBUG nova.virt.hardware [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.475 227766 DEBUG nova.virt.hardware [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.479 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:51:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3295826826' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.896 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.921 227766 DEBUG nova.storage.rbd_utils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:18 np0005593234 nova_compute[227762]: 2026-01-23 09:51:18.925 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:51:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/495406220' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.367 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.368 227766 DEBUG nova.virt.libvirt.vif [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:51:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1367169890',display_name='tempest-ServerDiskConfigTestJSON-server-1367169890',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1367169890',id=74,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-f3p1v1c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:51:12Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=27805b05-1c12-4131-9b61-c4fabd93f60d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.369 227766 DEBUG nova.network.os_vif_util [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.369 227766 DEBUG nova.network.os_vif_util [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.370 227766 DEBUG nova.objects.instance [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'pci_devices' on Instance uuid 27805b05-1c12-4131-9b61-c4fabd93f60d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.401 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  <uuid>27805b05-1c12-4131-9b61-c4fabd93f60d</uuid>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  <name>instance-0000004a</name>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1367169890</nova:name>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:51:18</nova:creationTime>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <nova:user uuid="0cfac2191989448ead77e75ca3910ac4">tempest-ServerDiskConfigTestJSON-211417238-project-member</nova:user>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <nova:project uuid="86d938c8e2bb41a79012befd500d1088">tempest-ServerDiskConfigTestJSON-211417238</nova:project>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <nova:port uuid="c0bfdf23-cc4e-4663-8b3c-29e15588af59">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <entry name="serial">27805b05-1c12-4131-9b61-c4fabd93f60d</entry>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <entry name="uuid">27805b05-1c12-4131-9b61-c4fabd93f60d</entry>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/27805b05-1c12-4131-9b61-c4fabd93f60d_disk">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/27805b05-1c12-4131-9b61-c4fabd93f60d_disk.config">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:7c:5c:6d"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <target dev="tapc0bfdf23-cc"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/console.log" append="off"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:51:19 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:51:19 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:51:19 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:51:19 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.402 227766 DEBUG nova.compute.manager [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Preparing to wait for external event network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.402 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.403 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.403 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.404 227766 DEBUG nova.virt.libvirt.vif [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:51:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1367169890',display_name='tempest-ServerDiskConfigTestJSON-server-1367169890',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1367169890',id=74,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-f3p1v1c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:51:12Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=27805b05-1c12-4131-9b61-c4fabd93f60d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.404 227766 DEBUG nova.network.os_vif_util [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.405 227766 DEBUG nova.network.os_vif_util [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.405 227766 DEBUG os_vif [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.405 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.406 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.406 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.411 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.411 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc0bfdf23-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.411 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc0bfdf23-cc, col_values=(('external_ids', {'iface-id': 'c0bfdf23-cc4e-4663-8b3c-29e15588af59', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:5c:6d', 'vm-uuid': '27805b05-1c12-4131-9b61-c4fabd93f60d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.413 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:19 np0005593234 NetworkManager[48942]: <info>  [1769161879.4138] manager: (tapc0bfdf23-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/123)
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.415 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.418 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.418 227766 INFO os_vif [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc')#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.480 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.481 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.481 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No VIF found with MAC fa:16:3e:7c:5c:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.482 227766 INFO nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Using config drive#033[00m
Jan 23 04:51:19 np0005593234 nova_compute[227762]: 2026-01-23 09:51:19.504 227766 DEBUG nova.storage.rbd_utils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:51:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:19.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:51:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:19.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:20 np0005593234 nova_compute[227762]: 2026-01-23 09:51:20.427 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:20 np0005593234 nova_compute[227762]: 2026-01-23 09:51:20.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:20 np0005593234 nova_compute[227762]: 2026-01-23 09:51:20.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:51:20 np0005593234 nova_compute[227762]: 2026-01-23 09:51:20.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:20 np0005593234 nova_compute[227762]: 2026-01-23 09:51:20.783 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:20 np0005593234 nova_compute[227762]: 2026-01-23 09:51:20.783 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:20 np0005593234 nova_compute[227762]: 2026-01-23 09:51:20.784 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:20 np0005593234 nova_compute[227762]: 2026-01-23 09:51:20.784 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:51:20 np0005593234 nova_compute[227762]: 2026-01-23 09:51:20.784 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:20 np0005593234 podman[259730]: 2026-01-23 09:51:20.786334195 +0000 UTC m=+0.077958817 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 23 04:51:20 np0005593234 nova_compute[227762]: 2026-01-23 09:51:20.825 227766 INFO nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Creating config drive at /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/disk.config#033[00m
Jan 23 04:51:20 np0005593234 nova_compute[227762]: 2026-01-23 09:51:20.831 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4p6lqu9i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:20 np0005593234 nova_compute[227762]: 2026-01-23 09:51:20.963 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4p6lqu9i" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:20.997 227766 DEBUG nova.storage.rbd_utils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.002 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/disk.config 27805b05-1c12-4131-9b61-c4fabd93f60d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.152 227766 DEBUG oslo_concurrency.processutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/disk.config 27805b05-1c12-4131-9b61-c4fabd93f60d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.150s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.154 227766 INFO nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Deleting local config drive /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/disk.config because it was imported into RBD.#033[00m
Jan 23 04:51:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:51:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1805213832' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:51:21 np0005593234 kernel: tapc0bfdf23-cc: entered promiscuous mode
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.199 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:21 np0005593234 NetworkManager[48942]: <info>  [1769161881.1999] manager: (tapc0bfdf23-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/124)
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.200 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:21 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:21Z|00234|binding|INFO|Claiming lport c0bfdf23-cc4e-4663-8b3c-29e15588af59 for this chassis.
Jan 23 04:51:21 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:21Z|00235|binding|INFO|c0bfdf23-cc4e-4663-8b3c-29e15588af59: Claiming fa:16:3e:7c:5c:6d 10.100.0.14
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.208 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:5c:6d 10.100.0.14'], port_security=['fa:16:3e:7c:5c:6d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '27805b05-1c12-4131-9b61-c4fabd93f60d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=c0bfdf23-cc4e-4663-8b3c-29e15588af59) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.210 144381 INFO neutron.agent.ovn.metadata.agent [-] Port c0bfdf23-cc4e-4663-8b3c-29e15588af59 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 bound to our chassis#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.211 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6d2cdc4c-47a0-475b-8e71-39465d365de3#033[00m
Jan 23 04:51:21 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:21Z|00236|binding|INFO|Setting lport c0bfdf23-cc4e-4663-8b3c-29e15588af59 ovn-installed in OVS
Jan 23 04:51:21 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:21Z|00237|binding|INFO|Setting lport c0bfdf23-cc4e-4663-8b3c-29e15588af59 up in Southbound
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.217 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.221 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.221 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b8ba5a37-5800-45dc-96f5-a5f9c953a863]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.222 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6d2cdc4c-41 in ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.224 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6d2cdc4c-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.224 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b67b32-6ad4-40a1-b948-beb52805c21c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.226 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[774c3e40-0997-450d-8041-f683d3ddb198]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.227 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161866.2254875, 0409b666-6d7a-4831-9ba3-08afe2d0c46b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.228 227766 INFO nova.compute.manager [-] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.237 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[76ee1551-8dbf-4819-bb69-a55e75bf66f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 systemd-machined[195626]: New machine qemu-30-instance-0000004a.
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.256 227766 DEBUG nova.compute.manager [None req-199bec19-4ab4-47c9-bea2-eec65fa8e2d0 - - - - - -] [instance: 0409b666-6d7a-4831-9ba3-08afe2d0c46b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:21 np0005593234 systemd[1]: Started Virtual Machine qemu-30-instance-0000004a.
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.261 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a0f39dd1-b727-4c95-8900-b524cabce73b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 systemd-udevd[259835]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:51:21 np0005593234 NetworkManager[48942]: <info>  [1769161881.2856] device (tapc0bfdf23-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:51:21 np0005593234 NetworkManager[48942]: <info>  [1769161881.2866] device (tapc0bfdf23-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.297 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9b1ab8d0-153e-467c-9334-f2688065f295]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.301 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[08224d1f-b38d-4ab3-830d-551b9070f65b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 NetworkManager[48942]: <info>  [1769161881.3023] manager: (tap6d2cdc4c-40): new Veth device (/org/freedesktop/NetworkManager/Devices/125)
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.327 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[3d628aa8-6a8b-4228-92af-f0b1cb266572]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.330 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[0260b5cc-60b5-4b95-84e0-89099b165bb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.349 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000004a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.349 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000004a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:51:21 np0005593234 NetworkManager[48942]: <info>  [1769161881.3504] device (tap6d2cdc4c-40): carrier: link connected
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.356 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[8055de21-6df2-495e-80d4-21be7189855b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.373 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[95f2fdc6-0c25-469f-b2b6-a45c174cd0f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577631, 'reachable_time': 25874, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 259864, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.389 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b751e4b9-afa0-40c2-819a-74fd5df02b2c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:5a26'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 577631, 'tstamp': 577631}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 259865, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.408 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0ce9c631-aa70-411c-bf19-4bf17d21811f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577631, 'reachable_time': 25874, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 259866, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.442 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0a965cb4-6dc6-4dc5-87a6-7749c68145b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.493 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.494 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4565MB free_disk=20.892486572265625GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.494 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.494 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.502 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bb4b3c64-3c13-441d-8216-66aa38b89cfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.503 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.503 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.503 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d2cdc4c-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:21 np0005593234 kernel: tap6d2cdc4c-40: entered promiscuous mode
Jan 23 04:51:21 np0005593234 NetworkManager[48942]: <info>  [1769161881.5361] manager: (tap6d2cdc4c-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.535 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.537 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.538 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6d2cdc4c-40, col_values=(('external_ids', {'iface-id': '04f6c0b6-99ee-4958-bc01-68fa310042f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:21 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:21Z|00238|binding|INFO|Releasing lport 04f6c0b6-99ee-4958-bc01-68fa310042f0 from this chassis (sb_readonly=0)
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.540 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.553 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.553 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.554 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f98432cd-a4eb-40d5-b9ee-b20ef6dce478]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.555 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:51:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:21.555 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'env', 'PROCESS_TAG=haproxy-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6d2cdc4c-47a0-475b-8e71-39465d365de3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.574 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 27805b05-1c12-4131-9b61-c4fabd93f60d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.574 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.574 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:51:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:21.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:21.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.659 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.684 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161881.6601167, 27805b05-1c12-4131-9b61-c4fabd93f60d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.684 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] VM Started (Lifecycle Event)#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.712 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.718 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161881.6602411, 27805b05-1c12-4131-9b61-c4fabd93f60d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.718 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.745 227766 DEBUG nova.network.neutron [req-07b2b05a-1c8d-4d8b-af8f-4e67ed1fa937 req-e4950a6a-2d3c-471f-8869-428a62181b73 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Updated VIF entry in instance network info cache for port c0bfdf23-cc4e-4663-8b3c-29e15588af59. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.746 227766 DEBUG nova.network.neutron [req-07b2b05a-1c8d-4d8b-af8f-4e67ed1fa937 req-e4950a6a-2d3c-471f-8869-428a62181b73 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Updating instance_info_cache with network_info: [{"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.750 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.755 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.789 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:51:21 np0005593234 nova_compute[227762]: 2026-01-23 09:51:21.792 227766 DEBUG oslo_concurrency.lockutils [req-07b2b05a-1c8d-4d8b-af8f-4e67ed1fa937 req-e4950a6a-2d3c-471f-8869-428a62181b73 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-27805b05-1c12-4131-9b61-c4fabd93f60d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:51:21 np0005593234 podman[259959]: 2026-01-23 09:51:21.895323541 +0000 UTC m=+0.049596760 container create 3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 04:51:21 np0005593234 systemd[1]: Started libpod-conmon-3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1.scope.
Jan 23 04:51:21 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:51:21 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5f126dd752b124f0406467708936df6dc0df20108f633c4cfd6541f97170c9f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:21 np0005593234 podman[259959]: 2026-01-23 09:51:21.867941616 +0000 UTC m=+0.022214855 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:51:21 np0005593234 podman[259959]: 2026-01-23 09:51:21.965370619 +0000 UTC m=+0.119643838 container init 3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 04:51:21 np0005593234 podman[259959]: 2026-01-23 09:51:21.970460518 +0000 UTC m=+0.124733727 container start 3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 04:51:21 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259974]: [NOTICE]   (259978) : New worker (259980) forked
Jan 23 04:51:21 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259974]: [NOTICE]   (259978) : Loading success.
Jan 23 04:51:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:51:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3565579020' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.111 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.117 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.144 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.201 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.201 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.340 227766 DEBUG nova.compute.manager [req-2a7dccff-9907-4665-8461-c30467daf062 req-8cef80dc-6db1-4cd9-83ab-8a9b7bd7bb36 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Received event network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.340 227766 DEBUG oslo_concurrency.lockutils [req-2a7dccff-9907-4665-8461-c30467daf062 req-8cef80dc-6db1-4cd9-83ab-8a9b7bd7bb36 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.341 227766 DEBUG oslo_concurrency.lockutils [req-2a7dccff-9907-4665-8461-c30467daf062 req-8cef80dc-6db1-4cd9-83ab-8a9b7bd7bb36 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.341 227766 DEBUG oslo_concurrency.lockutils [req-2a7dccff-9907-4665-8461-c30467daf062 req-8cef80dc-6db1-4cd9-83ab-8a9b7bd7bb36 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.341 227766 DEBUG nova.compute.manager [req-2a7dccff-9907-4665-8461-c30467daf062 req-8cef80dc-6db1-4cd9-83ab-8a9b7bd7bb36 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Processing event network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.342 227766 DEBUG nova.compute.manager [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.347 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.348 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161882.346818, 27805b05-1c12-4131-9b61-c4fabd93f60d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.349 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.354 227766 INFO nova.virt.libvirt.driver [-] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Instance spawned successfully.#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.355 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.389 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.393 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.393 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.394 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.394 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.395 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.395 227766 DEBUG nova.virt.libvirt.driver [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.400 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.441 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.474 227766 INFO nova.compute.manager [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Took 9.97 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.475 227766 DEBUG nova.compute.manager [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.563 227766 INFO nova.compute.manager [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Took 12.28 seconds to build instance.#033[00m
Jan 23 04:51:22 np0005593234 nova_compute[227762]: 2026-01-23 09:51:22.582 227766 DEBUG oslo_concurrency.lockutils [None req-da058d53-186f-4e0c-9321-5dcc235e81e8 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:23.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:23.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:24 np0005593234 nova_compute[227762]: 2026-01-23 09:51:24.196 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:24 np0005593234 nova_compute[227762]: 2026-01-23 09:51:24.232 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:24 np0005593234 nova_compute[227762]: 2026-01-23 09:51:24.233 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:24 np0005593234 nova_compute[227762]: 2026-01-23 09:51:24.415 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:24 np0005593234 nova_compute[227762]: 2026-01-23 09:51:24.816 227766 DEBUG nova.compute.manager [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Received event network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:24 np0005593234 nova_compute[227762]: 2026-01-23 09:51:24.816 227766 DEBUG oslo_concurrency.lockutils [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:24 np0005593234 nova_compute[227762]: 2026-01-23 09:51:24.817 227766 DEBUG oslo_concurrency.lockutils [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:24 np0005593234 nova_compute[227762]: 2026-01-23 09:51:24.817 227766 DEBUG oslo_concurrency.lockutils [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:24 np0005593234 nova_compute[227762]: 2026-01-23 09:51:24.817 227766 DEBUG nova.compute.manager [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] No waiting events found dispatching network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:24 np0005593234 nova_compute[227762]: 2026-01-23 09:51:24.818 227766 WARNING nova.compute.manager [req-4e5ae9f0-5eb5-4e80-ae94-c90791698ca1 req-64dd6b49-9ee8-442d-a2a5-1719d2fae626 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Received unexpected event network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:51:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:25 np0005593234 nova_compute[227762]: 2026-01-23 09:51:25.429 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:51:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:25.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:51:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:25.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:25 np0005593234 nova_compute[227762]: 2026-01-23 09:51:25.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:25 np0005593234 nova_compute[227762]: 2026-01-23 09:51:25.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:51:25 np0005593234 nova_compute[227762]: 2026-01-23 09:51:25.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:51:26 np0005593234 nova_compute[227762]: 2026-01-23 09:51:26.701 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-27805b05-1c12-4131-9b61-c4fabd93f60d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:51:26 np0005593234 nova_compute[227762]: 2026-01-23 09:51:26.701 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-27805b05-1c12-4131-9b61-c4fabd93f60d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:51:26 np0005593234 nova_compute[227762]: 2026-01-23 09:51:26.702 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:51:26 np0005593234 nova_compute[227762]: 2026-01-23 09:51:26.702 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 27805b05-1c12-4131-9b61-c4fabd93f60d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:27.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:27.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:29 np0005593234 nova_compute[227762]: 2026-01-23 09:51:29.419 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:51:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:29.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:51:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:51:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:29.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:51:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:30 np0005593234 nova_compute[227762]: 2026-01-23 09:51:30.430 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:31.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:31.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:32 np0005593234 nova_compute[227762]: 2026-01-23 09:51:32.220 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Updating instance_info_cache with network_info: [{"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:51:32 np0005593234 nova_compute[227762]: 2026-01-23 09:51:32.247 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-27805b05-1c12-4131-9b61-c4fabd93f60d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:51:32 np0005593234 nova_compute[227762]: 2026-01-23 09:51:32.248 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:51:32 np0005593234 nova_compute[227762]: 2026-01-23 09:51:32.248 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:32 np0005593234 nova_compute[227762]: 2026-01-23 09:51:32.249 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:32 np0005593234 nova_compute[227762]: 2026-01-23 09:51:32.604 227766 INFO nova.compute.manager [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Rebuilding instance#033[00m
Jan 23 04:51:33 np0005593234 nova_compute[227762]: 2026-01-23 09:51:33.000 227766 DEBUG nova.objects.instance [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 27805b05-1c12-4131-9b61-c4fabd93f60d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:33 np0005593234 nova_compute[227762]: 2026-01-23 09:51:33.120 227766 DEBUG nova.compute.manager [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:33 np0005593234 nova_compute[227762]: 2026-01-23 09:51:33.234 227766 DEBUG nova.objects.instance [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'pci_requests' on Instance uuid 27805b05-1c12-4131-9b61-c4fabd93f60d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:33 np0005593234 nova_compute[227762]: 2026-01-23 09:51:33.267 227766 DEBUG nova.objects.instance [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'pci_devices' on Instance uuid 27805b05-1c12-4131-9b61-c4fabd93f60d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:33 np0005593234 nova_compute[227762]: 2026-01-23 09:51:33.294 227766 DEBUG nova.objects.instance [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'resources' on Instance uuid 27805b05-1c12-4131-9b61-c4fabd93f60d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:33 np0005593234 nova_compute[227762]: 2026-01-23 09:51:33.326 227766 DEBUG nova.objects.instance [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'migration_context' on Instance uuid 27805b05-1c12-4131-9b61-c4fabd93f60d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:33 np0005593234 nova_compute[227762]: 2026-01-23 09:51:33.342 227766 DEBUG nova.objects.instance [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:51:33 np0005593234 nova_compute[227762]: 2026-01-23 09:51:33.345 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 04:51:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:33.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:33.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:34 np0005593234 nova_compute[227762]: 2026-01-23 09:51:34.421 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:34Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7c:5c:6d 10.100.0.14
Jan 23 04:51:34 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:34Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7c:5c:6d 10.100.0.14
Jan 23 04:51:35 np0005593234 nova_compute[227762]: 2026-01-23 09:51:35.243 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:51:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:35 np0005593234 nova_compute[227762]: 2026-01-23 09:51:35.465 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:35.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:51:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:35.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:51:35 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:35Z|00239|binding|INFO|Releasing lport 04f6c0b6-99ee-4958-bc01-68fa310042f0 from this chassis (sb_readonly=0)
Jan 23 04:51:35 np0005593234 nova_compute[227762]: 2026-01-23 09:51:35.982 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:36 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:36Z|00240|binding|INFO|Releasing lport 04f6c0b6-99ee-4958-bc01-68fa310042f0 from this chassis (sb_readonly=0)
Jan 23 04:51:36 np0005593234 nova_compute[227762]: 2026-01-23 09:51:36.159 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:36 np0005593234 podman[259999]: 2026-01-23 09:51:36.766719254 +0000 UTC m=+0.064125434 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true)
Jan 23 04:51:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:37.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:37.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:39 np0005593234 nova_compute[227762]: 2026-01-23 09:51:39.423 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:39.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:39.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:40 np0005593234 nova_compute[227762]: 2026-01-23 09:51:40.514 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:41.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:51:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:41.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:51:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:42.827 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:42.828 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:42.828 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:43 np0005593234 nova_compute[227762]: 2026-01-23 09:51:43.386 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 04:51:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:43.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:43.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:44 np0005593234 nova_compute[227762]: 2026-01-23 09:51:44.426 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:51:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1490985443' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:51:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:51:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1490985443' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:51:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:45 np0005593234 nova_compute[227762]: 2026-01-23 09:51:45.517 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:51:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:45.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:51:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:45.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:45 np0005593234 kernel: tapc0bfdf23-cc (unregistering): left promiscuous mode
Jan 23 04:51:45 np0005593234 NetworkManager[48942]: <info>  [1769161905.7212] device (tapc0bfdf23-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:51:45 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:45Z|00241|binding|INFO|Releasing lport c0bfdf23-cc4e-4663-8b3c-29e15588af59 from this chassis (sb_readonly=0)
Jan 23 04:51:45 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:45Z|00242|binding|INFO|Setting lport c0bfdf23-cc4e-4663-8b3c-29e15588af59 down in Southbound
Jan 23 04:51:45 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:45Z|00243|binding|INFO|Removing iface tapc0bfdf23-cc ovn-installed in OVS
Jan 23 04:51:45 np0005593234 nova_compute[227762]: 2026-01-23 09:51:45.732 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:45 np0005593234 nova_compute[227762]: 2026-01-23 09:51:45.734 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:45 np0005593234 nova_compute[227762]: 2026-01-23 09:51:45.750 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:45.770 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:5c:6d 10.100.0.14'], port_security=['fa:16:3e:7c:5c:6d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '27805b05-1c12-4131-9b61-c4fabd93f60d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=c0bfdf23-cc4e-4663-8b3c-29e15588af59) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:45.771 144381 INFO neutron.agent.ovn.metadata.agent [-] Port c0bfdf23-cc4e-4663-8b3c-29e15588af59 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 unbound from our chassis#033[00m
Jan 23 04:51:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:45.773 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d2cdc4c-47a0-475b-8e71-39465d365de3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:51:45 np0005593234 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000004a.scope: Deactivated successfully.
Jan 23 04:51:45 np0005593234 systemd[1]: machine-qemu\x2d30\x2dinstance\x2d0000004a.scope: Consumed 13.128s CPU time.
Jan 23 04:51:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:45.775 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[13da8d26-ba99-4a4d-9c4d-9e5bcad0a2f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:45.776 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace which is not needed anymore#033[00m
Jan 23 04:51:45 np0005593234 systemd-machined[195626]: Machine qemu-30-instance-0000004a terminated.
Jan 23 04:51:45 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259974]: [NOTICE]   (259978) : haproxy version is 2.8.14-c23fe91
Jan 23 04:51:45 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259974]: [NOTICE]   (259978) : path to executable is /usr/sbin/haproxy
Jan 23 04:51:45 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259974]: [WARNING]  (259978) : Exiting Master process...
Jan 23 04:51:45 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259974]: [ALERT]    (259978) : Current worker (259980) exited with code 143 (Terminated)
Jan 23 04:51:45 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[259974]: [WARNING]  (259978) : All workers exited. Exiting... (0)
Jan 23 04:51:45 np0005593234 systemd[1]: libpod-3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1.scope: Deactivated successfully.
Jan 23 04:51:45 np0005593234 podman[260097]: 2026-01-23 09:51:45.89733951 +0000 UTC m=+0.044292294 container died 3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:51:45 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1-userdata-shm.mount: Deactivated successfully.
Jan 23 04:51:45 np0005593234 systemd[1]: var-lib-containers-storage-overlay-f5f126dd752b124f0406467708936df6dc0df20108f633c4cfd6541f97170c9f-merged.mount: Deactivated successfully.
Jan 23 04:51:45 np0005593234 podman[260097]: 2026-01-23 09:51:45.926920263 +0000 UTC m=+0.073873037 container cleanup 3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:51:45 np0005593234 systemd[1]: libpod-conmon-3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1.scope: Deactivated successfully.
Jan 23 04:51:45 np0005593234 podman[260126]: 2026-01-23 09:51:45.998226021 +0000 UTC m=+0.050311272 container remove 3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 04:51:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:46.005 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2d35e392-fd8b-49ac-a00b-1e0000b5acf4]: (4, ('Fri Jan 23 09:51:45 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1)\n3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1\nFri Jan 23 09:51:45 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1)\n3d97c2b316e341db65c0e04d405d7560861c39777216558107e209a153465cc1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:46.008 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[574d87e8-535b-4b43-9bfe-9c82eda9918f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:46.009 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.011 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:46 np0005593234 kernel: tap6d2cdc4c-40: left promiscuous mode
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.064 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:46.066 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[abcdae52-287e-4a62-a1e8-226656bc521c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:46.077 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c5efe65a-7404-4d76-8235-e46d410557b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:46.078 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[957360f8-abf2-4acf-af7d-c22da1686bfd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:46.094 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1856ce04-01b4-42c3-a5ee-7453acfbd5f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 577625, 'reachable_time': 40605, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260154, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:46.098 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:51:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:46.098 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[f87423c6-1643-4ed9-8b7b-d58e0281bb23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:46 np0005593234 systemd[1]: run-netns-ovnmeta\x2d6d2cdc4c\x2d47a0\x2d475b\x2d8e71\x2d39465d365de3.mount: Deactivated successfully.
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.147 227766 DEBUG nova.compute.manager [req-48f41986-7ec0-4f6a-aa4b-b0f8453d291d req-449f3e54-9cdd-4d8b-b5aa-139edc712a5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Received event network-vif-unplugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.148 227766 DEBUG oslo_concurrency.lockutils [req-48f41986-7ec0-4f6a-aa4b-b0f8453d291d req-449f3e54-9cdd-4d8b-b5aa-139edc712a5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.148 227766 DEBUG oslo_concurrency.lockutils [req-48f41986-7ec0-4f6a-aa4b-b0f8453d291d req-449f3e54-9cdd-4d8b-b5aa-139edc712a5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.148 227766 DEBUG oslo_concurrency.lockutils [req-48f41986-7ec0-4f6a-aa4b-b0f8453d291d req-449f3e54-9cdd-4d8b-b5aa-139edc712a5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.148 227766 DEBUG nova.compute.manager [req-48f41986-7ec0-4f6a-aa4b-b0f8453d291d req-449f3e54-9cdd-4d8b-b5aa-139edc712a5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] No waiting events found dispatching network-vif-unplugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.148 227766 WARNING nova.compute.manager [req-48f41986-7ec0-4f6a-aa4b-b0f8453d291d req-449f3e54-9cdd-4d8b-b5aa-139edc712a5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Received unexpected event network-vif-unplugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.396 227766 INFO nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Instance shutdown successfully after 13 seconds.#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.400 227766 INFO nova.virt.libvirt.driver [-] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Instance destroyed successfully.#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.404 227766 INFO nova.virt.libvirt.driver [-] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Instance destroyed successfully.#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.405 227766 DEBUG nova.virt.libvirt.vif [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:51:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1367169890',display_name='tempest-ServerDiskConfigTestJSON-server-1367169890',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1367169890',id=74,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:51:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-f3p1v1c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:51:31Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=27805b05-1c12-4131-9b61-c4fabd93f60d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.405 227766 DEBUG nova.network.os_vif_util [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.406 227766 DEBUG nova.network.os_vif_util [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.406 227766 DEBUG os_vif [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.407 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.408 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc0bfdf23-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.409 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.410 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.413 227766 INFO os_vif [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc')#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.858 227766 INFO nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Deleting instance files /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d_del#033[00m
Jan 23 04:51:46 np0005593234 nova_compute[227762]: 2026-01-23 09:51:46.859 227766 INFO nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Deletion of /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d_del complete#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.273 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.273 227766 INFO nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Creating image(s)#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.299 227766 DEBUG nova.storage.rbd_utils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.329 227766 DEBUG nova.storage.rbd_utils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.357 227766 DEBUG nova.storage.rbd_utils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.360 227766 DEBUG oslo_concurrency.processutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.422 227766 DEBUG oslo_concurrency.processutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.423 227766 DEBUG oslo_concurrency.lockutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.424 227766 DEBUG oslo_concurrency.lockutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.424 227766 DEBUG oslo_concurrency.lockutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.451 227766 DEBUG nova.storage.rbd_utils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.454 227766 DEBUG oslo_concurrency.processutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 27805b05-1c12-4131-9b61-c4fabd93f60d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:47.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:47.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.757 227766 DEBUG oslo_concurrency.processutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 27805b05-1c12-4131-9b61-c4fabd93f60d_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.302s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.833 227766 DEBUG nova.storage.rbd_utils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] resizing rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.939 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.940 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Ensure instance console log exists: /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.940 227766 DEBUG oslo_concurrency.lockutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.940 227766 DEBUG oslo_concurrency.lockutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.941 227766 DEBUG oslo_concurrency.lockutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.943 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Start _get_guest_xml network_info=[{"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.947 227766 WARNING nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.954 227766 DEBUG nova.virt.libvirt.host [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.955 227766 DEBUG nova.virt.libvirt.host [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.958 227766 DEBUG nova.virt.libvirt.host [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.959 227766 DEBUG nova.virt.libvirt.host [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.960 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.960 227766 DEBUG nova.virt.hardware [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.961 227766 DEBUG nova.virt.hardware [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.961 227766 DEBUG nova.virt.hardware [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.961 227766 DEBUG nova.virt.hardware [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.961 227766 DEBUG nova.virt.hardware [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.962 227766 DEBUG nova.virt.hardware [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.962 227766 DEBUG nova.virt.hardware [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.962 227766 DEBUG nova.virt.hardware [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.963 227766 DEBUG nova.virt.hardware [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.963 227766 DEBUG nova.virt.hardware [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.963 227766 DEBUG nova.virt.hardware [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.963 227766 DEBUG nova.objects.instance [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 27805b05-1c12-4131-9b61-c4fabd93f60d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:47 np0005593234 nova_compute[227762]: 2026-01-23 09:51:47.989 227766 DEBUG oslo_concurrency.processutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:51:48 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3247052048' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.440 227766 DEBUG oslo_concurrency.processutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.466 227766 DEBUG nova.storage.rbd_utils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.470 227766 DEBUG oslo_concurrency.processutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:51:48 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2946235917' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.892 227766 DEBUG oslo_concurrency.processutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.894 227766 DEBUG nova.virt.libvirt.vif [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:51:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1367169890',display_name='tempest-ServerDiskConfigTestJSON-server-1367169890',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1367169890',id=74,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:51:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-f3p1v1c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:51:47Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=27805b05-1c12-4131-9b61-c4fabd93f60d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.895 227766 DEBUG nova.network.os_vif_util [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.896 227766 DEBUG nova.network.os_vif_util [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.899 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  <uuid>27805b05-1c12-4131-9b61-c4fabd93f60d</uuid>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  <name>instance-0000004a</name>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-1367169890</nova:name>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:51:47</nova:creationTime>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <nova:user uuid="0cfac2191989448ead77e75ca3910ac4">tempest-ServerDiskConfigTestJSON-211417238-project-member</nova:user>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <nova:project uuid="86d938c8e2bb41a79012befd500d1088">tempest-ServerDiskConfigTestJSON-211417238</nova:project>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="ae1f9e37-418c-462f-81d1-3599a6d89de9"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <nova:port uuid="c0bfdf23-cc4e-4663-8b3c-29e15588af59">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <entry name="serial">27805b05-1c12-4131-9b61-c4fabd93f60d</entry>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <entry name="uuid">27805b05-1c12-4131-9b61-c4fabd93f60d</entry>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/27805b05-1c12-4131-9b61-c4fabd93f60d_disk">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/27805b05-1c12-4131-9b61-c4fabd93f60d_disk.config">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:7c:5c:6d"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <target dev="tapc0bfdf23-cc"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/console.log" append="off"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:51:48 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:51:48 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:51:48 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:51:48 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.900 227766 DEBUG nova.virt.libvirt.vif [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:51:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1367169890',display_name='tempest-ServerDiskConfigTestJSON-server-1367169890',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1367169890',id=74,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:51:22Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-f3p1v1c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:51:47Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=27805b05-1c12-4131-9b61-c4fabd93f60d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.901 227766 DEBUG nova.network.os_vif_util [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.901 227766 DEBUG nova.network.os_vif_util [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.902 227766 DEBUG os_vif [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.902 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.903 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.903 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.905 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.905 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc0bfdf23-cc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.906 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc0bfdf23-cc, col_values=(('external_ids', {'iface-id': 'c0bfdf23-cc4e-4663-8b3c-29e15588af59', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7c:5c:6d', 'vm-uuid': '27805b05-1c12-4131-9b61-c4fabd93f60d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.907 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:48 np0005593234 NetworkManager[48942]: <info>  [1769161908.9085] manager: (tapc0bfdf23-cc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/127)
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.910 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.912 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.913 227766 INFO os_vif [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc')#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.981 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.982 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.982 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No VIF found with MAC fa:16:3e:7c:5c:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:51:48 np0005593234 nova_compute[227762]: 2026-01-23 09:51:48.983 227766 INFO nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Using config drive#033[00m
Jan 23 04:51:49 np0005593234 nova_compute[227762]: 2026-01-23 09:51:49.005 227766 DEBUG nova.storage.rbd_utils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:49 np0005593234 nova_compute[227762]: 2026-01-23 09:51:49.233 227766 DEBUG nova.objects.instance [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 27805b05-1c12-4131-9b61-c4fabd93f60d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:49 np0005593234 nova_compute[227762]: 2026-01-23 09:51:49.331 227766 DEBUG nova.objects.instance [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'keypairs' on Instance uuid 27805b05-1c12-4131-9b61-c4fabd93f60d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:49 np0005593234 nova_compute[227762]: 2026-01-23 09:51:49.452 227766 DEBUG nova.compute.manager [req-d2bfaa7d-9137-4254-8501-02408d728ca1 req-73ed8010-454c-4af9-9ff4-60da091670bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Received event network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:49 np0005593234 nova_compute[227762]: 2026-01-23 09:51:49.453 227766 DEBUG oslo_concurrency.lockutils [req-d2bfaa7d-9137-4254-8501-02408d728ca1 req-73ed8010-454c-4af9-9ff4-60da091670bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:49 np0005593234 nova_compute[227762]: 2026-01-23 09:51:49.453 227766 DEBUG oslo_concurrency.lockutils [req-d2bfaa7d-9137-4254-8501-02408d728ca1 req-73ed8010-454c-4af9-9ff4-60da091670bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:49 np0005593234 nova_compute[227762]: 2026-01-23 09:51:49.453 227766 DEBUG oslo_concurrency.lockutils [req-d2bfaa7d-9137-4254-8501-02408d728ca1 req-73ed8010-454c-4af9-9ff4-60da091670bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:49 np0005593234 nova_compute[227762]: 2026-01-23 09:51:49.453 227766 DEBUG nova.compute.manager [req-d2bfaa7d-9137-4254-8501-02408d728ca1 req-73ed8010-454c-4af9-9ff4-60da091670bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] No waiting events found dispatching network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:49 np0005593234 nova_compute[227762]: 2026-01-23 09:51:49.454 227766 WARNING nova.compute.manager [req-d2bfaa7d-9137-4254-8501-02408d728ca1 req-73ed8010-454c-4af9-9ff4-60da091670bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Received unexpected event network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 23 04:51:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:51:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:49.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:51:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:49.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:49 np0005593234 nova_compute[227762]: 2026-01-23 09:51:49.904 227766 INFO nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Creating config drive at /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/disk.config#033[00m
Jan 23 04:51:49 np0005593234 nova_compute[227762]: 2026-01-23 09:51:49.909 227766 DEBUG oslo_concurrency.processutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwu06dwky execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.038 227766 DEBUG oslo_concurrency.processutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwu06dwky" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.068 227766 DEBUG nova.storage.rbd_utils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] rbd image 27805b05-1c12-4131-9b61-c4fabd93f60d_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.072 227766 DEBUG oslo_concurrency.processutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/disk.config 27805b05-1c12-4131-9b61-c4fabd93f60d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.217 227766 DEBUG oslo_concurrency.processutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/disk.config 27805b05-1c12-4131-9b61-c4fabd93f60d_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.218 227766 INFO nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Deleting local config drive /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d/disk.config because it was imported into RBD.#033[00m
Jan 23 04:51:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:50 np0005593234 kernel: tapc0bfdf23-cc: entered promiscuous mode
Jan 23 04:51:50 np0005593234 NetworkManager[48942]: <info>  [1769161910.2710] manager: (tapc0bfdf23-cc): new Tun device (/org/freedesktop/NetworkManager/Devices/128)
Jan 23 04:51:50 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:50Z|00244|binding|INFO|Claiming lport c0bfdf23-cc4e-4663-8b3c-29e15588af59 for this chassis.
Jan 23 04:51:50 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:50Z|00245|binding|INFO|c0bfdf23-cc4e-4663-8b3c-29e15588af59: Claiming fa:16:3e:7c:5c:6d 10.100.0.14
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.271 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:50Z|00246|binding|INFO|Setting lport c0bfdf23-cc4e-4663-8b3c-29e15588af59 ovn-installed in OVS
Jan 23 04:51:50 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:50Z|00247|binding|INFO|Setting lport c0bfdf23-cc4e-4663-8b3c-29e15588af59 up in Southbound
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.287 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.287 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:5c:6d 10.100.0.14'], port_security=['fa:16:3e:7c:5c:6d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '27805b05-1c12-4131-9b61-c4fabd93f60d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '5', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=c0bfdf23-cc4e-4663-8b3c-29e15588af59) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.289 144381 INFO neutron.agent.ovn.metadata.agent [-] Port c0bfdf23-cc4e-4663-8b3c-29e15588af59 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 bound to our chassis#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.290 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6d2cdc4c-47a0-475b-8e71-39465d365de3#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.291 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593234 systemd-udevd[260475]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.301 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4f1b561c-7be9-4d1b-a327-59febaa74f47]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.302 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6d2cdc4c-41 in ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.303 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6d2cdc4c-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.304 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[84561808-dcbc-408e-973f-f3670a344ded]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.305 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[43297af7-b08b-4469-8dfd-5dd10f2f454a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 NetworkManager[48942]: <info>  [1769161910.3067] device (tapc0bfdf23-cc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:51:50 np0005593234 NetworkManager[48942]: <info>  [1769161910.3072] device (tapc0bfdf23-cc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:51:50 np0005593234 systemd-machined[195626]: New machine qemu-31-instance-0000004a.
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.316 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[2ed9f210-bc90-4b75-98e1-3349c1ef3dda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 systemd[1]: Started Virtual Machine qemu-31-instance-0000004a.
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.329 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d6b439a2-998e-4f6a-9970-302091f636a7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.358 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f8531e50-ab38-446e-8413-cc4a2d1f4a84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 NetworkManager[48942]: <info>  [1769161910.3660] manager: (tap6d2cdc4c-40): new Veth device (/org/freedesktop/NetworkManager/Devices/129)
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.365 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0b6851f9-639d-42bb-a230-3ca243734890]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.398 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[6c61aa68-90a6-4656-a744-860e94f63935]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.401 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[47c75410-7c85-43a4-9edb-18878bdfaacf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 NetworkManager[48942]: <info>  [1769161910.4236] device (tap6d2cdc4c-40): carrier: link connected
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.430 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[2ed09deb-8fd7-4d91-a649-353460797579]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.447 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[984d2390-4170-4c5d-9ce9-e74d71b00a8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580538, 'reachable_time': 31521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260512, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.464 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1ad37c4f-8845-43ed-b858-d1e89d468da8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:5a26'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 580538, 'tstamp': 580538}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 260513, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.483 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a80e01c8-6893-4b61-be7e-fadd2f4e19bf]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580538, 'reachable_time': 31521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 260514, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.512 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3aba05bc-38ca-4742-8537-ce4ccf410290]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.518 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.573 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4d4da7fc-de69-447b-9555-155e416e4ff1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.574 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.575 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.575 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d2cdc4c-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.577 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593234 NetworkManager[48942]: <info>  [1769161910.5781] manager: (tap6d2cdc4c-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Jan 23 04:51:50 np0005593234 kernel: tap6d2cdc4c-40: entered promiscuous mode
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.579 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.581 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6d2cdc4c-40, col_values=(('external_ids', {'iface-id': '04f6c0b6-99ee-4958-bc01-68fa310042f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.582 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:50Z|00248|binding|INFO|Releasing lport 04f6c0b6-99ee-4958-bc01-68fa310042f0 from this chassis (sb_readonly=0)
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.597 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.598 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.599 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2d713475-eb2b-4fb6-9d89-cdd6872db60f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.599 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:51:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:50.600 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'env', 'PROCESS_TAG=haproxy-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6d2cdc4c-47a0-475b-8e71-39465d365de3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.741 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for 27805b05-1c12-4131-9b61-c4fabd93f60d due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.742 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161910.741286, 27805b05-1c12-4131-9b61-c4fabd93f60d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.742 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.745 227766 DEBUG nova.compute.manager [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.745 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.748 227766 INFO nova.virt.libvirt.driver [-] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Instance spawned successfully.#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.749 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.789 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.796 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.799 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.800 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.801 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.801 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.802 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.802 227766 DEBUG nova.virt.libvirt.driver [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.838 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.838 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161910.7420888, 27805b05-1c12-4131-9b61-c4fabd93f60d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.839 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] VM Started (Lifecycle Event)#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.887 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.891 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.933 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 04:51:50 np0005593234 nova_compute[227762]: 2026-01-23 09:51:50.946 227766 DEBUG nova.compute.manager [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:51:50 np0005593234 podman[260588]: 2026-01-23 09:51:50.976441491 +0000 UTC m=+0.047899297 container create 6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 04:51:51 np0005593234 nova_compute[227762]: 2026-01-23 09:51:51.008 227766 DEBUG oslo_concurrency.lockutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:51 np0005593234 nova_compute[227762]: 2026-01-23 09:51:51.009 227766 DEBUG oslo_concurrency.lockutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:51 np0005593234 nova_compute[227762]: 2026-01-23 09:51:51.010 227766 DEBUG nova.objects.instance [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 04:51:51 np0005593234 systemd[1]: Started libpod-conmon-6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef.scope.
Jan 23 04:51:51 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:51:51 np0005593234 podman[260588]: 2026-01-23 09:51:50.951322256 +0000 UTC m=+0.022780082 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:51:51 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b3fd254458a5ab00a50560fb9c15d6ae33106c18138ae9a717179101511ac6b4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:51:51 np0005593234 podman[260588]: 2026-01-23 09:51:51.060070643 +0000 UTC m=+0.131528499 container init 6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 04:51:51 np0005593234 podman[260588]: 2026-01-23 09:51:51.064843352 +0000 UTC m=+0.136301158 container start 6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:51:51 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[260605]: [NOTICE]   (260628) : New worker (260634) forked
Jan 23 04:51:51 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[260605]: [NOTICE]   (260628) : Loading success.
Jan 23 04:51:51 np0005593234 podman[260602]: 2026-01-23 09:51:51.103425006 +0000 UTC m=+0.086943526 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:51:51 np0005593234 nova_compute[227762]: 2026-01-23 09:51:51.108 227766 DEBUG oslo_concurrency.lockutils [None req-2874465b-522d-419b-ae96-91390366d6a5 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:51.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:51.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:53 np0005593234 nova_compute[227762]: 2026-01-23 09:51:53.188 227766 DEBUG nova.compute.manager [req-bec8c63e-d24f-4c13-9947-369f91b3f03f req-425db2c1-8182-4459-b7cb-66bce30d8361 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Received event network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:53 np0005593234 nova_compute[227762]: 2026-01-23 09:51:53.188 227766 DEBUG oslo_concurrency.lockutils [req-bec8c63e-d24f-4c13-9947-369f91b3f03f req-425db2c1-8182-4459-b7cb-66bce30d8361 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:53 np0005593234 nova_compute[227762]: 2026-01-23 09:51:53.188 227766 DEBUG oslo_concurrency.lockutils [req-bec8c63e-d24f-4c13-9947-369f91b3f03f req-425db2c1-8182-4459-b7cb-66bce30d8361 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:53 np0005593234 nova_compute[227762]: 2026-01-23 09:51:53.189 227766 DEBUG oslo_concurrency.lockutils [req-bec8c63e-d24f-4c13-9947-369f91b3f03f req-425db2c1-8182-4459-b7cb-66bce30d8361 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:53 np0005593234 nova_compute[227762]: 2026-01-23 09:51:53.189 227766 DEBUG nova.compute.manager [req-bec8c63e-d24f-4c13-9947-369f91b3f03f req-425db2c1-8182-4459-b7cb-66bce30d8361 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] No waiting events found dispatching network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:53 np0005593234 nova_compute[227762]: 2026-01-23 09:51:53.189 227766 WARNING nova.compute.manager [req-bec8c63e-d24f-4c13-9947-369f91b3f03f req-425db2c1-8182-4459-b7cb-66bce30d8361 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Received unexpected event network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:51:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:53.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:53.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:53 np0005593234 nova_compute[227762]: 2026-01-23 09:51:53.960 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.346 227766 DEBUG oslo_concurrency.lockutils [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "27805b05-1c12-4131-9b61-c4fabd93f60d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.348 227766 DEBUG oslo_concurrency.lockutils [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.348 227766 DEBUG oslo_concurrency.lockutils [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.349 227766 DEBUG oslo_concurrency.lockutils [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.349 227766 DEBUG oslo_concurrency.lockutils [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.351 227766 INFO nova.compute.manager [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Terminating instance#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.352 227766 DEBUG nova.compute.manager [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:51:54 np0005593234 kernel: tapc0bfdf23-cc (unregistering): left promiscuous mode
Jan 23 04:51:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:51:54 np0005593234 NetworkManager[48942]: <info>  [1769161914.5191] device (tapc0bfdf23-cc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:51:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.0 total, 600.0 interval#012Cumulative writes: 8335 writes, 42K keys, 8335 commit groups, 1.0 writes per commit group, ingest: 0.08 GB, 0.03 MB/s#012Cumulative WAL: 8335 writes, 8335 syncs, 1.00 writes per sync, written: 0.08 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1706 writes, 8510 keys, 1706 commit groups, 1.0 writes per commit group, ingest: 16.87 MB, 0.03 MB/s#012Interval WAL: 1707 writes, 1707 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     52.5      0.99              0.15        24    0.041       0      0       0.0       0.0#012  L6      1/0    8.60 MB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   3.9    104.6     86.3      2.37              0.78        23    0.103    125K    13K       0.0       0.0#012 Sum      1/0    8.60 MB   0.0      0.2     0.1      0.2       0.3      0.1       0.0   4.9     73.8     76.3      3.36              0.92        47    0.072    125K    13K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.5     86.0     84.4      0.80              0.17        12    0.066     40K   3070       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.2     0.1      0.2       0.2      0.0       0.0   0.0    104.6     86.3      2.37              0.78        23    0.103    125K    13K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     52.6      0.99              0.15        23    0.043       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3000.0 total, 600.0 interval#012Flush(GB): cumulative 0.051, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.25 GB write, 0.09 MB/s write, 0.24 GB read, 0.08 MB/s read, 3.4 seconds#012Interval compaction: 0.07 GB write, 0.11 MB/s write, 0.07 GB read, 0.11 MB/s read, 0.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 304.00 MB usage: 27.69 MB table_size: 0 occupancy: 18446744073709551615 collections: 6 last_copies: 0 last_secs: 0.000269 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(1603,26.71 MB,8.78608%) FilterBlock(47,359.36 KB,0.11544%) IndexBlock(47,640.44 KB,0.205733%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 04:51:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:54Z|00249|binding|INFO|Releasing lport c0bfdf23-cc4e-4663-8b3c-29e15588af59 from this chassis (sb_readonly=0)
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.528 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:54Z|00250|binding|INFO|Setting lport c0bfdf23-cc4e-4663-8b3c-29e15588af59 down in Southbound
Jan 23 04:51:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:51:54Z|00251|binding|INFO|Removing iface tapc0bfdf23-cc ovn-installed in OVS
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.530 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.546 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.589 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7c:5c:6d 10.100.0.14'], port_security=['fa:16:3e:7c:5c:6d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '27805b05-1c12-4131-9b61-c4fabd93f60d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=c0bfdf23-cc4e-4663-8b3c-29e15588af59) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.591 144381 INFO neutron.agent.ovn.metadata.agent [-] Port c0bfdf23-cc4e-4663-8b3c-29e15588af59 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 unbound from our chassis#033[00m
Jan 23 04:51:54 np0005593234 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000004a.scope: Deactivated successfully.
Jan 23 04:51:54 np0005593234 systemd[1]: machine-qemu\x2d31\x2dinstance\x2d0000004a.scope: Consumed 4.084s CPU time.
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.592 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d2cdc4c-47a0-475b-8e71-39465d365de3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.593 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b6b7e07e-693a-453b-b8d7-93c3d855335c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.594 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace which is not needed anymore#033[00m
Jan 23 04:51:54 np0005593234 systemd-machined[195626]: Machine qemu-31-instance-0000004a terminated.
Jan 23 04:51:54 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[260605]: [NOTICE]   (260628) : haproxy version is 2.8.14-c23fe91
Jan 23 04:51:54 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[260605]: [NOTICE]   (260628) : path to executable is /usr/sbin/haproxy
Jan 23 04:51:54 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[260605]: [WARNING]  (260628) : Exiting Master process...
Jan 23 04:51:54 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[260605]: [ALERT]    (260628) : Current worker (260634) exited with code 143 (Terminated)
Jan 23 04:51:54 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[260605]: [WARNING]  (260628) : All workers exited. Exiting... (0)
Jan 23 04:51:54 np0005593234 systemd[1]: libpod-6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef.scope: Deactivated successfully.
Jan 23 04:51:54 np0005593234 podman[260802]: 2026-01-23 09:51:54.725867049 +0000 UTC m=+0.043393576 container died 6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 04:51:54 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef-userdata-shm.mount: Deactivated successfully.
Jan 23 04:51:54 np0005593234 systemd[1]: var-lib-containers-storage-overlay-b3fd254458a5ab00a50560fb9c15d6ae33106c18138ae9a717179101511ac6b4-merged.mount: Deactivated successfully.
Jan 23 04:51:54 np0005593234 podman[260802]: 2026-01-23 09:51:54.779643319 +0000 UTC m=+0.097169826 container cleanup 6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:51:54 np0005593234 systemd[1]: libpod-conmon-6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef.scope: Deactivated successfully.
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.790 227766 INFO nova.virt.libvirt.driver [-] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Instance destroyed successfully.#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.791 227766 DEBUG nova.objects.instance [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'resources' on Instance uuid 27805b05-1c12-4131-9b61-c4fabd93f60d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.815 227766 DEBUG nova.virt.libvirt.vif [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T09:51:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-1367169890',display_name='tempest-ServerDiskConfigTestJSON-server-1367169890',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-1367169890',id=74,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:51:50Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-f3p1v1c0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:51:51Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=27805b05-1c12-4131-9b61-c4fabd93f60d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.816 227766 DEBUG nova.network.os_vif_util [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "address": "fa:16:3e:7c:5c:6d", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc0bfdf23-cc", "ovs_interfaceid": "c0bfdf23-cc4e-4663-8b3c-29e15588af59", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.816 227766 DEBUG nova.network.os_vif_util [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.817 227766 DEBUG os_vif [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.818 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.818 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc0bfdf23-cc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.822 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.825 227766 INFO os_vif [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7c:5c:6d,bridge_name='br-int',has_traffic_filtering=True,id=c0bfdf23-cc4e-4663-8b3c-29e15588af59,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc0bfdf23-cc')#033[00m
Jan 23 04:51:54 np0005593234 podman[260839]: 2026-01-23 09:51:54.840763308 +0000 UTC m=+0.040448954 container remove 6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.846 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dcfbde09-9512-4fc1-a72c-847b4436f672]: (4, ('Fri Jan 23 09:51:54 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef)\n6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef\nFri Jan 23 09:51:54 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef)\n6b44e14cef6357d1368b3cd9af3279e53dce6edbdd62221628de039402c6f7ef\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.847 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[74fa77f1-ed4c-4c40-97ff-5804d76ff693]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.848 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.850 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:54 np0005593234 kernel: tap6d2cdc4c-40: left promiscuous mode
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.851 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.855 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2f025fd8-db86-4cb0-8190-bf1906e6b818]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593234 nova_compute[227762]: 2026-01-23 09:51:54.865 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.868 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1654319a-8a57-4184-a25d-1eef5ca9cf52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.869 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ca4b5380-76e4-4220-a410-174f809df823]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.884 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2b383fc2-1272-4866-8e94-0a7bf0fd8e7d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 580531, 'reachable_time': 31981, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 260874, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:54 np0005593234 systemd[1]: run-netns-ovnmeta\x2d6d2cdc4c\x2d47a0\x2d475b\x2d8e71\x2d39465d365de3.mount: Deactivated successfully.
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.888 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:51:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:54.888 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[9a6ae4a4-996a-4b4e-9f7b-1489945c58d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:51:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:51:55 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:51:55 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:51:55 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:51:55 np0005593234 nova_compute[227762]: 2026-01-23 09:51:55.520 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:55.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:55.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:56 np0005593234 nova_compute[227762]: 2026-01-23 09:51:56.038 227766 INFO nova.virt.libvirt.driver [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Deleting instance files /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d_del#033[00m
Jan 23 04:51:56 np0005593234 nova_compute[227762]: 2026-01-23 09:51:56.039 227766 INFO nova.virt.libvirt.driver [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Deletion of /var/lib/nova/instances/27805b05-1c12-4131-9b61-c4fabd93f60d_del complete#033[00m
Jan 23 04:51:56 np0005593234 nova_compute[227762]: 2026-01-23 09:51:56.162 227766 INFO nova.compute.manager [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Took 1.81 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:51:56 np0005593234 nova_compute[227762]: 2026-01-23 09:51:56.162 227766 DEBUG oslo.service.loopingcall [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:51:56 np0005593234 nova_compute[227762]: 2026-01-23 09:51:56.162 227766 DEBUG nova.compute.manager [-] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:51:56 np0005593234 nova_compute[227762]: 2026-01-23 09:51:56.163 227766 DEBUG nova.network.neutron [-] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:51:56 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:51:56 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:51:56 np0005593234 nova_compute[227762]: 2026-01-23 09:51:56.931 227766 DEBUG nova.compute.manager [req-94363faa-79cc-4e9b-9db9-6c192dadc3ae req-8259b847-e146-47ed-8013-0305f7d211ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Received event network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:51:56 np0005593234 nova_compute[227762]: 2026-01-23 09:51:56.931 227766 DEBUG oslo_concurrency.lockutils [req-94363faa-79cc-4e9b-9db9-6c192dadc3ae req-8259b847-e146-47ed-8013-0305f7d211ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:51:56 np0005593234 nova_compute[227762]: 2026-01-23 09:51:56.932 227766 DEBUG oslo_concurrency.lockutils [req-94363faa-79cc-4e9b-9db9-6c192dadc3ae req-8259b847-e146-47ed-8013-0305f7d211ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:51:56 np0005593234 nova_compute[227762]: 2026-01-23 09:51:56.932 227766 DEBUG oslo_concurrency.lockutils [req-94363faa-79cc-4e9b-9db9-6c192dadc3ae req-8259b847-e146-47ed-8013-0305f7d211ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:51:56 np0005593234 nova_compute[227762]: 2026-01-23 09:51:56.932 227766 DEBUG nova.compute.manager [req-94363faa-79cc-4e9b-9db9-6c192dadc3ae req-8259b847-e146-47ed-8013-0305f7d211ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] No waiting events found dispatching network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:51:56 np0005593234 nova_compute[227762]: 2026-01-23 09:51:56.932 227766 WARNING nova.compute.manager [req-94363faa-79cc-4e9b-9db9-6c192dadc3ae req-8259b847-e146-47ed-8013-0305f7d211ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Received unexpected event network-vif-plugged-c0bfdf23-cc4e-4663-8b3c-29e15588af59 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:51:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:57.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:57.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:58.773 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=27, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=26) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:51:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:51:58.774 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:51:58 np0005593234 nova_compute[227762]: 2026-01-23 09:51:58.775 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:51:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:51:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:51:59.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:51:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:51:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:51:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:51:59.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:51:59 np0005593234 nova_compute[227762]: 2026-01-23 09:51:59.865 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:00 np0005593234 nova_compute[227762]: 2026-01-23 09:52:00.521 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:01 np0005593234 nova_compute[227762]: 2026-01-23 09:52:01.405 227766 DEBUG nova.network.neutron [-] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:52:01 np0005593234 nova_compute[227762]: 2026-01-23 09:52:01.487 227766 INFO nova.compute.manager [-] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Took 5.32 seconds to deallocate network for instance.#033[00m
Jan 23 04:52:01 np0005593234 nova_compute[227762]: 2026-01-23 09:52:01.564 227766 DEBUG nova.compute.manager [req-44accc2d-eb96-4caa-8f2a-a0675b939176 req-193513df-1245-41d3-a8c9-4e0127396ff9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Received event network-vif-deleted-c0bfdf23-cc4e-4663-8b3c-29e15588af59 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:52:01 np0005593234 nova_compute[227762]: 2026-01-23 09:52:01.574 227766 DEBUG oslo_concurrency.lockutils [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:01 np0005593234 nova_compute[227762]: 2026-01-23 09:52:01.574 227766 DEBUG oslo_concurrency.lockutils [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:01.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:01.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:01 np0005593234 nova_compute[227762]: 2026-01-23 09:52:01.677 227766 DEBUG oslo_concurrency.processutils [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:52:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:52:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:52:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:52:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2478158389' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:52:02 np0005593234 nova_compute[227762]: 2026-01-23 09:52:02.192 227766 DEBUG oslo_concurrency.processutils [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.515s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:52:02 np0005593234 nova_compute[227762]: 2026-01-23 09:52:02.198 227766 DEBUG nova.compute.provider_tree [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:52:02 np0005593234 nova_compute[227762]: 2026-01-23 09:52:02.314 227766 DEBUG nova.scheduler.client.report [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:52:02 np0005593234 nova_compute[227762]: 2026-01-23 09:52:02.360 227766 DEBUG oslo_concurrency.lockutils [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.786s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:02 np0005593234 nova_compute[227762]: 2026-01-23 09:52:02.452 227766 INFO nova.scheduler.client.report [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Deleted allocations for instance 27805b05-1c12-4131-9b61-c4fabd93f60d#033[00m
Jan 23 04:52:02 np0005593234 nova_compute[227762]: 2026-01-23 09:52:02.607 227766 DEBUG oslo_concurrency.lockutils [None req-59946dcf-782a-4969-b8dc-940efadfb786 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "27805b05-1c12-4131-9b61-c4fabd93f60d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.259s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:52:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:03.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:52:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:03.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:04 np0005593234 nova_compute[227762]: 2026-01-23 09:52:04.869 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:05 np0005593234 nova_compute[227762]: 2026-01-23 09:52:05.522 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:52:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:05.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:52:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:05.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:52:06.776 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '27'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:52:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:07.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:07.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:07 np0005593234 podman[261005]: 2026-01-23 09:52:07.753711972 +0000 UTC m=+0.049886799 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 04:52:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:09.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:09.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:09 np0005593234 nova_compute[227762]: 2026-01-23 09:52:09.789 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769161914.788408, 27805b05-1c12-4131-9b61-c4fabd93f60d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:52:09 np0005593234 nova_compute[227762]: 2026-01-23 09:52:09.789 227766 INFO nova.compute.manager [-] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:52:09 np0005593234 nova_compute[227762]: 2026-01-23 09:52:09.854 227766 DEBUG nova.compute.manager [None req-533bdc47-c4aa-439e-96ca-5619a2621706 - - - - - -] [instance: 27805b05-1c12-4131-9b61-c4fabd93f60d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:52:09 np0005593234 nova_compute[227762]: 2026-01-23 09:52:09.871 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:10 np0005593234 nova_compute[227762]: 2026-01-23 09:52:10.525 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:52:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:11.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:52:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:11.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:52:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:13.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:52:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:13.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 23 04:52:14 np0005593234 nova_compute[227762]: 2026-01-23 09:52:14.874 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e247 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:15 np0005593234 nova_compute[227762]: 2026-01-23 09:52:15.527 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:15.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:52:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:15.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:52:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:17.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:17.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:17 np0005593234 nova_compute[227762]: 2026-01-23 09:52:17.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:52:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:19.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:52:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:19.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:19 np0005593234 nova_compute[227762]: 2026-01-23 09:52:19.913 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e248 e248: 3 total, 3 up, 3 in
Jan 23 04:52:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:20 np0005593234 nova_compute[227762]: 2026-01-23 09:52:20.528 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:20 np0005593234 nova_compute[227762]: 2026-01-23 09:52:20.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:20 np0005593234 nova_compute[227762]: 2026-01-23 09:52:20.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:52:20 np0005593234 nova_compute[227762]: 2026-01-23 09:52:20.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:20 np0005593234 nova_compute[227762]: 2026-01-23 09:52:20.980 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:20 np0005593234 nova_compute[227762]: 2026-01-23 09:52:20.981 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:20 np0005593234 nova_compute[227762]: 2026-01-23 09:52:20.981 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:20 np0005593234 nova_compute[227762]: 2026-01-23 09:52:20.981 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:52:20 np0005593234 nova_compute[227762]: 2026-01-23 09:52:20.981 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:52:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:52:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3481030889' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:52:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:52:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:21.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:52:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:21.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:21 np0005593234 nova_compute[227762]: 2026-01-23 09:52:21.704 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.722s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:52:21 np0005593234 podman[261103]: 2026-01-23 09:52:21.785612834 +0000 UTC m=+0.087490954 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 04:52:21 np0005593234 nova_compute[227762]: 2026-01-23 09:52:21.868 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:52:21 np0005593234 nova_compute[227762]: 2026-01-23 09:52:21.869 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4575MB free_disk=20.830829620361328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:52:21 np0005593234 nova_compute[227762]: 2026-01-23 09:52:21.869 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:21 np0005593234 nova_compute[227762]: 2026-01-23 09:52:21.869 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:21 np0005593234 nova_compute[227762]: 2026-01-23 09:52:21.965 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:52:21 np0005593234 nova_compute[227762]: 2026-01-23 09:52:21.966 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:52:21 np0005593234 nova_compute[227762]: 2026-01-23 09:52:21.996 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:52:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:52:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1532756678' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:52:22 np0005593234 nova_compute[227762]: 2026-01-23 09:52:22.426 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:52:22 np0005593234 nova_compute[227762]: 2026-01-23 09:52:22.433 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:52:22 np0005593234 nova_compute[227762]: 2026-01-23 09:52:22.524 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:52:22 np0005593234 nova_compute[227762]: 2026-01-23 09:52:22.778 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:52:22 np0005593234 nova_compute[227762]: 2026-01-23 09:52:22.778 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.909s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:52:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:23.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:52:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:23.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e249 e249: 3 total, 3 up, 3 in
Jan 23 04:52:24 np0005593234 nova_compute[227762]: 2026-01-23 09:52:24.917 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e249 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e250 e250: 3 total, 3 up, 3 in
Jan 23 04:52:25 np0005593234 nova_compute[227762]: 2026-01-23 09:52:25.530 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:52:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:25.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:52:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:25.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:25 np0005593234 nova_compute[227762]: 2026-01-23 09:52:25.779 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:25 np0005593234 nova_compute[227762]: 2026-01-23 09:52:25.780 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:26 np0005593234 nova_compute[227762]: 2026-01-23 09:52:26.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:26 np0005593234 nova_compute[227762]: 2026-01-23 09:52:26.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:52:26 np0005593234 nova_compute[227762]: 2026-01-23 09:52:26.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:52:26 np0005593234 nova_compute[227762]: 2026-01-23 09:52:26.783 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:52:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:27.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:27.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:27 np0005593234 nova_compute[227762]: 2026-01-23 09:52:27.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:28 np0005593234 nova_compute[227762]: 2026-01-23 09:52:28.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 04:52:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3000.5 total, 600.0 interval#012Cumulative writes: 31K writes, 127K keys, 31K commit groups, 1.0 writes per commit group, ingest: 0.13 GB, 0.04 MB/s#012Cumulative WAL: 31K writes, 10K syncs, 2.91 writes per sync, written: 0.13 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 10K writes, 41K keys, 10K commit groups, 1.0 writes per commit group, ingest: 44.38 MB, 0.07 MB/s#012Interval WAL: 10K writes, 4130 syncs, 2.57 writes per sync, written: 0.04 GB, 0.07 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 04:52:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:29.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:29.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:29 np0005593234 nova_compute[227762]: 2026-01-23 09:52:29.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:52:29 np0005593234 nova_compute[227762]: 2026-01-23 09:52:29.919 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e251 e251: 3 total, 3 up, 3 in
Jan 23 04:52:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:30 np0005593234 nova_compute[227762]: 2026-01-23 09:52:30.533 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:31.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:31.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:33.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:33.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:34 np0005593234 nova_compute[227762]: 2026-01-23 09:52:34.920 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.297238) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161955297322, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 1525, "num_deletes": 256, "total_data_size": 3323543, "memory_usage": 3377072, "flush_reason": "Manual Compaction"}
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161955412733, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 2171668, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42121, "largest_seqno": 43641, "table_properties": {"data_size": 2165256, "index_size": 3547, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1797, "raw_key_size": 14311, "raw_average_key_size": 20, "raw_value_size": 2151977, "raw_average_value_size": 3022, "num_data_blocks": 156, "num_entries": 712, "num_filter_entries": 712, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161840, "oldest_key_time": 1769161840, "file_creation_time": 1769161955, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 115537 microseconds, and 5154 cpu microseconds.
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.412794) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 2171668 bytes OK
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.412812) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.546151) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.546196) EVENT_LOG_v1 {"time_micros": 1769161955546187, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.546219) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 3316320, prev total WAL file size 3316320, number of live WAL files 2.
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.547273) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031323534' seq:72057594037927935, type:22 .. '6C6F676D0031353035' seq:0, type:0; will stop at (end)
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(2120KB)], [81(8809KB)]
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161955547403, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 11192905, "oldest_snapshot_seqno": -1}
Jan 23 04:52:35 np0005593234 nova_compute[227762]: 2026-01-23 09:52:35.550 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 6688 keys, 11052285 bytes, temperature: kUnknown
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161955691164, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 11052285, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11006714, "index_size": 27731, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16773, "raw_key_size": 171984, "raw_average_key_size": 25, "raw_value_size": 10886059, "raw_average_value_size": 1627, "num_data_blocks": 1110, "num_entries": 6688, "num_filter_entries": 6688, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769161955, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.691416) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 11052285 bytes
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.693320) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 77.8 rd, 76.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 8.6 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(10.2) write-amplify(5.1) OK, records in: 7219, records dropped: 531 output_compression: NoCompression
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.693337) EVENT_LOG_v1 {"time_micros": 1769161955693330, "job": 50, "event": "compaction_finished", "compaction_time_micros": 143838, "compaction_time_cpu_micros": 28205, "output_level": 6, "num_output_files": 1, "total_output_size": 11052285, "num_input_records": 7219, "num_output_records": 6688, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161955693829, "job": 50, "event": "table_file_deletion", "file_number": 83}
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161955695300, "job": 50, "event": "table_file_deletion", "file_number": 81}
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.547140) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.695329) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.695333) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.695334) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.695336) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:52:35 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:52:35.695337) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:52:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:35.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:52:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:35.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:52:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:37.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:52:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:37.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:52:38 np0005593234 podman[261161]: 2026-01-23 09:52:38.751273979 +0000 UTC m=+0.045038548 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 23 04:52:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:39.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:39.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:39 np0005593234 nova_compute[227762]: 2026-01-23 09:52:39.923 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:40 np0005593234 nova_compute[227762]: 2026-01-23 09:52:40.551 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:41.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:41.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e252 e252: 3 total, 3 up, 3 in
Jan 23 04:52:42 np0005593234 nova_compute[227762]: 2026-01-23 09:52:42.796 227766 DEBUG nova.compute.manager [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 23 04:52:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:52:42.828 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:52:42.829 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:52:42.829 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:42 np0005593234 nova_compute[227762]: 2026-01-23 09:52:42.920 227766 DEBUG oslo_concurrency.lockutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:42 np0005593234 nova_compute[227762]: 2026-01-23 09:52:42.920 227766 DEBUG oslo_concurrency.lockutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:42 np0005593234 nova_compute[227762]: 2026-01-23 09:52:42.986 227766 DEBUG nova.objects.instance [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'pci_requests' on Instance uuid a3c08e79-4f2b-42f2-bcac-21cbcfbc5247 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:52:43 np0005593234 nova_compute[227762]: 2026-01-23 09:52:43.019 227766 DEBUG nova.virt.hardware [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:52:43 np0005593234 nova_compute[227762]: 2026-01-23 09:52:43.019 227766 INFO nova.compute.claims [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:52:43 np0005593234 nova_compute[227762]: 2026-01-23 09:52:43.020 227766 DEBUG nova.objects.instance [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'resources' on Instance uuid a3c08e79-4f2b-42f2-bcac-21cbcfbc5247 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:52:43 np0005593234 nova_compute[227762]: 2026-01-23 09:52:43.036 227766 DEBUG nova.objects.instance [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'pci_devices' on Instance uuid a3c08e79-4f2b-42f2-bcac-21cbcfbc5247 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:52:43 np0005593234 nova_compute[227762]: 2026-01-23 09:52:43.173 227766 INFO nova.compute.resource_tracker [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Updating resource usage from migration 33811d13-3bff-46fc-9c7b-2a2a36548dcf#033[00m
Jan 23 04:52:43 np0005593234 nova_compute[227762]: 2026-01-23 09:52:43.173 227766 DEBUG nova.compute.resource_tracker [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Starting to track incoming migration 33811d13-3bff-46fc-9c7b-2a2a36548dcf with flavor eebea5f8-9b11-45ad-873d-c4ea90d3de87 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 04:52:43 np0005593234 nova_compute[227762]: 2026-01-23 09:52:43.350 227766 DEBUG oslo_concurrency.processutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:52:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:43.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:43.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:52:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2702526399' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:52:43 np0005593234 nova_compute[227762]: 2026-01-23 09:52:43.795 227766 DEBUG oslo_concurrency.processutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:52:43 np0005593234 nova_compute[227762]: 2026-01-23 09:52:43.801 227766 DEBUG nova.compute.provider_tree [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:52:44 np0005593234 nova_compute[227762]: 2026-01-23 09:52:44.131 227766 DEBUG nova.scheduler.client.report [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:52:44 np0005593234 nova_compute[227762]: 2026-01-23 09:52:44.182 227766 DEBUG oslo_concurrency.lockutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 1.261s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:44 np0005593234 nova_compute[227762]: 2026-01-23 09:52:44.182 227766 INFO nova.compute.manager [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Migrating#033[00m
Jan 23 04:52:44 np0005593234 nova_compute[227762]: 2026-01-23 09:52:44.926 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:45 np0005593234 nova_compute[227762]: 2026-01-23 09:52:45.552 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:45.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:52:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:45.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:52:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:47.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:47.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:48 np0005593234 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 04:52:48 np0005593234 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 04:52:48 np0005593234 systemd-logind[794]: New session 60 of user nova.
Jan 23 04:52:48 np0005593234 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 04:52:48 np0005593234 systemd[1]: Starting User Manager for UID 42436...
Jan 23 04:52:48 np0005593234 systemd[261263]: Queued start job for default target Main User Target.
Jan 23 04:52:48 np0005593234 systemd[261263]: Created slice User Application Slice.
Jan 23 04:52:48 np0005593234 systemd[261263]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:52:48 np0005593234 systemd[261263]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 04:52:48 np0005593234 systemd[261263]: Reached target Paths.
Jan 23 04:52:48 np0005593234 systemd[261263]: Reached target Timers.
Jan 23 04:52:48 np0005593234 systemd[261263]: Starting D-Bus User Message Bus Socket...
Jan 23 04:52:48 np0005593234 systemd[261263]: Starting Create User's Volatile Files and Directories...
Jan 23 04:52:48 np0005593234 systemd[261263]: Listening on D-Bus User Message Bus Socket.
Jan 23 04:52:48 np0005593234 systemd[261263]: Reached target Sockets.
Jan 23 04:52:48 np0005593234 systemd[261263]: Finished Create User's Volatile Files and Directories.
Jan 23 04:52:48 np0005593234 systemd[261263]: Reached target Basic System.
Jan 23 04:52:48 np0005593234 systemd[261263]: Reached target Main User Target.
Jan 23 04:52:48 np0005593234 systemd[261263]: Startup finished in 127ms.
Jan 23 04:52:48 np0005593234 systemd[1]: Started User Manager for UID 42436.
Jan 23 04:52:48 np0005593234 systemd[1]: Started Session 60 of User nova.
Jan 23 04:52:48 np0005593234 systemd[1]: session-60.scope: Deactivated successfully.
Jan 23 04:52:48 np0005593234 systemd-logind[794]: Session 60 logged out. Waiting for processes to exit.
Jan 23 04:52:48 np0005593234 systemd-logind[794]: Removed session 60.
Jan 23 04:52:48 np0005593234 systemd-logind[794]: New session 62 of user nova.
Jan 23 04:52:48 np0005593234 systemd[1]: Started Session 62 of User nova.
Jan 23 04:52:48 np0005593234 systemd[1]: session-62.scope: Deactivated successfully.
Jan 23 04:52:48 np0005593234 systemd-logind[794]: Session 62 logged out. Waiting for processes to exit.
Jan 23 04:52:48 np0005593234 systemd-logind[794]: Removed session 62.
Jan 23 04:52:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:49.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:49.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:49 np0005593234 nova_compute[227762]: 2026-01-23 09:52:49.929 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e253 e253: 3 total, 3 up, 3 in
Jan 23 04:52:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:50 np0005593234 nova_compute[227762]: 2026-01-23 09:52:50.554 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:51.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:51.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:52 np0005593234 podman[261287]: 2026-01-23 09:52:52.789275751 +0000 UTC m=+0.079963888 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 23 04:52:52 np0005593234 nova_compute[227762]: 2026-01-23 09:52:52.873 227766 DEBUG nova.compute.manager [req-53c8aa00-6ad4-4959-89e2-9a099a8e368d req-d150c9ae-7788-4e08-bd35-5bcbe7ee2398 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received event network-vif-unplugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:52:52 np0005593234 nova_compute[227762]: 2026-01-23 09:52:52.873 227766 DEBUG oslo_concurrency.lockutils [req-53c8aa00-6ad4-4959-89e2-9a099a8e368d req-d150c9ae-7788-4e08-bd35-5bcbe7ee2398 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:52 np0005593234 nova_compute[227762]: 2026-01-23 09:52:52.874 227766 DEBUG oslo_concurrency.lockutils [req-53c8aa00-6ad4-4959-89e2-9a099a8e368d req-d150c9ae-7788-4e08-bd35-5bcbe7ee2398 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:52 np0005593234 nova_compute[227762]: 2026-01-23 09:52:52.874 227766 DEBUG oslo_concurrency.lockutils [req-53c8aa00-6ad4-4959-89e2-9a099a8e368d req-d150c9ae-7788-4e08-bd35-5bcbe7ee2398 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:52 np0005593234 nova_compute[227762]: 2026-01-23 09:52:52.874 227766 DEBUG nova.compute.manager [req-53c8aa00-6ad4-4959-89e2-9a099a8e368d req-d150c9ae-7788-4e08-bd35-5bcbe7ee2398 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] No waiting events found dispatching network-vif-unplugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:52:52 np0005593234 nova_compute[227762]: 2026-01-23 09:52:52.874 227766 WARNING nova.compute.manager [req-53c8aa00-6ad4-4959-89e2-9a099a8e368d req-d150c9ae-7788-4e08-bd35-5bcbe7ee2398 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received unexpected event network-vif-unplugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 23 04:52:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:53.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:53.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:54 np0005593234 nova_compute[227762]: 2026-01-23 09:52:54.546 227766 INFO nova.network.neutron [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Updating port d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 04:52:54 np0005593234 nova_compute[227762]: 2026-01-23 09:52:54.932 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e253 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:52:55 np0005593234 nova_compute[227762]: 2026-01-23 09:52:55.495 227766 DEBUG nova.compute.manager [req-93d7a026-3cff-484c-af18-70deb8b07989 req-389fd91c-09b1-47c9-a2b6-ea2a344adc62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received event network-vif-plugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:52:55 np0005593234 nova_compute[227762]: 2026-01-23 09:52:55.495 227766 DEBUG oslo_concurrency.lockutils [req-93d7a026-3cff-484c-af18-70deb8b07989 req-389fd91c-09b1-47c9-a2b6-ea2a344adc62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:52:55 np0005593234 nova_compute[227762]: 2026-01-23 09:52:55.496 227766 DEBUG oslo_concurrency.lockutils [req-93d7a026-3cff-484c-af18-70deb8b07989 req-389fd91c-09b1-47c9-a2b6-ea2a344adc62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:52:55 np0005593234 nova_compute[227762]: 2026-01-23 09:52:55.496 227766 DEBUG oslo_concurrency.lockutils [req-93d7a026-3cff-484c-af18-70deb8b07989 req-389fd91c-09b1-47c9-a2b6-ea2a344adc62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:52:55 np0005593234 nova_compute[227762]: 2026-01-23 09:52:55.496 227766 DEBUG nova.compute.manager [req-93d7a026-3cff-484c-af18-70deb8b07989 req-389fd91c-09b1-47c9-a2b6-ea2a344adc62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] No waiting events found dispatching network-vif-plugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:52:55 np0005593234 nova_compute[227762]: 2026-01-23 09:52:55.496 227766 WARNING nova.compute.manager [req-93d7a026-3cff-484c-af18-70deb8b07989 req-389fd91c-09b1-47c9-a2b6-ea2a344adc62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received unexpected event network-vif-plugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 04:52:55 np0005593234 nova_compute[227762]: 2026-01-23 09:52:55.556 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:52:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:55.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:55.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:56 np0005593234 nova_compute[227762]: 2026-01-23 09:52:56.265 227766 DEBUG oslo_concurrency.lockutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "refresh_cache-a3c08e79-4f2b-42f2-bcac-21cbcfbc5247" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:52:56 np0005593234 nova_compute[227762]: 2026-01-23 09:52:56.266 227766 DEBUG oslo_concurrency.lockutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquired lock "refresh_cache-a3c08e79-4f2b-42f2-bcac-21cbcfbc5247" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:52:56 np0005593234 nova_compute[227762]: 2026-01-23 09:52:56.266 227766 DEBUG nova.network.neutron [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:52:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:57.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:57.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:57 np0005593234 nova_compute[227762]: 2026-01-23 09:52:57.752 227766 DEBUG nova.compute.manager [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received event network-changed-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:52:57 np0005593234 nova_compute[227762]: 2026-01-23 09:52:57.752 227766 DEBUG nova.compute.manager [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Refreshing instance network info cache due to event network-changed-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:52:57 np0005593234 nova_compute[227762]: 2026-01-23 09:52:57.752 227766 DEBUG oslo_concurrency.lockutils [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a3c08e79-4f2b-42f2-bcac-21cbcfbc5247" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:52:58 np0005593234 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 04:52:58 np0005593234 systemd[261263]: Activating special unit Exit the Session...
Jan 23 04:52:58 np0005593234 systemd[261263]: Stopped target Main User Target.
Jan 23 04:52:58 np0005593234 systemd[261263]: Stopped target Basic System.
Jan 23 04:52:58 np0005593234 systemd[261263]: Stopped target Paths.
Jan 23 04:52:58 np0005593234 systemd[261263]: Stopped target Sockets.
Jan 23 04:52:58 np0005593234 systemd[261263]: Stopped target Timers.
Jan 23 04:52:58 np0005593234 systemd[261263]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 04:52:58 np0005593234 systemd[261263]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 04:52:58 np0005593234 systemd[261263]: Closed D-Bus User Message Bus Socket.
Jan 23 04:52:58 np0005593234 systemd[261263]: Stopped Create User's Volatile Files and Directories.
Jan 23 04:52:58 np0005593234 systemd[261263]: Removed slice User Application Slice.
Jan 23 04:52:58 np0005593234 systemd[261263]: Reached target Shutdown.
Jan 23 04:52:58 np0005593234 systemd[261263]: Finished Exit the Session.
Jan 23 04:52:58 np0005593234 systemd[261263]: Reached target Exit the Session.
Jan 23 04:52:58 np0005593234 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 04:52:58 np0005593234 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 04:52:58 np0005593234 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 04:52:58 np0005593234 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 04:52:58 np0005593234 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 04:52:58 np0005593234 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 04:52:58 np0005593234 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 04:52:59 np0005593234 nova_compute[227762]: 2026-01-23 09:52:59.131 227766 DEBUG nova.network.neutron [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Updating instance_info_cache with network_info: [{"id": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "address": "fa:16:3e:49:23:e7", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d188d3-bf", "ovs_interfaceid": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:52:59 np0005593234 nova_compute[227762]: 2026-01-23 09:52:59.174 227766 DEBUG oslo_concurrency.lockutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Releasing lock "refresh_cache-a3c08e79-4f2b-42f2-bcac-21cbcfbc5247" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:52:59 np0005593234 nova_compute[227762]: 2026-01-23 09:52:59.178 227766 DEBUG oslo_concurrency.lockutils [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a3c08e79-4f2b-42f2-bcac-21cbcfbc5247" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:52:59 np0005593234 nova_compute[227762]: 2026-01-23 09:52:59.178 227766 DEBUG nova.network.neutron [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Refreshing network info cache for port d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:52:59 np0005593234 nova_compute[227762]: 2026-01-23 09:52:59.334 227766 DEBUG nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 23 04:52:59 np0005593234 nova_compute[227762]: 2026-01-23 09:52:59.336 227766 DEBUG nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 04:52:59 np0005593234 nova_compute[227762]: 2026-01-23 09:52:59.336 227766 INFO nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Creating image(s)#033[00m
Jan 23 04:52:59 np0005593234 nova_compute[227762]: 2026-01-23 09:52:59.375 227766 DEBUG nova.storage.rbd_utils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] creating snapshot(nova-resize) on rbd image(a3c08e79-4f2b-42f2-bcac-21cbcfbc5247_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 04:52:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:52:59.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:52:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:52:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:52:59.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:52:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e254 e254: 3 total, 3 up, 3 in
Jan 23 04:52:59 np0005593234 nova_compute[227762]: 2026-01-23 09:52:59.844 227766 DEBUG nova.objects.instance [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'trusted_certs' on Instance uuid a3c08e79-4f2b-42f2-bcac-21cbcfbc5247 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:52:59 np0005593234 nova_compute[227762]: 2026-01-23 09:52:59.985 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.065 227766 DEBUG nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.066 227766 DEBUG nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Ensure instance console log exists: /var/lib/nova/instances/a3c08e79-4f2b-42f2-bcac-21cbcfbc5247/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.067 227766 DEBUG oslo_concurrency.lockutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.067 227766 DEBUG oslo_concurrency.lockutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.067 227766 DEBUG oslo_concurrency.lockutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.070 227766 DEBUG nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Start _get_guest_xml network_info=[{"id": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "address": "fa:16:3e:49:23:e7", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "vif_mac": "fa:16:3e:49:23:e7"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d188d3-bf", "ovs_interfaceid": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.074 227766 WARNING nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.084 227766 DEBUG nova.virt.libvirt.host [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.084 227766 DEBUG nova.virt.libvirt.host [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.088 227766 DEBUG nova.virt.libvirt.host [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.088 227766 DEBUG nova.virt.libvirt.host [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.089 227766 DEBUG nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.090 227766 DEBUG nova.virt.hardware [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eebea5f8-9b11-45ad-873d-c4ea90d3de87',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.090 227766 DEBUG nova.virt.hardware [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.090 227766 DEBUG nova.virt.hardware [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.090 227766 DEBUG nova.virt.hardware [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.091 227766 DEBUG nova.virt.hardware [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.091 227766 DEBUG nova.virt.hardware [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.091 227766 DEBUG nova.virt.hardware [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.091 227766 DEBUG nova.virt.hardware [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.091 227766 DEBUG nova.virt.hardware [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.092 227766 DEBUG nova.virt.hardware [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.092 227766 DEBUG nova.virt.hardware [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.092 227766 DEBUG nova.objects.instance [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'vcpu_model' on Instance uuid a3c08e79-4f2b-42f2-bcac-21cbcfbc5247 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.115 227766 DEBUG oslo_concurrency.processutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:00.222 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=28, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=27) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.223 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:00.224 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:53:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:53:00 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2914516695' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.549 227766 DEBUG oslo_concurrency.processutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.580 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:00 np0005593234 nova_compute[227762]: 2026-01-23 09:53:00.585 227766 DEBUG oslo_concurrency.processutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:53:01 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2482348422' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.094 227766 DEBUG oslo_concurrency.processutils [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.096 227766 DEBUG nova.virt.libvirt.vif [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:52:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-231426966',display_name='tempest-ServerDiskConfigTestJSON-server-231426966',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-231426966',id=77,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:52:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-w5bvxhsu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:52:53Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=a3c08e79-4f2b-42f2-bcac-21cbcfbc5247,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "address": "fa:16:3e:49:23:e7", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "vif_mac": "fa:16:3e:49:23:e7"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d188d3-bf", "ovs_interfaceid": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.096 227766 DEBUG nova.network.os_vif_util [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "address": "fa:16:3e:49:23:e7", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "vif_mac": "fa:16:3e:49:23:e7"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d188d3-bf", "ovs_interfaceid": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.097 227766 DEBUG nova.network.os_vif_util [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:23:e7,bridge_name='br-int',has_traffic_filtering=True,id=d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d188d3-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.100 227766 DEBUG nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  <uuid>a3c08e79-4f2b-42f2-bcac-21cbcfbc5247</uuid>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  <name>instance-0000004d</name>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  <memory>196608</memory>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerDiskConfigTestJSON-server-231426966</nova:name>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:53:00</nova:creationTime>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.micro">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <nova:memory>192</nova:memory>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <nova:user uuid="0cfac2191989448ead77e75ca3910ac4">tempest-ServerDiskConfigTestJSON-211417238-project-member</nova:user>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <nova:project uuid="86d938c8e2bb41a79012befd500d1088">tempest-ServerDiskConfigTestJSON-211417238</nova:project>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <nova:port uuid="d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <entry name="serial">a3c08e79-4f2b-42f2-bcac-21cbcfbc5247</entry>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <entry name="uuid">a3c08e79-4f2b-42f2-bcac-21cbcfbc5247</entry>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a3c08e79-4f2b-42f2-bcac-21cbcfbc5247_disk">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a3c08e79-4f2b-42f2-bcac-21cbcfbc5247_disk.config">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:49:23:e7"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <target dev="tapd7d188d3-bf"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/a3c08e79-4f2b-42f2-bcac-21cbcfbc5247/console.log" append="off"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:53:01 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:53:01 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:53:01 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:53:01 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.102 227766 DEBUG nova.virt.libvirt.vif [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:52:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-231426966',display_name='tempest-ServerDiskConfigTestJSON-server-231426966',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-231426966',id=77,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:52:24Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-w5bvxhsu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:52:53Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=a3c08e79-4f2b-42f2-bcac-21cbcfbc5247,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "address": "fa:16:3e:49:23:e7", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "vif_mac": "fa:16:3e:49:23:e7"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d188d3-bf", "ovs_interfaceid": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.102 227766 DEBUG nova.network.os_vif_util [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "address": "fa:16:3e:49:23:e7", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "vif_mac": "fa:16:3e:49:23:e7"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d188d3-bf", "ovs_interfaceid": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.103 227766 DEBUG nova.network.os_vif_util [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:23:e7,bridge_name='br-int',has_traffic_filtering=True,id=d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d188d3-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.103 227766 DEBUG os_vif [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:23:e7,bridge_name='br-int',has_traffic_filtering=True,id=d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d188d3-bf') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.104 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.104 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.104 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.107 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.107 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d188d3-bf, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.108 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd7d188d3-bf, col_values=(('external_ids', {'iface-id': 'd7d188d3-bf5e-4df5-9ce1-6ba00d3f6728', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:49:23:e7', 'vm-uuid': 'a3c08e79-4f2b-42f2-bcac-21cbcfbc5247'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.157 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:01 np0005593234 NetworkManager[48942]: <info>  [1769161981.1579] manager: (tapd7d188d3-bf): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.160 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.163 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.164 227766 INFO os_vif [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:23:e7,bridge_name='br-int',has_traffic_filtering=True,id=d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d188d3-bf')#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.276 227766 DEBUG nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.277 227766 DEBUG nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.277 227766 DEBUG nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] No VIF found with MAC fa:16:3e:49:23:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.277 227766 INFO nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Using config drive#033[00m
Jan 23 04:53:01 np0005593234 kernel: tapd7d188d3-bf: entered promiscuous mode
Jan 23 04:53:01 np0005593234 NetworkManager[48942]: <info>  [1769161981.3617] manager: (tapd7d188d3-bf): new Tun device (/org/freedesktop/NetworkManager/Devices/132)
Jan 23 04:53:01 np0005593234 ovn_controller[134547]: 2026-01-23T09:53:01Z|00252|binding|INFO|Claiming lport d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 for this chassis.
Jan 23 04:53:01 np0005593234 ovn_controller[134547]: 2026-01-23T09:53:01Z|00253|binding|INFO|d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728: Claiming fa:16:3e:49:23:e7 10.100.0.11
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.362 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:01 np0005593234 ovn_controller[134547]: 2026-01-23T09:53:01Z|00254|binding|INFO|Setting lport d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 ovn-installed in OVS
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.380 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:01 np0005593234 ovn_controller[134547]: 2026-01-23T09:53:01Z|00255|binding|INFO|Setting lport d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 up in Southbound
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.389 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:23:e7 10.100.0.11'], port_security=['fa:16:3e:49:23:e7 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a3c08e79-4f2b-42f2-bcac-21cbcfbc5247', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '6', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.390 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 bound to our chassis#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.392 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6d2cdc4c-47a0-475b-8e71-39465d365de3#033[00m
Jan 23 04:53:01 np0005593234 systemd-udevd[261535]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:53:01 np0005593234 systemd-machined[195626]: New machine qemu-32-instance-0000004d.
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.403 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3b1304a1-1d85-4cfe-952e-d48bb11ca50d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.404 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6d2cdc4c-41 in ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.406 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6d2cdc4c-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.406 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e27779c4-373f-4ece-a00a-f25e8d4d6679]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.407 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1c741d29-cc6d-471e-ad10-53b289f7b688]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 NetworkManager[48942]: <info>  [1769161981.4107] device (tapd7d188d3-bf): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:53:01 np0005593234 NetworkManager[48942]: <info>  [1769161981.4112] device (tapd7d188d3-bf): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:53:01 np0005593234 systemd[1]: Started Virtual Machine qemu-32-instance-0000004d.
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.419 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[84a84779-6834-453c-85b3-8e3cee86c132]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.443 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[78437909-713c-46fd-87c0-259e66967727]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.472 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[fd53e307-55af-4158-953a-430736eff10c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.477 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[11c35d75-4391-4acc-85dd-41e8c4572e91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 NetworkManager[48942]: <info>  [1769161981.4783] manager: (tap6d2cdc4c-40): new Veth device (/org/freedesktop/NetworkManager/Devices/133)
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.506 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[57be5c92-b539-4d4b-a7b2-514f1ad95dc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.509 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[420f3a28-b9c7-4830-8266-50069bb551a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 NetworkManager[48942]: <info>  [1769161981.5320] device (tap6d2cdc4c-40): carrier: link connected
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.537 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f7e0924d-6742-4938-8458-625a436d7589]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.552 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b6c7854b-7c07-4fdc-ad40-049c3fff097c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587649, 'reachable_time': 23981, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 261569, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.568 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7fb2e8ef-844a-46f3-9af2-b929b993360c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:5a26'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 587649, 'tstamp': 587649}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 261570, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.585 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a3c7bdfc-7d03-473e-87fa-20a1fbae8774]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6d2cdc4c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:5a:26'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587649, 'reachable_time': 23981, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 261571, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.616 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e6ed4b10-6652-423a-a9e8-2b608dad747c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.673 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7cbbfa8d-d79b-4128-bbb7-9b01e59f719c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.675 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.675 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.676 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6d2cdc4c-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.677 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:01 np0005593234 NetworkManager[48942]: <info>  [1769161981.6781] manager: (tap6d2cdc4c-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Jan 23 04:53:01 np0005593234 kernel: tap6d2cdc4c-40: entered promiscuous mode
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.679 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6d2cdc4c-40, col_values=(('external_ids', {'iface-id': '04f6c0b6-99ee-4958-bc01-68fa310042f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.680 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:01 np0005593234 ovn_controller[134547]: 2026-01-23T09:53:01Z|00256|binding|INFO|Releasing lport 04f6c0b6-99ee-4958-bc01-68fa310042f0 from this chassis (sb_readonly=0)
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.681 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.681 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.682 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8a4a474e-338f-478f-83f1-40fc2f0d9e71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.683 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/6d2cdc4c-47a0-475b-8e71-39465d365de3.pid.haproxy
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 6d2cdc4c-47a0-475b-8e71-39465d365de3
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:53:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:01.683 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'env', 'PROCESS_TAG=haproxy-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6d2cdc4c-47a0-475b-8e71-39465d365de3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.696 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:01.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:01.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.854 227766 DEBUG nova.compute.manager [req-5fc97b3c-954d-4d79-9f71-7a065ede6b91 req-51d72b32-fd44-44e9-852c-b2b66f7a4518 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received event network-vif-plugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.854 227766 DEBUG oslo_concurrency.lockutils [req-5fc97b3c-954d-4d79-9f71-7a065ede6b91 req-51d72b32-fd44-44e9-852c-b2b66f7a4518 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.855 227766 DEBUG oslo_concurrency.lockutils [req-5fc97b3c-954d-4d79-9f71-7a065ede6b91 req-51d72b32-fd44-44e9-852c-b2b66f7a4518 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.855 227766 DEBUG oslo_concurrency.lockutils [req-5fc97b3c-954d-4d79-9f71-7a065ede6b91 req-51d72b32-fd44-44e9-852c-b2b66f7a4518 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.855 227766 DEBUG nova.compute.manager [req-5fc97b3c-954d-4d79-9f71-7a065ede6b91 req-51d72b32-fd44-44e9-852c-b2b66f7a4518 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] No waiting events found dispatching network-vif-plugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:53:01 np0005593234 nova_compute[227762]: 2026-01-23 09:53:01.856 227766 WARNING nova.compute.manager [req-5fc97b3c-954d-4d79-9f71-7a065ede6b91 req-51d72b32-fd44-44e9-852c-b2b66f7a4518 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received unexpected event network-vif-plugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 23 04:53:02 np0005593234 podman[261711]: 2026-01-23 09:53:02.017502016 +0000 UTC m=+0.024354151 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:53:02 np0005593234 podman[261711]: 2026-01-23 09:53:02.128926146 +0000 UTC m=+0.135778261 container create e3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:53:02 np0005593234 systemd[1]: Started libpod-conmon-e3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d.scope.
Jan 23 04:53:02 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:53:02 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa2266a2445a79149d540e5808dbb542c70728a592fa2c833aa8809747950e35/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:53:02 np0005593234 podman[261711]: 2026-01-23 09:53:02.335483137 +0000 UTC m=+0.342335252 container init e3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 04:53:02 np0005593234 podman[261711]: 2026-01-23 09:53:02.345036746 +0000 UTC m=+0.351888861 container start e3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 04:53:02 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[261764]: [NOTICE]   (261784) : New worker (261788) forked
Jan 23 04:53:02 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[261764]: [NOTICE]   (261784) : Loading success.
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.439 227766 DEBUG nova.network.neutron [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Updated VIF entry in instance network info cache for port d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.440 227766 DEBUG nova.network.neutron [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Updating instance_info_cache with network_info: [{"id": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "address": "fa:16:3e:49:23:e7", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d188d3-bf", "ovs_interfaceid": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.468 227766 DEBUG oslo_concurrency.lockutils [req-12f00ac8-dee6-4038-96ba-af1b6ddee68a req-950e0e59-085b-42d6-b3ba-0004af9f16ed 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a3c08e79-4f2b-42f2-bcac-21cbcfbc5247" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.505 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161982.5049608, a3c08e79-4f2b-42f2-bcac-21cbcfbc5247 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.506 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.507 227766 DEBUG nova.compute.manager [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.510 227766 INFO nova.virt.libvirt.driver [-] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Instance running successfully.#033[00m
Jan 23 04:53:02 np0005593234 virtqemud[227483]: argument unsupported: QEMU guest agent is not configured
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.513 227766 DEBUG nova.virt.libvirt.guest [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.514 227766 DEBUG nova.virt.libvirt.driver [None req-b638ed8e-9e60-4a7f-9c7c-bc8fb72a8e6e 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.534 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.537 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.590 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.591 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769161982.5058656, a3c08e79-4f2b-42f2-bcac-21cbcfbc5247 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.591 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] VM Started (Lifecycle Event)#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.626 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:53:02 np0005593234 nova_compute[227762]: 2026-01-23 09:53:02.630 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:53:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:03.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:53:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:03.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #85. Immutable memtables: 0.
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.832933) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 51] Flushing memtable with next log file: 85
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161983832967, "job": 51, "event": "flush_started", "num_memtables": 1, "num_entries": 579, "num_deletes": 252, "total_data_size": 852037, "memory_usage": 864296, "flush_reason": "Manual Compaction"}
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 51] Level-0 flush table #86: started
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161983838246, "cf_name": "default", "job": 51, "event": "table_file_creation", "file_number": 86, "file_size": 561940, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 43646, "largest_seqno": 44220, "table_properties": {"data_size": 558918, "index_size": 994, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7458, "raw_average_key_size": 19, "raw_value_size": 552672, "raw_average_value_size": 1458, "num_data_blocks": 43, "num_entries": 379, "num_filter_entries": 379, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161956, "oldest_key_time": 1769161956, "file_creation_time": 1769161983, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 86, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 51] Flush lasted 5421 microseconds, and 2332 cpu microseconds.
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.838351) [db/flush_job.cc:967] [default] [JOB 51] Level-0 flush table #86: 561940 bytes OK
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.838402) [db/memtable_list.cc:519] [default] Level-0 commit table #86 started
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.839844) [db/memtable_list.cc:722] [default] Level-0 commit table #86: memtable #1 done
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.839860) EVENT_LOG_v1 {"time_micros": 1769161983839854, "job": 51, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.839876) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 51] Try to delete WAL files size 848692, prev total WAL file size 848692, number of live WAL files 2.
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000082.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.840762) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033353134' seq:72057594037927935, type:22 .. '7061786F730033373636' seq:0, type:0; will stop at (end)
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 52] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 51 Base level 0, inputs: [86(548KB)], [84(10MB)]
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161983840821, "job": 52, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [86], "files_L6": [84], "score": -1, "input_data_size": 11614225, "oldest_snapshot_seqno": -1}
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 52] Generated table #87: 6547 keys, 9739272 bytes, temperature: kUnknown
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161983901405, "cf_name": "default", "job": 52, "event": "table_file_creation", "file_number": 87, "file_size": 9739272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9695841, "index_size": 25975, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16389, "raw_key_size": 169802, "raw_average_key_size": 25, "raw_value_size": 9578660, "raw_average_value_size": 1463, "num_data_blocks": 1028, "num_entries": 6547, "num_filter_entries": 6547, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769161983, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 87, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.901701) [db/compaction/compaction_job.cc:1663] [default] [JOB 52] Compacted 1@0 + 1@6 files to L6 => 9739272 bytes
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.903038) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 191.4 rd, 160.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 10.5 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(38.0) write-amplify(17.3) OK, records in: 7067, records dropped: 520 output_compression: NoCompression
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.903122) EVENT_LOG_v1 {"time_micros": 1769161983903052, "job": 52, "event": "compaction_finished", "compaction_time_micros": 60667, "compaction_time_cpu_micros": 21125, "output_level": 6, "num_output_files": 1, "total_output_size": 9739272, "num_input_records": 7067, "num_output_records": 6547, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000086.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161983903303, "job": 52, "event": "table_file_deletion", "file_number": 86}
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000084.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161983905059, "job": 52, "event": "table_file_deletion", "file_number": 84}
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.840697) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.905085) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.905089) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.905090) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.905091) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:53:03.905093) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:53:04 np0005593234 nova_compute[227762]: 2026-01-23 09:53:04.104 227766 DEBUG nova.compute.manager [req-a3cc047b-8146-473e-a28a-eecdd47b5dc4 req-ef7b250c-7ec8-4bef-a406-2c319943f2d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received event network-vif-plugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:53:04 np0005593234 nova_compute[227762]: 2026-01-23 09:53:04.105 227766 DEBUG oslo_concurrency.lockutils [req-a3cc047b-8146-473e-a28a-eecdd47b5dc4 req-ef7b250c-7ec8-4bef-a406-2c319943f2d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:04 np0005593234 nova_compute[227762]: 2026-01-23 09:53:04.106 227766 DEBUG oslo_concurrency.lockutils [req-a3cc047b-8146-473e-a28a-eecdd47b5dc4 req-ef7b250c-7ec8-4bef-a406-2c319943f2d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:04 np0005593234 nova_compute[227762]: 2026-01-23 09:53:04.106 227766 DEBUG oslo_concurrency.lockutils [req-a3cc047b-8146-473e-a28a-eecdd47b5dc4 req-ef7b250c-7ec8-4bef-a406-2c319943f2d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:04 np0005593234 nova_compute[227762]: 2026-01-23 09:53:04.106 227766 DEBUG nova.compute.manager [req-a3cc047b-8146-473e-a28a-eecdd47b5dc4 req-ef7b250c-7ec8-4bef-a406-2c319943f2d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] No waiting events found dispatching network-vif-plugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:53:04 np0005593234 nova_compute[227762]: 2026-01-23 09:53:04.107 227766 WARNING nova.compute.manager [req-a3cc047b-8146-473e-a28a-eecdd47b5dc4 req-ef7b250c-7ec8-4bef-a406-2c319943f2d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received unexpected event network-vif-plugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 for instance with vm_state resized and task_state None.#033[00m
Jan 23 04:53:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:53:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:53:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:05.227 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '28'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:05 np0005593234 nova_compute[227762]: 2026-01-23 09:53:05.562 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:05.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:05.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:06 np0005593234 nova_compute[227762]: 2026-01-23 09:53:06.158 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:07.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:07.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:09.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:09.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:09 np0005593234 podman[261807]: 2026-01-23 09:53:09.759311624 +0000 UTC m=+0.054398661 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 04:53:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:10 np0005593234 nova_compute[227762]: 2026-01-23 09:53:10.609 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:11 np0005593234 nova_compute[227762]: 2026-01-23 09:53:11.160 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:53:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:53:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:53:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:11.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:53:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:11.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e255 e255: 3 total, 3 up, 3 in
Jan 23 04:53:12 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 04:53:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 04:53:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 23 04:53:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:13.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:13.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:14 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000013 to be held by another RGW process; skipping for now
Jan 23 04:53:14 np0005593234 ovn_controller[134547]: 2026-01-23T09:53:14Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:49:23:e7 10.100.0.11
Jan 23 04:53:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:15 np0005593234 nova_compute[227762]: 2026-01-23 09:53:15.611 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:15.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:53:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:15.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:53:16 np0005593234 nova_compute[227762]: 2026-01-23 09:53:16.162 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:53:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:17.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:53:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:17.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:18 np0005593234 nova_compute[227762]: 2026-01-23 09:53:18.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:19.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:53:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:19.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:53:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e255 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:20 np0005593234 nova_compute[227762]: 2026-01-23 09:53:20.612 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 e256: 3 total, 3 up, 3 in
Jan 23 04:53:21 np0005593234 nova_compute[227762]: 2026-01-23 09:53:21.163 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:21.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:53:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:21.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:53:22 np0005593234 nova_compute[227762]: 2026-01-23 09:53:22.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:22 np0005593234 nova_compute[227762]: 2026-01-23 09:53:22.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:53:22 np0005593234 nova_compute[227762]: 2026-01-23 09:53:22.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:22 np0005593234 nova_compute[227762]: 2026-01-23 09:53:22.794 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:22 np0005593234 nova_compute[227762]: 2026-01-23 09:53:22.795 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:22 np0005593234 nova_compute[227762]: 2026-01-23 09:53:22.795 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:22 np0005593234 nova_compute[227762]: 2026-01-23 09:53:22.796 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:53:22 np0005593234 nova_compute[227762]: 2026-01-23 09:53:22.796 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:53:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/814601609' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:53:23 np0005593234 nova_compute[227762]: 2026-01-23 09:53:23.234 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:23 np0005593234 podman[261956]: 2026-01-23 09:53:23.355094654 +0000 UTC m=+0.073637551 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 04:53:23 np0005593234 nova_compute[227762]: 2026-01-23 09:53:23.422 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000004d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:53:23 np0005593234 nova_compute[227762]: 2026-01-23 09:53:23.423 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000004d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:53:23 np0005593234 nova_compute[227762]: 2026-01-23 09:53:23.572 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:53:23 np0005593234 nova_compute[227762]: 2026-01-23 09:53:23.573 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4401MB free_disk=20.921844482421875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:53:23 np0005593234 nova_compute[227762]: 2026-01-23 09:53:23.573 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:23 np0005593234 nova_compute[227762]: 2026-01-23 09:53:23.574 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:23.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:23.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:23 np0005593234 nova_compute[227762]: 2026-01-23 09:53:23.937 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance a3c08e79-4f2b-42f2-bcac-21cbcfbc5247 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:53:23 np0005593234 nova_compute[227762]: 2026-01-23 09:53:23.938 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:53:23 np0005593234 nova_compute[227762]: 2026-01-23 09:53:23.938 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:53:24 np0005593234 nova_compute[227762]: 2026-01-23 09:53:24.181 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:53:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4129964939' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:53:24 np0005593234 nova_compute[227762]: 2026-01-23 09:53:24.603 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:24 np0005593234 nova_compute[227762]: 2026-01-23 09:53:24.608 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:53:25 np0005593234 nova_compute[227762]: 2026-01-23 09:53:25.084 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:53:25 np0005593234 nova_compute[227762]: 2026-01-23 09:53:25.437 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:53:25 np0005593234 nova_compute[227762]: 2026-01-23 09:53:25.437 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.864s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:25 np0005593234 nova_compute[227762]: 2026-01-23 09:53:25.614 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:25.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:25.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:25.999 227766 DEBUG oslo_concurrency.lockutils [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.000 227766 DEBUG oslo_concurrency.lockutils [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.000 227766 DEBUG oslo_concurrency.lockutils [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.001 227766 DEBUG oslo_concurrency.lockutils [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.001 227766 DEBUG oslo_concurrency.lockutils [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.003 227766 INFO nova.compute.manager [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Terminating instance#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.004 227766 DEBUG nova.compute.manager [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.165 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:26 np0005593234 kernel: tapd7d188d3-bf (unregistering): left promiscuous mode
Jan 23 04:53:26 np0005593234 NetworkManager[48942]: <info>  [1769162006.2418] device (tapd7d188d3-bf): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:53:26 np0005593234 ovn_controller[134547]: 2026-01-23T09:53:26Z|00257|binding|INFO|Releasing lport d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 from this chassis (sb_readonly=0)
Jan 23 04:53:26 np0005593234 ovn_controller[134547]: 2026-01-23T09:53:26Z|00258|binding|INFO|Setting lport d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 down in Southbound
Jan 23 04:53:26 np0005593234 ovn_controller[134547]: 2026-01-23T09:53:26Z|00259|binding|INFO|Removing iface tapd7d188d3-bf ovn-installed in OVS
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.253 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.260 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:23:e7 10.100.0.11'], port_security=['fa:16:3e:49:23:e7 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a3c08e79-4f2b-42f2-bcac-21cbcfbc5247', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '86d938c8e2bb41a79012befd500d1088', 'neutron:revision_number': '8', 'neutron:security_group_ids': '7a7b70d2-dc13-4ace-b4e0-b2bcfa748347', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=99c61616-3f86-4228-bb78-0dc84e2b2157, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.262 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 in datapath 6d2cdc4c-47a0-475b-8e71-39465d365de3 unbound from our chassis#033[00m
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.263 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6d2cdc4c-47a0-475b-8e71-39465d365de3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.265 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eea862c8-f9bc-46be-a98b-2d396b24f95a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.266 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 namespace which is not needed anymore#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.272 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:26 np0005593234 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004d.scope: Deactivated successfully.
Jan 23 04:53:26 np0005593234 systemd[1]: machine-qemu\x2d32\x2dinstance\x2d0000004d.scope: Consumed 14.367s CPU time.
Jan 23 04:53:26 np0005593234 systemd-machined[195626]: Machine qemu-32-instance-0000004d terminated.
Jan 23 04:53:26 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[261764]: [NOTICE]   (261784) : haproxy version is 2.8.14-c23fe91
Jan 23 04:53:26 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[261764]: [NOTICE]   (261784) : path to executable is /usr/sbin/haproxy
Jan 23 04:53:26 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[261764]: [WARNING]  (261784) : Exiting Master process...
Jan 23 04:53:26 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[261764]: [WARNING]  (261784) : Exiting Master process...
Jan 23 04:53:26 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[261764]: [ALERT]    (261784) : Current worker (261788) exited with code 143 (Terminated)
Jan 23 04:53:26 np0005593234 neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3[261764]: [WARNING]  (261784) : All workers exited. Exiting... (0)
Jan 23 04:53:26 np0005593234 systemd[1]: libpod-e3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d.scope: Deactivated successfully.
Jan 23 04:53:26 np0005593234 podman[262032]: 2026-01-23 09:53:26.409740603 +0000 UTC m=+0.050110386 container died e3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.437 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.439 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:26 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d-userdata-shm.mount: Deactivated successfully.
Jan 23 04:53:26 np0005593234 systemd[1]: var-lib-containers-storage-overlay-aa2266a2445a79149d540e5808dbb542c70728a592fa2c833aa8809747950e35-merged.mount: Deactivated successfully.
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.446 227766 INFO nova.virt.libvirt.driver [-] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Instance destroyed successfully.#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.447 227766 DEBUG nova.objects.instance [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lazy-loading 'resources' on Instance uuid a3c08e79-4f2b-42f2-bcac-21cbcfbc5247 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:53:26 np0005593234 podman[262032]: 2026-01-23 09:53:26.44871671 +0000 UTC m=+0.089086483 container cleanup e3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 04:53:26 np0005593234 systemd[1]: libpod-conmon-e3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d.scope: Deactivated successfully.
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.478 227766 DEBUG nova.virt.libvirt.vif [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:52:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerDiskConfigTestJSON-server-231426966',display_name='tempest-ServerDiskConfigTestJSON-server-231426966',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverdiskconfigtestjson-server-231426966',id=77,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:53:02Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='86d938c8e2bb41a79012befd500d1088',ramdisk_id='',reservation_id='r-w5bvxhsu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerDiskConfigTestJSON-211417238',owner_user_name='tempest-ServerDiskConfigTestJSON-211417238-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:53:14Z,user_data=None,user_id='0cfac2191989448ead77e75ca3910ac4',uuid=a3c08e79-4f2b-42f2-bcac-21cbcfbc5247,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "address": "fa:16:3e:49:23:e7", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d188d3-bf", "ovs_interfaceid": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.479 227766 DEBUG nova.network.os_vif_util [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converting VIF {"id": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "address": "fa:16:3e:49:23:e7", "network": {"id": "6d2cdc4c-47a0-475b-8e71-39465d365de3", "bridge": "br-int", "label": "tempest-ServerDiskConfigTestJSON-1859353210-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "86d938c8e2bb41a79012befd500d1088", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd7d188d3-bf", "ovs_interfaceid": "d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.480 227766 DEBUG nova.network.os_vif_util [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:49:23:e7,bridge_name='br-int',has_traffic_filtering=True,id=d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d188d3-bf') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.481 227766 DEBUG os_vif [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:49:23:e7,bridge_name='br-int',has_traffic_filtering=True,id=d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d188d3-bf') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.483 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.483 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d188d3-bf, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.485 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.486 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.489 227766 INFO os_vif [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:49:23:e7,bridge_name='br-int',has_traffic_filtering=True,id=d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728,network=Network(6d2cdc4c-47a0-475b-8e71-39465d365de3),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd7d188d3-bf')#033[00m
Jan 23 04:53:26 np0005593234 podman[262074]: 2026-01-23 09:53:26.527200451 +0000 UTC m=+0.052449699 container remove e3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.533 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ab866249-1688-438d-b8fe-1ba856e91109]: (4, ('Fri Jan 23 09:53:26 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (e3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d)\ne3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d\nFri Jan 23 09:53:26 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 (e3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d)\ne3b031eef5164af405f4a76d5bb85a9f14fc96d374cfe9b87b6208ee14d9ca7d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.534 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[388786da-3e6b-4001-b0b6-076a00b1271a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.535 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6d2cdc4c-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.537 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:26 np0005593234 kernel: tap6d2cdc4c-40: left promiscuous mode
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.539 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.541 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[366a7571-f90a-47f2-9c4c-c11decb4652c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.551 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.562 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4e506f0c-f3c2-463e-bdd7-9c7a6d380a2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.564 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb5f7ef-6fb5-46cf-81b2-196cf310af60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.579 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c7e939-dc6d-4933-b3f2-e7e8c30ba680]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 587642, 'reachable_time': 44671, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262107, 'error': None, 'target': 'ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:26 np0005593234 systemd[1]: run-netns-ovnmeta\x2d6d2cdc4c\x2d47a0\x2d475b\x2d8e71\x2d39465d365de3.mount: Deactivated successfully.
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.583 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6d2cdc4c-47a0-475b-8e71-39465d365de3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:53:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:26.583 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[9feec2e9-f2cb-462e-91a3-9582211bb528]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:53:26 np0005593234 nova_compute[227762]: 2026-01-23 09:53:26.740 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:27 np0005593234 nova_compute[227762]: 2026-01-23 09:53:27.045 227766 DEBUG nova.compute.manager [req-634cde0a-a1e3-4f5f-9fad-bcadc96dbb00 req-1aa69400-ee3b-4d4e-8960-a75fe0289034 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received event network-vif-unplugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:53:27 np0005593234 nova_compute[227762]: 2026-01-23 09:53:27.046 227766 DEBUG oslo_concurrency.lockutils [req-634cde0a-a1e3-4f5f-9fad-bcadc96dbb00 req-1aa69400-ee3b-4d4e-8960-a75fe0289034 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:27 np0005593234 nova_compute[227762]: 2026-01-23 09:53:27.046 227766 DEBUG oslo_concurrency.lockutils [req-634cde0a-a1e3-4f5f-9fad-bcadc96dbb00 req-1aa69400-ee3b-4d4e-8960-a75fe0289034 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:27 np0005593234 nova_compute[227762]: 2026-01-23 09:53:27.046 227766 DEBUG oslo_concurrency.lockutils [req-634cde0a-a1e3-4f5f-9fad-bcadc96dbb00 req-1aa69400-ee3b-4d4e-8960-a75fe0289034 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:27 np0005593234 nova_compute[227762]: 2026-01-23 09:53:27.046 227766 DEBUG nova.compute.manager [req-634cde0a-a1e3-4f5f-9fad-bcadc96dbb00 req-1aa69400-ee3b-4d4e-8960-a75fe0289034 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] No waiting events found dispatching network-vif-unplugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:53:27 np0005593234 nova_compute[227762]: 2026-01-23 09:53:27.047 227766 DEBUG nova.compute.manager [req-634cde0a-a1e3-4f5f-9fad-bcadc96dbb00 req-1aa69400-ee3b-4d4e-8960-a75fe0289034 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received event network-vif-unplugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:53:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:27.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:27.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:28 np0005593234 nova_compute[227762]: 2026-01-23 09:53:28.583 227766 INFO nova.virt.libvirt.driver [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Deleting instance files /var/lib/nova/instances/a3c08e79-4f2b-42f2-bcac-21cbcfbc5247_del#033[00m
Jan 23 04:53:28 np0005593234 nova_compute[227762]: 2026-01-23 09:53:28.584 227766 INFO nova.virt.libvirt.driver [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Deletion of /var/lib/nova/instances/a3c08e79-4f2b-42f2-bcac-21cbcfbc5247_del complete#033[00m
Jan 23 04:53:28 np0005593234 nova_compute[227762]: 2026-01-23 09:53:28.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:28 np0005593234 nova_compute[227762]: 2026-01-23 09:53:28.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:53:28 np0005593234 nova_compute[227762]: 2026-01-23 09:53:28.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:53:29 np0005593234 nova_compute[227762]: 2026-01-23 09:53:29.425 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 23 04:53:29 np0005593234 nova_compute[227762]: 2026-01-23 09:53:29.425 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:53:29 np0005593234 nova_compute[227762]: 2026-01-23 09:53:29.484 227766 INFO nova.compute.manager [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Took 3.48 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:53:29 np0005593234 nova_compute[227762]: 2026-01-23 09:53:29.485 227766 DEBUG oslo.service.loopingcall [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:53:29 np0005593234 nova_compute[227762]: 2026-01-23 09:53:29.486 227766 DEBUG nova.compute.manager [-] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:53:29 np0005593234 nova_compute[227762]: 2026-01-23 09:53:29.486 227766 DEBUG nova.network.neutron [-] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:53:29 np0005593234 nova_compute[227762]: 2026-01-23 09:53:29.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:29 np0005593234 nova_compute[227762]: 2026-01-23 09:53:29.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:53:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:29.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:53:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:53:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:29.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:53:30 np0005593234 nova_compute[227762]: 2026-01-23 09:53:30.041 227766 DEBUG nova.compute.manager [req-0d989ca6-a824-467a-988c-32a950547c5d req-4aa05279-0469-418d-abc7-dfc4ff7380e1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received event network-vif-plugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:53:30 np0005593234 nova_compute[227762]: 2026-01-23 09:53:30.042 227766 DEBUG oslo_concurrency.lockutils [req-0d989ca6-a824-467a-988c-32a950547c5d req-4aa05279-0469-418d-abc7-dfc4ff7380e1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:30 np0005593234 nova_compute[227762]: 2026-01-23 09:53:30.042 227766 DEBUG oslo_concurrency.lockutils [req-0d989ca6-a824-467a-988c-32a950547c5d req-4aa05279-0469-418d-abc7-dfc4ff7380e1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:30 np0005593234 nova_compute[227762]: 2026-01-23 09:53:30.043 227766 DEBUG oslo_concurrency.lockutils [req-0d989ca6-a824-467a-988c-32a950547c5d req-4aa05279-0469-418d-abc7-dfc4ff7380e1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:30 np0005593234 nova_compute[227762]: 2026-01-23 09:53:30.043 227766 DEBUG nova.compute.manager [req-0d989ca6-a824-467a-988c-32a950547c5d req-4aa05279-0469-418d-abc7-dfc4ff7380e1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] No waiting events found dispatching network-vif-plugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:53:30 np0005593234 nova_compute[227762]: 2026-01-23 09:53:30.043 227766 WARNING nova.compute.manager [req-0d989ca6-a824-467a-988c-32a950547c5d req-4aa05279-0469-418d-abc7-dfc4ff7380e1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received unexpected event network-vif-plugged-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:53:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:30 np0005593234 nova_compute[227762]: 2026-01-23 09:53:30.615 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:31 np0005593234 nova_compute[227762]: 2026-01-23 09:53:31.486 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:31 np0005593234 nova_compute[227762]: 2026-01-23 09:53:31.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:31.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:53:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:31.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:53:31 np0005593234 nova_compute[227762]: 2026-01-23 09:53:31.853 227766 DEBUG nova.network.neutron [-] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:53:31 np0005593234 nova_compute[227762]: 2026-01-23 09:53:31.890 227766 INFO nova.compute.manager [-] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Took 2.40 seconds to deallocate network for instance.#033[00m
Jan 23 04:53:31 np0005593234 nova_compute[227762]: 2026-01-23 09:53:31.962 227766 DEBUG oslo_concurrency.lockutils [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:31 np0005593234 nova_compute[227762]: 2026-01-23 09:53:31.962 227766 DEBUG oslo_concurrency.lockutils [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:32 np0005593234 nova_compute[227762]: 2026-01-23 09:53:32.091 227766 DEBUG oslo_concurrency.processutils [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:32 np0005593234 nova_compute[227762]: 2026-01-23 09:53:32.133 227766 DEBUG nova.compute.manager [req-b0b8d409-8c5e-4275-adc7-29606a8ac4f2 req-0a31fefe-f2a2-4b34-bd78-afeeefa24691 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Received event network-vif-deleted-d7d188d3-bf5e-4df5-9ce1-6ba00d3f6728 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:53:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:53:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3148099859' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:53:32 np0005593234 nova_compute[227762]: 2026-01-23 09:53:32.505 227766 DEBUG oslo_concurrency.processutils [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:32 np0005593234 nova_compute[227762]: 2026-01-23 09:53:32.511 227766 DEBUG nova.compute.provider_tree [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:53:32 np0005593234 nova_compute[227762]: 2026-01-23 09:53:32.539 227766 DEBUG nova.scheduler.client.report [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:53:32 np0005593234 nova_compute[227762]: 2026-01-23 09:53:32.588 227766 DEBUG oslo_concurrency.lockutils [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.626s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:32 np0005593234 nova_compute[227762]: 2026-01-23 09:53:32.645 227766 INFO nova.scheduler.client.report [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Deleted allocations for instance a3c08e79-4f2b-42f2-bcac-21cbcfbc5247#033[00m
Jan 23 04:53:32 np0005593234 nova_compute[227762]: 2026-01-23 09:53:32.775 227766 DEBUG oslo_concurrency.lockutils [None req-4b3ea659-cede-4234-86bb-7c5e84ca329f 0cfac2191989448ead77e75ca3910ac4 86d938c8e2bb41a79012befd500d1088 - - default default] Lock "a3c08e79-4f2b-42f2-bcac-21cbcfbc5247" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.775s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:33.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:33.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:34 np0005593234 nova_compute[227762]: 2026-01-23 09:53:34.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:35 np0005593234 nova_compute[227762]: 2026-01-23 09:53:35.618 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:35.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:35.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:35 np0005593234 nova_compute[227762]: 2026-01-23 09:53:35.801 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:35 np0005593234 nova_compute[227762]: 2026-01-23 09:53:35.801 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 04:53:35 np0005593234 nova_compute[227762]: 2026-01-23 09:53:35.849 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 04:53:36 np0005593234 nova_compute[227762]: 2026-01-23 09:53:36.489 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:36 np0005593234 nova_compute[227762]: 2026-01-23 09:53:36.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:36 np0005593234 nova_compute[227762]: 2026-01-23 09:53:36.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 04:53:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:37.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:37.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:39.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:39.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:40 np0005593234 podman[262162]: 2026-01-23 09:53:40.380710323 +0000 UTC m=+0.057269301 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:53:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:40 np0005593234 nova_compute[227762]: 2026-01-23 09:53:40.619 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:41 np0005593234 nova_compute[227762]: 2026-01-23 09:53:41.440 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162006.4384027, a3c08e79-4f2b-42f2-bcac-21cbcfbc5247 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:53:41 np0005593234 nova_compute[227762]: 2026-01-23 09:53:41.440 227766 INFO nova.compute.manager [-] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:53:41 np0005593234 nova_compute[227762]: 2026-01-23 09:53:41.492 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:41 np0005593234 nova_compute[227762]: 2026-01-23 09:53:41.496 227766 DEBUG nova.compute.manager [None req-38949680-6c2d-49a7-869f-6ad331a79a04 - - - - - -] [instance: a3c08e79-4f2b-42f2-bcac-21cbcfbc5247] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:53:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:53:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:41.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:53:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:41.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:42.829 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:42.830 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:42.830 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:43.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:43.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:45 np0005593234 nova_compute[227762]: 2026-01-23 09:53:45.621 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:45.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:53:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:45.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:53:46 np0005593234 nova_compute[227762]: 2026-01-23 09:53:46.494 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:53:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:47.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:53:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:47.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:49.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:49.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:50 np0005593234 nova_compute[227762]: 2026-01-23 09:53:50.623 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:51 np0005593234 nova_compute[227762]: 2026-01-23 09:53:51.049 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:53:51 np0005593234 nova_compute[227762]: 2026-01-23 09:53:51.539 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:53:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:51.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:53:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:51.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:52 np0005593234 nova_compute[227762]: 2026-01-23 09:53:52.802 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "4dff175a-ea83-41a7-b707-9a974155229b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:52 np0005593234 nova_compute[227762]: 2026-01-23 09:53:52.802 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:52 np0005593234 nova_compute[227762]: 2026-01-23 09:53:52.851 227766 DEBUG nova.compute.manager [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:53:53 np0005593234 nova_compute[227762]: 2026-01-23 09:53:53.042 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:53 np0005593234 nova_compute[227762]: 2026-01-23 09:53:53.043 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:53 np0005593234 nova_compute[227762]: 2026-01-23 09:53:53.049 227766 DEBUG nova.virt.hardware [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:53:53 np0005593234 nova_compute[227762]: 2026-01-23 09:53:53.050 227766 INFO nova.compute.claims [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:53:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:53.213 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=29, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=28) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:53:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:53:53.213 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:53:53 np0005593234 nova_compute[227762]: 2026-01-23 09:53:53.214 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:53 np0005593234 nova_compute[227762]: 2026-01-23 09:53:53.432 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:53.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:53.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:53 np0005593234 podman[262231]: 2026-01-23 09:53:53.846739431 +0000 UTC m=+0.138478987 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 04:53:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:53:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3219697590' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:53:53 np0005593234 nova_compute[227762]: 2026-01-23 09:53:53.900 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:53 np0005593234 nova_compute[227762]: 2026-01-23 09:53:53.906 227766 DEBUG nova.compute.provider_tree [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:53:53 np0005593234 nova_compute[227762]: 2026-01-23 09:53:53.962 227766 DEBUG nova.scheduler.client.report [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:53:53 np0005593234 nova_compute[227762]: 2026-01-23 09:53:53.998 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.955s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:53 np0005593234 nova_compute[227762]: 2026-01-23 09:53:53.999 227766 DEBUG nova.compute.manager [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.116 227766 DEBUG nova.compute.manager [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.117 227766 DEBUG nova.network.neutron [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.184 227766 INFO nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.210 227766 DEBUG nova.compute.manager [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.399 227766 DEBUG nova.compute.manager [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.400 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.400 227766 INFO nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Creating image(s)#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.424 227766 DEBUG nova.storage.rbd_utils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 4dff175a-ea83-41a7-b707-9a974155229b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.448 227766 DEBUG nova.storage.rbd_utils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 4dff175a-ea83-41a7-b707-9a974155229b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.472 227766 DEBUG nova.storage.rbd_utils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 4dff175a-ea83-41a7-b707-9a974155229b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.476 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.543 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.545 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.545 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.546 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.571 227766 DEBUG nova.storage.rbd_utils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 4dff175a-ea83-41a7-b707-9a974155229b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.575 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 4dff175a-ea83-41a7-b707-9a974155229b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:53:54 np0005593234 nova_compute[227762]: 2026-01-23 09:53:54.602 227766 DEBUG nova.policy [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '28a7a778c8ab486fb586e81bb84113be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61df91981c55482fa5c9a64686c79f9e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:53:55 np0005593234 nova_compute[227762]: 2026-01-23 09:53:55.394 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 4dff175a-ea83-41a7-b707-9a974155229b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.819s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:53:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:53:55 np0005593234 nova_compute[227762]: 2026-01-23 09:53:55.471 227766 DEBUG nova.storage.rbd_utils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] resizing rbd image 4dff175a-ea83-41a7-b707-9a974155229b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:53:55 np0005593234 nova_compute[227762]: 2026-01-23 09:53:55.628 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:53:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:55.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:53:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:55.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:55 np0005593234 nova_compute[227762]: 2026-01-23 09:53:55.854 227766 DEBUG nova.objects.instance [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'migration_context' on Instance uuid 4dff175a-ea83-41a7-b707-9a974155229b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:53:55 np0005593234 nova_compute[227762]: 2026-01-23 09:53:55.877 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:53:55 np0005593234 nova_compute[227762]: 2026-01-23 09:53:55.877 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Ensure instance console log exists: /var/lib/nova/instances/4dff175a-ea83-41a7-b707-9a974155229b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:53:55 np0005593234 nova_compute[227762]: 2026-01-23 09:53:55.878 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:53:55 np0005593234 nova_compute[227762]: 2026-01-23 09:53:55.878 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:53:55 np0005593234 nova_compute[227762]: 2026-01-23 09:53:55.878 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:53:56 np0005593234 nova_compute[227762]: 2026-01-23 09:53:56.541 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:53:56 np0005593234 nova_compute[227762]: 2026-01-23 09:53:56.957 227766 DEBUG nova.network.neutron [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Successfully created port: f7cddac6-b950-4d79-8e72-aca83650b1cd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:53:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:57.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:57.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:53:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:53:59.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:53:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:53:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:53:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:59.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:00 np0005593234 nova_compute[227762]: 2026-01-23 09:54:00.626 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:54:00 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1008650236' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:54:00 np0005593234 nova_compute[227762]: 2026-01-23 09:54:00.686 227766 DEBUG nova.network.neutron [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Successfully updated port: f7cddac6-b950-4d79-8e72-aca83650b1cd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:54:00 np0005593234 nova_compute[227762]: 2026-01-23 09:54:00.719 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "refresh_cache-4dff175a-ea83-41a7-b707-9a974155229b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:00 np0005593234 nova_compute[227762]: 2026-01-23 09:54:00.719 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquired lock "refresh_cache-4dff175a-ea83-41a7-b707-9a974155229b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:00 np0005593234 nova_compute[227762]: 2026-01-23 09:54:00.719 227766 DEBUG nova.network.neutron [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:54:00 np0005593234 nova_compute[227762]: 2026-01-23 09:54:00.927 227766 DEBUG nova.compute.manager [req-9c0a54a7-24e5-4157-afc6-8b606a557665 req-d5c4b667-6121-4dc2-b615-5e11a9285220 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received event network-changed-f7cddac6-b950-4d79-8e72-aca83650b1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:00 np0005593234 nova_compute[227762]: 2026-01-23 09:54:00.927 227766 DEBUG nova.compute.manager [req-9c0a54a7-24e5-4157-afc6-8b606a557665 req-d5c4b667-6121-4dc2-b615-5e11a9285220 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Refreshing instance network info cache due to event network-changed-f7cddac6-b950-4d79-8e72-aca83650b1cd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:54:00 np0005593234 nova_compute[227762]: 2026-01-23 09:54:00.928 227766 DEBUG oslo_concurrency.lockutils [req-9c0a54a7-24e5-4157-afc6-8b606a557665 req-d5c4b667-6121-4dc2-b615-5e11a9285220 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-4dff175a-ea83-41a7-b707-9a974155229b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:01.215 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '29'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:01 np0005593234 nova_compute[227762]: 2026-01-23 09:54:01.292 227766 DEBUG nova.network.neutron [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:54:01 np0005593234 nova_compute[227762]: 2026-01-23 09:54:01.543 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:01.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:01.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:03.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:54:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/169981653' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:54:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:03.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.235 227766 DEBUG nova.network.neutron [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Updating instance_info_cache with network_info: [{"id": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "address": "fa:16:3e:03:8e:97", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7cddac6-b9", "ovs_interfaceid": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.271 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Releasing lock "refresh_cache-4dff175a-ea83-41a7-b707-9a974155229b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.272 227766 DEBUG nova.compute.manager [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Instance network_info: |[{"id": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "address": "fa:16:3e:03:8e:97", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7cddac6-b9", "ovs_interfaceid": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.272 227766 DEBUG oslo_concurrency.lockutils [req-9c0a54a7-24e5-4157-afc6-8b606a557665 req-d5c4b667-6121-4dc2-b615-5e11a9285220 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-4dff175a-ea83-41a7-b707-9a974155229b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.272 227766 DEBUG nova.network.neutron [req-9c0a54a7-24e5-4157-afc6-8b606a557665 req-d5c4b667-6121-4dc2-b615-5e11a9285220 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Refreshing network info cache for port f7cddac6-b950-4d79-8e72-aca83650b1cd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.275 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Start _get_guest_xml network_info=[{"id": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "address": "fa:16:3e:03:8e:97", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7cddac6-b9", "ovs_interfaceid": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.279 227766 WARNING nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.285 227766 DEBUG nova.virt.libvirt.host [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.285 227766 DEBUG nova.virt.libvirt.host [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.288 227766 DEBUG nova.virt.libvirt.host [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.289 227766 DEBUG nova.virt.libvirt.host [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.290 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.290 227766 DEBUG nova.virt.hardware [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.291 227766 DEBUG nova.virt.hardware [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.291 227766 DEBUG nova.virt.hardware [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.291 227766 DEBUG nova.virt.hardware [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.291 227766 DEBUG nova.virt.hardware [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.291 227766 DEBUG nova.virt.hardware [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.292 227766 DEBUG nova.virt.hardware [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.292 227766 DEBUG nova.virt.hardware [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.292 227766 DEBUG nova.virt.hardware [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.292 227766 DEBUG nova.virt.hardware [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.293 227766 DEBUG nova.virt.hardware [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.295 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:54:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3622285005' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.719 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.741 227766 DEBUG nova.storage.rbd_utils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 4dff175a-ea83-41a7-b707-9a974155229b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:04 np0005593234 nova_compute[227762]: 2026-01-23 09:54:04.745 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:54:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/731173930' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.242 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.244 227766 DEBUG nova.virt.libvirt.vif [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:53:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-183505836',display_name='tempest-DeleteServersTestJSON-server-183505836',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-183505836',id=80,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-mz31r5t0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:54Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=4dff175a-ea83-41a7-b707-9a974155229b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "address": "fa:16:3e:03:8e:97", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7cddac6-b9", "ovs_interfaceid": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.244 227766 DEBUG nova.network.os_vif_util [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "address": "fa:16:3e:03:8e:97", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7cddac6-b9", "ovs_interfaceid": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.245 227766 DEBUG nova.network.os_vif_util [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:8e:97,bridge_name='br-int',has_traffic_filtering=True,id=f7cddac6-b950-4d79-8e72-aca83650b1cd,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7cddac6-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.247 227766 DEBUG nova.objects.instance [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'pci_devices' on Instance uuid 4dff175a-ea83-41a7-b707-9a974155229b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.311 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  <uuid>4dff175a-ea83-41a7-b707-9a974155229b</uuid>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  <name>instance-00000050</name>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <nova:name>tempest-DeleteServersTestJSON-server-183505836</nova:name>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:54:04</nova:creationTime>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <nova:user uuid="28a7a778c8ab486fb586e81bb84113be">tempest-DeleteServersTestJSON-944070453-project-member</nova:user>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <nova:project uuid="61df91981c55482fa5c9a64686c79f9e">tempest-DeleteServersTestJSON-944070453</nova:project>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <nova:port uuid="f7cddac6-b950-4d79-8e72-aca83650b1cd">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <entry name="serial">4dff175a-ea83-41a7-b707-9a974155229b</entry>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <entry name="uuid">4dff175a-ea83-41a7-b707-9a974155229b</entry>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/4dff175a-ea83-41a7-b707-9a974155229b_disk">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/4dff175a-ea83-41a7-b707-9a974155229b_disk.config">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:03:8e:97"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <target dev="tapf7cddac6-b9"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/4dff175a-ea83-41a7-b707-9a974155229b/console.log" append="off"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:54:05 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:54:05 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:54:05 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:54:05 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.313 227766 DEBUG nova.compute.manager [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Preparing to wait for external event network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.313 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "4dff175a-ea83-41a7-b707-9a974155229b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.314 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.314 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.315 227766 DEBUG nova.virt.libvirt.vif [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:53:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-183505836',display_name='tempest-DeleteServersTestJSON-server-183505836',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-183505836',id=80,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-mz31r5t0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:53:54Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=4dff175a-ea83-41a7-b707-9a974155229b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "address": "fa:16:3e:03:8e:97", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7cddac6-b9", "ovs_interfaceid": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.315 227766 DEBUG nova.network.os_vif_util [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "address": "fa:16:3e:03:8e:97", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7cddac6-b9", "ovs_interfaceid": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.316 227766 DEBUG nova.network.os_vif_util [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:8e:97,bridge_name='br-int',has_traffic_filtering=True,id=f7cddac6-b950-4d79-8e72-aca83650b1cd,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7cddac6-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.316 227766 DEBUG os_vif [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:8e:97,bridge_name='br-int',has_traffic_filtering=True,id=f7cddac6-b950-4d79-8e72-aca83650b1cd,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7cddac6-b9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.317 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.317 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.318 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.322 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.322 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7cddac6-b9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.323 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf7cddac6-b9, col_values=(('external_ids', {'iface-id': 'f7cddac6-b950-4d79-8e72-aca83650b1cd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:03:8e:97', 'vm-uuid': '4dff175a-ea83-41a7-b707-9a974155229b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:05 np0005593234 NetworkManager[48942]: <info>  [1769162045.3658] manager: (tapf7cddac6-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/135)
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.365 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.368 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.372 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.373 227766 INFO os_vif [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:8e:97,bridge_name='br-int',has_traffic_filtering=True,id=f7cddac6-b950-4d79-8e72-aca83650b1cd,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7cddac6-b9')#033[00m
Jan 23 04:54:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.547 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.548 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.548 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No VIF found with MAC fa:16:3e:03:8e:97, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.549 227766 INFO nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Using config drive#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.580 227766 DEBUG nova.storage.rbd_utils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 4dff175a-ea83-41a7-b707-9a974155229b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:05 np0005593234 nova_compute[227762]: 2026-01-23 09:54:05.629 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:05.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:05.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:54:06 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4085021664' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:54:07 np0005593234 nova_compute[227762]: 2026-01-23 09:54:07.295 227766 INFO nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Creating config drive at /var/lib/nova/instances/4dff175a-ea83-41a7-b707-9a974155229b/disk.config#033[00m
Jan 23 04:54:07 np0005593234 nova_compute[227762]: 2026-01-23 09:54:07.300 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4dff175a-ea83-41a7-b707-9a974155229b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpenu9tei2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:07 np0005593234 nova_compute[227762]: 2026-01-23 09:54:07.427 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4dff175a-ea83-41a7-b707-9a974155229b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpenu9tei2" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:07 np0005593234 nova_compute[227762]: 2026-01-23 09:54:07.450 227766 DEBUG nova.storage.rbd_utils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image 4dff175a-ea83-41a7-b707-9a974155229b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:07 np0005593234 nova_compute[227762]: 2026-01-23 09:54:07.453 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4dff175a-ea83-41a7-b707-9a974155229b/disk.config 4dff175a-ea83-41a7-b707-9a974155229b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:07 np0005593234 nova_compute[227762]: 2026-01-23 09:54:07.770 227766 DEBUG oslo_concurrency.processutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4dff175a-ea83-41a7-b707-9a974155229b/disk.config 4dff175a-ea83-41a7-b707-9a974155229b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.317s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:07 np0005593234 nova_compute[227762]: 2026-01-23 09:54:07.771 227766 INFO nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Deleting local config drive /var/lib/nova/instances/4dff175a-ea83-41a7-b707-9a974155229b/disk.config because it was imported into RBD.#033[00m
Jan 23 04:54:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:07.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:07 np0005593234 kernel: tapf7cddac6-b9: entered promiscuous mode
Jan 23 04:54:07 np0005593234 NetworkManager[48942]: <info>  [1769162047.8187] manager: (tapf7cddac6-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/136)
Jan 23 04:54:07 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:07Z|00260|binding|INFO|Claiming lport f7cddac6-b950-4d79-8e72-aca83650b1cd for this chassis.
Jan 23 04:54:07 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:07Z|00261|binding|INFO|f7cddac6-b950-4d79-8e72-aca83650b1cd: Claiming fa:16:3e:03:8e:97 10.100.0.11
Jan 23 04:54:07 np0005593234 nova_compute[227762]: 2026-01-23 09:54:07.820 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:07 np0005593234 nova_compute[227762]: 2026-01-23 09:54:07.822 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:07.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.839 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:8e:97 10.100.0.11'], port_security=['fa:16:3e:03:8e:97 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4dff175a-ea83-41a7-b707-9a974155229b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=f7cddac6-b950-4d79-8e72-aca83650b1cd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.841 144381 INFO neutron.agent.ovn.metadata.agent [-] Port f7cddac6-b950-4d79-8e72-aca83650b1cd in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 bound to our chassis#033[00m
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.843 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a3788149-efcd-4940-8a8f-e21af0a56a06#033[00m
Jan 23 04:54:07 np0005593234 systemd-udevd[262620]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.855 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[803e0a28-54ac-4497-91c7-9fe16b32df52]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.856 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa3788149-e1 in ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.858 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa3788149-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.858 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d7cf6a9b-94c0-4964-a821-a6c68d24a7f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:07 np0005593234 systemd-machined[195626]: New machine qemu-33-instance-00000050.
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.859 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8f33e1e6-818c-4796-a040-f6eff805a69d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:07 np0005593234 NetworkManager[48942]: <info>  [1769162047.8642] device (tapf7cddac6-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:54:07 np0005593234 NetworkManager[48942]: <info>  [1769162047.8648] device (tapf7cddac6-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.870 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[77b97c54-c757-4c43-b388-ed6ff5c8a455]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:07 np0005593234 nova_compute[227762]: 2026-01-23 09:54:07.881 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.881 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a4d311b4-f39c-43b1-9064-ab23f19cf73b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:07 np0005593234 systemd[1]: Started Virtual Machine qemu-33-instance-00000050.
Jan 23 04:54:07 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:07Z|00262|binding|INFO|Setting lport f7cddac6-b950-4d79-8e72-aca83650b1cd ovn-installed in OVS
Jan 23 04:54:07 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:07Z|00263|binding|INFO|Setting lport f7cddac6-b950-4d79-8e72-aca83650b1cd up in Southbound
Jan 23 04:54:07 np0005593234 nova_compute[227762]: 2026-01-23 09:54:07.890 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.914 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[453ce28e-aa7e-46cf-a9a9-d4adb7c6166e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:07 np0005593234 systemd-udevd[262624]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.919 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[97844f55-2bcc-4684-a219-7f98dcf8403d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:07 np0005593234 NetworkManager[48942]: <info>  [1769162047.9203] manager: (tapa3788149-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/137)
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.949 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d2b105df-2cf8-4d87-927e-72007d901961]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.952 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[5dbdf08f-36fa-48a5-9520-e2cecd2846c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:07 np0005593234 NetworkManager[48942]: <info>  [1769162047.9731] device (tapa3788149-e0): carrier: link connected
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.978 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[bb50fcd4-a377-4fad-999f-f3beb2ca5d38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:07.996 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[832ebf00-27ee-4b1a-9b66-2a20127c7bac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 594293, 'reachable_time': 40182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 262653, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:08.011 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[121b9339-14e4-444d-ba88-55548b232274]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:ddff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 594293, 'tstamp': 594293}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 262654, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:08.026 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aadfa581-aa71-4c5b-aafc-922eef03acec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 594293, 'reachable_time': 40182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 262655, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:08.052 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[561227b8-64d4-474d-9343-96aef7539b03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:08.104 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d2bcde00-e959-4361-ad1a-3944c802f286]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:08.108 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:08.109 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:08.109 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3788149-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:08 np0005593234 kernel: tapa3788149-e0: entered promiscuous mode
Jan 23 04:54:08 np0005593234 nova_compute[227762]: 2026-01-23 09:54:08.111 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:08 np0005593234 nova_compute[227762]: 2026-01-23 09:54:08.113 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:08 np0005593234 NetworkManager[48942]: <info>  [1769162048.1140] manager: (tapa3788149-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:08.114 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa3788149-e0, col_values=(('external_ids', {'iface-id': 'd6ce7fd1-128d-488f-94e6-68332f7a8a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:08 np0005593234 nova_compute[227762]: 2026-01-23 09:54:08.116 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:08Z|00264|binding|INFO|Releasing lport d6ce7fd1-128d-488f-94e6-68332f7a8a6b from this chassis (sb_readonly=0)
Jan 23 04:54:08 np0005593234 nova_compute[227762]: 2026-01-23 09:54:08.128 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:08.129 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:08.130 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5aab8848-ab74-400d-8b60-95ad177f1612]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:08.130 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:54:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:08.131 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'env', 'PROCESS_TAG=haproxy-a3788149-efcd-4940-8a8f-e21af0a56a06', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a3788149-efcd-4940-8a8f-e21af0a56a06.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:54:08 np0005593234 podman[262705]: 2026-01-23 09:54:08.470505108 +0000 UTC m=+0.049026631 container create 9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:54:08 np0005593234 systemd[1]: Started libpod-conmon-9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d.scope.
Jan 23 04:54:08 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:54:08 np0005593234 podman[262705]: 2026-01-23 09:54:08.44464028 +0000 UTC m=+0.023161833 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:54:08 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/307475ba68338b296f5bbe602ced8c215a8e6eedecb54f2f4c74e52beaf4c36b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:08 np0005593234 podman[262705]: 2026-01-23 09:54:08.55570433 +0000 UTC m=+0.134225883 container init 9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:54:08 np0005593234 podman[262705]: 2026-01-23 09:54:08.560536141 +0000 UTC m=+0.139057664 container start 9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:54:08 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[262738]: [NOTICE]   (262747) : New worker (262749) forked
Jan 23 04:54:08 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[262738]: [NOTICE]   (262747) : Loading success.
Jan 23 04:54:08 np0005593234 nova_compute[227762]: 2026-01-23 09:54:08.631 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162048.6312892, 4dff175a-ea83-41a7-b707-9a974155229b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:54:08 np0005593234 nova_compute[227762]: 2026-01-23 09:54:08.632 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] VM Started (Lifecycle Event)#033[00m
Jan 23 04:54:08 np0005593234 nova_compute[227762]: 2026-01-23 09:54:08.666 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:54:08 np0005593234 nova_compute[227762]: 2026-01-23 09:54:08.672 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162048.6315062, 4dff175a-ea83-41a7-b707-9a974155229b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:54:08 np0005593234 nova_compute[227762]: 2026-01-23 09:54:08.672 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:54:08 np0005593234 nova_compute[227762]: 2026-01-23 09:54:08.696 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:54:08 np0005593234 nova_compute[227762]: 2026-01-23 09:54:08.701 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:54:08 np0005593234 nova_compute[227762]: 2026-01-23 09:54:08.731 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:54:09 np0005593234 nova_compute[227762]: 2026-01-23 09:54:09.659 227766 DEBUG nova.network.neutron [req-9c0a54a7-24e5-4157-afc6-8b606a557665 req-d5c4b667-6121-4dc2-b615-5e11a9285220 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Updated VIF entry in instance network info cache for port f7cddac6-b950-4d79-8e72-aca83650b1cd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:54:09 np0005593234 nova_compute[227762]: 2026-01-23 09:54:09.659 227766 DEBUG nova.network.neutron [req-9c0a54a7-24e5-4157-afc6-8b606a557665 req-d5c4b667-6121-4dc2-b615-5e11a9285220 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Updating instance_info_cache with network_info: [{"id": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "address": "fa:16:3e:03:8e:97", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7cddac6-b9", "ovs_interfaceid": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:09 np0005593234 nova_compute[227762]: 2026-01-23 09:54:09.693 227766 DEBUG oslo_concurrency.lockutils [req-9c0a54a7-24e5-4157-afc6-8b606a557665 req-d5c4b667-6121-4dc2-b615-5e11a9285220 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-4dff175a-ea83-41a7-b707-9a974155229b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:54:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:09.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:09.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:09 np0005593234 nova_compute[227762]: 2026-01-23 09:54:09.992 227766 DEBUG nova.compute.manager [req-d34c4c29-0b5d-4a62-b71b-2e0f08665526 req-9d1389cb-1d12-47ca-b866-1da2171cfebf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received event network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:09 np0005593234 nova_compute[227762]: 2026-01-23 09:54:09.993 227766 DEBUG oslo_concurrency.lockutils [req-d34c4c29-0b5d-4a62-b71b-2e0f08665526 req-9d1389cb-1d12-47ca-b866-1da2171cfebf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4dff175a-ea83-41a7-b707-9a974155229b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:09 np0005593234 nova_compute[227762]: 2026-01-23 09:54:09.993 227766 DEBUG oslo_concurrency.lockutils [req-d34c4c29-0b5d-4a62-b71b-2e0f08665526 req-9d1389cb-1d12-47ca-b866-1da2171cfebf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:09 np0005593234 nova_compute[227762]: 2026-01-23 09:54:09.993 227766 DEBUG oslo_concurrency.lockutils [req-d34c4c29-0b5d-4a62-b71b-2e0f08665526 req-9d1389cb-1d12-47ca-b866-1da2171cfebf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:09 np0005593234 nova_compute[227762]: 2026-01-23 09:54:09.993 227766 DEBUG nova.compute.manager [req-d34c4c29-0b5d-4a62-b71b-2e0f08665526 req-9d1389cb-1d12-47ca-b866-1da2171cfebf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Processing event network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:54:09 np0005593234 nova_compute[227762]: 2026-01-23 09:54:09.994 227766 DEBUG nova.compute.manager [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:54:09 np0005593234 nova_compute[227762]: 2026-01-23 09:54:09.997 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162049.997651, 4dff175a-ea83-41a7-b707-9a974155229b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:54:09 np0005593234 nova_compute[227762]: 2026-01-23 09:54:09.998 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:54:09 np0005593234 nova_compute[227762]: 2026-01-23 09:54:09.999 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.005 227766 INFO nova.virt.libvirt.driver [-] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Instance spawned successfully.#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.005 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.025 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.032 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.035 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.036 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.036 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.036 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.037 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.037 227766 DEBUG nova.virt.libvirt.driver [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.067 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.185 227766 INFO nova.compute.manager [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Took 15.79 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.185 227766 DEBUG nova.compute.manager [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.277 227766 INFO nova.compute.manager [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Took 17.33 seconds to build instance.#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.368 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.408 227766 DEBUG oslo_concurrency.lockutils [None req-55787351-1b24-4b92-a7ce-b527777c0427 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:10 np0005593234 nova_compute[227762]: 2026-01-23 09:54:10.631 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:10 np0005593234 podman[262760]: 2026-01-23 09:54:10.767521404 +0000 UTC m=+0.054745302 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 04:54:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:11.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:11.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:12 np0005593234 nova_compute[227762]: 2026-01-23 09:54:12.261 227766 DEBUG nova.compute.manager [req-c91206e5-6498-4307-8592-7d38f486a588 req-8cdcf615-0d8c-4a75-9f04-e97181befffc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received event network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:12 np0005593234 nova_compute[227762]: 2026-01-23 09:54:12.262 227766 DEBUG oslo_concurrency.lockutils [req-c91206e5-6498-4307-8592-7d38f486a588 req-8cdcf615-0d8c-4a75-9f04-e97181befffc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4dff175a-ea83-41a7-b707-9a974155229b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:12 np0005593234 nova_compute[227762]: 2026-01-23 09:54:12.263 227766 DEBUG oslo_concurrency.lockutils [req-c91206e5-6498-4307-8592-7d38f486a588 req-8cdcf615-0d8c-4a75-9f04-e97181befffc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:12 np0005593234 nova_compute[227762]: 2026-01-23 09:54:12.263 227766 DEBUG oslo_concurrency.lockutils [req-c91206e5-6498-4307-8592-7d38f486a588 req-8cdcf615-0d8c-4a75-9f04-e97181befffc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:12 np0005593234 nova_compute[227762]: 2026-01-23 09:54:12.263 227766 DEBUG nova.compute.manager [req-c91206e5-6498-4307-8592-7d38f486a588 req-8cdcf615-0d8c-4a75-9f04-e97181befffc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] No waiting events found dispatching network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:12 np0005593234 nova_compute[227762]: 2026-01-23 09:54:12.263 227766 WARNING nova.compute.manager [req-c91206e5-6498-4307-8592-7d38f486a588 req-8cdcf615-0d8c-4a75-9f04-e97181befffc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received unexpected event network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd for instance with vm_state active and task_state None.#033[00m
Jan 23 04:54:12 np0005593234 nova_compute[227762]: 2026-01-23 09:54:12.664 227766 DEBUG nova.objects.instance [None req-e789f015-650a-46c0-984d-e0a8bbca0b33 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'pci_devices' on Instance uuid 4dff175a-ea83-41a7-b707-9a974155229b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:12 np0005593234 nova_compute[227762]: 2026-01-23 09:54:12.698 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162052.6983058, 4dff175a-ea83-41a7-b707-9a974155229b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:54:12 np0005593234 nova_compute[227762]: 2026-01-23 09:54:12.699 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:54:12 np0005593234 nova_compute[227762]: 2026-01-23 09:54:12.730 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:54:12 np0005593234 nova_compute[227762]: 2026-01-23 09:54:12.734 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:54:12 np0005593234 nova_compute[227762]: 2026-01-23 09:54:12.773 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 23 04:54:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:54:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 04:54:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:54:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 04:54:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:54:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:54:13 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1272632667' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:54:13 np0005593234 kernel: tapf7cddac6-b9 (unregistering): left promiscuous mode
Jan 23 04:54:13 np0005593234 NetworkManager[48942]: <info>  [1769162053.2935] device (tapf7cddac6-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:54:13 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:13Z|00265|binding|INFO|Releasing lport f7cddac6-b950-4d79-8e72-aca83650b1cd from this chassis (sb_readonly=0)
Jan 23 04:54:13 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:13Z|00266|binding|INFO|Setting lport f7cddac6-b950-4d79-8e72-aca83650b1cd down in Southbound
Jan 23 04:54:13 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:13Z|00267|binding|INFO|Removing iface tapf7cddac6-b9 ovn-installed in OVS
Jan 23 04:54:13 np0005593234 nova_compute[227762]: 2026-01-23 09:54:13.302 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:13 np0005593234 nova_compute[227762]: 2026-01-23 09:54:13.303 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:13 np0005593234 nova_compute[227762]: 2026-01-23 09:54:13.319 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.338 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:8e:97 10.100.0.11'], port_security=['fa:16:3e:03:8e:97 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4dff175a-ea83-41a7-b707-9a974155229b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=f7cddac6-b950-4d79-8e72-aca83650b1cd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.339 144381 INFO neutron.agent.ovn.metadata.agent [-] Port f7cddac6-b950-4d79-8e72-aca83650b1cd in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 unbound from our chassis#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.341 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a3788149-efcd-4940-8a8f-e21af0a56a06, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.342 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ba19e13b-22af-4a3c-9950-94ab7ccc7c94]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.342 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace which is not needed anymore#033[00m
Jan 23 04:54:13 np0005593234 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000050.scope: Deactivated successfully.
Jan 23 04:54:13 np0005593234 systemd[1]: machine-qemu\x2d33\x2dinstance\x2d00000050.scope: Consumed 3.515s CPU time.
Jan 23 04:54:13 np0005593234 systemd-machined[195626]: Machine qemu-33-instance-00000050 terminated.
Jan 23 04:54:13 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[262738]: [NOTICE]   (262747) : haproxy version is 2.8.14-c23fe91
Jan 23 04:54:13 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[262738]: [NOTICE]   (262747) : path to executable is /usr/sbin/haproxy
Jan 23 04:54:13 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[262738]: [WARNING]  (262747) : Exiting Master process...
Jan 23 04:54:13 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[262738]: [ALERT]    (262747) : Current worker (262749) exited with code 143 (Terminated)
Jan 23 04:54:13 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[262738]: [WARNING]  (262747) : All workers exited. Exiting... (0)
Jan 23 04:54:13 np0005593234 systemd[1]: libpod-9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d.scope: Deactivated successfully.
Jan 23 04:54:13 np0005593234 podman[263058]: 2026-01-23 09:54:13.477378204 +0000 UTC m=+0.046883877 container died 9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 04:54:13 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d-userdata-shm.mount: Deactivated successfully.
Jan 23 04:54:13 np0005593234 systemd[1]: var-lib-containers-storage-overlay-307475ba68338b296f5bbe602ced8c215a8e6eedecb54f2f4c74e52beaf4c36b-merged.mount: Deactivated successfully.
Jan 23 04:54:13 np0005593234 podman[263058]: 2026-01-23 09:54:13.53396509 +0000 UTC m=+0.103470743 container cleanup 9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:54:13 np0005593234 systemd[1]: libpod-conmon-9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d.scope: Deactivated successfully.
Jan 23 04:54:13 np0005593234 kernel: tapf7cddac6-b9: entered promiscuous mode
Jan 23 04:54:13 np0005593234 kernel: tapf7cddac6-b9 (unregistering): left promiscuous mode
Jan 23 04:54:13 np0005593234 NetworkManager[48942]: <info>  [1769162053.5462] manager: (tapf7cddac6-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/139)
Jan 23 04:54:13 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:13Z|00268|binding|INFO|Claiming lport f7cddac6-b950-4d79-8e72-aca83650b1cd for this chassis.
Jan 23 04:54:13 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:13Z|00269|binding|INFO|f7cddac6-b950-4d79-8e72-aca83650b1cd: Claiming fa:16:3e:03:8e:97 10.100.0.11
Jan 23 04:54:13 np0005593234 nova_compute[227762]: 2026-01-23 09:54:13.585 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.601 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:8e:97 10.100.0.11'], port_security=['fa:16:3e:03:8e:97 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4dff175a-ea83-41a7-b707-9a974155229b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=f7cddac6-b950-4d79-8e72-aca83650b1cd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:54:13 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:13Z|00270|binding|INFO|Setting lport f7cddac6-b950-4d79-8e72-aca83650b1cd ovn-installed in OVS
Jan 23 04:54:13 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:13Z|00271|binding|INFO|Setting lport f7cddac6-b950-4d79-8e72-aca83650b1cd up in Southbound
Jan 23 04:54:13 np0005593234 nova_compute[227762]: 2026-01-23 09:54:13.604 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:13 np0005593234 nova_compute[227762]: 2026-01-23 09:54:13.606 227766 DEBUG nova.compute.manager [None req-e789f015-650a-46c0-984d-e0a8bbca0b33 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:54:13 np0005593234 podman[263089]: 2026-01-23 09:54:13.639094434 +0000 UTC m=+0.043858100 container remove 9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.644 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[924cb60d-85b4-4aee-8c9a-fbf5a6b4647d]: (4, ('Fri Jan 23 09:54:13 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d)\n9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d\nFri Jan 23 09:54:13 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d)\n9c1683c89099fc07899db5b525f733521b520ea300dc95605d35aaa72b94d43d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.646 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[73b1de14-271a-4b86-b46a-0fcba31ea79a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.647 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:13 np0005593234 kernel: tapa3788149-e0: left promiscuous mode
Jan 23 04:54:13 np0005593234 nova_compute[227762]: 2026-01-23 09:54:13.650 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:13 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:13Z|00272|binding|INFO|Releasing lport f7cddac6-b950-4d79-8e72-aca83650b1cd from this chassis (sb_readonly=0)
Jan 23 04:54:13 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:13Z|00273|binding|INFO|Setting lport f7cddac6-b950-4d79-8e72-aca83650b1cd down in Southbound
Jan 23 04:54:13 np0005593234 nova_compute[227762]: 2026-01-23 09:54:13.664 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:13 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:13Z|00274|binding|INFO|Removing iface tapf7cddac6-b9 ovn-installed in OVS
Jan 23 04:54:13 np0005593234 nova_compute[227762]: 2026-01-23 09:54:13.667 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.668 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[02584d19-04f0-4ac4-981d-7d31a7988552]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.675 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:03:8e:97 10.100.0.11'], port_security=['fa:16:3e:03:8e:97 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '4dff175a-ea83-41a7-b707-9a974155229b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=f7cddac6-b950-4d79-8e72-aca83650b1cd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:54:13 np0005593234 nova_compute[227762]: 2026-01-23 09:54:13.682 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.686 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9c08c331-d928-416c-8401-6e908bb4e0c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.687 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8b37de88-d93d-4f41-b9f3-167cea871135]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.702 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[19b11c98-d7b6-4b47-ae64-e63e492e8a03]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 594286, 'reachable_time': 25195, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263115, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.704 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.704 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[fc77ccb1-7fe5-478b-8bfa-2bc9002c0b95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.705 144381 INFO neutron.agent.ovn.metadata.agent [-] Port f7cddac6-b950-4d79-8e72-aca83650b1cd in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 unbound from our chassis#033[00m
Jan 23 04:54:13 np0005593234 systemd[1]: run-netns-ovnmeta\x2da3788149\x2defcd\x2d4940\x2d8a8f\x2de21af0a56a06.mount: Deactivated successfully.
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.706 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a3788149-efcd-4940-8a8f-e21af0a56a06, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.707 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eeb441f5-501e-4ceb-8e8f-f6d8e8816adc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.707 144381 INFO neutron.agent.ovn.metadata.agent [-] Port f7cddac6-b950-4d79-8e72-aca83650b1cd in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 unbound from our chassis#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.709 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a3788149-efcd-4940-8a8f-e21af0a56a06, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:54:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:13.709 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a8e6ef29-a96a-463f-b786-ea86b5be69fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:13.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:13.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:14 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:54:14 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:54:14 np0005593234 nova_compute[227762]: 2026-01-23 09:54:14.559 227766 DEBUG nova.compute.manager [req-eb60266d-9edb-4ed2-ab97-9c7088941d2b req-503957d1-de72-4b62-8b27-5bc715d4673b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received event network-vif-unplugged-f7cddac6-b950-4d79-8e72-aca83650b1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:14 np0005593234 nova_compute[227762]: 2026-01-23 09:54:14.559 227766 DEBUG oslo_concurrency.lockutils [req-eb60266d-9edb-4ed2-ab97-9c7088941d2b req-503957d1-de72-4b62-8b27-5bc715d4673b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4dff175a-ea83-41a7-b707-9a974155229b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:14 np0005593234 nova_compute[227762]: 2026-01-23 09:54:14.559 227766 DEBUG oslo_concurrency.lockutils [req-eb60266d-9edb-4ed2-ab97-9c7088941d2b req-503957d1-de72-4b62-8b27-5bc715d4673b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:14 np0005593234 nova_compute[227762]: 2026-01-23 09:54:14.560 227766 DEBUG oslo_concurrency.lockutils [req-eb60266d-9edb-4ed2-ab97-9c7088941d2b req-503957d1-de72-4b62-8b27-5bc715d4673b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:14 np0005593234 nova_compute[227762]: 2026-01-23 09:54:14.560 227766 DEBUG nova.compute.manager [req-eb60266d-9edb-4ed2-ab97-9c7088941d2b req-503957d1-de72-4b62-8b27-5bc715d4673b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] No waiting events found dispatching network-vif-unplugged-f7cddac6-b950-4d79-8e72-aca83650b1cd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:14 np0005593234 nova_compute[227762]: 2026-01-23 09:54:14.560 227766 WARNING nova.compute.manager [req-eb60266d-9edb-4ed2-ab97-9c7088941d2b req-503957d1-de72-4b62-8b27-5bc715d4673b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received unexpected event network-vif-unplugged-f7cddac6-b950-4d79-8e72-aca83650b1cd for instance with vm_state suspended and task_state None.#033[00m
Jan 23 04:54:14 np0005593234 nova_compute[227762]: 2026-01-23 09:54:14.560 227766 DEBUG nova.compute.manager [req-eb60266d-9edb-4ed2-ab97-9c7088941d2b req-503957d1-de72-4b62-8b27-5bc715d4673b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received event network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:14 np0005593234 nova_compute[227762]: 2026-01-23 09:54:14.560 227766 DEBUG oslo_concurrency.lockutils [req-eb60266d-9edb-4ed2-ab97-9c7088941d2b req-503957d1-de72-4b62-8b27-5bc715d4673b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4dff175a-ea83-41a7-b707-9a974155229b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:14 np0005593234 nova_compute[227762]: 2026-01-23 09:54:14.560 227766 DEBUG oslo_concurrency.lockutils [req-eb60266d-9edb-4ed2-ab97-9c7088941d2b req-503957d1-de72-4b62-8b27-5bc715d4673b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:14 np0005593234 nova_compute[227762]: 2026-01-23 09:54:14.562 227766 DEBUG oslo_concurrency.lockutils [req-eb60266d-9edb-4ed2-ab97-9c7088941d2b req-503957d1-de72-4b62-8b27-5bc715d4673b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:14 np0005593234 nova_compute[227762]: 2026-01-23 09:54:14.562 227766 DEBUG nova.compute.manager [req-eb60266d-9edb-4ed2-ab97-9c7088941d2b req-503957d1-de72-4b62-8b27-5bc715d4673b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] No waiting events found dispatching network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:14 np0005593234 nova_compute[227762]: 2026-01-23 09:54:14.562 227766 WARNING nova.compute.manager [req-eb60266d-9edb-4ed2-ab97-9c7088941d2b req-503957d1-de72-4b62-8b27-5bc715d4673b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received unexpected event network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd for instance with vm_state suspended and task_state None.#033[00m
Jan 23 04:54:15 np0005593234 nova_compute[227762]: 2026-01-23 09:54:15.370 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:15 np0005593234 nova_compute[227762]: 2026-01-23 09:54:15.633 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:15.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:15.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.206 227766 DEBUG nova.compute.manager [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received event network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.206 227766 DEBUG oslo_concurrency.lockutils [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4dff175a-ea83-41a7-b707-9a974155229b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.206 227766 DEBUG oslo_concurrency.lockutils [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.207 227766 DEBUG oslo_concurrency.lockutils [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.207 227766 DEBUG nova.compute.manager [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] No waiting events found dispatching network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.207 227766 WARNING nova.compute.manager [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received unexpected event network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd for instance with vm_state suspended and task_state None.#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.207 227766 DEBUG nova.compute.manager [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received event network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.207 227766 DEBUG oslo_concurrency.lockutils [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4dff175a-ea83-41a7-b707-9a974155229b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.208 227766 DEBUG oslo_concurrency.lockutils [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.208 227766 DEBUG oslo_concurrency.lockutils [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.208 227766 DEBUG nova.compute.manager [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] No waiting events found dispatching network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.208 227766 WARNING nova.compute.manager [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received unexpected event network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd for instance with vm_state suspended and task_state None.#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.208 227766 DEBUG nova.compute.manager [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received event network-vif-unplugged-f7cddac6-b950-4d79-8e72-aca83650b1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.209 227766 DEBUG oslo_concurrency.lockutils [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4dff175a-ea83-41a7-b707-9a974155229b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.209 227766 DEBUG oslo_concurrency.lockutils [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.209 227766 DEBUG oslo_concurrency.lockutils [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.209 227766 DEBUG nova.compute.manager [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] No waiting events found dispatching network-vif-unplugged-f7cddac6-b950-4d79-8e72-aca83650b1cd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.209 227766 WARNING nova.compute.manager [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received unexpected event network-vif-unplugged-f7cddac6-b950-4d79-8e72-aca83650b1cd for instance with vm_state suspended and task_state None.#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.209 227766 DEBUG nova.compute.manager [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received event network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.210 227766 DEBUG oslo_concurrency.lockutils [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4dff175a-ea83-41a7-b707-9a974155229b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.210 227766 DEBUG oslo_concurrency.lockutils [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.210 227766 DEBUG oslo_concurrency.lockutils [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.210 227766 DEBUG nova.compute.manager [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] No waiting events found dispatching network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.210 227766 WARNING nova.compute.manager [req-50236fb4-0c4b-431b-bc58-22c4f838c525 req-4e983fb6-7f52-44d2-8702-8dac58a9183d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received unexpected event network-vif-plugged-f7cddac6-b950-4d79-8e72-aca83650b1cd for instance with vm_state suspended and task_state None.#033[00m
Jan 23 04:54:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:17.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.816 227766 DEBUG oslo_concurrency.lockutils [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "4dff175a-ea83-41a7-b707-9a974155229b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.816 227766 DEBUG oslo_concurrency.lockutils [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.816 227766 DEBUG oslo_concurrency.lockutils [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "4dff175a-ea83-41a7-b707-9a974155229b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.817 227766 DEBUG oslo_concurrency.lockutils [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.817 227766 DEBUG oslo_concurrency.lockutils [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.818 227766 INFO nova.compute.manager [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Terminating instance#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.819 227766 DEBUG nova.compute.manager [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.825 227766 INFO nova.virt.libvirt.driver [-] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Instance destroyed successfully.#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.826 227766 DEBUG nova.objects.instance [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'resources' on Instance uuid 4dff175a-ea83-41a7-b707-9a974155229b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:17.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.850 227766 DEBUG nova.virt.libvirt.vif [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:53:50Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-183505836',display_name='tempest-DeleteServersTestJSON-server-183505836',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-183505836',id=80,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:54:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-mz31r5t0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:54:13Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=4dff175a-ea83-41a7-b707-9a974155229b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "address": "fa:16:3e:03:8e:97", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7cddac6-b9", "ovs_interfaceid": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.851 227766 DEBUG nova.network.os_vif_util [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "address": "fa:16:3e:03:8e:97", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7cddac6-b9", "ovs_interfaceid": "f7cddac6-b950-4d79-8e72-aca83650b1cd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.852 227766 DEBUG nova.network.os_vif_util [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:03:8e:97,bridge_name='br-int',has_traffic_filtering=True,id=f7cddac6-b950-4d79-8e72-aca83650b1cd,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7cddac6-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.852 227766 DEBUG os_vif [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:8e:97,bridge_name='br-int',has_traffic_filtering=True,id=f7cddac6-b950-4d79-8e72-aca83650b1cd,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7cddac6-b9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.853 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.854 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7cddac6-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.855 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.857 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:54:17 np0005593234 nova_compute[227762]: 2026-01-23 09:54:17.859 227766 INFO os_vif [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:03:8e:97,bridge_name='br-int',has_traffic_filtering=True,id=f7cddac6-b950-4d79-8e72-aca83650b1cd,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7cddac6-b9')#033[00m
Jan 23 04:54:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:19.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:19.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:20 np0005593234 nova_compute[227762]: 2026-01-23 09:54:20.424 227766 INFO nova.virt.libvirt.driver [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Deleting instance files /var/lib/nova/instances/4dff175a-ea83-41a7-b707-9a974155229b_del#033[00m
Jan 23 04:54:20 np0005593234 nova_compute[227762]: 2026-01-23 09:54:20.425 227766 INFO nova.virt.libvirt.driver [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Deletion of /var/lib/nova/instances/4dff175a-ea83-41a7-b707-9a974155229b_del complete#033[00m
Jan 23 04:54:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:20 np0005593234 nova_compute[227762]: 2026-01-23 09:54:20.564 227766 INFO nova.compute.manager [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Took 2.74 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:54:20 np0005593234 nova_compute[227762]: 2026-01-23 09:54:20.564 227766 DEBUG oslo.service.loopingcall [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:54:20 np0005593234 nova_compute[227762]: 2026-01-23 09:54:20.565 227766 DEBUG nova.compute.manager [-] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:54:20 np0005593234 nova_compute[227762]: 2026-01-23 09:54:20.565 227766 DEBUG nova.network.neutron [-] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:54:20 np0005593234 nova_compute[227762]: 2026-01-23 09:54:20.634 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:20 np0005593234 nova_compute[227762]: 2026-01-23 09:54:20.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:21.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:21.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:22 np0005593234 nova_compute[227762]: 2026-01-23 09:54:22.857 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:23 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:54:23 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:54:23 np0005593234 nova_compute[227762]: 2026-01-23 09:54:23.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:23.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:23.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:24 np0005593234 nova_compute[227762]: 2026-01-23 09:54:24.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:24 np0005593234 nova_compute[227762]: 2026-01-23 09:54:24.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:54:24 np0005593234 nova_compute[227762]: 2026-01-23 09:54:24.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:24 np0005593234 podman[263241]: 2026-01-23 09:54:24.779598677 +0000 UTC m=+0.075777908 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 04:54:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e256 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:25 np0005593234 nova_compute[227762]: 2026-01-23 09:54:25.635 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:25 np0005593234 nova_compute[227762]: 2026-01-23 09:54:25.686 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:25 np0005593234 nova_compute[227762]: 2026-01-23 09:54:25.687 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:25 np0005593234 nova_compute[227762]: 2026-01-23 09:54:25.687 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:25 np0005593234 nova_compute[227762]: 2026-01-23 09:54:25.687 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:54:25 np0005593234 nova_compute[227762]: 2026-01-23 09:54:25.687 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:25.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:25.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:54:26 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3060700011' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.134 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.276 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.277 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4606MB free_disk=20.922027587890625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.277 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.277 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.423 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 4dff175a-ea83-41a7-b707-9a974155229b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.424 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.424 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.487 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.536 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.537 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.579 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.631 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:54:26 np0005593234 nova_compute[227762]: 2026-01-23 09:54:26.749 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:54:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2053903848' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:54:27 np0005593234 nova_compute[227762]: 2026-01-23 09:54:27.200 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:27 np0005593234 nova_compute[227762]: 2026-01-23 09:54:27.206 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:54:27 np0005593234 nova_compute[227762]: 2026-01-23 09:54:27.670 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:54:27 np0005593234 nova_compute[227762]: 2026-01-23 09:54:27.675 227766 DEBUG nova.network.neutron [-] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:27 np0005593234 nova_compute[227762]: 2026-01-23 09:54:27.715 227766 INFO nova.compute.manager [-] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Took 7.15 seconds to deallocate network for instance.#033[00m
Jan 23 04:54:27 np0005593234 nova_compute[227762]: 2026-01-23 09:54:27.721 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:54:27 np0005593234 nova_compute[227762]: 2026-01-23 09:54:27.721 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.444s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:27 np0005593234 nova_compute[227762]: 2026-01-23 09:54:27.815 227766 DEBUG oslo_concurrency.lockutils [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:27 np0005593234 nova_compute[227762]: 2026-01-23 09:54:27.816 227766 DEBUG oslo_concurrency.lockutils [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:27.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:27.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:27 np0005593234 nova_compute[227762]: 2026-01-23 09:54:27.860 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:27 np0005593234 nova_compute[227762]: 2026-01-23 09:54:27.956 227766 DEBUG oslo_concurrency.processutils [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:28 np0005593234 nova_compute[227762]: 2026-01-23 09:54:28.169 227766 DEBUG nova.compute.manager [req-46430004-e1e8-42a4-96c2-ec25979834c9 req-e97eacb4-de66-45a2-8d3e-db06db8e38ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Received event network-vif-deleted-f7cddac6-b950-4d79-8e72-aca83650b1cd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:54:28 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4134407061' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:54:28 np0005593234 nova_compute[227762]: 2026-01-23 09:54:28.406 227766 DEBUG oslo_concurrency.processutils [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:28 np0005593234 nova_compute[227762]: 2026-01-23 09:54:28.411 227766 DEBUG nova.compute.provider_tree [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:54:28 np0005593234 nova_compute[227762]: 2026-01-23 09:54:28.478 227766 DEBUG nova.scheduler.client.report [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:54:28 np0005593234 nova_compute[227762]: 2026-01-23 09:54:28.538 227766 DEBUG oslo_concurrency.lockutils [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.722s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:28 np0005593234 nova_compute[227762]: 2026-01-23 09:54:28.587 227766 INFO nova.scheduler.client.report [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Deleted allocations for instance 4dff175a-ea83-41a7-b707-9a974155229b#033[00m
Jan 23 04:54:28 np0005593234 nova_compute[227762]: 2026-01-23 09:54:28.607 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162053.6062932, 4dff175a-ea83-41a7-b707-9a974155229b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:54:28 np0005593234 nova_compute[227762]: 2026-01-23 09:54:28.608 227766 INFO nova.compute.manager [-] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:54:28 np0005593234 nova_compute[227762]: 2026-01-23 09:54:28.654 227766 DEBUG nova.compute.manager [None req-b1ad2a2b-f372-4810-a5ab-81dec76203a4 - - - - - -] [instance: 4dff175a-ea83-41a7-b707-9a974155229b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:54:28 np0005593234 nova_compute[227762]: 2026-01-23 09:54:28.780 227766 DEBUG oslo_concurrency.lockutils [None req-1484a524-deb7-4fb7-b696-3f869c8fa65d 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "4dff175a-ea83-41a7-b707-9a974155229b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.963s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e257 e257: 3 total, 3 up, 3 in
Jan 23 04:54:29 np0005593234 nova_compute[227762]: 2026-01-23 09:54:29.722 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:29 np0005593234 nova_compute[227762]: 2026-01-23 09:54:29.723 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:54:29 np0005593234 nova_compute[227762]: 2026-01-23 09:54:29.723 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:54:29 np0005593234 nova_compute[227762]: 2026-01-23 09:54:29.749 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:54:29 np0005593234 nova_compute[227762]: 2026-01-23 09:54:29.750 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:29.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:29.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:30 np0005593234 nova_compute[227762]: 2026-01-23 09:54:30.673 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:30 np0005593234 nova_compute[227762]: 2026-01-23 09:54:30.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:30 np0005593234 nova_compute[227762]: 2026-01-23 09:54:30.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:31.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:31.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:32 np0005593234 nova_compute[227762]: 2026-01-23 09:54:32.868 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:33 np0005593234 nova_compute[227762]: 2026-01-23 09:54:33.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:33.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:33.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:34 np0005593234 nova_compute[227762]: 2026-01-23 09:54:34.960 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:54:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:35 np0005593234 nova_compute[227762]: 2026-01-23 09:54:35.677 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:35.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:35.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:35 np0005593234 nova_compute[227762]: 2026-01-23 09:54:35.899 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:35 np0005593234 nova_compute[227762]: 2026-01-23 09:54:35.900 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:35 np0005593234 nova_compute[227762]: 2026-01-23 09:54:35.931 227766 DEBUG nova.compute.manager [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.160 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.161 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.167 227766 DEBUG nova.virt.hardware [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.167 227766 INFO nova.compute.claims [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.346 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:54:36 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4216302276' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.776 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.782 227766 DEBUG nova.compute.provider_tree [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.806 227766 DEBUG nova.scheduler.client.report [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.830 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.831 227766 DEBUG nova.compute.manager [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.888 227766 DEBUG nova.compute.manager [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.888 227766 DEBUG nova.network.neutron [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.911 227766 INFO nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:54:36 np0005593234 nova_compute[227762]: 2026-01-23 09:54:36.936 227766 DEBUG nova.compute.manager [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.032 227766 DEBUG nova.compute.manager [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.033 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.034 227766 INFO nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Creating image(s)#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.061 227766 DEBUG nova.storage.rbd_utils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image dc5e2bb3-0d73-4538-a181-9380a1d67934_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.090 227766 DEBUG nova.storage.rbd_utils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image dc5e2bb3-0d73-4538-a181-9380a1d67934_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.119 227766 DEBUG nova.storage.rbd_utils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image dc5e2bb3-0d73-4538-a181-9380a1d67934_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.123 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.183 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.185 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.186 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.186 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.212 227766 DEBUG nova.storage.rbd_utils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image dc5e2bb3-0d73-4538-a181-9380a1d67934_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.216 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 dc5e2bb3-0d73-4538-a181-9380a1d67934_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.238 227766 DEBUG nova.policy [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '28a7a778c8ab486fb586e81bb84113be', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '61df91981c55482fa5c9a64686c79f9e', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.505 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 dc5e2bb3-0d73-4538-a181-9380a1d67934_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.289s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.591 227766 DEBUG nova.storage.rbd_utils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] resizing rbd image dc5e2bb3-0d73-4538-a181-9380a1d67934_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.707 227766 DEBUG nova.objects.instance [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'migration_context' on Instance uuid dc5e2bb3-0d73-4538-a181-9380a1d67934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.728 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.728 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Ensure instance console log exists: /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.729 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.729 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.729 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:37.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:37.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:37 np0005593234 nova_compute[227762]: 2026-01-23 09:54:37.871 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:39.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:39 np0005593234 nova_compute[227762]: 2026-01-23 09:54:39.869 227766 DEBUG nova.network.neutron [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Successfully created port: 21920b88-3779-4c29-b3a9-7591691e880a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:54:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:39.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:40 np0005593234 nova_compute[227762]: 2026-01-23 09:54:40.679 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:40 np0005593234 podman[263581]: 2026-01-23 09:54:40.926446657 +0000 UTC m=+0.056399823 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 04:54:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:41.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:41.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:42 np0005593234 nova_compute[227762]: 2026-01-23 09:54:42.645 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:42.646 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=30, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=29) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:54:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:42.647 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:54:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:42.830 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:42.831 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:42.831 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:42 np0005593234 nova_compute[227762]: 2026-01-23 09:54:42.991 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:43 np0005593234 nova_compute[227762]: 2026-01-23 09:54:43.378 227766 DEBUG nova.network.neutron [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Successfully updated port: 21920b88-3779-4c29-b3a9-7591691e880a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:54:43 np0005593234 nova_compute[227762]: 2026-01-23 09:54:43.405 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:43 np0005593234 nova_compute[227762]: 2026-01-23 09:54:43.406 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquired lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:43 np0005593234 nova_compute[227762]: 2026-01-23 09:54:43.406 227766 DEBUG nova.network.neutron [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:54:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:43.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:43.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:44 np0005593234 nova_compute[227762]: 2026-01-23 09:54:44.157 227766 DEBUG nova.network.neutron [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:54:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:54:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2430556718' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:54:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:54:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2430556718' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:54:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e258 e258: 3 total, 3 up, 3 in
Jan 23 04:54:45 np0005593234 nova_compute[227762]: 2026-01-23 09:54:45.251 227766 DEBUG nova.compute.manager [req-9d8b9277-1b59-4667-a42a-673db45f6b66 req-9fe474eb-514d-4b1d-97a8-72bc707993a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-changed-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:45 np0005593234 nova_compute[227762]: 2026-01-23 09:54:45.251 227766 DEBUG nova.compute.manager [req-9d8b9277-1b59-4667-a42a-673db45f6b66 req-9fe474eb-514d-4b1d-97a8-72bc707993a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Refreshing instance network info cache due to event network-changed-21920b88-3779-4c29-b3a9-7591691e880a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:54:45 np0005593234 nova_compute[227762]: 2026-01-23 09:54:45.252 227766 DEBUG oslo_concurrency.lockutils [req-9d8b9277-1b59-4667-a42a-673db45f6b66 req-9fe474eb-514d-4b1d-97a8-72bc707993a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:45 np0005593234 nova_compute[227762]: 2026-01-23 09:54:45.682 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:45.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:45.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.170 227766 DEBUG nova.network.neutron [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updating instance_info_cache with network_info: [{"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.436 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Releasing lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.436 227766 DEBUG nova.compute.manager [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Instance network_info: |[{"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.436 227766 DEBUG oslo_concurrency.lockutils [req-9d8b9277-1b59-4667-a42a-673db45f6b66 req-9fe474eb-514d-4b1d-97a8-72bc707993a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.437 227766 DEBUG nova.network.neutron [req-9d8b9277-1b59-4667-a42a-673db45f6b66 req-9fe474eb-514d-4b1d-97a8-72bc707993a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Refreshing network info cache for port 21920b88-3779-4c29-b3a9-7591691e880a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.439 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Start _get_guest_xml network_info=[{"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.443 227766 WARNING nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.454 227766 DEBUG nova.virt.libvirt.host [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.455 227766 DEBUG nova.virt.libvirt.host [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.466 227766 DEBUG nova.virt.libvirt.host [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.466 227766 DEBUG nova.virt.libvirt.host [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.467 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.467 227766 DEBUG nova.virt.hardware [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.468 227766 DEBUG nova.virt.hardware [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.468 227766 DEBUG nova.virt.hardware [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.468 227766 DEBUG nova.virt.hardware [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.468 227766 DEBUG nova.virt.hardware [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.468 227766 DEBUG nova.virt.hardware [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.468 227766 DEBUG nova.virt.hardware [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.469 227766 DEBUG nova.virt.hardware [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.469 227766 DEBUG nova.virt.hardware [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.469 227766 DEBUG nova.virt.hardware [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.469 227766 DEBUG nova.virt.hardware [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.471 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:46.649 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '30'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:54:46 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2328642548' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.912 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.934 227766 DEBUG nova.storage.rbd_utils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image dc5e2bb3-0d73-4538-a181-9380a1d67934_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:46 np0005593234 nova_compute[227762]: 2026-01-23 09:54:46.938 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:54:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1364145267' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.410 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.412 227766 DEBUG nova.virt.libvirt.vif [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:54:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1189076101',display_name='tempest-DeleteServersTestJSON-server-1189076101',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1189076101',id=83,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-t070zg10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:54:36Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=dc5e2bb3-0d73-4538-a181-9380a1d67934,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.412 227766 DEBUG nova.network.os_vif_util [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.413 227766 DEBUG nova.network.os_vif_util [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.414 227766 DEBUG nova.objects.instance [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'pci_devices' on Instance uuid dc5e2bb3-0d73-4538-a181-9380a1d67934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.495 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  <uuid>dc5e2bb3-0d73-4538-a181-9380a1d67934</uuid>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  <name>instance-00000053</name>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <nova:name>tempest-DeleteServersTestJSON-server-1189076101</nova:name>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:54:46</nova:creationTime>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <nova:user uuid="28a7a778c8ab486fb586e81bb84113be">tempest-DeleteServersTestJSON-944070453-project-member</nova:user>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <nova:project uuid="61df91981c55482fa5c9a64686c79f9e">tempest-DeleteServersTestJSON-944070453</nova:project>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <nova:port uuid="21920b88-3779-4c29-b3a9-7591691e880a">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <entry name="serial">dc5e2bb3-0d73-4538-a181-9380a1d67934</entry>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <entry name="uuid">dc5e2bb3-0d73-4538-a181-9380a1d67934</entry>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/dc5e2bb3-0d73-4538-a181-9380a1d67934_disk">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/dc5e2bb3-0d73-4538-a181-9380a1d67934_disk.config">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:46:2e:b4"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <target dev="tap21920b88-37"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/console.log" append="off"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:54:47 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:54:47 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:54:47 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:54:47 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.496 227766 DEBUG nova.compute.manager [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Preparing to wait for external event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.496 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.496 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.496 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.497 227766 DEBUG nova.virt.libvirt.vif [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:54:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1189076101',display_name='tempest-DeleteServersTestJSON-server-1189076101',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1189076101',id=83,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-t070zg10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:54:36Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=dc5e2bb3-0d73-4538-a181-9380a1d67934,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.497 227766 DEBUG nova.network.os_vif_util [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.498 227766 DEBUG nova.network.os_vif_util [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.498 227766 DEBUG os_vif [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.499 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.499 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.499 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.503 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.503 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap21920b88-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.503 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap21920b88-37, col_values=(('external_ids', {'iface-id': '21920b88-3779-4c29-b3a9-7591691e880a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:2e:b4', 'vm-uuid': 'dc5e2bb3-0d73-4538-a181-9380a1d67934'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.505 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:47 np0005593234 NetworkManager[48942]: <info>  [1769162087.5059] manager: (tap21920b88-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/140)
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.508 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.511 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.512 227766 INFO os_vif [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37')#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.669 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.669 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.670 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] No VIF found with MAC fa:16:3e:46:2e:b4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.670 227766 INFO nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Using config drive#033[00m
Jan 23 04:54:47 np0005593234 nova_compute[227762]: 2026-01-23 09:54:47.698 227766 DEBUG nova.storage.rbd_utils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image dc5e2bb3-0d73-4538-a181-9380a1d67934_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:47.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:47.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:48 np0005593234 nova_compute[227762]: 2026-01-23 09:54:48.797 227766 INFO nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Creating config drive at /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/disk.config#033[00m
Jan 23 04:54:48 np0005593234 nova_compute[227762]: 2026-01-23 09:54:48.803 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp20delujl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:48 np0005593234 nova_compute[227762]: 2026-01-23 09:54:48.931 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp20delujl" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:48 np0005593234 nova_compute[227762]: 2026-01-23 09:54:48.955 227766 DEBUG nova.storage.rbd_utils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] rbd image dc5e2bb3-0d73-4538-a181-9380a1d67934_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:54:48 np0005593234 nova_compute[227762]: 2026-01-23 09:54:48.958 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/disk.config dc5e2bb3-0d73-4538-a181-9380a1d67934_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.105 227766 DEBUG oslo_concurrency.processutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/disk.config dc5e2bb3-0d73-4538-a181-9380a1d67934_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.106 227766 INFO nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Deleting local config drive /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/disk.config because it was imported into RBD.#033[00m
Jan 23 04:54:49 np0005593234 kernel: tap21920b88-37: entered promiscuous mode
Jan 23 04:54:49 np0005593234 NetworkManager[48942]: <info>  [1769162089.1529] manager: (tap21920b88-37): new Tun device (/org/freedesktop/NetworkManager/Devices/141)
Jan 23 04:54:49 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:49Z|00275|binding|INFO|Claiming lport 21920b88-3779-4c29-b3a9-7591691e880a for this chassis.
Jan 23 04:54:49 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:49Z|00276|binding|INFO|21920b88-3779-4c29-b3a9-7591691e880a: Claiming fa:16:3e:46:2e:b4 10.100.0.14
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.155 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.163 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:2e:b4 10.100.0.14'], port_security=['fa:16:3e:46:2e:b4 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dc5e2bb3-0d73-4538-a181-9380a1d67934', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=21920b88-3779-4c29-b3a9-7591691e880a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.164 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 21920b88-3779-4c29-b3a9-7591691e880a in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 bound to our chassis#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.166 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a3788149-efcd-4940-8a8f-e21af0a56a06#033[00m
Jan 23 04:54:49 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:49Z|00277|binding|INFO|Setting lport 21920b88-3779-4c29-b3a9-7591691e880a ovn-installed in OVS
Jan 23 04:54:49 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:49Z|00278|binding|INFO|Setting lport 21920b88-3779-4c29-b3a9-7591691e880a up in Southbound
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.172 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.174 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:49 np0005593234 systemd-udevd[263740]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.180 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c484a41d-e92b-49d2-9c04-2ec1cea77673]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.181 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa3788149-e1 in ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.182 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa3788149-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.183 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[16f42abf-0d0e-4145-a672-5f36954b8679]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.185 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[64783bcc-5b71-4c26-8bc3-743c324d4579]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 systemd-machined[195626]: New machine qemu-34-instance-00000053.
Jan 23 04:54:49 np0005593234 NetworkManager[48942]: <info>  [1769162089.1933] device (tap21920b88-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:54:49 np0005593234 NetworkManager[48942]: <info>  [1769162089.1941] device (tap21920b88-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.200 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[53525bb8-fe9c-4402-973a-6f52d3e7e917]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 systemd[1]: Started Virtual Machine qemu-34-instance-00000053.
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.226 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[24d6a8cb-8dcd-43dc-b8f2-7c8d50dfe3a2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.258 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[06d367f8-d818-49ae-b016-43be0823a101]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 NetworkManager[48942]: <info>  [1769162089.2648] manager: (tapa3788149-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/142)
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.264 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6fa556f8-05ce-4b8d-943c-08c416b62dd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.298 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ad9be026-7688-4362-bbdf-9a8ae32cb9f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.301 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[229317d0-83e5-44ec-bd17-92c16e4acc7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 NetworkManager[48942]: <info>  [1769162089.3200] device (tapa3788149-e0): carrier: link connected
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.323 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[dcafd9cc-9354-4444-93a8-9506c32d9043]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.341 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[06922daf-3eda-4110-b5f2-463a6039b4d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598427, 'reachable_time': 40909, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 263773, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.354 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aea31805-c1e9-42a3-8f9d-c331194ecb63]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:ddff'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 598427, 'tstamp': 598427}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 263774, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.366 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f6571831-a052-4885-9073-115b34a15a79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa3788149-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:dd:ff'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598427, 'reachable_time': 40909, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 263775, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.393 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0b5253d6-d8aa-4c82-99ce-79199f5ce61d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.442 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[51a68a9b-e34d-45cd-b63e-db3b5aa34c89]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.443 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.443 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.444 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa3788149-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:49 np0005593234 NetworkManager[48942]: <info>  [1769162089.4464] manager: (tapa3788149-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.445 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:49 np0005593234 kernel: tapa3788149-e0: entered promiscuous mode
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.448 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.449 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa3788149-e0, col_values=(('external_ids', {'iface-id': 'd6ce7fd1-128d-488f-94e6-68332f7a8a6b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:54:49 np0005593234 ovn_controller[134547]: 2026-01-23T09:54:49Z|00279|binding|INFO|Releasing lport d6ce7fd1-128d-488f-94e6-68332f7a8a6b from this chassis (sb_readonly=0)
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.451 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.465 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.466 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.466 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0845bf-8946-4b4d-b81a-1c9c1f3cfa8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.467 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/a3788149-efcd-4940-8a8f-e21af0a56a06.pid.haproxy
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID a3788149-efcd-4940-8a8f-e21af0a56a06
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:54:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:54:49.468 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'env', 'PROCESS_TAG=haproxy-a3788149-efcd-4940-8a8f-e21af0a56a06', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a3788149-efcd-4940-8a8f-e21af0a56a06.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.628 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162089.6276913, dc5e2bb3-0d73-4538-a181-9380a1d67934 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.628 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] VM Started (Lifecycle Event)#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.669 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.674 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162089.6283984, dc5e2bb3-0d73-4538-a181-9380a1d67934 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.674 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.724 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.728 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.768 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:54:49 np0005593234 podman[263849]: 2026-01-23 09:54:49.833672105 +0000 UTC m=+0.050640373 container create 66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 04:54:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:49.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:49 np0005593234 systemd[1]: Started libpod-conmon-66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9.scope.
Jan 23 04:54:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:49.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:49 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:54:49 np0005593234 podman[263849]: 2026-01-23 09:54:49.806395843 +0000 UTC m=+0.023364131 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:54:49 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03bf7ba5f433009b9e8c51c81544819e4d558173e10e4b1d71ec849a2c06487b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:54:49 np0005593234 podman[263849]: 2026-01-23 09:54:49.911831976 +0000 UTC m=+0.128800264 container init 66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:54:49 np0005593234 podman[263849]: 2026-01-23 09:54:49.917651008 +0000 UTC m=+0.134619276 container start 66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 04:54:49 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[263865]: [NOTICE]   (263870) : New worker (263872) forked
Jan 23 04:54:49 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[263865]: [NOTICE]   (263870) : Loading success.
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.965 227766 DEBUG nova.network.neutron [req-9d8b9277-1b59-4667-a42a-673db45f6b66 req-9fe474eb-514d-4b1d-97a8-72bc707993a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updated VIF entry in instance network info cache for port 21920b88-3779-4c29-b3a9-7591691e880a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.966 227766 DEBUG nova.network.neutron [req-9d8b9277-1b59-4667-a42a-673db45f6b66 req-9fe474eb-514d-4b1d-97a8-72bc707993a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updating instance_info_cache with network_info: [{"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:54:49 np0005593234 nova_compute[227762]: 2026-01-23 09:54:49.988 227766 DEBUG oslo_concurrency.lockutils [req-9d8b9277-1b59-4667-a42a-673db45f6b66 req-9fe474eb-514d-4b1d-97a8-72bc707993a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:54:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e258 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:50 np0005593234 nova_compute[227762]: 2026-01-23 09:54:50.683 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e259 e259: 3 total, 3 up, 3 in
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.194 227766 DEBUG nova.compute.manager [req-aba42d3c-345d-4bdb-9f0a-06ec49555232 req-6de70cbf-9aac-4d02-ad32-fb8d9f66fa15 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.195 227766 DEBUG oslo_concurrency.lockutils [req-aba42d3c-345d-4bdb-9f0a-06ec49555232 req-6de70cbf-9aac-4d02-ad32-fb8d9f66fa15 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.195 227766 DEBUG oslo_concurrency.lockutils [req-aba42d3c-345d-4bdb-9f0a-06ec49555232 req-6de70cbf-9aac-4d02-ad32-fb8d9f66fa15 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.195 227766 DEBUG oslo_concurrency.lockutils [req-aba42d3c-345d-4bdb-9f0a-06ec49555232 req-6de70cbf-9aac-4d02-ad32-fb8d9f66fa15 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.196 227766 DEBUG nova.compute.manager [req-aba42d3c-345d-4bdb-9f0a-06ec49555232 req-6de70cbf-9aac-4d02-ad32-fb8d9f66fa15 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Processing event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.196 227766 DEBUG nova.compute.manager [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.202 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.202 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162091.201621, dc5e2bb3-0d73-4538-a181-9380a1d67934 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.203 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.207 227766 INFO nova.virt.libvirt.driver [-] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Instance spawned successfully.#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.207 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.260 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.266 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.267 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.267 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.267 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.268 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.268 227766 DEBUG nova.virt.libvirt.driver [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.272 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.343 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.487 227766 INFO nova.compute.manager [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Took 14.45 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.488 227766 DEBUG nova.compute.manager [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.602 227766 INFO nova.compute.manager [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Took 15.51 seconds to build instance.#033[00m
Jan 23 04:54:51 np0005593234 nova_compute[227762]: 2026-01-23 09:54:51.642 227766 DEBUG oslo_concurrency.lockutils [None req-6e9d5a9e-6358-43b6-9277-43095c6e87ac 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.742s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:51.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:51.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:52 np0005593234 nova_compute[227762]: 2026-01-23 09:54:52.508 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:53 np0005593234 nova_compute[227762]: 2026-01-23 09:54:53.455 227766 DEBUG nova.compute.manager [req-81cb6ccd-cb90-4122-a7f4-fb9a31e56a66 req-8486b3da-4799-4cf2-85f8-f29ce1444e4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:54:53 np0005593234 nova_compute[227762]: 2026-01-23 09:54:53.456 227766 DEBUG oslo_concurrency.lockutils [req-81cb6ccd-cb90-4122-a7f4-fb9a31e56a66 req-8486b3da-4799-4cf2-85f8-f29ce1444e4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:54:53 np0005593234 nova_compute[227762]: 2026-01-23 09:54:53.456 227766 DEBUG oslo_concurrency.lockutils [req-81cb6ccd-cb90-4122-a7f4-fb9a31e56a66 req-8486b3da-4799-4cf2-85f8-f29ce1444e4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:54:53 np0005593234 nova_compute[227762]: 2026-01-23 09:54:53.456 227766 DEBUG oslo_concurrency.lockutils [req-81cb6ccd-cb90-4122-a7f4-fb9a31e56a66 req-8486b3da-4799-4cf2-85f8-f29ce1444e4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:54:53 np0005593234 nova_compute[227762]: 2026-01-23 09:54:53.457 227766 DEBUG nova.compute.manager [req-81cb6ccd-cb90-4122-a7f4-fb9a31e56a66 req-8486b3da-4799-4cf2-85f8-f29ce1444e4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] No waiting events found dispatching network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:54:53 np0005593234 nova_compute[227762]: 2026-01-23 09:54:53.457 227766 WARNING nova.compute.manager [req-81cb6ccd-cb90-4122-a7f4-fb9a31e56a66 req-8486b3da-4799-4cf2-85f8-f29ce1444e4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received unexpected event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a for instance with vm_state active and task_state None.#033[00m
Jan 23 04:54:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:53.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:53.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:54:55 np0005593234 nova_compute[227762]: 2026-01-23 09:54:55.685 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:55 np0005593234 podman[263883]: 2026-01-23 09:54:55.790859022 +0000 UTC m=+0.086522674 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 04:54:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:55.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:55.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:57 np0005593234 nova_compute[227762]: 2026-01-23 09:54:57.513 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:54:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:54:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:57.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:54:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:54:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:57.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:54:59 np0005593234 nova_compute[227762]: 2026-01-23 09:54:59.666 227766 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:54:59 np0005593234 nova_compute[227762]: 2026-01-23 09:54:59.667 227766 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquired lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:54:59 np0005593234 nova_compute[227762]: 2026-01-23 09:54:59.667 227766 DEBUG nova.network.neutron [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:54:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:59.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:54:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:54:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:54:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:59.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:00 np0005593234 nova_compute[227762]: 2026-01-23 09:55:00.687 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:55:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:01.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:55:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:01.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:02 np0005593234 nova_compute[227762]: 2026-01-23 09:55:02.516 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:03.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:03 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:55:03 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 04:55:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:03.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:04 np0005593234 nova_compute[227762]: 2026-01-23 09:55:04.186 227766 DEBUG nova.network.neutron [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updating instance_info_cache with network_info: [{"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:04 np0005593234 nova_compute[227762]: 2026-01-23 09:55:04.221 227766 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Releasing lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:55:04 np0005593234 nova_compute[227762]: 2026-01-23 09:55:04.740 227766 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 23 04:55:04 np0005593234 nova_compute[227762]: 2026-01-23 09:55:04.741 227766 DEBUG nova.virt.libvirt.volume.remotefs [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Creating file /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/677c69e5f392490a8f07b4f723a4d623.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 23 04:55:04 np0005593234 nova_compute[227762]: 2026-01-23 09:55:04.741 227766 DEBUG oslo_concurrency.processutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/677c69e5f392490a8f07b4f723a4d623.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:05 np0005593234 ovn_controller[134547]: 2026-01-23T09:55:05Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:46:2e:b4 10.100.0.14
Jan 23 04:55:05 np0005593234 ovn_controller[134547]: 2026-01-23T09:55:05Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:46:2e:b4 10.100.0.14
Jan 23 04:55:05 np0005593234 nova_compute[227762]: 2026-01-23 09:55:05.207 227766 DEBUG oslo_concurrency.processutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/677c69e5f392490a8f07b4f723a4d623.tmp" returned: 1 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:05 np0005593234 nova_compute[227762]: 2026-01-23 09:55:05.208 227766 DEBUG oslo_concurrency.processutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934/677c69e5f392490a8f07b4f723a4d623.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 04:55:05 np0005593234 nova_compute[227762]: 2026-01-23 09:55:05.208 227766 DEBUG nova.virt.libvirt.volume.remotefs [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Creating directory /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 23 04:55:05 np0005593234 nova_compute[227762]: 2026-01-23 09:55:05.209 227766 DEBUG oslo_concurrency.processutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:05 np0005593234 nova_compute[227762]: 2026-01-23 09:55:05.435 227766 DEBUG oslo_concurrency.processutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/dc5e2bb3-0d73-4538-a181-9380a1d67934" returned: 0 in 0.226s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:05 np0005593234 nova_compute[227762]: 2026-01-23 09:55:05.440 227766 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 04:55:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:05 np0005593234 nova_compute[227762]: 2026-01-23 09:55:05.689 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:05.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:05.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:07 np0005593234 nova_compute[227762]: 2026-01-23 09:55:07.520 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:07 np0005593234 kernel: tap21920b88-37 (unregistering): left promiscuous mode
Jan 23 04:55:07 np0005593234 NetworkManager[48942]: <info>  [1769162107.6781] device (tap21920b88-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:55:07 np0005593234 ovn_controller[134547]: 2026-01-23T09:55:07Z|00280|binding|INFO|Releasing lport 21920b88-3779-4c29-b3a9-7591691e880a from this chassis (sb_readonly=0)
Jan 23 04:55:07 np0005593234 ovn_controller[134547]: 2026-01-23T09:55:07Z|00281|binding|INFO|Setting lport 21920b88-3779-4c29-b3a9-7591691e880a down in Southbound
Jan 23 04:55:07 np0005593234 ovn_controller[134547]: 2026-01-23T09:55:07Z|00282|binding|INFO|Removing iface tap21920b88-37 ovn-installed in OVS
Jan 23 04:55:07 np0005593234 nova_compute[227762]: 2026-01-23 09:55:07.695 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:07 np0005593234 nova_compute[227762]: 2026-01-23 09:55:07.697 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:07.709 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:2e:b4 10.100.0.14'], port_security=['fa:16:3e:46:2e:b4 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'dc5e2bb3-0d73-4538-a181-9380a1d67934', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a3788149-efcd-4940-8a8f-e21af0a56a06', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '61df91981c55482fa5c9a64686c79f9e', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c496be77-ece3-4368-8b38-35095cbe875d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7be5811b-44e1-4fd4-8769-fc25c57f044d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=21920b88-3779-4c29-b3a9-7591691e880a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:07.711 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 21920b88-3779-4c29-b3a9-7591691e880a in datapath a3788149-efcd-4940-8a8f-e21af0a56a06 unbound from our chassis#033[00m
Jan 23 04:55:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:07.713 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a3788149-efcd-4940-8a8f-e21af0a56a06, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:55:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:07.714 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f62e8776-e6c3-4056-bd4e-7d0aa46a51b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:07.715 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 namespace which is not needed anymore#033[00m
Jan 23 04:55:07 np0005593234 nova_compute[227762]: 2026-01-23 09:55:07.715 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:07 np0005593234 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000053.scope: Deactivated successfully.
Jan 23 04:55:07 np0005593234 systemd[1]: machine-qemu\x2d34\x2dinstance\x2d00000053.scope: Consumed 13.283s CPU time.
Jan 23 04:55:07 np0005593234 systemd-machined[195626]: Machine qemu-34-instance-00000053 terminated.
Jan 23 04:55:07 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[263865]: [NOTICE]   (263870) : haproxy version is 2.8.14-c23fe91
Jan 23 04:55:07 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[263865]: [NOTICE]   (263870) : path to executable is /usr/sbin/haproxy
Jan 23 04:55:07 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[263865]: [WARNING]  (263870) : Exiting Master process...
Jan 23 04:55:07 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[263865]: [ALERT]    (263870) : Current worker (263872) exited with code 143 (Terminated)
Jan 23 04:55:07 np0005593234 neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06[263865]: [WARNING]  (263870) : All workers exited. Exiting... (0)
Jan 23 04:55:07 np0005593234 systemd[1]: libpod-66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9.scope: Deactivated successfully.
Jan 23 04:55:07 np0005593234 podman[263994]: 2026-01-23 09:55:07.842614177 +0000 UTC m=+0.043509391 container died 66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 04:55:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:07.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:07 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9-userdata-shm.mount: Deactivated successfully.
Jan 23 04:55:07 np0005593234 systemd[1]: var-lib-containers-storage-overlay-03bf7ba5f433009b9e8c51c81544819e4d558173e10e4b1d71ec849a2c06487b-merged.mount: Deactivated successfully.
Jan 23 04:55:07 np0005593234 podman[263994]: 2026-01-23 09:55:07.877375462 +0000 UTC m=+0.078270676 container cleanup 66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Jan 23 04:55:07 np0005593234 systemd[1]: libpod-conmon-66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9.scope: Deactivated successfully.
Jan 23 04:55:07 np0005593234 nova_compute[227762]: 2026-01-23 09:55:07.909 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:07 np0005593234 nova_compute[227762]: 2026-01-23 09:55:07.914 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:07.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:07 np0005593234 podman[264024]: 2026-01-23 09:55:07.943113675 +0000 UTC m=+0.046309718 container remove 66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 04:55:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:07.948 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[913e231e-69be-4644-9a12-f94dcca88995]: (4, ('Fri Jan 23 09:55:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9)\n66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9\nFri Jan 23 09:55:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 (66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9)\n66a8f8f4c028d1cc3b2854ec24a979275926645edb8280e8ca01931bd5e805f9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:07.949 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac28556-8cd1-477c-8a6b-8e69aa7e1802]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:07.950 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa3788149-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:07 np0005593234 nova_compute[227762]: 2026-01-23 09:55:07.952 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:07 np0005593234 kernel: tapa3788149-e0: left promiscuous mode
Jan 23 04:55:07 np0005593234 nova_compute[227762]: 2026-01-23 09:55:07.968 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:07 np0005593234 nova_compute[227762]: 2026-01-23 09:55:07.969 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:07.971 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb6c84a-65d5-4153-86b7-b8589d3aa9f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:07.986 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[22c66ab3-0a72-42c2-abfc-2481f3e5023f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:07.987 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5aaf17dd-becf-4ec5-b147-ae47bc079c3b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:08.001 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3abedc9d-09f2-4522-911e-e44a0ec5be03]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 598421, 'reachable_time': 30650, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 264052, 'error': None, 'target': 'ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:08 np0005593234 systemd[1]: run-netns-ovnmeta\x2da3788149\x2defcd\x2d4940\x2d8a8f\x2de21af0a56a06.mount: Deactivated successfully.
Jan 23 04:55:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:08.005 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a3788149-efcd-4940-8a8f-e21af0a56a06 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:55:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:08.005 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[49fb2fcf-98bf-4810-9d77-40f1a2311859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:55:08 np0005593234 nova_compute[227762]: 2026-01-23 09:55:08.455 227766 INFO nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 04:55:08 np0005593234 nova_compute[227762]: 2026-01-23 09:55:08.460 227766 INFO nova.virt.libvirt.driver [-] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Instance destroyed successfully.#033[00m
Jan 23 04:55:08 np0005593234 nova_compute[227762]: 2026-01-23 09:55:08.461 227766 DEBUG nova.virt.libvirt.vif [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:54:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1189076101',display_name='tempest-DeleteServersTestJSON-server-1189076101',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1189076101',id=83,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:54:51Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-t070zg10',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:54:58Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=dc5e2bb3-0d73-4538-a181-9380a1d67934,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-570088325-network", "vif_mac": "fa:16:3e:46:2e:b4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:55:08 np0005593234 nova_compute[227762]: 2026-01-23 09:55:08.461 227766 DEBUG nova.network.os_vif_util [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-DeleteServersTestJSON-570088325-network", "vif_mac": "fa:16:3e:46:2e:b4"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:08 np0005593234 nova_compute[227762]: 2026-01-23 09:55:08.462 227766 DEBUG nova.network.os_vif_util [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:08 np0005593234 nova_compute[227762]: 2026-01-23 09:55:08.462 227766 DEBUG os_vif [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:55:08 np0005593234 nova_compute[227762]: 2026-01-23 09:55:08.464 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:08 np0005593234 nova_compute[227762]: 2026-01-23 09:55:08.464 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap21920b88-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:08 np0005593234 nova_compute[227762]: 2026-01-23 09:55:08.465 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:08 np0005593234 nova_compute[227762]: 2026-01-23 09:55:08.466 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:08 np0005593234 nova_compute[227762]: 2026-01-23 09:55:08.469 227766 INFO os_vif [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37')#033[00m
Jan 23 04:55:08 np0005593234 nova_compute[227762]: 2026-01-23 09:55:08.473 227766 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] skipping disk for instance-00000053 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:55:08 np0005593234 nova_compute[227762]: 2026-01-23 09:55:08.474 227766 DEBUG nova.virt.libvirt.driver [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] skipping disk for instance-00000053 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:55:09 np0005593234 nova_compute[227762]: 2026-01-23 09:55:09.184 227766 DEBUG neutronclient.v2_0.client [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 21920b88-3779-4c29-b3a9-7591691e880a for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 23 04:55:09 np0005593234 nova_compute[227762]: 2026-01-23 09:55:09.537 227766 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:09 np0005593234 nova_compute[227762]: 2026-01-23 09:55:09.538 227766 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:09 np0005593234 nova_compute[227762]: 2026-01-23 09:55:09.538 227766 DEBUG oslo_concurrency.lockutils [None req-55c9ad61-4d7d-4320-97da-f218aff3b97b 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:09.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:09.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:10 np0005593234 nova_compute[227762]: 2026-01-23 09:55:10.237 227766 DEBUG nova.compute.manager [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-vif-unplugged-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:10 np0005593234 nova_compute[227762]: 2026-01-23 09:55:10.237 227766 DEBUG oslo_concurrency.lockutils [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:10 np0005593234 nova_compute[227762]: 2026-01-23 09:55:10.238 227766 DEBUG oslo_concurrency.lockutils [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:10 np0005593234 nova_compute[227762]: 2026-01-23 09:55:10.238 227766 DEBUG oslo_concurrency.lockutils [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:10 np0005593234 nova_compute[227762]: 2026-01-23 09:55:10.238 227766 DEBUG nova.compute.manager [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] No waiting events found dispatching network-vif-unplugged-21920b88-3779-4c29-b3a9-7591691e880a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:10 np0005593234 nova_compute[227762]: 2026-01-23 09:55:10.238 227766 WARNING nova.compute.manager [req-ed038eee-ec34-4537-b7d9-eac39520705b req-f2e8d212-d91a-41a8-9d2c-3b81359ec441 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received unexpected event network-vif-unplugged-21920b88-3779-4c29-b3a9-7591691e880a for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 04:55:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:10 np0005593234 nova_compute[227762]: 2026-01-23 09:55:10.691 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:11 np0005593234 podman[264055]: 2026-01-23 09:55:11.776511488 +0000 UTC m=+0.049596480 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:55:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:11.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:11.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:13 np0005593234 nova_compute[227762]: 2026-01-23 09:55:13.465 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:13 np0005593234 nova_compute[227762]: 2026-01-23 09:55:13.634 227766 DEBUG nova.compute.manager [req-9615443f-75e1-449d-bc8b-f1ea6741a475 req-accc92da-5767-4bf9-a2b5-9e001779d576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:13 np0005593234 nova_compute[227762]: 2026-01-23 09:55:13.635 227766 DEBUG oslo_concurrency.lockutils [req-9615443f-75e1-449d-bc8b-f1ea6741a475 req-accc92da-5767-4bf9-a2b5-9e001779d576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:13 np0005593234 nova_compute[227762]: 2026-01-23 09:55:13.635 227766 DEBUG oslo_concurrency.lockutils [req-9615443f-75e1-449d-bc8b-f1ea6741a475 req-accc92da-5767-4bf9-a2b5-9e001779d576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:13 np0005593234 nova_compute[227762]: 2026-01-23 09:55:13.635 227766 DEBUG oslo_concurrency.lockutils [req-9615443f-75e1-449d-bc8b-f1ea6741a475 req-accc92da-5767-4bf9-a2b5-9e001779d576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:13 np0005593234 nova_compute[227762]: 2026-01-23 09:55:13.635 227766 DEBUG nova.compute.manager [req-9615443f-75e1-449d-bc8b-f1ea6741a475 req-accc92da-5767-4bf9-a2b5-9e001779d576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] No waiting events found dispatching network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:13 np0005593234 nova_compute[227762]: 2026-01-23 09:55:13.636 227766 WARNING nova.compute.manager [req-9615443f-75e1-449d-bc8b-f1ea6741a475 req-accc92da-5767-4bf9-a2b5-9e001779d576 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received unexpected event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 04:55:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:13.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:13.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e259 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:15 np0005593234 nova_compute[227762]: 2026-01-23 09:55:15.693 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:15.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:55:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:15.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:55:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e260 e260: 3 total, 3 up, 3 in
Jan 23 04:55:16 np0005593234 nova_compute[227762]: 2026-01-23 09:55:16.866 227766 DEBUG nova.compute.manager [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-changed-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:16 np0005593234 nova_compute[227762]: 2026-01-23 09:55:16.866 227766 DEBUG nova.compute.manager [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Refreshing instance network info cache due to event network-changed-21920b88-3779-4c29-b3a9-7591691e880a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:55:16 np0005593234 nova_compute[227762]: 2026-01-23 09:55:16.866 227766 DEBUG oslo_concurrency.lockutils [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:55:16 np0005593234 nova_compute[227762]: 2026-01-23 09:55:16.866 227766 DEBUG oslo_concurrency.lockutils [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:55:16 np0005593234 nova_compute[227762]: 2026-01-23 09:55:16.867 227766 DEBUG nova.network.neutron [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Refreshing network info cache for port 21920b88-3779-4c29-b3a9-7591691e880a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:55:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:17.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:55:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:17.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:55:18 np0005593234 nova_compute[227762]: 2026-01-23 09:55:18.466 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:19 np0005593234 nova_compute[227762]: 2026-01-23 09:55:19.829 227766 DEBUG nova.compute.manager [req-93ee128c-8b23-4a44-a476-1b41c53b497f req-55f4ec5a-f3c2-4bc9-a00f-554e2f9544a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:19 np0005593234 nova_compute[227762]: 2026-01-23 09:55:19.829 227766 DEBUG oslo_concurrency.lockutils [req-93ee128c-8b23-4a44-a476-1b41c53b497f req-55f4ec5a-f3c2-4bc9-a00f-554e2f9544a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:19 np0005593234 nova_compute[227762]: 2026-01-23 09:55:19.829 227766 DEBUG oslo_concurrency.lockutils [req-93ee128c-8b23-4a44-a476-1b41c53b497f req-55f4ec5a-f3c2-4bc9-a00f-554e2f9544a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:19 np0005593234 nova_compute[227762]: 2026-01-23 09:55:19.830 227766 DEBUG oslo_concurrency.lockutils [req-93ee128c-8b23-4a44-a476-1b41c53b497f req-55f4ec5a-f3c2-4bc9-a00f-554e2f9544a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:19 np0005593234 nova_compute[227762]: 2026-01-23 09:55:19.830 227766 DEBUG nova.compute.manager [req-93ee128c-8b23-4a44-a476-1b41c53b497f req-55f4ec5a-f3c2-4bc9-a00f-554e2f9544a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] No waiting events found dispatching network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:19 np0005593234 nova_compute[227762]: 2026-01-23 09:55:19.830 227766 WARNING nova.compute.manager [req-93ee128c-8b23-4a44-a476-1b41c53b497f req-55f4ec5a-f3c2-4bc9-a00f-554e2f9544a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received unexpected event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a for instance with vm_state resized and task_state None.#033[00m
Jan 23 04:55:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:19.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:19.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:20 np0005593234 nova_compute[227762]: 2026-01-23 09:55:20.694 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:20 np0005593234 nova_compute[227762]: 2026-01-23 09:55:20.770 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:21 np0005593234 nova_compute[227762]: 2026-01-23 09:55:21.799 227766 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:21 np0005593234 nova_compute[227762]: 2026-01-23 09:55:21.801 227766 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:21 np0005593234 nova_compute[227762]: 2026-01-23 09:55:21.801 227766 DEBUG nova.compute.manager [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Going to confirm migration 13 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Jan 23 04:55:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:21.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:21.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.003 227766 DEBUG nova.compute.manager [req-87bb8816-95a9-4f75-b8f2-7dfedfcc6e2c req-7a5b35fa-e596-4c0f-a3c7-0302de858171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.004 227766 DEBUG oslo_concurrency.lockutils [req-87bb8816-95a9-4f75-b8f2-7dfedfcc6e2c req-7a5b35fa-e596-4c0f-a3c7-0302de858171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.004 227766 DEBUG oslo_concurrency.lockutils [req-87bb8816-95a9-4f75-b8f2-7dfedfcc6e2c req-7a5b35fa-e596-4c0f-a3c7-0302de858171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.004 227766 DEBUG oslo_concurrency.lockutils [req-87bb8816-95a9-4f75-b8f2-7dfedfcc6e2c req-7a5b35fa-e596-4c0f-a3c7-0302de858171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.004 227766 DEBUG nova.compute.manager [req-87bb8816-95a9-4f75-b8f2-7dfedfcc6e2c req-7a5b35fa-e596-4c0f-a3c7-0302de858171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] No waiting events found dispatching network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.004 227766 WARNING nova.compute.manager [req-87bb8816-95a9-4f75-b8f2-7dfedfcc6e2c req-7a5b35fa-e596-4c0f-a3c7-0302de858171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Received unexpected event network-vif-plugged-21920b88-3779-4c29-b3a9-7591691e880a for instance with vm_state resized and task_state deleting.#033[00m
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.151 227766 DEBUG nova.network.neutron [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updated VIF entry in instance network info cache for port 21920b88-3779-4c29-b3a9-7591691e880a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.152 227766 DEBUG nova.network.neutron [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updating instance_info_cache with network_info: [{"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.177 227766 DEBUG oslo_concurrency.lockutils [req-6c4b8ad4-fefa-4e01-bd40-1cd2270460f0 req-dbd36627-8384-4950-9c74-0a3819815757 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.920 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162107.9194782, dc5e2bb3-0d73-4538-a181-9380a1d67934 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.921 227766 INFO nova.compute.manager [-] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.951 227766 DEBUG nova.compute.manager [None req-d0e54367-9930-440d-8d61-e7881cacaf6c - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.956 227766 DEBUG nova.compute.manager [None req-d0e54367-9930-440d-8d61-e7881cacaf6c - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: deleting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:55:22 np0005593234 nova_compute[227762]: 2026-01-23 09:55:22.985 227766 INFO nova.compute.manager [None req-d0e54367-9930-440d-8d61-e7881cacaf6c - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 23 04:55:23 np0005593234 nova_compute[227762]: 2026-01-23 09:55:23.467 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:23 np0005593234 nova_compute[227762]: 2026-01-23 09:55:23.714 227766 DEBUG neutronclient.v2_0.client [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 21920b88-3779-4c29-b3a9-7591691e880a for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 23 04:55:23 np0005593234 nova_compute[227762]: 2026-01-23 09:55:23.714 227766 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:55:23 np0005593234 nova_compute[227762]: 2026-01-23 09:55:23.714 227766 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquired lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:55:23 np0005593234 nova_compute[227762]: 2026-01-23 09:55:23.714 227766 DEBUG nova.network.neutron [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:55:23 np0005593234 nova_compute[227762]: 2026-01-23 09:55:23.715 227766 DEBUG nova.objects.instance [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'info_cache' on Instance uuid dc5e2bb3-0d73-4538-a181-9380a1d67934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:55:23 np0005593234 nova_compute[227762]: 2026-01-23 09:55:23.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:23.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:23.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 04:55:24 np0005593234 nova_compute[227762]: 2026-01-23 09:55:24.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:24 np0005593234 nova_compute[227762]: 2026-01-23 09:55:24.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:55:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:55:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:55:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:55:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:25.418 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=31, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=30) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:55:25 np0005593234 nova_compute[227762]: 2026-01-23 09:55:25.419 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:25.419 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:55:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:25 np0005593234 nova_compute[227762]: 2026-01-23 09:55:25.697 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:25.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:55:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:25.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:55:26 np0005593234 nova_compute[227762]: 2026-01-23 09:55:26.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:26 np0005593234 nova_compute[227762]: 2026-01-23 09:55:26.782 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:26 np0005593234 nova_compute[227762]: 2026-01-23 09:55:26.783 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:26 np0005593234 nova_compute[227762]: 2026-01-23 09:55:26.783 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:26 np0005593234 nova_compute[227762]: 2026-01-23 09:55:26.783 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:55:26 np0005593234 nova_compute[227762]: 2026-01-23 09:55:26.784 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:26 np0005593234 podman[264266]: 2026-01-23 09:55:26.791469084 +0000 UTC m=+0.085900223 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 23 04:55:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:55:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1893725373' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.241 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.320 227766 DEBUG nova.network.neutron [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updating instance_info_cache with network_info: [{"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.348 227766 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Releasing lock "refresh_cache-dc5e2bb3-0d73-4538-a181-9380a1d67934" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.348 227766 DEBUG nova.objects.instance [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lazy-loading 'migration_context' on Instance uuid dc5e2bb3-0d73-4538-a181-9380a1d67934 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.354 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000053 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.354 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000053 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.485 227766 DEBUG nova.storage.rbd_utils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] removing snapshot(nova-resize) on rbd image(dc5e2bb3-0d73-4538-a181-9380a1d67934_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.632 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.633 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4591MB free_disk=20.876117706298828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.633 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.633 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.686 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Migration for instance dc5e2bb3-0d73-4538-a181-9380a1d67934 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.712 227766 INFO nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Updating resource usage from migration e333aff9-bf2d-4dc5-ab0a-0044cabcbb7e#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.712 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: dc5e2bb3-0d73-4538-a181-9380a1d67934] Starting to track outgoing migration e333aff9-bf2d-4dc5-ab0a-0044cabcbb7e with flavor 68d42077-c749-4366-ba3e-07758debb02d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1444#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.750 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Migration e333aff9-bf2d-4dc5-ab0a-0044cabcbb7e is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.750 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.750 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:55:27 np0005593234 nova_compute[227762]: 2026-01-23 09:55:27.814 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:27.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:27.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:55:28 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/627071696' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.262 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.268 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.302 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.337 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.338 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.705s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.468 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e261 e261: 3 total, 3 up, 3 in
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.739 227766 DEBUG nova.virt.libvirt.vif [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:54:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-DeleteServersTestJSON-server-1189076101',display_name='tempest-DeleteServersTestJSON-server-1189076101',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-deleteserverstestjson-server-1189076101',id=83,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:55:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='61df91981c55482fa5c9a64686c79f9e',ramdisk_id='',reservation_id='r-t070zg10',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-DeleteServersTestJSON-944070453',owner_user_name='tempest-DeleteServersTestJSON-944070453-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:55:21Z,user_data=None,user_id='28a7a778c8ab486fb586e81bb84113be',uuid=dc5e2bb3-0d73-4538-a181-9380a1d67934,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.740 227766 DEBUG nova.network.os_vif_util [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converting VIF {"id": "21920b88-3779-4c29-b3a9-7591691e880a", "address": "fa:16:3e:46:2e:b4", "network": {"id": "a3788149-efcd-4940-8a8f-e21af0a56a06", "bridge": "br-int", "label": "tempest-DeleteServersTestJSON-570088325-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "61df91981c55482fa5c9a64686c79f9e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap21920b88-37", "ovs_interfaceid": "21920b88-3779-4c29-b3a9-7591691e880a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.741 227766 DEBUG nova.network.os_vif_util [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.741 227766 DEBUG os_vif [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.743 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.743 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap21920b88-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.744 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.745 227766 INFO os_vif [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:2e:b4,bridge_name='br-int',has_traffic_filtering=True,id=21920b88-3779-4c29-b3a9-7591691e880a,network=Network(a3788149-efcd-4940-8a8f-e21af0a56a06),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap21920b88-37')#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.746 227766 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.746 227766 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:28 np0005593234 nova_compute[227762]: 2026-01-23 09:55:28.829 227766 DEBUG oslo_concurrency.processutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:55:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:55:29 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3744703450' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:55:29 np0005593234 nova_compute[227762]: 2026-01-23 09:55:29.245 227766 DEBUG oslo_concurrency.processutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:55:29 np0005593234 nova_compute[227762]: 2026-01-23 09:55:29.251 227766 DEBUG nova.compute.provider_tree [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:55:29 np0005593234 nova_compute[227762]: 2026-01-23 09:55:29.270 227766 DEBUG nova.scheduler.client.report [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:55:29 np0005593234 nova_compute[227762]: 2026-01-23 09:55:29.348 227766 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:29 np0005593234 nova_compute[227762]: 2026-01-23 09:55:29.538 227766 INFO nova.scheduler.client.report [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Deleted allocation for migration e333aff9-bf2d-4dc5-ab0a-0044cabcbb7e#033[00m
Jan 23 04:55:29 np0005593234 nova_compute[227762]: 2026-01-23 09:55:29.638 227766 DEBUG oslo_concurrency.lockutils [None req-9a70cce8-e5d9-4ad0-a6e7-4afe24183830 28a7a778c8ab486fb586e81bb84113be 61df91981c55482fa5c9a64686c79f9e - - default default] Lock "dc5e2bb3-0d73-4538-a181-9380a1d67934" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 7.837s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:29.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:29.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:30 np0005593234 nova_compute[227762]: 2026-01-23 09:55:30.338 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:30 np0005593234 nova_compute[227762]: 2026-01-23 09:55:30.339 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:55:30 np0005593234 nova_compute[227762]: 2026-01-23 09:55:30.339 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:55:30 np0005593234 nova_compute[227762]: 2026-01-23 09:55:30.441 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:55:30 np0005593234 nova_compute[227762]: 2026-01-23 09:55:30.441 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:30 np0005593234 nova_compute[227762]: 2026-01-23 09:55:30.699 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:30 np0005593234 nova_compute[227762]: 2026-01-23 09:55:30.841 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:31.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:31.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:32 np0005593234 nova_compute[227762]: 2026-01-23 09:55:32.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:32 np0005593234 nova_compute[227762]: 2026-01-23 09:55:32.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:55:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:55:33 np0005593234 nova_compute[227762]: 2026-01-23 09:55:33.469 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:55:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:33.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:55:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:33.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:34.421 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '31'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:55:34 np0005593234 nova_compute[227762]: 2026-01-23 09:55:34.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:55:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e261 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:35 np0005593234 nova_compute[227762]: 2026-01-23 09:55:35.701 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:35.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:35.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 e262: 3 total, 3 up, 3 in
Jan 23 04:55:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:37.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:55:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:37.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:55:38 np0005593234 nova_compute[227762]: 2026-01-23 09:55:38.471 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:55:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3497638853' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:55:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:55:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3497638853' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:55:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:39.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:39.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:40 np0005593234 nova_compute[227762]: 2026-01-23 09:55:40.292 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:40 np0005593234 nova_compute[227762]: 2026-01-23 09:55:40.703 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:41.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:41.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:42 np0005593234 podman[264503]: 2026-01-23 09:55:42.75242786 +0000 UTC m=+0.051470019 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 04:55:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:42.831 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:55:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:42.832 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:55:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:55:42.832 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:55:43 np0005593234 nova_compute[227762]: 2026-01-23 09:55:43.472 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:43.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:43.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:55:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3916954589' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:55:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:55:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3916954589' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:55:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:45 np0005593234 nova_compute[227762]: 2026-01-23 09:55:45.705 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:45.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:45.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:47.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:47.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:48 np0005593234 nova_compute[227762]: 2026-01-23 09:55:48.474 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:55:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:49.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:55:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:49.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:50 np0005593234 nova_compute[227762]: 2026-01-23 09:55:50.710 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:51.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:51.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:53 np0005593234 nova_compute[227762]: 2026-01-23 09:55:53.475 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:53.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:53.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:55:55 np0005593234 nova_compute[227762]: 2026-01-23 09:55:55.712 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:55.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:55.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:57 np0005593234 podman[264529]: 2026-01-23 09:55:57.777341387 +0000 UTC m=+0.075696375 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:55:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:57.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:55:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:57.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:55:58 np0005593234 nova_compute[227762]: 2026-01-23 09:55:58.478 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:55:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:59.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:55:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:55:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:55:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:59.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:00 np0005593234 nova_compute[227762]: 2026-01-23 09:56:00.715 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:01.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:01.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:03 np0005593234 nova_compute[227762]: 2026-01-23 09:56:03.480 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:03.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:03.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:05 np0005593234 nova_compute[227762]: 2026-01-23 09:56:05.716 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:05.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:05.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:07.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:07.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:08 np0005593234 nova_compute[227762]: 2026-01-23 09:56:08.481 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:56:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:09.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:56:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:10.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:10 np0005593234 nova_compute[227762]: 2026-01-23 09:56:10.720 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:11.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:12.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:12 np0005593234 nova_compute[227762]: 2026-01-23 09:56:12.079 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:56:12.079 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=32, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=31) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:56:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:56:12.080 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:56:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:56:12.082 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '32'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:56:13 np0005593234 nova_compute[227762]: 2026-01-23 09:56:13.482 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:13 np0005593234 podman[264614]: 2026-01-23 09:56:13.748435708 +0000 UTC m=+0.046915276 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 04:56:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:13.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:14.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:15 np0005593234 nova_compute[227762]: 2026-01-23 09:56:15.721 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:15.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:16.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:17.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:18.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:18 np0005593234 nova_compute[227762]: 2026-01-23 09:56:18.484 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:56:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/602875715' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:56:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:56:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/602875715' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:56:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:19.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:20.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:20 np0005593234 nova_compute[227762]: 2026-01-23 09:56:20.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:20 np0005593234 nova_compute[227762]: 2026-01-23 09:56:20.754 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:56:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2175395627' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:56:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:56:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2175395627' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:56:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:21.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:22.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:23 np0005593234 nova_compute[227762]: 2026-01-23 09:56:23.485 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:23.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:24.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:24 np0005593234 nova_compute[227762]: 2026-01-23 09:56:24.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:25 np0005593234 nova_compute[227762]: 2026-01-23 09:56:25.757 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:25.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:26.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:26 np0005593234 nova_compute[227762]: 2026-01-23 09:56:26.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:26 np0005593234 nova_compute[227762]: 2026-01-23 09:56:26.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:56:26 np0005593234 nova_compute[227762]: 2026-01-23 09:56:26.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:26 np0005593234 nova_compute[227762]: 2026-01-23 09:56:26.780 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:26 np0005593234 nova_compute[227762]: 2026-01-23 09:56:26.780 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:26 np0005593234 nova_compute[227762]: 2026-01-23 09:56:26.780 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:26 np0005593234 nova_compute[227762]: 2026-01-23 09:56:26.781 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:56:26 np0005593234 nova_compute[227762]: 2026-01-23 09:56:26.781 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:56:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:56:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2481238359' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:56:27 np0005593234 nova_compute[227762]: 2026-01-23 09:56:27.236 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:56:27 np0005593234 nova_compute[227762]: 2026-01-23 09:56:27.410 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:56:27 np0005593234 nova_compute[227762]: 2026-01-23 09:56:27.411 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4646MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:56:27 np0005593234 nova_compute[227762]: 2026-01-23 09:56:27.411 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:27 np0005593234 nova_compute[227762]: 2026-01-23 09:56:27.411 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:27 np0005593234 nova_compute[227762]: 2026-01-23 09:56:27.781 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:56:27 np0005593234 nova_compute[227762]: 2026-01-23 09:56:27.781 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:56:27 np0005593234 nova_compute[227762]: 2026-01-23 09:56:27.804 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:56:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:27.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:56:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:28.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:56:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:56:28 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2362752819' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:56:28 np0005593234 nova_compute[227762]: 2026-01-23 09:56:28.261 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:56:28 np0005593234 nova_compute[227762]: 2026-01-23 09:56:28.266 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:56:28 np0005593234 nova_compute[227762]: 2026-01-23 09:56:28.486 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #49. Immutable memtables: 6.
Jan 23 04:56:28 np0005593234 nova_compute[227762]: 2026-01-23 09:56:28.770 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:56:28 np0005593234 podman[264737]: 2026-01-23 09:56:28.77508264 +0000 UTC m=+0.075533260 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:56:29 np0005593234 nova_compute[227762]: 2026-01-23 09:56:29.200 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:56:29 np0005593234 nova_compute[227762]: 2026-01-23 09:56:29.200 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.789s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #88. Immutable memtables: 0.
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:29.905950) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 53] Flushing memtable with next log file: 88
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162189906087, "job": 53, "event": "flush_started", "num_memtables": 1, "num_entries": 2411, "num_deletes": 254, "total_data_size": 5659196, "memory_usage": 5755128, "flush_reason": "Manual Compaction"}
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 53] Level-0 flush table #89: started
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162189933632, "cf_name": "default", "job": 53, "event": "table_file_creation", "file_number": 89, "file_size": 3712948, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 44225, "largest_seqno": 46631, "table_properties": {"data_size": 3703079, "index_size": 6235, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 20882, "raw_average_key_size": 20, "raw_value_size": 3683228, "raw_average_value_size": 3653, "num_data_blocks": 271, "num_entries": 1008, "num_filter_entries": 1008, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161984, "oldest_key_time": 1769161984, "file_creation_time": 1769162189, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 89, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 53] Flush lasted 27710 microseconds, and 7949 cpu microseconds.
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:56:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:29.933698) [db/flush_job.cc:967] [default] [JOB 53] Level-0 flush table #89: 3712948 bytes OK
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:29.933717) [db/memtable_list.cc:519] [default] Level-0 commit table #89 started
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:29.936979) [db/memtable_list.cc:722] [default] Level-0 commit table #89: memtable #1 done
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:29.936995) EVENT_LOG_v1 {"time_micros": 1769162189936990, "job": 53, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:29.937040) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 53] Try to delete WAL files size 5648425, prev total WAL file size 5648425, number of live WAL files 2.
Jan 23 04:56:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:29.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000085.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:29.938376) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033373635' seq:72057594037927935, type:22 .. '7061786F730034303137' seq:0, type:0; will stop at (end)
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 54] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 53 Base level 0, inputs: [89(3625KB)], [87(9511KB)]
Jan 23 04:56:29 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162189938484, "job": 54, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [89], "files_L6": [87], "score": -1, "input_data_size": 13452220, "oldest_snapshot_seqno": -1}
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 54] Generated table #90: 7030 keys, 11522268 bytes, temperature: kUnknown
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162190009852, "cf_name": "default", "job": 54, "event": "table_file_creation", "file_number": 90, "file_size": 11522268, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11474196, "index_size": 29400, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17605, "raw_key_size": 180818, "raw_average_key_size": 25, "raw_value_size": 11347245, "raw_average_value_size": 1614, "num_data_blocks": 1169, "num_entries": 7030, "num_filter_entries": 7030, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769162189, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 90, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:30.010090) [db/compaction/compaction_job.cc:1663] [default] [JOB 54] Compacted 1@0 + 1@6 files to L6 => 11522268 bytes
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:30.011400) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 188.3 rd, 161.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 9.3 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(6.7) write-amplify(3.1) OK, records in: 7555, records dropped: 525 output_compression: NoCompression
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:30.011419) EVENT_LOG_v1 {"time_micros": 1769162190011411, "job": 54, "event": "compaction_finished", "compaction_time_micros": 71439, "compaction_time_cpu_micros": 24852, "output_level": 6, "num_output_files": 1, "total_output_size": 11522268, "num_input_records": 7555, "num_output_records": 7030, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000089.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162190012224, "job": 54, "event": "table_file_deletion", "file_number": 89}
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000087.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162190013831, "job": 54, "event": "table_file_deletion", "file_number": 87}
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:29.938313) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:30.013885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:30.013889) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:30.013891) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:30.013892) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:56:30.013894) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:56:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:30.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:30 np0005593234 nova_compute[227762]: 2026-01-23 09:56:30.759 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:31.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:32.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:32 np0005593234 nova_compute[227762]: 2026-01-23 09:56:32.200 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:32 np0005593234 nova_compute[227762]: 2026-01-23 09:56:32.201 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:56:32 np0005593234 nova_compute[227762]: 2026-01-23 09:56:32.201 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:56:32 np0005593234 nova_compute[227762]: 2026-01-23 09:56:32.230 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:56:32 np0005593234 nova_compute[227762]: 2026-01-23 09:56:32.231 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:32 np0005593234 nova_compute[227762]: 2026-01-23 09:56:32.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:33 np0005593234 nova_compute[227762]: 2026-01-23 09:56:33.489 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:33.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:34.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:56:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:56:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:56:34 np0005593234 nova_compute[227762]: 2026-01-23 09:56:34.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:35 np0005593234 nova_compute[227762]: 2026-01-23 09:56:35.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:56:35 np0005593234 nova_compute[227762]: 2026-01-23 09:56:35.761 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:35.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:36.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:37.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:38.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:38 np0005593234 nova_compute[227762]: 2026-01-23 09:56:38.490 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:39.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:40.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:40 np0005593234 nova_compute[227762]: 2026-01-23 09:56:40.762 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:56:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:56:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:41.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:42.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:42 np0005593234 ovn_controller[134547]: 2026-01-23T09:56:42Z|00283|memory_trim|INFO|Detected inactivity (last active 30000 ms ago): trimming memory
Jan 23 04:56:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:56:42.833 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:56:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:56:42.833 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:56:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:56:42.833 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:56:43 np0005593234 nova_compute[227762]: 2026-01-23 09:56:43.493 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:43.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:56:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:44.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:56:44 np0005593234 podman[265006]: 2026-01-23 09:56:44.753459688 +0000 UTC m=+0.047385381 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 23 04:56:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:45 np0005593234 nova_compute[227762]: 2026-01-23 09:56:45.766 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:45.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:46.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:47.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:56:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:48.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:56:48 np0005593234 nova_compute[227762]: 2026-01-23 09:56:48.495 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:49.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:50.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:50 np0005593234 nova_compute[227762]: 2026-01-23 09:56:50.768 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:51.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:52.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:53 np0005593234 nova_compute[227762]: 2026-01-23 09:56:53.496 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:53.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:56:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:54.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:56:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:56:55 np0005593234 nova_compute[227762]: 2026-01-23 09:56:55.770 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:55.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:56.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:57.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:58.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:56:58 np0005593234 nova_compute[227762]: 2026-01-23 09:56:58.497 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:56:59 np0005593234 podman[265033]: 2026-01-23 09:56:59.782688682 +0000 UTC m=+0.074232770 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 23 04:56:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:56:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:56:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:59.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:00.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:00 np0005593234 nova_compute[227762]: 2026-01-23 09:57:00.772 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:01.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:02.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:03 np0005593234 nova_compute[227762]: 2026-01-23 09:57:03.498 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:03.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:04.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:05 np0005593234 nova_compute[227762]: 2026-01-23 09:57:05.774 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:05.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:57:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:06.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:57:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:07.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:57:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:08.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:57:08 np0005593234 nova_compute[227762]: 2026-01-23 09:57:08.499 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:08.803 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=33, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=32) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:57:08 np0005593234 nova_compute[227762]: 2026-01-23 09:57:08.803 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:08.804 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:57:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:57:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:09.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:57:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:10.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:10 np0005593234 nova_compute[227762]: 2026-01-23 09:57:10.775 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:11.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:12.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:13 np0005593234 nova_compute[227762]: 2026-01-23 09:57:13.501 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:13.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:57:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:14.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:57:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:15 np0005593234 podman[265118]: 2026-01-23 09:57:15.743320137 +0000 UTC m=+0.043470189 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 04:57:15 np0005593234 nova_compute[227762]: 2026-01-23 09:57:15.778 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:15.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:16.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:17.805 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '33'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:17.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:18.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:18 np0005593234 nova_compute[227762]: 2026-01-23 09:57:18.503 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:19.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:20.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:20 np0005593234 nova_compute[227762]: 2026-01-23 09:57:20.780 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:21 np0005593234 nova_compute[227762]: 2026-01-23 09:57:21.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:21.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:57:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:22.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:57:23 np0005593234 nova_compute[227762]: 2026-01-23 09:57:23.505 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:23.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:24.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:25 np0005593234 nova_compute[227762]: 2026-01-23 09:57:25.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:25 np0005593234 nova_compute[227762]: 2026-01-23 09:57:25.781 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:25.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:57:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:26.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:57:26 np0005593234 nova_compute[227762]: 2026-01-23 09:57:26.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:26 np0005593234 nova_compute[227762]: 2026-01-23 09:57:26.793 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:26 np0005593234 nova_compute[227762]: 2026-01-23 09:57:26.793 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:26 np0005593234 nova_compute[227762]: 2026-01-23 09:57:26.793 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:26 np0005593234 nova_compute[227762]: 2026-01-23 09:57:26.793 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:57:26 np0005593234 nova_compute[227762]: 2026-01-23 09:57:26.794 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:57:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:57:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3178299964' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:57:27 np0005593234 nova_compute[227762]: 2026-01-23 09:57:27.405 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.611s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:57:27 np0005593234 nova_compute[227762]: 2026-01-23 09:57:27.562 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:57:27 np0005593234 nova_compute[227762]: 2026-01-23 09:57:27.563 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4623MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:57:27 np0005593234 nova_compute[227762]: 2026-01-23 09:57:27.563 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:27 np0005593234 nova_compute[227762]: 2026-01-23 09:57:27.564 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:27 np0005593234 nova_compute[227762]: 2026-01-23 09:57:27.669 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:57:27 np0005593234 nova_compute[227762]: 2026-01-23 09:57:27.670 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:57:27 np0005593234 nova_compute[227762]: 2026-01-23 09:57:27.704 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:57:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:27.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:57:28 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/397753885' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:57:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:57:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:28.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:57:28 np0005593234 nova_compute[227762]: 2026-01-23 09:57:28.141 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:57:28 np0005593234 nova_compute[227762]: 2026-01-23 09:57:28.147 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:57:28 np0005593234 nova_compute[227762]: 2026-01-23 09:57:28.175 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:57:28 np0005593234 nova_compute[227762]: 2026-01-23 09:57:28.177 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:57:28 np0005593234 nova_compute[227762]: 2026-01-23 09:57:28.178 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.614s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:28 np0005593234 nova_compute[227762]: 2026-01-23 09:57:28.507 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:29 np0005593234 nova_compute[227762]: 2026-01-23 09:57:29.178 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:29 np0005593234 nova_compute[227762]: 2026-01-23 09:57:29.179 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:57:29 np0005593234 nova_compute[227762]: 2026-01-23 09:57:29.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:29.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:30.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:30 np0005593234 nova_compute[227762]: 2026-01-23 09:57:30.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:30 np0005593234 nova_compute[227762]: 2026-01-23 09:57:30.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:57:30 np0005593234 nova_compute[227762]: 2026-01-23 09:57:30.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:57:30 np0005593234 nova_compute[227762]: 2026-01-23 09:57:30.781 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:57:30 np0005593234 nova_compute[227762]: 2026-01-23 09:57:30.782 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:30 np0005593234 podman[265240]: 2026-01-23 09:57:30.809507003 +0000 UTC m=+0.107682614 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:57:31 np0005593234 nova_compute[227762]: 2026-01-23 09:57:31.775 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:31.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:32 np0005593234 nova_compute[227762]: 2026-01-23 09:57:32.028 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Acquiring lock "1e7fbf43-a3d7-4017-84bb-9787aa383363" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:32 np0005593234 nova_compute[227762]: 2026-01-23 09:57:32.028 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:32 np0005593234 nova_compute[227762]: 2026-01-23 09:57:32.065 227766 DEBUG nova.compute.manager [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:57:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:57:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:32.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:57:32 np0005593234 nova_compute[227762]: 2026-01-23 09:57:32.263 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:32 np0005593234 nova_compute[227762]: 2026-01-23 09:57:32.263 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:32 np0005593234 nova_compute[227762]: 2026-01-23 09:57:32.274 227766 DEBUG nova.virt.hardware [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:57:32 np0005593234 nova_compute[227762]: 2026-01-23 09:57:32.275 227766 INFO nova.compute.claims [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:57:32 np0005593234 nova_compute[227762]: 2026-01-23 09:57:32.559 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:57:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:57:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4006480262' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.006 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.011 227766 DEBUG nova.compute.provider_tree [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.043 227766 DEBUG nova.scheduler.client.report [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.093 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.830s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.094 227766 DEBUG nova.compute.manager [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.237 227766 DEBUG nova.compute.manager [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.238 227766 DEBUG nova.network.neutron [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.284 227766 INFO nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.363 227766 DEBUG nova.compute.manager [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.510 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.632 227766 DEBUG nova.compute.manager [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.633 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.633 227766 INFO nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Creating image(s)#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.661 227766 DEBUG nova.storage.rbd_utils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] rbd image 1e7fbf43-a3d7-4017-84bb-9787aa383363_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.685 227766 DEBUG nova.storage.rbd_utils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] rbd image 1e7fbf43-a3d7-4017-84bb-9787aa383363_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.715 227766 DEBUG nova.storage.rbd_utils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] rbd image 1e7fbf43-a3d7-4017-84bb-9787aa383363_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.719 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.745 227766 DEBUG nova.policy [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fb7d106814e948feb72555b92cb0bce7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '49fe499c3ed341249456b8cc11ae8483', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.749 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.785 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.786 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.787 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.787 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.815 227766 DEBUG nova.storage.rbd_utils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] rbd image 1e7fbf43-a3d7-4017-84bb-9787aa383363_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:57:33 np0005593234 nova_compute[227762]: 2026-01-23 09:57:33.819 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 1e7fbf43-a3d7-4017-84bb-9787aa383363_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:57:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:33.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:34.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:34 np0005593234 nova_compute[227762]: 2026-01-23 09:57:34.260 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 1e7fbf43-a3d7-4017-84bb-9787aa383363_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:57:34 np0005593234 nova_compute[227762]: 2026-01-23 09:57:34.329 227766 DEBUG nova.storage.rbd_utils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] resizing rbd image 1e7fbf43-a3d7-4017-84bb-9787aa383363_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:57:34 np0005593234 nova_compute[227762]: 2026-01-23 09:57:34.707 227766 DEBUG nova.objects.instance [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lazy-loading 'migration_context' on Instance uuid 1e7fbf43-a3d7-4017-84bb-9787aa383363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:57:34 np0005593234 nova_compute[227762]: 2026-01-23 09:57:34.735 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:57:34 np0005593234 nova_compute[227762]: 2026-01-23 09:57:34.735 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Ensure instance console log exists: /var/lib/nova/instances/1e7fbf43-a3d7-4017-84bb-9787aa383363/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:57:34 np0005593234 nova_compute[227762]: 2026-01-23 09:57:34.736 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:34 np0005593234 nova_compute[227762]: 2026-01-23 09:57:34.736 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:34 np0005593234 nova_compute[227762]: 2026-01-23 09:57:34.736 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:35 np0005593234 nova_compute[227762]: 2026-01-23 09:57:35.678 227766 DEBUG nova.network.neutron [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Successfully created port: 086c8696-1f80-478d-a2ee-27821d7e0b7a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:57:35 np0005593234 nova_compute[227762]: 2026-01-23 09:57:35.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:35 np0005593234 nova_compute[227762]: 2026-01-23 09:57:35.784 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:35.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:36.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:36 np0005593234 nova_compute[227762]: 2026-01-23 09:57:36.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:57:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:38.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:38 np0005593234 nova_compute[227762]: 2026-01-23 09:57:38.044 227766 DEBUG nova.network.neutron [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Successfully updated port: 086c8696-1f80-478d-a2ee-27821d7e0b7a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:57:38 np0005593234 nova_compute[227762]: 2026-01-23 09:57:38.071 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Acquiring lock "refresh_cache-1e7fbf43-a3d7-4017-84bb-9787aa383363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:57:38 np0005593234 nova_compute[227762]: 2026-01-23 09:57:38.071 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Acquired lock "refresh_cache-1e7fbf43-a3d7-4017-84bb-9787aa383363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:57:38 np0005593234 nova_compute[227762]: 2026-01-23 09:57:38.071 227766 DEBUG nova.network.neutron [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:57:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:38.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:38 np0005593234 nova_compute[227762]: 2026-01-23 09:57:38.306 227766 DEBUG nova.compute.manager [req-c1482a62-47f5-4d3c-a147-1c12fb15b943 req-68d45dbc-cb3a-40f1-a462-7c3bd62b485b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Received event network-changed-086c8696-1f80-478d-a2ee-27821d7e0b7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:57:38 np0005593234 nova_compute[227762]: 2026-01-23 09:57:38.306 227766 DEBUG nova.compute.manager [req-c1482a62-47f5-4d3c-a147-1c12fb15b943 req-68d45dbc-cb3a-40f1-a462-7c3bd62b485b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Refreshing instance network info cache due to event network-changed-086c8696-1f80-478d-a2ee-27821d7e0b7a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:57:38 np0005593234 nova_compute[227762]: 2026-01-23 09:57:38.306 227766 DEBUG oslo_concurrency.lockutils [req-c1482a62-47f5-4d3c-a147-1c12fb15b943 req-68d45dbc-cb3a-40f1-a462-7c3bd62b485b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-1e7fbf43-a3d7-4017-84bb-9787aa383363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:57:38 np0005593234 nova_compute[227762]: 2026-01-23 09:57:38.512 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:38 np0005593234 nova_compute[227762]: 2026-01-23 09:57:38.614 227766 DEBUG nova.network.neutron [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:57:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:57:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:40.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:57:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:57:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:40.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:57:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:40 np0005593234 nova_compute[227762]: 2026-01-23 09:57:40.785 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.757 227766 DEBUG nova.network.neutron [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Updating instance_info_cache with network_info: [{"id": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "address": "fa:16:3e:5e:42:da", "network": {"id": "28b2d467-fd65-4cbe-ad42-32f0c6d389dc", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1979623964-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49fe499c3ed341249456b8cc11ae8483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086c8696-1f", "ovs_interfaceid": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.785 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Releasing lock "refresh_cache-1e7fbf43-a3d7-4017-84bb-9787aa383363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.785 227766 DEBUG nova.compute.manager [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Instance network_info: |[{"id": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "address": "fa:16:3e:5e:42:da", "network": {"id": "28b2d467-fd65-4cbe-ad42-32f0c6d389dc", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1979623964-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49fe499c3ed341249456b8cc11ae8483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086c8696-1f", "ovs_interfaceid": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.786 227766 DEBUG oslo_concurrency.lockutils [req-c1482a62-47f5-4d3c-a147-1c12fb15b943 req-68d45dbc-cb3a-40f1-a462-7c3bd62b485b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-1e7fbf43-a3d7-4017-84bb-9787aa383363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.786 227766 DEBUG nova.network.neutron [req-c1482a62-47f5-4d3c-a147-1c12fb15b943 req-68d45dbc-cb3a-40f1-a462-7c3bd62b485b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Refreshing network info cache for port 086c8696-1f80-478d-a2ee-27821d7e0b7a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.788 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Start _get_guest_xml network_info=[{"id": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "address": "fa:16:3e:5e:42:da", "network": {"id": "28b2d467-fd65-4cbe-ad42-32f0c6d389dc", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1979623964-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49fe499c3ed341249456b8cc11ae8483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086c8696-1f", "ovs_interfaceid": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.793 227766 WARNING nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.797 227766 DEBUG nova.virt.libvirt.host [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.798 227766 DEBUG nova.virt.libvirt.host [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.803 227766 DEBUG nova.virt.libvirt.host [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.803 227766 DEBUG nova.virt.libvirt.host [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.804 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.805 227766 DEBUG nova.virt.hardware [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.805 227766 DEBUG nova.virt.hardware [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.805 227766 DEBUG nova.virt.hardware [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.805 227766 DEBUG nova.virt.hardware [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.806 227766 DEBUG nova.virt.hardware [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.806 227766 DEBUG nova.virt.hardware [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.806 227766 DEBUG nova.virt.hardware [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.806 227766 DEBUG nova.virt.hardware [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.806 227766 DEBUG nova.virt.hardware [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.807 227766 DEBUG nova.virt.hardware [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.807 227766 DEBUG nova.virt.hardware [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:57:41 np0005593234 nova_compute[227762]: 2026-01-23 09:57:41.810 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:57:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:42.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:57:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:42.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:57:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:57:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2737268635' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.263 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.283 227766 DEBUG nova.storage.rbd_utils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] rbd image 1e7fbf43-a3d7-4017-84bb-9787aa383363_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.286 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:57:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:57:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3519765521' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.716 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.719 227766 DEBUG nova.virt.libvirt.vif [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:57:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1185215746',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1185215746',id=87,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='49fe499c3ed341249456b8cc11ae8483',ramdisk_id='',reservation_id='r-c7sop938',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-307060045',owner_user_name='tempest-InstanceActionsV221TestJSON-307060045-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:57:33Z,user_data=None,user_id='fb7d106814e948feb72555b92cb0bce7',uuid=1e7fbf43-a3d7-4017-84bb-9787aa383363,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "address": "fa:16:3e:5e:42:da", "network": {"id": "28b2d467-fd65-4cbe-ad42-32f0c6d389dc", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1979623964-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49fe499c3ed341249456b8cc11ae8483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086c8696-1f", "ovs_interfaceid": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.720 227766 DEBUG nova.network.os_vif_util [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Converting VIF {"id": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "address": "fa:16:3e:5e:42:da", "network": {"id": "28b2d467-fd65-4cbe-ad42-32f0c6d389dc", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1979623964-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49fe499c3ed341249456b8cc11ae8483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086c8696-1f", "ovs_interfaceid": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.721 227766 DEBUG nova.network.os_vif_util [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:42:da,bridge_name='br-int',has_traffic_filtering=True,id=086c8696-1f80-478d-a2ee-27821d7e0b7a,network=Network(28b2d467-fd65-4cbe-ad42-32f0c6d389dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086c8696-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.722 227766 DEBUG nova.objects.instance [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1e7fbf43-a3d7-4017-84bb-9787aa383363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.748 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  <uuid>1e7fbf43-a3d7-4017-84bb-9787aa383363</uuid>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  <name>instance-00000057</name>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <nova:name>tempest-InstanceActionsV221TestJSON-server-1185215746</nova:name>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:57:41</nova:creationTime>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <nova:user uuid="fb7d106814e948feb72555b92cb0bce7">tempest-InstanceActionsV221TestJSON-307060045-project-member</nova:user>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <nova:project uuid="49fe499c3ed341249456b8cc11ae8483">tempest-InstanceActionsV221TestJSON-307060045</nova:project>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <nova:port uuid="086c8696-1f80-478d-a2ee-27821d7e0b7a">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <entry name="serial">1e7fbf43-a3d7-4017-84bb-9787aa383363</entry>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <entry name="uuid">1e7fbf43-a3d7-4017-84bb-9787aa383363</entry>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1e7fbf43-a3d7-4017-84bb-9787aa383363_disk">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1e7fbf43-a3d7-4017-84bb-9787aa383363_disk.config">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:5e:42:da"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <target dev="tap086c8696-1f"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/1e7fbf43-a3d7-4017-84bb-9787aa383363/console.log" append="off"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:57:42 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:57:42 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:57:42 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:57:42 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.750 227766 DEBUG nova.compute.manager [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Preparing to wait for external event network-vif-plugged-086c8696-1f80-478d-a2ee-27821d7e0b7a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.750 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Acquiring lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.750 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.751 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.751 227766 DEBUG nova.virt.libvirt.vif [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:57:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1185215746',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1185215746',id=87,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='49fe499c3ed341249456b8cc11ae8483',ramdisk_id='',reservation_id='r-c7sop938',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-InstanceActionsV221TestJSON-307060045',owner_user_name='tempest-InstanceActionsV221TestJSON-307060045-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:57:33Z,user_data=None,user_id='fb7d106814e948feb72555b92cb0bce7',uuid=1e7fbf43-a3d7-4017-84bb-9787aa383363,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "address": "fa:16:3e:5e:42:da", "network": {"id": "28b2d467-fd65-4cbe-ad42-32f0c6d389dc", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1979623964-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49fe499c3ed341249456b8cc11ae8483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086c8696-1f", "ovs_interfaceid": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.752 227766 DEBUG nova.network.os_vif_util [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Converting VIF {"id": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "address": "fa:16:3e:5e:42:da", "network": {"id": "28b2d467-fd65-4cbe-ad42-32f0c6d389dc", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1979623964-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49fe499c3ed341249456b8cc11ae8483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086c8696-1f", "ovs_interfaceid": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.752 227766 DEBUG nova.network.os_vif_util [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:42:da,bridge_name='br-int',has_traffic_filtering=True,id=086c8696-1f80-478d-a2ee-27821d7e0b7a,network=Network(28b2d467-fd65-4cbe-ad42-32f0c6d389dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086c8696-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.752 227766 DEBUG os_vif [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:42:da,bridge_name='br-int',has_traffic_filtering=True,id=086c8696-1f80-478d-a2ee-27821d7e0b7a,network=Network(28b2d467-fd65-4cbe-ad42-32f0c6d389dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086c8696-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.753 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.753 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.754 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.757 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.757 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap086c8696-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.757 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap086c8696-1f, col_values=(('external_ids', {'iface-id': '086c8696-1f80-478d-a2ee-27821d7e0b7a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5e:42:da', 'vm-uuid': '1e7fbf43-a3d7-4017-84bb-9787aa383363'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.759 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:42 np0005593234 NetworkManager[48942]: <info>  [1769162262.7598] manager: (tap086c8696-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/144)
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.761 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.765 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.766 227766 INFO os_vif [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:42:da,bridge_name='br-int',has_traffic_filtering=True,id=086c8696-1f80-478d-a2ee-27821d7e0b7a,network=Network(28b2d467-fd65-4cbe-ad42-32f0c6d389dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086c8696-1f')#033[00m
Jan 23 04:57:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:42.833 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:42.834 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:42.834 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.915 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.915 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.916 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] No VIF found with MAC fa:16:3e:5e:42:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.916 227766 INFO nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Using config drive#033[00m
Jan 23 04:57:42 np0005593234 nova_compute[227762]: 2026-01-23 09:57:42.939 227766 DEBUG nova.storage.rbd_utils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] rbd image 1e7fbf43-a3d7-4017-84bb-9787aa383363_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:57:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:57:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:57:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:57:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:57:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:57:43 np0005593234 nova_compute[227762]: 2026-01-23 09:57:43.729 227766 INFO nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Creating config drive at /var/lib/nova/instances/1e7fbf43-a3d7-4017-84bb-9787aa383363/disk.config#033[00m
Jan 23 04:57:43 np0005593234 nova_compute[227762]: 2026-01-23 09:57:43.734 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1e7fbf43-a3d7-4017-84bb-9787aa383363/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvo9kf4ld execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:57:43 np0005593234 nova_compute[227762]: 2026-01-23 09:57:43.863 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1e7fbf43-a3d7-4017-84bb-9787aa383363/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvo9kf4ld" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:57:43 np0005593234 nova_compute[227762]: 2026-01-23 09:57:43.890 227766 DEBUG nova.storage.rbd_utils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] rbd image 1e7fbf43-a3d7-4017-84bb-9787aa383363_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:57:43 np0005593234 nova_compute[227762]: 2026-01-23 09:57:43.894 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1e7fbf43-a3d7-4017-84bb-9787aa383363/disk.config 1e7fbf43-a3d7-4017-84bb-9787aa383363_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:57:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:57:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:44.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.075 227766 DEBUG oslo_concurrency.processutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1e7fbf43-a3d7-4017-84bb-9787aa383363/disk.config 1e7fbf43-a3d7-4017-84bb-9787aa383363_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.181s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.076 227766 INFO nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Deleting local config drive /var/lib/nova/instances/1e7fbf43-a3d7-4017-84bb-9787aa383363/disk.config because it was imported into RBD.#033[00m
Jan 23 04:57:44 np0005593234 kernel: tap086c8696-1f: entered promiscuous mode
Jan 23 04:57:44 np0005593234 NetworkManager[48942]: <info>  [1769162264.1259] manager: (tap086c8696-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/145)
Jan 23 04:57:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:57:44Z|00284|binding|INFO|Claiming lport 086c8696-1f80-478d-a2ee-27821d7e0b7a for this chassis.
Jan 23 04:57:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:57:44Z|00285|binding|INFO|086c8696-1f80-478d-a2ee-27821d7e0b7a: Claiming fa:16:3e:5e:42:da 10.100.0.6
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.126 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.131 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.134 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.142 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:42:da 10.100.0.6'], port_security=['fa:16:3e:5e:42:da 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1e7fbf43-a3d7-4017-84bb-9787aa383363', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28b2d467-fd65-4cbe-ad42-32f0c6d389dc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '49fe499c3ed341249456b8cc11ae8483', 'neutron:revision_number': '2', 'neutron:security_group_ids': '650a23fb-f32e-4462-ac76-ec424df94eba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1714143d-be12-43f3-9519-7e1066d245a7, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=086c8696-1f80-478d-a2ee-27821d7e0b7a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.143 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 086c8696-1f80-478d-a2ee-27821d7e0b7a in datapath 28b2d467-fd65-4cbe-ad42-32f0c6d389dc bound to our chassis#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.145 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 28b2d467-fd65-4cbe-ad42-32f0c6d389dc#033[00m
Jan 23 04:57:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:44.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.158 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[88a4f63d-fdc2-4e84-8523-cdb57cec10ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.159 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap28b2d467-f1 in ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:57:44 np0005593234 systemd-machined[195626]: New machine qemu-35-instance-00000057.
Jan 23 04:57:44 np0005593234 systemd-udevd[265788]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.163 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap28b2d467-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.163 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e376390e-595c-48a4-9c81-0fe9a96018f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.164 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3b81aafe-2319-4600-81b0-996470de3ea9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 NetworkManager[48942]: <info>  [1769162264.1756] device (tap086c8696-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:57:44 np0005593234 NetworkManager[48942]: <info>  [1769162264.1763] device (tap086c8696-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:57:44 np0005593234 systemd[1]: Started Virtual Machine qemu-35-instance-00000057.
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.178 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[a19b3605-e26e-4009-9994-42e0f3794806]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.193 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.193 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f33c2e76-c53e-4ede-aa61-edaeaf5e952f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:57:44Z|00286|binding|INFO|Setting lport 086c8696-1f80-478d-a2ee-27821d7e0b7a ovn-installed in OVS
Jan 23 04:57:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:57:44Z|00287|binding|INFO|Setting lport 086c8696-1f80-478d-a2ee-27821d7e0b7a up in Southbound
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.202 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.223 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[3d6ca635-3dd4-460a-a196-336fa788467a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.229 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2a38b93e-7d38-4e1e-bf2f-e7c30746082c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 NetworkManager[48942]: <info>  [1769162264.2302] manager: (tap28b2d467-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/146)
Jan 23 04:57:44 np0005593234 systemd-udevd[265791]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.255 227766 DEBUG nova.network.neutron [req-c1482a62-47f5-4d3c-a147-1c12fb15b943 req-68d45dbc-cb3a-40f1-a462-7c3bd62b485b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Updated VIF entry in instance network info cache for port 086c8696-1f80-478d-a2ee-27821d7e0b7a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.256 227766 DEBUG nova.network.neutron [req-c1482a62-47f5-4d3c-a147-1c12fb15b943 req-68d45dbc-cb3a-40f1-a462-7c3bd62b485b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Updating instance_info_cache with network_info: [{"id": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "address": "fa:16:3e:5e:42:da", "network": {"id": "28b2d467-fd65-4cbe-ad42-32f0c6d389dc", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1979623964-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49fe499c3ed341249456b8cc11ae8483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086c8696-1f", "ovs_interfaceid": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.261 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ca00077c-68e0-4a3b-87d1-2e4b8dc28c80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.263 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[47e859ff-a616-4b1a-b0bf-49deb9774127]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.276 227766 DEBUG oslo_concurrency.lockutils [req-c1482a62-47f5-4d3c-a147-1c12fb15b943 req-68d45dbc-cb3a-40f1-a462-7c3bd62b485b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-1e7fbf43-a3d7-4017-84bb-9787aa383363" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:57:44 np0005593234 NetworkManager[48942]: <info>  [1769162264.2856] device (tap28b2d467-f0): carrier: link connected
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.290 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[964bc4f0-5c21-4a58-b9d0-b6e8c39df419]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.308 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c83c1d29-f431-4d73-b80b-95506bd3a492]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28b2d467-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:b6:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 615924, 'reachable_time': 26643, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 265820, 'error': None, 'target': 'ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.323 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[af83bacb-6682-4453-be66-3572248d2900]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe77:b611'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 615924, 'tstamp': 615924}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 265821, 'error': None, 'target': 'ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.341 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b1be90bb-dc31-4967-936b-9d98c0d35013]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap28b2d467-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:77:b6:11'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 89], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 615924, 'reachable_time': 26643, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 265822, 'error': None, 'target': 'ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.368 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1e1db0ff-9702-47af-a5df-97a466ce9acb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.429 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c638e6e3-9e5e-400e-9730-350c68bdda56]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.431 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28b2d467-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.431 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.431 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap28b2d467-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.433 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:44 np0005593234 NetworkManager[48942]: <info>  [1769162264.4336] manager: (tap28b2d467-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/147)
Jan 23 04:57:44 np0005593234 kernel: tap28b2d467-f0: entered promiscuous mode
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.437 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.437 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap28b2d467-f0, col_values=(('external_ids', {'iface-id': 'e21d88ec-d313-4ab0-bc88-7dc6bf3b2963'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:57:44Z|00288|binding|INFO|Releasing lport e21d88ec-d313-4ab0-bc88-7dc6bf3b2963 from this chassis (sb_readonly=0)
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.453 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.456 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/28b2d467-fd65-4cbe-ad42-32f0c6d389dc.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/28b2d467-fd65-4cbe-ad42-32f0c6d389dc.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:57:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:57:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1354347059' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.457 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e340709e-1d23-468f-9380-51e0631629a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.458 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-28b2d467-fd65-4cbe-ad42-32f0c6d389dc
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/28b2d467-fd65-4cbe-ad42-32f0c6d389dc.pid.haproxy
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 28b2d467-fd65-4cbe-ad42-32f0c6d389dc
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:57:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:44.459 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc', 'env', 'PROCESS_TAG=haproxy-28b2d467-fd65-4cbe-ad42-32f0c6d389dc', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/28b2d467-fd65-4cbe-ad42-32f0c6d389dc.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:57:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:57:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1354347059' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.560 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162264.5599585, 1e7fbf43-a3d7-4017-84bb-9787aa383363 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.561 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] VM Started (Lifecycle Event)#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.585 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.589 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162264.5610714, 1e7fbf43-a3d7-4017-84bb-9787aa383363 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.589 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.612 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.615 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.637 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.734 227766 DEBUG nova.compute.manager [req-63d6a82a-ca85-4585-b5bd-53ced9fe03fc req-01fbbb65-7d32-4e8c-8709-dcc0c7698459 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Received event network-vif-plugged-086c8696-1f80-478d-a2ee-27821d7e0b7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.735 227766 DEBUG oslo_concurrency.lockutils [req-63d6a82a-ca85-4585-b5bd-53ced9fe03fc req-01fbbb65-7d32-4e8c-8709-dcc0c7698459 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.736 227766 DEBUG oslo_concurrency.lockutils [req-63d6a82a-ca85-4585-b5bd-53ced9fe03fc req-01fbbb65-7d32-4e8c-8709-dcc0c7698459 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.736 227766 DEBUG oslo_concurrency.lockutils [req-63d6a82a-ca85-4585-b5bd-53ced9fe03fc req-01fbbb65-7d32-4e8c-8709-dcc0c7698459 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.736 227766 DEBUG nova.compute.manager [req-63d6a82a-ca85-4585-b5bd-53ced9fe03fc req-01fbbb65-7d32-4e8c-8709-dcc0c7698459 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Processing event network-vif-plugged-086c8696-1f80-478d-a2ee-27821d7e0b7a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.737 227766 DEBUG nova.compute.manager [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.740 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162264.7399054, 1e7fbf43-a3d7-4017-84bb-9787aa383363 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.740 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.742 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.745 227766 INFO nova.virt.libvirt.driver [-] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Instance spawned successfully.#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.746 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.787 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.787 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.788 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.788 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.789 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.789 227766 DEBUG nova.virt.libvirt.driver [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.794 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.797 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.837 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:57:44 np0005593234 podman[265896]: 2026-01-23 09:57:44.854068001 +0000 UTC m=+0.056955080 container create 704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.903 227766 INFO nova.compute.manager [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Took 11.27 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:57:44 np0005593234 nova_compute[227762]: 2026-01-23 09:57:44.904 227766 DEBUG nova.compute.manager [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:57:44 np0005593234 podman[265896]: 2026-01-23 09:57:44.818888352 +0000 UTC m=+0.021775461 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:57:44 np0005593234 systemd[1]: Started libpod-conmon-704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4.scope.
Jan 23 04:57:44 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:57:44 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2fe6fe2814b09895ee0fdc93d772729bf98dc4e8865324990e2c99780de6b86b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:57:44 np0005593234 podman[265896]: 2026-01-23 09:57:44.954764036 +0000 UTC m=+0.157651135 container init 704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 04:57:44 np0005593234 podman[265896]: 2026-01-23 09:57:44.960310889 +0000 UTC m=+0.163197968 container start 704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 04:57:44 np0005593234 neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc[265911]: [NOTICE]   (265915) : New worker (265917) forked
Jan 23 04:57:44 np0005593234 neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc[265911]: [NOTICE]   (265915) : Loading success.
Jan 23 04:57:45 np0005593234 nova_compute[227762]: 2026-01-23 09:57:45.026 227766 INFO nova.compute.manager [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Took 12.82 seconds to build instance.#033[00m
Jan 23 04:57:45 np0005593234 nova_compute[227762]: 2026-01-23 09:57:45.050 227766 DEBUG oslo_concurrency.lockutils [None req-fe83ef19-2abe-41be-a2e7-5d688d0aacfc fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:45 np0005593234 nova_compute[227762]: 2026-01-23 09:57:45.787 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:46.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:46.095 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=34, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=33) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.096 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:46.097 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:57:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:46.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.695 227766 DEBUG oslo_concurrency.lockutils [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Acquiring lock "1e7fbf43-a3d7-4017-84bb-9787aa383363" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.695 227766 DEBUG oslo_concurrency.lockutils [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.695 227766 DEBUG oslo_concurrency.lockutils [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Acquiring lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.695 227766 DEBUG oslo_concurrency.lockutils [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.696 227766 DEBUG oslo_concurrency.lockutils [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.697 227766 INFO nova.compute.manager [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Terminating instance#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.698 227766 DEBUG nova.compute.manager [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:57:46 np0005593234 kernel: tap086c8696-1f (unregistering): left promiscuous mode
Jan 23 04:57:46 np0005593234 NetworkManager[48942]: <info>  [1769162266.7474] device (tap086c8696-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.756 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:46 np0005593234 ovn_controller[134547]: 2026-01-23T09:57:46Z|00289|binding|INFO|Releasing lport 086c8696-1f80-478d-a2ee-27821d7e0b7a from this chassis (sb_readonly=0)
Jan 23 04:57:46 np0005593234 ovn_controller[134547]: 2026-01-23T09:57:46Z|00290|binding|INFO|Setting lport 086c8696-1f80-478d-a2ee-27821d7e0b7a down in Southbound
Jan 23 04:57:46 np0005593234 ovn_controller[134547]: 2026-01-23T09:57:46Z|00291|binding|INFO|Removing iface tap086c8696-1f ovn-installed in OVS
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.760 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:46.766 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5e:42:da 10.100.0.6'], port_security=['fa:16:3e:5e:42:da 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '1e7fbf43-a3d7-4017-84bb-9787aa383363', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-28b2d467-fd65-4cbe-ad42-32f0c6d389dc', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '49fe499c3ed341249456b8cc11ae8483', 'neutron:revision_number': '4', 'neutron:security_group_ids': '650a23fb-f32e-4462-ac76-ec424df94eba', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1714143d-be12-43f3-9519-7e1066d245a7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=086c8696-1f80-478d-a2ee-27821d7e0b7a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:57:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:46.768 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 086c8696-1f80-478d-a2ee-27821d7e0b7a in datapath 28b2d467-fd65-4cbe-ad42-32f0c6d389dc unbound from our chassis#033[00m
Jan 23 04:57:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:46.769 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 28b2d467-fd65-4cbe-ad42-32f0c6d389dc, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:57:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:46.771 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f89e88e6-3059-4774-86a9-afd331ce4ee3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:46.771 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc namespace which is not needed anymore#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.774 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:46 np0005593234 podman[265927]: 2026-01-23 09:57:46.791426572 +0000 UTC m=+0.087630807 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:57:46 np0005593234 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000057.scope: Deactivated successfully.
Jan 23 04:57:46 np0005593234 systemd[1]: machine-qemu\x2d35\x2dinstance\x2d00000057.scope: Consumed 2.426s CPU time.
Jan 23 04:57:46 np0005593234 systemd-machined[195626]: Machine qemu-35-instance-00000057 terminated.
Jan 23 04:57:46 np0005593234 neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc[265911]: [NOTICE]   (265915) : haproxy version is 2.8.14-c23fe91
Jan 23 04:57:46 np0005593234 neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc[265911]: [NOTICE]   (265915) : path to executable is /usr/sbin/haproxy
Jan 23 04:57:46 np0005593234 neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc[265911]: [WARNING]  (265915) : Exiting Master process...
Jan 23 04:57:46 np0005593234 neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc[265911]: [ALERT]    (265915) : Current worker (265917) exited with code 143 (Terminated)
Jan 23 04:57:46 np0005593234 neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc[265911]: [WARNING]  (265915) : All workers exited. Exiting... (0)
Jan 23 04:57:46 np0005593234 systemd[1]: libpod-704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4.scope: Deactivated successfully.
Jan 23 04:57:46 np0005593234 podman[265968]: 2026-01-23 09:57:46.889673981 +0000 UTC m=+0.041421015 container died 704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:57:46 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4-userdata-shm.mount: Deactivated successfully.
Jan 23 04:57:46 np0005593234 systemd[1]: var-lib-containers-storage-overlay-2fe6fe2814b09895ee0fdc93d772729bf98dc4e8865324990e2c99780de6b86b-merged.mount: Deactivated successfully.
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.919 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:46 np0005593234 podman[265968]: 2026-01-23 09:57:46.927295096 +0000 UTC m=+0.079042130 container cleanup 704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.929 227766 INFO nova.virt.libvirt.driver [-] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Instance destroyed successfully.#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.930 227766 DEBUG nova.objects.instance [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lazy-loading 'resources' on Instance uuid 1e7fbf43-a3d7-4017-84bb-9787aa383363 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:57:46 np0005593234 systemd[1]: libpod-conmon-704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4.scope: Deactivated successfully.
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.939 227766 DEBUG nova.compute.manager [req-0aaf4c37-3621-4e62-9842-8efe5b82beb5 req-de7c71a0-666d-4242-933a-3220c29cd417 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Received event network-vif-plugged-086c8696-1f80-478d-a2ee-27821d7e0b7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.939 227766 DEBUG oslo_concurrency.lockutils [req-0aaf4c37-3621-4e62-9842-8efe5b82beb5 req-de7c71a0-666d-4242-933a-3220c29cd417 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.939 227766 DEBUG oslo_concurrency.lockutils [req-0aaf4c37-3621-4e62-9842-8efe5b82beb5 req-de7c71a0-666d-4242-933a-3220c29cd417 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.939 227766 DEBUG oslo_concurrency.lockutils [req-0aaf4c37-3621-4e62-9842-8efe5b82beb5 req-de7c71a0-666d-4242-933a-3220c29cd417 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.940 227766 DEBUG nova.compute.manager [req-0aaf4c37-3621-4e62-9842-8efe5b82beb5 req-de7c71a0-666d-4242-933a-3220c29cd417 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] No waiting events found dispatching network-vif-plugged-086c8696-1f80-478d-a2ee-27821d7e0b7a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.940 227766 WARNING nova.compute.manager [req-0aaf4c37-3621-4e62-9842-8efe5b82beb5 req-de7c71a0-666d-4242-933a-3220c29cd417 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Received unexpected event network-vif-plugged-086c8696-1f80-478d-a2ee-27821d7e0b7a for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.965 227766 DEBUG nova.virt.libvirt.vif [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:57:28Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-InstanceActionsV221TestJSON-server-1185215746',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-instanceactionsv221testjson-server-1185215746',id=87,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:57:44Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='49fe499c3ed341249456b8cc11ae8483',ramdisk_id='',reservation_id='r-c7sop938',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-InstanceActionsV221TestJSON-307060045',owner_user_name='tempest-InstanceActionsV221TestJSON-307060045-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:57:44Z,user_data=None,user_id='fb7d106814e948feb72555b92cb0bce7',uuid=1e7fbf43-a3d7-4017-84bb-9787aa383363,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "address": "fa:16:3e:5e:42:da", "network": {"id": "28b2d467-fd65-4cbe-ad42-32f0c6d389dc", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1979623964-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49fe499c3ed341249456b8cc11ae8483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086c8696-1f", "ovs_interfaceid": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.965 227766 DEBUG nova.network.os_vif_util [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Converting VIF {"id": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "address": "fa:16:3e:5e:42:da", "network": {"id": "28b2d467-fd65-4cbe-ad42-32f0c6d389dc", "bridge": "br-int", "label": "tempest-InstanceActionsV221TestJSON-1979623964-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "49fe499c3ed341249456b8cc11ae8483", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap086c8696-1f", "ovs_interfaceid": "086c8696-1f80-478d-a2ee-27821d7e0b7a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.966 227766 DEBUG nova.network.os_vif_util [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5e:42:da,bridge_name='br-int',has_traffic_filtering=True,id=086c8696-1f80-478d-a2ee-27821d7e0b7a,network=Network(28b2d467-fd65-4cbe-ad42-32f0c6d389dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086c8696-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.966 227766 DEBUG os_vif [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:42:da,bridge_name='br-int',has_traffic_filtering=True,id=086c8696-1f80-478d-a2ee-27821d7e0b7a,network=Network(28b2d467-fd65-4cbe-ad42-32f0c6d389dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086c8696-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.968 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.968 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap086c8696-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.969 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.971 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.973 227766 INFO os_vif [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5e:42:da,bridge_name='br-int',has_traffic_filtering=True,id=086c8696-1f80-478d-a2ee-27821d7e0b7a,network=Network(28b2d467-fd65-4cbe-ad42-32f0c6d389dc),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap086c8696-1f')#033[00m
Jan 23 04:57:46 np0005593234 podman[266009]: 2026-01-23 09:57:46.986544657 +0000 UTC m=+0.039247267 container remove 704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:57:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:46.992 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8cd2ee2b-2726-477f-8be6-6fbae4a851ac]: (4, ('Fri Jan 23 09:57:46 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc (704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4)\n704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4\nFri Jan 23 09:57:46 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc (704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4)\n704ad3f26ae70b0b0b076ee498ed228a08db7719a15893476cfe397829af74f4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:46.994 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[042ff5f6-4310-4b9b-8fc2-91a663c6f0f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:46.996 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap28b2d467-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:46 np0005593234 kernel: tap28b2d467-f0: left promiscuous mode
Jan 23 04:57:46 np0005593234 nova_compute[227762]: 2026-01-23 09:57:46.998 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:47.002 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aeefdfb7-8ad3-43f5-a843-58095f23411e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:47 np0005593234 nova_compute[227762]: 2026-01-23 09:57:47.014 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:47.017 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[621f719c-9ca7-4c0f-8bdd-5bca7f83d180]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:47.018 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7bcf7130-d0da-4578-8024-85d78529b66b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:47.032 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2e86bea4-39e0-4eb1-ac36-c68f59d0b9ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 615917, 'reachable_time': 25290, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 266042, 'error': None, 'target': 'ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:47.035 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-28b2d467-fd65-4cbe-ad42-32f0c6d389dc deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:57:47 np0005593234 systemd[1]: run-netns-ovnmeta\x2d28b2d467\x2dfd65\x2d4cbe\x2dad42\x2d32f0c6d389dc.mount: Deactivated successfully.
Jan 23 04:57:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:47.035 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[e353b7aa-e15c-45bf-b4c2-d9c2b66b4a6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:57:47 np0005593234 nova_compute[227762]: 2026-01-23 09:57:47.368 227766 INFO nova.virt.libvirt.driver [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Deleting instance files /var/lib/nova/instances/1e7fbf43-a3d7-4017-84bb-9787aa383363_del#033[00m
Jan 23 04:57:47 np0005593234 nova_compute[227762]: 2026-01-23 09:57:47.369 227766 INFO nova.virt.libvirt.driver [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Deletion of /var/lib/nova/instances/1e7fbf43-a3d7-4017-84bb-9787aa383363_del complete#033[00m
Jan 23 04:57:47 np0005593234 nova_compute[227762]: 2026-01-23 09:57:47.448 227766 INFO nova.compute.manager [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Took 0.75 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:57:47 np0005593234 nova_compute[227762]: 2026-01-23 09:57:47.449 227766 DEBUG oslo.service.loopingcall [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:57:47 np0005593234 nova_compute[227762]: 2026-01-23 09:57:47.449 227766 DEBUG nova.compute.manager [-] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:57:47 np0005593234 nova_compute[227762]: 2026-01-23 09:57:47.449 227766 DEBUG nova.network.neutron [-] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:57:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:48.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:57:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:48.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.188 227766 DEBUG nova.compute.manager [req-98b998e5-54c7-4c0f-8f70-908024572f92 req-9d3060de-1fcb-44f2-a492-a28ced9f40bd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Received event network-vif-unplugged-086c8696-1f80-478d-a2ee-27821d7e0b7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.188 227766 DEBUG oslo_concurrency.lockutils [req-98b998e5-54c7-4c0f-8f70-908024572f92 req-9d3060de-1fcb-44f2-a492-a28ced9f40bd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.188 227766 DEBUG oslo_concurrency.lockutils [req-98b998e5-54c7-4c0f-8f70-908024572f92 req-9d3060de-1fcb-44f2-a492-a28ced9f40bd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.189 227766 DEBUG oslo_concurrency.lockutils [req-98b998e5-54c7-4c0f-8f70-908024572f92 req-9d3060de-1fcb-44f2-a492-a28ced9f40bd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.189 227766 DEBUG nova.compute.manager [req-98b998e5-54c7-4c0f-8f70-908024572f92 req-9d3060de-1fcb-44f2-a492-a28ced9f40bd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] No waiting events found dispatching network-vif-unplugged-086c8696-1f80-478d-a2ee-27821d7e0b7a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.189 227766 DEBUG nova.compute.manager [req-98b998e5-54c7-4c0f-8f70-908024572f92 req-9d3060de-1fcb-44f2-a492-a28ced9f40bd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Received event network-vif-unplugged-086c8696-1f80-478d-a2ee-27821d7e0b7a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.189 227766 DEBUG nova.compute.manager [req-98b998e5-54c7-4c0f-8f70-908024572f92 req-9d3060de-1fcb-44f2-a492-a28ced9f40bd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Received event network-vif-plugged-086c8696-1f80-478d-a2ee-27821d7e0b7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.189 227766 DEBUG oslo_concurrency.lockutils [req-98b998e5-54c7-4c0f-8f70-908024572f92 req-9d3060de-1fcb-44f2-a492-a28ced9f40bd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.190 227766 DEBUG oslo_concurrency.lockutils [req-98b998e5-54c7-4c0f-8f70-908024572f92 req-9d3060de-1fcb-44f2-a492-a28ced9f40bd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.190 227766 DEBUG oslo_concurrency.lockutils [req-98b998e5-54c7-4c0f-8f70-908024572f92 req-9d3060de-1fcb-44f2-a492-a28ced9f40bd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.190 227766 DEBUG nova.compute.manager [req-98b998e5-54c7-4c0f-8f70-908024572f92 req-9d3060de-1fcb-44f2-a492-a28ced9f40bd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] No waiting events found dispatching network-vif-plugged-086c8696-1f80-478d-a2ee-27821d7e0b7a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.190 227766 WARNING nova.compute.manager [req-98b998e5-54c7-4c0f-8f70-908024572f92 req-9d3060de-1fcb-44f2-a492-a28ced9f40bd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Received unexpected event network-vif-plugged-086c8696-1f80-478d-a2ee-27821d7e0b7a for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.626 227766 DEBUG nova.network.neutron [-] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.692 227766 INFO nova.compute.manager [-] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Took 2.24 seconds to deallocate network for instance.#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.779 227766 DEBUG nova.compute.manager [req-a7d067e5-a8cb-4c43-a9fe-a6e7487bbc74 req-41fdf278-b791-48b2-8d89-bf68ff378b41 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Received event network-vif-deleted-086c8696-1f80-478d-a2ee-27821d7e0b7a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.807 227766 DEBUG oslo_concurrency.lockutils [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.807 227766 DEBUG oslo_concurrency.lockutils [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:57:49 np0005593234 nova_compute[227762]: 2026-01-23 09:57:49.915 227766 DEBUG oslo_concurrency.processutils [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:57:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:50.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:57:50.099 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '34'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:57:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:50.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:57:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:57:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:57:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3988333716' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:57:50 np0005593234 nova_compute[227762]: 2026-01-23 09:57:50.356 227766 DEBUG oslo_concurrency.processutils [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:57:50 np0005593234 nova_compute[227762]: 2026-01-23 09:57:50.363 227766 DEBUG nova.compute.provider_tree [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:57:50 np0005593234 nova_compute[227762]: 2026-01-23 09:57:50.418 227766 DEBUG nova.scheduler.client.report [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:57:50 np0005593234 nova_compute[227762]: 2026-01-23 09:57:50.459 227766 DEBUG oslo_concurrency.lockutils [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:50 np0005593234 nova_compute[227762]: 2026-01-23 09:57:50.588 227766 INFO nova.scheduler.client.report [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Deleted allocations for instance 1e7fbf43-a3d7-4017-84bb-9787aa383363#033[00m
Jan 23 04:57:50 np0005593234 nova_compute[227762]: 2026-01-23 09:57:50.789 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:50 np0005593234 nova_compute[227762]: 2026-01-23 09:57:50.869 227766 DEBUG oslo_concurrency.lockutils [None req-3f0df076-994b-4714-80cf-4c35ab774341 fb7d106814e948feb72555b92cb0bce7 49fe499c3ed341249456b8cc11ae8483 - - default default] Lock "1e7fbf43-a3d7-4017-84bb-9787aa383363" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:57:51 np0005593234 nova_compute[227762]: 2026-01-23 09:57:51.970 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:52.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:57:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:52.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:57:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:54.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:54.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:57:55 np0005593234 nova_compute[227762]: 2026-01-23 09:57:55.790 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:56.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:56.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:56 np0005593234 nova_compute[227762]: 2026-01-23 09:57:56.973 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:57:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:58.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:57:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:57:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:57:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:58.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:00.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:00.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:00 np0005593234 nova_compute[227762]: 2026-01-23 09:58:00.835 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:01 np0005593234 podman[266135]: 2026-01-23 09:58:01.356222389 +0000 UTC m=+0.073767945 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:58:01 np0005593234 nova_compute[227762]: 2026-01-23 09:58:01.660 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:01 np0005593234 nova_compute[227762]: 2026-01-23 09:58:01.928 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162266.9271493, 1e7fbf43-a3d7-4017-84bb-9787aa383363 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:58:01 np0005593234 nova_compute[227762]: 2026-01-23 09:58:01.928 227766 INFO nova.compute.manager [-] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:58:01 np0005593234 nova_compute[227762]: 2026-01-23 09:58:01.958 227766 DEBUG nova.compute.manager [None req-7067f99c-a641-4cd5-98f9-aa3e2ab8e5d4 - - - - - -] [instance: 1e7fbf43-a3d7-4017-84bb-9787aa383363] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:58:01 np0005593234 nova_compute[227762]: 2026-01-23 09:58:01.974 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:02.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:02.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:04.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:04.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:05 np0005593234 nova_compute[227762]: 2026-01-23 09:58:05.837 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:06.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:06.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:06 np0005593234 nova_compute[227762]: 2026-01-23 09:58:06.976 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:08.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:08.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:58:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1013537627' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:58:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:10.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:10.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:10 np0005593234 nova_compute[227762]: 2026-01-23 09:58:10.840 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:11 np0005593234 nova_compute[227762]: 2026-01-23 09:58:11.980 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:12.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:58:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:12.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:58:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:14.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:14.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:15 np0005593234 nova_compute[227762]: 2026-01-23 09:58:15.841 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:16.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:58:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:16.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:58:16 np0005593234 nova_compute[227762]: 2026-01-23 09:58:16.982 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:17 np0005593234 podman[266233]: 2026-01-23 09:58:17.510279685 +0000 UTC m=+0.047570307 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 04:58:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:18.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:18.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:20.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:20.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:20 np0005593234 nova_compute[227762]: 2026-01-23 09:58:20.842 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:21 np0005593234 nova_compute[227762]: 2026-01-23 09:58:21.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:21 np0005593234 nova_compute[227762]: 2026-01-23 09:58:21.985 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:22.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:58:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:22.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:58:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:24.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:24.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:24 np0005593234 nova_compute[227762]: 2026-01-23 09:58:24.607 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:24 np0005593234 nova_compute[227762]: 2026-01-23 09:58:24.608 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:24 np0005593234 nova_compute[227762]: 2026-01-23 09:58:24.732 227766 DEBUG nova.compute.manager [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:58:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:25 np0005593234 nova_compute[227762]: 2026-01-23 09:58:25.601 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:25 np0005593234 nova_compute[227762]: 2026-01-23 09:58:25.602 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:25 np0005593234 nova_compute[227762]: 2026-01-23 09:58:25.615 227766 DEBUG nova.virt.hardware [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:58:25 np0005593234 nova_compute[227762]: 2026-01-23 09:58:25.615 227766 INFO nova.compute.claims [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:58:25 np0005593234 nova_compute[227762]: 2026-01-23 09:58:25.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:25 np0005593234 nova_compute[227762]: 2026-01-23 09:58:25.845 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:26.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:26 np0005593234 nova_compute[227762]: 2026-01-23 09:58:26.099 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:26.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:26.476 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=35, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=34) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:58:26 np0005593234 nova_compute[227762]: 2026-01-23 09:58:26.476 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:26.478 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:58:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:58:26 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2601629312' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:58:26 np0005593234 nova_compute[227762]: 2026-01-23 09:58:26.547 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:26 np0005593234 nova_compute[227762]: 2026-01-23 09:58:26.554 227766 DEBUG nova.compute.provider_tree [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:58:26 np0005593234 nova_compute[227762]: 2026-01-23 09:58:26.949 227766 DEBUG nova.scheduler.client.report [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:58:26 np0005593234 nova_compute[227762]: 2026-01-23 09:58:26.988 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.059 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.060 227766 DEBUG nova.compute.manager [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.181 227766 DEBUG nova.compute.manager [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.182 227766 DEBUG nova.network.neutron [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.233 227766 INFO nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.305 227766 DEBUG nova.compute.manager [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:58:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:27.479 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '35'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.584 227766 DEBUG nova.compute.manager [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.585 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.586 227766 INFO nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Creating image(s)#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.624 227766 DEBUG nova.storage.rbd_utils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.651 227766 DEBUG nova.storage.rbd_utils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.685 227766 DEBUG nova.storage.rbd_utils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.692 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.764 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.765 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.765 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.766 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.792 227766 DEBUG nova.storage.rbd_utils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.796 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:27 np0005593234 nova_compute[227762]: 2026-01-23 09:58:27.824 227766 DEBUG nova.policy [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57e3c530deab46758172af6777c8c108', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd557095954714e01b800ed2898d27593', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:58:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:28.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.071 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.275s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.149 227766 DEBUG nova.storage.rbd_utils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] resizing rbd image 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:58:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:28.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.251 227766 DEBUG nova.objects.instance [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lazy-loading 'migration_context' on Instance uuid 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.286 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.287 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Ensure instance console log exists: /var/lib/nova/instances/30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.287 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.288 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.289 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.785 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.785 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.785 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.786 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:58:28 np0005593234 nova_compute[227762]: 2026-01-23 09:58:28.786 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:58:29 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3871769358' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:58:29 np0005593234 nova_compute[227762]: 2026-01-23 09:58:29.211 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:29 np0005593234 nova_compute[227762]: 2026-01-23 09:58:29.379 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:58:29 np0005593234 nova_compute[227762]: 2026-01-23 09:58:29.380 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4622MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:58:29 np0005593234 nova_compute[227762]: 2026-01-23 09:58:29.380 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:29 np0005593234 nova_compute[227762]: 2026-01-23 09:58:29.381 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:29 np0005593234 nova_compute[227762]: 2026-01-23 09:58:29.480 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:58:29 np0005593234 nova_compute[227762]: 2026-01-23 09:58:29.481 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:58:29 np0005593234 nova_compute[227762]: 2026-01-23 09:58:29.481 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:58:29 np0005593234 nova_compute[227762]: 2026-01-23 09:58:29.574 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:29 np0005593234 nova_compute[227762]: 2026-01-23 09:58:29.744 227766 DEBUG nova.network.neutron [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Successfully created port: 3ae0badd-deb5-430c-8cef-bb36b7ca7eea _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:58:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:58:30 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3182916033' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:58:30 np0005593234 nova_compute[227762]: 2026-01-23 09:58:30.025 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:30 np0005593234 nova_compute[227762]: 2026-01-23 09:58:30.030 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:58:30 np0005593234 nova_compute[227762]: 2026-01-23 09:58:30.053 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:58:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:30.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:30 np0005593234 nova_compute[227762]: 2026-01-23 09:58:30.113 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:58:30 np0005593234 nova_compute[227762]: 2026-01-23 09:58:30.113 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:30.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:30 np0005593234 nova_compute[227762]: 2026-01-23 09:58:30.847 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:30 np0005593234 nova_compute[227762]: 2026-01-23 09:58:30.946 227766 DEBUG nova.network.neutron [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Successfully created port: a1541348-c83c-4165-8181-1f5dddb145af _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:58:31 np0005593234 nova_compute[227762]: 2026-01-23 09:58:31.114 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:31 np0005593234 nova_compute[227762]: 2026-01-23 09:58:31.114 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:58:31 np0005593234 nova_compute[227762]: 2026-01-23 09:58:31.115 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:58:31 np0005593234 nova_compute[227762]: 2026-01-23 09:58:31.156 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 04:58:31 np0005593234 nova_compute[227762]: 2026-01-23 09:58:31.156 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 04:58:31 np0005593234 nova_compute[227762]: 2026-01-23 09:58:31.157 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:31 np0005593234 podman[266552]: 2026-01-23 09:58:31.787469368 +0000 UTC m=+0.090365073 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:58:31 np0005593234 nova_compute[227762]: 2026-01-23 09:58:31.989 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:32.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:32.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:32 np0005593234 nova_compute[227762]: 2026-01-23 09:58:32.451 227766 DEBUG nova.network.neutron [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Successfully created port: 0e41c282-1666-4ce9-aa23-76ee3e40aed8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:58:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:34.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:58:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:34.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:58:34 np0005593234 nova_compute[227762]: 2026-01-23 09:58:34.230 227766 DEBUG nova.network.neutron [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Successfully updated port: 3ae0badd-deb5-430c-8cef-bb36b7ca7eea _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:58:34 np0005593234 nova_compute[227762]: 2026-01-23 09:58:34.554 227766 DEBUG nova.compute.manager [req-7a54a65c-915e-4b54-9026-23fde69340a4 req-542dfbb0-973e-4458-89ed-0941ce6c1767 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-changed-3ae0badd-deb5-430c-8cef-bb36b7ca7eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:34 np0005593234 nova_compute[227762]: 2026-01-23 09:58:34.555 227766 DEBUG nova.compute.manager [req-7a54a65c-915e-4b54-9026-23fde69340a4 req-542dfbb0-973e-4458-89ed-0941ce6c1767 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Refreshing instance network info cache due to event network-changed-3ae0badd-deb5-430c-8cef-bb36b7ca7eea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:58:34 np0005593234 nova_compute[227762]: 2026-01-23 09:58:34.555 227766 DEBUG oslo_concurrency.lockutils [req-7a54a65c-915e-4b54-9026-23fde69340a4 req-542dfbb0-973e-4458-89ed-0941ce6c1767 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:34 np0005593234 nova_compute[227762]: 2026-01-23 09:58:34.556 227766 DEBUG oslo_concurrency.lockutils [req-7a54a65c-915e-4b54-9026-23fde69340a4 req-542dfbb0-973e-4458-89ed-0941ce6c1767 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:58:34 np0005593234 nova_compute[227762]: 2026-01-23 09:58:34.556 227766 DEBUG nova.network.neutron [req-7a54a65c-915e-4b54-9026-23fde69340a4 req-542dfbb0-973e-4458-89ed-0941ce6c1767 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Refreshing network info cache for port 3ae0badd-deb5-430c-8cef-bb36b7ca7eea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:58:34 np0005593234 nova_compute[227762]: 2026-01-23 09:58:34.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:35 np0005593234 nova_compute[227762]: 2026-01-23 09:58:35.849 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:35 np0005593234 nova_compute[227762]: 2026-01-23 09:58:35.953 227766 DEBUG nova.network.neutron [req-7a54a65c-915e-4b54-9026-23fde69340a4 req-542dfbb0-973e-4458-89ed-0941ce6c1767 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:58:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:58:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:36.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:58:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:36.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:36 np0005593234 nova_compute[227762]: 2026-01-23 09:58:36.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:36 np0005593234 nova_compute[227762]: 2026-01-23 09:58:36.992 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:37 np0005593234 nova_compute[227762]: 2026-01-23 09:58:37.088 227766 DEBUG nova.network.neutron [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Successfully updated port: a1541348-c83c-4165-8181-1f5dddb145af _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:58:37 np0005593234 nova_compute[227762]: 2026-01-23 09:58:37.173 227766 DEBUG nova.network.neutron [req-7a54a65c-915e-4b54-9026-23fde69340a4 req-542dfbb0-973e-4458-89ed-0941ce6c1767 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:58:37 np0005593234 nova_compute[227762]: 2026-01-23 09:58:37.215 227766 DEBUG oslo_concurrency.lockutils [req-7a54a65c-915e-4b54-9026-23fde69340a4 req-542dfbb0-973e-4458-89ed-0941ce6c1767 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:58:37 np0005593234 nova_compute[227762]: 2026-01-23 09:58:37.315 227766 DEBUG nova.compute.manager [req-6eab2f7f-e3d6-4985-8011-55e772d8d67c req-74b09947-f125-45c0-a113-fed374fdc094 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-changed-a1541348-c83c-4165-8181-1f5dddb145af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:37 np0005593234 nova_compute[227762]: 2026-01-23 09:58:37.316 227766 DEBUG nova.compute.manager [req-6eab2f7f-e3d6-4985-8011-55e772d8d67c req-74b09947-f125-45c0-a113-fed374fdc094 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Refreshing instance network info cache due to event network-changed-a1541348-c83c-4165-8181-1f5dddb145af. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:58:37 np0005593234 nova_compute[227762]: 2026-01-23 09:58:37.316 227766 DEBUG oslo_concurrency.lockutils [req-6eab2f7f-e3d6-4985-8011-55e772d8d67c req-74b09947-f125-45c0-a113-fed374fdc094 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:37 np0005593234 nova_compute[227762]: 2026-01-23 09:58:37.316 227766 DEBUG oslo_concurrency.lockutils [req-6eab2f7f-e3d6-4985-8011-55e772d8d67c req-74b09947-f125-45c0-a113-fed374fdc094 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:58:37 np0005593234 nova_compute[227762]: 2026-01-23 09:58:37.316 227766 DEBUG nova.network.neutron [req-6eab2f7f-e3d6-4985-8011-55e772d8d67c req-74b09947-f125-45c0-a113-fed374fdc094 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Refreshing network info cache for port a1541348-c83c-4165-8181-1f5dddb145af _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:58:37 np0005593234 nova_compute[227762]: 2026-01-23 09:58:37.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:37 np0005593234 nova_compute[227762]: 2026-01-23 09:58:37.832 227766 DEBUG nova.network.neutron [req-6eab2f7f-e3d6-4985-8011-55e772d8d67c req-74b09947-f125-45c0-a113-fed374fdc094 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:58:37 np0005593234 nova_compute[227762]: 2026-01-23 09:58:37.995 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:37 np0005593234 nova_compute[227762]: 2026-01-23 09:58:37.996 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:38 np0005593234 nova_compute[227762]: 2026-01-23 09:58:38.040 227766 DEBUG nova.compute.manager [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:58:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:38.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:58:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:38.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:58:38 np0005593234 nova_compute[227762]: 2026-01-23 09:58:38.242 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:38 np0005593234 nova_compute[227762]: 2026-01-23 09:58:38.242 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:38 np0005593234 nova_compute[227762]: 2026-01-23 09:58:38.251 227766 DEBUG nova.virt.hardware [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:58:38 np0005593234 nova_compute[227762]: 2026-01-23 09:58:38.251 227766 INFO nova.compute.claims [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:58:38 np0005593234 nova_compute[227762]: 2026-01-23 09:58:38.571 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:38 np0005593234 nova_compute[227762]: 2026-01-23 09:58:38.903 227766 DEBUG nova.network.neutron [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Successfully updated port: 0e41c282-1666-4ce9-aa23-76ee3e40aed8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:58:38 np0005593234 nova_compute[227762]: 2026-01-23 09:58:38.943 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "refresh_cache-30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:58:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2043793573' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.048 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.054 227766 DEBUG nova.compute.provider_tree [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.077 227766 DEBUG nova.scheduler.client.report [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.129 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.887s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.130 227766 DEBUG nova.compute.manager [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.239 227766 DEBUG nova.compute.manager [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.239 227766 DEBUG nova.network.neutron [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.413 227766 INFO nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.707 227766 DEBUG nova.compute.manager [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.820 227766 DEBUG nova.network.neutron [req-6eab2f7f-e3d6-4985-8011-55e772d8d67c req-74b09947-f125-45c0-a113-fed374fdc094 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.853 227766 DEBUG oslo_concurrency.lockutils [req-6eab2f7f-e3d6-4985-8011-55e772d8d67c req-74b09947-f125-45c0-a113-fed374fdc094 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.856 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquired lock "refresh_cache-30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.856 227766 DEBUG nova.network.neutron [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.966 227766 DEBUG nova.compute.manager [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.968 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.968 227766 INFO nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Creating image(s)#033[00m
Jan 23 04:58:39 np0005593234 nova_compute[227762]: 2026-01-23 09:58:39.994 227766 DEBUG nova.storage.rbd_utils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.022 227766 DEBUG nova.storage.rbd_utils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.047 227766 DEBUG nova.storage.rbd_utils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.051 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.074 227766 DEBUG nova.compute.manager [req-1dccc04a-b9c8-4258-bf37-721bebb95328 req-edbdea64-0b23-4b1a-9195-83884b6f8da4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-changed-0e41c282-1666-4ce9-aa23-76ee3e40aed8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.074 227766 DEBUG nova.compute.manager [req-1dccc04a-b9c8-4258-bf37-721bebb95328 req-edbdea64-0b23-4b1a-9195-83884b6f8da4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Refreshing instance network info cache due to event network-changed-0e41c282-1666-4ce9-aa23-76ee3e40aed8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.075 227766 DEBUG oslo_concurrency.lockutils [req-1dccc04a-b9c8-4258-bf37-721bebb95328 req-edbdea64-0b23-4b1a-9195-83884b6f8da4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:40.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.115 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.116 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.116 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.117 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.142 227766 DEBUG nova.storage.rbd_utils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.146 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.171 227766 DEBUG nova.policy [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd83df80213fd40f99fdc68c146fe9a2a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c288779980de4f03be20b7eed343b775', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:58:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:40.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.458 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.312s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.528 227766 DEBUG nova.storage.rbd_utils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] resizing rbd image 11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.636 227766 DEBUG nova.objects.instance [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lazy-loading 'migration_context' on Instance uuid 11d58e6c-38fd-4f34-9d0f-102df6aee42b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.672 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.672 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Ensure instance console log exists: /var/lib/nova/instances/11d58e6c-38fd-4f34-9d0f-102df6aee42b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.673 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.673 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.673 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.851 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:40 np0005593234 nova_compute[227762]: 2026-01-23 09:58:40.854 227766 DEBUG nova.network.neutron [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:58:41 np0005593234 nova_compute[227762]: 2026-01-23 09:58:41.836 227766 DEBUG nova.network.neutron [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Successfully created port: 0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:58:41 np0005593234 nova_compute[227762]: 2026-01-23 09:58:41.995 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:42.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:42.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:42 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:42Z|00292|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 04:58:42 np0005593234 nova_compute[227762]: 2026-01-23 09:58:42.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:42 np0005593234 nova_compute[227762]: 2026-01-23 09:58:42.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 04:58:42 np0005593234 nova_compute[227762]: 2026-01-23 09:58:42.780 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 04:58:42 np0005593234 nova_compute[227762]: 2026-01-23 09:58:42.781 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:42 np0005593234 nova_compute[227762]: 2026-01-23 09:58:42.781 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 04:58:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:42.835 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:42.835 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:42.836 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:44.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:44.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:44 np0005593234 nova_compute[227762]: 2026-01-23 09:58:44.369 227766 DEBUG nova.network.neutron [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Successfully updated port: 0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:58:44 np0005593234 nova_compute[227762]: 2026-01-23 09:58:44.396 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "refresh_cache-11d58e6c-38fd-4f34-9d0f-102df6aee42b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:44 np0005593234 nova_compute[227762]: 2026-01-23 09:58:44.397 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquired lock "refresh_cache-11d58e6c-38fd-4f34-9d0f-102df6aee42b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:58:44 np0005593234 nova_compute[227762]: 2026-01-23 09:58:44.397 227766 DEBUG nova.network.neutron [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:58:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 04:58:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2562171172' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 04:58:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 04:58:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2562171172' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 04:58:44 np0005593234 nova_compute[227762]: 2026-01-23 09:58:44.976 227766 DEBUG nova.compute.manager [req-29c6c98f-7853-487c-bc01-d0408061c7a0 req-9f4cd4cb-f1cc-478e-b07d-0f470cec0a38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Received event network-changed-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:44 np0005593234 nova_compute[227762]: 2026-01-23 09:58:44.976 227766 DEBUG nova.compute.manager [req-29c6c98f-7853-487c-bc01-d0408061c7a0 req-9f4cd4cb-f1cc-478e-b07d-0f470cec0a38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Refreshing instance network info cache due to event network-changed-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:58:44 np0005593234 nova_compute[227762]: 2026-01-23 09:58:44.976 227766 DEBUG oslo_concurrency.lockutils [req-29c6c98f-7853-487c-bc01-d0408061c7a0 req-9f4cd4cb-f1cc-478e-b07d-0f470cec0a38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-11d58e6c-38fd-4f34-9d0f-102df6aee42b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:58:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:45 np0005593234 nova_compute[227762]: 2026-01-23 09:58:45.853 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:45 np0005593234 nova_compute[227762]: 2026-01-23 09:58:45.868 227766 DEBUG nova.network.neutron [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:58:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:46.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:46.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:46 np0005593234 nova_compute[227762]: 2026-01-23 09:58:46.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:58:46 np0005593234 nova_compute[227762]: 2026-01-23 09:58:46.997 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:47 np0005593234 podman[266836]: 2026-01-23 09:58:47.755474313 +0000 UTC m=+0.051315423 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 04:58:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:48.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:48.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:50.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.160 227766 DEBUG nova.network.neutron [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Updating instance_info_cache with network_info: [{"id": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "address": "fa:16:3e:1b:80:50", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0260ea0f-e0", "ovs_interfaceid": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:58:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:50.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.332 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Releasing lock "refresh_cache-11d58e6c-38fd-4f34-9d0f-102df6aee42b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.333 227766 DEBUG nova.compute.manager [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Instance network_info: |[{"id": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "address": "fa:16:3e:1b:80:50", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0260ea0f-e0", "ovs_interfaceid": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.333 227766 DEBUG oslo_concurrency.lockutils [req-29c6c98f-7853-487c-bc01-d0408061c7a0 req-9f4cd4cb-f1cc-478e-b07d-0f470cec0a38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-11d58e6c-38fd-4f34-9d0f-102df6aee42b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.333 227766 DEBUG nova.network.neutron [req-29c6c98f-7853-487c-bc01-d0408061c7a0 req-9f4cd4cb-f1cc-478e-b07d-0f470cec0a38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Refreshing network info cache for port 0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.336 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Start _get_guest_xml network_info=[{"id": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "address": "fa:16:3e:1b:80:50", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0260ea0f-e0", "ovs_interfaceid": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.341 227766 WARNING nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.358 227766 DEBUG nova.virt.libvirt.host [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.359 227766 DEBUG nova.virt.libvirt.host [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.366 227766 DEBUG nova.virt.libvirt.host [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.367 227766 DEBUG nova.virt.libvirt.host [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.368 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.368 227766 DEBUG nova.virt.hardware [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.368 227766 DEBUG nova.virt.hardware [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.369 227766 DEBUG nova.virt.hardware [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.369 227766 DEBUG nova.virt.hardware [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.369 227766 DEBUG nova.virt.hardware [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.369 227766 DEBUG nova.virt.hardware [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.369 227766 DEBUG nova.virt.hardware [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.370 227766 DEBUG nova.virt.hardware [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.370 227766 DEBUG nova.virt.hardware [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.370 227766 DEBUG nova.virt.hardware [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.370 227766 DEBUG nova.virt.hardware [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.373 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.855 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:58:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/404318620' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.898 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.924 227766 DEBUG nova.storage.rbd_utils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:50 np0005593234 nova_compute[227762]: 2026-01-23 09:58:50.927 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:58:51 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1231432554' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:58:51 np0005593234 nova_compute[227762]: 2026-01-23 09:58:51.832 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.904s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:51 np0005593234 nova_compute[227762]: 2026-01-23 09:58:51.834 227766 DEBUG nova.virt.libvirt.vif [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:58:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-123958626',display_name='tempest-tempest.common.compute-instance-123958626-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-123958626-2',id=92,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c288779980de4f03be20b7eed343b775',ramdisk_id='',reservation_id='r-4ujtl3ky',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-351408189',owner_user_name='tempest-MultipleCreateTestJSON-351408189-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:39Z,user_data=None,user_id='d83df80213fd40f99fdc68c146fe9a2a',uuid=11d58e6c-38fd-4f34-9d0f-102df6aee42b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "address": "fa:16:3e:1b:80:50", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0260ea0f-e0", "ovs_interfaceid": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:58:51 np0005593234 nova_compute[227762]: 2026-01-23 09:58:51.834 227766 DEBUG nova.network.os_vif_util [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converting VIF {"id": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "address": "fa:16:3e:1b:80:50", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0260ea0f-e0", "ovs_interfaceid": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:58:51 np0005593234 nova_compute[227762]: 2026-01-23 09:58:51.835 227766 DEBUG nova.network.os_vif_util [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:80:50,bridge_name='br-int',has_traffic_filtering=True,id=0260ea0f-e0a8-4506-8b70-c7b52d5a7b48,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0260ea0f-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:58:51 np0005593234 nova_compute[227762]: 2026-01-23 09:58:51.837 227766 DEBUG nova.objects.instance [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lazy-loading 'pci_devices' on Instance uuid 11d58e6c-38fd-4f34-9d0f-102df6aee42b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:58:51 np0005593234 nova_compute[227762]: 2026-01-23 09:58:51.999 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.084 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  <uuid>11d58e6c-38fd-4f34-9d0f-102df6aee42b</uuid>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  <name>instance-0000005c</name>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <nova:name>tempest-tempest.common.compute-instance-123958626-2</nova:name>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:58:50</nova:creationTime>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <nova:user uuid="d83df80213fd40f99fdc68c146fe9a2a">tempest-MultipleCreateTestJSON-351408189-project-member</nova:user>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <nova:project uuid="c288779980de4f03be20b7eed343b775">tempest-MultipleCreateTestJSON-351408189</nova:project>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <nova:port uuid="0260ea0f-e0a8-4506-8b70-c7b52d5a7b48">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <entry name="serial">11d58e6c-38fd-4f34-9d0f-102df6aee42b</entry>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <entry name="uuid">11d58e6c-38fd-4f34-9d0f-102df6aee42b</entry>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk.config">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:1b:80:50"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <target dev="tap0260ea0f-e0"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/11d58e6c-38fd-4f34-9d0f-102df6aee42b/console.log" append="off"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:58:52 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:58:52 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:58:52 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:58:52 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.086 227766 DEBUG nova.compute.manager [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Preparing to wait for external event network-vif-plugged-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.087 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.088 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.089 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:52.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.091 227766 DEBUG nova.virt.libvirt.vif [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:58:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-123958626',display_name='tempest-tempest.common.compute-instance-123958626-2',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-123958626-2',id=92,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=1,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c288779980de4f03be20b7eed343b775',ramdisk_id='',reservation_id='r-4ujtl3ky',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-351408189',owner_user_name='tempest-MultipleCreateTestJSON-351408189-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:39Z,user_data=None,user_id='d83df80213fd40f99fdc68c146fe9a2a',uuid=11d58e6c-38fd-4f34-9d0f-102df6aee42b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "address": "fa:16:3e:1b:80:50", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0260ea0f-e0", "ovs_interfaceid": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.091 227766 DEBUG nova.network.os_vif_util [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converting VIF {"id": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "address": "fa:16:3e:1b:80:50", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0260ea0f-e0", "ovs_interfaceid": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.093 227766 DEBUG nova.network.os_vif_util [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:80:50,bridge_name='br-int',has_traffic_filtering=True,id=0260ea0f-e0a8-4506-8b70-c7b52d5a7b48,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0260ea0f-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.094 227766 DEBUG os_vif [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:80:50,bridge_name='br-int',has_traffic_filtering=True,id=0260ea0f-e0a8-4506-8b70-c7b52d5a7b48,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0260ea0f-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.096 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.097 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.098 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.104 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.104 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0260ea0f-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.105 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0260ea0f-e0, col_values=(('external_ids', {'iface-id': '0260ea0f-e0a8-4506-8b70-c7b52d5a7b48', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1b:80:50', 'vm-uuid': '11d58e6c-38fd-4f34-9d0f-102df6aee42b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.106 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:52 np0005593234 NetworkManager[48942]: <info>  [1769162332.1083] manager: (tap0260ea0f-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.108 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.114 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.115 227766 INFO os_vif [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:80:50,bridge_name='br-int',has_traffic_filtering=True,id=0260ea0f-e0a8-4506-8b70-c7b52d5a7b48,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0260ea0f-e0')#033[00m
Jan 23 04:58:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:52.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.314 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.314 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.315 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] No VIF found with MAC fa:16:3e:1b:80:50, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.315 227766 INFO nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Using config drive#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.341 227766 DEBUG nova.storage.rbd_utils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:58:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:58:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 04:58:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:58:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.903 227766 DEBUG nova.network.neutron [req-29c6c98f-7853-487c-bc01-d0408061c7a0 req-9f4cd4cb-f1cc-478e-b07d-0f470cec0a38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Updated VIF entry in instance network info cache for port 0260ea0f-e0a8-4506-8b70-c7b52d5a7b48. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.904 227766 DEBUG nova.network.neutron [req-29c6c98f-7853-487c-bc01-d0408061c7a0 req-9f4cd4cb-f1cc-478e-b07d-0f470cec0a38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Updating instance_info_cache with network_info: [{"id": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "address": "fa:16:3e:1b:80:50", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0260ea0f-e0", "ovs_interfaceid": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:58:52 np0005593234 nova_compute[227762]: 2026-01-23 09:58:52.942 227766 DEBUG oslo_concurrency.lockutils [req-29c6c98f-7853-487c-bc01-d0408061c7a0 req-9f4cd4cb-f1cc-478e-b07d-0f470cec0a38 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-11d58e6c-38fd-4f34-9d0f-102df6aee42b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:58:53 np0005593234 nova_compute[227762]: 2026-01-23 09:58:53.027 227766 INFO nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Creating config drive at /var/lib/nova/instances/11d58e6c-38fd-4f34-9d0f-102df6aee42b/disk.config#033[00m
Jan 23 04:58:53 np0005593234 nova_compute[227762]: 2026-01-23 09:58:53.032 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/11d58e6c-38fd-4f34-9d0f-102df6aee42b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc8nhvu0u execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:53 np0005593234 nova_compute[227762]: 2026-01-23 09:58:53.160 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/11d58e6c-38fd-4f34-9d0f-102df6aee42b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc8nhvu0u" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:53 np0005593234 nova_compute[227762]: 2026-01-23 09:58:53.196 227766 DEBUG nova.storage.rbd_utils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:53 np0005593234 nova_compute[227762]: 2026-01-23 09:58:53.200 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/11d58e6c-38fd-4f34-9d0f-102df6aee42b/disk.config 11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:54.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:54.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:54 np0005593234 nova_compute[227762]: 2026-01-23 09:58:54.580 227766 DEBUG oslo_concurrency.processutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/11d58e6c-38fd-4f34-9d0f-102df6aee42b/disk.config 11d58e6c-38fd-4f34-9d0f-102df6aee42b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.380s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:54 np0005593234 nova_compute[227762]: 2026-01-23 09:58:54.581 227766 INFO nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Deleting local config drive /var/lib/nova/instances/11d58e6c-38fd-4f34-9d0f-102df6aee42b/disk.config because it was imported into RBD.#033[00m
Jan 23 04:58:54 np0005593234 kernel: tap0260ea0f-e0: entered promiscuous mode
Jan 23 04:58:54 np0005593234 NetworkManager[48942]: <info>  [1769162334.6336] manager: (tap0260ea0f-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/149)
Jan 23 04:58:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:54Z|00293|binding|INFO|Claiming lport 0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 for this chassis.
Jan 23 04:58:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:54Z|00294|binding|INFO|0260ea0f-e0a8-4506-8b70-c7b52d5a7b48: Claiming fa:16:3e:1b:80:50 10.100.0.6
Jan 23 04:58:54 np0005593234 nova_compute[227762]: 2026-01-23 09:58:54.636 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:54 np0005593234 nova_compute[227762]: 2026-01-23 09:58:54.640 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:54 np0005593234 nova_compute[227762]: 2026-01-23 09:58:54.641 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:54 np0005593234 systemd-udevd[267131]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:58:54 np0005593234 systemd-machined[195626]: New machine qemu-36-instance-0000005c.
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.669 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:80:50 10.100.0.6'], port_security=['fa:16:3e:1b:80:50 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '11d58e6c-38fd-4f34-9d0f-102df6aee42b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c5732b3-3484-43db-a231-53d04de40d61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c288779980de4f03be20b7eed343b775', 'neutron:revision_number': '2', 'neutron:security_group_ids': '288ecf98-3e6e-478c-8e27-86a4106b4ef8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2529943-1c00-4757-827e-798919a83756, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=0260ea0f-e0a8-4506-8b70-c7b52d5a7b48) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.671 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 in datapath 6c5732b3-3484-43db-a231-53d04de40d61 bound to our chassis#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.673 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6c5732b3-3484-43db-a231-53d04de40d61#033[00m
Jan 23 04:58:54 np0005593234 NetworkManager[48942]: <info>  [1769162334.6750] device (tap0260ea0f-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:58:54 np0005593234 NetworkManager[48942]: <info>  [1769162334.6755] device (tap0260ea0f-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:58:54 np0005593234 systemd[1]: Started Virtual Machine qemu-36-instance-0000005c.
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.687 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[528b943b-6e9c-46c6-a5b1-2aad1f979eea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.689 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6c5732b3-31 in ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.691 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6c5732b3-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.691 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[588a81e1-d2a4-4110-b9e3-cb0299d737b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.692 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb88f36-64e1-493d-bbcc-79988fb3853b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 nova_compute[227762]: 2026-01-23 09:58:54.704 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.703 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[f0b7cb7e-a38d-4cbc-b0b2-65c37fe0ecb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:54Z|00295|binding|INFO|Setting lport 0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 ovn-installed in OVS
Jan 23 04:58:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:54Z|00296|binding|INFO|Setting lport 0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 up in Southbound
Jan 23 04:58:54 np0005593234 nova_compute[227762]: 2026-01-23 09:58:54.707 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.719 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f53660ff-6e59-4beb-b26f-e35c67e54bc3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.748 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[33329120-4ea7-4cb6-a8bd-0c9ebf29ef6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 NetworkManager[48942]: <info>  [1769162334.7542] manager: (tap6c5732b3-30): new Veth device (/org/freedesktop/NetworkManager/Devices/150)
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.753 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f0dc2862-81bd-41fd-a13e-96af4383262c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.798 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1058a989-7c3c-41fa-a7cc-b75215879ce6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.802 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[0229be23-79c8-492b-8d1d-eae3bd8be340]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 NetworkManager[48942]: <info>  [1769162334.8240] device (tap6c5732b3-30): carrier: link connected
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.830 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[10f62968-eda3-4920-b274-3446c1b997db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.845 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[da9df1bc-5f0a-46c8-88ba-29dc7087bb8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c5732b3-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:ad:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 622978, 'reachable_time': 18419, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267165, 'error': None, 'target': 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.861 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[610a3244-6bdf-4de7-af9f-f73768bd2f3a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:adb9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 622978, 'tstamp': 622978}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267166, 'error': None, 'target': 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.879 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3acc76eb-0c5d-4066-8472-37c7b1052827]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c5732b3-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:ad:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 92], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 622978, 'reachable_time': 18419, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 267167, 'error': None, 'target': 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.909 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d21440ea-28fb-4336-8ef9-61def37c302f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.969 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0b3feec0-3d7d-4b14-959c-bd9d48683117]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.970 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c5732b3-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.970 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.971 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c5732b3-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:54 np0005593234 nova_compute[227762]: 2026-01-23 09:58:54.972 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:54 np0005593234 kernel: tap6c5732b3-30: entered promiscuous mode
Jan 23 04:58:54 np0005593234 NetworkManager[48942]: <info>  [1769162334.9733] manager: (tap6c5732b3-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Jan 23 04:58:54 np0005593234 nova_compute[227762]: 2026-01-23 09:58:54.975 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.976 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6c5732b3-30, col_values=(('external_ids', {'iface-id': '4f372140-9451-4bb5-99b3-fc5570b8346b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:54 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:54Z|00297|binding|INFO|Releasing lport 4f372140-9451-4bb5-99b3-fc5570b8346b from this chassis (sb_readonly=0)
Jan 23 04:58:54 np0005593234 nova_compute[227762]: 2026-01-23 09:58:54.977 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:54 np0005593234 nova_compute[227762]: 2026-01-23 09:58:54.991 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.992 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6c5732b3-3484-43db-a231-53d04de40d61.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6c5732b3-3484-43db-a231-53d04de40d61.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:58:54 np0005593234 nova_compute[227762]: 2026-01-23 09:58:54.993 227766 DEBUG nova.network.neutron [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Updating instance_info_cache with network_info: [{"id": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "address": "fa:16:3e:37:24:09", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ae0badd-de", "ovs_interfaceid": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a1541348-c83c-4165-8181-1f5dddb145af", "address": "fa:16:3e:17:8c:ae", "network": {"id": "1b009808-c7c2-4bc8-995b-b11e0fa9f5b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1217344448", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1541348-c8", "ovs_interfaceid": "a1541348-c83c-4165-8181-1f5dddb145af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "address": "fa:16:3e:71:2a:14", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e41c282-16", "ovs_interfaceid": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.995 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8a278aba-bf98-4bf8-b8fa-eaa801a54796]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.996 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-6c5732b3-3484-43db-a231-53d04de40d61
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/6c5732b3-3484-43db-a231-53d04de40d61.pid.haproxy
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 6c5732b3-3484-43db-a231-53d04de40d61
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:58:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:54.997 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'env', 'PROCESS_TAG=haproxy-6c5732b3-3484-43db-a231-53d04de40d61', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6c5732b3-3484-43db-a231-53d04de40d61.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.024 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Releasing lock "refresh_cache-30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.024 227766 DEBUG nova.compute.manager [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Instance network_info: |[{"id": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "address": "fa:16:3e:37:24:09", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ae0badd-de", "ovs_interfaceid": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a1541348-c83c-4165-8181-1f5dddb145af", "address": "fa:16:3e:17:8c:ae", "network": {"id": "1b009808-c7c2-4bc8-995b-b11e0fa9f5b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1217344448", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1541348-c8", "ovs_interfaceid": "a1541348-c83c-4165-8181-1f5dddb145af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "address": "fa:16:3e:71:2a:14", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e41c282-16", "ovs_interfaceid": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.025 227766 DEBUG oslo_concurrency.lockutils [req-1dccc04a-b9c8-4258-bf37-721bebb95328 req-edbdea64-0b23-4b1a-9195-83884b6f8da4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.025 227766 DEBUG nova.network.neutron [req-1dccc04a-b9c8-4258-bf37-721bebb95328 req-edbdea64-0b23-4b1a-9195-83884b6f8da4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Refreshing network info cache for port 0e41c282-1666-4ce9-aa23-76ee3e40aed8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.029 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Start _get_guest_xml network_info=[{"id": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "address": "fa:16:3e:37:24:09", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ae0badd-de", "ovs_interfaceid": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a1541348-c83c-4165-8181-1f5dddb145af", "address": "fa:16:3e:17:8c:ae", "network": {"id": "1b009808-c7c2-4bc8-995b-b11e0fa9f5b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1217344448", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1541348-c8", "ovs_interfaceid": "a1541348-c83c-4165-8181-1f5dddb145af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "address": "fa:16:3e:71:2a:14", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e41c282-16", "ovs_interfaceid": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.033 227766 WARNING nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.038 227766 DEBUG nova.virt.libvirt.host [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.039 227766 DEBUG nova.virt.libvirt.host [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.049 227766 DEBUG nova.virt.libvirt.host [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.050 227766 DEBUG nova.virt.libvirt.host [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.051 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.051 227766 DEBUG nova.virt.hardware [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.052 227766 DEBUG nova.virt.hardware [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.052 227766 DEBUG nova.virt.hardware [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.052 227766 DEBUG nova.virt.hardware [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.053 227766 DEBUG nova.virt.hardware [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.053 227766 DEBUG nova.virt.hardware [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.053 227766 DEBUG nova.virt.hardware [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.053 227766 DEBUG nova.virt.hardware [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.054 227766 DEBUG nova.virt.hardware [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.054 227766 DEBUG nova.virt.hardware [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.054 227766 DEBUG nova.virt.hardware [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.057 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.200 227766 DEBUG nova.compute.manager [req-34e39548-4bc3-4b05-96ff-b36286e2a8dd req-bebb2336-6027-4539-b709-5829bcb3046e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Received event network-vif-plugged-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.200 227766 DEBUG oslo_concurrency.lockutils [req-34e39548-4bc3-4b05-96ff-b36286e2a8dd req-bebb2336-6027-4539-b709-5829bcb3046e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.201 227766 DEBUG oslo_concurrency.lockutils [req-34e39548-4bc3-4b05-96ff-b36286e2a8dd req-bebb2336-6027-4539-b709-5829bcb3046e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.201 227766 DEBUG oslo_concurrency.lockutils [req-34e39548-4bc3-4b05-96ff-b36286e2a8dd req-bebb2336-6027-4539-b709-5829bcb3046e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.201 227766 DEBUG nova.compute.manager [req-34e39548-4bc3-4b05-96ff-b36286e2a8dd req-bebb2336-6027-4539-b709-5829bcb3046e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Processing event network-vif-plugged-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:58:55 np0005593234 podman[267217]: 2026-01-23 09:58:55.340975749 +0000 UTC m=+0.026277842 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:58:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:58:55 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/793611343' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:58:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:58:55 np0005593234 podman[267217]: 2026-01-23 09:58:55.553764626 +0000 UTC m=+0.239066699 container create 5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.560 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.591 227766 DEBUG nova.storage.rbd_utils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:55 np0005593234 systemd[1]: Started libpod-conmon-5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44.scope.
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.597 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:55 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:58:55 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e60ab3257327bebe14d4db635abf4de8bcfba80b68594f914ea5a449743daadb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:58:55 np0005593234 podman[267217]: 2026-01-23 09:58:55.638936145 +0000 UTC m=+0.324238208 container init 5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 04:58:55 np0005593234 podman[267217]: 2026-01-23 09:58:55.645444869 +0000 UTC m=+0.330746932 container start 5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:58:55 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[267285]: [NOTICE]   (267298) : New worker (267300) forked
Jan 23 04:58:55 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[267285]: [NOTICE]   (267298) : Loading success.
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.722 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162335.7222412, 11d58e6c-38fd-4f34-9d0f-102df6aee42b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.723 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] VM Started (Lifecycle Event)#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.727 227766 DEBUG nova.compute.manager [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.731 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.735 227766 INFO nova.virt.libvirt.driver [-] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Instance spawned successfully.#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.736 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.769 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.771 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.772 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.772 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.773 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.773 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.774 227766 DEBUG nova.virt.libvirt.driver [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.787 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.852 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.853 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162335.7262275, 11d58e6c-38fd-4f34-9d0f-102df6aee42b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.853 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.858 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.902 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.905 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162335.7302015, 11d58e6c-38fd-4f34-9d0f-102df6aee42b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.905 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.919 227766 INFO nova.compute.manager [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Took 15.95 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.920 227766 DEBUG nova.compute.manager [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.941 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.944 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:58:55 np0005593234 nova_compute[227762]: 2026-01-23 09:58:55.985 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:58:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:58:56 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2839001502' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.046 227766 INFO nova.compute.manager [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Took 17.87 seconds to build instance.#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.063 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.066 227766 DEBUG nova.virt.libvirt.vif [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-598627440',display_name='tempest-ServersTestMultiNic-server-598627440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-598627440',id=90,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-cctpnia6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:27Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "address": "fa:16:3e:37:24:09", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ae0badd-de", "ovs_interfaceid": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.067 227766 DEBUG nova.network.os_vif_util [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "address": "fa:16:3e:37:24:09", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ae0badd-de", "ovs_interfaceid": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.068 227766 DEBUG nova.network.os_vif_util [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:24:09,bridge_name='br-int',has_traffic_filtering=True,id=3ae0badd-deb5-430c-8cef-bb36b7ca7eea,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ae0badd-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.069 227766 DEBUG nova.virt.libvirt.vif [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-598627440',display_name='tempest-ServersTestMultiNic-server-598627440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-598627440',id=90,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-cctpnia6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:27Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a1541348-c83c-4165-8181-1f5dddb145af", "address": "fa:16:3e:17:8c:ae", "network": {"id": "1b009808-c7c2-4bc8-995b-b11e0fa9f5b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1217344448", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1541348-c8", "ovs_interfaceid": "a1541348-c83c-4165-8181-1f5dddb145af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.069 227766 DEBUG nova.network.os_vif_util [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "a1541348-c83c-4165-8181-1f5dddb145af", "address": "fa:16:3e:17:8c:ae", "network": {"id": "1b009808-c7c2-4bc8-995b-b11e0fa9f5b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1217344448", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1541348-c8", "ovs_interfaceid": "a1541348-c83c-4165-8181-1f5dddb145af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.070 227766 DEBUG nova.network.os_vif_util [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:8c:ae,bridge_name='br-int',has_traffic_filtering=True,id=a1541348-c83c-4165-8181-1f5dddb145af,network=Network(1b009808-c7c2-4bc8-995b-b11e0fa9f5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1541348-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.071 227766 DEBUG nova.virt.libvirt.vif [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-598627440',display_name='tempest-ServersTestMultiNic-server-598627440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-598627440',id=90,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-cctpnia6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:27Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "address": "fa:16:3e:71:2a:14", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e41c282-16", "ovs_interfaceid": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.071 227766 DEBUG nova.network.os_vif_util [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "address": "fa:16:3e:71:2a:14", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e41c282-16", "ovs_interfaceid": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.071 227766 DEBUG nova.network.os_vif_util [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:2a:14,bridge_name='br-int',has_traffic_filtering=True,id=0e41c282-1666-4ce9-aa23-76ee3e40aed8,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e41c282-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.073 227766 DEBUG nova.objects.instance [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lazy-loading 'pci_devices' on Instance uuid 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.078 227766 DEBUG oslo_concurrency.lockutils [None req-a86f2d7d-549a-46be-8d76-9e5f7e2dc573 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.082s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:58:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:56.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.107 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  <uuid>30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c</uuid>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  <name>instance-0000005a</name>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServersTestMultiNic-server-598627440</nova:name>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:58:55</nova:creationTime>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <nova:user uuid="57e3c530deab46758172af6777c8c108">tempest-ServersTestMultiNic-546513917-project-member</nova:user>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <nova:project uuid="d557095954714e01b800ed2898d27593">tempest-ServersTestMultiNic-546513917</nova:project>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <nova:port uuid="3ae0badd-deb5-430c-8cef-bb36b7ca7eea">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.45" ipVersion="4"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <nova:port uuid="a1541348-c83c-4165-8181-1f5dddb145af">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.1.157" ipVersion="4"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <nova:port uuid="0e41c282-1666-4ce9-aa23-76ee3e40aed8">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.208" ipVersion="4"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <entry name="serial">30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c</entry>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <entry name="uuid">30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c</entry>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk.config">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:37:24:09"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <target dev="tap3ae0badd-de"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:17:8c:ae"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <target dev="tapa1541348-c8"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:71:2a:14"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <target dev="tap0e41c282-16"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c/console.log" append="off"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:58:56 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:58:56 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:58:56 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:58:56 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.107 227766 DEBUG nova.compute.manager [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Preparing to wait for external event network-vif-plugged-3ae0badd-deb5-430c-8cef-bb36b7ca7eea prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.108 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.108 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.108 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.108 227766 DEBUG nova.compute.manager [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Preparing to wait for external event network-vif-plugged-a1541348-c83c-4165-8181-1f5dddb145af prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.108 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.108 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.109 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.109 227766 DEBUG nova.compute.manager [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Preparing to wait for external event network-vif-plugged-0e41c282-1666-4ce9-aa23-76ee3e40aed8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.109 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.109 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.109 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.110 227766 DEBUG nova.virt.libvirt.vif [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-598627440',display_name='tempest-ServersTestMultiNic-server-598627440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-598627440',id=90,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-cctpnia6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:27Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "address": "fa:16:3e:37:24:09", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ae0badd-de", "ovs_interfaceid": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.110 227766 DEBUG nova.network.os_vif_util [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "address": "fa:16:3e:37:24:09", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ae0badd-de", "ovs_interfaceid": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.110 227766 DEBUG nova.network.os_vif_util [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:24:09,bridge_name='br-int',has_traffic_filtering=True,id=3ae0badd-deb5-430c-8cef-bb36b7ca7eea,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ae0badd-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.111 227766 DEBUG os_vif [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:24:09,bridge_name='br-int',has_traffic_filtering=True,id=3ae0badd-deb5-430c-8cef-bb36b7ca7eea,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ae0badd-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.111 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.112 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.112 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.115 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.115 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ae0badd-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.115 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3ae0badd-de, col_values=(('external_ids', {'iface-id': '3ae0badd-deb5-430c-8cef-bb36b7ca7eea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:37:24:09', 'vm-uuid': '30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:56 np0005593234 NetworkManager[48942]: <info>  [1769162336.1180] manager: (tap3ae0badd-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/152)
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.119 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.123 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.124 227766 INFO os_vif [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:24:09,bridge_name='br-int',has_traffic_filtering=True,id=3ae0badd-deb5-430c-8cef-bb36b7ca7eea,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ae0badd-de')#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.124 227766 DEBUG nova.virt.libvirt.vif [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-598627440',display_name='tempest-ServersTestMultiNic-server-598627440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-598627440',id=90,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-cctpnia6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:27Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a1541348-c83c-4165-8181-1f5dddb145af", "address": "fa:16:3e:17:8c:ae", "network": {"id": "1b009808-c7c2-4bc8-995b-b11e0fa9f5b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1217344448", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1541348-c8", "ovs_interfaceid": "a1541348-c83c-4165-8181-1f5dddb145af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.125 227766 DEBUG nova.network.os_vif_util [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "a1541348-c83c-4165-8181-1f5dddb145af", "address": "fa:16:3e:17:8c:ae", "network": {"id": "1b009808-c7c2-4bc8-995b-b11e0fa9f5b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1217344448", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1541348-c8", "ovs_interfaceid": "a1541348-c83c-4165-8181-1f5dddb145af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.125 227766 DEBUG nova.network.os_vif_util [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:8c:ae,bridge_name='br-int',has_traffic_filtering=True,id=a1541348-c83c-4165-8181-1f5dddb145af,network=Network(1b009808-c7c2-4bc8-995b-b11e0fa9f5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1541348-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.125 227766 DEBUG os_vif [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:8c:ae,bridge_name='br-int',has_traffic_filtering=True,id=a1541348-c83c-4165-8181-1f5dddb145af,network=Network(1b009808-c7c2-4bc8-995b-b11e0fa9f5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1541348-c8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.126 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.126 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.126 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.128 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.128 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa1541348-c8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.129 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa1541348-c8, col_values=(('external_ids', {'iface-id': 'a1541348-c83c-4165-8181-1f5dddb145af', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:17:8c:ae', 'vm-uuid': '30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:56 np0005593234 NetworkManager[48942]: <info>  [1769162336.1309] manager: (tapa1541348-c8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.133 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.135 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.136 227766 INFO os_vif [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:8c:ae,bridge_name='br-int',has_traffic_filtering=True,id=a1541348-c83c-4165-8181-1f5dddb145af,network=Network(1b009808-c7c2-4bc8-995b-b11e0fa9f5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1541348-c8')#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.137 227766 DEBUG nova.virt.libvirt.vif [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-598627440',display_name='tempest-ServersTestMultiNic-server-598627440',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-598627440',id=90,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-cctpnia6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:58:27Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "address": "fa:16:3e:71:2a:14", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e41c282-16", "ovs_interfaceid": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.137 227766 DEBUG nova.network.os_vif_util [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "address": "fa:16:3e:71:2a:14", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e41c282-16", "ovs_interfaceid": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.138 227766 DEBUG nova.network.os_vif_util [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:2a:14,bridge_name='br-int',has_traffic_filtering=True,id=0e41c282-1666-4ce9-aa23-76ee3e40aed8,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e41c282-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.138 227766 DEBUG os_vif [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:2a:14,bridge_name='br-int',has_traffic_filtering=True,id=0e41c282-1666-4ce9-aa23-76ee3e40aed8,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e41c282-16') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.138 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.139 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.139 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.141 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.141 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0e41c282-16, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.141 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0e41c282-16, col_values=(('external_ids', {'iface-id': '0e41c282-1666-4ce9-aa23-76ee3e40aed8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:71:2a:14', 'vm-uuid': '30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:56 np0005593234 NetworkManager[48942]: <info>  [1769162336.1433] manager: (tap0e41c282-16): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.145 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.152 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.153 227766 INFO os_vif [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:2a:14,bridge_name='br-int',has_traffic_filtering=True,id=0e41c282-1666-4ce9-aa23-76ee3e40aed8,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e41c282-16')#033[00m
Jan 23 04:58:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:56.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.357 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.357 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.358 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] No VIF found with MAC fa:16:3e:37:24:09, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.358 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] No VIF found with MAC fa:16:3e:17:8c:ae, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.358 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] No VIF found with MAC fa:16:3e:71:2a:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.359 227766 INFO nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Using config drive#033[00m
Jan 23 04:58:56 np0005593234 nova_compute[227762]: 2026-01-23 09:58:56.388 227766 DEBUG nova.storage.rbd_utils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:57 np0005593234 nova_compute[227762]: 2026-01-23 09:58:57.635 227766 INFO nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Creating config drive at /var/lib/nova/instances/30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c/disk.config#033[00m
Jan 23 04:58:57 np0005593234 nova_compute[227762]: 2026-01-23 09:58:57.640 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmsagjxbi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:57 np0005593234 nova_compute[227762]: 2026-01-23 09:58:57.768 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmsagjxbi" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:57 np0005593234 nova_compute[227762]: 2026-01-23 09:58:57.903 227766 DEBUG nova.storage.rbd_utils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:58:57 np0005593234 nova_compute[227762]: 2026-01-23 09:58:57.908 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c/disk.config 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:58:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:58.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:58:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:58:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:58.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.279 227766 DEBUG nova.compute.manager [req-3e183e0f-2a65-4885-80d2-291428d372de req-07a22958-c6d7-407e-9c3e-35a3ee8410ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Received event network-vif-plugged-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.279 227766 DEBUG oslo_concurrency.lockutils [req-3e183e0f-2a65-4885-80d2-291428d372de req-07a22958-c6d7-407e-9c3e-35a3ee8410ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.279 227766 DEBUG oslo_concurrency.lockutils [req-3e183e0f-2a65-4885-80d2-291428d372de req-07a22958-c6d7-407e-9c3e-35a3ee8410ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.280 227766 DEBUG oslo_concurrency.lockutils [req-3e183e0f-2a65-4885-80d2-291428d372de req-07a22958-c6d7-407e-9c3e-35a3ee8410ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.280 227766 DEBUG nova.compute.manager [req-3e183e0f-2a65-4885-80d2-291428d372de req-07a22958-c6d7-407e-9c3e-35a3ee8410ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] No waiting events found dispatching network-vif-plugged-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.280 227766 WARNING nova.compute.manager [req-3e183e0f-2a65-4885-80d2-291428d372de req-07a22958-c6d7-407e-9c3e-35a3ee8410ec 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Received unexpected event network-vif-plugged-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 for instance with vm_state active and task_state None.#033[00m
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.620 227766 DEBUG oslo_concurrency.processutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c/disk.config 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.712s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.621 227766 INFO nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Deleting local config drive /var/lib/nova/instances/30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c/disk.config because it was imported into RBD.#033[00m
Jan 23 04:58:58 np0005593234 kernel: tap3ae0badd-de: entered promiscuous mode
Jan 23 04:58:58 np0005593234 NetworkManager[48942]: <info>  [1769162338.6690] manager: (tap3ae0badd-de): new Tun device (/org/freedesktop/NetworkManager/Devices/155)
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.677 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:58Z|00298|binding|INFO|Claiming lport 3ae0badd-deb5-430c-8cef-bb36b7ca7eea for this chassis.
Jan 23 04:58:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:58Z|00299|binding|INFO|3ae0badd-deb5-430c-8cef-bb36b7ca7eea: Claiming fa:16:3e:37:24:09 10.100.0.45
Jan 23 04:58:58 np0005593234 NetworkManager[48942]: <info>  [1769162338.6910] manager: (tapa1541348-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/156)
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.690 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:24:09 10.100.0.45'], port_security=['fa:16:3e:37:24:09 10.100.0.45'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.45/24', 'neutron:device_id': '30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557095954714e01b800ed2898d27593', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd5947ff6-b7cd-4a55-bcbe-de55519d051c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d31f115-0250-45c9-a1b4-d3823a4c1297, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=3ae0badd-deb5-430c-8cef-bb36b7ca7eea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.692 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 3ae0badd-deb5-430c-8cef-bb36b7ca7eea in datapath ef004289-2bc3-4dae-bfd1-9d2d36a65be8 bound to our chassis#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.694 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ef004289-2bc3-4dae-bfd1-9d2d36a65be8#033[00m
Jan 23 04:58:58 np0005593234 NetworkManager[48942]: <info>  [1769162338.7046] manager: (tap0e41c282-16): new Tun device (/org/freedesktop/NetworkManager/Devices/157)
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.706 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6e781bc6-7db6-41c8-938e-9058c8f0bab5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.707 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapef004289-21 in ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:58:58 np0005593234 systemd-udevd[267420]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:58:58 np0005593234 systemd-udevd[267421]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:58:58 np0005593234 systemd-udevd[267419]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.711 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapef004289-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.711 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[71cd66eb-03bd-49ba-a23f-87893488bc09]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.713 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[34bdaade-c77a-4099-8406-b45d81629411]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 kernel: tapa1541348-c8: entered promiscuous mode
Jan 23 04:58:58 np0005593234 kernel: tap0e41c282-16: entered promiscuous mode
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.723 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:58Z|00300|binding|INFO|Claiming lport a1541348-c83c-4165-8181-1f5dddb145af for this chassis.
Jan 23 04:58:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:58Z|00301|binding|INFO|a1541348-c83c-4165-8181-1f5dddb145af: Claiming fa:16:3e:17:8c:ae 10.100.1.157
Jan 23 04:58:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:58Z|00302|binding|INFO|Claiming lport 0e41c282-1666-4ce9-aa23-76ee3e40aed8 for this chassis.
Jan 23 04:58:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:58Z|00303|binding|INFO|0e41c282-1666-4ce9-aa23-76ee3e40aed8: Claiming fa:16:3e:71:2a:14 10.100.0.208
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.727 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[95a22d92-2992-427e-b22a-f4f092391bf2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 NetworkManager[48942]: <info>  [1769162338.7319] device (tapa1541348-c8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:58:58 np0005593234 NetworkManager[48942]: <info>  [1769162338.7330] device (tap3ae0badd-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:58:58 np0005593234 NetworkManager[48942]: <info>  [1769162338.7356] device (tapa1541348-c8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.730 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.735 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:8c:ae 10.100.1.157'], port_security=['fa:16:3e:17:8c:ae 10.100.1.157'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.157/24', 'neutron:device_id': '30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557095954714e01b800ed2898d27593', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd5947ff6-b7cd-4a55-bcbe-de55519d051c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a30405c0-470a-4f97-ad9f-b7aa6ff409cb, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a1541348-c83c-4165-8181-1f5dddb145af) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:58:58 np0005593234 NetworkManager[48942]: <info>  [1769162338.7366] device (tap3ae0badd-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.737 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:2a:14 10.100.0.208'], port_security=['fa:16:3e:71:2a:14 10.100.0.208'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.208/24', 'neutron:device_id': '30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557095954714e01b800ed2898d27593', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd5947ff6-b7cd-4a55-bcbe-de55519d051c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d31f115-0250-45c9-a1b4-d3823a4c1297, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=0e41c282-1666-4ce9-aa23-76ee3e40aed8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:58:58 np0005593234 NetworkManager[48942]: <info>  [1769162338.7380] device (tap0e41c282-16): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:58:58 np0005593234 NetworkManager[48942]: <info>  [1769162338.7392] device (tap0e41c282-16): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:58:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:58Z|00304|binding|INFO|Setting lport 3ae0badd-deb5-430c-8cef-bb36b7ca7eea ovn-installed in OVS
Jan 23 04:58:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:58Z|00305|binding|INFO|Setting lport 3ae0badd-deb5-430c-8cef-bb36b7ca7eea up in Southbound
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.740 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:58 np0005593234 systemd-machined[195626]: New machine qemu-37-instance-0000005a.
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.753 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1206a225-6116-43c5-a0cc-93d29bdcc847]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 systemd[1]: Started Virtual Machine qemu-37-instance-0000005a.
Jan 23 04:58:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:58Z|00306|binding|INFO|Setting lport 0e41c282-1666-4ce9-aa23-76ee3e40aed8 ovn-installed in OVS
Jan 23 04:58:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:58Z|00307|binding|INFO|Setting lport 0e41c282-1666-4ce9-aa23-76ee3e40aed8 up in Southbound
Jan 23 04:58:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:58Z|00308|binding|INFO|Setting lport a1541348-c83c-4165-8181-1f5dddb145af ovn-installed in OVS
Jan 23 04:58:58 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:58Z|00309|binding|INFO|Setting lport a1541348-c83c-4165-8181-1f5dddb145af up in Southbound
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.786 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.786 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[87584e72-bcf2-400d-9a1d-cc51ca1c546d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.795 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[67e3d8bc-311c-414d-9a81-43387705cbcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 NetworkManager[48942]: <info>  [1769162338.7970] manager: (tapef004289-20): new Veth device (/org/freedesktop/NetworkManager/Devices/158)
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.823 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[8af151e2-f64a-4093-874c-7e259c226308]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.827 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[bac151b9-4cd9-4b31-afa1-cb25b8bcdbda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 NetworkManager[48942]: <info>  [1769162338.8502] device (tapef004289-20): carrier: link connected
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.854 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[2a66e7b7-cdc5-41a2-95f1-f9ef929d2d0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.876 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c6dd54-8238-45aa-bcff-36f9c2528565]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef004289-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:3e:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623380, 'reachable_time': 40355, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267457, 'error': None, 'target': 'ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.893 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8bc07249-32f9-4d16-bfe5-8aa8e2303a30]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe48:3e8f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623380, 'tstamp': 623380}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267458, 'error': None, 'target': 'ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.909 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fe1a1a1b-e16f-43df-9e51-0de869eefa59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef004289-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:3e:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623380, 'reachable_time': 40355, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 267459, 'error': None, 'target': 'ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.937 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3d19ae82-c4ca-4e31-a9ca-9e7e78d9449e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.996 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[68e90944-cfe5-4069-a22f-12d2e5526468]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.997 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef004289-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.997 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.996 227766 DEBUG nova.network.neutron [req-1dccc04a-b9c8-4258-bf37-721bebb95328 req-edbdea64-0b23-4b1a-9195-83884b6f8da4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Updated VIF entry in instance network info cache for port 0e41c282-1666-4ce9-aa23-76ee3e40aed8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:58.997 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef004289-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:58 np0005593234 nova_compute[227762]: 2026-01-23 09:58:58.998 227766 DEBUG nova.network.neutron [req-1dccc04a-b9c8-4258-bf37-721bebb95328 req-edbdea64-0b23-4b1a-9195-83884b6f8da4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Updating instance_info_cache with network_info: [{"id": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "address": "fa:16:3e:37:24:09", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ae0badd-de", "ovs_interfaceid": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "a1541348-c83c-4165-8181-1f5dddb145af", "address": "fa:16:3e:17:8c:ae", "network": {"id": "1b009808-c7c2-4bc8-995b-b11e0fa9f5b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1217344448", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1541348-c8", "ovs_interfaceid": "a1541348-c83c-4165-8181-1f5dddb145af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "address": "fa:16:3e:71:2a:14", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e41c282-16", "ovs_interfaceid": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:58:59 np0005593234 NetworkManager[48942]: <info>  [1769162339.0000] manager: (tapef004289-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/159)
Jan 23 04:58:59 np0005593234 kernel: tapef004289-20: entered promiscuous mode
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.002 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:59 np0005593234 ovn_controller[134547]: 2026-01-23T09:58:59Z|00310|binding|INFO|Releasing lport 7f620c47-7119-4933-a36c-a82d159d6fc0 from this chassis (sb_readonly=0)
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:59.003 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapef004289-20, col_values=(('external_ids', {'iface-id': '7f620c47-7119-4933-a36c-a82d159d6fc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.032 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:59.034 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ef004289-2bc3-4dae-bfd1-9d2d36a65be8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ef004289-2bc3-4dae-bfd1-9d2d36a65be8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:59.035 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7ae5c7b2-8068-478d-8395-1494f5a2987a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:59.036 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ef004289-2bc3-4dae-bfd1-9d2d36a65be8
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ef004289-2bc3-4dae-bfd1-9d2d36a65be8.pid.haproxy
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ef004289-2bc3-4dae-bfd1-9d2d36a65be8
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:58:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:58:59.037 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'env', 'PROCESS_TAG=haproxy-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ef004289-2bc3-4dae-bfd1-9d2d36a65be8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.045 227766 DEBUG oslo_concurrency.lockutils [req-1dccc04a-b9c8-4258-bf37-721bebb95328 req-edbdea64-0b23-4b1a-9195-83884b6f8da4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.267 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162339.2667966, 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.268 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] VM Started (Lifecycle Event)#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.341 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.345 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162339.2671156, 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.346 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.406 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.410 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.446 227766 DEBUG nova.compute.manager [req-aebe2adc-d505-4c2d-933f-b77fb13f111d req-f5bc2f58-faa7-40e3-801c-355a6e8b73c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-plugged-0e41c282-1666-4ce9-aa23-76ee3e40aed8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.447 227766 DEBUG oslo_concurrency.lockutils [req-aebe2adc-d505-4c2d-933f-b77fb13f111d req-f5bc2f58-faa7-40e3-801c-355a6e8b73c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.447 227766 DEBUG oslo_concurrency.lockutils [req-aebe2adc-d505-4c2d-933f-b77fb13f111d req-f5bc2f58-faa7-40e3-801c-355a6e8b73c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.447 227766 DEBUG oslo_concurrency.lockutils [req-aebe2adc-d505-4c2d-933f-b77fb13f111d req-f5bc2f58-faa7-40e3-801c-355a6e8b73c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.448 227766 DEBUG nova.compute.manager [req-aebe2adc-d505-4c2d-933f-b77fb13f111d req-f5bc2f58-faa7-40e3-801c-355a6e8b73c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Processing event network-vif-plugged-0e41c282-1666-4ce9-aa23-76ee3e40aed8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.449 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:58:59 np0005593234 podman[267535]: 2026-01-23 09:58:59.40773034 +0000 UTC m=+0.023697961 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.748 227766 DEBUG oslo_concurrency.lockutils [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.750 227766 DEBUG oslo_concurrency.lockutils [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.750 227766 DEBUG oslo_concurrency.lockutils [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.750 227766 DEBUG oslo_concurrency.lockutils [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.751 227766 DEBUG oslo_concurrency.lockutils [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.752 227766 INFO nova.compute.manager [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Terminating instance#033[00m
Jan 23 04:58:59 np0005593234 nova_compute[227762]: 2026-01-23 09:58:59.754 227766 DEBUG nova.compute.manager [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:59:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:00.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:00 np0005593234 podman[267535]: 2026-01-23 09:59:00.154405481 +0000 UTC m=+0.770373072 container create 3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:59:00 np0005593234 kernel: tap0260ea0f-e0 (unregistering): left promiscuous mode
Jan 23 04:59:00 np0005593234 NetworkManager[48942]: <info>  [1769162340.1997] device (tap0260ea0f-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:59:00 np0005593234 systemd[1]: Started libpod-conmon-3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5.scope.
Jan 23 04:59:00 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:00Z|00311|binding|INFO|Releasing lport 0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 from this chassis (sb_readonly=0)
Jan 23 04:59:00 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:00Z|00312|binding|INFO|Setting lport 0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 down in Southbound
Jan 23 04:59:00 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:00Z|00313|binding|INFO|Removing iface tap0260ea0f-e0 ovn-installed in OVS
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.211 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.212 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.219 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:80:50 10.100.0.6'], port_security=['fa:16:3e:1b:80:50 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '11d58e6c-38fd-4f34-9d0f-102df6aee42b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c5732b3-3484-43db-a231-53d04de40d61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c288779980de4f03be20b7eed343b775', 'neutron:revision_number': '4', 'neutron:security_group_ids': '288ecf98-3e6e-478c-8e27-86a4106b4ef8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2529943-1c00-4757-827e-798919a83756, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=0260ea0f-e0a8-4506-8b70-c7b52d5a7b48) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.225 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:00 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:59:00 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5c316692c141a408fcd86f8e17006774e3e93c8b7b3465450a6108c31c25fa2f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:00 np0005593234 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000005c.scope: Deactivated successfully.
Jan 23 04:59:00 np0005593234 systemd[1]: machine-qemu\x2d36\x2dinstance\x2d0000005c.scope: Consumed 4.887s CPU time.
Jan 23 04:59:00 np0005593234 systemd-machined[195626]: Machine qemu-36-instance-0000005c terminated.
Jan 23 04:59:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:00.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:00 np0005593234 podman[267535]: 2026-01-23 09:59:00.353053876 +0000 UTC m=+0.969021477 container init 3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:59:00 np0005593234 podman[267535]: 2026-01-23 09:59:00.358333581 +0000 UTC m=+0.974301172 container start 3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 04:59:00 np0005593234 neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8[267556]: [NOTICE]   (267561) : New worker (267570) forked
Jan 23 04:59:00 np0005593234 neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8[267556]: [NOTICE]   (267561) : Loading success.
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.391 227766 INFO nova.virt.libvirt.driver [-] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Instance destroyed successfully.#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.391 227766 DEBUG nova.objects.instance [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lazy-loading 'resources' on Instance uuid 11d58e6c-38fd-4f34-9d0f-102df6aee42b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.412 227766 DEBUG nova.virt.libvirt.vif [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:58:35Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-123958626',display_name='tempest-tempest.common.compute-instance-123958626-2',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-123958626-2',id=92,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=1,launched_at=2026-01-23T09:58:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c288779980de4f03be20b7eed343b775',ramdisk_id='',reservation_id='r-4ujtl3ky',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-351408189',owner_user_name='tempest-MultipleCreateTestJSON-351408189-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:58:56Z,user_data=None,user_id='d83df80213fd40f99fdc68c146fe9a2a',uuid=11d58e6c-38fd-4f34-9d0f-102df6aee42b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "address": "fa:16:3e:1b:80:50", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0260ea0f-e0", "ovs_interfaceid": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.413 227766 DEBUG nova.network.os_vif_util [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converting VIF {"id": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "address": "fa:16:3e:1b:80:50", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0260ea0f-e0", "ovs_interfaceid": "0260ea0f-e0a8-4506-8b70-c7b52d5a7b48", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.413 227766 DEBUG nova.network.os_vif_util [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:80:50,bridge_name='br-int',has_traffic_filtering=True,id=0260ea0f-e0a8-4506-8b70-c7b52d5a7b48,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0260ea0f-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.414 227766 DEBUG os_vif [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:80:50,bridge_name='br-int',has_traffic_filtering=True,id=0260ea0f-e0a8-4506-8b70-c7b52d5a7b48,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0260ea0f-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.416 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.417 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0260ea0f-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.418 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a1541348-c83c-4165-8181-1f5dddb145af in datapath 1b009808-c7c2-4bc8-995b-b11e0fa9f5b1 unbound from our chassis#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.419 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.420 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1b009808-c7c2-4bc8-995b-b11e0fa9f5b1#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.421 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.424 227766 INFO os_vif [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:80:50,bridge_name='br-int',has_traffic_filtering=True,id=0260ea0f-e0a8-4506-8b70-c7b52d5a7b48,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0260ea0f-e0')#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.432 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cb4d8978-586a-4d5c-949e-b9c4b6102d82]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.433 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1b009808-c1 in ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.435 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1b009808-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.435 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d78cee66-8283-418f-a1aa-c07bf0ead9d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.437 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8fe2e83e-bb4c-4b39-acae-9d3caeac4fbb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.450 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[db3186b6-2c98-4994-b37f-513199dc12e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.464 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4d7e625e-9969-4bbb-8195-e7a78578f795]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.499 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[bf053014-f05d-4615-98fb-a541f9ccd929]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 NetworkManager[48942]: <info>  [1769162340.5063] manager: (tap1b009808-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/160)
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.504 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e2693377-e893-4221-95fe-f46af23153bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.540 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[50b70034-4c10-4654-ac3a-62ef8c15e3df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.543 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[05713524-3445-4f63-86a8-1d7aab67ed21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 NetworkManager[48942]: <info>  [1769162340.5681] device (tap1b009808-c0): carrier: link connected
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.573 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[be9ad9e3-fae4-4a16-9399-29add3b6fb0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.592 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[25f91a0a-7d6e-4d38-aca0-a8d2aa05787c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b009808-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:89:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623552, 'reachable_time': 23438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267642, 'error': None, 'target': 'ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.608 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[226a20f7-d3a5-4587-9c7b-7f40f16e4e6b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe0e:895d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623552, 'tstamp': 623552}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267658, 'error': None, 'target': 'ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.626 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[28dc6ada-339e-4840-b361-2d3d7c27237c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b009808-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:0e:89:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 98], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623552, 'reachable_time': 23438, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 267661, 'error': None, 'target': 'ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.654 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ab4a4515-ffb1-4b3e-89ee-28a59e017406]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.706 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6a1d2542-e545-4b51-b6a9-8a5a0065b723]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.708 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b009808-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.708 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.708 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b009808-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:00 np0005593234 NetworkManager[48942]: <info>  [1769162340.7110] manager: (tap1b009808-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/161)
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.710 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.713 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:00 np0005593234 kernel: tap1b009808-c0: entered promiscuous mode
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.714 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1b009808-c0, col_values=(('external_ids', {'iface-id': 'eeaa24ca-1bd9-43d3-bb74-640cd40da1d9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.715 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:00 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:00Z|00314|binding|INFO|Releasing lport eeaa24ca-1bd9-43d3-bb74-640cd40da1d9 from this chassis (sb_readonly=0)
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.730 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.731 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1b009808-c7c2-4bc8-995b-b11e0fa9f5b1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1b009808-c7c2-4bc8-995b-b11e0fa9f5b1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.732 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[58d100d2-487c-4579-b863-0c2545f2411b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.733 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/1b009808-c7c2-4bc8-995b-b11e0fa9f5b1.pid.haproxy
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 1b009808-c7c2-4bc8-995b-b11e0fa9f5b1
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:59:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:00.733 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1', 'env', 'PROCESS_TAG=haproxy-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1b009808-c7c2-4bc8-995b-b11e0fa9f5b1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:59:00 np0005593234 nova_compute[227762]: 2026-01-23 09:59:00.859 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 04:59:01 np0005593234 podman[267695]: 2026-01-23 09:59:01.183866326 +0000 UTC m=+0.085867654 container create 32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 04:59:01 np0005593234 podman[267695]: 2026-01-23 09:59:01.126599547 +0000 UTC m=+0.028600885 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:59:01 np0005593234 systemd[1]: Started libpod-conmon-32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d.scope.
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #91. Immutable memtables: 0.
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.245361) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 55] Flushing memtable with next log file: 91
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341245469, "job": 55, "event": "flush_started", "num_memtables": 1, "num_entries": 1866, "num_deletes": 256, "total_data_size": 4154541, "memory_usage": 4222080, "flush_reason": "Manual Compaction"}
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 55] Level-0 flush table #92: started
Jan 23 04:59:01 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341261156, "cf_name": "default", "job": 55, "event": "table_file_creation", "file_number": 92, "file_size": 2708623, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 46637, "largest_seqno": 48497, "table_properties": {"data_size": 2701150, "index_size": 4352, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16110, "raw_average_key_size": 19, "raw_value_size": 2685897, "raw_average_value_size": 3320, "num_data_blocks": 191, "num_entries": 809, "num_filter_entries": 809, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162190, "oldest_key_time": 1769162190, "file_creation_time": 1769162341, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 92, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 55] Flush lasted 15945 microseconds, and 6552 cpu microseconds.
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.261331) [db/flush_job.cc:967] [default] [JOB 55] Level-0 flush table #92: 2708623 bytes OK
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.261378) [db/memtable_list.cc:519] [default] Level-0 commit table #92 started
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.263358) [db/memtable_list.cc:722] [default] Level-0 commit table #92: memtable #1 done
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.263372) EVENT_LOG_v1 {"time_micros": 1769162341263367, "job": 55, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.263388) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 55] Try to delete WAL files size 4146047, prev total WAL file size 4154825, number of live WAL files 2.
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000088.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.264641) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031353034' seq:72057594037927935, type:22 .. '6C6F676D0031373536' seq:0, type:0; will stop at (end)
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 56] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 55 Base level 0, inputs: [92(2645KB)], [90(10MB)]
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341264767, "job": 56, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [92], "files_L6": [90], "score": -1, "input_data_size": 14230891, "oldest_snapshot_seqno": -1}
Jan 23 04:59:01 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a86087d4c3110b499b142fc4d8bd23d13f060ebed4b956d992a9930d69d90970/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:01 np0005593234 podman[267695]: 2026-01-23 09:59:01.280203885 +0000 UTC m=+0.182205183 container init 32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:59:01 np0005593234 podman[267695]: 2026-01-23 09:59:01.28583623 +0000 UTC m=+0.187837518 container start 32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 04:59:01 np0005593234 neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1[267711]: [NOTICE]   (267715) : New worker (267717) forked
Jan 23 04:59:01 np0005593234 neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1[267711]: [NOTICE]   (267715) : Loading success.
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.337 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 0e41c282-1666-4ce9-aa23-76ee3e40aed8 in datapath ef004289-2bc3-4dae-bfd1-9d2d36a65be8 unbound from our chassis#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.340 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ef004289-2bc3-4dae-bfd1-9d2d36a65be8#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.353 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[00355e82-6dc4-4d9c-ae1e-a78f7f9a2a2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.385 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[db0f57cd-7097-47c1-ae52-29d8f9451fd7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.389 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e52f5316-decb-47e8-bc0d-9e6c9dafdd3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.417 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[3e30f2c0-fe4e-421d-96b7-bf3a39df5286]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.433 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[acd4b31c-0b13-4fec-9f85-f6645d65f9ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef004289-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:3e:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 5, 'rx_bytes': 532, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623380, 'reachable_time': 40355, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267731, 'error': None, 'target': 'ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.454 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dc005efc-97da-45cb-bde1-9bcd278ffe01]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tapef004289-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623391, 'tstamp': 623391}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267732, 'error': None, 'target': 'ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapef004289-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623394, 'tstamp': 623394}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267732, 'error': None, 'target': 'ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.456 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef004289-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.458 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.459 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.459 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef004289-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.459 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.459 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapef004289-20, col_values=(('external_ids', {'iface-id': '7f620c47-7119-4933-a36c-a82d159d6fc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.460 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.461 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 in datapath 6c5732b3-3484-43db-a231-53d04de40d61 unbound from our chassis#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.462 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c5732b3-3484-43db-a231-53d04de40d61, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.463 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d7521a9d-0e11-4a9a-848f-76bc5756578d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.463 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 namespace which is not needed anymore#033[00m
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 56] Generated table #93: 7312 keys, 14086276 bytes, temperature: kUnknown
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341568459, "cf_name": "default", "job": 56, "event": "table_file_creation", "file_number": 93, "file_size": 14086276, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14033738, "index_size": 33162, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18309, "raw_key_size": 187641, "raw_average_key_size": 25, "raw_value_size": 13899415, "raw_average_value_size": 1900, "num_data_blocks": 1328, "num_entries": 7312, "num_filter_entries": 7312, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769162341, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 93, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.617 227766 DEBUG nova.compute.manager [req-b35963c7-8df3-4854-8aab-149853d961e4 req-30db7d7b-3b91-4450-a90d-e1ef4b71bfde 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-plugged-0e41c282-1666-4ce9-aa23-76ee3e40aed8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.618 227766 DEBUG oslo_concurrency.lockutils [req-b35963c7-8df3-4854-8aab-149853d961e4 req-30db7d7b-3b91-4450-a90d-e1ef4b71bfde 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.618 227766 DEBUG oslo_concurrency.lockutils [req-b35963c7-8df3-4854-8aab-149853d961e4 req-30db7d7b-3b91-4450-a90d-e1ef4b71bfde 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.618 227766 DEBUG oslo_concurrency.lockutils [req-b35963c7-8df3-4854-8aab-149853d961e4 req-30db7d7b-3b91-4450-a90d-e1ef4b71bfde 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.619 227766 DEBUG nova.compute.manager [req-b35963c7-8df3-4854-8aab-149853d961e4 req-30db7d7b-3b91-4450-a90d-e1ef4b71bfde 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] No event matching network-vif-plugged-0e41c282-1666-4ce9-aa23-76ee3e40aed8 in dict_keys([('network-vif-plugged', '3ae0badd-deb5-430c-8cef-bb36b7ca7eea'), ('network-vif-plugged', 'a1541348-c83c-4165-8181-1f5dddb145af')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.619 227766 WARNING nova.compute.manager [req-b35963c7-8df3-4854-8aab-149853d961e4 req-30db7d7b-3b91-4450-a90d-e1ef4b71bfde 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received unexpected event network-vif-plugged-0e41c282-1666-4ce9-aa23-76ee3e40aed8 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.568777) [db/compaction/compaction_job.cc:1663] [default] [JOB 56] Compacted 1@0 + 1@6 files to L6 => 14086276 bytes
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.627721) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 46.8 rd, 46.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 11.0 +0.0 blob) out(13.4 +0.0 blob), read-write-amplify(10.5) write-amplify(5.2) OK, records in: 7839, records dropped: 527 output_compression: NoCompression
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.627760) EVENT_LOG_v1 {"time_micros": 1769162341627746, "job": 56, "event": "compaction_finished", "compaction_time_micros": 303840, "compaction_time_cpu_micros": 30209, "output_level": 6, "num_output_files": 1, "total_output_size": 14086276, "num_input_records": 7839, "num_output_records": 7312, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000092.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341628446, "job": 56, "event": "table_file_deletion", "file_number": 92}
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000090.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341630686, "job": 56, "event": "table_file_deletion", "file_number": 90}
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.264539) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.630812) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.630817) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.630819) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.630820) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.630822) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #94. Immutable memtables: 0.
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.631284) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 57] Flushing memtable with next log file: 94
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341631312, "job": 57, "event": "flush_started", "num_memtables": 1, "num_entries": 257, "num_deletes": 251, "total_data_size": 23108, "memory_usage": 28944, "flush_reason": "Manual Compaction"}
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 57] Level-0 flush table #95: started
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341684843, "cf_name": "default", "job": 57, "event": "table_file_creation", "file_number": 95, "file_size": 13893, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 48499, "largest_seqno": 48754, "table_properties": {"data_size": 12141, "index_size": 49, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 645, "raw_key_size": 5144, "raw_average_key_size": 20, "raw_value_size": 8731, "raw_average_value_size": 34, "num_data_blocks": 2, "num_entries": 256, "num_filter_entries": 256, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162341, "oldest_key_time": 1769162341, "file_creation_time": 1769162341, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 95, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 57] Flush lasted 53635 microseconds, and 871 cpu microseconds.
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.684917) [db/flush_job.cc:967] [default] [JOB 57] Level-0 flush table #95: 13893 bytes OK
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.684939) [db/memtable_list.cc:519] [default] Level-0 commit table #95 started
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.688503) [db/memtable_list.cc:722] [default] Level-0 commit table #95: memtable #1 done
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.688541) EVENT_LOG_v1 {"time_micros": 1769162341688531, "job": 57, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.688576) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 57] Try to delete WAL files size 21082, prev total WAL file size 21082, number of live WAL files 2.
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000091.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.689110) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031353035' seq:72057594037927935, type:22 .. '6D6772737461740031373537' seq:0, type:0; will stop at (end)
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 58] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 57 Base level 0, inputs: [95(13KB)], [93(13MB)]
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341689169, "job": 58, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [95], "files_L6": [93], "score": -1, "input_data_size": 14100169, "oldest_snapshot_seqno": -1}
Jan 23 04:59:01 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[267285]: [NOTICE]   (267298) : haproxy version is 2.8.14-c23fe91
Jan 23 04:59:01 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[267285]: [NOTICE]   (267298) : path to executable is /usr/sbin/haproxy
Jan 23 04:59:01 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[267285]: [WARNING]  (267298) : Exiting Master process...
Jan 23 04:59:01 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[267285]: [ALERT]    (267298) : Current worker (267300) exited with code 143 (Terminated)
Jan 23 04:59:01 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[267285]: [WARNING]  (267298) : All workers exited. Exiting... (0)
Jan 23 04:59:01 np0005593234 systemd[1]: libpod-5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44.scope: Deactivated successfully.
Jan 23 04:59:01 np0005593234 podman[267749]: 2026-01-23 09:59:01.701452933 +0000 UTC m=+0.153160806 container died 5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 58] Generated table #96: 7062 keys, 10259202 bytes, temperature: kUnknown
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341742798, "cf_name": "default", "job": 58, "event": "table_file_creation", "file_number": 96, "file_size": 10259202, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10213279, "index_size": 27187, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17669, "raw_key_size": 182639, "raw_average_key_size": 25, "raw_value_size": 10088208, "raw_average_value_size": 1428, "num_data_blocks": 1078, "num_entries": 7062, "num_filter_entries": 7062, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769162341, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 96, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.743129) [db/compaction/compaction_job.cc:1663] [default] [JOB 58] Compacted 1@0 + 1@6 files to L6 => 10259202 bytes
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.745377) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 262.2 rd, 190.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.0, 13.4 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(1753.4) write-amplify(738.4) OK, records in: 7568, records dropped: 506 output_compression: NoCompression
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.745407) EVENT_LOG_v1 {"time_micros": 1769162341745394, "job": 58, "event": "compaction_finished", "compaction_time_micros": 53775, "compaction_time_cpu_micros": 26635, "output_level": 6, "num_output_files": 1, "total_output_size": 10259202, "num_input_records": 7568, "num_output_records": 7062, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000095.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341745526, "job": 58, "event": "table_file_deletion", "file_number": 95}
Jan 23 04:59:01 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44-userdata-shm.mount: Deactivated successfully.
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000093.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162341747633, "job": 58, "event": "table_file_deletion", "file_number": 93}
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.688965) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.747812) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.747816) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.747818) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.747819) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:01.747821) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:01 np0005593234 systemd[1]: var-lib-containers-storage-overlay-e60ab3257327bebe14d4db635abf4de8bcfba80b68594f914ea5a449743daadb-merged.mount: Deactivated successfully.
Jan 23 04:59:01 np0005593234 podman[267749]: 2026-01-23 09:59:01.761533428 +0000 UTC m=+0.213241301 container cleanup 5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:59:01 np0005593234 systemd[1]: libpod-conmon-5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44.scope: Deactivated successfully.
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.817 227766 INFO nova.virt.libvirt.driver [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Deleting instance files /var/lib/nova/instances/11d58e6c-38fd-4f34-9d0f-102df6aee42b_del#033[00m
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.818 227766 INFO nova.virt.libvirt.driver [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Deletion of /var/lib/nova/instances/11d58e6c-38fd-4f34-9d0f-102df6aee42b_del complete#033[00m
Jan 23 04:59:01 np0005593234 podman[267780]: 2026-01-23 09:59:01.821713829 +0000 UTC m=+0.039261048 container remove 5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.827 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fd273fb2-79c6-43e4-8ea9-3667f4de1779]: (4, ('Fri Jan 23 09:59:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 (5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44)\n5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44\nFri Jan 23 09:59:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 (5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44)\n5820f9527011f8a2c8e0678a730f9077ec470e0abbf7790875203b55463bcd44\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.828 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[44fd24f7-76ca-4ced-804f-c8b485a3360a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.829 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c5732b3-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.831 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:01 np0005593234 kernel: tap6c5732b3-30: left promiscuous mode
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.845 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.848 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f24983c6-39d2-443d-ac54-3c0d51b5fb2c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.864 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5d508dc7-8736-4deb-a293-2829301accb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.865 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[db2a3b2b-5eea-49fa-9ff7-01e5e077cb6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.883 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[65bd6b36-ca43-4008-a370-3933742d1676]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 622970, 'reachable_time': 17559, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267795, 'error': None, 'target': 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 systemd[1]: run-netns-ovnmeta\x2d6c5732b3\x2d3484\x2d43db\x2da231\x2d53d04de40d61.mount: Deactivated successfully.
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.887 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:59:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:01.887 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[18c24029-e98d-4573-b382-9c0f5eb0b6d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.955 227766 INFO nova.compute.manager [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Took 2.20 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.955 227766 DEBUG oslo.service.loopingcall [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.955 227766 DEBUG nova.compute.manager [-] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:59:01 np0005593234 nova_compute[227762]: 2026-01-23 09:59:01.956 227766 DEBUG nova.network.neutron [-] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:59:01 np0005593234 podman[267794]: 2026-01-23 09:59:01.95653066 +0000 UTC m=+0.076540362 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 04:59:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:02.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:02 np0005593234 nova_compute[227762]: 2026-01-23 09:59:02.245 227766 DEBUG nova.compute.manager [req-93a4185c-03b3-40f8-a9a3-cc87c1bad413 req-b810ce35-ac39-4241-9f21-4e259386c843 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-plugged-3ae0badd-deb5-430c-8cef-bb36b7ca7eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:02 np0005593234 nova_compute[227762]: 2026-01-23 09:59:02.246 227766 DEBUG oslo_concurrency.lockutils [req-93a4185c-03b3-40f8-a9a3-cc87c1bad413 req-b810ce35-ac39-4241-9f21-4e259386c843 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:02 np0005593234 nova_compute[227762]: 2026-01-23 09:59:02.246 227766 DEBUG oslo_concurrency.lockutils [req-93a4185c-03b3-40f8-a9a3-cc87c1bad413 req-b810ce35-ac39-4241-9f21-4e259386c843 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:02 np0005593234 nova_compute[227762]: 2026-01-23 09:59:02.246 227766 DEBUG oslo_concurrency.lockutils [req-93a4185c-03b3-40f8-a9a3-cc87c1bad413 req-b810ce35-ac39-4241-9f21-4e259386c843 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:02 np0005593234 nova_compute[227762]: 2026-01-23 09:59:02.246 227766 DEBUG nova.compute.manager [req-93a4185c-03b3-40f8-a9a3-cc87c1bad413 req-b810ce35-ac39-4241-9f21-4e259386c843 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Processing event network-vif-plugged-3ae0badd-deb5-430c-8cef-bb36b7ca7eea _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:59:02 np0005593234 nova_compute[227762]: 2026-01-23 09:59:02.246 227766 DEBUG nova.compute.manager [req-93a4185c-03b3-40f8-a9a3-cc87c1bad413 req-b810ce35-ac39-4241-9f21-4e259386c843 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-plugged-3ae0badd-deb5-430c-8cef-bb36b7ca7eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:02 np0005593234 nova_compute[227762]: 2026-01-23 09:59:02.247 227766 DEBUG oslo_concurrency.lockutils [req-93a4185c-03b3-40f8-a9a3-cc87c1bad413 req-b810ce35-ac39-4241-9f21-4e259386c843 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:02 np0005593234 nova_compute[227762]: 2026-01-23 09:59:02.247 227766 DEBUG oslo_concurrency.lockutils [req-93a4185c-03b3-40f8-a9a3-cc87c1bad413 req-b810ce35-ac39-4241-9f21-4e259386c843 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:02 np0005593234 nova_compute[227762]: 2026-01-23 09:59:02.247 227766 DEBUG oslo_concurrency.lockutils [req-93a4185c-03b3-40f8-a9a3-cc87c1bad413 req-b810ce35-ac39-4241-9f21-4e259386c843 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:02 np0005593234 nova_compute[227762]: 2026-01-23 09:59:02.247 227766 DEBUG nova.compute.manager [req-93a4185c-03b3-40f8-a9a3-cc87c1bad413 req-b810ce35-ac39-4241-9f21-4e259386c843 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] No event matching network-vif-plugged-3ae0badd-deb5-430c-8cef-bb36b7ca7eea in dict_keys([('network-vif-plugged', 'a1541348-c83c-4165-8181-1f5dddb145af')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Jan 23 04:59:02 np0005593234 nova_compute[227762]: 2026-01-23 09:59:02.247 227766 WARNING nova.compute.manager [req-93a4185c-03b3-40f8-a9a3-cc87c1bad413 req-b810ce35-ac39-4241-9f21-4e259386c843 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received unexpected event network-vif-plugged-3ae0badd-deb5-430c-8cef-bb36b7ca7eea for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:59:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:59:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:02.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:59:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:04.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:59:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:04.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.418 227766 DEBUG nova.compute.manager [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-plugged-a1541348-c83c-4165-8181-1f5dddb145af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.419 227766 DEBUG oslo_concurrency.lockutils [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.419 227766 DEBUG oslo_concurrency.lockutils [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.419 227766 DEBUG oslo_concurrency.lockutils [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.419 227766 DEBUG nova.compute.manager [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Processing event network-vif-plugged-a1541348-c83c-4165-8181-1f5dddb145af _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.420 227766 DEBUG nova.compute.manager [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-plugged-a1541348-c83c-4165-8181-1f5dddb145af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.420 227766 DEBUG oslo_concurrency.lockutils [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.420 227766 DEBUG oslo_concurrency.lockutils [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.420 227766 DEBUG oslo_concurrency.lockutils [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.420 227766 DEBUG nova.compute.manager [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] No waiting events found dispatching network-vif-plugged-a1541348-c83c-4165-8181-1f5dddb145af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.420 227766 WARNING nova.compute.manager [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received unexpected event network-vif-plugged-a1541348-c83c-4165-8181-1f5dddb145af for instance with vm_state building and task_state spawning.#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.420 227766 DEBUG nova.compute.manager [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Received event network-vif-unplugged-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.421 227766 DEBUG oslo_concurrency.lockutils [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.421 227766 DEBUG oslo_concurrency.lockutils [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.421 227766 DEBUG oslo_concurrency.lockutils [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.421 227766 DEBUG nova.compute.manager [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] No waiting events found dispatching network-vif-unplugged-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.421 227766 DEBUG nova.compute.manager [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Received event network-vif-unplugged-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.421 227766 DEBUG nova.compute.manager [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Received event network-vif-plugged-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.421 227766 DEBUG oslo_concurrency.lockutils [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.422 227766 DEBUG oslo_concurrency.lockutils [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.422 227766 DEBUG oslo_concurrency.lockutils [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.422 227766 DEBUG nova.compute.manager [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] No waiting events found dispatching network-vif-plugged-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.422 227766 WARNING nova.compute.manager [req-f9e3b235-abcc-4d85-ac46-a1b53324a7a6 req-d9179902-b1d9-4bcc-b85e-a8067f741fc1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Received unexpected event network-vif-plugged-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.423 227766 DEBUG nova.compute.manager [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Instance event wait completed in 5 seconds for network-vif-plugged,network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.429 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162344.4293709, 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.430 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.432 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.435 227766 INFO nova.virt.libvirt.driver [-] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Instance spawned successfully.#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.436 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.457 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.459 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.473 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.473 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.474 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.474 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.474 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.475 227766 DEBUG nova.virt.libvirt.driver [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.520 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.597 227766 DEBUG nova.network.neutron [-] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.684 227766 INFO nova.compute.manager [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Took 37.10 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.684 227766 DEBUG nova.compute.manager [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.688 227766 INFO nova.compute.manager [-] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Took 2.73 seconds to deallocate network for instance.#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.808 227766 DEBUG oslo_concurrency.lockutils [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.809 227766 DEBUG oslo_concurrency.lockutils [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.828 227766 INFO nova.compute.manager [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Took 39.91 seconds to build instance.#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.834 227766 DEBUG nova.compute.manager [req-376c009a-fd25-426a-85e8-b7e81ad8ade4 req-4806dcca-a90d-4ace-81cc-a92b527d4ec8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Received event network-vif-deleted-0260ea0f-e0a8-4506-8b70-c7b52d5a7b48 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.876 227766 DEBUG oslo_concurrency.lockutils [None req-aa000f0d-c8b4-481d-a4e2-e9ebd3322c46 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 40.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:04 np0005593234 nova_compute[227762]: 2026-01-23 09:59:04.939 227766 DEBUG oslo_concurrency.processutils [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:05 np0005593234 nova_compute[227762]: 2026-01-23 09:59:05.420 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:59:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/827405525' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:59:05 np0005593234 nova_compute[227762]: 2026-01-23 09:59:05.678 227766 DEBUG oslo_concurrency.processutils [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.739s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:05 np0005593234 nova_compute[227762]: 2026-01-23 09:59:05.687 227766 DEBUG nova.compute.provider_tree [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:59:05 np0005593234 nova_compute[227762]: 2026-01-23 09:59:05.718 227766 DEBUG nova.scheduler.client.report [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:59:05 np0005593234 nova_compute[227762]: 2026-01-23 09:59:05.765 227766 DEBUG oslo_concurrency.lockutils [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.956s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:05 np0005593234 nova_compute[227762]: 2026-01-23 09:59:05.824 227766 INFO nova.scheduler.client.report [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Deleted allocations for instance 11d58e6c-38fd-4f34-9d0f-102df6aee42b#033[00m
Jan 23 04:59:05 np0005593234 nova_compute[227762]: 2026-01-23 09:59:05.861 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:06.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:06.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:06 np0005593234 nova_compute[227762]: 2026-01-23 09:59:06.393 227766 DEBUG oslo_concurrency.lockutils [None req-d8f90c9c-92c2-4eb4-9a3e-9e449540b58f d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "11d58e6c-38fd-4f34-9d0f-102df6aee42b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:08.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:59:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:08.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.516 227766 DEBUG oslo_concurrency.lockutils [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.517 227766 DEBUG oslo_concurrency.lockutils [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.517 227766 DEBUG oslo_concurrency.lockutils [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.518 227766 DEBUG oslo_concurrency.lockutils [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.518 227766 DEBUG oslo_concurrency.lockutils [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.520 227766 INFO nova.compute.manager [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Terminating instance#033[00m
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.521 227766 DEBUG nova.compute.manager [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:59:08 np0005593234 kernel: tap3ae0badd-de (unregistering): left promiscuous mode
Jan 23 04:59:08 np0005593234 NetworkManager[48942]: <info>  [1769162348.6620] device (tap3ae0badd-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:59:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:08Z|00315|binding|INFO|Releasing lport 3ae0badd-deb5-430c-8cef-bb36b7ca7eea from this chassis (sb_readonly=0)
Jan 23 04:59:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:08Z|00316|binding|INFO|Setting lport 3ae0badd-deb5-430c-8cef-bb36b7ca7eea down in Southbound
Jan 23 04:59:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:08Z|00317|binding|INFO|Removing iface tap3ae0badd-de ovn-installed in OVS
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.673 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.675 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.681 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:37:24:09 10.100.0.45'], port_security=['fa:16:3e:37:24:09 10.100.0.45'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.45/24', 'neutron:device_id': '30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557095954714e01b800ed2898d27593', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd5947ff6-b7cd-4a55-bcbe-de55519d051c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d31f115-0250-45c9-a1b4-d3823a4c1297, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=3ae0badd-deb5-430c-8cef-bb36b7ca7eea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.682 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 3ae0badd-deb5-430c-8cef-bb36b7ca7eea in datapath ef004289-2bc3-4dae-bfd1-9d2d36a65be8 unbound from our chassis#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.684 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ef004289-2bc3-4dae-bfd1-9d2d36a65be8#033[00m
Jan 23 04:59:08 np0005593234 kernel: tapa1541348-c8 (unregistering): left promiscuous mode
Jan 23 04:59:08 np0005593234 NetworkManager[48942]: <info>  [1769162348.6938] device (tapa1541348-c8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.696 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:08Z|00318|binding|INFO|Releasing lport a1541348-c83c-4165-8181-1f5dddb145af from this chassis (sb_readonly=0)
Jan 23 04:59:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:08Z|00319|binding|INFO|Setting lport a1541348-c83c-4165-8181-1f5dddb145af down in Southbound
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.703 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.703 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9abe3e75-5d73-4ebe-8dee-ce4c98c51ceb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:08Z|00320|binding|INFO|Removing iface tapa1541348-c8 ovn-installed in OVS
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.705 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.712 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:17:8c:ae 10.100.1.157'], port_security=['fa:16:3e:17:8c:ae 10.100.1.157'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.157/24', 'neutron:device_id': '30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557095954714e01b800ed2898d27593', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd5947ff6-b7cd-4a55-bcbe-de55519d051c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a30405c0-470a-4f97-ad9f-b7aa6ff409cb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a1541348-c83c-4165-8181-1f5dddb145af) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.718 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:08 np0005593234 kernel: tap0e41c282-16 (unregistering): left promiscuous mode
Jan 23 04:59:08 np0005593234 NetworkManager[48942]: <info>  [1769162348.7255] device (tap0e41c282-16): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.731 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.740 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7135586b-c7dd-4fe2-ae70-fd1cd66b7f06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.745 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7a3edee1-fc3d-4a4c-aea8-411934e170a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:08Z|00321|binding|INFO|Releasing lport 0e41c282-1666-4ce9-aa23-76ee3e40aed8 from this chassis (sb_readonly=0)
Jan 23 04:59:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:08Z|00322|binding|INFO|Setting lport 0e41c282-1666-4ce9-aa23-76ee3e40aed8 down in Southbound
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.746 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:08 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:08Z|00323|binding|INFO|Removing iface tap0e41c282-16 ovn-installed in OVS
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.749 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.763 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.770 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:71:2a:14 10.100.0.208'], port_security=['fa:16:3e:71:2a:14 10.100.0.208'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.208/24', 'neutron:device_id': '30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557095954714e01b800ed2898d27593', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd5947ff6-b7cd-4a55-bcbe-de55519d051c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6d31f115-0250-45c9-a1b4-d3823a4c1297, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=0e41c282-1666-4ce9-aa23-76ee3e40aed8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.776 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[415a3b4f-b98c-45db-925c-94020e4b7478]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:08 np0005593234 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000005a.scope: Deactivated successfully.
Jan 23 04:59:08 np0005593234 systemd[1]: machine-qemu\x2d37\x2dinstance\x2d0000005a.scope: Consumed 4.486s CPU time.
Jan 23 04:59:08 np0005593234 systemd-machined[195626]: Machine qemu-37-instance-0000005a terminated.
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.799 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0f883375-0ad7-4249-8831-38a76546b494]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef004289-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:48:3e:8f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 6, 'tx_packets': 7, 'rx_bytes': 532, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 96], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623380, 'reachable_time': 40355, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 267922, 'error': None, 'target': 'ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.816 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c8577d80-e7aa-4a70-bbcb-03f089857370]: (4, ({'family': 2, 'prefixlen': 24, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.255'], ['IFA_LABEL', 'tapef004289-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623391, 'tstamp': 623391}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267923, 'error': None, 'target': 'ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapef004289-21'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 623394, 'tstamp': 623394}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 267923, 'error': None, 'target': 'ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.818 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef004289-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.819 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.827 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.827 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef004289-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.828 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.828 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapef004289-20, col_values=(('external_ids', {'iface-id': '7f620c47-7119-4933-a36c-a82d159d6fc0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.829 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.830 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a1541348-c83c-4165-8181-1f5dddb145af in datapath 1b009808-c7c2-4bc8-995b-b11e0fa9f5b1 unbound from our chassis#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.833 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1b009808-c7c2-4bc8-995b-b11e0fa9f5b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.834 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6c45bd65-71ad-4027-a294-3057ea02fd34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:08.835 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1 namespace which is not needed anymore#033[00m
Jan 23 04:59:08 np0005593234 NetworkManager[48942]: <info>  [1769162348.9403] manager: (tap3ae0badd-de): new Tun device (/org/freedesktop/NetworkManager/Devices/162)
Jan 23 04:59:08 np0005593234 NetworkManager[48942]: <info>  [1769162348.9532] manager: (tapa1541348-c8): new Tun device (/org/freedesktop/NetworkManager/Devices/163)
Jan 23 04:59:08 np0005593234 NetworkManager[48942]: <info>  [1769162348.9668] manager: (tap0e41c282-16): new Tun device (/org/freedesktop/NetworkManager/Devices/164)
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.986 227766 INFO nova.virt.libvirt.driver [-] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Instance destroyed successfully.#033[00m
Jan 23 04:59:08 np0005593234 nova_compute[227762]: 2026-01-23 09:59:08.988 227766 DEBUG nova.objects.instance [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lazy-loading 'resources' on Instance uuid 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.018 227766 DEBUG nova.virt.libvirt.vif [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-598627440',display_name='tempest-ServersTestMultiNic-server-598627440',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-598627440',id=90,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:59:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-cctpnia6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:59:04Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "address": "fa:16:3e:37:24:09", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ae0badd-de", "ovs_interfaceid": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.018 227766 DEBUG nova.network.os_vif_util [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "address": "fa:16:3e:37:24:09", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.45", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3ae0badd-de", "ovs_interfaceid": "3ae0badd-deb5-430c-8cef-bb36b7ca7eea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.019 227766 DEBUG nova.network.os_vif_util [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:37:24:09,bridge_name='br-int',has_traffic_filtering=True,id=3ae0badd-deb5-430c-8cef-bb36b7ca7eea,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ae0badd-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.020 227766 DEBUG os_vif [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:24:09,bridge_name='br-int',has_traffic_filtering=True,id=3ae0badd-deb5-430c-8cef-bb36b7ca7eea,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ae0badd-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.021 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.021 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ae0badd-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.023 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.025 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.030 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.033 227766 INFO os_vif [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:37:24:09,bridge_name='br-int',has_traffic_filtering=True,id=3ae0badd-deb5-430c-8cef-bb36b7ca7eea,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3ae0badd-de')#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.034 227766 DEBUG nova.virt.libvirt.vif [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-598627440',display_name='tempest-ServersTestMultiNic-server-598627440',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-598627440',id=90,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:59:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-cctpnia6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:59:04Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a1541348-c83c-4165-8181-1f5dddb145af", "address": "fa:16:3e:17:8c:ae", "network": {"id": "1b009808-c7c2-4bc8-995b-b11e0fa9f5b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1217344448", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1541348-c8", "ovs_interfaceid": "a1541348-c83c-4165-8181-1f5dddb145af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.034 227766 DEBUG nova.network.os_vif_util [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "a1541348-c83c-4165-8181-1f5dddb145af", "address": "fa:16:3e:17:8c:ae", "network": {"id": "1b009808-c7c2-4bc8-995b-b11e0fa9f5b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1217344448", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1541348-c8", "ovs_interfaceid": "a1541348-c83c-4165-8181-1f5dddb145af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.035 227766 DEBUG nova.network.os_vif_util [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:17:8c:ae,bridge_name='br-int',has_traffic_filtering=True,id=a1541348-c83c-4165-8181-1f5dddb145af,network=Network(1b009808-c7c2-4bc8-995b-b11e0fa9f5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1541348-c8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.035 227766 DEBUG os_vif [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:8c:ae,bridge_name='br-int',has_traffic_filtering=True,id=a1541348-c83c-4165-8181-1f5dddb145af,network=Network(1b009808-c7c2-4bc8-995b-b11e0fa9f5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1541348-c8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.037 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.037 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa1541348-c8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.038 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.060 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.064 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.066 227766 INFO os_vif [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:17:8c:ae,bridge_name='br-int',has_traffic_filtering=True,id=a1541348-c83c-4165-8181-1f5dddb145af,network=Network(1b009808-c7c2-4bc8-995b-b11e0fa9f5b1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa1541348-c8')#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.067 227766 DEBUG nova.virt.libvirt.vif [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:58:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-598627440',display_name='tempest-ServersTestMultiNic-server-598627440',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-598627440',id=90,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:59:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-cctpnia6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:59:04Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "address": "fa:16:3e:71:2a:14", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e41c282-16", "ovs_interfaceid": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.068 227766 DEBUG nova.network.os_vif_util [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "address": "fa:16:3e:71:2a:14", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e41c282-16", "ovs_interfaceid": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.068 227766 DEBUG nova.network.os_vif_util [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:71:2a:14,bridge_name='br-int',has_traffic_filtering=True,id=0e41c282-1666-4ce9-aa23-76ee3e40aed8,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e41c282-16') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.069 227766 DEBUG os_vif [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:2a:14,bridge_name='br-int',has_traffic_filtering=True,id=0e41c282-1666-4ce9-aa23-76ee3e40aed8,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e41c282-16') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.070 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.070 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0e41c282-16, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.071 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.072 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.073 227766 INFO os_vif [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:71:2a:14,bridge_name='br-int',has_traffic_filtering=True,id=0e41c282-1666-4ce9-aa23-76ee3e40aed8,network=Network(ef004289-2bc3-4dae-bfd1-9d2d36a65be8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0e41c282-16')#033[00m
Jan 23 04:59:09 np0005593234 neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1[267711]: [NOTICE]   (267715) : haproxy version is 2.8.14-c23fe91
Jan 23 04:59:09 np0005593234 neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1[267711]: [NOTICE]   (267715) : path to executable is /usr/sbin/haproxy
Jan 23 04:59:09 np0005593234 neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1[267711]: [WARNING]  (267715) : Exiting Master process...
Jan 23 04:59:09 np0005593234 neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1[267711]: [ALERT]    (267715) : Current worker (267717) exited with code 143 (Terminated)
Jan 23 04:59:09 np0005593234 neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1[267711]: [WARNING]  (267715) : All workers exited. Exiting... (0)
Jan 23 04:59:09 np0005593234 systemd[1]: libpod-32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d.scope: Deactivated successfully.
Jan 23 04:59:09 np0005593234 podman[267942]: 2026-01-23 09:59:09.106975437 +0000 UTC m=+0.195319183 container died 32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 04:59:09 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d-userdata-shm.mount: Deactivated successfully.
Jan 23 04:59:09 np0005593234 systemd[1]: var-lib-containers-storage-overlay-a86087d4c3110b499b142fc4d8bd23d13f060ebed4b956d992a9930d69d90970-merged.mount: Deactivated successfully.
Jan 23 04:59:09 np0005593234 podman[267942]: 2026-01-23 09:59:09.21049692 +0000 UTC m=+0.298840666 container cleanup 32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:59:09 np0005593234 systemd[1]: libpod-conmon-32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d.scope: Deactivated successfully.
Jan 23 04:59:09 np0005593234 podman[268029]: 2026-01-23 09:59:09.313145236 +0000 UTC m=+0.081330852 container remove 32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:59:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:09.320 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5b34af45-dcb1-401b-a15d-e21bec0ac83c]: (4, ('Fri Jan 23 09:59:08 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1 (32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d)\n32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d\nFri Jan 23 09:59:09 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1 (32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d)\n32ad24ccda13891d34dc7c77a1c380166a40bf7d912374a9eac3bd746ba8f85d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:09.323 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0314f09a-95ac-4388-bb87-760b7e518de4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:09.324 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b009808-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.348 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:09 np0005593234 kernel: tap1b009808-c0: left promiscuous mode
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.351 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:09.354 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2e093cad-b821-4ec3-bf61-71faa9b3a42f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.364 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:09.371 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4e5eade1-1607-46f4-a4e1-382deb966dfc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:09.372 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a6eae215-b65b-4c36-a30a-706de4474aa4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:09.390 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e417fe-fe19-4525-add8-019ab0f72077]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623545, 'reachable_time': 43507, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268044, 'error': None, 'target': 'ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:09 np0005593234 systemd[1]: run-netns-ovnmeta\x2d1b009808\x2dc7c2\x2d4bc8\x2d995b\x2db11e0fa9f5b1.mount: Deactivated successfully.
Jan 23 04:59:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:09.395 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1b009808-c7c2-4bc8-995b-b11e0fa9f5b1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:59:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:09.395 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[c6290329-eeb4-437c-adac-848b1f1802a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:09.395 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 0e41c282-1666-4ce9-aa23-76ee3e40aed8 in datapath ef004289-2bc3-4dae-bfd1-9d2d36a65be8 unbound from our chassis#033[00m
Jan 23 04:59:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:09.397 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef004289-2bc3-4dae-bfd1-9d2d36a65be8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:59:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:09.398 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f299f825-5306-4bc5-bfb5-71c2649ea0b4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:09.398 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8 namespace which is not needed anymore#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.409 227766 DEBUG nova.compute.manager [req-4e66d437-ceda-4bc0-abfd-cf26718f527a req-4c9f6be9-17a8-4a5a-b3dc-75f30b28c650 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-unplugged-0e41c282-1666-4ce9-aa23-76ee3e40aed8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.409 227766 DEBUG oslo_concurrency.lockutils [req-4e66d437-ceda-4bc0-abfd-cf26718f527a req-4c9f6be9-17a8-4a5a-b3dc-75f30b28c650 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.410 227766 DEBUG oslo_concurrency.lockutils [req-4e66d437-ceda-4bc0-abfd-cf26718f527a req-4c9f6be9-17a8-4a5a-b3dc-75f30b28c650 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.410 227766 DEBUG oslo_concurrency.lockutils [req-4e66d437-ceda-4bc0-abfd-cf26718f527a req-4c9f6be9-17a8-4a5a-b3dc-75f30b28c650 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.411 227766 DEBUG nova.compute.manager [req-4e66d437-ceda-4bc0-abfd-cf26718f527a req-4c9f6be9-17a8-4a5a-b3dc-75f30b28c650 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] No waiting events found dispatching network-vif-unplugged-0e41c282-1666-4ce9-aa23-76ee3e40aed8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:09 np0005593234 nova_compute[227762]: 2026-01-23 09:59:09.411 227766 DEBUG nova.compute.manager [req-4e66d437-ceda-4bc0-abfd-cf26718f527a req-4c9f6be9-17a8-4a5a-b3dc-75f30b28c650 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-unplugged-0e41c282-1666-4ce9-aa23-76ee3e40aed8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:59:09 np0005593234 neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8[267556]: [NOTICE]   (267561) : haproxy version is 2.8.14-c23fe91
Jan 23 04:59:09 np0005593234 neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8[267556]: [NOTICE]   (267561) : path to executable is /usr/sbin/haproxy
Jan 23 04:59:09 np0005593234 neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8[267556]: [WARNING]  (267561) : Exiting Master process...
Jan 23 04:59:09 np0005593234 neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8[267556]: [ALERT]    (267561) : Current worker (267570) exited with code 143 (Terminated)
Jan 23 04:59:09 np0005593234 neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8[267556]: [WARNING]  (267561) : All workers exited. Exiting... (0)
Jan 23 04:59:09 np0005593234 systemd[1]: libpod-3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5.scope: Deactivated successfully.
Jan 23 04:59:09 np0005593234 podman[268062]: 2026-01-23 09:59:09.809382376 +0000 UTC m=+0.322845646 container died 3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 04:59:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:10.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:10.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:10 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5-userdata-shm.mount: Deactivated successfully.
Jan 23 04:59:10 np0005593234 systemd[1]: var-lib-containers-storage-overlay-5c316692c141a408fcd86f8e17006774e3e93c8b7b3465450a6108c31c25fa2f-merged.mount: Deactivated successfully.
Jan 23 04:59:10 np0005593234 nova_compute[227762]: 2026-01-23 09:59:10.714 227766 DEBUG nova.compute.manager [req-204aee84-538a-4011-a7b4-c08b6684805c req-291d7c52-a4ca-40d2-b28e-ff46efa3b849 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-unplugged-3ae0badd-deb5-430c-8cef-bb36b7ca7eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:10 np0005593234 nova_compute[227762]: 2026-01-23 09:59:10.715 227766 DEBUG oslo_concurrency.lockutils [req-204aee84-538a-4011-a7b4-c08b6684805c req-291d7c52-a4ca-40d2-b28e-ff46efa3b849 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:10 np0005593234 nova_compute[227762]: 2026-01-23 09:59:10.715 227766 DEBUG oslo_concurrency.lockutils [req-204aee84-538a-4011-a7b4-c08b6684805c req-291d7c52-a4ca-40d2-b28e-ff46efa3b849 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:10 np0005593234 nova_compute[227762]: 2026-01-23 09:59:10.716 227766 DEBUG oslo_concurrency.lockutils [req-204aee84-538a-4011-a7b4-c08b6684805c req-291d7c52-a4ca-40d2-b28e-ff46efa3b849 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:10 np0005593234 nova_compute[227762]: 2026-01-23 09:59:10.716 227766 DEBUG nova.compute.manager [req-204aee84-538a-4011-a7b4-c08b6684805c req-291d7c52-a4ca-40d2-b28e-ff46efa3b849 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] No waiting events found dispatching network-vif-unplugged-3ae0badd-deb5-430c-8cef-bb36b7ca7eea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:10 np0005593234 nova_compute[227762]: 2026-01-23 09:59:10.716 227766 DEBUG nova.compute.manager [req-204aee84-538a-4011-a7b4-c08b6684805c req-291d7c52-a4ca-40d2-b28e-ff46efa3b849 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-unplugged-3ae0badd-deb5-430c-8cef-bb36b7ca7eea for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:59:10 np0005593234 nova_compute[227762]: 2026-01-23 09:59:10.862 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:10 np0005593234 podman[268062]: 2026-01-23 09:59:10.935642683 +0000 UTC m=+1.449105963 container cleanup 3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 04:59:10 np0005593234 systemd[1]: libpod-conmon-3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5.scope: Deactivated successfully.
Jan 23 04:59:11 np0005593234 podman[268095]: 2026-01-23 09:59:11.050108378 +0000 UTC m=+0.090181127 container remove 3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 04:59:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:11.056 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e86c687a-f03f-47f0-9878-abaad69a5f70]: (4, ('Fri Jan 23 09:59:09 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8 (3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5)\n3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5\nFri Jan 23 09:59:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8 (3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5)\n3a8fcde36daaa3f60381907a1a42779bde9e5ebd2942a8e9160f135a6241c2e5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:11.057 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[88094edf-7721-4e4d-8733-47ab50149ff0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:11.058 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef004289-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.059 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:11 np0005593234 kernel: tapef004289-20: left promiscuous mode
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.072 227766 INFO nova.virt.libvirt.driver [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Deleting instance files /var/lib/nova/instances/30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_del#033[00m
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.073 227766 INFO nova.virt.libvirt.driver [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Deletion of /var/lib/nova/instances/30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c_del complete#033[00m
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.075 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:11.077 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2240ae2b-32b4-4d4e-9f41-d1a2e35e8160]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:11.093 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5c03ec76-9ae7-4089-9979-0467a27e5ad2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:11.094 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fc9cce55-63ba-4ef9-8b6d-5fdcf3eba617]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:11.111 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7b38e596-ba53-464d-9852-e1d171b7aebf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 623373, 'reachable_time': 15065, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268111, 'error': None, 'target': 'ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:11.113 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ef004289-2bc3-4dae-bfd1-9d2d36a65be8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:59:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:11.113 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a8d4e3-fcd8-4255-8338-33a7b39d121d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:11 np0005593234 systemd[1]: run-netns-ovnmeta\x2def004289\x2d2bc3\x2d4dae\x2dbfd1\x2d9d2d36a65be8.mount: Deactivated successfully.
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.223 227766 INFO nova.compute.manager [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Took 2.70 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.224 227766 DEBUG oslo.service.loopingcall [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.224 227766 DEBUG nova.compute.manager [-] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.224 227766 DEBUG nova.network.neutron [-] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.639 227766 DEBUG nova.compute.manager [req-6fd01e47-14cf-4ff7-be7e-6310e6148bc2 req-15f7a624-daba-4b5c-be52-8d261809ed6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-plugged-0e41c282-1666-4ce9-aa23-76ee3e40aed8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.640 227766 DEBUG oslo_concurrency.lockutils [req-6fd01e47-14cf-4ff7-be7e-6310e6148bc2 req-15f7a624-daba-4b5c-be52-8d261809ed6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.640 227766 DEBUG oslo_concurrency.lockutils [req-6fd01e47-14cf-4ff7-be7e-6310e6148bc2 req-15f7a624-daba-4b5c-be52-8d261809ed6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.641 227766 DEBUG oslo_concurrency.lockutils [req-6fd01e47-14cf-4ff7-be7e-6310e6148bc2 req-15f7a624-daba-4b5c-be52-8d261809ed6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.641 227766 DEBUG nova.compute.manager [req-6fd01e47-14cf-4ff7-be7e-6310e6148bc2 req-15f7a624-daba-4b5c-be52-8d261809ed6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] No waiting events found dispatching network-vif-plugged-0e41c282-1666-4ce9-aa23-76ee3e40aed8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:11 np0005593234 nova_compute[227762]: 2026-01-23 09:59:11.641 227766 WARNING nova.compute.manager [req-6fd01e47-14cf-4ff7-be7e-6310e6148bc2 req-15f7a624-daba-4b5c-be52-8d261809ed6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received unexpected event network-vif-plugged-0e41c282-1666-4ce9-aa23-76ee3e40aed8 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:59:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:12.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:12.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.710 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "2e578010-2c00-4db3-9cee-27a10eedc975" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.711 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.744 227766 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.867 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.868 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.876 227766 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.877 227766 INFO nova.compute.claims [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.906 227766 DEBUG nova.compute.manager [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-plugged-3ae0badd-deb5-430c-8cef-bb36b7ca7eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.907 227766 DEBUG oslo_concurrency.lockutils [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.907 227766 DEBUG oslo_concurrency.lockutils [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.907 227766 DEBUG oslo_concurrency.lockutils [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.907 227766 DEBUG nova.compute.manager [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] No waiting events found dispatching network-vif-plugged-3ae0badd-deb5-430c-8cef-bb36b7ca7eea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.908 227766 WARNING nova.compute.manager [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received unexpected event network-vif-plugged-3ae0badd-deb5-430c-8cef-bb36b7ca7eea for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.908 227766 DEBUG nova.compute.manager [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-unplugged-a1541348-c83c-4165-8181-1f5dddb145af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.908 227766 DEBUG oslo_concurrency.lockutils [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.908 227766 DEBUG oslo_concurrency.lockutils [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.909 227766 DEBUG oslo_concurrency.lockutils [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.909 227766 DEBUG nova.compute.manager [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] No waiting events found dispatching network-vif-unplugged-a1541348-c83c-4165-8181-1f5dddb145af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.909 227766 DEBUG nova.compute.manager [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-unplugged-a1541348-c83c-4165-8181-1f5dddb145af for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.909 227766 DEBUG nova.compute.manager [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-plugged-a1541348-c83c-4165-8181-1f5dddb145af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.910 227766 DEBUG oslo_concurrency.lockutils [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.910 227766 DEBUG oslo_concurrency.lockutils [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.910 227766 DEBUG oslo_concurrency.lockutils [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.910 227766 DEBUG nova.compute.manager [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] No waiting events found dispatching network-vif-plugged-a1541348-c83c-4165-8181-1f5dddb145af pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:12 np0005593234 nova_compute[227762]: 2026-01-23 09:59:12.910 227766 WARNING nova.compute.manager [req-d12dad72-7b12-4467-9b66-b0f5434b4548 req-8a69f38e-a038-44ce-a3b3-03aa1de2ec17 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received unexpected event network-vif-plugged-a1541348-c83c-4165-8181-1f5dddb145af for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:59:13 np0005593234 nova_compute[227762]: 2026-01-23 09:59:13.280 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:13 np0005593234 nova_compute[227762]: 2026-01-23 09:59:13.870 227766 DEBUG nova.compute.manager [req-dfd6e1a2-8a32-43d6-8704-cbc9cc987e23 req-7997d633-2e5b-491f-ae8b-84cd116741b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-deleted-3ae0badd-deb5-430c-8cef-bb36b7ca7eea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:13 np0005593234 nova_compute[227762]: 2026-01-23 09:59:13.871 227766 INFO nova.compute.manager [req-dfd6e1a2-8a32-43d6-8704-cbc9cc987e23 req-7997d633-2e5b-491f-ae8b-84cd116741b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Neutron deleted interface 3ae0badd-deb5-430c-8cef-bb36b7ca7eea; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:59:13 np0005593234 nova_compute[227762]: 2026-01-23 09:59:13.872 227766 DEBUG nova.network.neutron [req-dfd6e1a2-8a32-43d6-8704-cbc9cc987e23 req-7997d633-2e5b-491f-ae8b-84cd116741b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Updating instance_info_cache with network_info: [{"id": "a1541348-c83c-4165-8181-1f5dddb145af", "address": "fa:16:3e:17:8c:ae", "network": {"id": "1b009808-c7c2-4bc8-995b-b11e0fa9f5b1", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1217344448", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.157", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa1541348-c8", "ovs_interfaceid": "a1541348-c83c-4165-8181-1f5dddb145af", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "address": "fa:16:3e:71:2a:14", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e41c282-16", "ovs_interfaceid": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:13 np0005593234 nova_compute[227762]: 2026-01-23 09:59:13.947 227766 DEBUG nova.compute.manager [req-dfd6e1a2-8a32-43d6-8704-cbc9cc987e23 req-7997d633-2e5b-491f-ae8b-84cd116741b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Detach interface failed, port_id=3ae0badd-deb5-430c-8cef-bb36b7ca7eea, reason: Instance 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 04:59:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:59:14 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4076051625' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.025 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.745s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.031 227766 DEBUG nova.compute.provider_tree [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.071 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.077 227766 DEBUG nova.scheduler.client.report [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:59:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:14.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.119 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.251s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.120 227766 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.204 227766 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.205 227766 DEBUG nova.network.neutron [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.252 227766 INFO nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:59:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:59:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:14.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.288 227766 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.531 227766 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.532 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.532 227766 INFO nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Creating image(s)#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.556 227766 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e578010-2c00-4db3-9cee-27a10eedc975_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.583 227766 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e578010-2c00-4db3-9cee-27a10eedc975_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.612 227766 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e578010-2c00-4db3-9cee-27a10eedc975_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.615 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.675 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.676 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.676 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.677 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.710 227766 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e578010-2c00-4db3-9cee-27a10eedc975_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.714 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 2e578010-2c00-4db3-9cee-27a10eedc975_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:14 np0005593234 nova_compute[227762]: 2026-01-23 09:59:14.738 227766 DEBUG nova.policy [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'd83df80213fd40f99fdc68c146fe9a2a', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c288779980de4f03be20b7eed343b775', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:59:15 np0005593234 nova_compute[227762]: 2026-01-23 09:59:15.390 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162340.3887985, 11d58e6c-38fd-4f34-9d0f-102df6aee42b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:59:15 np0005593234 nova_compute[227762]: 2026-01-23 09:59:15.390 227766 INFO nova.compute.manager [-] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:59:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:15 np0005593234 nova_compute[227762]: 2026-01-23 09:59:15.865 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:16 np0005593234 nova_compute[227762]: 2026-01-23 09:59:16.099 227766 DEBUG nova.compute.manager [None req-1e43937a-1a06-4441-9305-f6a572617fcf - - - - - -] [instance: 11d58e6c-38fd-4f34-9d0f-102df6aee42b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:16.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:16 np0005593234 nova_compute[227762]: 2026-01-23 09:59:16.189 227766 DEBUG nova.compute.manager [req-e36d13fa-c8b0-40e1-8d1f-f16bbb9d93dc req-fe4c4c52-2f77-465d-a29f-fec3505b3f4e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-deleted-a1541348-c83c-4165-8181-1f5dddb145af external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:16 np0005593234 nova_compute[227762]: 2026-01-23 09:59:16.189 227766 INFO nova.compute.manager [req-e36d13fa-c8b0-40e1-8d1f-f16bbb9d93dc req-fe4c4c52-2f77-465d-a29f-fec3505b3f4e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Neutron deleted interface a1541348-c83c-4165-8181-1f5dddb145af; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 04:59:16 np0005593234 nova_compute[227762]: 2026-01-23 09:59:16.189 227766 DEBUG nova.network.neutron [req-e36d13fa-c8b0-40e1-8d1f-f16bbb9d93dc req-fe4c4c52-2f77-465d-a29f-fec3505b3f4e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Updating instance_info_cache with network_info: [{"id": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "address": "fa:16:3e:71:2a:14", "network": {"id": "ef004289-2bc3-4dae-bfd1-9d2d36a65be8", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-2095648035", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.208", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0e41c282-16", "ovs_interfaceid": "0e41c282-1666-4ce9-aa23-76ee3e40aed8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:16 np0005593234 nova_compute[227762]: 2026-01-23 09:59:16.252 227766 DEBUG nova.compute.manager [req-e36d13fa-c8b0-40e1-8d1f-f16bbb9d93dc req-fe4c4c52-2f77-465d-a29f-fec3505b3f4e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Detach interface failed, port_id=a1541348-c83c-4165-8181-1f5dddb145af, reason: Instance 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 04:59:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:16.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:17 np0005593234 nova_compute[227762]: 2026-01-23 09:59:17.324 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 2e578010-2c00-4db3-9cee-27a10eedc975_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.610s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:17 np0005593234 nova_compute[227762]: 2026-01-23 09:59:17.397 227766 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] resizing rbd image 2e578010-2c00-4db3-9cee-27a10eedc975_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 04:59:17 np0005593234 nova_compute[227762]: 2026-01-23 09:59:17.598 227766 DEBUG nova.objects.instance [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lazy-loading 'migration_context' on Instance uuid 2e578010-2c00-4db3-9cee-27a10eedc975 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:59:17 np0005593234 nova_compute[227762]: 2026-01-23 09:59:17.629 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 04:59:17 np0005593234 nova_compute[227762]: 2026-01-23 09:59:17.629 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Ensure instance console log exists: /var/lib/nova/instances/2e578010-2c00-4db3-9cee-27a10eedc975/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 04:59:17 np0005593234 nova_compute[227762]: 2026-01-23 09:59:17.630 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:17 np0005593234 nova_compute[227762]: 2026-01-23 09:59:17.630 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:17 np0005593234 nova_compute[227762]: 2026-01-23 09:59:17.630 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:18.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:18.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:18 np0005593234 nova_compute[227762]: 2026-01-23 09:59:18.728 227766 DEBUG nova.network.neutron [-] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:18 np0005593234 nova_compute[227762]: 2026-01-23 09:59:18.769 227766 INFO nova.compute.manager [-] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Took 7.54 seconds to deallocate network for instance.#033[00m
Jan 23 04:59:18 np0005593234 podman[268305]: 2026-01-23 09:59:18.78823195 +0000 UTC m=+0.078909745 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 04:59:18 np0005593234 nova_compute[227762]: 2026-01-23 09:59:18.866 227766 DEBUG oslo_concurrency.lockutils [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:18 np0005593234 nova_compute[227762]: 2026-01-23 09:59:18.867 227766 DEBUG oslo_concurrency.lockutils [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:19 np0005593234 nova_compute[227762]: 2026-01-23 09:59:19.063 227766 DEBUG oslo_concurrency.processutils [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:19 np0005593234 nova_compute[227762]: 2026-01-23 09:59:19.084 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:19.423 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=36, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=35) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:19.424 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 04:59:19 np0005593234 nova_compute[227762]: 2026-01-23 09:59:19.423 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:59:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4227549933' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:59:19 np0005593234 nova_compute[227762]: 2026-01-23 09:59:19.521 227766 DEBUG oslo_concurrency.processutils [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:19 np0005593234 nova_compute[227762]: 2026-01-23 09:59:19.527 227766 DEBUG nova.compute.provider_tree [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:59:19 np0005593234 nova_compute[227762]: 2026-01-23 09:59:19.563 227766 DEBUG nova.scheduler.client.report [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:59:19 np0005593234 nova_compute[227762]: 2026-01-23 09:59:19.599 227766 DEBUG oslo_concurrency.lockutils [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.732s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:19 np0005593234 nova_compute[227762]: 2026-01-23 09:59:19.673 227766 DEBUG nova.compute.manager [req-ae838796-d537-401f-9ba2-2c6ce71c899d req-aab44346-73c7-45c6-b830-01c926c584dd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Received event network-vif-deleted-0e41c282-1666-4ce9-aa23-76ee3e40aed8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:19 np0005593234 nova_compute[227762]: 2026-01-23 09:59:19.718 227766 INFO nova.scheduler.client.report [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Deleted allocations for instance 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c#033[00m
Jan 23 04:59:19 np0005593234 nova_compute[227762]: 2026-01-23 09:59:19.886 227766 DEBUG oslo_concurrency.lockutils [None req-d09a3538-41d6-48ed-b8e8-347a71c44134 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 11.370s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:19 np0005593234 nova_compute[227762]: 2026-01-23 09:59:19.985 227766 DEBUG nova.network.neutron [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Successfully created port: 2da067d2-38b0-41de-8c24-58b2fc5bef8b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 04:59:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:20.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:20.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:20 np0005593234 nova_compute[227762]: 2026-01-23 09:59:20.917 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:22.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:22.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:22.425 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '36'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:22 np0005593234 nova_compute[227762]: 2026-01-23 09:59:22.809 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:22 np0005593234 nova_compute[227762]: 2026-01-23 09:59:22.949 227766 DEBUG nova.network.neutron [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Successfully updated port: 2da067d2-38b0-41de-8c24-58b2fc5bef8b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 04:59:22 np0005593234 nova_compute[227762]: 2026-01-23 09:59:22.981 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "refresh_cache-2e578010-2c00-4db3-9cee-27a10eedc975" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:59:22 np0005593234 nova_compute[227762]: 2026-01-23 09:59:22.981 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquired lock "refresh_cache-2e578010-2c00-4db3-9cee-27a10eedc975" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:59:22 np0005593234 nova_compute[227762]: 2026-01-23 09:59:22.981 227766 DEBUG nova.network.neutron [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 04:59:23 np0005593234 nova_compute[227762]: 2026-01-23 09:59:23.368 227766 DEBUG nova.network.neutron [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 04:59:23 np0005593234 nova_compute[227762]: 2026-01-23 09:59:23.985 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162348.9832878, 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:59:23 np0005593234 nova_compute[227762]: 2026-01-23 09:59:23.985 227766 INFO nova.compute.manager [-] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:59:24 np0005593234 nova_compute[227762]: 2026-01-23 09:59:24.086 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:24.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:24.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:25 np0005593234 nova_compute[227762]: 2026-01-23 09:59:25.919 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:25 np0005593234 nova_compute[227762]: 2026-01-23 09:59:25.936 227766 DEBUG nova.compute.manager [None req-5980e545-4960-4860-9d37-754e34ccb4b2 - - - - - -] [instance: 30d0e6a8-5f07-4bc6-b8a5-8b5cca7c2f9c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:26.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:26.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:26 np0005593234 nova_compute[227762]: 2026-01-23 09:59:26.450 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:26 np0005593234 nova_compute[227762]: 2026-01-23 09:59:26.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:27 np0005593234 nova_compute[227762]: 2026-01-23 09:59:27.001 227766 DEBUG nova.compute.manager [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Received event network-changed-2da067d2-38b0-41de-8c24-58b2fc5bef8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:27 np0005593234 nova_compute[227762]: 2026-01-23 09:59:27.002 227766 DEBUG nova.compute.manager [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Refreshing instance network info cache due to event network-changed-2da067d2-38b0-41de-8c24-58b2fc5bef8b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 04:59:27 np0005593234 nova_compute[227762]: 2026-01-23 09:59:27.002 227766 DEBUG oslo_concurrency.lockutils [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-2e578010-2c00-4db3-9cee-27a10eedc975" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:59:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:28.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.205 227766 DEBUG nova.network.neutron [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Updating instance_info_cache with network_info: [{"id": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "address": "fa:16:3e:fe:b7:82", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da067d2-38", "ovs_interfaceid": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:28.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.423 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Releasing lock "refresh_cache-2e578010-2c00-4db3-9cee-27a10eedc975" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.424 227766 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Instance network_info: |[{"id": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "address": "fa:16:3e:fe:b7:82", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da067d2-38", "ovs_interfaceid": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.424 227766 DEBUG oslo_concurrency.lockutils [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-2e578010-2c00-4db3-9cee-27a10eedc975" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.424 227766 DEBUG nova.network.neutron [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Refreshing network info cache for port 2da067d2-38b0-41de-8c24-58b2fc5bef8b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.426 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Start _get_guest_xml network_info=[{"id": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "address": "fa:16:3e:fe:b7:82", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da067d2-38", "ovs_interfaceid": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.433 227766 WARNING nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.439 227766 DEBUG nova.virt.libvirt.host [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.440 227766 DEBUG nova.virt.libvirt.host [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.447 227766 DEBUG nova.virt.libvirt.host [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.448 227766 DEBUG nova.virt.libvirt.host [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.449 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.449 227766 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.449 227766 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.450 227766 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.450 227766 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.450 227766 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.450 227766 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.451 227766 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.451 227766 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.451 227766 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.451 227766 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.452 227766 DEBUG nova.virt.hardware [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.455 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:59:28 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4107573966' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.875 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.899 227766 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e578010-2c00-4db3-9cee-27a10eedc975_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:28 np0005593234 nova_compute[227762]: 2026-01-23 09:59:28.902 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.088 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 04:59:29 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4145084801' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.362 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.364 227766 DEBUG nova.virt.libvirt.vif [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:59:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-2087045140',display_name='tempest-MultipleCreateTestJSON-server-2087045140-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-2087045140-1',id=94,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c288779980de4f03be20b7eed343b775',ramdisk_id='',reservation_id='r-2nwggk6q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-351408189',owner_user_name='tempest-MultipleCreateTestJSON-351408189-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:59:14Z,user_data=None,user_id='d83df80213fd40f99fdc68c146fe9a2a',uuid=2e578010-2c00-4db3-9cee-27a10eedc975,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "address": "fa:16:3e:fe:b7:82", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da067d2-38", "ovs_interfaceid": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.364 227766 DEBUG nova.network.os_vif_util [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converting VIF {"id": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "address": "fa:16:3e:fe:b7:82", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da067d2-38", "ovs_interfaceid": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.365 227766 DEBUG nova.network.os_vif_util [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:b7:82,bridge_name='br-int',has_traffic_filtering=True,id=2da067d2-38b0-41de-8c24-58b2fc5bef8b,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da067d2-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.366 227766 DEBUG nova.objects.instance [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lazy-loading 'pci_devices' on Instance uuid 2e578010-2c00-4db3-9cee-27a10eedc975 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.587 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] End _get_guest_xml xml=<domain type="kvm">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  <uuid>2e578010-2c00-4db3-9cee-27a10eedc975</uuid>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  <name>instance-0000005e</name>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <nova:name>tempest-MultipleCreateTestJSON-server-2087045140-1</nova:name>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 09:59:28</nova:creationTime>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <nova:user uuid="d83df80213fd40f99fdc68c146fe9a2a">tempest-MultipleCreateTestJSON-351408189-project-member</nova:user>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <nova:project uuid="c288779980de4f03be20b7eed343b775">tempest-MultipleCreateTestJSON-351408189</nova:project>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <nova:port uuid="2da067d2-38b0-41de-8c24-58b2fc5bef8b">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <system>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <entry name="serial">2e578010-2c00-4db3-9cee-27a10eedc975</entry>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <entry name="uuid">2e578010-2c00-4db3-9cee-27a10eedc975</entry>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    </system>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  <os>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  </os>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  <features>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  </features>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  </clock>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  <devices>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/2e578010-2c00-4db3-9cee-27a10eedc975_disk">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/2e578010-2c00-4db3-9cee-27a10eedc975_disk.config">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      </source>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      </auth>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    </disk>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:fe:b7:82"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <target dev="tap2da067d2-38"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    </interface>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/2e578010-2c00-4db3-9cee-27a10eedc975/console.log" append="off"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    </serial>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <video>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    </video>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    </rng>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 04:59:29 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 04:59:29 np0005593234 nova_compute[227762]:  </devices>
Jan 23 04:59:29 np0005593234 nova_compute[227762]: </domain>
Jan 23 04:59:29 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.588 227766 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Preparing to wait for external event network-vif-plugged-2da067d2-38b0-41de-8c24-58b2fc5bef8b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.588 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.589 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.589 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.590 227766 DEBUG nova.virt.libvirt.vif [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:59:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-2087045140',display_name='tempest-MultipleCreateTestJSON-server-2087045140-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-2087045140-1',id=94,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c288779980de4f03be20b7eed343b775',ramdisk_id='',reservation_id='r-2nwggk6q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-MultipleCreateTestJSON-351408189',owner_user_name='tempest-MultipleCreateTestJSON-351408189-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:59:14Z,user_data=None,user_id='d83df80213fd40f99fdc68c146fe9a2a',uuid=2e578010-2c00-4db3-9cee-27a10eedc975,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "address": "fa:16:3e:fe:b7:82", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da067d2-38", "ovs_interfaceid": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.590 227766 DEBUG nova.network.os_vif_util [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converting VIF {"id": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "address": "fa:16:3e:fe:b7:82", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da067d2-38", "ovs_interfaceid": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.591 227766 DEBUG nova.network.os_vif_util [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:b7:82,bridge_name='br-int',has_traffic_filtering=True,id=2da067d2-38b0-41de-8c24-58b2fc5bef8b,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da067d2-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.591 227766 DEBUG os_vif [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:b7:82,bridge_name='br-int',has_traffic_filtering=True,id=2da067d2-38b0-41de-8c24-58b2fc5bef8b,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da067d2-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.592 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.592 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.593 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.595 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.595 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2da067d2-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.596 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2da067d2-38, col_values=(('external_ids', {'iface-id': '2da067d2-38b0-41de-8c24-58b2fc5bef8b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fe:b7:82', 'vm-uuid': '2e578010-2c00-4db3-9cee-27a10eedc975'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.597 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:29 np0005593234 NetworkManager[48942]: <info>  [1769162369.5984] manager: (tap2da067d2-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.600 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.603 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.604 227766 INFO os_vif [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:b7:82,bridge_name='br-int',has_traffic_filtering=True,id=2da067d2-38b0-41de-8c24-58b2fc5bef8b,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da067d2-38')#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.919 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.920 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.920 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.920 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.921 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.958 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.959 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.959 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] No VIF found with MAC fa:16:3e:fe:b7:82, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 04:59:29 np0005593234 nova_compute[227762]: 2026-01-23 09:59:29.960 227766 INFO nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Using config drive#033[00m
Jan 23 04:59:30 np0005593234 nova_compute[227762]: 2026-01-23 09:59:30.066 227766 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e578010-2c00-4db3-9cee-27a10eedc975_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:30.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:30.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:59:30 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3473366394' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:59:30 np0005593234 nova_compute[227762]: 2026-01-23 09:59:30.652 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.732s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:30 np0005593234 nova_compute[227762]: 2026-01-23 09:59:30.921 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.419 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.419 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000005e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.461 227766 INFO nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Creating config drive at /var/lib/nova/instances/2e578010-2c00-4db3-9cee-27a10eedc975/disk.config#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.466 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/2e578010-2c00-4db3-9cee-27a10eedc975/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph1c3pobt execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.590 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.591 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4581MB free_disk=20.92584228515625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.591 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.592 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.599 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/2e578010-2c00-4db3-9cee-27a10eedc975/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph1c3pobt" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.622 227766 DEBUG nova.storage.rbd_utils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] rbd image 2e578010-2c00-4db3-9cee-27a10eedc975_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.625 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/2e578010-2c00-4db3-9cee-27a10eedc975/disk.config 2e578010-2c00-4db3-9cee-27a10eedc975_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.754 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 2e578010-2c00-4db3-9cee-27a10eedc975 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.755 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.755 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.803 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.819 227766 DEBUG oslo_concurrency.processutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/2e578010-2c00-4db3-9cee-27a10eedc975/disk.config 2e578010-2c00-4db3-9cee-27a10eedc975_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.194s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.820 227766 INFO nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Deleting local config drive /var/lib/nova/instances/2e578010-2c00-4db3-9cee-27a10eedc975/disk.config because it was imported into RBD.#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.846 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.846 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 04:59:31 np0005593234 kernel: tap2da067d2-38: entered promiscuous mode
Jan 23 04:59:31 np0005593234 NetworkManager[48942]: <info>  [1769162371.8746] manager: (tap2da067d2-38): new Tun device (/org/freedesktop/NetworkManager/Devices/166)
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.874 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:31 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:31Z|00324|binding|INFO|Claiming lport 2da067d2-38b0-41de-8c24-58b2fc5bef8b for this chassis.
Jan 23 04:59:31 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:31Z|00325|binding|INFO|2da067d2-38b0-41de-8c24-58b2fc5bef8b: Claiming fa:16:3e:fe:b7:82 10.100.0.7
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.878 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.888 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 04:59:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:31.887 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:b7:82 10.100.0.7'], port_security=['fa:16:3e:fe:b7:82 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2e578010-2c00-4db3-9cee-27a10eedc975', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c5732b3-3484-43db-a231-53d04de40d61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c288779980de4f03be20b7eed343b775', 'neutron:revision_number': '2', 'neutron:security_group_ids': '288ecf98-3e6e-478c-8e27-86a4106b4ef8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2529943-1c00-4757-827e-798919a83756, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=2da067d2-38b0-41de-8c24-58b2fc5bef8b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:31.888 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 2da067d2-38b0-41de-8c24-58b2fc5bef8b in datapath 6c5732b3-3484-43db-a231-53d04de40d61 bound to our chassis#033[00m
Jan 23 04:59:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:31.890 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6c5732b3-3484-43db-a231-53d04de40d61#033[00m
Jan 23 04:59:31 np0005593234 systemd-udevd[268561]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 04:59:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:31.901 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[db8e6764-1ec4-4f0e-9ba0-6942d69af2ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:31.901 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6c5732b3-31 in ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 04:59:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:31.903 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6c5732b3-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 04:59:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:31.903 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eef5506b-f644-45ce-aa0e-da01f7f4adaa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:31.904 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4cebd956-9729-4eaa-8916-9954b7db902f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:31 np0005593234 systemd-machined[195626]: New machine qemu-38-instance-0000005e.
Jan 23 04:59:31 np0005593234 NetworkManager[48942]: <info>  [1769162371.9135] device (tap2da067d2-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 04:59:31 np0005593234 NetworkManager[48942]: <info>  [1769162371.9140] device (tap2da067d2-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 04:59:31 np0005593234 systemd[1]: Started Virtual Machine qemu-38-instance-0000005e.
Jan 23 04:59:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:31.917 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[3c8ff6c3-f435-4867-90e1-c038816506b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.923 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.941 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:31.942 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[220cadec-ee01-443f-ab93-5f1b834d5deb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:31 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:31Z|00326|binding|INFO|Setting lport 2da067d2-38b0-41de-8c24-58b2fc5bef8b ovn-installed in OVS
Jan 23 04:59:31 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:31Z|00327|binding|INFO|Setting lport 2da067d2-38b0-41de-8c24-58b2fc5bef8b up in Southbound
Jan 23 04:59:31 np0005593234 nova_compute[227762]: 2026-01-23 09:59:31.948 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:31.972 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[38367c8c-a604-4d44-88d9-40093d8c36b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:31.977 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cc896f46-c5c3-4766-b325-3b9a8a3ad4d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:31 np0005593234 NetworkManager[48942]: <info>  [1769162371.9782] manager: (tap6c5732b3-30): new Veth device (/org/freedesktop/NetworkManager/Devices/167)
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.007 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4149e4a1-f115-4172-9a99-4332357f7598]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.010 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e9adc3ad-cd19-4102-aedc-fe64bcc0821f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:32 np0005593234 NetworkManager[48942]: <info>  [1769162372.0303] device (tap6c5732b3-30): carrier: link connected
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.036 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[6eca8c85-64d9-454d-9ce1-86e150c28fb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.054 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3d92a365-e589-40c0-b2d8-a4fee020cd59]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c5732b3-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:ad:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 626699, 'reachable_time': 33770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268606, 'error': None, 'target': 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.060 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.071 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2d0e98a8-8d7d-407b-9191-f2d4da1fdd62]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe04:adb9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 626699, 'tstamp': 626699}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 268612, 'error': None, 'target': 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.092 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c47f1596-1836-4bf4-b520-e0d2326f8bae]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c5732b3-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:04:ad:b9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 103], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 626699, 'reachable_time': 33770, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 268617, 'error': None, 'target': 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:32 np0005593234 podman[268578]: 2026-01-23 09:59:32.109769745 +0000 UTC m=+0.092363766 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 04:59:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:32.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.130 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[37d2236e-47d4-4628-aa1a-55405f187639]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.187 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[92464016-e1c2-4321-a350-a47af39cf1ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.188 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c5732b3-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.189 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.189 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c5732b3-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:32 np0005593234 NetworkManager[48942]: <info>  [1769162372.1913] manager: (tap6c5732b3-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Jan 23 04:59:32 np0005593234 kernel: tap6c5732b3-30: entered promiscuous mode
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.190 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.193 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6c5732b3-30, col_values=(('external_ids', {'iface-id': '4f372140-9451-4bb5-99b3-fc5570b8346b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:32 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:32Z|00328|binding|INFO|Releasing lport 4f372140-9451-4bb5-99b3-fc5570b8346b from this chassis (sb_readonly=0)
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.207 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.209 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6c5732b3-3484-43db-a231-53d04de40d61.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6c5732b3-3484-43db-a231-53d04de40d61.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.210 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f323a268-97c7-4736-bfee-a8022c3706d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.210 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-6c5732b3-3484-43db-a231-53d04de40d61
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/6c5732b3-3484-43db-a231-53d04de40d61.pid.haproxy
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 6c5732b3-3484-43db-a231-53d04de40d61
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 04:59:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:32.212 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'env', 'PROCESS_TAG=haproxy-6c5732b3-3484-43db-a231-53d04de40d61', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6c5732b3-3484-43db-a231-53d04de40d61.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 04:59:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:32.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:59:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2352095814' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.534 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162372.5333652, 2e578010-2c00-4db3-9cee-27a10eedc975 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.534 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] VM Started (Lifecycle Event)#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.537 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.542 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.559 227766 DEBUG nova.network.neutron [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Updated VIF entry in instance network info cache for port 2da067d2-38b0-41de-8c24-58b2fc5bef8b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.560 227766 DEBUG nova.network.neutron [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Updating instance_info_cache with network_info: [{"id": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "address": "fa:16:3e:fe:b7:82", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da067d2-38", "ovs_interfaceid": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.603 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.604 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.611 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162372.5336561, 2e578010-2c00-4db3-9cee-27a10eedc975 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.612 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] VM Paused (Lifecycle Event)#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.618 227766 DEBUG oslo_concurrency.lockutils [req-8614461a-5475-4ad5-93ce-9753677a8e91 req-ab8a3e22-da67-4ce6-b395-7898ad7cc971 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-2e578010-2c00-4db3-9cee-27a10eedc975" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:59:32 np0005593234 podman[268715]: 2026-01-23 09:59:32.549469458 +0000 UTC m=+0.029112820 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.645 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.648 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.661 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.662 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.679 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.691 227766 DEBUG nova.compute.manager [req-54e8d0d8-b10e-467b-b3c0-0a85057fd7f3 req-a7c9f6a6-2880-4b59-b293-21c6212c4c7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Received event network-vif-plugged-2da067d2-38b0-41de-8c24-58b2fc5bef8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.691 227766 DEBUG oslo_concurrency.lockutils [req-54e8d0d8-b10e-467b-b3c0-0a85057fd7f3 req-a7c9f6a6-2880-4b59-b293-21c6212c4c7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.692 227766 DEBUG oslo_concurrency.lockutils [req-54e8d0d8-b10e-467b-b3c0-0a85057fd7f3 req-a7c9f6a6-2880-4b59-b293-21c6212c4c7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.692 227766 DEBUG oslo_concurrency.lockutils [req-54e8d0d8-b10e-467b-b3c0-0a85057fd7f3 req-a7c9f6a6-2880-4b59-b293-21c6212c4c7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.692 227766 DEBUG nova.compute.manager [req-54e8d0d8-b10e-467b-b3c0-0a85057fd7f3 req-a7c9f6a6-2880-4b59-b293-21c6212c4c7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Processing event network-vif-plugged-2da067d2-38b0-41de-8c24-58b2fc5bef8b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.693 227766 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.698 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162372.6983945, 2e578010-2c00-4db3-9cee-27a10eedc975 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.699 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] VM Resumed (Lifecycle Event)#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.700 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.703 227766 INFO nova.virt.libvirt.driver [-] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Instance spawned successfully.#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.704 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.735 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.738 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.739 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.739 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.740 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.740 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.740 227766 DEBUG nova.virt.libvirt.driver [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.745 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.797 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.902 227766 INFO nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Took 18.37 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 04:59:32 np0005593234 nova_compute[227762]: 2026-01-23 09:59:32.903 227766 DEBUG nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:33 np0005593234 podman[268715]: 2026-01-23 09:59:33.124923342 +0000 UTC m=+0.604566684 container create 9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 04:59:33 np0005593234 systemd[1]: Started libpod-conmon-9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50.scope.
Jan 23 04:59:33 np0005593234 systemd[1]: Started libcrun container.
Jan 23 04:59:33 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7ebd2faf6dccece934cf7c95e8a02b0e25455df7388bd36f9da95f341760d4d0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 04:59:33 np0005593234 nova_compute[227762]: 2026-01-23 09:59:33.394 227766 INFO nova.compute.manager [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Took 20.58 seconds to build instance.#033[00m
Jan 23 04:59:33 np0005593234 podman[268715]: 2026-01-23 09:59:33.397856857 +0000 UTC m=+0.877500229 container init 9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 04:59:33 np0005593234 podman[268715]: 2026-01-23 09:59:33.403907756 +0000 UTC m=+0.883551098 container start 9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 04:59:33 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[268732]: [NOTICE]   (268736) : New worker (268738) forked
Jan 23 04:59:33 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[268732]: [NOTICE]   (268736) : Loading success.
Jan 23 04:59:33 np0005593234 nova_compute[227762]: 2026-01-23 09:59:33.517 227766 DEBUG oslo_concurrency.lockutils [None req-c73f7654-c7c3-4f20-ba77-dddf3e1d8179 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.807s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:33 np0005593234 nova_compute[227762]: 2026-01-23 09:59:33.662 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:33 np0005593234 nova_compute[227762]: 2026-01-23 09:59:33.663 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 04:59:33 np0005593234 nova_compute[227762]: 2026-01-23 09:59:33.663 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 04:59:33 np0005593234 nova_compute[227762]: 2026-01-23 09:59:33.920 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-2e578010-2c00-4db3-9cee-27a10eedc975" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 04:59:33 np0005593234 nova_compute[227762]: 2026-01-23 09:59:33.921 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-2e578010-2c00-4db3-9cee-27a10eedc975" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 04:59:33 np0005593234 nova_compute[227762]: 2026-01-23 09:59:33.921 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 04:59:33 np0005593234 nova_compute[227762]: 2026-01-23 09:59:33.922 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 2e578010-2c00-4db3-9cee-27a10eedc975 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:59:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:34.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:34.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:34 np0005593234 nova_compute[227762]: 2026-01-23 09:59:34.598 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:34 np0005593234 nova_compute[227762]: 2026-01-23 09:59:34.890 227766 DEBUG nova.compute.manager [req-14b22c87-30d9-4378-83c2-59f18601c0be req-3207c0c1-21ea-4bf7-b68a-b1288c613d49 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Received event network-vif-plugged-2da067d2-38b0-41de-8c24-58b2fc5bef8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:34 np0005593234 nova_compute[227762]: 2026-01-23 09:59:34.890 227766 DEBUG oslo_concurrency.lockutils [req-14b22c87-30d9-4378-83c2-59f18601c0be req-3207c0c1-21ea-4bf7-b68a-b1288c613d49 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:34 np0005593234 nova_compute[227762]: 2026-01-23 09:59:34.891 227766 DEBUG oslo_concurrency.lockutils [req-14b22c87-30d9-4378-83c2-59f18601c0be req-3207c0c1-21ea-4bf7-b68a-b1288c613d49 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:34 np0005593234 nova_compute[227762]: 2026-01-23 09:59:34.891 227766 DEBUG oslo_concurrency.lockutils [req-14b22c87-30d9-4378-83c2-59f18601c0be req-3207c0c1-21ea-4bf7-b68a-b1288c613d49 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:34 np0005593234 nova_compute[227762]: 2026-01-23 09:59:34.891 227766 DEBUG nova.compute.manager [req-14b22c87-30d9-4378-83c2-59f18601c0be req-3207c0c1-21ea-4bf7-b68a-b1288c613d49 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] No waiting events found dispatching network-vif-plugged-2da067d2-38b0-41de-8c24-58b2fc5bef8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:34 np0005593234 nova_compute[227762]: 2026-01-23 09:59:34.891 227766 WARNING nova.compute.manager [req-14b22c87-30d9-4378-83c2-59f18601c0be req-3207c0c1-21ea-4bf7-b68a-b1288c613d49 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Received unexpected event network-vif-plugged-2da067d2-38b0-41de-8c24-58b2fc5bef8b for instance with vm_state active and task_state None.#033[00m
Jan 23 04:59:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:35 np0005593234 nova_compute[227762]: 2026-01-23 09:59:35.926 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:36.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:36.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:36 np0005593234 nova_compute[227762]: 2026-01-23 09:59:36.676 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Updating instance_info_cache with network_info: [{"id": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "address": "fa:16:3e:fe:b7:82", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da067d2-38", "ovs_interfaceid": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:36 np0005593234 nova_compute[227762]: 2026-01-23 09:59:36.775 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-2e578010-2c00-4db3-9cee-27a10eedc975" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 04:59:36 np0005593234 nova_compute[227762]: 2026-01-23 09:59:36.776 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 04:59:36 np0005593234 nova_compute[227762]: 2026-01-23 09:59:36.776 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:36 np0005593234 nova_compute[227762]: 2026-01-23 09:59:36.776 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:36 np0005593234 nova_compute[227762]: 2026-01-23 09:59:36.852 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:38.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:38.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:38 np0005593234 nova_compute[227762]: 2026-01-23 09:59:38.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:39 np0005593234 nova_compute[227762]: 2026-01-23 09:59:39.600 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:39 np0005593234 nova_compute[227762]: 2026-01-23 09:59:39.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 04:59:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:40.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:40.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:40 np0005593234 nova_compute[227762]: 2026-01-23 09:59:40.927 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:42.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:42.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:42.836 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:42.839 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:42.840 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:43 np0005593234 nova_compute[227762]: 2026-01-23 09:59:43.782 227766 DEBUG oslo_concurrency.lockutils [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "2e578010-2c00-4db3-9cee-27a10eedc975" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:43 np0005593234 nova_compute[227762]: 2026-01-23 09:59:43.783 227766 DEBUG oslo_concurrency.lockutils [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:43 np0005593234 nova_compute[227762]: 2026-01-23 09:59:43.783 227766 DEBUG oslo_concurrency.lockutils [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:43 np0005593234 nova_compute[227762]: 2026-01-23 09:59:43.783 227766 DEBUG oslo_concurrency.lockutils [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:43 np0005593234 nova_compute[227762]: 2026-01-23 09:59:43.783 227766 DEBUG oslo_concurrency.lockutils [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:43 np0005593234 nova_compute[227762]: 2026-01-23 09:59:43.785 227766 INFO nova.compute.manager [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Terminating instance#033[00m
Jan 23 04:59:43 np0005593234 nova_compute[227762]: 2026-01-23 09:59:43.786 227766 DEBUG nova.compute.manager [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 04:59:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:44.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:44.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:44 np0005593234 kernel: tap2da067d2-38 (unregistering): left promiscuous mode
Jan 23 04:59:44 np0005593234 NetworkManager[48942]: <info>  [1769162384.4466] device (tap2da067d2-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 04:59:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:44Z|00329|binding|INFO|Releasing lport 2da067d2-38b0-41de-8c24-58b2fc5bef8b from this chassis (sb_readonly=0)
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.464 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:44Z|00330|binding|INFO|Setting lport 2da067d2-38b0-41de-8c24-58b2fc5bef8b down in Southbound
Jan 23 04:59:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:44Z|00331|binding|INFO|Removing iface tap2da067d2-38 ovn-installed in OVS
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.471 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.488 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:44 np0005593234 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000005e.scope: Deactivated successfully.
Jan 23 04:59:44 np0005593234 systemd[1]: machine-qemu\x2d38\x2dinstance\x2d0000005e.scope: Consumed 11.721s CPU time.
Jan 23 04:59:44 np0005593234 systemd-machined[195626]: Machine qemu-38-instance-0000005e terminated.
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.602 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:44 np0005593234 kernel: tap2da067d2-38: entered promiscuous mode
Jan 23 04:59:44 np0005593234 kernel: tap2da067d2-38 (unregistering): left promiscuous mode
Jan 23 04:59:44 np0005593234 NetworkManager[48942]: <info>  [1769162384.6165] manager: (tap2da067d2-38): new Tun device (/org/freedesktop/NetworkManager/Devices/169)
Jan 23 04:59:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:44Z|00332|if_status|INFO|Not updating pb chassis for 2da067d2-38b0-41de-8c24-58b2fc5bef8b now as sb is readonly
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.620 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.630 227766 INFO nova.virt.libvirt.driver [-] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Instance destroyed successfully.#033[00m
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.631 227766 DEBUG nova.objects.instance [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lazy-loading 'resources' on Instance uuid 2e578010-2c00-4db3-9cee-27a10eedc975 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 04:59:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:44Z|00333|binding|INFO|Releasing lport 2da067d2-38b0-41de-8c24-58b2fc5bef8b from this chassis (sb_readonly=1)
Jan 23 04:59:44 np0005593234 ovn_controller[134547]: 2026-01-23T09:59:44Z|00334|if_status|INFO|Not setting lport 2da067d2-38b0-41de-8c24-58b2fc5bef8b down as sb is readonly
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.639 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.654 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:44.910 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:b7:82 10.100.0.7'], port_security=['fa:16:3e:fe:b7:82 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '2e578010-2c00-4db3-9cee-27a10eedc975', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c5732b3-3484-43db-a231-53d04de40d61', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c288779980de4f03be20b7eed343b775', 'neutron:revision_number': '4', 'neutron:security_group_ids': '288ecf98-3e6e-478c-8e27-86a4106b4ef8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2529943-1c00-4757-827e-798919a83756, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=2da067d2-38b0-41de-8c24-58b2fc5bef8b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 04:59:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:44.912 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 2da067d2-38b0-41de-8c24-58b2fc5bef8b in datapath 6c5732b3-3484-43db-a231-53d04de40d61 unbound from our chassis#033[00m
Jan 23 04:59:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:44.915 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c5732b3-3484-43db-a231-53d04de40d61, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 04:59:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:44.918 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5fd74a56-2549-4902-a6b8-26d37c8699a4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:44.919 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 namespace which is not needed anymore#033[00m
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.935 227766 DEBUG nova.virt.libvirt.vif [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:59:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-MultipleCreateTestJSON-server-2087045140',display_name='tempest-MultipleCreateTestJSON-server-2087045140-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-multiplecreatetestjson-server-2087045140-1',id=94,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:59:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c288779980de4f03be20b7eed343b775',ramdisk_id='',reservation_id='r-2nwggk6q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-MultipleCreateTestJSON-351408189',owner_user_name='tempest-MultipleCreateTestJSON-351408189-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T09:59:33Z,user_data=None,user_id='d83df80213fd40f99fdc68c146fe9a2a',uuid=2e578010-2c00-4db3-9cee-27a10eedc975,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "address": "fa:16:3e:fe:b7:82", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da067d2-38", "ovs_interfaceid": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.936 227766 DEBUG nova.network.os_vif_util [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converting VIF {"id": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "address": "fa:16:3e:fe:b7:82", "network": {"id": "6c5732b3-3484-43db-a231-53d04de40d61", "bridge": "br-int", "label": "tempest-MultipleCreateTestJSON-989500160-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c288779980de4f03be20b7eed343b775", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2da067d2-38", "ovs_interfaceid": "2da067d2-38b0-41de-8c24-58b2fc5bef8b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.937 227766 DEBUG nova.network.os_vif_util [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:fe:b7:82,bridge_name='br-int',has_traffic_filtering=True,id=2da067d2-38b0-41de-8c24-58b2fc5bef8b,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da067d2-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.938 227766 DEBUG os_vif [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:fe:b7:82,bridge_name='br-int',has_traffic_filtering=True,id=2da067d2-38b0-41de-8c24-58b2fc5bef8b,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da067d2-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.940 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.941 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2da067d2-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.943 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.946 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 04:59:44 np0005593234 nova_compute[227762]: 2026-01-23 09:59:44.949 227766 INFO os_vif [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:fe:b7:82,bridge_name='br-int',has_traffic_filtering=True,id=2da067d2-38b0-41de-8c24-58b2fc5bef8b,network=Network(6c5732b3-3484-43db-a231-53d04de40d61),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2da067d2-38')#033[00m
Jan 23 04:59:45 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[268732]: [NOTICE]   (268736) : haproxy version is 2.8.14-c23fe91
Jan 23 04:59:45 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[268732]: [NOTICE]   (268736) : path to executable is /usr/sbin/haproxy
Jan 23 04:59:45 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[268732]: [WARNING]  (268736) : Exiting Master process...
Jan 23 04:59:45 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[268732]: [ALERT]    (268736) : Current worker (268738) exited with code 143 (Terminated)
Jan 23 04:59:45 np0005593234 neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61[268732]: [WARNING]  (268736) : All workers exited. Exiting... (0)
Jan 23 04:59:45 np0005593234 systemd[1]: libpod-9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50.scope: Deactivated successfully.
Jan 23 04:59:45 np0005593234 podman[268843]: 2026-01-23 09:59:45.154914556 +0000 UTC m=+0.123577441 container died 9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 04:59:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:45 np0005593234 nova_compute[227762]: 2026-01-23 09:59:45.929 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:46 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50-userdata-shm.mount: Deactivated successfully.
Jan 23 04:59:46 np0005593234 systemd[1]: var-lib-containers-storage-overlay-7ebd2faf6dccece934cf7c95e8a02b0e25455df7388bd36f9da95f341760d4d0-merged.mount: Deactivated successfully.
Jan 23 04:59:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:59:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:46.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:59:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:46.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:46 np0005593234 podman[268843]: 2026-01-23 09:59:46.332737245 +0000 UTC m=+1.301400130 container cleanup 9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 04:59:46 np0005593234 systemd[1]: libpod-conmon-9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50.scope: Deactivated successfully.
Jan 23 04:59:46 np0005593234 podman[268886]: 2026-01-23 09:59:46.608838239 +0000 UTC m=+0.245605253 container remove 9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 04:59:46 np0005593234 nova_compute[227762]: 2026-01-23 09:59:46.612 227766 DEBUG nova.compute.manager [req-43db395f-d35d-47b4-854b-1649b9f85ba1 req-aa79cc08-6b71-4b48-8465-6dfc7918daa0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Received event network-vif-unplugged-2da067d2-38b0-41de-8c24-58b2fc5bef8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:46 np0005593234 nova_compute[227762]: 2026-01-23 09:59:46.612 227766 DEBUG oslo_concurrency.lockutils [req-43db395f-d35d-47b4-854b-1649b9f85ba1 req-aa79cc08-6b71-4b48-8465-6dfc7918daa0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:46 np0005593234 nova_compute[227762]: 2026-01-23 09:59:46.612 227766 DEBUG oslo_concurrency.lockutils [req-43db395f-d35d-47b4-854b-1649b9f85ba1 req-aa79cc08-6b71-4b48-8465-6dfc7918daa0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:46 np0005593234 nova_compute[227762]: 2026-01-23 09:59:46.613 227766 DEBUG oslo_concurrency.lockutils [req-43db395f-d35d-47b4-854b-1649b9f85ba1 req-aa79cc08-6b71-4b48-8465-6dfc7918daa0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:46 np0005593234 nova_compute[227762]: 2026-01-23 09:59:46.613 227766 DEBUG nova.compute.manager [req-43db395f-d35d-47b4-854b-1649b9f85ba1 req-aa79cc08-6b71-4b48-8465-6dfc7918daa0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] No waiting events found dispatching network-vif-unplugged-2da067d2-38b0-41de-8c24-58b2fc5bef8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:46 np0005593234 nova_compute[227762]: 2026-01-23 09:59:46.613 227766 DEBUG nova.compute.manager [req-43db395f-d35d-47b4-854b-1649b9f85ba1 req-aa79cc08-6b71-4b48-8465-6dfc7918daa0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Received event network-vif-unplugged-2da067d2-38b0-41de-8c24-58b2fc5bef8b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 04:59:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:46.615 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e688e311-0669-47e5-956a-70506425ba54]: (4, ('Fri Jan 23 09:59:45 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 (9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50)\n9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50\nFri Jan 23 09:59:46 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 (9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50)\n9df708544431772ff78997fb6b29cd6570a7045877d2049b4b7e7c0ef9c72c50\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:46.616 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d3b828f5-2b09-4907-98f2-954d33790a5f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:46.617 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c5732b3-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 04:59:46 np0005593234 kernel: tap6c5732b3-30: left promiscuous mode
Jan 23 04:59:46 np0005593234 nova_compute[227762]: 2026-01-23 09:59:46.619 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:46 np0005593234 nova_compute[227762]: 2026-01-23 09:59:46.622 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:46.624 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ecea2db2-5dec-43b1-862b-cbedd402875e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:46 np0005593234 nova_compute[227762]: 2026-01-23 09:59:46.633 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:46.640 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d95a736c-302f-42e2-93e8-218fd40d64fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:46.641 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[63ed0023-297f-4548-88b1-d795fbc8929f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:46.657 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ad281442-309e-4d34-af8f-cf13de415f9c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 626692, 'reachable_time': 18100, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 268902, 'error': None, 'target': 'ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:46.660 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6c5732b3-3484-43db-a231-53d04de40d61 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 04:59:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 09:59:46.660 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[21ca4d21-b569-4b48-9ff5-5aaa40980431]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 04:59:46 np0005593234 systemd[1]: run-netns-ovnmeta\x2d6c5732b3\x2d3484\x2d43db\x2da231\x2d53d04de40d61.mount: Deactivated successfully.
Jan 23 04:59:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:48.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:48.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:48 np0005593234 nova_compute[227762]: 2026-01-23 09:59:48.786 227766 DEBUG nova.compute.manager [req-da5e5604-eb28-4ffe-a731-4b06a5c31891 req-f184098c-a198-40cd-ae75-0c5665d90cd6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Received event network-vif-plugged-2da067d2-38b0-41de-8c24-58b2fc5bef8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:48 np0005593234 nova_compute[227762]: 2026-01-23 09:59:48.786 227766 DEBUG oslo_concurrency.lockutils [req-da5e5604-eb28-4ffe-a731-4b06a5c31891 req-f184098c-a198-40cd-ae75-0c5665d90cd6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:48 np0005593234 nova_compute[227762]: 2026-01-23 09:59:48.786 227766 DEBUG oslo_concurrency.lockutils [req-da5e5604-eb28-4ffe-a731-4b06a5c31891 req-f184098c-a198-40cd-ae75-0c5665d90cd6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:48 np0005593234 nova_compute[227762]: 2026-01-23 09:59:48.787 227766 DEBUG oslo_concurrency.lockutils [req-da5e5604-eb28-4ffe-a731-4b06a5c31891 req-f184098c-a198-40cd-ae75-0c5665d90cd6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:48 np0005593234 nova_compute[227762]: 2026-01-23 09:59:48.787 227766 DEBUG nova.compute.manager [req-da5e5604-eb28-4ffe-a731-4b06a5c31891 req-f184098c-a198-40cd-ae75-0c5665d90cd6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] No waiting events found dispatching network-vif-plugged-2da067d2-38b0-41de-8c24-58b2fc5bef8b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 04:59:48 np0005593234 nova_compute[227762]: 2026-01-23 09:59:48.787 227766 WARNING nova.compute.manager [req-da5e5604-eb28-4ffe-a731-4b06a5c31891 req-f184098c-a198-40cd-ae75-0c5665d90cd6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Received unexpected event network-vif-plugged-2da067d2-38b0-41de-8c24-58b2fc5bef8b for instance with vm_state active and task_state deleting.#033[00m
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #97. Immutable memtables: 0.
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.004151) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 59] Flushing memtable with next log file: 97
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162389004281, "job": 59, "event": "flush_started", "num_memtables": 1, "num_entries": 722, "num_deletes": 251, "total_data_size": 1218474, "memory_usage": 1243496, "flush_reason": "Manual Compaction"}
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 59] Level-0 flush table #98: started
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162389014657, "cf_name": "default", "job": 59, "event": "table_file_creation", "file_number": 98, "file_size": 803810, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 48759, "largest_seqno": 49476, "table_properties": {"data_size": 800371, "index_size": 1283, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8049, "raw_average_key_size": 19, "raw_value_size": 793505, "raw_average_value_size": 1902, "num_data_blocks": 57, "num_entries": 417, "num_filter_entries": 417, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162342, "oldest_key_time": 1769162342, "file_creation_time": 1769162389, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 98, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 59] Flush lasted 10571 microseconds, and 3583 cpu microseconds.
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.014722) [db/flush_job.cc:967] [default] [JOB 59] Level-0 flush table #98: 803810 bytes OK
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.014747) [db/memtable_list.cc:519] [default] Level-0 commit table #98 started
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.017762) [db/memtable_list.cc:722] [default] Level-0 commit table #98: memtable #1 done
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.017800) EVENT_LOG_v1 {"time_micros": 1769162389017790, "job": 59, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.017823) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 59] Try to delete WAL files size 1214585, prev total WAL file size 1214585, number of live WAL files 2.
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000094.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.018505) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034303136' seq:72057594037927935, type:22 .. '7061786F730034323638' seq:0, type:0; will stop at (end)
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 60] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 59 Base level 0, inputs: [98(784KB)], [96(10018KB)]
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162389018619, "job": 60, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [98], "files_L6": [96], "score": -1, "input_data_size": 11063012, "oldest_snapshot_seqno": -1}
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 60] Generated table #99: 6969 keys, 9139433 bytes, temperature: kUnknown
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162389081253, "cf_name": "default", "job": 60, "event": "table_file_creation", "file_number": 99, "file_size": 9139433, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9095047, "index_size": 25842, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17477, "raw_key_size": 181434, "raw_average_key_size": 26, "raw_value_size": 8972493, "raw_average_value_size": 1287, "num_data_blocks": 1014, "num_entries": 6969, "num_filter_entries": 6969, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769162389, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 99, "seqno_to_time_mapping": "N/A"}}
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.081559) [db/compaction/compaction_job.cc:1663] [default] [JOB 60] Compacted 1@0 + 1@6 files to L6 => 9139433 bytes
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.083484) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.3 rd, 145.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 9.8 +0.0 blob) out(8.7 +0.0 blob), read-write-amplify(25.1) write-amplify(11.4) OK, records in: 7479, records dropped: 510 output_compression: NoCompression
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.083512) EVENT_LOG_v1 {"time_micros": 1769162389083500, "job": 60, "event": "compaction_finished", "compaction_time_micros": 62745, "compaction_time_cpu_micros": 24724, "output_level": 6, "num_output_files": 1, "total_output_size": 9139433, "num_input_records": 7479, "num_output_records": 6969, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000098.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162389083789, "job": 60, "event": "table_file_deletion", "file_number": 98}
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000096.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162389085734, "job": 60, "event": "table_file_deletion", "file_number": 96}
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.018455) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.085780) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.085784) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.085786) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.085788) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-09:59:49.085790) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 04:59:49 np0005593234 nova_compute[227762]: 2026-01-23 09:59:49.173 227766 INFO nova.virt.libvirt.driver [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Deleting instance files /var/lib/nova/instances/2e578010-2c00-4db3-9cee-27a10eedc975_del#033[00m
Jan 23 04:59:49 np0005593234 nova_compute[227762]: 2026-01-23 09:59:49.174 227766 INFO nova.virt.libvirt.driver [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Deletion of /var/lib/nova/instances/2e578010-2c00-4db3-9cee-27a10eedc975_del complete#033[00m
Jan 23 04:59:49 np0005593234 nova_compute[227762]: 2026-01-23 09:59:49.319 227766 INFO nova.compute.manager [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Took 5.53 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 04:59:49 np0005593234 nova_compute[227762]: 2026-01-23 09:59:49.319 227766 DEBUG oslo.service.loopingcall [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 04:59:49 np0005593234 nova_compute[227762]: 2026-01-23 09:59:49.320 227766 DEBUG nova.compute.manager [-] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 04:59:49 np0005593234 nova_compute[227762]: 2026-01-23 09:59:49.320 227766 DEBUG nova.network.neutron [-] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 04:59:49 np0005593234 podman[268905]: 2026-01-23 09:59:49.766321499 +0000 UTC m=+0.054794402 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 04:59:49 np0005593234 nova_compute[227762]: 2026-01-23 09:59:49.945 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:50.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 04:59:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:50.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 04:59:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:50 np0005593234 nova_compute[227762]: 2026-01-23 09:59:50.931 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:51 np0005593234 nova_compute[227762]: 2026-01-23 09:59:51.778 227766 DEBUG nova.network.neutron [-] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 04:59:51 np0005593234 nova_compute[227762]: 2026-01-23 09:59:51.828 227766 INFO nova.compute.manager [-] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Took 2.51 seconds to deallocate network for instance.#033[00m
Jan 23 04:59:51 np0005593234 nova_compute[227762]: 2026-01-23 09:59:51.941 227766 DEBUG oslo_concurrency.lockutils [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:51 np0005593234 nova_compute[227762]: 2026-01-23 09:59:51.942 227766 DEBUG oslo_concurrency.lockutils [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:52 np0005593234 nova_compute[227762]: 2026-01-23 09:59:52.024 227766 DEBUG oslo_concurrency.processutils [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:52 np0005593234 nova_compute[227762]: 2026-01-23 09:59:52.050 227766 DEBUG nova.compute.manager [req-f881a88b-b6d5-41fd-bcb1-38b8c438401b req-a0b99f58-f6f7-4029-8757-7a0e185a9927 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Received event network-vif-deleted-2da067d2-38b0-41de-8c24-58b2fc5bef8b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 04:59:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:52.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:52.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:59:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/217819776' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:59:52 np0005593234 nova_compute[227762]: 2026-01-23 09:59:52.919 227766 DEBUG oslo_concurrency.processutils [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.895s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:52 np0005593234 nova_compute[227762]: 2026-01-23 09:59:52.925 227766 DEBUG nova.compute.provider_tree [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:59:52 np0005593234 nova_compute[227762]: 2026-01-23 09:59:52.967 227766 DEBUG nova.scheduler.client.report [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:59:53 np0005593234 nova_compute[227762]: 2026-01-23 09:59:53.011 227766 DEBUG oslo_concurrency.lockutils [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.069s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:53 np0005593234 nova_compute[227762]: 2026-01-23 09:59:53.043 227766 INFO nova.scheduler.client.report [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Deleted allocations for instance 2e578010-2c00-4db3-9cee-27a10eedc975#033[00m
Jan 23 04:59:53 np0005593234 nova_compute[227762]: 2026-01-23 09:59:53.192 227766 DEBUG oslo_concurrency.lockutils [None req-33f44261-5660-4cc4-8d68-d6e47ccfa495 d83df80213fd40f99fdc68c146fe9a2a c288779980de4f03be20b7eed343b775 - - default default] Lock "2e578010-2c00-4db3-9cee-27a10eedc975" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 9.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:54.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:54.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:54 np0005593234 nova_compute[227762]: 2026-01-23 09:59:54.947 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:55 np0005593234 nova_compute[227762]: 2026-01-23 09:59:55.038 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "4b94f03d-d738-409e-a0ac-b23304c3be02" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:55 np0005593234 nova_compute[227762]: 2026-01-23 09:59:55.038 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:55 np0005593234 nova_compute[227762]: 2026-01-23 09:59:55.062 227766 DEBUG nova.compute.manager [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 04:59:55 np0005593234 nova_compute[227762]: 2026-01-23 09:59:55.172 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 04:59:55 np0005593234 nova_compute[227762]: 2026-01-23 09:59:55.173 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 04:59:55 np0005593234 nova_compute[227762]: 2026-01-23 09:59:55.180 227766 DEBUG nova.virt.hardware [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 04:59:55 np0005593234 nova_compute[227762]: 2026-01-23 09:59:55.181 227766 INFO nova.compute.claims [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 04:59:55 np0005593234 nova_compute[227762]: 2026-01-23 09:59:55.528 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 04:59:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 04:59:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 04:59:55 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/741120230' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 04:59:55 np0005593234 nova_compute[227762]: 2026-01-23 09:59:55.932 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 04:59:55 np0005593234 nova_compute[227762]: 2026-01-23 09:59:55.948 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 04:59:55 np0005593234 nova_compute[227762]: 2026-01-23 09:59:55.953 227766 DEBUG nova.compute.provider_tree [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 04:59:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:56.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:56.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:56 np0005593234 nova_compute[227762]: 2026-01-23 09:59:56.592 227766 DEBUG nova.scheduler.client.report [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 04:59:58 np0005593234 nova_compute[227762]: 2026-01-23 09:59:58.102 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.929s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 04:59:58 np0005593234 nova_compute[227762]: 2026-01-23 09:59:58.103 227766 DEBUG nova.compute.manager [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 04:59:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 04:59:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:58.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 04:59:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 04:59:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 04:59:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:58.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 04:59:58 np0005593234 nova_compute[227762]: 2026-01-23 09:59:58.452 227766 DEBUG nova.compute.manager [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 04:59:58 np0005593234 nova_compute[227762]: 2026-01-23 09:59:58.452 227766 DEBUG nova.network.neutron [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 04:59:58 np0005593234 nova_compute[227762]: 2026-01-23 09:59:58.711 227766 INFO nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 04:59:58 np0005593234 nova_compute[227762]: 2026-01-23 09:59:58.954 227766 DEBUG nova.policy [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '57e3c530deab46758172af6777c8c108', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd557095954714e01b800ed2898d27593', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 04:59:59 np0005593234 nova_compute[227762]: 2026-01-23 09:59:59.249 227766 DEBUG nova.compute.manager [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 04:59:59 np0005593234 nova_compute[227762]: 2026-01-23 09:59:59.628 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162384.6279898, 2e578010-2c00-4db3-9cee-27a10eedc975 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 04:59:59 np0005593234 nova_compute[227762]: 2026-01-23 09:59:59.629 227766 INFO nova.compute.manager [-] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] VM Stopped (Lifecycle Event)#033[00m
Jan 23 04:59:59 np0005593234 nova_compute[227762]: 2026-01-23 09:59:59.785 227766 DEBUG nova.compute.manager [None req-28d5ce8b-d330-4401-b911-2aa1eb3f5d0b - - - - - -] [instance: 2e578010-2c00-4db3-9cee-27a10eedc975] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 04:59:59 np0005593234 nova_compute[227762]: 2026-01-23 09:59:59.992 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:00 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 05:00:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:00.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:00.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:00.710 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=37, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=36) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.710 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:00.711 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.782 227766 DEBUG nova.compute.manager [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.783 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.784 227766 INFO nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Creating image(s)#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.810 227766 DEBUG nova.storage.rbd_utils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 4b94f03d-d738-409e-a0ac-b23304c3be02_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.837 227766 DEBUG nova.storage.rbd_utils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 4b94f03d-d738-409e-a0ac-b23304c3be02_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.864 227766 DEBUG nova.storage.rbd_utils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 4b94f03d-d738-409e-a0ac-b23304c3be02_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.868 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.933 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.936 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.936 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.937 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.937 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.962 227766 DEBUG nova.storage.rbd_utils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 4b94f03d-d738-409e-a0ac-b23304c3be02_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:00:00 np0005593234 nova_compute[227762]: 2026-01-23 10:00:00.966 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 4b94f03d-d738-409e-a0ac-b23304c3be02_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:00:01 np0005593234 nova_compute[227762]: 2026-01-23 10:00:01.694 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 4b94f03d-d738-409e-a0ac-b23304c3be02_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.729s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:00:01 np0005593234 nova_compute[227762]: 2026-01-23 10:00:01.765 227766 DEBUG nova.storage.rbd_utils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] resizing rbd image 4b94f03d-d738-409e-a0ac-b23304c3be02_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:00:01 np0005593234 nova_compute[227762]: 2026-01-23 10:00:01.888 227766 DEBUG nova.objects.instance [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lazy-loading 'migration_context' on Instance uuid 4b94f03d-d738-409e-a0ac-b23304c3be02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:00:01 np0005593234 nova_compute[227762]: 2026-01-23 10:00:01.920 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:00:01 np0005593234 nova_compute[227762]: 2026-01-23 10:00:01.921 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Ensure instance console log exists: /var/lib/nova/instances/4b94f03d-d738-409e-a0ac-b23304c3be02/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:00:01 np0005593234 nova_compute[227762]: 2026-01-23 10:00:01.921 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:01 np0005593234 nova_compute[227762]: 2026-01-23 10:00:01.922 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:01 np0005593234 nova_compute[227762]: 2026-01-23 10:00:01.922 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:02.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:02.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:00:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:00:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:00:02 np0005593234 podman[269274]: 2026-01-23 10:00:02.829778703 +0000 UTC m=+0.119611186 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:00:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:04.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:04.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:04 np0005593234 nova_compute[227762]: 2026-01-23 10:00:04.576 227766 DEBUG nova.network.neutron [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Successfully created port: 4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:00:04 np0005593234 nova_compute[227762]: 2026-01-23 10:00:04.994 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:05 np0005593234 nova_compute[227762]: 2026-01-23 10:00:05.935 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:06.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:00:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:06.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:00:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:08.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:08.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:00:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:00:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:08.712 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '37'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:09 np0005593234 nova_compute[227762]: 2026-01-23 10:00:09.978 227766 DEBUG nova.network.neutron [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Successfully created port: 1cb18651-911d-44cd-a90d-80ff618579a8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:00:09 np0005593234 nova_compute[227762]: 2026-01-23 10:00:09.997 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:10.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:10.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:10 np0005593234 nova_compute[227762]: 2026-01-23 10:00:10.938 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:12.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:12.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:13 np0005593234 nova_compute[227762]: 2026-01-23 10:00:13.224 227766 DEBUG nova.network.neutron [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Successfully updated port: 4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:00:13 np0005593234 nova_compute[227762]: 2026-01-23 10:00:13.494 227766 DEBUG nova.compute.manager [req-043d5f80-37bd-40fc-91fb-784ce50681c7 req-8145d187-8b0b-44c1-9a8b-10f3ca0988d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-changed-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:13 np0005593234 nova_compute[227762]: 2026-01-23 10:00:13.494 227766 DEBUG nova.compute.manager [req-043d5f80-37bd-40fc-91fb-784ce50681c7 req-8145d187-8b0b-44c1-9a8b-10f3ca0988d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Refreshing instance network info cache due to event network-changed-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:00:13 np0005593234 nova_compute[227762]: 2026-01-23 10:00:13.495 227766 DEBUG oslo_concurrency.lockutils [req-043d5f80-37bd-40fc-91fb-784ce50681c7 req-8145d187-8b0b-44c1-9a8b-10f3ca0988d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-4b94f03d-d738-409e-a0ac-b23304c3be02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:00:13 np0005593234 nova_compute[227762]: 2026-01-23 10:00:13.495 227766 DEBUG oslo_concurrency.lockutils [req-043d5f80-37bd-40fc-91fb-784ce50681c7 req-8145d187-8b0b-44c1-9a8b-10f3ca0988d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-4b94f03d-d738-409e-a0ac-b23304c3be02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:00:13 np0005593234 nova_compute[227762]: 2026-01-23 10:00:13.495 227766 DEBUG nova.network.neutron [req-043d5f80-37bd-40fc-91fb-784ce50681c7 req-8145d187-8b0b-44c1-9a8b-10f3ca0988d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Refreshing network info cache for port 4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:00:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:14.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:00:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:14.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:00:14 np0005593234 nova_compute[227762]: 2026-01-23 10:00:14.686 227766 DEBUG nova.network.neutron [req-043d5f80-37bd-40fc-91fb-784ce50681c7 req-8145d187-8b0b-44c1-9a8b-10f3ca0988d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:00:15 np0005593234 nova_compute[227762]: 2026-01-23 10:00:15.000 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:15 np0005593234 nova_compute[227762]: 2026-01-23 10:00:15.881 227766 DEBUG nova.network.neutron [req-043d5f80-37bd-40fc-91fb-784ce50681c7 req-8145d187-8b0b-44c1-9a8b-10f3ca0988d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:00:15 np0005593234 nova_compute[227762]: 2026-01-23 10:00:15.941 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:16.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:16.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:16 np0005593234 nova_compute[227762]: 2026-01-23 10:00:16.503 227766 DEBUG oslo_concurrency.lockutils [req-043d5f80-37bd-40fc-91fb-784ce50681c7 req-8145d187-8b0b-44c1-9a8b-10f3ca0988d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-4b94f03d-d738-409e-a0ac-b23304c3be02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:00:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:00:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:18.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:00:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:18.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:19 np0005593234 nova_compute[227762]: 2026-01-23 10:00:19.631 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:19 np0005593234 nova_compute[227762]: 2026-01-23 10:00:19.803 227766 DEBUG nova.network.neutron [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Successfully updated port: 1cb18651-911d-44cd-a90d-80ff618579a8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:00:19 np0005593234 nova_compute[227762]: 2026-01-23 10:00:19.831 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "refresh_cache-4b94f03d-d738-409e-a0ac-b23304c3be02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:00:19 np0005593234 nova_compute[227762]: 2026-01-23 10:00:19.832 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquired lock "refresh_cache-4b94f03d-d738-409e-a0ac-b23304c3be02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:00:19 np0005593234 nova_compute[227762]: 2026-01-23 10:00:19.832 227766 DEBUG nova.network.neutron [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:00:20 np0005593234 nova_compute[227762]: 2026-01-23 10:00:20.001 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:20.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:20 np0005593234 nova_compute[227762]: 2026-01-23 10:00:20.256 227766 DEBUG nova.network.neutron [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:00:20 np0005593234 nova_compute[227762]: 2026-01-23 10:00:20.277 227766 DEBUG nova.compute.manager [req-d4014a40-3c46-4f08-87d6-4f507eada4c9 req-da0c553c-12d7-45e4-952e-de1d6f4c518a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-changed-1cb18651-911d-44cd-a90d-80ff618579a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:20 np0005593234 nova_compute[227762]: 2026-01-23 10:00:20.277 227766 DEBUG nova.compute.manager [req-d4014a40-3c46-4f08-87d6-4f507eada4c9 req-da0c553c-12d7-45e4-952e-de1d6f4c518a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Refreshing instance network info cache due to event network-changed-1cb18651-911d-44cd-a90d-80ff618579a8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:00:20 np0005593234 nova_compute[227762]: 2026-01-23 10:00:20.277 227766 DEBUG oslo_concurrency.lockutils [req-d4014a40-3c46-4f08-87d6-4f507eada4c9 req-da0c553c-12d7-45e4-952e-de1d6f4c518a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-4b94f03d-d738-409e-a0ac-b23304c3be02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:00:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:20.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:20 np0005593234 podman[269409]: 2026-01-23 10:00:20.755850597 +0000 UTC m=+0.055038831 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 23 05:00:20 np0005593234 nova_compute[227762]: 2026-01-23 10:00:20.943 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:22.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:22.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:22 np0005593234 nova_compute[227762]: 2026-01-23 10:00:22.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:24.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:24.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.421 227766 DEBUG nova.network.neutron [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Updating instance_info_cache with network_info: [{"id": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "address": "fa:16:3e:27:8b:47", "network": {"id": "70cf4ed9-f261-4264-9650-7d4e3f77ec45", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-368069413", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b07fe06-9d", "ovs_interfaceid": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1cb18651-911d-44cd-a90d-80ff618579a8", "address": "fa:16:3e:73:05:64", "network": {"id": "d2cd39c5-4f2d-4c93-b09b-4332da1257d0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1778987856", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb18651-91", "ovs_interfaceid": "1cb18651-911d-44cd-a90d-80ff618579a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.489 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Releasing lock "refresh_cache-4b94f03d-d738-409e-a0ac-b23304c3be02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.489 227766 DEBUG nova.compute.manager [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Instance network_info: |[{"id": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "address": "fa:16:3e:27:8b:47", "network": {"id": "70cf4ed9-f261-4264-9650-7d4e3f77ec45", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-368069413", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b07fe06-9d", "ovs_interfaceid": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1cb18651-911d-44cd-a90d-80ff618579a8", "address": "fa:16:3e:73:05:64", "network": {"id": "d2cd39c5-4f2d-4c93-b09b-4332da1257d0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1778987856", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb18651-91", "ovs_interfaceid": "1cb18651-911d-44cd-a90d-80ff618579a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.489 227766 DEBUG oslo_concurrency.lockutils [req-d4014a40-3c46-4f08-87d6-4f507eada4c9 req-da0c553c-12d7-45e4-952e-de1d6f4c518a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-4b94f03d-d738-409e-a0ac-b23304c3be02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.490 227766 DEBUG nova.network.neutron [req-d4014a40-3c46-4f08-87d6-4f507eada4c9 req-da0c553c-12d7-45e4-952e-de1d6f4c518a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Refreshing network info cache for port 1cb18651-911d-44cd-a90d-80ff618579a8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.493 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Start _get_guest_xml network_info=[{"id": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "address": "fa:16:3e:27:8b:47", "network": {"id": "70cf4ed9-f261-4264-9650-7d4e3f77ec45", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-368069413", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b07fe06-9d", "ovs_interfaceid": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1cb18651-911d-44cd-a90d-80ff618579a8", "address": "fa:16:3e:73:05:64", "network": {"id": "d2cd39c5-4f2d-4c93-b09b-4332da1257d0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1778987856", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb18651-91", "ovs_interfaceid": "1cb18651-911d-44cd-a90d-80ff618579a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.498 227766 WARNING nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.535 227766 DEBUG nova.virt.libvirt.host [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.536 227766 DEBUG nova.virt.libvirt.host [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.542 227766 DEBUG nova.virt.libvirt.host [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.543 227766 DEBUG nova.virt.libvirt.host [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.545 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.545 227766 DEBUG nova.virt.hardware [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.545 227766 DEBUG nova.virt.hardware [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.546 227766 DEBUG nova.virt.hardware [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.546 227766 DEBUG nova.virt.hardware [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.546 227766 DEBUG nova.virt.hardware [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.546 227766 DEBUG nova.virt.hardware [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.547 227766 DEBUG nova.virt.hardware [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.547 227766 DEBUG nova.virt.hardware [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.547 227766 DEBUG nova.virt.hardware [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.548 227766 DEBUG nova.virt.hardware [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.548 227766 DEBUG nova.virt.hardware [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:00:24 np0005593234 nova_compute[227762]: 2026-01-23 10:00:24.552 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.003 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:00:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3145492259' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.054 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.102 227766 DEBUG nova.storage.rbd_utils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 4b94f03d-d738-409e-a0ac-b23304c3be02_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.106 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:00:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:00:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3200014349' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.529 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.531 227766 DEBUG nova.virt.libvirt.vif [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:59:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1351727518',display_name='tempest-ServersTestMultiNic-server-1351727518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1351727518',id=96,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-1yrvua34',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:59:59Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=4b94f03d-d738-409e-a0ac-b23304c3be02,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "address": "fa:16:3e:27:8b:47", "network": {"id": "70cf4ed9-f261-4264-9650-7d4e3f77ec45", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-368069413", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b07fe06-9d", "ovs_interfaceid": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.531 227766 DEBUG nova.network.os_vif_util [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "address": "fa:16:3e:27:8b:47", "network": {"id": "70cf4ed9-f261-4264-9650-7d4e3f77ec45", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-368069413", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b07fe06-9d", "ovs_interfaceid": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.533 227766 DEBUG nova.network.os_vif_util [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:8b:47,bridge_name='br-int',has_traffic_filtering=True,id=4b07fe06-9de4-4bfa-9a19-6aeceed74ea1,network=Network(70cf4ed9-f261-4264-9650-7d4e3f77ec45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b07fe06-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.534 227766 DEBUG nova.virt.libvirt.vif [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:59:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1351727518',display_name='tempest-ServersTestMultiNic-server-1351727518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1351727518',id=96,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-1yrvua34',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:59:59Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=4b94f03d-d738-409e-a0ac-b23304c3be02,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1cb18651-911d-44cd-a90d-80ff618579a8", "address": "fa:16:3e:73:05:64", "network": {"id": "d2cd39c5-4f2d-4c93-b09b-4332da1257d0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1778987856", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb18651-91", "ovs_interfaceid": "1cb18651-911d-44cd-a90d-80ff618579a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.534 227766 DEBUG nova.network.os_vif_util [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "1cb18651-911d-44cd-a90d-80ff618579a8", "address": "fa:16:3e:73:05:64", "network": {"id": "d2cd39c5-4f2d-4c93-b09b-4332da1257d0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1778987856", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb18651-91", "ovs_interfaceid": "1cb18651-911d-44cd-a90d-80ff618579a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.534 227766 DEBUG nova.network.os_vif_util [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:05:64,bridge_name='br-int',has_traffic_filtering=True,id=1cb18651-911d-44cd-a90d-80ff618579a8,network=Network(d2cd39c5-4f2d-4c93-b09b-4332da1257d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb18651-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.536 227766 DEBUG nova.objects.instance [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4b94f03d-d738-409e-a0ac-b23304c3be02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:00:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.660 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  <uuid>4b94f03d-d738-409e-a0ac-b23304c3be02</uuid>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  <name>instance-00000060</name>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServersTestMultiNic-server-1351727518</nova:name>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:00:24</nova:creationTime>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <nova:user uuid="57e3c530deab46758172af6777c8c108">tempest-ServersTestMultiNic-546513917-project-member</nova:user>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <nova:project uuid="d557095954714e01b800ed2898d27593">tempest-ServersTestMultiNic-546513917</nova:project>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <nova:port uuid="4b07fe06-9de4-4bfa-9a19-6aeceed74ea1">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.212" ipVersion="4"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <nova:port uuid="1cb18651-911d-44cd-a90d-80ff618579a8">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.1.177" ipVersion="4"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <entry name="serial">4b94f03d-d738-409e-a0ac-b23304c3be02</entry>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <entry name="uuid">4b94f03d-d738-409e-a0ac-b23304c3be02</entry>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/4b94f03d-d738-409e-a0ac-b23304c3be02_disk">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/4b94f03d-d738-409e-a0ac-b23304c3be02_disk.config">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:27:8b:47"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <target dev="tap4b07fe06-9d"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:73:05:64"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <target dev="tap1cb18651-91"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/4b94f03d-d738-409e-a0ac-b23304c3be02/console.log" append="off"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:00:25 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:00:25 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:00:25 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:00:25 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.662 227766 DEBUG nova.compute.manager [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Preparing to wait for external event network-vif-plugged-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.662 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.663 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.663 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.663 227766 DEBUG nova.compute.manager [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Preparing to wait for external event network-vif-plugged-1cb18651-911d-44cd-a90d-80ff618579a8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.664 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.664 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.664 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.665 227766 DEBUG nova.virt.libvirt.vif [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:59:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1351727518',display_name='tempest-ServersTestMultiNic-server-1351727518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1351727518',id=96,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-1yrvua34',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:59:59Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=4b94f03d-d738-409e-a0ac-b23304c3be02,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "address": "fa:16:3e:27:8b:47", "network": {"id": "70cf4ed9-f261-4264-9650-7d4e3f77ec45", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-368069413", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b07fe06-9d", "ovs_interfaceid": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.666 227766 DEBUG nova.network.os_vif_util [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "address": "fa:16:3e:27:8b:47", "network": {"id": "70cf4ed9-f261-4264-9650-7d4e3f77ec45", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-368069413", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b07fe06-9d", "ovs_interfaceid": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.666 227766 DEBUG nova.network.os_vif_util [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:8b:47,bridge_name='br-int',has_traffic_filtering=True,id=4b07fe06-9de4-4bfa-9a19-6aeceed74ea1,network=Network(70cf4ed9-f261-4264-9650-7d4e3f77ec45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b07fe06-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.667 227766 DEBUG os_vif [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:8b:47,bridge_name='br-int',has_traffic_filtering=True,id=4b07fe06-9de4-4bfa-9a19-6aeceed74ea1,network=Network(70cf4ed9-f261-4264-9650-7d4e3f77ec45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b07fe06-9d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.668 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.668 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.669 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.672 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.672 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4b07fe06-9d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.673 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4b07fe06-9d, col_values=(('external_ids', {'iface-id': '4b07fe06-9de4-4bfa-9a19-6aeceed74ea1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:27:8b:47', 'vm-uuid': '4b94f03d-d738-409e-a0ac-b23304c3be02'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.675 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:25 np0005593234 NetworkManager[48942]: <info>  [1769162425.6756] manager: (tap4b07fe06-9d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/170)
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.677 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.681 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.682 227766 INFO os_vif [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:8b:47,bridge_name='br-int',has_traffic_filtering=True,id=4b07fe06-9de4-4bfa-9a19-6aeceed74ea1,network=Network(70cf4ed9-f261-4264-9650-7d4e3f77ec45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b07fe06-9d')#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.683 227766 DEBUG nova.virt.libvirt.vif [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T09:59:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1351727518',display_name='tempest-ServersTestMultiNic-server-1351727518',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1351727518',id=96,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-1yrvua34',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T09:59:59Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=4b94f03d-d738-409e-a0ac-b23304c3be02,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1cb18651-911d-44cd-a90d-80ff618579a8", "address": "fa:16:3e:73:05:64", "network": {"id": "d2cd39c5-4f2d-4c93-b09b-4332da1257d0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1778987856", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb18651-91", "ovs_interfaceid": "1cb18651-911d-44cd-a90d-80ff618579a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.683 227766 DEBUG nova.network.os_vif_util [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "1cb18651-911d-44cd-a90d-80ff618579a8", "address": "fa:16:3e:73:05:64", "network": {"id": "d2cd39c5-4f2d-4c93-b09b-4332da1257d0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1778987856", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb18651-91", "ovs_interfaceid": "1cb18651-911d-44cd-a90d-80ff618579a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.684 227766 DEBUG nova.network.os_vif_util [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:05:64,bridge_name='br-int',has_traffic_filtering=True,id=1cb18651-911d-44cd-a90d-80ff618579a8,network=Network(d2cd39c5-4f2d-4c93-b09b-4332da1257d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb18651-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.684 227766 DEBUG os_vif [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:05:64,bridge_name='br-int',has_traffic_filtering=True,id=1cb18651-911d-44cd-a90d-80ff618579a8,network=Network(d2cd39c5-4f2d-4c93-b09b-4332da1257d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb18651-91') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.685 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.685 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.685 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.688 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.688 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1cb18651-91, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.688 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1cb18651-91, col_values=(('external_ids', {'iface-id': '1cb18651-911d-44cd-a90d-80ff618579a8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:73:05:64', 'vm-uuid': '4b94f03d-d738-409e-a0ac-b23304c3be02'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.689 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:25 np0005593234 NetworkManager[48942]: <info>  [1769162425.6903] manager: (tap1cb18651-91): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/171)
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.691 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.696 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.696 227766 INFO os_vif [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:05:64,bridge_name='br-int',has_traffic_filtering=True,id=1cb18651-911d-44cd-a90d-80ff618579a8,network=Network(d2cd39c5-4f2d-4c93-b09b-4332da1257d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb18651-91')#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.822 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.823 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.823 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] No VIF found with MAC fa:16:3e:27:8b:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.823 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] No VIF found with MAC fa:16:3e:73:05:64, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.824 227766 INFO nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Using config drive#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.850 227766 DEBUG nova.storage.rbd_utils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 4b94f03d-d738-409e-a0ac-b23304c3be02_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:00:25 np0005593234 nova_compute[227762]: 2026-01-23 10:00:25.944 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:26.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:26.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:27 np0005593234 nova_compute[227762]: 2026-01-23 10:00:27.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:27 np0005593234 nova_compute[227762]: 2026-01-23 10:00:27.986 227766 INFO nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Creating config drive at /var/lib/nova/instances/4b94f03d-d738-409e-a0ac-b23304c3be02/disk.config#033[00m
Jan 23 05:00:27 np0005593234 nova_compute[227762]: 2026-01-23 10:00:27.991 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4b94f03d-d738-409e-a0ac-b23304c3be02/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpctsmknpz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.122 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4b94f03d-d738-409e-a0ac-b23304c3be02/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpctsmknpz" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.150 227766 DEBUG nova.storage.rbd_utils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] rbd image 4b94f03d-d738-409e-a0ac-b23304c3be02_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.154 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4b94f03d-d738-409e-a0ac-b23304c3be02/disk.config 4b94f03d-d738-409e-a0ac-b23304c3be02_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:00:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:28.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.315 227766 DEBUG oslo_concurrency.processutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4b94f03d-d738-409e-a0ac-b23304c3be02/disk.config 4b94f03d-d738-409e-a0ac-b23304c3be02_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.161s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.316 227766 INFO nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Deleting local config drive /var/lib/nova/instances/4b94f03d-d738-409e-a0ac-b23304c3be02/disk.config because it was imported into RBD.#033[00m
Jan 23 05:00:28 np0005593234 kernel: tap4b07fe06-9d: entered promiscuous mode
Jan 23 05:00:28 np0005593234 NetworkManager[48942]: <info>  [1769162428.3671] manager: (tap4b07fe06-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/172)
Jan 23 05:00:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:28Z|00335|binding|INFO|Claiming lport 4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 for this chassis.
Jan 23 05:00:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:28Z|00336|binding|INFO|4b07fe06-9de4-4bfa-9a19-6aeceed74ea1: Claiming fa:16:3e:27:8b:47 10.100.0.212
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.369 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.371 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:28 np0005593234 NetworkManager[48942]: <info>  [1769162428.3804] manager: (tap1cb18651-91): new Tun device (/org/freedesktop/NetworkManager/Devices/173)
Jan 23 05:00:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:28.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:28 np0005593234 systemd-udevd[269624]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:00:28 np0005593234 systemd-udevd[269623]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.410 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:8b:47 10.100.0.212'], port_security=['fa:16:3e:27:8b:47 10.100.0.212'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.212/24', 'neutron:device_id': '4b94f03d-d738-409e-a0ac-b23304c3be02', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-70cf4ed9-f261-4264-9650-7d4e3f77ec45', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557095954714e01b800ed2898d27593', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd5947ff6-b7cd-4a55-bcbe-de55519d051c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41f1c577-33c8-446d-b30d-8d6307c411c1, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=4b07fe06-9de4-4bfa-9a19-6aeceed74ea1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.412 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 in datapath 70cf4ed9-f261-4264-9650-7d4e3f77ec45 bound to our chassis#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.414 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 70cf4ed9-f261-4264-9650-7d4e3f77ec45#033[00m
Jan 23 05:00:28 np0005593234 kernel: tap1cb18651-91: entered promiscuous mode
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.418 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:28Z|00337|binding|INFO|Claiming lport 1cb18651-911d-44cd-a90d-80ff618579a8 for this chassis.
Jan 23 05:00:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:28Z|00338|binding|INFO|1cb18651-911d-44cd-a90d-80ff618579a8: Claiming fa:16:3e:73:05:64 10.100.1.177
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.422 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:28Z|00339|binding|INFO|Setting lport 4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 ovn-installed in OVS
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.424 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:28 np0005593234 NetworkManager[48942]: <info>  [1769162428.4283] device (tap1cb18651-91): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:00:28 np0005593234 NetworkManager[48942]: <info>  [1769162428.4296] device (tap1cb18651-91): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:00:28 np0005593234 NetworkManager[48942]: <info>  [1769162428.4303] device (tap4b07fe06-9d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:00:28 np0005593234 NetworkManager[48942]: <info>  [1769162428.4312] device (tap4b07fe06-9d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.432 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ef72aa7b-ddaf-4e55-b6b4-63c3bd3e28f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.433 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap70cf4ed9-f1 in ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:00:28 np0005593234 systemd-machined[195626]: New machine qemu-39-instance-00000060.
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.435 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap70cf4ed9-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.435 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1d57874f-ae97-43f3-88b0-948ad194557c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.435 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d1b86178-cf7c-472e-94c1-8bd5e6207299]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 systemd[1]: Started Virtual Machine qemu-39-instance-00000060.
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.448 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[d89b2dd5-13f6-4370-8e54-9859f3f5489f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:28Z|00340|binding|INFO|Setting lport 1cb18651-911d-44cd-a90d-80ff618579a8 ovn-installed in OVS
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.460 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7c0eb271-99ef-4e65-9946-309b4b9db4fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.460 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:28Z|00341|binding|INFO|Setting lport 4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 up in Southbound
Jan 23 05:00:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:28Z|00342|binding|INFO|Setting lport 1cb18651-911d-44cd-a90d-80ff618579a8 up in Southbound
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.464 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:05:64 10.100.1.177'], port_security=['fa:16:3e:73:05:64 10.100.1.177'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.177/24', 'neutron:device_id': '4b94f03d-d738-409e-a0ac-b23304c3be02', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2cd39c5-4f2d-4c93-b09b-4332da1257d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557095954714e01b800ed2898d27593', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd5947ff6-b7cd-4a55-bcbe-de55519d051c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=05d174b2-887d-40e4-80e0-f89b59d24436, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=1cb18651-911d-44cd-a90d-80ff618579a8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.493 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[df8d7f4c-5b9c-49bb-97b6-67e5b5235695]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 NetworkManager[48942]: <info>  [1769162428.4993] manager: (tap70cf4ed9-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/174)
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.499 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[39a32072-f597-4321-a2c4-134240a8fd80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.538 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b16c2e38-b3c3-43f4-976d-4894656a4e4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.542 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[2767632e-2bdc-4471-886f-8191f24dfd66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 NetworkManager[48942]: <info>  [1769162428.5691] device (tap70cf4ed9-f0): carrier: link connected
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.574 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[0f31a49a-4a33-4cc8-9004-847f72a3c5cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.593 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1d051760-0cc6-4278-8162-f7867244f7c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap70cf4ed9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:60:96:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632352, 'reachable_time': 27077, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 269659, 'error': None, 'target': 'ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.604 227766 DEBUG nova.network.neutron [req-d4014a40-3c46-4f08-87d6-4f507eada4c9 req-da0c553c-12d7-45e4-952e-de1d6f4c518a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Updated VIF entry in instance network info cache for port 1cb18651-911d-44cd-a90d-80ff618579a8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.604 227766 DEBUG nova.network.neutron [req-d4014a40-3c46-4f08-87d6-4f507eada4c9 req-da0c553c-12d7-45e4-952e-de1d6f4c518a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Updating instance_info_cache with network_info: [{"id": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "address": "fa:16:3e:27:8b:47", "network": {"id": "70cf4ed9-f261-4264-9650-7d4e3f77ec45", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-368069413", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b07fe06-9d", "ovs_interfaceid": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "1cb18651-911d-44cd-a90d-80ff618579a8", "address": "fa:16:3e:73:05:64", "network": {"id": "d2cd39c5-4f2d-4c93-b09b-4332da1257d0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1778987856", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb18651-91", "ovs_interfaceid": "1cb18651-911d-44cd-a90d-80ff618579a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.609 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ca35ddfe-dbaa-4068-901b-ea66438fb6a4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe60:96e7'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 632352, 'tstamp': 632352}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 269660, 'error': None, 'target': 'ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.626 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2ac4cca2-1afb-447c-99b6-7e71328e0d18]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap70cf4ed9-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:60:96:e7'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 107], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632352, 'reachable_time': 27077, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 269661, 'error': None, 'target': 'ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.640 227766 DEBUG oslo_concurrency.lockutils [req-d4014a40-3c46-4f08-87d6-4f507eada4c9 req-da0c553c-12d7-45e4-952e-de1d6f4c518a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-4b94f03d-d738-409e-a0ac-b23304c3be02" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.660 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9df9d31b-e583-4b48-a10c-86f276a3b958]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.715 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3d658340-4426-46b7-b154-f1f84c68226f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.716 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70cf4ed9-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.716 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.717 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap70cf4ed9-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:28 np0005593234 NetworkManager[48942]: <info>  [1769162428.7194] manager: (tap70cf4ed9-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/175)
Jan 23 05:00:28 np0005593234 kernel: tap70cf4ed9-f0: entered promiscuous mode
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.719 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.721 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap70cf4ed9-f0, col_values=(('external_ids', {'iface-id': '868a1e94-ec7c-4294-a3ff-16f8594e04ad'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.722 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:28Z|00343|binding|INFO|Releasing lport 868a1e94-ec7c-4294-a3ff-16f8594e04ad from this chassis (sb_readonly=0)
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.736 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.738 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/70cf4ed9-f261-4264-9650-7d4e3f77ec45.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/70cf4ed9-f261-4264-9650-7d4e3f77ec45.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.739 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4032a707-5306-4305-9b66-4bbceb510967]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.739 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-70cf4ed9-f261-4264-9650-7d4e3f77ec45
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/70cf4ed9-f261-4264-9650-7d4e3f77ec45.pid.haproxy
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 70cf4ed9-f261-4264-9650-7d4e3f77ec45
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:00:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:28.741 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45', 'env', 'PROCESS_TAG=haproxy-70cf4ed9-f261-4264-9650-7d4e3f77ec45', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/70cf4ed9-f261-4264-9650-7d4e3f77ec45.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.906 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162428.9058332, 4b94f03d-d738-409e-a0ac-b23304c3be02 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.907 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] VM Started (Lifecycle Event)#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.962 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.966 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162428.9062753, 4b94f03d-d738-409e-a0ac-b23304c3be02 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:00:28 np0005593234 nova_compute[227762]: 2026-01-23 10:00:28.966 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.009 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.012 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.038 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:00:29 np0005593234 podman[269736]: 2026-01-23 10:00:29.202196179 +0000 UTC m=+0.115172728 container create 126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:00:29 np0005593234 podman[269736]: 2026-01-23 10:00:29.108373208 +0000 UTC m=+0.021349777 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.210 227766 DEBUG nova.compute.manager [req-f9a54993-3b86-499c-9a47-35049959c056 req-8402aa23-89d1-4735-9e61-80994b996300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-vif-plugged-1cb18651-911d-44cd-a90d-80ff618579a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.210 227766 DEBUG oslo_concurrency.lockutils [req-f9a54993-3b86-499c-9a47-35049959c056 req-8402aa23-89d1-4735-9e61-80994b996300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.211 227766 DEBUG oslo_concurrency.lockutils [req-f9a54993-3b86-499c-9a47-35049959c056 req-8402aa23-89d1-4735-9e61-80994b996300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.211 227766 DEBUG oslo_concurrency.lockutils [req-f9a54993-3b86-499c-9a47-35049959c056 req-8402aa23-89d1-4735-9e61-80994b996300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.211 227766 DEBUG nova.compute.manager [req-f9a54993-3b86-499c-9a47-35049959c056 req-8402aa23-89d1-4735-9e61-80994b996300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Processing event network-vif-plugged-1cb18651-911d-44cd-a90d-80ff618579a8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.386 227766 DEBUG nova.compute.manager [req-f8a1bbdf-ec83-4a5b-b56d-8796d0447f28 req-77b5970d-63ef-447b-b8a9-639328551de1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-vif-plugged-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.386 227766 DEBUG oslo_concurrency.lockutils [req-f8a1bbdf-ec83-4a5b-b56d-8796d0447f28 req-77b5970d-63ef-447b-b8a9-639328551de1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.386 227766 DEBUG oslo_concurrency.lockutils [req-f8a1bbdf-ec83-4a5b-b56d-8796d0447f28 req-77b5970d-63ef-447b-b8a9-639328551de1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.387 227766 DEBUG oslo_concurrency.lockutils [req-f8a1bbdf-ec83-4a5b-b56d-8796d0447f28 req-77b5970d-63ef-447b-b8a9-639328551de1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.387 227766 DEBUG nova.compute.manager [req-f8a1bbdf-ec83-4a5b-b56d-8796d0447f28 req-77b5970d-63ef-447b-b8a9-639328551de1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Processing event network-vif-plugged-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.387 227766 DEBUG nova.compute.manager [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.392 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162429.3918393, 4b94f03d-d738-409e-a0ac-b23304c3be02 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.392 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.393 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.397 227766 INFO nova.virt.libvirt.driver [-] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Instance spawned successfully.#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.397 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.437 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.443 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.446 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.447 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.447 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.447 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.448 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.448 227766 DEBUG nova.virt.libvirt.driver [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.488 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.552 227766 INFO nova.compute.manager [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Took 28.77 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.553 227766 DEBUG nova.compute.manager [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:00:29 np0005593234 systemd[1]: Started libpod-conmon-126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa.scope.
Jan 23 05:00:29 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:00:29 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/427edfc308db846e738b26fc1d3c0d582329de8d5bf6723e51d45b34a586ad61/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.666 227766 INFO nova.compute.manager [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Took 34.53 seconds to build instance.#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.721 227766 DEBUG oslo_concurrency.lockutils [None req-8e7bf3b9-be42-4e5b-a75f-dd0f5b4b3e83 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 34.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.785 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.785 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.786 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.786 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:00:29 np0005593234 nova_compute[227762]: 2026-01-23 10:00:29.786 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:00:29 np0005593234 podman[269736]: 2026-01-23 10:00:29.87328884 +0000 UTC m=+0.786265389 container init 126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 05:00:29 np0005593234 podman[269736]: 2026-01-23 10:00:29.879748592 +0000 UTC m=+0.792725141 container start 126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:00:29 np0005593234 neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45[269751]: [NOTICE]   (269757) : New worker (269761) forked
Jan 23 05:00:29 np0005593234 neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45[269751]: [NOTICE]   (269757) : Loading success.
Jan 23 05:00:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:29.972 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 1cb18651-911d-44cd-a90d-80ff618579a8 in datapath d2cd39c5-4f2d-4c93-b09b-4332da1257d0 unbound from our chassis#033[00m
Jan 23 05:00:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:29.974 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d2cd39c5-4f2d-4c93-b09b-4332da1257d0#033[00m
Jan 23 05:00:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:29.984 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8b48f9d6-fd19-4dc9-b7e0-6347273f4722]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:29.985 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd2cd39c5-41 in ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:00:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:29.986 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd2cd39c5-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:00:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:29.987 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8f60873e-a4df-4be3-b4b4-3c58f037a792]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:29.988 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[222ccc2f-38f3-4036-a2bc-cfdd16a9a2d6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:29.999 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[69d73035-6588-4c09-8d6a-c8788bbb57b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.013 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4159cb14-83e6-497a-b705-2be9205a1276]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.045 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[95e6b004-f584-4a7c-95a3-70e284c5cb96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 NetworkManager[48942]: <info>  [1769162430.0525] manager: (tapd2cd39c5-40): new Veth device (/org/freedesktop/NetworkManager/Devices/176)
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.053 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fe794b57-e90a-47b4-bcff-a62aa70dc4e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.091 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[2888bc85-701d-448d-9ffe-efde0bc78f4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.096 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4272ebc4-55cb-4e7a-b7e1-24cafdcba35a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 NetworkManager[48942]: <info>  [1769162430.1237] device (tapd2cd39c5-40): carrier: link connected
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.130 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[23ce49a9-5493-47fe-b05c-ede3d3e3426a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.147 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[63048aa5-a76f-4710-8417-9d48ecb5b438]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2cd39c5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b6:1e:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632508, 'reachable_time': 26264, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 269797, 'error': None, 'target': 'ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.164 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f850580a-0367-4593-80cf-c214cbee090d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb6:1ee3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 632508, 'tstamp': 632508}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 269798, 'error': None, 'target': 'ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.184 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[100d4052-db97-41b1-9dfa-1b7ed71ba2e7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd2cd39c5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b6:1e:e3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 108], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632508, 'reachable_time': 26264, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 269799, 'error': None, 'target': 'ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:30.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.216 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3fd0d754-060a-4389-9f33-0f27befee37f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:00:30 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/622370306' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.263 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.280 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[40e1411e-c809-4f4c-9990-073a8b333b00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.283 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2cd39c5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.283 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.284 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd2cd39c5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:30 np0005593234 kernel: tapd2cd39c5-40: entered promiscuous mode
Jan 23 05:00:30 np0005593234 NetworkManager[48942]: <info>  [1769162430.2864] manager: (tapd2cd39c5-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/177)
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.288 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.292 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd2cd39c5-40, col_values=(('external_ids', {'iface-id': '24f5cee8-a34f-44d9-882b-a2949cbd5db0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:30 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:30Z|00344|binding|INFO|Releasing lport 24f5cee8-a34f-44d9-882b-a2949cbd5db0 from this chassis (sb_readonly=0)
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.311 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.313 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d2cd39c5-4f2d-4c93-b09b-4332da1257d0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d2cd39c5-4f2d-4c93-b09b-4332da1257d0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.314 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e1853ea8-3647-4698-b2d9-502197f2cd48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.314 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-d2cd39c5-4f2d-4c93-b09b-4332da1257d0
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/d2cd39c5-4f2d-4c93-b09b-4332da1257d0.pid.haproxy
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID d2cd39c5-4f2d-4c93-b09b-4332da1257d0
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:00:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:30.315 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0', 'env', 'PROCESS_TAG=haproxy-d2cd39c5-4f2d-4c93-b09b-4332da1257d0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d2cd39c5-4f2d-4c93-b09b-4332da1257d0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:00:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:00:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:30.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.410 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000060 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.410 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000060 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:00:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.593 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.595 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4484MB free_disk=20.921916961669922GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.595 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.596 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.689 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:30 np0005593234 podman[269835]: 2026-01-23 10:00:30.704726429 +0000 UTC m=+0.046425741 container create a8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.705 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 4b94f03d-d738-409e-a0ac-b23304c3be02 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.709 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.710 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:00:30 np0005593234 systemd[1]: Started libpod-conmon-a8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b.scope.
Jan 23 05:00:30 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:00:30 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3c7bf5de8f685e77d529d4a76404ec7c1479a4bacce380b75548cb762e347f02/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:00:30 np0005593234 podman[269835]: 2026-01-23 10:00:30.775344755 +0000 UTC m=+0.117044087 container init a8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 05:00:30 np0005593234 podman[269835]: 2026-01-23 10:00:30.679240423 +0000 UTC m=+0.020939765 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:00:30 np0005593234 podman[269835]: 2026-01-23 10:00:30.781116005 +0000 UTC m=+0.122815327 container start a8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.791 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:00:30 np0005593234 neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0[269848]: [NOTICE]   (269852) : New worker (269855) forked
Jan 23 05:00:30 np0005593234 neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0[269848]: [NOTICE]   (269852) : Loading success.
Jan 23 05:00:30 np0005593234 nova_compute[227762]: 2026-01-23 10:00:30.946 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:00:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1772872227' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.234 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.240 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.523 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.592 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.593 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.997s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.702 227766 DEBUG nova.compute.manager [req-be711b69-0c78-4ed8-83cd-fd9620388b35 req-a0c79d2f-7c62-4e4a-97bd-ac0b5a68d1b6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-vif-plugged-1cb18651-911d-44cd-a90d-80ff618579a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.702 227766 DEBUG oslo_concurrency.lockutils [req-be711b69-0c78-4ed8-83cd-fd9620388b35 req-a0c79d2f-7c62-4e4a-97bd-ac0b5a68d1b6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.703 227766 DEBUG oslo_concurrency.lockutils [req-be711b69-0c78-4ed8-83cd-fd9620388b35 req-a0c79d2f-7c62-4e4a-97bd-ac0b5a68d1b6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.703 227766 DEBUG oslo_concurrency.lockutils [req-be711b69-0c78-4ed8-83cd-fd9620388b35 req-a0c79d2f-7c62-4e4a-97bd-ac0b5a68d1b6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.703 227766 DEBUG nova.compute.manager [req-be711b69-0c78-4ed8-83cd-fd9620388b35 req-a0c79d2f-7c62-4e4a-97bd-ac0b5a68d1b6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] No waiting events found dispatching network-vif-plugged-1cb18651-911d-44cd-a90d-80ff618579a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.704 227766 WARNING nova.compute.manager [req-be711b69-0c78-4ed8-83cd-fd9620388b35 req-a0c79d2f-7c62-4e4a-97bd-ac0b5a68d1b6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received unexpected event network-vif-plugged-1cb18651-911d-44cd-a90d-80ff618579a8 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.789 227766 DEBUG nova.compute.manager [req-055bfe30-9014-4d03-b68b-422ddd722b77 req-108818b8-9b0f-40df-a2b7-b84437c553fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-vif-plugged-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.789 227766 DEBUG oslo_concurrency.lockutils [req-055bfe30-9014-4d03-b68b-422ddd722b77 req-108818b8-9b0f-40df-a2b7-b84437c553fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.790 227766 DEBUG oslo_concurrency.lockutils [req-055bfe30-9014-4d03-b68b-422ddd722b77 req-108818b8-9b0f-40df-a2b7-b84437c553fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.790 227766 DEBUG oslo_concurrency.lockutils [req-055bfe30-9014-4d03-b68b-422ddd722b77 req-108818b8-9b0f-40df-a2b7-b84437c553fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.790 227766 DEBUG nova.compute.manager [req-055bfe30-9014-4d03-b68b-422ddd722b77 req-108818b8-9b0f-40df-a2b7-b84437c553fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] No waiting events found dispatching network-vif-plugged-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:31 np0005593234 nova_compute[227762]: 2026-01-23 10:00:31.791 227766 WARNING nova.compute.manager [req-055bfe30-9014-4d03-b68b-422ddd722b77 req-108818b8-9b0f-40df-a2b7-b84437c553fa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received unexpected event network-vif-plugged-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.144 227766 DEBUG oslo_concurrency.lockutils [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "4b94f03d-d738-409e-a0ac-b23304c3be02" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.145 227766 DEBUG oslo_concurrency.lockutils [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.145 227766 DEBUG oslo_concurrency.lockutils [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.145 227766 DEBUG oslo_concurrency.lockutils [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.145 227766 DEBUG oslo_concurrency.lockutils [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.147 227766 INFO nova.compute.manager [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Terminating instance#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.148 227766 DEBUG nova.compute.manager [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:00:32 np0005593234 kernel: tap4b07fe06-9d (unregistering): left promiscuous mode
Jan 23 05:00:32 np0005593234 NetworkManager[48942]: <info>  [1769162432.1855] device (tap4b07fe06-9d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:00:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:00:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:32.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:00:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:32Z|00345|binding|INFO|Releasing lport 4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 from this chassis (sb_readonly=0)
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.196 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:32Z|00346|binding|INFO|Setting lport 4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 down in Southbound
Jan 23 05:00:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:32Z|00347|binding|INFO|Removing iface tap4b07fe06-9d ovn-installed in OVS
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.199 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.207 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:27:8b:47 10.100.0.212'], port_security=['fa:16:3e:27:8b:47 10.100.0.212'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.212/24', 'neutron:device_id': '4b94f03d-d738-409e-a0ac-b23304c3be02', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-70cf4ed9-f261-4264-9650-7d4e3f77ec45', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557095954714e01b800ed2898d27593', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd5947ff6-b7cd-4a55-bcbe-de55519d051c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=41f1c577-33c8-446d-b30d-8d6307c411c1, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=4b07fe06-9de4-4bfa-9a19-6aeceed74ea1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.208 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 in datapath 70cf4ed9-f261-4264-9650-7d4e3f77ec45 unbound from our chassis#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.209 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 70cf4ed9-f261-4264-9650-7d4e3f77ec45, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.211 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.210 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[edabc808-694a-4300-895e-3e0fda2040b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.212 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45 namespace which is not needed anymore#033[00m
Jan 23 05:00:32 np0005593234 kernel: tap1cb18651-91 (unregistering): left promiscuous mode
Jan 23 05:00:32 np0005593234 NetworkManager[48942]: <info>  [1769162432.2196] device (tap1cb18651-91): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.221 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.231 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:32Z|00348|binding|INFO|Releasing lport 1cb18651-911d-44cd-a90d-80ff618579a8 from this chassis (sb_readonly=0)
Jan 23 05:00:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:32Z|00349|binding|INFO|Setting lport 1cb18651-911d-44cd-a90d-80ff618579a8 down in Southbound
Jan 23 05:00:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:00:32Z|00350|binding|INFO|Removing iface tap1cb18651-91 ovn-installed in OVS
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.234 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.245 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.252 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:73:05:64 10.100.1.177'], port_security=['fa:16:3e:73:05:64 10.100.1.177'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.1.177/24', 'neutron:device_id': '4b94f03d-d738-409e-a0ac-b23304c3be02', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d2cd39c5-4f2d-4c93-b09b-4332da1257d0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd557095954714e01b800ed2898d27593', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd5947ff6-b7cd-4a55-bcbe-de55519d051c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=05d174b2-887d-40e4-80e0-f89b59d24436, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=1cb18651-911d-44cd-a90d-80ff618579a8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:00:32 np0005593234 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000060.scope: Deactivated successfully.
Jan 23 05:00:32 np0005593234 systemd[1]: machine-qemu\x2d39\x2dinstance\x2d00000060.scope: Consumed 3.212s CPU time.
Jan 23 05:00:32 np0005593234 systemd-machined[195626]: Machine qemu-39-instance-00000060 terminated.
Jan 23 05:00:32 np0005593234 NetworkManager[48942]: <info>  [1769162432.3618] manager: (tap4b07fe06-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/178)
Jan 23 05:00:32 np0005593234 NetworkManager[48942]: <info>  [1769162432.3781] manager: (tap1cb18651-91): new Tun device (/org/freedesktop/NetworkManager/Devices/179)
Jan 23 05:00:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:32.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.398 227766 INFO nova.virt.libvirt.driver [-] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Instance destroyed successfully.#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.399 227766 DEBUG nova.objects.instance [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lazy-loading 'resources' on Instance uuid 4b94f03d-d738-409e-a0ac-b23304c3be02 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.416 227766 DEBUG nova.virt.libvirt.vif [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:59:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1351727518',display_name='tempest-ServersTestMultiNic-server-1351727518',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1351727518',id=96,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:00:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-1yrvua34',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:00:29Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=4b94f03d-d738-409e-a0ac-b23304c3be02,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "address": "fa:16:3e:27:8b:47", "network": {"id": "70cf4ed9-f261-4264-9650-7d4e3f77ec45", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-368069413", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b07fe06-9d", "ovs_interfaceid": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.416 227766 DEBUG nova.network.os_vif_util [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "address": "fa:16:3e:27:8b:47", "network": {"id": "70cf4ed9-f261-4264-9650-7d4e3f77ec45", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-368069413", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b07fe06-9d", "ovs_interfaceid": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.417 227766 DEBUG nova.network.os_vif_util [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:27:8b:47,bridge_name='br-int',has_traffic_filtering=True,id=4b07fe06-9de4-4bfa-9a19-6aeceed74ea1,network=Network(70cf4ed9-f261-4264-9650-7d4e3f77ec45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b07fe06-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.417 227766 DEBUG os_vif [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:8b:47,bridge_name='br-int',has_traffic_filtering=True,id=4b07fe06-9de4-4bfa-9a19-6aeceed74ea1,network=Network(70cf4ed9-f261-4264-9650-7d4e3f77ec45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b07fe06-9d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.419 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.419 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4b07fe06-9d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.421 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.422 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.425 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.427 227766 INFO os_vif [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:27:8b:47,bridge_name='br-int',has_traffic_filtering=True,id=4b07fe06-9de4-4bfa-9a19-6aeceed74ea1,network=Network(70cf4ed9-f261-4264-9650-7d4e3f77ec45),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4b07fe06-9d')#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.428 227766 DEBUG nova.virt.libvirt.vif [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:59:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServersTestMultiNic-server-1351727518',display_name='tempest-ServersTestMultiNic-server-1351727518',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstestmultinic-server-1351727518',id=96,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:00:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d557095954714e01b800ed2898d27593',ramdisk_id='',reservation_id='r-1yrvua34',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServersTestMultiNic-546513917',owner_user_name='tempest-ServersTestMultiNic-546513917-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:00:29Z,user_data=None,user_id='57e3c530deab46758172af6777c8c108',uuid=4b94f03d-d738-409e-a0ac-b23304c3be02,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1cb18651-911d-44cd-a90d-80ff618579a8", "address": "fa:16:3e:73:05:64", "network": {"id": "d2cd39c5-4f2d-4c93-b09b-4332da1257d0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1778987856", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb18651-91", "ovs_interfaceid": "1cb18651-911d-44cd-a90d-80ff618579a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.428 227766 DEBUG nova.network.os_vif_util [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converting VIF {"id": "1cb18651-911d-44cd-a90d-80ff618579a8", "address": "fa:16:3e:73:05:64", "network": {"id": "d2cd39c5-4f2d-4c93-b09b-4332da1257d0", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-1778987856", "subnets": [{"cidr": "10.100.1.0/24", "dns": [], "gateway": {"address": "10.100.1.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.1.177", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1cb18651-91", "ovs_interfaceid": "1cb18651-911d-44cd-a90d-80ff618579a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.429 227766 DEBUG nova.network.os_vif_util [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:73:05:64,bridge_name='br-int',has_traffic_filtering=True,id=1cb18651-911d-44cd-a90d-80ff618579a8,network=Network(d2cd39c5-4f2d-4c93-b09b-4332da1257d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb18651-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.429 227766 DEBUG os_vif [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:05:64,bridge_name='br-int',has_traffic_filtering=True,id=1cb18651-911d-44cd-a90d-80ff618579a8,network=Network(d2cd39c5-4f2d-4c93-b09b-4332da1257d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb18651-91') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.430 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.430 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1cb18651-91, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.431 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.433 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.434 227766 INFO os_vif [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:73:05:64,bridge_name='br-int',has_traffic_filtering=True,id=1cb18651-911d-44cd-a90d-80ff618579a8,network=Network(d2cd39c5-4f2d-4c93-b09b-4332da1257d0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1cb18651-91')#033[00m
Jan 23 05:00:32 np0005593234 neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45[269751]: [NOTICE]   (269757) : haproxy version is 2.8.14-c23fe91
Jan 23 05:00:32 np0005593234 neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45[269751]: [NOTICE]   (269757) : path to executable is /usr/sbin/haproxy
Jan 23 05:00:32 np0005593234 neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45[269751]: [WARNING]  (269757) : Exiting Master process...
Jan 23 05:00:32 np0005593234 neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45[269751]: [ALERT]    (269757) : Current worker (269761) exited with code 143 (Terminated)
Jan 23 05:00:32 np0005593234 neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45[269751]: [WARNING]  (269757) : All workers exited. Exiting... (0)
Jan 23 05:00:32 np0005593234 systemd[1]: libpod-126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa.scope: Deactivated successfully.
Jan 23 05:00:32 np0005593234 podman[269914]: 2026-01-23 10:00:32.495206482 +0000 UTC m=+0.200877314 container died 126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:00:32 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa-userdata-shm.mount: Deactivated successfully.
Jan 23 05:00:32 np0005593234 systemd[1]: var-lib-containers-storage-overlay-427edfc308db846e738b26fc1d3c0d582329de8d5bf6723e51d45b34a586ad61-merged.mount: Deactivated successfully.
Jan 23 05:00:32 np0005593234 podman[269914]: 2026-01-23 10:00:32.545896706 +0000 UTC m=+0.251567538 container cleanup 126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:00:32 np0005593234 systemd[1]: libpod-conmon-126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa.scope: Deactivated successfully.
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.593 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.594 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.594 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:00:32 np0005593234 podman[269989]: 2026-01-23 10:00:32.606599902 +0000 UTC m=+0.040129414 container remove 126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.612 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[478111f3-6a01-4fa7-af84-0bfe60f13fa0]: (4, ('Fri Jan 23 10:00:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45 (126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa)\n126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa\nFri Jan 23 10:00:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45 (126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa)\n126824e74e5e16b2345d57a2f0ddb8f21a740be61760d1300c53fc9c5636c8aa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.614 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dac9594c-3316-4500-b777-868876f71cf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.615 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70cf4ed9-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.617 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 kernel: tap70cf4ed9-f0: left promiscuous mode
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.629 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.630 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.630 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.631 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.631 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.632 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.634 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[19067f1d-b200-4b31-90d8-439f4a27b256]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.655 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c25789b4-e13b-4ee9-bdb1-236420577ada]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.656 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[16c805dc-07d2-4d6b-8c1f-0270481a149b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.673 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[95d80eb3-0a6d-4791-a2a2-3f6bb4749415]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632344, 'reachable_time': 21518, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270004, 'error': None, 'target': 'ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 systemd[1]: run-netns-ovnmeta\x2d70cf4ed9\x2df261\x2d4264\x2d9650\x2d7d4e3f77ec45.mount: Deactivated successfully.
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.676 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-70cf4ed9-f261-4264-9650-7d4e3f77ec45 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.677 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[afd0cbf6-b83a-4474-b32b-e3b49b4ae859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.678 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 1cb18651-911d-44cd-a90d-80ff618579a8 in datapath d2cd39c5-4f2d-4c93-b09b-4332da1257d0 unbound from our chassis#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.679 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d2cd39c5-4f2d-4c93-b09b-4332da1257d0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.680 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1eaf9cc3-cfa5-40c5-97d2-2c30f65ab220]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.681 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0 namespace which is not needed anymore#033[00m
Jan 23 05:00:32 np0005593234 neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0[269848]: [NOTICE]   (269852) : haproxy version is 2.8.14-c23fe91
Jan 23 05:00:32 np0005593234 neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0[269848]: [NOTICE]   (269852) : path to executable is /usr/sbin/haproxy
Jan 23 05:00:32 np0005593234 neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0[269848]: [WARNING]  (269852) : Exiting Master process...
Jan 23 05:00:32 np0005593234 neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0[269848]: [WARNING]  (269852) : Exiting Master process...
Jan 23 05:00:32 np0005593234 neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0[269848]: [ALERT]    (269852) : Current worker (269855) exited with code 143 (Terminated)
Jan 23 05:00:32 np0005593234 neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0[269848]: [WARNING]  (269852) : All workers exited. Exiting... (0)
Jan 23 05:00:32 np0005593234 systemd[1]: libpod-a8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b.scope: Deactivated successfully.
Jan 23 05:00:32 np0005593234 podman[270023]: 2026-01-23 10:00:32.804609157 +0000 UTC m=+0.042900492 container died a8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 23 05:00:32 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b-userdata-shm.mount: Deactivated successfully.
Jan 23 05:00:32 np0005593234 systemd[1]: var-lib-containers-storage-overlay-3c7bf5de8f685e77d529d4a76404ec7c1479a4bacce380b75548cb762e347f02-merged.mount: Deactivated successfully.
Jan 23 05:00:32 np0005593234 podman[270023]: 2026-01-23 10:00:32.837955088 +0000 UTC m=+0.076246393 container cleanup a8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:00:32 np0005593234 systemd[1]: libpod-conmon-a8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b.scope: Deactivated successfully.
Jan 23 05:00:32 np0005593234 podman[270056]: 2026-01-23 10:00:32.896330081 +0000 UTC m=+0.039290838 container remove a8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.902 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6433193f-e1e8-4e2b-9520-c69f27a462e5]: (4, ('Fri Jan 23 10:00:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0 (a8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b)\na8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b\nFri Jan 23 10:00:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0 (a8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b)\na8a78fadfa91a81bc72689747a20d9aae02279ad32338dbbc32ac7393cd5376b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.904 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d7c21335-1c72-4ca4-b7b0-1fa602549c68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.905 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd2cd39c5-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.907 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 kernel: tapd2cd39c5-40: left promiscuous mode
Jan 23 05:00:32 np0005593234 nova_compute[227762]: 2026-01-23 10:00:32.923 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.926 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[332dd87a-e8aa-4dd2-9e08-2a28280285ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.953 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a86ef0cb-1182-4e28-a0ad-d8005fbc49d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.954 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f3723f6a-50e2-46a0-a5e6-b32dba1c04c0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.972 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e764c0c7-24a8-4177-a6ec-2ca96471dec8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 632500, 'reachable_time': 37592, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 270093, 'error': None, 'target': 'ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.974 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d2cd39c5-4f2d-4c93-b09b-4332da1257d0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:00:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:32.974 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[35421caf-3edb-4059-b7bc-5c226c01cef7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:00:33 np0005593234 podman[270057]: 2026-01-23 10:00:33.002210338 +0000 UTC m=+0.133118468 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 23 05:00:33 np0005593234 nova_compute[227762]: 2026-01-23 10:00:33.175 227766 INFO nova.virt.libvirt.driver [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Deleting instance files /var/lib/nova/instances/4b94f03d-d738-409e-a0ac-b23304c3be02_del#033[00m
Jan 23 05:00:33 np0005593234 nova_compute[227762]: 2026-01-23 10:00:33.176 227766 INFO nova.virt.libvirt.driver [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Deletion of /var/lib/nova/instances/4b94f03d-d738-409e-a0ac-b23304c3be02_del complete#033[00m
Jan 23 05:00:33 np0005593234 nova_compute[227762]: 2026-01-23 10:00:33.295 227766 INFO nova.compute.manager [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Took 1.15 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:00:33 np0005593234 nova_compute[227762]: 2026-01-23 10:00:33.296 227766 DEBUG oslo.service.loopingcall [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:00:33 np0005593234 nova_compute[227762]: 2026-01-23 10:00:33.296 227766 DEBUG nova.compute.manager [-] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:00:33 np0005593234 nova_compute[227762]: 2026-01-23 10:00:33.296 227766 DEBUG nova.network.neutron [-] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:00:33 np0005593234 systemd[1]: run-netns-ovnmeta\x2dd2cd39c5\x2d4f2d\x2d4c93\x2db09b\x2d4332da1257d0.mount: Deactivated successfully.
Jan 23 05:00:33 np0005593234 nova_compute[227762]: 2026-01-23 10:00:33.943 227766 DEBUG nova.compute.manager [req-0e00d3d2-e3ed-46b9-af02-ad3ccd75ff58 req-d749afe5-3f7b-4232-a31a-6e5b8ae83f3f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-vif-unplugged-1cb18651-911d-44cd-a90d-80ff618579a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:33 np0005593234 nova_compute[227762]: 2026-01-23 10:00:33.944 227766 DEBUG oslo_concurrency.lockutils [req-0e00d3d2-e3ed-46b9-af02-ad3ccd75ff58 req-d749afe5-3f7b-4232-a31a-6e5b8ae83f3f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:33 np0005593234 nova_compute[227762]: 2026-01-23 10:00:33.945 227766 DEBUG oslo_concurrency.lockutils [req-0e00d3d2-e3ed-46b9-af02-ad3ccd75ff58 req-d749afe5-3f7b-4232-a31a-6e5b8ae83f3f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:33 np0005593234 nova_compute[227762]: 2026-01-23 10:00:33.945 227766 DEBUG oslo_concurrency.lockutils [req-0e00d3d2-e3ed-46b9-af02-ad3ccd75ff58 req-d749afe5-3f7b-4232-a31a-6e5b8ae83f3f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:33 np0005593234 nova_compute[227762]: 2026-01-23 10:00:33.945 227766 DEBUG nova.compute.manager [req-0e00d3d2-e3ed-46b9-af02-ad3ccd75ff58 req-d749afe5-3f7b-4232-a31a-6e5b8ae83f3f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] No waiting events found dispatching network-vif-unplugged-1cb18651-911d-44cd-a90d-80ff618579a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:33 np0005593234 nova_compute[227762]: 2026-01-23 10:00:33.946 227766 DEBUG nova.compute.manager [req-0e00d3d2-e3ed-46b9-af02-ad3ccd75ff58 req-d749afe5-3f7b-4232-a31a-6e5b8ae83f3f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-vif-unplugged-1cb18651-911d-44cd-a90d-80ff618579a8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:00:34 np0005593234 nova_compute[227762]: 2026-01-23 10:00:34.098 227766 DEBUG nova.compute.manager [req-a700e24e-953f-4e93-b434-96e1d924e93d req-258db01e-dfac-483a-8932-63eef9837288 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-vif-unplugged-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:34 np0005593234 nova_compute[227762]: 2026-01-23 10:00:34.098 227766 DEBUG oslo_concurrency.lockutils [req-a700e24e-953f-4e93-b434-96e1d924e93d req-258db01e-dfac-483a-8932-63eef9837288 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:34 np0005593234 nova_compute[227762]: 2026-01-23 10:00:34.099 227766 DEBUG oslo_concurrency.lockutils [req-a700e24e-953f-4e93-b434-96e1d924e93d req-258db01e-dfac-483a-8932-63eef9837288 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:34 np0005593234 nova_compute[227762]: 2026-01-23 10:00:34.099 227766 DEBUG oslo_concurrency.lockutils [req-a700e24e-953f-4e93-b434-96e1d924e93d req-258db01e-dfac-483a-8932-63eef9837288 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:34 np0005593234 nova_compute[227762]: 2026-01-23 10:00:34.100 227766 DEBUG nova.compute.manager [req-a700e24e-953f-4e93-b434-96e1d924e93d req-258db01e-dfac-483a-8932-63eef9837288 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] No waiting events found dispatching network-vif-unplugged-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:34 np0005593234 nova_compute[227762]: 2026-01-23 10:00:34.100 227766 DEBUG nova.compute.manager [req-a700e24e-953f-4e93-b434-96e1d924e93d req-258db01e-dfac-483a-8932-63eef9837288 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-vif-unplugged-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:00:34 np0005593234 nova_compute[227762]: 2026-01-23 10:00:34.100 227766 DEBUG nova.compute.manager [req-a700e24e-953f-4e93-b434-96e1d924e93d req-258db01e-dfac-483a-8932-63eef9837288 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-vif-plugged-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:34 np0005593234 nova_compute[227762]: 2026-01-23 10:00:34.100 227766 DEBUG oslo_concurrency.lockutils [req-a700e24e-953f-4e93-b434-96e1d924e93d req-258db01e-dfac-483a-8932-63eef9837288 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:34 np0005593234 nova_compute[227762]: 2026-01-23 10:00:34.100 227766 DEBUG oslo_concurrency.lockutils [req-a700e24e-953f-4e93-b434-96e1d924e93d req-258db01e-dfac-483a-8932-63eef9837288 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:34 np0005593234 nova_compute[227762]: 2026-01-23 10:00:34.101 227766 DEBUG oslo_concurrency.lockutils [req-a700e24e-953f-4e93-b434-96e1d924e93d req-258db01e-dfac-483a-8932-63eef9837288 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:34 np0005593234 nova_compute[227762]: 2026-01-23 10:00:34.101 227766 DEBUG nova.compute.manager [req-a700e24e-953f-4e93-b434-96e1d924e93d req-258db01e-dfac-483a-8932-63eef9837288 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] No waiting events found dispatching network-vif-plugged-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:34 np0005593234 nova_compute[227762]: 2026-01-23 10:00:34.101 227766 WARNING nova.compute.manager [req-a700e24e-953f-4e93-b434-96e1d924e93d req-258db01e-dfac-483a-8932-63eef9837288 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received unexpected event network-vif-plugged-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:00:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:34.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:34.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:35 np0005593234 nova_compute[227762]: 2026-01-23 10:00:35.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:35 np0005593234 nova_compute[227762]: 2026-01-23 10:00:35.949 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:00:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:36.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:00:36 np0005593234 nova_compute[227762]: 2026-01-23 10:00:36.267 227766 DEBUG nova.compute.manager [req-936589b7-571d-4c15-b677-68ac1d4e719c req-61be0fd3-0f36-439c-b773-3b37cd4611d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-vif-plugged-1cb18651-911d-44cd-a90d-80ff618579a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:36 np0005593234 nova_compute[227762]: 2026-01-23 10:00:36.267 227766 DEBUG oslo_concurrency.lockutils [req-936589b7-571d-4c15-b677-68ac1d4e719c req-61be0fd3-0f36-439c-b773-3b37cd4611d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:36 np0005593234 nova_compute[227762]: 2026-01-23 10:00:36.267 227766 DEBUG oslo_concurrency.lockutils [req-936589b7-571d-4c15-b677-68ac1d4e719c req-61be0fd3-0f36-439c-b773-3b37cd4611d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:36 np0005593234 nova_compute[227762]: 2026-01-23 10:00:36.268 227766 DEBUG oslo_concurrency.lockutils [req-936589b7-571d-4c15-b677-68ac1d4e719c req-61be0fd3-0f36-439c-b773-3b37cd4611d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:36 np0005593234 nova_compute[227762]: 2026-01-23 10:00:36.268 227766 DEBUG nova.compute.manager [req-936589b7-571d-4c15-b677-68ac1d4e719c req-61be0fd3-0f36-439c-b773-3b37cd4611d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] No waiting events found dispatching network-vif-plugged-1cb18651-911d-44cd-a90d-80ff618579a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:00:36 np0005593234 nova_compute[227762]: 2026-01-23 10:00:36.268 227766 WARNING nova.compute.manager [req-936589b7-571d-4c15-b677-68ac1d4e719c req-61be0fd3-0f36-439c-b773-3b37cd4611d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received unexpected event network-vif-plugged-1cb18651-911d-44cd-a90d-80ff618579a8 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:00:36 np0005593234 nova_compute[227762]: 2026-01-23 10:00:36.268 227766 DEBUG nova.compute.manager [req-936589b7-571d-4c15-b677-68ac1d4e719c req-61be0fd3-0f36-439c-b773-3b37cd4611d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-vif-deleted-1cb18651-911d-44cd-a90d-80ff618579a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:36 np0005593234 nova_compute[227762]: 2026-01-23 10:00:36.269 227766 INFO nova.compute.manager [req-936589b7-571d-4c15-b677-68ac1d4e719c req-61be0fd3-0f36-439c-b773-3b37cd4611d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Neutron deleted interface 1cb18651-911d-44cd-a90d-80ff618579a8; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:00:36 np0005593234 nova_compute[227762]: 2026-01-23 10:00:36.269 227766 DEBUG nova.network.neutron [req-936589b7-571d-4c15-b677-68ac1d4e719c req-61be0fd3-0f36-439c-b773-3b37cd4611d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Updating instance_info_cache with network_info: [{"id": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "address": "fa:16:3e:27:8b:47", "network": {"id": "70cf4ed9-f261-4264-9650-7d4e3f77ec45", "bridge": "br-int", "label": "tempest-ServersTestMultiNic-368069413", "subnets": [{"cidr": "10.100.0.0/24", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.212", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d557095954714e01b800ed2898d27593", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4b07fe06-9d", "ovs_interfaceid": "4b07fe06-9de4-4bfa-9a19-6aeceed74ea1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:00:36 np0005593234 nova_compute[227762]: 2026-01-23 10:00:36.310 227766 DEBUG nova.compute.manager [req-936589b7-571d-4c15-b677-68ac1d4e719c req-61be0fd3-0f36-439c-b773-3b37cd4611d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Detach interface failed, port_id=1cb18651-911d-44cd-a90d-80ff618579a8, reason: Instance 4b94f03d-d738-409e-a0ac-b23304c3be02 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 05:00:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:36.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:37 np0005593234 nova_compute[227762]: 2026-01-23 10:00:37.433 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:38.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:38.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:39 np0005593234 nova_compute[227762]: 2026-01-23 10:00:39.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:40.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:40.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:40 np0005593234 nova_compute[227762]: 2026-01-23 10:00:40.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:00:40 np0005593234 nova_compute[227762]: 2026-01-23 10:00:40.952 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:41 np0005593234 nova_compute[227762]: 2026-01-23 10:00:41.408 227766 DEBUG nova.network.neutron [-] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:00:41 np0005593234 nova_compute[227762]: 2026-01-23 10:00:41.430 227766 INFO nova.compute.manager [-] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Took 8.13 seconds to deallocate network for instance.#033[00m
Jan 23 05:00:41 np0005593234 nova_compute[227762]: 2026-01-23 10:00:41.507 227766 DEBUG oslo_concurrency.lockutils [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:41 np0005593234 nova_compute[227762]: 2026-01-23 10:00:41.508 227766 DEBUG oslo_concurrency.lockutils [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:41 np0005593234 nova_compute[227762]: 2026-01-23 10:00:41.593 227766 DEBUG oslo_concurrency.processutils [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:00:41 np0005593234 nova_compute[227762]: 2026-01-23 10:00:41.656 227766 DEBUG nova.compute.manager [req-21df1477-aaec-4d30-9d17-70e5c80a5b40 req-a93fbb1b-d2a4-4bae-b8d0-a6ede0dc2c83 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Received event network-vif-deleted-4b07fe06-9de4-4bfa-9a19-6aeceed74ea1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:00:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:00:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2379554699' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:00:42 np0005593234 nova_compute[227762]: 2026-01-23 10:00:42.086 227766 DEBUG oslo_concurrency.processutils [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:00:42 np0005593234 nova_compute[227762]: 2026-01-23 10:00:42.093 227766 DEBUG nova.compute.provider_tree [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:00:42 np0005593234 nova_compute[227762]: 2026-01-23 10:00:42.131 227766 DEBUG nova.scheduler.client.report [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:00:42 np0005593234 nova_compute[227762]: 2026-01-23 10:00:42.166 227766 DEBUG oslo_concurrency.lockutils [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:00:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:42.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:00:42 np0005593234 nova_compute[227762]: 2026-01-23 10:00:42.213 227766 INFO nova.scheduler.client.report [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Deleted allocations for instance 4b94f03d-d738-409e-a0ac-b23304c3be02#033[00m
Jan 23 05:00:42 np0005593234 nova_compute[227762]: 2026-01-23 10:00:42.351 227766 DEBUG oslo_concurrency.lockutils [None req-29c605b8-db5c-4766-a21c-d4f923b4758d 57e3c530deab46758172af6777c8c108 d557095954714e01b800ed2898d27593 - - default default] Lock "4b94f03d-d738-409e-a0ac-b23304c3be02" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:42.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:42 np0005593234 nova_compute[227762]: 2026-01-23 10:00:42.435 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:42.837 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:00:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:42.838 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:00:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:00:42.838 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:00:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:00:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:44.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:00:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:00:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:44.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:00:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:45 np0005593234 nova_compute[227762]: 2026-01-23 10:00:45.953 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:46 np0005593234 nova_compute[227762]: 2026-01-23 10:00:46.188 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:46.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:46.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:47 np0005593234 nova_compute[227762]: 2026-01-23 10:00:47.397 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162432.3961265, 4b94f03d-d738-409e-a0ac-b23304c3be02 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:00:47 np0005593234 nova_compute[227762]: 2026-01-23 10:00:47.398 227766 INFO nova.compute.manager [-] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:00:47 np0005593234 nova_compute[227762]: 2026-01-23 10:00:47.436 227766 DEBUG nova.compute.manager [None req-e66ef640-ba9f-4bf8-b961-fe01e9805d9f - - - - - -] [instance: 4b94f03d-d738-409e-a0ac-b23304c3be02] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:00:47 np0005593234 nova_compute[227762]: 2026-01-23 10:00:47.438 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:48.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:48.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:50.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:00:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:50.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:00:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:50 np0005593234 nova_compute[227762]: 2026-01-23 10:00:50.955 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:51 np0005593234 podman[270181]: 2026-01-23 10:00:51.767649257 +0000 UTC m=+0.055091821 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 23 05:00:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:52.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:52.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:52 np0005593234 nova_compute[227762]: 2026-01-23 10:00:52.440 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:54.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:54.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:00:55 np0005593234 nova_compute[227762]: 2026-01-23 10:00:55.956 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:56.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:00:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:56.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:00:57 np0005593234 nova_compute[227762]: 2026-01-23 10:00:57.443 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:00:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:58.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:00:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:00:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:58.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:00:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:00:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1283242855' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:00:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:00:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1283242855' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:01:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:00.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:00.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:00 np0005593234 nova_compute[227762]: 2026-01-23 10:01:00.958 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:01:01 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/752801133' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:01:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:01:01 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/752801133' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:01:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:02.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:01:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:02.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:01:02 np0005593234 nova_compute[227762]: 2026-01-23 10:01:02.446 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:03 np0005593234 podman[270242]: 2026-01-23 10:01:03.580382227 +0000 UTC m=+0.081082723 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:01:03 np0005593234 nova_compute[227762]: 2026-01-23 10:01:03.899 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:03 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:03.900 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=38, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=37) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:01:03 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:03.901 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:01:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:04.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:04.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:05 np0005593234 nova_compute[227762]: 2026-01-23 10:01:05.959 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:06.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:06.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:06.903 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '38'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:07 np0005593234 nova_compute[227762]: 2026-01-23 10:01:07.448 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:08.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:08.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:08 np0005593234 podman[270465]: 2026-01-23 10:01:08.862531829 +0000 UTC m=+0.081041412 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 23 05:01:08 np0005593234 podman[270465]: 2026-01-23 10:01:08.99220997 +0000 UTC m=+0.210719523 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2)
Jan 23 05:01:09 np0005593234 podman[270615]: 2026-01-23 10:01:09.540851596 +0000 UTC m=+0.053139640 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:01:09 np0005593234 podman[270615]: 2026-01-23 10:01:09.552883192 +0000 UTC m=+0.065171216 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:01:10 np0005593234 podman[270676]: 2026-01-23 10:01:10.068613981 +0000 UTC m=+0.372674542 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, release=1793, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2023-02-22T09:23:20, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, distribution-scope=public)
Jan 23 05:01:10 np0005593234 podman[270676]: 2026-01-23 10:01:10.079870562 +0000 UTC m=+0.383931103 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, io.openshift.expose-services=, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.28.2, vcs-type=git, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, distribution-scope=public, release=1793, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, name=keepalived)
Jan 23 05:01:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:10.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:10.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:10 np0005593234 nova_compute[227762]: 2026-01-23 10:01:10.961 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:01:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:01:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:12.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:12 np0005593234 nova_compute[227762]: 2026-01-23 10:01:12.451 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:12.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:14.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:14.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:15 np0005593234 nova_compute[227762]: 2026-01-23 10:01:15.963 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:16.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:16.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:17 np0005593234 nova_compute[227762]: 2026-01-23 10:01:17.453 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:18.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:18.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:20.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:01:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:20.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:20 np0005593234 nova_compute[227762]: 2026-01-23 10:01:20.965 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:22.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:22 np0005593234 nova_compute[227762]: 2026-01-23 10:01:22.455 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:22.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:22 np0005593234 podman[270897]: 2026-01-23 10:01:22.761280704 +0000 UTC m=+0.050469747 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:01:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:24.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:24.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:24 np0005593234 nova_compute[227762]: 2026-01-23 10:01:24.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:26 np0005593234 nova_compute[227762]: 2026-01-23 10:01:26.020 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:26.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:26.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:27 np0005593234 nova_compute[227762]: 2026-01-23 10:01:27.457 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:28.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:28.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:29 np0005593234 nova_compute[227762]: 2026-01-23 10:01:29.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:30.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:01:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:30.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:01:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:30 np0005593234 nova_compute[227762]: 2026-01-23 10:01:30.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:30 np0005593234 nova_compute[227762]: 2026-01-23 10:01:30.910 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:30 np0005593234 nova_compute[227762]: 2026-01-23 10:01:30.910 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:30 np0005593234 nova_compute[227762]: 2026-01-23 10:01:30.910 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:30 np0005593234 nova_compute[227762]: 2026-01-23 10:01:30.910 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:01:30 np0005593234 nova_compute[227762]: 2026-01-23 10:01:30.911 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:31 np0005593234 nova_compute[227762]: 2026-01-23 10:01:31.022 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:01:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1429210547' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:01:31 np0005593234 nova_compute[227762]: 2026-01-23 10:01:31.364 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:31 np0005593234 nova_compute[227762]: 2026-01-23 10:01:31.535 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:01:31 np0005593234 nova_compute[227762]: 2026-01-23 10:01:31.537 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4604MB free_disk=20.921844482421875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:01:31 np0005593234 nova_compute[227762]: 2026-01-23 10:01:31.537 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:31 np0005593234 nova_compute[227762]: 2026-01-23 10:01:31.537 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:31 np0005593234 nova_compute[227762]: 2026-01-23 10:01:31.686 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:01:31 np0005593234 nova_compute[227762]: 2026-01-23 10:01:31.686 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:01:31 np0005593234 nova_compute[227762]: 2026-01-23 10:01:31.715 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:01:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/658634027' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:01:32 np0005593234 nova_compute[227762]: 2026-01-23 10:01:32.174 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:32 np0005593234 nova_compute[227762]: 2026-01-23 10:01:32.179 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:01:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:32.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:32 np0005593234 nova_compute[227762]: 2026-01-23 10:01:32.331 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:01:32 np0005593234 nova_compute[227762]: 2026-01-23 10:01:32.472 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:32.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:32 np0005593234 nova_compute[227762]: 2026-01-23 10:01:32.756 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:01:32 np0005593234 nova_compute[227762]: 2026-01-23 10:01:32.757 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.220s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:33 np0005593234 nova_compute[227762]: 2026-01-23 10:01:33.757 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:33 np0005593234 nova_compute[227762]: 2026-01-23 10:01:33.758 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:01:33 np0005593234 nova_compute[227762]: 2026-01-23 10:01:33.758 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:01:33 np0005593234 podman[271019]: 2026-01-23 10:01:33.804321342 +0000 UTC m=+0.096173885 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:01:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:34.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:34 np0005593234 nova_compute[227762]: 2026-01-23 10:01:34.471 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:01:34 np0005593234 nova_compute[227762]: 2026-01-23 10:01:34.472 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:34 np0005593234 nova_compute[227762]: 2026-01-23 10:01:34.472 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:34 np0005593234 nova_compute[227762]: 2026-01-23 10:01:34.472 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:01:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:34.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:36 np0005593234 nova_compute[227762]: 2026-01-23 10:01:36.056 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:36.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:36.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:37 np0005593234 nova_compute[227762]: 2026-01-23 10:01:37.476 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:37 np0005593234 nova_compute[227762]: 2026-01-23 10:01:37.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:37 np0005593234 nova_compute[227762]: 2026-01-23 10:01:37.808 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:38.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:38.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:39 np0005593234 nova_compute[227762]: 2026-01-23 10:01:39.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:01:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:40.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:01:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:40.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:40 np0005593234 nova_compute[227762]: 2026-01-23 10:01:40.697 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "61f18fb1-66aa-4089-b98f-50b8a49800ff" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:40 np0005593234 nova_compute[227762]: 2026-01-23 10:01:40.697 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:40 np0005593234 nova_compute[227762]: 2026-01-23 10:01:40.741 227766 DEBUG nova.compute.manager [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:01:41 np0005593234 nova_compute[227762]: 2026-01-23 10:01:41.058 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:41 np0005593234 nova_compute[227762]: 2026-01-23 10:01:41.505 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:41 np0005593234 nova_compute[227762]: 2026-01-23 10:01:41.506 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:41 np0005593234 nova_compute[227762]: 2026-01-23 10:01:41.514 227766 DEBUG nova.virt.hardware [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:01:41 np0005593234 nova_compute[227762]: 2026-01-23 10:01:41.514 227766 INFO nova.compute.claims [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:01:41 np0005593234 nova_compute[227762]: 2026-01-23 10:01:41.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:01:41 np0005593234 nova_compute[227762]: 2026-01-23 10:01:41.791 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:01:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3554528661' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:01:42 np0005593234 nova_compute[227762]: 2026-01-23 10:01:42.212 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:42 np0005593234 nova_compute[227762]: 2026-01-23 10:01:42.218 227766 DEBUG nova.compute.provider_tree [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:01:42 np0005593234 nova_compute[227762]: 2026-01-23 10:01:42.258 227766 DEBUG nova.scheduler.client.report [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:01:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:42.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:42 np0005593234 nova_compute[227762]: 2026-01-23 10:01:42.479 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:01:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:42.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:01:42 np0005593234 nova_compute[227762]: 2026-01-23 10:01:42.518 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:42 np0005593234 nova_compute[227762]: 2026-01-23 10:01:42.519 227766 DEBUG nova.compute.manager [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:01:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:42.838 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:42.839 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:42.839 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:42 np0005593234 nova_compute[227762]: 2026-01-23 10:01:42.919 227766 DEBUG nova.compute.manager [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:01:42 np0005593234 nova_compute[227762]: 2026-01-23 10:01:42.919 227766 DEBUG nova.network.neutron [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:01:43 np0005593234 nova_compute[227762]: 2026-01-23 10:01:43.074 227766 INFO nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:01:43 np0005593234 nova_compute[227762]: 2026-01-23 10:01:43.651 227766 DEBUG nova.compute.manager [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:01:43 np0005593234 nova_compute[227762]: 2026-01-23 10:01:43.718 227766 DEBUG nova.policy [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c09e682996b940dc97c866f9e4f1e74e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0f5ca0233c1a490aa2d596b88a0ec503', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:01:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:44.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:01:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:44.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:01:44 np0005593234 nova_compute[227762]: 2026-01-23 10:01:44.878 227766 DEBUG nova.compute.manager [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:01:44 np0005593234 nova_compute[227762]: 2026-01-23 10:01:44.879 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:01:44 np0005593234 nova_compute[227762]: 2026-01-23 10:01:44.879 227766 INFO nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Creating image(s)#033[00m
Jan 23 05:01:44 np0005593234 nova_compute[227762]: 2026-01-23 10:01:44.900 227766 DEBUG nova.storage.rbd_utils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 61f18fb1-66aa-4089-b98f-50b8a49800ff_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:44 np0005593234 nova_compute[227762]: 2026-01-23 10:01:44.919 227766 DEBUG nova.storage.rbd_utils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 61f18fb1-66aa-4089-b98f-50b8a49800ff_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:44 np0005593234 nova_compute[227762]: 2026-01-23 10:01:44.942 227766 DEBUG nova.storage.rbd_utils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 61f18fb1-66aa-4089-b98f-50b8a49800ff_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:44 np0005593234 nova_compute[227762]: 2026-01-23 10:01:44.947 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:45 np0005593234 nova_compute[227762]: 2026-01-23 10:01:45.009 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:45 np0005593234 nova_compute[227762]: 2026-01-23 10:01:45.010 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:45 np0005593234 nova_compute[227762]: 2026-01-23 10:01:45.011 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:45 np0005593234 nova_compute[227762]: 2026-01-23 10:01:45.011 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:45 np0005593234 nova_compute[227762]: 2026-01-23 10:01:45.032 227766 DEBUG nova.storage.rbd_utils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 61f18fb1-66aa-4089-b98f-50b8a49800ff_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:45 np0005593234 nova_compute[227762]: 2026-01-23 10:01:45.036 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 61f18fb1-66aa-4089-b98f-50b8a49800ff_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:45 np0005593234 nova_compute[227762]: 2026-01-23 10:01:45.373 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 61f18fb1-66aa-4089-b98f-50b8a49800ff_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.337s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:45 np0005593234 nova_compute[227762]: 2026-01-23 10:01:45.458 227766 DEBUG nova.storage.rbd_utils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] resizing rbd image 61f18fb1-66aa-4089-b98f-50b8a49800ff_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:01:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:45 np0005593234 nova_compute[227762]: 2026-01-23 10:01:45.690 227766 DEBUG nova.objects.instance [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'migration_context' on Instance uuid 61f18fb1-66aa-4089-b98f-50b8a49800ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:01:46 np0005593234 nova_compute[227762]: 2026-01-23 10:01:46.002 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:01:46 np0005593234 nova_compute[227762]: 2026-01-23 10:01:46.002 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Ensure instance console log exists: /var/lib/nova/instances/61f18fb1-66aa-4089-b98f-50b8a49800ff/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:01:46 np0005593234 nova_compute[227762]: 2026-01-23 10:01:46.003 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:46 np0005593234 nova_compute[227762]: 2026-01-23 10:01:46.003 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:46 np0005593234 nova_compute[227762]: 2026-01-23 10:01:46.003 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:46 np0005593234 nova_compute[227762]: 2026-01-23 10:01:46.059 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:46.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:01:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:46.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:01:47 np0005593234 nova_compute[227762]: 2026-01-23 10:01:47.482 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:47 np0005593234 nova_compute[227762]: 2026-01-23 10:01:47.720 227766 DEBUG nova.network.neutron [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Successfully created port: c9c463b9-3793-44a9-9773-69cc1638096d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:01:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:48.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:48.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:01:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:50.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:01:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:01:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:50.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:01:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:51 np0005593234 nova_compute[227762]: 2026-01-23 10:01:51.062 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:51 np0005593234 nova_compute[227762]: 2026-01-23 10:01:51.934 227766 DEBUG nova.network.neutron [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Successfully updated port: c9c463b9-3793-44a9-9773-69cc1638096d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:01:51 np0005593234 nova_compute[227762]: 2026-01-23 10:01:51.974 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "refresh_cache-61f18fb1-66aa-4089-b98f-50b8a49800ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:01:51 np0005593234 nova_compute[227762]: 2026-01-23 10:01:51.975 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquired lock "refresh_cache-61f18fb1-66aa-4089-b98f-50b8a49800ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:01:51 np0005593234 nova_compute[227762]: 2026-01-23 10:01:51.975 227766 DEBUG nova.network.neutron [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:01:52 np0005593234 nova_compute[227762]: 2026-01-23 10:01:52.077 227766 DEBUG nova.compute.manager [req-8e249af7-a321-4c09-b3e6-d723f1f0fc4a req-0cb5e0ff-f3a7-4f15-b2db-3cd15302c40e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received event network-changed-c9c463b9-3793-44a9-9773-69cc1638096d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:01:52 np0005593234 nova_compute[227762]: 2026-01-23 10:01:52.077 227766 DEBUG nova.compute.manager [req-8e249af7-a321-4c09-b3e6-d723f1f0fc4a req-0cb5e0ff-f3a7-4f15-b2db-3cd15302c40e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Refreshing instance network info cache due to event network-changed-c9c463b9-3793-44a9-9773-69cc1638096d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:01:52 np0005593234 nova_compute[227762]: 2026-01-23 10:01:52.078 227766 DEBUG oslo_concurrency.lockutils [req-8e249af7-a321-4c09-b3e6-d723f1f0fc4a req-0cb5e0ff-f3a7-4f15-b2db-3cd15302c40e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-61f18fb1-66aa-4089-b98f-50b8a49800ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:01:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:01:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:52.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:01:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:52 np0005593234 nova_compute[227762]: 2026-01-23 10:01:52.524 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:52.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:52 np0005593234 nova_compute[227762]: 2026-01-23 10:01:52.826 227766 DEBUG nova.network.neutron [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:01:53 np0005593234 podman[271295]: 2026-01-23 10:01:53.744028108 +0000 UTC m=+0.042327723 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:01:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:54.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.0 total, 600.0 interval#012Cumulative writes: 9906 writes, 50K keys, 9906 commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.03 MB/s#012Cumulative WAL: 9906 writes, 9906 syncs, 1.00 writes per sync, written: 0.10 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1571 writes, 7865 keys, 1571 commit groups, 1.0 writes per commit group, ingest: 15.78 MB, 0.03 MB/s#012Interval WAL: 1571 writes, 1571 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     50.4      1.22              0.17        30    0.041       0      0       0.0       0.0#012  L6      1/0    8.72 MB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   4.3    104.4     87.2      3.07              0.93        29    0.106    170K    16K       0.0       0.0#012 Sum      1/0    8.72 MB   0.0      0.3     0.1      0.3       0.3      0.1       0.0   5.3     74.7     76.7      4.29              1.11        59    0.073    170K    16K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.6     78.0     78.1      0.93              0.18        12    0.077     44K   3119       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   0.0    104.4     87.2      3.07              0.93        29    0.106    170K    16K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     50.6      1.22              0.17        29    0.042       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 3600.0 total, 600.0 interval#012Flush(GB): cumulative 0.060, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.32 GB write, 0.09 MB/s write, 0.31 GB read, 0.09 MB/s read, 4.3 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.9 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 304.00 MB usage: 35.54 MB table_size: 0 occupancy: 18446744073709551615 collections: 7 last_copies: 0 last_secs: 0.000253 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2055,34.23 MB,11.2603%) FilterBlock(59,489.55 KB,0.157261%) IndexBlock(59,846.81 KB,0.272028%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:01:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:01:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:54.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.634 227766 DEBUG nova.network.neutron [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Updating instance_info_cache with network_info: [{"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.841 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Releasing lock "refresh_cache-61f18fb1-66aa-4089-b98f-50b8a49800ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.842 227766 DEBUG nova.compute.manager [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Instance network_info: |[{"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.842 227766 DEBUG oslo_concurrency.lockutils [req-8e249af7-a321-4c09-b3e6-d723f1f0fc4a req-0cb5e0ff-f3a7-4f15-b2db-3cd15302c40e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-61f18fb1-66aa-4089-b98f-50b8a49800ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.842 227766 DEBUG nova.network.neutron [req-8e249af7-a321-4c09-b3e6-d723f1f0fc4a req-0cb5e0ff-f3a7-4f15-b2db-3cd15302c40e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Refreshing network info cache for port c9c463b9-3793-44a9-9773-69cc1638096d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.845 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Start _get_guest_xml network_info=[{"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.849 227766 WARNING nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.858 227766 DEBUG nova.virt.libvirt.host [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.858 227766 DEBUG nova.virt.libvirt.host [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.862 227766 DEBUG nova.virt.libvirt.host [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.863 227766 DEBUG nova.virt.libvirt.host [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.864 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.864 227766 DEBUG nova.virt.hardware [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.864 227766 DEBUG nova.virt.hardware [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.865 227766 DEBUG nova.virt.hardware [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.865 227766 DEBUG nova.virt.hardware [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.865 227766 DEBUG nova.virt.hardware [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.865 227766 DEBUG nova.virt.hardware [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.865 227766 DEBUG nova.virt.hardware [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.866 227766 DEBUG nova.virt.hardware [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.866 227766 DEBUG nova.virt.hardware [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.866 227766 DEBUG nova.virt.hardware [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.866 227766 DEBUG nova.virt.hardware [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:01:54 np0005593234 nova_compute[227762]: 2026-01-23 10:01:54.869 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:01:55 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/989655409' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.305 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.328 227766 DEBUG nova.storage.rbd_utils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 61f18fb1-66aa-4089-b98f-50b8a49800ff_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.333 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:01:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:01:55 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1643997811' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.800 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.801 227766 DEBUG nova.virt.libvirt.vif [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:01:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1972148084',display_name='tempest-ListServerFiltersTestJSON-instance-1972148084',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1972148084',id=98,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0f5ca0233c1a490aa2d596b88a0ec503',ramdisk_id='',reservation_id='r-l9i0gz1m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1524131674',owner_user_name='tempest-ListServerFiltersTestJSON-1524131674-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:01:44Z,user_data=None,user_id='c09e682996b940dc97c866f9e4f1e74e',uuid=61f18fb1-66aa-4089-b98f-50b8a49800ff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.802 227766 DEBUG nova.network.os_vif_util [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converting VIF {"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.803 227766 DEBUG nova.network.os_vif_util [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.804 227766 DEBUG nova.objects.instance [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'pci_devices' on Instance uuid 61f18fb1-66aa-4089-b98f-50b8a49800ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.822 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  <uuid>61f18fb1-66aa-4089-b98f-50b8a49800ff</uuid>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  <name>instance-00000062</name>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1972148084</nova:name>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:01:54</nova:creationTime>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <nova:user uuid="c09e682996b940dc97c866f9e4f1e74e">tempest-ListServerFiltersTestJSON-1524131674-project-member</nova:user>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <nova:project uuid="0f5ca0233c1a490aa2d596b88a0ec503">tempest-ListServerFiltersTestJSON-1524131674</nova:project>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <nova:port uuid="c9c463b9-3793-44a9-9773-69cc1638096d">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <entry name="serial">61f18fb1-66aa-4089-b98f-50b8a49800ff</entry>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <entry name="uuid">61f18fb1-66aa-4089-b98f-50b8a49800ff</entry>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/61f18fb1-66aa-4089-b98f-50b8a49800ff_disk">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/61f18fb1-66aa-4089-b98f-50b8a49800ff_disk.config">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:2d:54:e7"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <target dev="tapc9c463b9-37"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/61f18fb1-66aa-4089-b98f-50b8a49800ff/console.log" append="off"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:01:55 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:01:55 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:01:55 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:01:55 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.824 227766 DEBUG nova.compute.manager [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Preparing to wait for external event network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.824 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.824 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.824 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.825 227766 DEBUG nova.virt.libvirt.vif [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:01:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1972148084',display_name='tempest-ListServerFiltersTestJSON-instance-1972148084',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1972148084',id=98,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0f5ca0233c1a490aa2d596b88a0ec503',ramdisk_id='',reservation_id='r-l9i0gz1m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ListServerFiltersTestJSON-1524131674',owner_user_name='tempest-ListServerFiltersTestJSON-1524131674-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:01:44Z,user_data=None,user_id='c09e682996b940dc97c866f9e4f1e74e',uuid=61f18fb1-66aa-4089-b98f-50b8a49800ff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.825 227766 DEBUG nova.network.os_vif_util [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converting VIF {"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.826 227766 DEBUG nova.network.os_vif_util [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.826 227766 DEBUG os_vif [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.827 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.827 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.827 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.830 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.831 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc9c463b9-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.831 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc9c463b9-37, col_values=(('external_ids', {'iface-id': 'c9c463b9-3793-44a9-9773-69cc1638096d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:54:e7', 'vm-uuid': '61f18fb1-66aa-4089-b98f-50b8a49800ff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.832 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:55 np0005593234 NetworkManager[48942]: <info>  [1769162515.8334] manager: (tapc9c463b9-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/180)
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.836 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.839 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.840 227766 INFO os_vif [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37')#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.906 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.907 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.907 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] No VIF found with MAC fa:16:3e:2d:54:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.907 227766 INFO nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Using config drive#033[00m
Jan 23 05:01:55 np0005593234 nova_compute[227762]: 2026-01-23 10:01:55.932 227766 DEBUG nova.storage.rbd_utils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 61f18fb1-66aa-4089-b98f-50b8a49800ff_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:56 np0005593234 nova_compute[227762]: 2026-01-23 10:01:56.108 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:56.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:56.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:58 np0005593234 nova_compute[227762]: 2026-01-23 10:01:58.124 227766 INFO nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Creating config drive at /var/lib/nova/instances/61f18fb1-66aa-4089-b98f-50b8a49800ff/disk.config#033[00m
Jan 23 05:01:58 np0005593234 nova_compute[227762]: 2026-01-23 10:01:58.129 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/61f18fb1-66aa-4089-b98f-50b8a49800ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppanmkgi3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:58 np0005593234 nova_compute[227762]: 2026-01-23 10:01:58.256 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/61f18fb1-66aa-4089-b98f-50b8a49800ff/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppanmkgi3" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:58.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:58 np0005593234 nova_compute[227762]: 2026-01-23 10:01:58.284 227766 DEBUG nova.storage.rbd_utils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] rbd image 61f18fb1-66aa-4089-b98f-50b8a49800ff_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:01:58 np0005593234 nova_compute[227762]: 2026-01-23 10:01:58.287 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/61f18fb1-66aa-4089-b98f-50b8a49800ff/disk.config 61f18fb1-66aa-4089-b98f-50b8a49800ff_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:01:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:01:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:01:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:58.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:01:58 np0005593234 nova_compute[227762]: 2026-01-23 10:01:58.867 227766 DEBUG oslo_concurrency.processutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/61f18fb1-66aa-4089-b98f-50b8a49800ff/disk.config 61f18fb1-66aa-4089-b98f-50b8a49800ff_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.580s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:01:58 np0005593234 nova_compute[227762]: 2026-01-23 10:01:58.869 227766 INFO nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Deleting local config drive /var/lib/nova/instances/61f18fb1-66aa-4089-b98f-50b8a49800ff/disk.config because it was imported into RBD.#033[00m
Jan 23 05:01:58 np0005593234 kernel: tapc9c463b9-37: entered promiscuous mode
Jan 23 05:01:58 np0005593234 ovn_controller[134547]: 2026-01-23T10:01:58Z|00351|binding|INFO|Claiming lport c9c463b9-3793-44a9-9773-69cc1638096d for this chassis.
Jan 23 05:01:58 np0005593234 ovn_controller[134547]: 2026-01-23T10:01:58Z|00352|binding|INFO|c9c463b9-3793-44a9-9773-69cc1638096d: Claiming fa:16:3e:2d:54:e7 10.100.0.12
Jan 23 05:01:58 np0005593234 NetworkManager[48942]: <info>  [1769162518.9205] manager: (tapc9c463b9-37): new Tun device (/org/freedesktop/NetworkManager/Devices/181)
Jan 23 05:01:58 np0005593234 nova_compute[227762]: 2026-01-23 10:01:58.920 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:58.931 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:54:e7 10.100.0.12'], port_security=['fa:16:3e:2d:54:e7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '61f18fb1-66aa-4089-b98f-50b8a49800ff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0f5ca0233c1a490aa2d596b88a0ec503', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ad8a7362-692a-4044-8393-1c10014f8bab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=83406af9-ea42-4cda-96ee-b8c04ab0651a, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=c9c463b9-3793-44a9-9773-69cc1638096d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:01:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:58.933 144381 INFO neutron.agent.ovn.metadata.agent [-] Port c9c463b9-3793-44a9-9773-69cc1638096d in datapath 969bd83a-7542-46e3-90f0-1a81f26ba6b8 bound to our chassis#033[00m
Jan 23 05:01:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:58.934 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 969bd83a-7542-46e3-90f0-1a81f26ba6b8#033[00m
Jan 23 05:01:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:58.946 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[669470a9-3741-4888-9b53-22aa69b8ba63]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:58.947 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap969bd83a-71 in ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:01:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:58.949 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap969bd83a-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:01:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:58.949 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b0dd4355-1ff7-41ed-a0ef-342ccafc1342]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:58.949 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1e6da172-13f6-4684-8c6b-67cc27bb2860]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:58 np0005593234 systemd-udevd[271454]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:01:58 np0005593234 systemd-machined[195626]: New machine qemu-40-instance-00000062.
Jan 23 05:01:58 np0005593234 NetworkManager[48942]: <info>  [1769162518.9636] device (tapc9c463b9-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:01:58 np0005593234 NetworkManager[48942]: <info>  [1769162518.9643] device (tapc9c463b9-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:01:58 np0005593234 nova_compute[227762]: 2026-01-23 10:01:58.963 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:58 np0005593234 systemd[1]: Started Virtual Machine qemu-40-instance-00000062.
Jan 23 05:01:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:58.963 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[c70541ae-6fbc-4adf-9be3-51dd8c54beb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:58 np0005593234 nova_compute[227762]: 2026-01-23 10:01:58.969 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:58 np0005593234 ovn_controller[134547]: 2026-01-23T10:01:58Z|00353|binding|INFO|Setting lport c9c463b9-3793-44a9-9773-69cc1638096d ovn-installed in OVS
Jan 23 05:01:58 np0005593234 ovn_controller[134547]: 2026-01-23T10:01:58Z|00354|binding|INFO|Setting lport c9c463b9-3793-44a9-9773-69cc1638096d up in Southbound
Jan 23 05:01:58 np0005593234 nova_compute[227762]: 2026-01-23 10:01:58.972 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:58.989 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5a140c29-2e3c-4393-8fc6-40d9f647df54]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.021 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c9192adf-ca14-4c00-86a5-5f8d416b9c9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:59 np0005593234 systemd-udevd[271457]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:01:59 np0005593234 NetworkManager[48942]: <info>  [1769162519.0276] manager: (tap969bd83a-70): new Veth device (/org/freedesktop/NetworkManager/Devices/182)
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.026 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ee967e02-7f14-4468-85be-93ff67322c47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.059 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[81e79788-5d94-4e4f-a09b-ea631e9ecf03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.062 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[677b85b2-2566-4fad-b024-7523d498202d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:59 np0005593234 NetworkManager[48942]: <info>  [1769162519.0842] device (tap969bd83a-70): carrier: link connected
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.089 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ad442f91-b03d-483f-8f46-f13f8da8b17e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.108 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[95382515-f3d0-43a7-b1f8-30337fef2cf5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap969bd83a-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:fe:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641404, 'reachable_time': 30658, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 271486, 'error': None, 'target': 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.124 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[af00564c-0ef4-4292-b3f5-176f2e8b3497]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:fef5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 641404, 'tstamp': 641404}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 271487, 'error': None, 'target': 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.139 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b44a496f-6d88-4ac7-9528-0b39aa724a1c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap969bd83a-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:fe:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 112], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641404, 'reachable_time': 30658, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 271488, 'error': None, 'target': 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.166 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[59c6af8c-7091-4795-b0ef-9f1ec7f89dcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.221 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cee0ee07-87ec-45c2-8d66-01200801d33f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.223 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap969bd83a-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.223 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.224 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap969bd83a-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:59 np0005593234 kernel: tap969bd83a-70: entered promiscuous mode
Jan 23 05:01:59 np0005593234 nova_compute[227762]: 2026-01-23 10:01:59.225 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:59 np0005593234 NetworkManager[48942]: <info>  [1769162519.2264] manager: (tap969bd83a-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/183)
Jan 23 05:01:59 np0005593234 nova_compute[227762]: 2026-01-23 10:01:59.227 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.229 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap969bd83a-70, col_values=(('external_ids', {'iface-id': '9ee89271-3ee7-4672-8800-56bb900c4dd0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:01:59 np0005593234 nova_compute[227762]: 2026-01-23 10:01:59.231 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:59 np0005593234 ovn_controller[134547]: 2026-01-23T10:01:59Z|00355|binding|INFO|Releasing lport 9ee89271-3ee7-4672-8800-56bb900c4dd0 from this chassis (sb_readonly=0)
Jan 23 05:01:59 np0005593234 nova_compute[227762]: 2026-01-23 10:01:59.232 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.233 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/969bd83a-7542-46e3-90f0-1a81f26ba6b8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/969bd83a-7542-46e3-90f0-1a81f26ba6b8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.234 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2683a2e7-f9d3-46c6-968a-2ee44a911b1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.235 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-969bd83a-7542-46e3-90f0-1a81f26ba6b8
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/969bd83a-7542-46e3-90f0-1a81f26ba6b8.pid.haproxy
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 969bd83a-7542-46e3-90f0-1a81f26ba6b8
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:01:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:01:59.236 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'env', 'PROCESS_TAG=haproxy-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/969bd83a-7542-46e3-90f0-1a81f26ba6b8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:01:59 np0005593234 nova_compute[227762]: 2026-01-23 10:01:59.244 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:01:59 np0005593234 podman[271520]: 2026-01-23 10:01:59.607097695 +0000 UTC m=+0.050609422 container create 40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:01:59 np0005593234 systemd[1]: Started libpod-conmon-40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4.scope.
Jan 23 05:01:59 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:01:59 np0005593234 podman[271520]: 2026-01-23 10:01:59.580950369 +0000 UTC m=+0.024462126 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:01:59 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0ad228d559a70f5aacc206e93bc348d270b05fac9b23ab9c24a2429eb5a23cd3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:01:59 np0005593234 podman[271520]: 2026-01-23 10:01:59.698002104 +0000 UTC m=+0.141513851 container init 40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 23 05:01:59 np0005593234 podman[271520]: 2026-01-23 10:01:59.704171937 +0000 UTC m=+0.147683664 container start 40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 05:01:59 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[271551]: [NOTICE]   (271575) : New worker (271578) forked
Jan 23 05:01:59 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[271551]: [NOTICE]   (271575) : Loading success.
Jan 23 05:01:59 np0005593234 nova_compute[227762]: 2026-01-23 10:01:59.825 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162519.824687, 61f18fb1-66aa-4089-b98f-50b8a49800ff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:01:59 np0005593234 nova_compute[227762]: 2026-01-23 10:01:59.825 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] VM Started (Lifecycle Event)#033[00m
Jan 23 05:01:59 np0005593234 nova_compute[227762]: 2026-01-23 10:01:59.856 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:01:59 np0005593234 nova_compute[227762]: 2026-01-23 10:01:59.861 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162519.82525, 61f18fb1-66aa-4089-b98f-50b8a49800ff => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:01:59 np0005593234 nova_compute[227762]: 2026-01-23 10:01:59.861 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:01:59 np0005593234 nova_compute[227762]: 2026-01-23 10:01:59.890 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:01:59 np0005593234 nova_compute[227762]: 2026-01-23 10:01:59.893 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:01:59 np0005593234 nova_compute[227762]: 2026-01-23 10:01:59.931 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.114 227766 DEBUG nova.network.neutron [req-8e249af7-a321-4c09-b3e6-d723f1f0fc4a req-0cb5e0ff-f3a7-4f15-b2db-3cd15302c40e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Updated VIF entry in instance network info cache for port c9c463b9-3793-44a9-9773-69cc1638096d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.115 227766 DEBUG nova.network.neutron [req-8e249af7-a321-4c09-b3e6-d723f1f0fc4a req-0cb5e0ff-f3a7-4f15-b2db-3cd15302c40e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Updating instance_info_cache with network_info: [{"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.148 227766 DEBUG oslo_concurrency.lockutils [req-8e249af7-a321-4c09-b3e6-d723f1f0fc4a req-0cb5e0ff-f3a7-4f15-b2db-3cd15302c40e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-61f18fb1-66aa-4089-b98f-50b8a49800ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:00.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:00.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.564 227766 DEBUG nova.compute.manager [req-a94c4366-738b-4c6b-844c-ba86e1f98de3 req-28f0d2e2-92a2-4df5-aceb-d04f450ed553 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received event network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.565 227766 DEBUG oslo_concurrency.lockutils [req-a94c4366-738b-4c6b-844c-ba86e1f98de3 req-28f0d2e2-92a2-4df5-aceb-d04f450ed553 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.565 227766 DEBUG oslo_concurrency.lockutils [req-a94c4366-738b-4c6b-844c-ba86e1f98de3 req-28f0d2e2-92a2-4df5-aceb-d04f450ed553 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.566 227766 DEBUG oslo_concurrency.lockutils [req-a94c4366-738b-4c6b-844c-ba86e1f98de3 req-28f0d2e2-92a2-4df5-aceb-d04f450ed553 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.566 227766 DEBUG nova.compute.manager [req-a94c4366-738b-4c6b-844c-ba86e1f98de3 req-28f0d2e2-92a2-4df5-aceb-d04f450ed553 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Processing event network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.567 227766 DEBUG nova.compute.manager [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.572 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162520.5719602, 61f18fb1-66aa-4089-b98f-50b8a49800ff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.572 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.574 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.578 227766 INFO nova.virt.libvirt.driver [-] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Instance spawned successfully.#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.578 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.606 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.611 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.615 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.616 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.617 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.617 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.618 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.618 227766 DEBUG nova.virt.libvirt.driver [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.666 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.745 227766 INFO nova.compute.manager [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Took 15.87 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.746 227766 DEBUG nova.compute.manager [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.833 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.867 227766 INFO nova.compute.manager [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Took 19.42 seconds to build instance.#033[00m
Jan 23 05:02:00 np0005593234 nova_compute[227762]: 2026-01-23 10:02:00.901 227766 DEBUG oslo_concurrency.lockutils [None req-aecbdd07-f7a6-4dcc-9e4b-71f89a73c4ac c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.203s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:01 np0005593234 nova_compute[227762]: 2026-01-23 10:02:01.110 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:01 np0005593234 nova_compute[227762]: 2026-01-23 10:02:01.803 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:01.805 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=39, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=38) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:02:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:01.807 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:02:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:01.808 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '39'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:02.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:02.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:02 np0005593234 nova_compute[227762]: 2026-01-23 10:02:02.747 227766 DEBUG nova.compute.manager [req-52a29137-62c9-47c9-8591-8b918578106f req-2c2ae42b-5db4-45bd-b807-e75249febb00 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received event network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:02 np0005593234 nova_compute[227762]: 2026-01-23 10:02:02.747 227766 DEBUG oslo_concurrency.lockutils [req-52a29137-62c9-47c9-8591-8b918578106f req-2c2ae42b-5db4-45bd-b807-e75249febb00 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:02 np0005593234 nova_compute[227762]: 2026-01-23 10:02:02.747 227766 DEBUG oslo_concurrency.lockutils [req-52a29137-62c9-47c9-8591-8b918578106f req-2c2ae42b-5db4-45bd-b807-e75249febb00 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:02 np0005593234 nova_compute[227762]: 2026-01-23 10:02:02.748 227766 DEBUG oslo_concurrency.lockutils [req-52a29137-62c9-47c9-8591-8b918578106f req-2c2ae42b-5db4-45bd-b807-e75249febb00 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:02 np0005593234 nova_compute[227762]: 2026-01-23 10:02:02.748 227766 DEBUG nova.compute.manager [req-52a29137-62c9-47c9-8591-8b918578106f req-2c2ae42b-5db4-45bd-b807-e75249febb00 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] No waiting events found dispatching network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:02:02 np0005593234 nova_compute[227762]: 2026-01-23 10:02:02.748 227766 WARNING nova.compute.manager [req-52a29137-62c9-47c9-8591-8b918578106f req-2c2ae42b-5db4-45bd-b807-e75249febb00 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received unexpected event network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:02:04 np0005593234 podman[271619]: 2026-01-23 10:02:04.023337682 +0000 UTC m=+0.087625458 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:02:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:04.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:04.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:05 np0005593234 nova_compute[227762]: 2026-01-23 10:02:05.837 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:06 np0005593234 nova_compute[227762]: 2026-01-23 10:02:06.113 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:06.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:06.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:08.291 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:08.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:10.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:10.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:10 np0005593234 nova_compute[227762]: 2026-01-23 10:02:10.840 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:11 np0005593234 nova_compute[227762]: 2026-01-23 10:02:11.113 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:02:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:12.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:02:12 np0005593234 nova_compute[227762]: 2026-01-23 10:02:12.487 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:12 np0005593234 nova_compute[227762]: 2026-01-23 10:02:12.487 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:12 np0005593234 nova_compute[227762]: 2026-01-23 10:02:12.527 227766 DEBUG nova.compute.manager [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:02:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:02:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:12.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:02:12 np0005593234 nova_compute[227762]: 2026-01-23 10:02:12.677 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:12 np0005593234 nova_compute[227762]: 2026-01-23 10:02:12.677 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:12 np0005593234 nova_compute[227762]: 2026-01-23 10:02:12.686 227766 DEBUG nova.virt.hardware [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:02:12 np0005593234 nova_compute[227762]: 2026-01-23 10:02:12.687 227766 INFO nova.compute.claims [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:02:12 np0005593234 nova_compute[227762]: 2026-01-23 10:02:12.927 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.193 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.194 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.232 227766 DEBUG nova.compute.manager [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:02:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:02:13 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1048278726' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.421 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.426 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.429 227766 DEBUG nova.compute.provider_tree [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.451 227766 DEBUG nova.scheduler.client.report [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.528 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.851s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.529 227766 DEBUG nova.compute.manager [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.531 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.540 227766 DEBUG nova.virt.hardware [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.541 227766 INFO nova.compute.claims [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.661 227766 DEBUG nova.compute.manager [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.661 227766 DEBUG nova.network.neutron [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.728 227766 INFO nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.781 227766 DEBUG nova.compute.manager [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:02:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:13Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:2d:54:e7 10.100.0.12
Jan 23 05:02:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:13Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:54:e7 10.100.0.12
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.970 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.994 227766 DEBUG nova.compute.manager [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.996 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:02:13 np0005593234 nova_compute[227762]: 2026-01-23 10:02:13.996 227766 INFO nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Creating image(s)#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.029 227766 DEBUG nova.storage.rbd_utils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.057 227766 DEBUG nova.storage.rbd_utils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.082 227766 DEBUG nova.storage.rbd_utils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.086 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.153 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.154 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.155 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.155 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.185 227766 DEBUG nova.storage.rbd_utils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.190 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 f4412d8b-963a-4c3a-accf-68f8cf82c864_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.215 227766 DEBUG nova.policy [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '29710db389c842df836944048225740f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8c16cd713fa74a88b43e4edf01c273bd', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:02:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:14.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:02:14 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2938247568' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.416 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.427 227766 DEBUG nova.compute.provider_tree [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.491 227766 DEBUG nova.scheduler.client.report [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.538 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.538 227766 DEBUG nova.compute.manager [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.553 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 f4412d8b-963a-4c3a-accf-68f8cf82c864_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.364s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:14.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.631 227766 DEBUG nova.storage.rbd_utils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] resizing rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.698 227766 DEBUG nova.compute.manager [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.698 227766 DEBUG nova.network.neutron [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.739 227766 DEBUG nova.objects.instance [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'migration_context' on Instance uuid f4412d8b-963a-4c3a-accf-68f8cf82c864 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.766 227766 INFO nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.769 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.769 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Ensure instance console log exists: /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.769 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.770 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.770 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.810 227766 DEBUG nova.compute.manager [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:02:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.956 227766 DEBUG nova.compute.manager [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.957 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.958 227766 INFO nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Creating image(s)#033[00m
Jan 23 05:02:14 np0005593234 nova_compute[227762]: 2026-01-23 10:02:14.980 227766 DEBUG nova.storage.rbd_utils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.009 227766 DEBUG nova.storage.rbd_utils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.036 227766 DEBUG nova.storage.rbd_utils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.041 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.101 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.102 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.103 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.103 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.127 227766 DEBUG nova.storage.rbd_utils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.130 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.156 227766 DEBUG nova.policy [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9d4a5c201efa4992a9ef57d8abdc1675', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.420 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.291s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.506 227766 DEBUG nova.storage.rbd_utils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] resizing rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:02:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.604 227766 DEBUG nova.objects.instance [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'migration_context' on Instance uuid b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.623 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.623 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Ensure instance console log exists: /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.624 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.624 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.625 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.733 227766 DEBUG nova.network.neutron [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Successfully created port: e14a3386-c770-46be-bafc-0418fa8274a7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:02:15 np0005593234 nova_compute[227762]: 2026-01-23 10:02:15.844 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:16 np0005593234 nova_compute[227762]: 2026-01-23 10:02:16.132 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:16.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:16.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:16 np0005593234 nova_compute[227762]: 2026-01-23 10:02:16.922 227766 DEBUG nova.network.neutron [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Successfully created port: 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:02:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:18.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:18.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.204 227766 DEBUG nova.network.neutron [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Successfully updated port: e14a3386-c770-46be-bafc-0418fa8274a7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.228 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "refresh_cache-f4412d8b-963a-4c3a-accf-68f8cf82c864" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.228 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquired lock "refresh_cache-f4412d8b-963a-4c3a-accf-68f8cf82c864" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.228 227766 DEBUG nova.network.neutron [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.393 227766 DEBUG nova.compute.manager [req-eb511374-b6d8-46d4-9136-6978c8a88857 req-1eca9bd5-698b-449e-8284-45f59c04f835 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received event network-changed-e14a3386-c770-46be-bafc-0418fa8274a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.394 227766 DEBUG nova.compute.manager [req-eb511374-b6d8-46d4-9136-6978c8a88857 req-1eca9bd5-698b-449e-8284-45f59c04f835 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Refreshing instance network info cache due to event network-changed-e14a3386-c770-46be-bafc-0418fa8274a7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.394 227766 DEBUG oslo_concurrency.lockutils [req-eb511374-b6d8-46d4-9136-6978c8a88857 req-1eca9bd5-698b-449e-8284-45f59c04f835 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-f4412d8b-963a-4c3a-accf-68f8cf82c864" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.611 227766 DEBUG nova.network.neutron [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.757 227766 DEBUG nova.network.neutron [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Successfully updated port: 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.795 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "refresh_cache-b0b5a1b2-04bd-48e0-a0f7-0c679d784e04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.796 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquired lock "refresh_cache-b0b5a1b2-04bd-48e0-a0f7-0c679d784e04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.796 227766 DEBUG nova.network.neutron [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.967 227766 DEBUG nova.compute.manager [req-da18b09f-048a-4380-83ec-2ce2debad68e req-2cefeaeb-abc0-4572-94fc-b48758acbbe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received event network-changed-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.967 227766 DEBUG nova.compute.manager [req-da18b09f-048a-4380-83ec-2ce2debad68e req-2cefeaeb-abc0-4572-94fc-b48758acbbe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Refreshing instance network info cache due to event network-changed-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:02:19 np0005593234 nova_compute[227762]: 2026-01-23 10:02:19.968 227766 DEBUG oslo_concurrency.lockutils [req-da18b09f-048a-4380-83ec-2ce2debad68e req-2cefeaeb-abc0-4572-94fc-b48758acbbe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-b0b5a1b2-04bd-48e0-a0f7-0c679d784e04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:02:20 np0005593234 nova_compute[227762]: 2026-01-23 10:02:20.159 227766 DEBUG nova.network.neutron [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:02:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:20.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:20.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:20 np0005593234 nova_compute[227762]: 2026-01-23 10:02:20.849 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:21 np0005593234 nova_compute[227762]: 2026-01-23 10:02:21.133 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:22.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.374 227766 DEBUG nova.network.neutron [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Updating instance_info_cache with network_info: [{"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.492 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Releasing lock "refresh_cache-f4412d8b-963a-4c3a-accf-68f8cf82c864" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.493 227766 DEBUG nova.compute.manager [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance network_info: |[{"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.493 227766 DEBUG oslo_concurrency.lockutils [req-eb511374-b6d8-46d4-9136-6978c8a88857 req-1eca9bd5-698b-449e-8284-45f59c04f835 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-f4412d8b-963a-4c3a-accf-68f8cf82c864" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.494 227766 DEBUG nova.network.neutron [req-eb511374-b6d8-46d4-9136-6978c8a88857 req-1eca9bd5-698b-449e-8284-45f59c04f835 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Refreshing network info cache for port e14a3386-c770-46be-bafc-0418fa8274a7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.496 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Start _get_guest_xml network_info=[{"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.500 227766 WARNING nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.513 227766 DEBUG nova.virt.libvirt.host [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.514 227766 DEBUG nova.virt.libvirt.host [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.522 227766 DEBUG nova.virt.libvirt.host [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.523 227766 DEBUG nova.virt.libvirt.host [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.525 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.526 227766 DEBUG nova.virt.hardware [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.526 227766 DEBUG nova.virt.hardware [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.526 227766 DEBUG nova.virt.hardware [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.526 227766 DEBUG nova.virt.hardware [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.527 227766 DEBUG nova.virt.hardware [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.527 227766 DEBUG nova.virt.hardware [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.527 227766 DEBUG nova.virt.hardware [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.527 227766 DEBUG nova.virt.hardware [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.528 227766 DEBUG nova.virt.hardware [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.528 227766 DEBUG nova.virt.hardware [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.528 227766 DEBUG nova.virt.hardware [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.533 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:22.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:02:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1928101532' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.969 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:22 np0005593234 nova_compute[227762]: 2026-01-23 10:02:22.996 227766 DEBUG nova.storage.rbd_utils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.000 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.132 227766 DEBUG nova.network.neutron [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Updating instance_info_cache with network_info: [{"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.187 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Releasing lock "refresh_cache-b0b5a1b2-04bd-48e0-a0f7-0c679d784e04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.188 227766 DEBUG nova.compute.manager [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Instance network_info: |[{"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.188 227766 DEBUG oslo_concurrency.lockutils [req-da18b09f-048a-4380-83ec-2ce2debad68e req-2cefeaeb-abc0-4572-94fc-b48758acbbe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-b0b5a1b2-04bd-48e0-a0f7-0c679d784e04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.189 227766 DEBUG nova.network.neutron [req-da18b09f-048a-4380-83ec-2ce2debad68e req-2cefeaeb-abc0-4572-94fc-b48758acbbe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Refreshing network info cache for port 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.192 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Start _get_guest_xml network_info=[{"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.196 227766 WARNING nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.203 227766 DEBUG nova.virt.libvirt.host [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.203 227766 DEBUG nova.virt.libvirt.host [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.209 227766 DEBUG nova.virt.libvirt.host [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.209 227766 DEBUG nova.virt.libvirt.host [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.210 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.210 227766 DEBUG nova.virt.hardware [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.211 227766 DEBUG nova.virt.hardware [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.211 227766 DEBUG nova.virt.hardware [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.211 227766 DEBUG nova.virt.hardware [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.211 227766 DEBUG nova.virt.hardware [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.212 227766 DEBUG nova.virt.hardware [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.212 227766 DEBUG nova.virt.hardware [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.212 227766 DEBUG nova.virt.hardware [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.212 227766 DEBUG nova.virt.hardware [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.212 227766 DEBUG nova.virt.hardware [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.213 227766 DEBUG nova.virt.hardware [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.216 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:02:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3459594829' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.465 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.466 227766 DEBUG nova.virt.libvirt.vif [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:02:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-191757669',display_name='tempest-tempest.common.compute-instance-191757669',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-191757669',id=101,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8c16cd713fa74a88b43e4edf01c273bd',ramdisk_id='',reservation_id='r-lj8pfndu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-882763067',owner_user_name='tempest-ServerActionsTestOtherA-882763067-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:02:13Z,user_data=None,user_id='29710db389c842df836944048225740f',uuid=f4412d8b-963a-4c3a-accf-68f8cf82c864,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.467 227766 DEBUG nova.network.os_vif_util [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converting VIF {"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.468 227766 DEBUG nova.network.os_vif_util [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.469 227766 DEBUG nova.objects.instance [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'pci_devices' on Instance uuid f4412d8b-963a-4c3a-accf-68f8cf82c864 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.530 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  <uuid>f4412d8b-963a-4c3a-accf-68f8cf82c864</uuid>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  <name>instance-00000065</name>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <nova:name>tempest-tempest.common.compute-instance-191757669</nova:name>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:02:22</nova:creationTime>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <nova:user uuid="29710db389c842df836944048225740f">tempest-ServerActionsTestOtherA-882763067-project-member</nova:user>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <nova:project uuid="8c16cd713fa74a88b43e4edf01c273bd">tempest-ServerActionsTestOtherA-882763067</nova:project>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <nova:port uuid="e14a3386-c770-46be-bafc-0418fa8274a7">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <entry name="serial">f4412d8b-963a-4c3a-accf-68f8cf82c864</entry>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <entry name="uuid">f4412d8b-963a-4c3a-accf-68f8cf82c864</entry>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/f4412d8b-963a-4c3a-accf-68f8cf82c864_disk">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/f4412d8b-963a-4c3a-accf-68f8cf82c864_disk.config">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:b2:95:7c"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <target dev="tape14a3386-c7"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/console.log" append="off"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:02:23 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:02:23 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:02:23 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:02:23 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.532 227766 DEBUG nova.compute.manager [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Preparing to wait for external event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.533 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.533 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.533 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.534 227766 DEBUG nova.virt.libvirt.vif [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:02:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-191757669',display_name='tempest-tempest.common.compute-instance-191757669',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-191757669',id=101,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8c16cd713fa74a88b43e4edf01c273bd',ramdisk_id='',reservation_id='r-lj8pfndu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherA-882763067',owner_user_name='tempest-ServerActionsTestOtherA-882763067-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:02:13Z,user_data=None,user_id='29710db389c842df836944048225740f',uuid=f4412d8b-963a-4c3a-accf-68f8cf82c864,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.534 227766 DEBUG nova.network.os_vif_util [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converting VIF {"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.535 227766 DEBUG nova.network.os_vif_util [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.535 227766 DEBUG os_vif [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.536 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.537 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.537 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.541 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.541 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape14a3386-c7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.542 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape14a3386-c7, col_values=(('external_ids', {'iface-id': 'e14a3386-c770-46be-bafc-0418fa8274a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b2:95:7c', 'vm-uuid': 'f4412d8b-963a-4c3a-accf-68f8cf82c864'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.544 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:23 np0005593234 NetworkManager[48942]: <info>  [1769162543.5446] manager: (tape14a3386-c7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/184)
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.546 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.550 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.552 227766 INFO os_vif [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7')#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.633 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.633 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.634 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] No VIF found with MAC fa:16:3e:b2:95:7c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.634 227766 INFO nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Using config drive#033[00m
Jan 23 05:02:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:02:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1337420446' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.656 227766 DEBUG nova.storage.rbd_utils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.662 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.688 227766 DEBUG nova.storage.rbd_utils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:23 np0005593234 nova_compute[227762]: 2026-01-23 10:02:23.691 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:02:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4046423486' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.131 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.132 227766 DEBUG nova.virt.libvirt.vif [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:02:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-998112874',display_name='tempest-tempest.common.compute-instance-998112874',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-998112874',id=102,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-fjoe9sqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:02:14Z,user_data=None,user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=b0b5a1b2-04bd-48e0-a0f7-0c679d784e04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.132 227766 DEBUG nova.network.os_vif_util [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.133 227766 DEBUG nova.network.os_vif_util [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.134 227766 DEBUG nova.objects.instance [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'pci_devices' on Instance uuid b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:24 np0005593234 podman[272472]: 2026-01-23 10:02:24.148945865 +0000 UTC m=+0.064175176 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.160 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  <uuid>b0b5a1b2-04bd-48e0-a0f7-0c679d784e04</uuid>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  <name>instance-00000066</name>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <nova:name>tempest-tempest.common.compute-instance-998112874</nova:name>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:02:23</nova:creationTime>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <nova:user uuid="9d4a5c201efa4992a9ef57d8abdc1675">tempest-ServerActionsTestJSON-1619235720-project-member</nova:user>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <nova:project uuid="74c5c1d0762242f29a5d26033efd9f6d">tempest-ServerActionsTestJSON-1619235720</nova:project>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <nova:port uuid="4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <entry name="serial">b0b5a1b2-04bd-48e0-a0f7-0c679d784e04</entry>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <entry name="uuid">b0b5a1b2-04bd-48e0-a0f7-0c679d784e04</entry>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk.config">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:1e:ca:17"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <target dev="tap4ca9e3cb-f7"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/console.log" append="off"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:02:24 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:02:24 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:02:24 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:02:24 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.161 227766 DEBUG nova.compute.manager [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Preparing to wait for external event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.161 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.161 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.161 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.162 227766 DEBUG nova.virt.libvirt.vif [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:02:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-998112874',display_name='tempest-tempest.common.compute-instance-998112874',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-998112874',id=102,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-fjoe9sqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:02:14Z,user_data=None,user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=b0b5a1b2-04bd-48e0-a0f7-0c679d784e04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.162 227766 DEBUG nova.network.os_vif_util [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.162 227766 DEBUG nova.network.os_vif_util [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.163 227766 DEBUG os_vif [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.163 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.164 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.164 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.166 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.166 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ca9e3cb-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.167 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4ca9e3cb-f7, col_values=(('external_ids', {'iface-id': '4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:ca:17', 'vm-uuid': 'b0b5a1b2-04bd-48e0-a0f7-0c679d784e04'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.168 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:24 np0005593234 NetworkManager[48942]: <info>  [1769162544.1693] manager: (tap4ca9e3cb-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/185)
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.170 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.176 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.177 227766 INFO os_vif [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7')#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.303 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.304 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.304 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No VIF found with MAC fa:16:3e:1e:ca:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.304 227766 INFO nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Using config drive#033[00m
Jan 23 05:02:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:24.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:24 np0005593234 nova_compute[227762]: 2026-01-23 10:02:24.331 227766 DEBUG nova.storage.rbd_utils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:02:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:02:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:24.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:25 np0005593234 nova_compute[227762]: 2026-01-23 10:02:25.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.137 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:26.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.380 227766 INFO nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Creating config drive at /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/disk.config#033[00m
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.385 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp42qp9g8y execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.410 227766 INFO nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Creating config drive at /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/disk.config#033[00m
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.415 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp30dwp9d0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.514 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp42qp9g8y" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.539 227766 DEBUG nova.storage.rbd_utils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.542 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/disk.config f4412d8b-963a-4c3a-accf-68f8cf82c864_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.563 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp30dwp9d0" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:26.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.593 227766 DEBUG nova.storage.rbd_utils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.597 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/disk.config b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.710 227766 DEBUG oslo_concurrency.processutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/disk.config f4412d8b-963a-4c3a-accf-68f8cf82c864_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.712 227766 INFO nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Deleting local config drive /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/disk.config because it was imported into RBD.#033[00m
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.737 227766 DEBUG oslo_concurrency.processutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/disk.config b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.738 227766 INFO nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Deleting local config drive /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/disk.config because it was imported into RBD.#033[00m
Jan 23 05:02:26 np0005593234 kernel: tape14a3386-c7: entered promiscuous mode
Jan 23 05:02:26 np0005593234 NetworkManager[48942]: <info>  [1769162546.7620] manager: (tape14a3386-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/186)
Jan 23 05:02:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:26Z|00356|binding|INFO|Claiming lport e14a3386-c770-46be-bafc-0418fa8274a7 for this chassis.
Jan 23 05:02:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:26Z|00357|binding|INFO|e14a3386-c770-46be-bafc-0418fa8274a7: Claiming fa:16:3e:b2:95:7c 10.100.0.7
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.768 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:26 np0005593234 NetworkManager[48942]: <info>  [1769162546.7830] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/187)
Jan 23 05:02:26 np0005593234 NetworkManager[48942]: <info>  [1769162546.7834] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/188)
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.782 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:26 np0005593234 NetworkManager[48942]: <info>  [1769162546.7955] manager: (tap4ca9e3cb-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/189)
Jan 23 05:02:26 np0005593234 systemd-udevd[272639]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:02:26 np0005593234 systemd-udevd[272640]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:02:26 np0005593234 NetworkManager[48942]: <info>  [1769162546.8079] device (tape14a3386-c7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:02:26 np0005593234 NetworkManager[48942]: <info>  [1769162546.8087] device (tape14a3386-c7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:02:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:26.902 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:95:7c 10.100.0.7'], port_security=['fa:16:3e:b2:95:7c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f4412d8b-963a-4c3a-accf-68f8cf82c864', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c16cd713fa74a88b43e4edf01c273bd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1b3b2b26-a9c9-438c-b14e-9fddf18d8ea5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c3aed5f-30b8-4c57-808e-87764ab67fc8, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=e14a3386-c770-46be-bafc-0418fa8274a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:02:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:26.904 144381 INFO neutron.agent.ovn.metadata.agent [-] Port e14a3386-c770-46be-bafc-0418fa8274a7 in datapath 8575e824-4be0-4206-873e-2f9a3d1ded0b bound to our chassis#033[00m
Jan 23 05:02:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:26.906 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8575e824-4be0-4206-873e-2f9a3d1ded0b#033[00m
Jan 23 05:02:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:26.921 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c29e8812-b9a5-4c8f-9724-93052e84bb5b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:26.922 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8575e824-41 in ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:02:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:26.924 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8575e824-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:02:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:26.924 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dc3966e6-b983-43c7-9d37-28c8cdd0a5e4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:26.925 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[75cd1a0d-1d5b-44ec-b987-1f098aeda507]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:26 np0005593234 systemd-machined[195626]: New machine qemu-41-instance-00000065.
Jan 23 05:02:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:26.937 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[cb523e85-580f-4ccb-a878-382c670778f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:26 np0005593234 systemd[1]: Started Virtual Machine qemu-41-instance-00000065.
Jan 23 05:02:26 np0005593234 systemd-machined[195626]: New machine qemu-42-instance-00000066.
Jan 23 05:02:26 np0005593234 systemd[1]: Started Virtual Machine qemu-42-instance-00000066.
Jan 23 05:02:26 np0005593234 kernel: tap4ca9e3cb-f7: entered promiscuous mode
Jan 23 05:02:26 np0005593234 NetworkManager[48942]: <info>  [1769162546.9573] device (tap4ca9e3cb-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.957 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:26 np0005593234 NetworkManager[48942]: <info>  [1769162546.9587] device (tap4ca9e3cb-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.962 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:26.964 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4c72ad87-3c51-465a-82d3-e33872dc3d32]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:26Z|00358|binding|INFO|Releasing lport 9ee89271-3ee7-4672-8800-56bb900c4dd0 from this chassis (sb_readonly=0)
Jan 23 05:02:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:26Z|00359|binding|INFO|Claiming lport 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 for this chassis.
Jan 23 05:02:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:26Z|00360|binding|INFO|4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7: Claiming fa:16:3e:1e:ca:17 10.100.0.9
Jan 23 05:02:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:26Z|00361|binding|INFO|Setting lport e14a3386-c770-46be-bafc-0418fa8274a7 ovn-installed in OVS
Jan 23 05:02:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:26Z|00362|binding|INFO|Setting lport e14a3386-c770-46be-bafc-0418fa8274a7 up in Southbound
Jan 23 05:02:26 np0005593234 nova_compute[227762]: 2026-01-23 10:02:26.991 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:26.993 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:ca:17 10.100.0.9'], port_security=['fa:16:3e:1e:ca:17 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b0b5a1b2-04bd-48e0-a0f7-0c679d784e04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '323fc591-4197-401d-b3c4-392a8ca4598f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.000 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c415d293-4cd5-40f4-be13-43779d117e9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 NetworkManager[48942]: <info>  [1769162547.0079] manager: (tap8575e824-40): new Veth device (/org/freedesktop/NetworkManager/Devices/190)
Jan 23 05:02:27 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:27Z|00363|binding|INFO|Setting lport 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 ovn-installed in OVS
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.009 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.007 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0a844432-b6e2-46bb-b7f2-e5d20c2f16a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:27Z|00364|binding|INFO|Setting lport 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 up in Southbound
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.040 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[0eeeea1a-ad53-4c36-9040-cb9856a03a20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.043 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b1eb2b01-c1e5-4ec2-83ea-a4168b712f72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 NetworkManager[48942]: <info>  [1769162547.0635] device (tap8575e824-40): carrier: link connected
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.069 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[5e01f965-8c2f-4849-82f1-105062c5f7b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.087 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f3387cad-ed34-4c04-b9c3-a5f34f896d56]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8575e824-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:16:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 644202, 'reachable_time': 26350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272686, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.104 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bd9128cf-7bfd-49ae-8902-9974938d0cdb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef2:16ca'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 644202, 'tstamp': 644202}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272687, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.121 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8b501f54-7200-410c-a6e3-3915ab02782a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8575e824-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:16:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 115], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 644202, 'reachable_time': 26350, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 272688, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.150 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[24117a05-973c-4ab7-b45e-42e19ef1f521]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.182 227766 DEBUG nova.network.neutron [req-da18b09f-048a-4380-83ec-2ce2debad68e req-2cefeaeb-abc0-4572-94fc-b48758acbbe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Updated VIF entry in instance network info cache for port 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.182 227766 DEBUG nova.network.neutron [req-da18b09f-048a-4380-83ec-2ce2debad68e req-2cefeaeb-abc0-4572-94fc-b48758acbbe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Updating instance_info_cache with network_info: [{"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.209 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dcd3d5ff-5d66-4483-8ac5-4bd2cb7b6832]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.210 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8575e824-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.211 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.211 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8575e824-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.212 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:27 np0005593234 NetworkManager[48942]: <info>  [1769162547.2134] manager: (tap8575e824-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/191)
Jan 23 05:02:27 np0005593234 kernel: tap8575e824-40: entered promiscuous mode
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.214 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.218 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8575e824-40, col_values=(('external_ids', {'iface-id': 'f7023d86-3158-4cc4-b690-f57bb76e92b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.219 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:27 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:27Z|00365|binding|INFO|Releasing lport f7023d86-3158-4cc4-b690-f57bb76e92b5 from this chassis (sb_readonly=0)
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.220 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.223 227766 DEBUG oslo_concurrency.lockutils [req-da18b09f-048a-4380-83ec-2ce2debad68e req-2cefeaeb-abc0-4572-94fc-b48758acbbe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-b0b5a1b2-04bd-48e0-a0f7-0c679d784e04" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.225 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8575e824-4be0-4206-873e-2f9a3d1ded0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8575e824-4be0-4206-873e-2f9a3d1ded0b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.225 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b935a18d-a0cb-4a39-a193-6ea15972c058]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.226 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-8575e824-4be0-4206-873e-2f9a3d1ded0b
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/8575e824-4be0-4206-873e-2f9a3d1ded0b.pid.haproxy
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 8575e824-4be0-4206-873e-2f9a3d1ded0b
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.227 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'env', 'PROCESS_TAG=haproxy-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8575e824-4be0-4206-873e-2f9a3d1ded0b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.233 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.474 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162547.4738636, f4412d8b-963a-4c3a-accf-68f8cf82c864 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.474 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] VM Started (Lifecycle Event)#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.533 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.536 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162547.4739933, f4412d8b-963a-4c3a-accf-68f8cf82c864 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.536 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.572 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.575 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:02:27 np0005593234 nova_compute[227762]: 2026-01-23 10:02:27.605 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:02:27 np0005593234 podman[272760]: 2026-01-23 10:02:27.609405319 +0000 UTC m=+0.048931460 container create 3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 05:02:27 np0005593234 systemd[1]: Started libpod-conmon-3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399.scope.
Jan 23 05:02:27 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:02:27 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/96c16d6f60a757b8178febb43fecedbb8057a56c9c64c0c5d7b2cdab7b854224/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:27 np0005593234 podman[272760]: 2026-01-23 10:02:27.585892924 +0000 UTC m=+0.025419085 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:02:27 np0005593234 podman[272760]: 2026-01-23 10:02:27.718352261 +0000 UTC m=+0.157878432 container init 3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:02:27 np0005593234 podman[272760]: 2026-01-23 10:02:27.724553485 +0000 UTC m=+0.164079626 container start 3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:02:27 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[272775]: [NOTICE]   (272779) : New worker (272781) forked
Jan 23 05:02:27 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[272775]: [NOTICE]   (272779) : Loading success.
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.957 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c unbound from our chassis#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.959 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee03d7c9-e107-41bf-95cc-5508578ad66c#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.968 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a2eaa6a8-086d-42eb-a370-fca23eda4ead]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.969 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee03d7c9-e1 in ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.971 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee03d7c9-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.971 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[977d7093-818a-4660-a7f0-9f8c7ac222c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.972 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7b960ed6-5dd5-43a2-9e6a-645ef0301838]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:27.986 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[99606053-0528-4a07-bae3-ebab4495b383]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.004 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e169f07d-edc2-442a-886c-504879112e7a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.037 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[0f378422-4034-4fc5-b0c8-729f625e41b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:28 np0005593234 NetworkManager[48942]: <info>  [1769162548.0457] manager: (tapee03d7c9-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/192)
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.047 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9bfb6dbd-4e31-43a3-8c0a-dafa63425d60]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:28 np0005593234 systemd-udevd[272673]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.085 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[86a3998f-b3d8-4988-9de2-0c008e079b0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.088 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a035e312-ccdb-4b0c-a355-6e87adf29901]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:28 np0005593234 NetworkManager[48942]: <info>  [1769162548.1116] device (tapee03d7c9-e0): carrier: link connected
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.118 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[2ccac95b-2725-44a0-af41-bf1bd1a464ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.119 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162548.118784, b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.119 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] VM Started (Lifecycle Event)#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.134 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9e72d21b-b8e5-424c-8e08-d890bda8cddb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 644307, 'reachable_time': 19562, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 272845, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.151 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c20c53fc-0d45-4085-8083-099395758271]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:6530'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 644307, 'tstamp': 644307}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 272846, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.159 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.162 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162548.119193, b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.163 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.168 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[83d30971-5e7b-4fa3-912a-322e9ef58b68]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 116], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 644307, 'reachable_time': 19562, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 272847, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.191 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.194 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.199 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6636dd-f663-4cf6-a5d7-bd3cb454bf73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.230 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.251 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e4a466db-7fd5-4a2d-85e0-76335a733971]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.253 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.253 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.253 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee03d7c9-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:28 np0005593234 NetworkManager[48942]: <info>  [1769162548.2564] manager: (tapee03d7c9-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/193)
Jan 23 05:02:28 np0005593234 kernel: tapee03d7c9-e0: entered promiscuous mode
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.256 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:28 np0005593234 ceph-mgr[77448]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.258 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.259 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee03d7c9-e0, col_values=(('external_ids', {'iface-id': '702d4523-a665-42f5-9a36-57d187c0698a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.260 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:28Z|00366|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=0)
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.274 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.275 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.276 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ab5b8c18-41a7-498e-a5b3-6113851384b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.277 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:02:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:28.278 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'env', 'PROCESS_TAG=haproxy-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee03d7c9-e107-41bf-95cc-5508578ad66c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:02:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:28.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:02:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:28.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:02:28 np0005593234 podman[272880]: 2026-01-23 10:02:28.612970123 +0000 UTC m=+0.046118331 container create efe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:02:28 np0005593234 systemd[1]: Started libpod-conmon-efe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e.scope.
Jan 23 05:02:28 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:02:28 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3652240f2f964440a934dcc5425fdca080a5c36fffc91adca2e8b33289443689/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:28 np0005593234 podman[272880]: 2026-01-23 10:02:28.681940348 +0000 UTC m=+0.115088556 container init efe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 05:02:28 np0005593234 podman[272880]: 2026-01-23 10:02:28.588370565 +0000 UTC m=+0.021518793 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:02:28 np0005593234 podman[272880]: 2026-01-23 10:02:28.687148961 +0000 UTC m=+0.120297169 container start efe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:02:28 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[272895]: [NOTICE]   (272899) : New worker (272901) forked
Jan 23 05:02:28 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[272895]: [NOTICE]   (272899) : Loading success.
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.716 227766 DEBUG oslo_concurrency.lockutils [None req-d36decd7-9f39-4090-b40f-52173c92d948 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "61f18fb1-66aa-4089-b98f-50b8a49800ff" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.717 227766 DEBUG oslo_concurrency.lockutils [None req-d36decd7-9f39-4090-b40f-52173c92d948 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.717 227766 DEBUG nova.compute.manager [None req-d36decd7-9f39-4090-b40f-52173c92d948 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.722 227766 DEBUG nova.compute.manager [None req-d36decd7-9f39-4090-b40f-52173c92d948 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.723 227766 DEBUG nova.objects.instance [None req-d36decd7-9f39-4090-b40f-52173c92d948 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'flavor' on Instance uuid 61f18fb1-66aa-4089-b98f-50b8a49800ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:28 np0005593234 nova_compute[227762]: 2026-01-23 10:02:28.779 227766 DEBUG nova.virt.libvirt.driver [None req-d36decd7-9f39-4090-b40f-52173c92d948 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:02:29 np0005593234 nova_compute[227762]: 2026-01-23 10:02:29.168 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:02:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 3600.5 total, 600.0 interval#012Cumulative writes: 37K writes, 151K keys, 37K commit groups, 1.0 writes per commit group, ingest: 0.15 GB, 0.04 MB/s#012Cumulative WAL: 37K writes, 13K syncs, 2.85 writes per sync, written: 0.15 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6473 writes, 24K keys, 6473 commit groups, 1.0 writes per commit group, ingest: 25.70 MB, 0.04 MB/s#012Interval WAL: 6473 writes, 2520 syncs, 2.57 writes per sync, written: 0.03 GB, 0.04 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:02:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:30.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:02:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:02:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:30.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.778 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.778 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.778 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.807 227766 DEBUG nova.network.neutron [req-eb511374-b6d8-46d4-9136-6978c8a88857 req-1eca9bd5-698b-449e-8284-45f59c04f835 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Updated VIF entry in instance network info cache for port e14a3386-c770-46be-bafc-0418fa8274a7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.807 227766 DEBUG nova.network.neutron [req-eb511374-b6d8-46d4-9136-6978c8a88857 req-1eca9bd5-698b-449e-8284-45f59c04f835 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Updating instance_info_cache with network_info: [{"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.833 227766 DEBUG oslo_concurrency.lockutils [req-eb511374-b6d8-46d4-9136-6978c8a88857 req-1eca9bd5-698b-449e-8284-45f59c04f835 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-f4412d8b-963a-4c3a-accf-68f8cf82c864" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.982 227766 DEBUG nova.compute.manager [req-73f08d81-f091-4bb4-8ee6-4c7dda2fe39b req-4a0c8e70-4a6d-4c45-b3fb-646711a902ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.983 227766 DEBUG oslo_concurrency.lockutils [req-73f08d81-f091-4bb4-8ee6-4c7dda2fe39b req-4a0c8e70-4a6d-4c45-b3fb-646711a902ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.983 227766 DEBUG oslo_concurrency.lockutils [req-73f08d81-f091-4bb4-8ee6-4c7dda2fe39b req-4a0c8e70-4a6d-4c45-b3fb-646711a902ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.983 227766 DEBUG oslo_concurrency.lockutils [req-73f08d81-f091-4bb4-8ee6-4c7dda2fe39b req-4a0c8e70-4a6d-4c45-b3fb-646711a902ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.983 227766 DEBUG nova.compute.manager [req-73f08d81-f091-4bb4-8ee6-4c7dda2fe39b req-4a0c8e70-4a6d-4c45-b3fb-646711a902ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Processing event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.984 227766 DEBUG nova.compute.manager [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.988 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162550.9878538, f4412d8b-963a-4c3a-accf-68f8cf82c864 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.988 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.990 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.992 227766 INFO nova.virt.libvirt.driver [-] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance spawned successfully.#033[00m
Jan 23 05:02:30 np0005593234 nova_compute[227762]: 2026-01-23 10:02:30.992 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.059 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.063 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.068 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.068 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.068 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.069 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.069 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.069 227766 DEBUG nova.virt.libvirt.driver [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.123 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.139 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:02:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2237026009' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.261 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:31 np0005593234 kernel: tapc9c463b9-37 (unregistering): left promiscuous mode
Jan 23 05:02:31 np0005593234 NetworkManager[48942]: <info>  [1769162551.3451] device (tapc9c463b9-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:02:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:31Z|00367|binding|INFO|Releasing lport c9c463b9-3793-44a9-9773-69cc1638096d from this chassis (sb_readonly=0)
Jan 23 05:02:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:31Z|00368|binding|INFO|Setting lport c9c463b9-3793-44a9-9773-69cc1638096d down in Southbound
Jan 23 05:02:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:31Z|00369|binding|INFO|Removing iface tapc9c463b9-37 ovn-installed in OVS
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.359 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.377 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:31 np0005593234 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000062.scope: Deactivated successfully.
Jan 23 05:02:31 np0005593234 systemd[1]: machine-qemu\x2d40\x2dinstance\x2d00000062.scope: Consumed 15.094s CPU time.
Jan 23 05:02:31 np0005593234 systemd-machined[195626]: Machine qemu-40-instance-00000062 terminated.
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.444 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:54:e7 10.100.0.12'], port_security=['fa:16:3e:2d:54:e7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '61f18fb1-66aa-4089-b98f-50b8a49800ff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0f5ca0233c1a490aa2d596b88a0ec503', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ad8a7362-692a-4044-8393-1c10014f8bab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=83406af9-ea42-4cda-96ee-b8c04ab0651a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=c9c463b9-3793-44a9-9773-69cc1638096d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.447 144381 INFO neutron.agent.ovn.metadata.agent [-] Port c9c463b9-3793-44a9-9773-69cc1638096d in datapath 969bd83a-7542-46e3-90f0-1a81f26ba6b8 unbound from our chassis#033[00m
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.448 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 969bd83a-7542-46e3-90f0-1a81f26ba6b8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.449 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d46ea9aa-5f31-4add-bfb1-6e1bb8404afe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.450 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 namespace which is not needed anymore#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.483 227766 INFO nova.compute.manager [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Took 17.49 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.483 227766 DEBUG nova.compute.manager [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:31 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[271551]: [NOTICE]   (271575) : haproxy version is 2.8.14-c23fe91
Jan 23 05:02:31 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[271551]: [NOTICE]   (271575) : path to executable is /usr/sbin/haproxy
Jan 23 05:02:31 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[271551]: [WARNING]  (271575) : Exiting Master process...
Jan 23 05:02:31 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[271551]: [WARNING]  (271575) : Exiting Master process...
Jan 23 05:02:31 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[271551]: [ALERT]    (271575) : Current worker (271578) exited with code 143 (Terminated)
Jan 23 05:02:31 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[271551]: [WARNING]  (271575) : All workers exited. Exiting... (0)
Jan 23 05:02:31 np0005593234 systemd[1]: libpod-40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4.scope: Deactivated successfully.
Jan 23 05:02:31 np0005593234 podman[273008]: 2026-01-23 10:02:31.585597021 +0000 UTC m=+0.044849432 container died 40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:02:31 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4-userdata-shm.mount: Deactivated successfully.
Jan 23 05:02:31 np0005593234 systemd[1]: var-lib-containers-storage-overlay-0ad228d559a70f5aacc206e93bc348d270b05fac9b23ab9c24a2429eb5a23cd3-merged.mount: Deactivated successfully.
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.622 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000062 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.622 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000062 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.627 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000066 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.627 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000066 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:02:31 np0005593234 podman[273008]: 2026-01-23 10:02:31.631669049 +0000 UTC m=+0.090921480 container cleanup 40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.635 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000065 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.635 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000065 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.636 227766 INFO nova.compute.manager [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Took 19.01 seconds to build instance.#033[00m
Jan 23 05:02:31 np0005593234 systemd[1]: libpod-conmon-40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4.scope: Deactivated successfully.
Jan 23 05:02:31 np0005593234 podman[273045]: 2026-01-23 10:02:31.713784984 +0000 UTC m=+0.052696686 container remove 40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.720 227766 DEBUG oslo_concurrency.lockutils [None req-6d74f869-6e00-4cef-864c-fe58993b307b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.723 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5e4eca0e-6dfc-432f-bffc-565c300c2eee]: (4, ('Fri Jan 23 10:02:31 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 (40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4)\n40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4\nFri Jan 23 10:02:31 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 (40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4)\n40376edbe1928d9826307689371013c8eb89933ae0668da75f5d4b43a94855b4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.725 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[22207be7-427d-4d6a-a0f3-635088cb50ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.727 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap969bd83a-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.729 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:31 np0005593234 kernel: tap969bd83a-70: left promiscuous mode
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.747 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.749 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4c865970-fa47-4dcf-822c-3b093b35a2d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.765 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b5b07372-dae2-4b1a-b068-661f9bcd0281]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.766 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fafce15a-cebd-49e3-8acc-c5a995398df0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.783 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[170958c9-75fb-4b3b-9e8a-bd60d526a77a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 641397, 'reachable_time': 18027, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273064, 'error': None, 'target': 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:31 np0005593234 systemd[1]: run-netns-ovnmeta\x2d969bd83a\x2d7542\x2d46e3\x2d90f0\x2d1a81f26ba6b8.mount: Deactivated successfully.
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.786 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:02:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:31.786 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[530e0441-c841-4cff-860c-476ea2e2187f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.815 227766 INFO nova.virt.libvirt.driver [None req-d36decd7-9f39-4090-b40f-52173c92d948 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.820 227766 INFO nova.virt.libvirt.driver [-] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Instance destroyed successfully.#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.821 227766 DEBUG nova.objects.instance [None req-d36decd7-9f39-4090-b40f-52173c92d948 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'numa_topology' on Instance uuid 61f18fb1-66aa-4089-b98f-50b8a49800ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.871 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.872 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4224MB free_disk=20.656776428222656GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.872 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:31 np0005593234 nova_compute[227762]: 2026-01-23 10:02:31.872 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.080 227766 DEBUG nova.compute.manager [None req-d36decd7-9f39-4090-b40f-52173c92d948 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.194 227766 DEBUG oslo_concurrency.lockutils [None req-d36decd7-9f39-4090-b40f-52173c92d948 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.477s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.271 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 61f18fb1-66aa-4089-b98f-50b8a49800ff actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.272 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance f4412d8b-963a-4c3a-accf-68f8cf82c864 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.272 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.272 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.273 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:02:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:32.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.458 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.488 227766 DEBUG nova.compute.manager [req-1495e9ab-4c81-48f3-8b31-f3a5bac24033 req-d763bd5c-8b2e-487a-a325-61f989b1d084 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received event network-vif-unplugged-c9c463b9-3793-44a9-9773-69cc1638096d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.489 227766 DEBUG oslo_concurrency.lockutils [req-1495e9ab-4c81-48f3-8b31-f3a5bac24033 req-d763bd5c-8b2e-487a-a325-61f989b1d084 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.489 227766 DEBUG oslo_concurrency.lockutils [req-1495e9ab-4c81-48f3-8b31-f3a5bac24033 req-d763bd5c-8b2e-487a-a325-61f989b1d084 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.490 227766 DEBUG oslo_concurrency.lockutils [req-1495e9ab-4c81-48f3-8b31-f3a5bac24033 req-d763bd5c-8b2e-487a-a325-61f989b1d084 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.490 227766 DEBUG nova.compute.manager [req-1495e9ab-4c81-48f3-8b31-f3a5bac24033 req-d763bd5c-8b2e-487a-a325-61f989b1d084 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] No waiting events found dispatching network-vif-unplugged-c9c463b9-3793-44a9-9773-69cc1638096d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.490 227766 WARNING nova.compute.manager [req-1495e9ab-4c81-48f3-8b31-f3a5bac24033 req-d763bd5c-8b2e-487a-a325-61f989b1d084 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received unexpected event network-vif-unplugged-c9c463b9-3793-44a9-9773-69cc1638096d for instance with vm_state stopped and task_state None.#033[00m
Jan 23 05:02:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:32.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:02:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/442446989' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.897 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.902 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.941 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.993 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:02:32 np0005593234 nova_compute[227762]: 2026-01-23 10:02:32.993 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.372 227766 DEBUG nova.compute.manager [req-92699d96-d3a2-4652-a857-932c8cbb9060 req-b38685c2-24d6-4d09-96a2-5bc10eb033d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.373 227766 DEBUG oslo_concurrency.lockutils [req-92699d96-d3a2-4652-a857-932c8cbb9060 req-b38685c2-24d6-4d09-96a2-5bc10eb033d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.374 227766 DEBUG oslo_concurrency.lockutils [req-92699d96-d3a2-4652-a857-932c8cbb9060 req-b38685c2-24d6-4d09-96a2-5bc10eb033d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.374 227766 DEBUG oslo_concurrency.lockutils [req-92699d96-d3a2-4652-a857-932c8cbb9060 req-b38685c2-24d6-4d09-96a2-5bc10eb033d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.374 227766 DEBUG nova.compute.manager [req-92699d96-d3a2-4652-a857-932c8cbb9060 req-b38685c2-24d6-4d09-96a2-5bc10eb033d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] No waiting events found dispatching network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.375 227766 WARNING nova.compute.manager [req-92699d96-d3a2-4652-a857-932c8cbb9060 req-b38685c2-24d6-4d09-96a2-5bc10eb033d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received unexpected event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.376 227766 DEBUG nova.compute.manager [req-92699d96-d3a2-4652-a857-932c8cbb9060 req-b38685c2-24d6-4d09-96a2-5bc10eb033d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.376 227766 DEBUG oslo_concurrency.lockutils [req-92699d96-d3a2-4652-a857-932c8cbb9060 req-b38685c2-24d6-4d09-96a2-5bc10eb033d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.377 227766 DEBUG oslo_concurrency.lockutils [req-92699d96-d3a2-4652-a857-932c8cbb9060 req-b38685c2-24d6-4d09-96a2-5bc10eb033d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.377 227766 DEBUG oslo_concurrency.lockutils [req-92699d96-d3a2-4652-a857-932c8cbb9060 req-b38685c2-24d6-4d09-96a2-5bc10eb033d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.377 227766 DEBUG nova.compute.manager [req-92699d96-d3a2-4652-a857-932c8cbb9060 req-b38685c2-24d6-4d09-96a2-5bc10eb033d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Processing event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.378 227766 DEBUG nova.compute.manager [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.382 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162553.3818543, b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.382 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.384 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.387 227766 INFO nova.virt.libvirt.driver [-] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Instance spawned successfully.#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.388 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.614 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.615 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.615 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.616 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.616 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.617 227766 DEBUG nova.virt.libvirt.driver [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.649 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:33 np0005593234 nova_compute[227762]: 2026-01-23 10:02:33.653 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:02:34 np0005593234 nova_compute[227762]: 2026-01-23 10:02:34.062 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:02:34 np0005593234 nova_compute[227762]: 2026-01-23 10:02:34.139 227766 INFO nova.compute.manager [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Took 19.18 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:02:34 np0005593234 nova_compute[227762]: 2026-01-23 10:02:34.139 227766 DEBUG nova.compute.manager [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:34 np0005593234 nova_compute[227762]: 2026-01-23 10:02:34.170 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:34 np0005593234 nova_compute[227762]: 2026-01-23 10:02:34.276 227766 INFO nova.compute.manager [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Took 20.93 seconds to build instance.#033[00m
Jan 23 05:02:34 np0005593234 nova_compute[227762]: 2026-01-23 10:02:34.303 227766 DEBUG oslo_concurrency.lockutils [None req-9864fc30-97a7-46de-96af-2042c145d913 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:34.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:34.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:34 np0005593234 podman[273089]: 2026-01-23 10:02:34.897828644 +0000 UTC m=+0.187378323 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:02:34 np0005593234 nova_compute[227762]: 2026-01-23 10:02:34.993 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:34 np0005593234 nova_compute[227762]: 2026-01-23 10:02:34.994 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:02:34 np0005593234 nova_compute[227762]: 2026-01-23 10:02:34.994 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.223 227766 DEBUG oslo_concurrency.lockutils [None req-8f08cb91-3952-45f1-b479-23f9508ba6be 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.224 227766 DEBUG oslo_concurrency.lockutils [None req-8f08cb91-3952-45f1-b479-23f9508ba6be 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.224 227766 DEBUG nova.compute.manager [None req-8f08cb91-3952-45f1-b479-23f9508ba6be 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.228 227766 DEBUG nova.compute.manager [None req-8f08cb91-3952-45f1-b479-23f9508ba6be 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.228 227766 DEBUG nova.objects.instance [None req-8f08cb91-3952-45f1-b479-23f9508ba6be 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'flavor' on Instance uuid f4412d8b-963a-4c3a-accf-68f8cf82c864 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.327 227766 DEBUG nova.virt.libvirt.driver [None req-8f08cb91-3952-45f1-b479-23f9508ba6be 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.374 227766 DEBUG nova.compute.manager [req-0958609b-60b5-435b-924a-70b765d818eb req-c7d00bc3-a3cc-4e26-9846-1b7d70e9241f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received event network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.374 227766 DEBUG oslo_concurrency.lockutils [req-0958609b-60b5-435b-924a-70b765d818eb req-c7d00bc3-a3cc-4e26-9846-1b7d70e9241f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.374 227766 DEBUG oslo_concurrency.lockutils [req-0958609b-60b5-435b-924a-70b765d818eb req-c7d00bc3-a3cc-4e26-9846-1b7d70e9241f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.375 227766 DEBUG oslo_concurrency.lockutils [req-0958609b-60b5-435b-924a-70b765d818eb req-c7d00bc3-a3cc-4e26-9846-1b7d70e9241f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.375 227766 DEBUG nova.compute.manager [req-0958609b-60b5-435b-924a-70b765d818eb req-c7d00bc3-a3cc-4e26-9846-1b7d70e9241f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] No waiting events found dispatching network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.375 227766 WARNING nova.compute.manager [req-0958609b-60b5-435b-924a-70b765d818eb req-c7d00bc3-a3cc-4e26-9846-1b7d70e9241f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received unexpected event network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d for instance with vm_state stopped and task_state None.#033[00m
Jan 23 05:02:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.608 227766 DEBUG nova.compute.manager [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.609 227766 DEBUG oslo_concurrency.lockutils [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.609 227766 DEBUG oslo_concurrency.lockutils [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.609 227766 DEBUG oslo_concurrency.lockutils [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.610 227766 DEBUG nova.compute.manager [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] No waiting events found dispatching network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.610 227766 WARNING nova.compute.manager [req-6a181dc4-8a5c-4ad0-b286-e81b19abbb83 req-8487b1e9-18c8-4bf4-a4e0-37067c0dec58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received unexpected event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.636 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-61f18fb1-66aa-4089-b98f-50b8a49800ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.637 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-61f18fb1-66aa-4089-b98f-50b8a49800ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.637 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.637 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 61f18fb1-66aa-4089-b98f-50b8a49800ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.841 227766 DEBUG nova.objects.instance [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'flavor' on Instance uuid 61f18fb1-66aa-4089-b98f-50b8a49800ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:35 np0005593234 nova_compute[227762]: 2026-01-23 10:02:35.875 227766 DEBUG oslo_concurrency.lockutils [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "refresh_cache-61f18fb1-66aa-4089-b98f-50b8a49800ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:02:36 np0005593234 nova_compute[227762]: 2026-01-23 10:02:36.142 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:36.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:36.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:38.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:38.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:39 np0005593234 nova_compute[227762]: 2026-01-23 10:02:39.172 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:40.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:02:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:40.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:02:41 np0005593234 nova_compute[227762]: 2026-01-23 10:02:41.175 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:41 np0005593234 nova_compute[227762]: 2026-01-23 10:02:41.332 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Updating instance_info_cache with network_info: [{"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:02:41 np0005593234 nova_compute[227762]: 2026-01-23 10:02:41.367 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-61f18fb1-66aa-4089-b98f-50b8a49800ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:41 np0005593234 nova_compute[227762]: 2026-01-23 10:02:41.368 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:02:41 np0005593234 nova_compute[227762]: 2026-01-23 10:02:41.369 227766 DEBUG oslo_concurrency.lockutils [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquired lock "refresh_cache-61f18fb1-66aa-4089-b98f-50b8a49800ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:02:41 np0005593234 nova_compute[227762]: 2026-01-23 10:02:41.369 227766 DEBUG nova.network.neutron [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:02:41 np0005593234 nova_compute[227762]: 2026-01-23 10:02:41.369 227766 DEBUG nova.objects.instance [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'info_cache' on Instance uuid 61f18fb1-66aa-4089-b98f-50b8a49800ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:41 np0005593234 nova_compute[227762]: 2026-01-23 10:02:41.371 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:41 np0005593234 nova_compute[227762]: 2026-01-23 10:02:41.371 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:41 np0005593234 nova_compute[227762]: 2026-01-23 10:02:41.372 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:41 np0005593234 nova_compute[227762]: 2026-01-23 10:02:41.372 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:02:41 np0005593234 nova_compute[227762]: 2026-01-23 10:02:41.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:41 np0005593234 nova_compute[227762]: 2026-01-23 10:02:41.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:42.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:42.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:42.839 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:42.840 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:42.841 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:44 np0005593234 nova_compute[227762]: 2026-01-23 10:02:44.174 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:44.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:02:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3469476833' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:02:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:02:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3469476833' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:02:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:44.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:45 np0005593234 nova_compute[227762]: 2026-01-23 10:02:45.375 227766 DEBUG nova.virt.libvirt.driver [None req-8f08cb91-3952-45f1-b479-23f9508ba6be 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 05:02:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:46 np0005593234 nova_compute[227762]: 2026-01-23 10:02:46.221 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:46.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:46 np0005593234 nova_compute[227762]: 2026-01-23 10:02:46.540 227766 INFO nova.compute.manager [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Rebuilding instance#033[00m
Jan 23 05:02:46 np0005593234 nova_compute[227762]: 2026-01-23 10:02:46.588 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162551.5873203, 61f18fb1-66aa-4089-b98f-50b8a49800ff => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:46 np0005593234 nova_compute[227762]: 2026-01-23 10:02:46.588 227766 INFO nova.compute.manager [-] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:02:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:46.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:46 np0005593234 nova_compute[227762]: 2026-01-23 10:02:46.614 227766 DEBUG nova.compute.manager [None req-3c73f98c-dac3-4e02-81dc-23e238818b7a - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:46 np0005593234 nova_compute[227762]: 2026-01-23 10:02:46.617 227766 DEBUG nova.compute.manager [None req-3c73f98c-dac3-4e02-81dc-23e238818b7a - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:02:46 np0005593234 nova_compute[227762]: 2026-01-23 10:02:46.945 227766 INFO nova.compute.manager [None req-3c73f98c-dac3-4e02-81dc-23e238818b7a - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 23 05:02:47 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:47Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1e:ca:17 10.100.0.9
Jan 23 05:02:47 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:47Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1e:ca:17 10.100.0.9
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.267 227766 DEBUG nova.network.neutron [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Updating instance_info_cache with network_info: [{"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.532 227766 DEBUG oslo_concurrency.lockutils [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Releasing lock "refresh_cache-61f18fb1-66aa-4089-b98f-50b8a49800ff" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.670 227766 INFO nova.virt.libvirt.driver [-] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Instance destroyed successfully.#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.671 227766 DEBUG nova.objects.instance [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'numa_topology' on Instance uuid 61f18fb1-66aa-4089-b98f-50b8a49800ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:47 np0005593234 kernel: tape14a3386-c7 (unregistering): left promiscuous mode
Jan 23 05:02:47 np0005593234 NetworkManager[48942]: <info>  [1769162567.6797] device (tape14a3386-c7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.691 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:47 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:47Z|00370|binding|INFO|Releasing lport e14a3386-c770-46be-bafc-0418fa8274a7 from this chassis (sb_readonly=0)
Jan 23 05:02:47 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:47Z|00371|binding|INFO|Setting lport e14a3386-c770-46be-bafc-0418fa8274a7 down in Southbound
Jan 23 05:02:47 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:47Z|00372|binding|INFO|Removing iface tape14a3386-c7 ovn-installed in OVS
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.693 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.708 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.740 227766 DEBUG nova.objects.instance [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'resources' on Instance uuid 61f18fb1-66aa-4089-b98f-50b8a49800ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:47 np0005593234 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000065.scope: Deactivated successfully.
Jan 23 05:02:47 np0005593234 systemd[1]: machine-qemu\x2d41\x2dinstance\x2d00000065.scope: Consumed 13.680s CPU time.
Jan 23 05:02:47 np0005593234 systemd-machined[195626]: Machine qemu-41-instance-00000065 terminated.
Jan 23 05:02:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:47.847 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:95:7c 10.100.0.7'], port_security=['fa:16:3e:b2:95:7c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f4412d8b-963a-4c3a-accf-68f8cf82c864', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c16cd713fa74a88b43e4edf01c273bd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1b3b2b26-a9c9-438c-b14e-9fddf18d8ea5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c3aed5f-30b8-4c57-808e-87764ab67fc8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=e14a3386-c770-46be-bafc-0418fa8274a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:02:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:47.848 144381 INFO neutron.agent.ovn.metadata.agent [-] Port e14a3386-c770-46be-bafc-0418fa8274a7 in datapath 8575e824-4be0-4206-873e-2f9a3d1ded0b unbound from our chassis#033[00m
Jan 23 05:02:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:47.849 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8575e824-4be0-4206-873e-2f9a3d1ded0b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:02:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:47.852 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b9311271-6f92-4ca7-b31c-ccc20ec25bb5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:47.853 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b namespace which is not needed anymore#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.870 227766 DEBUG nova.objects.instance [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'trusted_certs' on Instance uuid b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.929 227766 DEBUG nova.virt.libvirt.vif [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:01:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1972148084',display_name='tempest-ListServerFiltersTestJSON-instance-1972148084',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1972148084',id=98,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:02:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='0f5ca0233c1a490aa2d596b88a0ec503',ramdisk_id='',reservation_id='r-l9i0gz1m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1524131674',owner_user_name='tempest-ListServerFiltersTestJSON-1524131674-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:02:32Z,user_data=None,user_id='c09e682996b940dc97c866f9e4f1e74e',uuid=61f18fb1-66aa-4089-b98f-50b8a49800ff,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.931 227766 DEBUG nova.network.os_vif_util [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converting VIF {"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.932 227766 DEBUG nova.network.os_vif_util [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.934 227766 DEBUG os_vif [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.936 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.937 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc9c463b9-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.942 227766 DEBUG nova.compute.manager [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.946 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.949 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.953 227766 INFO os_vif [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37')#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.961 227766 DEBUG nova.virt.libvirt.driver [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Start _get_guest_xml network_info=[{"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.965 227766 WARNING nova.virt.libvirt.driver [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.975 227766 DEBUG nova.virt.libvirt.host [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.976 227766 DEBUG nova.virt.libvirt.host [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.983 227766 DEBUG nova.virt.libvirt.host [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.985 227766 DEBUG nova.virt.libvirt.host [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.987 227766 DEBUG nova.virt.libvirt.driver [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.987 227766 DEBUG nova.virt.hardware [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.988 227766 DEBUG nova.virt.hardware [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.988 227766 DEBUG nova.virt.hardware [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.988 227766 DEBUG nova.virt.hardware [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.988 227766 DEBUG nova.virt.hardware [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.989 227766 DEBUG nova.virt.hardware [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.989 227766 DEBUG nova.virt.hardware [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.989 227766 DEBUG nova.virt.hardware [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.990 227766 DEBUG nova.virt.hardware [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.990 227766 DEBUG nova.virt.hardware [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.990 227766 DEBUG nova.virt.hardware [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:02:47 np0005593234 nova_compute[227762]: 2026-01-23 10:02:47.991 227766 DEBUG nova.objects.instance [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 61f18fb1-66aa-4089-b98f-50b8a49800ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.027 227766 DEBUG oslo_concurrency.processutils [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.108 227766 DEBUG nova.objects.instance [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'pci_requests' on Instance uuid b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:48 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[272775]: [NOTICE]   (272779) : haproxy version is 2.8.14-c23fe91
Jan 23 05:02:48 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[272775]: [NOTICE]   (272779) : path to executable is /usr/sbin/haproxy
Jan 23 05:02:48 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[272775]: [WARNING]  (272779) : Exiting Master process...
Jan 23 05:02:48 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[272775]: [ALERT]    (272779) : Current worker (272781) exited with code 143 (Terminated)
Jan 23 05:02:48 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[272775]: [WARNING]  (272779) : All workers exited. Exiting... (0)
Jan 23 05:02:48 np0005593234 systemd[1]: libpod-3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399.scope: Deactivated successfully.
Jan 23 05:02:48 np0005593234 podman[273213]: 2026-01-23 10:02:48.197184807 +0000 UTC m=+0.224685680 container died 3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.207 227766 DEBUG nova.objects.instance [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'pci_devices' on Instance uuid b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:48.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.380 227766 DEBUG nova.objects.instance [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'resources' on Instance uuid b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.385 227766 INFO nova.virt.libvirt.driver [None req-8f08cb91-3952-45f1-b479-23f9508ba6be 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance shutdown successfully after 13 seconds.#033[00m
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.390 227766 INFO nova.virt.libvirt.driver [-] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance destroyed successfully.#033[00m
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.390 227766 DEBUG nova.objects.instance [None req-8f08cb91-3952-45f1-b479-23f9508ba6be 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'numa_topology' on Instance uuid f4412d8b-963a-4c3a-accf-68f8cf82c864 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.416 227766 DEBUG nova.objects.instance [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'migration_context' on Instance uuid b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.442 227766 DEBUG nova.compute.manager [None req-8f08cb91-3952-45f1-b479-23f9508ba6be 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.453 227766 DEBUG nova.objects.instance [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.455 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:02:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:02:48 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2840457785' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.479 227766 DEBUG oslo_concurrency.processutils [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.518 227766 DEBUG oslo_concurrency.processutils [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:48 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399-userdata-shm.mount: Deactivated successfully.
Jan 23 05:02:48 np0005593234 systemd[1]: var-lib-containers-storage-overlay-96c16d6f60a757b8178febb43fecedbb8057a56c9c64c0c5d7b2cdab7b854224-merged.mount: Deactivated successfully.
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.563 227766 DEBUG oslo_concurrency.lockutils [None req-8f08cb91-3952-45f1-b479-23f9508ba6be 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 13.340s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:48.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:48 np0005593234 podman[273213]: 2026-01-23 10:02:48.624966238 +0000 UTC m=+0.652467091 container cleanup 3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:02:48 np0005593234 systemd[1]: libpod-conmon-3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399.scope: Deactivated successfully.
Jan 23 05:02:48 np0005593234 podman[273283]: 2026-01-23 10:02:48.910432534 +0000 UTC m=+0.261490518 container remove 3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:02:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:48.917 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[82c7f082-08a9-4a74-9d55-09aef73cc5f7]: (4, ('Fri Jan 23 10:02:47 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b (3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399)\n3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399\nFri Jan 23 10:02:48 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b (3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399)\n3f0f2ce3679d50782ae1e3231f3862228cbc8e4df7772790fa49a4769e128399\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:48.919 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fe6eedc4-1683-40c9-91df-508d9f00a5f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:48.920 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8575e824-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.922 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:48 np0005593234 kernel: tap8575e824-40: left promiscuous mode
Jan 23 05:02:48 np0005593234 nova_compute[227762]: 2026-01-23 10:02:48.941 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:48.943 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[47c3498e-c09e-4bd0-895e-6962f0a7c95a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:02:48 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2115337883' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:02:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:48.964 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2ac609a2-bf7d-48bb-be14-104887672907]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:48.965 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c5495247-edaf-4dde-a5c3-c9d7421cbed4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:48.980 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d994bf17-5aba-4699-ac1f-6916d4f858f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 644195, 'reachable_time': 22645, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273319, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:48 np0005593234 systemd[1]: run-netns-ovnmeta\x2d8575e824\x2d4be0\x2d4206\x2d873e\x2d2f9a3d1ded0b.mount: Deactivated successfully.
Jan 23 05:02:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:48.984 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:02:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:48.984 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[4e6f451b-4b54-4a52-96b0-aca5835bd0c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.344 227766 DEBUG oslo_concurrency.processutils [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.827s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.346 227766 DEBUG nova.virt.libvirt.vif [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:01:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1972148084',display_name='tempest-ListServerFiltersTestJSON-instance-1972148084',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1972148084',id=98,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:02:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='0f5ca0233c1a490aa2d596b88a0ec503',ramdisk_id='',reservation_id='r-l9i0gz1m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1524131674',owner_user_name='tempest-ListServerFiltersTestJSON-1524131674-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:02:32Z,user_data=None,user_id='c09e682996b940dc97c866f9e4f1e74e',uuid=61f18fb1-66aa-4089-b98f-50b8a49800ff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.346 227766 DEBUG nova.network.os_vif_util [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converting VIF {"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.347 227766 DEBUG nova.network.os_vif_util [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.348 227766 DEBUG nova.objects.instance [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'pci_devices' on Instance uuid 61f18fb1-66aa-4089-b98f-50b8a49800ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.374 227766 DEBUG nova.virt.libvirt.driver [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  <uuid>61f18fb1-66aa-4089-b98f-50b8a49800ff</uuid>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  <name>instance-00000062</name>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <nova:name>tempest-ListServerFiltersTestJSON-instance-1972148084</nova:name>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:02:47</nova:creationTime>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <nova:user uuid="c09e682996b940dc97c866f9e4f1e74e">tempest-ListServerFiltersTestJSON-1524131674-project-member</nova:user>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <nova:project uuid="0f5ca0233c1a490aa2d596b88a0ec503">tempest-ListServerFiltersTestJSON-1524131674</nova:project>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <nova:port uuid="c9c463b9-3793-44a9-9773-69cc1638096d">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <entry name="serial">61f18fb1-66aa-4089-b98f-50b8a49800ff</entry>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <entry name="uuid">61f18fb1-66aa-4089-b98f-50b8a49800ff</entry>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/61f18fb1-66aa-4089-b98f-50b8a49800ff_disk">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/61f18fb1-66aa-4089-b98f-50b8a49800ff_disk.config">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:2d:54:e7"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <target dev="tapc9c463b9-37"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/61f18fb1-66aa-4089-b98f-50b8a49800ff/console.log" append="off"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <input type="keyboard" bus="usb"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:02:49 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:02:49 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:02:49 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:02:49 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.375 227766 DEBUG nova.virt.libvirt.driver [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] skipping disk for instance-00000062 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.375 227766 DEBUG nova.virt.libvirt.driver [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] skipping disk for instance-00000062 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.376 227766 DEBUG nova.virt.libvirt.vif [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:01:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1972148084',display_name='tempest-ListServerFiltersTestJSON-instance-1972148084',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1972148084',id=98,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:02:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='0f5ca0233c1a490aa2d596b88a0ec503',ramdisk_id='',reservation_id='r-l9i0gz1m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1524131674',owner_user_name='tempest-ListServerFiltersTestJSON-1524131674-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:02:32Z,user_data=None,user_id='c09e682996b940dc97c866f9e4f1e74e',uuid=61f18fb1-66aa-4089-b98f-50b8a49800ff,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.376 227766 DEBUG nova.network.os_vif_util [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converting VIF {"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.377 227766 DEBUG nova.network.os_vif_util [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.377 227766 DEBUG os_vif [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.378 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.378 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.378 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.380 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.381 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc9c463b9-37, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.381 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc9c463b9-37, col_values=(('external_ids', {'iface-id': 'c9c463b9-3793-44a9-9773-69cc1638096d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:2d:54:e7', 'vm-uuid': '61f18fb1-66aa-4089-b98f-50b8a49800ff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:49 np0005593234 NetworkManager[48942]: <info>  [1769162569.3835] manager: (tapc9c463b9-37): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/194)
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.384 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.391 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.392 227766 INFO os_vif [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37')#033[00m
Jan 23 05:02:49 np0005593234 kernel: tapc9c463b9-37: entered promiscuous mode
Jan 23 05:02:49 np0005593234 systemd-udevd[273180]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:02:49 np0005593234 NetworkManager[48942]: <info>  [1769162569.4978] manager: (tapc9c463b9-37): new Tun device (/org/freedesktop/NetworkManager/Devices/195)
Jan 23 05:02:49 np0005593234 NetworkManager[48942]: <info>  [1769162569.5351] device (tapc9c463b9-37): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:02:49 np0005593234 NetworkManager[48942]: <info>  [1769162569.5357] device (tapc9c463b9-37): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:02:49 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:49Z|00373|binding|INFO|Claiming lport c9c463b9-3793-44a9-9773-69cc1638096d for this chassis.
Jan 23 05:02:49 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:49Z|00374|binding|INFO|c9c463b9-3793-44a9-9773-69cc1638096d: Claiming fa:16:3e:2d:54:e7 10.100.0.12
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.535 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.543 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:54:e7 10.100.0.12'], port_security=['fa:16:3e:2d:54:e7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '61f18fb1-66aa-4089-b98f-50b8a49800ff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0f5ca0233c1a490aa2d596b88a0ec503', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ad8a7362-692a-4044-8393-1c10014f8bab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=83406af9-ea42-4cda-96ee-b8c04ab0651a, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=c9c463b9-3793-44a9-9773-69cc1638096d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.544 144381 INFO neutron.agent.ovn.metadata.agent [-] Port c9c463b9-3793-44a9-9773-69cc1638096d in datapath 969bd83a-7542-46e3-90f0-1a81f26ba6b8 bound to our chassis#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.545 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 969bd83a-7542-46e3-90f0-1a81f26ba6b8#033[00m
Jan 23 05:02:49 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:49Z|00375|binding|INFO|Setting lport c9c463b9-3793-44a9-9773-69cc1638096d ovn-installed in OVS
Jan 23 05:02:49 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:49Z|00376|binding|INFO|Setting lport c9c463b9-3793-44a9-9773-69cc1638096d up in Southbound
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.555 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6ab43cfa-5e4c-4398-b696-030d1a8d198c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.556 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap969bd83a-71 in ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.557 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.558 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap969bd83a-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.558 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[95924fdc-b934-4b09-854c-6b15e560188a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.559 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[67adac01-f137-4f46-9dc9-a34fd13271b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.559 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:49 np0005593234 systemd-machined[195626]: New machine qemu-43-instance-00000062.
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.571 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[81fb7300-1bba-4075-be55-db3290b015d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 systemd[1]: Started Virtual Machine qemu-43-instance-00000062.
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.599 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[36ba8947-d408-43a7-ba8d-d6ef2d3b81d6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.629 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1ca91bab-7447-4d27-8ba1-5768f835d92d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.635 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[62de6881-2d80-4bef-bc23-02be85115cc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 NetworkManager[48942]: <info>  [1769162569.6362] manager: (tap969bd83a-70): new Veth device (/org/freedesktop/NetworkManager/Devices/196)
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.666 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[25471b73-5ea0-46f7-80bb-a89cf360712b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.669 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c965d2-b39b-46eb-afb3-644459b0e8ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 NetworkManager[48942]: <info>  [1769162569.6912] device (tap969bd83a-70): carrier: link connected
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.703 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d49eb8c7-68fa-467b-80b8-dee7b644576a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.720 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d98e5b72-1db3-4597-b682-7973e220e6e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap969bd83a-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:fe:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646465, 'reachable_time': 24091, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273373, 'error': None, 'target': 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.735 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1173690e-79e6-4cf8-a74d-85e20e40f8b6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:fef5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 646465, 'tstamp': 646465}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 273374, 'error': None, 'target': 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.753 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[57a38eab-26c9-4814-8890-0f1950669022]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap969bd83a-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:fe:f5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 120], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646465, 'reachable_time': 24091, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 273375, 'error': None, 'target': 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.780 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[86548a28-ca5b-4744-b39f-e93aa25b2314]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.830 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd0144a-e010-42c1-b5de-4b007b54c2a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.831 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap969bd83a-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.831 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.832 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap969bd83a-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:49 np0005593234 NetworkManager[48942]: <info>  [1769162569.8340] manager: (tap969bd83a-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/197)
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.834 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:49 np0005593234 kernel: tap969bd83a-70: entered promiscuous mode
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.837 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.838 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap969bd83a-70, col_values=(('external_ids', {'iface-id': '9ee89271-3ee7-4672-8800-56bb900c4dd0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.839 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:49 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:49Z|00377|binding|INFO|Releasing lport 9ee89271-3ee7-4672-8800-56bb900c4dd0 from this chassis (sb_readonly=0)
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.855 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:49 np0005593234 nova_compute[227762]: 2026-01-23 10:02:49.859 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.861 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/969bd83a-7542-46e3-90f0-1a81f26ba6b8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/969bd83a-7542-46e3-90f0-1a81f26ba6b8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.862 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ef66a1e4-50c7-493c-99ac-130302dd7e1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.863 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-969bd83a-7542-46e3-90f0-1a81f26ba6b8
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/969bd83a-7542-46e3-90f0-1a81f26ba6b8.pid.haproxy
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 969bd83a-7542-46e3-90f0-1a81f26ba6b8
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:02:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:49.863 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'env', 'PROCESS_TAG=haproxy-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/969bd83a-7542-46e3-90f0-1a81f26ba6b8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.260 227766 DEBUG nova.compute.manager [req-3e6a5b3d-ad0d-4692-b84d-f5587ec7cf53 req-fcba8483-dde0-4f68-8b2e-7d166cae1770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received event network-vif-unplugged-e14a3386-c770-46be-bafc-0418fa8274a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.261 227766 DEBUG oslo_concurrency.lockutils [req-3e6a5b3d-ad0d-4692-b84d-f5587ec7cf53 req-fcba8483-dde0-4f68-8b2e-7d166cae1770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.262 227766 DEBUG oslo_concurrency.lockutils [req-3e6a5b3d-ad0d-4692-b84d-f5587ec7cf53 req-fcba8483-dde0-4f68-8b2e-7d166cae1770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.262 227766 DEBUG oslo_concurrency.lockutils [req-3e6a5b3d-ad0d-4692-b84d-f5587ec7cf53 req-fcba8483-dde0-4f68-8b2e-7d166cae1770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.262 227766 DEBUG nova.compute.manager [req-3e6a5b3d-ad0d-4692-b84d-f5587ec7cf53 req-fcba8483-dde0-4f68-8b2e-7d166cae1770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] No waiting events found dispatching network-vif-unplugged-e14a3386-c770-46be-bafc-0418fa8274a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.262 227766 WARNING nova.compute.manager [req-3e6a5b3d-ad0d-4692-b84d-f5587ec7cf53 req-fcba8483-dde0-4f68-8b2e-7d166cae1770 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received unexpected event network-vif-unplugged-e14a3386-c770-46be-bafc-0418fa8274a7 for instance with vm_state stopped and task_state None.#033[00m
Jan 23 05:02:50 np0005593234 podman[273436]: 2026-01-23 10:02:50.189228305 +0000 UTC m=+0.024338540 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:02:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:50.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.436 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162570.4363027, 61f18fb1-66aa-4089-b98f-50b8a49800ff => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.437 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.439 227766 DEBUG nova.compute.manager [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.442 227766 INFO nova.virt.libvirt.driver [-] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Instance rebooted successfully.#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.442 227766 DEBUG nova.compute.manager [None req-a1aaf40e-7102-461c-b094-4f4a3a199a18 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.459 227766 DEBUG nova.compute.manager [req-1259b03b-7bbe-4679-ab66-096cf5acccc9 req-2cc08814-383c-46d9-812c-76c3d5e53f34 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received event network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.459 227766 DEBUG oslo_concurrency.lockutils [req-1259b03b-7bbe-4679-ab66-096cf5acccc9 req-2cc08814-383c-46d9-812c-76c3d5e53f34 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.459 227766 DEBUG oslo_concurrency.lockutils [req-1259b03b-7bbe-4679-ab66-096cf5acccc9 req-2cc08814-383c-46d9-812c-76c3d5e53f34 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.460 227766 DEBUG oslo_concurrency.lockutils [req-1259b03b-7bbe-4679-ab66-096cf5acccc9 req-2cc08814-383c-46d9-812c-76c3d5e53f34 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.460 227766 DEBUG nova.compute.manager [req-1259b03b-7bbe-4679-ab66-096cf5acccc9 req-2cc08814-383c-46d9-812c-76c3d5e53f34 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] No waiting events found dispatching network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.460 227766 WARNING nova.compute.manager [req-1259b03b-7bbe-4679-ab66-096cf5acccc9 req-2cc08814-383c-46d9-812c-76c3d5e53f34 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received unexpected event network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.460 227766 DEBUG nova.compute.manager [req-1259b03b-7bbe-4679-ab66-096cf5acccc9 req-2cc08814-383c-46d9-812c-76c3d5e53f34 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received event network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.460 227766 DEBUG oslo_concurrency.lockutils [req-1259b03b-7bbe-4679-ab66-096cf5acccc9 req-2cc08814-383c-46d9-812c-76c3d5e53f34 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.460 227766 DEBUG oslo_concurrency.lockutils [req-1259b03b-7bbe-4679-ab66-096cf5acccc9 req-2cc08814-383c-46d9-812c-76c3d5e53f34 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.461 227766 DEBUG oslo_concurrency.lockutils [req-1259b03b-7bbe-4679-ab66-096cf5acccc9 req-2cc08814-383c-46d9-812c-76c3d5e53f34 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.461 227766 DEBUG nova.compute.manager [req-1259b03b-7bbe-4679-ab66-096cf5acccc9 req-2cc08814-383c-46d9-812c-76c3d5e53f34 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] No waiting events found dispatching network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.461 227766 WARNING nova.compute.manager [req-1259b03b-7bbe-4679-ab66-096cf5acccc9 req-2cc08814-383c-46d9-812c-76c3d5e53f34 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received unexpected event network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.501 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.504 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.582 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.583 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162570.4390802, 61f18fb1-66aa-4089-b98f-50b8a49800ff => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.583 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] VM Started (Lifecycle Event)#033[00m
Jan 23 05:02:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:02:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:50.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.617 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:50 np0005593234 nova_compute[227762]: 2026-01-23 10:02:50.621 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:02:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:51 np0005593234 podman[273436]: 2026-01-23 10:02:51.021968606 +0000 UTC m=+0.857078821 container create 8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:02:51 np0005593234 systemd[1]: Started libpod-conmon-8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b.scope.
Jan 23 05:02:51 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:02:51 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/46e3a1bc2f3618bad9c838c505c71c387248f73f4a4644cd13c180e4cb992d97/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:02:51 np0005593234 podman[273436]: 2026-01-23 10:02:51.15275459 +0000 UTC m=+0.987864825 container init 8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 05:02:51 np0005593234 podman[273436]: 2026-01-23 10:02:51.160119271 +0000 UTC m=+0.995229486 container start 8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:02:51 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[273470]: [NOTICE]   (273474) : New worker (273476) forked
Jan 23 05:02:51 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[273470]: [NOTICE]   (273474) : Loading success.
Jan 23 05:02:51 np0005593234 nova_compute[227762]: 2026-01-23 10:02:51.223 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:52.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:52 np0005593234 nova_compute[227762]: 2026-01-23 10:02:52.583 227766 DEBUG nova.compute.manager [req-04377cd8-41b7-4a33-a698-adfb2d93e564 req-4f220a44-09f2-4cb0-9b54-22d10f41e30a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:52 np0005593234 nova_compute[227762]: 2026-01-23 10:02:52.583 227766 DEBUG oslo_concurrency.lockutils [req-04377cd8-41b7-4a33-a698-adfb2d93e564 req-4f220a44-09f2-4cb0-9b54-22d10f41e30a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:52 np0005593234 nova_compute[227762]: 2026-01-23 10:02:52.584 227766 DEBUG oslo_concurrency.lockutils [req-04377cd8-41b7-4a33-a698-adfb2d93e564 req-4f220a44-09f2-4cb0-9b54-22d10f41e30a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:52 np0005593234 nova_compute[227762]: 2026-01-23 10:02:52.584 227766 DEBUG oslo_concurrency.lockutils [req-04377cd8-41b7-4a33-a698-adfb2d93e564 req-4f220a44-09f2-4cb0-9b54-22d10f41e30a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:52 np0005593234 kernel: tap4ca9e3cb-f7 (unregistering): left promiscuous mode
Jan 23 05:02:52 np0005593234 nova_compute[227762]: 2026-01-23 10:02:52.584 227766 DEBUG nova.compute.manager [req-04377cd8-41b7-4a33-a698-adfb2d93e564 req-4f220a44-09f2-4cb0-9b54-22d10f41e30a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] No waiting events found dispatching network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:02:52 np0005593234 nova_compute[227762]: 2026-01-23 10:02:52.585 227766 WARNING nova.compute.manager [req-04377cd8-41b7-4a33-a698-adfb2d93e564 req-4f220a44-09f2-4cb0-9b54-22d10f41e30a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received unexpected event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 for instance with vm_state stopped and task_state rebuilding.#033[00m
Jan 23 05:02:52 np0005593234 NetworkManager[48942]: <info>  [1769162572.5917] device (tap4ca9e3cb-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:02:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:52Z|00378|binding|INFO|Releasing lport 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 from this chassis (sb_readonly=0)
Jan 23 05:02:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:52Z|00379|binding|INFO|Setting lport 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 down in Southbound
Jan 23 05:02:52 np0005593234 nova_compute[227762]: 2026-01-23 10:02:52.600 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:02:52Z|00380|binding|INFO|Removing iface tap4ca9e3cb-f7 ovn-installed in OVS
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.610 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:ca:17 10.100.0.9'], port_security=['fa:16:3e:1e:ca:17 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b0b5a1b2-04bd-48e0-a0f7-0c679d784e04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '323fc591-4197-401d-b3c4-392a8ca4598f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.611 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c unbound from our chassis#033[00m
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.613 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee03d7c9-e107-41bf-95cc-5508578ad66c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:02:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:52.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.613 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e7647686-7363-41ae-9084-3bdcd37b16f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.616 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace which is not needed anymore#033[00m
Jan 23 05:02:52 np0005593234 nova_compute[227762]: 2026-01-23 10:02:52.617 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:52 np0005593234 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000066.scope: Deactivated successfully.
Jan 23 05:02:52 np0005593234 systemd[1]: machine-qemu\x2d42\x2dinstance\x2d00000066.scope: Consumed 14.442s CPU time.
Jan 23 05:02:52 np0005593234 systemd-machined[195626]: Machine qemu-42-instance-00000066 terminated.
Jan 23 05:02:52 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[272895]: [NOTICE]   (272899) : haproxy version is 2.8.14-c23fe91
Jan 23 05:02:52 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[272895]: [NOTICE]   (272899) : path to executable is /usr/sbin/haproxy
Jan 23 05:02:52 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[272895]: [WARNING]  (272899) : Exiting Master process...
Jan 23 05:02:52 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[272895]: [WARNING]  (272899) : Exiting Master process...
Jan 23 05:02:52 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[272895]: [ALERT]    (272899) : Current worker (272901) exited with code 143 (Terminated)
Jan 23 05:02:52 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[272895]: [WARNING]  (272899) : All workers exited. Exiting... (0)
Jan 23 05:02:52 np0005593234 systemd[1]: libpod-efe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e.scope: Deactivated successfully.
Jan 23 05:02:52 np0005593234 conmon[272895]: conmon efe0e5a337105308c3c3 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-efe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e.scope/container/memory.events
Jan 23 05:02:52 np0005593234 podman[273511]: 2026-01-23 10:02:52.742792104 +0000 UTC m=+0.044765690 container died efe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:02:52 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-efe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e-userdata-shm.mount: Deactivated successfully.
Jan 23 05:02:52 np0005593234 systemd[1]: var-lib-containers-storage-overlay-3652240f2f964440a934dcc5425fdca080a5c36fffc91adca2e8b33289443689-merged.mount: Deactivated successfully.
Jan 23 05:02:52 np0005593234 podman[273511]: 2026-01-23 10:02:52.778059986 +0000 UTC m=+0.080033582 container cleanup efe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:02:52 np0005593234 systemd[1]: libpod-conmon-efe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e.scope: Deactivated successfully.
Jan 23 05:02:52 np0005593234 podman[273541]: 2026-01-23 10:02:52.860319844 +0000 UTC m=+0.052273793 container remove efe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.868 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[02cee02b-04b4-4abb-a95a-bf5f147376f4]: (4, ('Fri Jan 23 10:02:52 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (efe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e)\nefe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e\nFri Jan 23 10:02:52 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (efe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e)\nefe0e5a337105308c3c3920b5b380f2c319813430f6e13420b23fb416df3102e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.870 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6555b2fa-2454-4cba-b6ba-e6c89cb67453]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.871 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:52 np0005593234 kernel: tapee03d7c9-e0: left promiscuous mode
Jan 23 05:02:52 np0005593234 nova_compute[227762]: 2026-01-23 10:02:52.873 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:52 np0005593234 nova_compute[227762]: 2026-01-23 10:02:52.890 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.892 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cf959a0e-98d1-4cd3-8545-11c0f0522826]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.907 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4edfd710-f5ae-49cf-920f-96a204f5427b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.908 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c4a92f-ee71-4ee6-bf26-ac5d5a607d6c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.921 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2e1f27dd-a2c5-4856-9da0-dbfa42a3df7e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 644299, 'reachable_time': 35774, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 273574, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:52 np0005593234 systemd[1]: run-netns-ovnmeta\x2dee03d7c9\x2de107\x2d41bf\x2d95cc\x2d5508578ad66c.mount: Deactivated successfully.
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.926 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:02:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:02:52.926 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[0873d92c-2880-42e8-9697-0f5a508daf43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.084 227766 INFO nova.compute.manager [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Rebuilding instance#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.478 227766 INFO nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Instance shutdown successfully after 5 seconds.#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.483 227766 INFO nova.virt.libvirt.driver [-] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Instance destroyed successfully.#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.487 227766 INFO nova.virt.libvirt.driver [-] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Instance destroyed successfully.#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.489 227766 DEBUG nova.virt.libvirt.vif [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:02:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-998112874',display_name='tempest-ServerActionsTestJSON-server-832385645',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-998112874',id=102,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:02:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-fjoe9sqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:02:45Z,user_data=None,user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=b0b5a1b2-04bd-48e0-a0f7-0c679d784e04,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.489 227766 DEBUG nova.network.os_vif_util [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.490 227766 DEBUG nova.network.os_vif_util [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.490 227766 DEBUG os_vif [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.492 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.493 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ca9e3cb-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.494 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.496 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.498 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.500 227766 INFO os_vif [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7')#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.840 227766 DEBUG nova.objects.instance [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'trusted_certs' on Instance uuid f4412d8b-963a-4c3a-accf-68f8cf82c864 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:53 np0005593234 nova_compute[227762]: 2026-01-23 10:02:53.874 227766 DEBUG nova.compute.manager [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.079 227766 DEBUG nova.objects.instance [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'pci_requests' on Instance uuid f4412d8b-963a-4c3a-accf-68f8cf82c864 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.098 227766 DEBUG nova.objects.instance [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'pci_devices' on Instance uuid f4412d8b-963a-4c3a-accf-68f8cf82c864 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.120 227766 DEBUG nova.objects.instance [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'resources' on Instance uuid f4412d8b-963a-4c3a-accf-68f8cf82c864 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.146 227766 DEBUG nova.objects.instance [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'migration_context' on Instance uuid f4412d8b-963a-4c3a-accf-68f8cf82c864 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.174 227766 DEBUG nova.objects.instance [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.177 227766 INFO nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance already shutdown.#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.182 227766 INFO nova.virt.libvirt.driver [-] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance destroyed successfully.#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.187 227766 INFO nova.virt.libvirt.driver [-] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance destroyed successfully.#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.188 227766 DEBUG nova.virt.libvirt.vif [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:02:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-191757669',display_name='tempest-tempest.common.compute-instance-191757669',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-191757669',id=101,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:02:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='8c16cd713fa74a88b43e4edf01c273bd',ramdisk_id='',reservation_id='r-lj8pfndu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-882763067',owner_user_name='tempest-ServerActionsTestOtherA-882763067-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:02:51Z,user_data=None,user_id='29710db389c842df836944048225740f',uuid=f4412d8b-963a-4c3a-accf-68f8cf82c864,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.188 227766 DEBUG nova.network.os_vif_util [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converting VIF {"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.189 227766 DEBUG nova.network.os_vif_util [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.189 227766 DEBUG os_vif [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.191 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.191 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape14a3386-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.193 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.194 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.196 227766 INFO os_vif [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7')#033[00m
Jan 23 05:02:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:54.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:02:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:54.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.754 227766 DEBUG nova.compute.manager [req-6bfd0942-b124-4e9c-b638-4d80933a8827 req-210d4a42-95d6-4199-b6a6-c322c046f7bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received event network-vif-unplugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.755 227766 DEBUG oslo_concurrency.lockutils [req-6bfd0942-b124-4e9c-b638-4d80933a8827 req-210d4a42-95d6-4199-b6a6-c322c046f7bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.755 227766 DEBUG oslo_concurrency.lockutils [req-6bfd0942-b124-4e9c-b638-4d80933a8827 req-210d4a42-95d6-4199-b6a6-c322c046f7bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.755 227766 DEBUG oslo_concurrency.lockutils [req-6bfd0942-b124-4e9c-b638-4d80933a8827 req-210d4a42-95d6-4199-b6a6-c322c046f7bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.755 227766 DEBUG nova.compute.manager [req-6bfd0942-b124-4e9c-b638-4d80933a8827 req-210d4a42-95d6-4199-b6a6-c322c046f7bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] No waiting events found dispatching network-vif-unplugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.755 227766 WARNING nova.compute.manager [req-6bfd0942-b124-4e9c-b638-4d80933a8827 req-210d4a42-95d6-4199-b6a6-c322c046f7bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received unexpected event network-vif-unplugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.756 227766 DEBUG nova.compute.manager [req-6bfd0942-b124-4e9c-b638-4d80933a8827 req-210d4a42-95d6-4199-b6a6-c322c046f7bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.756 227766 DEBUG oslo_concurrency.lockutils [req-6bfd0942-b124-4e9c-b638-4d80933a8827 req-210d4a42-95d6-4199-b6a6-c322c046f7bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.756 227766 DEBUG oslo_concurrency.lockutils [req-6bfd0942-b124-4e9c-b638-4d80933a8827 req-210d4a42-95d6-4199-b6a6-c322c046f7bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.756 227766 DEBUG oslo_concurrency.lockutils [req-6bfd0942-b124-4e9c-b638-4d80933a8827 req-210d4a42-95d6-4199-b6a6-c322c046f7bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.756 227766 DEBUG nova.compute.manager [req-6bfd0942-b124-4e9c-b638-4d80933a8827 req-210d4a42-95d6-4199-b6a6-c322c046f7bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] No waiting events found dispatching network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:02:54 np0005593234 nova_compute[227762]: 2026-01-23 10:02:54.756 227766 WARNING nova.compute.manager [req-6bfd0942-b124-4e9c-b638-4d80933a8827 req-210d4a42-95d6-4199-b6a6-c322c046f7bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received unexpected event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 23 05:02:54 np0005593234 podman[273614]: 2026-01-23 10:02:54.786454555 +0000 UTC m=+0.074149836 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:02:55 np0005593234 nova_compute[227762]: 2026-01-23 10:02:55.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:02:55 np0005593234 nova_compute[227762]: 2026-01-23 10:02:55.746 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:55 np0005593234 nova_compute[227762]: 2026-01-23 10:02:55.747 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:55 np0005593234 nova_compute[227762]: 2026-01-23 10:02:55.747 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:55 np0005593234 nova_compute[227762]: 2026-01-23 10:02:55.748 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:55 np0005593234 nova_compute[227762]: 2026-01-23 10:02:55.748 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:55 np0005593234 nova_compute[227762]: 2026-01-23 10:02:55.749 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:02:55 np0005593234 nova_compute[227762]: 2026-01-23 10:02:55.806 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 23 05:02:55 np0005593234 nova_compute[227762]: 2026-01-23 10:02:55.807 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Image id ae1f9e37-418c-462f-81d1-3599a6d89de9 yields fingerprint 8edc4c18d7d1964a485fb1b305c460bdc5a45b20 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 23 05:02:55 np0005593234 nova_compute[227762]: 2026-01-23 10:02:55.807 227766 INFO nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] image ae1f9e37-418c-462f-81d1-3599a6d89de9 at (/var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20): checking#033[00m
Jan 23 05:02:55 np0005593234 nova_compute[227762]: 2026-01-23 10:02:55.807 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] image ae1f9e37-418c-462f-81d1-3599a6d89de9 at (/var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Jan 23 05:02:55 np0005593234 nova_compute[227762]: 2026-01-23 10:02:55.809 227766 INFO oslo.privsep.daemon [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpnpom2x_6/privsep.sock']#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.124 227766 INFO nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Deleting instance files /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_del#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.125 227766 INFO nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Deletion of /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_del complete#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.225 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:02:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:56.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.373 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.374 227766 INFO nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Creating image(s)#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.400 227766 DEBUG nova.storage.rbd_utils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.431 227766 DEBUG nova.storage.rbd_utils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.467 227766 DEBUG nova.storage.rbd_utils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.478 227766 DEBUG oslo_concurrency.processutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.542 227766 DEBUG oslo_concurrency.processutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.543 227766 DEBUG oslo_concurrency.lockutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.544 227766 DEBUG oslo_concurrency.lockutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.544 227766 DEBUG oslo_concurrency.lockutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.574 227766 DEBUG nova.storage.rbd_utils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.577 227766 DEBUG oslo_concurrency.processutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.599 227766 INFO oslo.privsep.daemon [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Spawned new privsep daemon via rootwrap#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.430 273674 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.449 273674 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.452 273674 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.452 273674 INFO oslo.privsep.daemon [-] privsep daemon running as pid 273674#033[00m
Jan 23 05:02:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:56.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.693 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.694 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Image id 84c0ef19-7f67-4bd3-95d8-507c3e0942ed yields fingerprint a6f655456a04e1d13ef2e44ed4544c38917863a2 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.694 227766 INFO nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] image 84c0ef19-7f67-4bd3-95d8-507c3e0942ed at (/var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2): checking#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.694 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] image 84c0ef19-7f67-4bd3-95d8-507c3e0942ed at (/var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.695 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] 61f18fb1-66aa-4089-b98f-50b8a49800ff is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.696 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] f4412d8b-963a-4c3a-accf-68f8cf82c864 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.696 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.696 227766 WARNING nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.696 227766 INFO nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Active base files: /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.696 227766 INFO nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Removable base files: /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.697 227766 INFO nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.697 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.697 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 23 05:02:56 np0005593234 nova_compute[227762]: 2026-01-23 10:02:56.697 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 23 05:02:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:02:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:58.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.578 227766 DEBUG oslo_concurrency.processutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.001s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:02:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:02:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:58.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.648 227766 DEBUG nova.storage.rbd_utils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] resizing rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.694 227766 INFO nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Deleting instance files /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864_del#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.695 227766 INFO nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Deletion of /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864_del complete#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.776 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.777 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Ensure instance console log exists: /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.778 227766 DEBUG oslo_concurrency.lockutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.779 227766 DEBUG oslo_concurrency.lockutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.779 227766 DEBUG oslo_concurrency.lockutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.783 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Start _get_guest_xml network_info=[{"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.793 227766 WARNING nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.804 227766 DEBUG nova.virt.libvirt.host [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.805 227766 DEBUG nova.virt.libvirt.host [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.817 227766 DEBUG nova.virt.libvirt.host [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.818 227766 DEBUG nova.virt.libvirt.host [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.821 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.822 227766 DEBUG nova.virt.hardware [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.823 227766 DEBUG nova.virt.hardware [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.823 227766 DEBUG nova.virt.hardware [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.823 227766 DEBUG nova.virt.hardware [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.823 227766 DEBUG nova.virt.hardware [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.824 227766 DEBUG nova.virt.hardware [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.824 227766 DEBUG nova.virt.hardware [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.824 227766 DEBUG nova.virt.hardware [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.824 227766 DEBUG nova.virt.hardware [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.825 227766 DEBUG nova.virt.hardware [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.825 227766 DEBUG nova.virt.hardware [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.825 227766 DEBUG nova.objects.instance [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'vcpu_model' on Instance uuid b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:58 np0005593234 nova_compute[227762]: 2026-01-23 10:02:58.859 227766 DEBUG oslo_concurrency.processutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.036 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.037 227766 INFO nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Creating image(s)#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.069 227766 DEBUG nova.storage.rbd_utils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.103 227766 DEBUG nova.storage.rbd_utils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.137 227766 DEBUG nova.storage.rbd_utils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.142 227766 DEBUG oslo_concurrency.processutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.193 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.209 227766 DEBUG oslo_concurrency.processutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.210 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.211 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.212 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.256 227766 DEBUG nova.storage.rbd_utils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.261 227766 DEBUG oslo_concurrency.processutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 f4412d8b-963a-4c3a-accf-68f8cf82c864_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:02:59 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3617192504' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.324 227766 DEBUG oslo_concurrency.processutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.367 227766 DEBUG nova.storage.rbd_utils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.371 227766 DEBUG oslo_concurrency.processutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.578 227766 DEBUG oslo_concurrency.processutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 f4412d8b-963a-4c3a-accf-68f8cf82c864_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.316s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.650 227766 DEBUG nova.storage.rbd_utils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] resizing rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.835 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.836 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Ensure instance console log exists: /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.837 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.837 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.837 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.839 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Start _get_guest_xml network_info=[{"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.843 227766 WARNING nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.849 227766 DEBUG nova.virt.libvirt.host [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.849 227766 DEBUG nova.virt.libvirt.host [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.857 227766 DEBUG nova.virt.libvirt.host [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.857 227766 DEBUG nova.virt.libvirt.host [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.859 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.859 227766 DEBUG nova.virt.hardware [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.859 227766 DEBUG nova.virt.hardware [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.859 227766 DEBUG nova.virt.hardware [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.859 227766 DEBUG nova.virt.hardware [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.860 227766 DEBUG nova.virt.hardware [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.860 227766 DEBUG nova.virt.hardware [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.860 227766 DEBUG nova.virt.hardware [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.860 227766 DEBUG nova.virt.hardware [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.860 227766 DEBUG nova.virt.hardware [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.860 227766 DEBUG nova.virt.hardware [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.861 227766 DEBUG nova.virt.hardware [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.861 227766 DEBUG nova.objects.instance [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'vcpu_model' on Instance uuid f4412d8b-963a-4c3a-accf-68f8cf82c864 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:02:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:02:59 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/596655548' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.884 227766 DEBUG oslo_concurrency.processutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.911 227766 DEBUG oslo_concurrency.processutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.540s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.913 227766 DEBUG nova.virt.libvirt.vif [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:02:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-998112874',display_name='tempest-ServerActionsTestJSON-server-832385645',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-998112874',id=102,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:02:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-fjoe9sqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:02:56Z,user_data=None,user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=b0b5a1b2-04bd-48e0-a0f7-0c679d784e04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.914 227766 DEBUG nova.network.os_vif_util [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.915 227766 DEBUG nova.network.os_vif_util [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.918 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  <uuid>b0b5a1b2-04bd-48e0-a0f7-0c679d784e04</uuid>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  <name>instance-00000066</name>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerActionsTestJSON-server-832385645</nova:name>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:02:58</nova:creationTime>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <nova:user uuid="9d4a5c201efa4992a9ef57d8abdc1675">tempest-ServerActionsTestJSON-1619235720-project-member</nova:user>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <nova:project uuid="74c5c1d0762242f29a5d26033efd9f6d">tempest-ServerActionsTestJSON-1619235720</nova:project>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="ae1f9e37-418c-462f-81d1-3599a6d89de9"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <nova:port uuid="4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <entry name="serial">b0b5a1b2-04bd-48e0-a0f7-0c679d784e04</entry>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <entry name="uuid">b0b5a1b2-04bd-48e0-a0f7-0c679d784e04</entry>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk.config">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:1e:ca:17"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <target dev="tap4ca9e3cb-f7"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/console.log" append="off"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:02:59 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:02:59 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:02:59 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:02:59 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.919 227766 DEBUG nova.compute.manager [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Preparing to wait for external event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.919 227766 DEBUG oslo_concurrency.lockutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.919 227766 DEBUG oslo_concurrency.lockutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.919 227766 DEBUG oslo_concurrency.lockutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.920 227766 DEBUG nova.virt.libvirt.vif [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:02:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-998112874',display_name='tempest-ServerActionsTestJSON-server-832385645',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-998112874',id=102,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:02:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-fjoe9sqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:02:56Z,user_data=None,user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=b0b5a1b2-04bd-48e0-a0f7-0c679d784e04,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.920 227766 DEBUG nova.network.os_vif_util [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.921 227766 DEBUG nova.network.os_vif_util [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.921 227766 DEBUG os_vif [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.922 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.922 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.922 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.925 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.925 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4ca9e3cb-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.926 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4ca9e3cb-f7, col_values=(('external_ids', {'iface-id': '4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1e:ca:17', 'vm-uuid': 'b0b5a1b2-04bd-48e0-a0f7-0c679d784e04'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:02:59 np0005593234 NetworkManager[48942]: <info>  [1769162579.9283] manager: (tap4ca9e3cb-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/198)
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.930 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.933 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:02:59 np0005593234 nova_compute[227762]: 2026-01-23 10:02:59.933 227766 INFO os_vif [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7')#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.010 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.010 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.011 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No VIF found with MAC fa:16:3e:1e:ca:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.011 227766 INFO nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Using config drive#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.044 227766 DEBUG nova.storage.rbd_utils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.068 227766 DEBUG nova.objects.instance [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'ec2_ids' on Instance uuid b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.106 227766 DEBUG nova.objects.instance [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'keypairs' on Instance uuid b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:03:00 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1931325190' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.343 227766 DEBUG oslo_concurrency.processutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:00.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.366 227766 DEBUG nova.storage.rbd_utils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.369 227766 DEBUG oslo_concurrency.processutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:00.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:03:00 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/310018844' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:03:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.821 227766 DEBUG oslo_concurrency.processutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.822 227766 DEBUG nova.virt.libvirt.vif [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:02:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-191757669',display_name='tempest-tempest.common.compute-instance-191757669',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-191757669',id=101,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:02:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='8c16cd713fa74a88b43e4edf01c273bd',ramdisk_id='',reservation_id='r-lj8pfndu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-882763067',owner_user_name='tempest-ServerActionsTestOtherA-882763067-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:02:58Z,user_data=None,user_id='29710db389c842df836944048225740f',uuid=f4412d8b-963a-4c3a-accf-68f8cf82c864,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.823 227766 DEBUG nova.network.os_vif_util [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converting VIF {"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.823 227766 DEBUG nova.network.os_vif_util [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.828 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  <uuid>f4412d8b-963a-4c3a-accf-68f8cf82c864</uuid>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  <name>instance-00000065</name>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <nova:name>tempest-tempest.common.compute-instance-191757669</nova:name>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:02:59</nova:creationTime>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <nova:user uuid="29710db389c842df836944048225740f">tempest-ServerActionsTestOtherA-882763067-project-member</nova:user>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <nova:project uuid="8c16cd713fa74a88b43e4edf01c273bd">tempest-ServerActionsTestOtherA-882763067</nova:project>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="ae1f9e37-418c-462f-81d1-3599a6d89de9"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <nova:port uuid="e14a3386-c770-46be-bafc-0418fa8274a7">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <entry name="serial">f4412d8b-963a-4c3a-accf-68f8cf82c864</entry>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <entry name="uuid">f4412d8b-963a-4c3a-accf-68f8cf82c864</entry>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/f4412d8b-963a-4c3a-accf-68f8cf82c864_disk">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/f4412d8b-963a-4c3a-accf-68f8cf82c864_disk.config">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:b2:95:7c"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <target dev="tape14a3386-c7"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/console.log" append="off"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:03:00 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:03:00 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:03:00 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:03:00 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.829 227766 DEBUG nova.compute.manager [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Preparing to wait for external event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.829 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.830 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.830 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.831 227766 DEBUG nova.virt.libvirt.vif [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:02:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-191757669',display_name='tempest-tempest.common.compute-instance-191757669',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-191757669',id=101,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:02:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='8c16cd713fa74a88b43e4edf01c273bd',ramdisk_id='',reservation_id='r-lj8pfndu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-882763067',owner_user_name='tempest-ServerActionsTestOtherA-882763067-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:02:58Z,user_data=None,user_id='29710db389c842df836944048225740f',uuid=f4412d8b-963a-4c3a-accf-68f8cf82c864,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.831 227766 DEBUG nova.network.os_vif_util [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converting VIF {"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.832 227766 DEBUG nova.network.os_vif_util [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.832 227766 DEBUG os_vif [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.833 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.833 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.834 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.836 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.837 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape14a3386-c7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.837 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape14a3386-c7, col_values=(('external_ids', {'iface-id': 'e14a3386-c770-46be-bafc-0418fa8274a7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b2:95:7c', 'vm-uuid': 'f4412d8b-963a-4c3a-accf-68f8cf82c864'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:00 np0005593234 NetworkManager[48942]: <info>  [1769162580.8408] manager: (tape14a3386-c7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/199)
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.843 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.847 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.848 227766 INFO os_vif [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7')#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.862 227766 INFO nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Creating config drive at /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/disk.config#033[00m
Jan 23 05:03:00 np0005593234 nova_compute[227762]: 2026-01-23 10:03:00.868 227766 DEBUG oslo_concurrency.processutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa5s4apt2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.015 227766 DEBUG oslo_concurrency.processutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa5s4apt2" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.045 227766 DEBUG nova.storage.rbd_utils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.051 227766 DEBUG oslo_concurrency.processutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/disk.config b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:01 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #50. Immutable memtables: 7.
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.225 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.226 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.226 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] No VIF found with MAC fa:16:3e:b2:95:7c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.226 227766 INFO nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Using config drive#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.252 227766 DEBUG nova.storage.rbd_utils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.256 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.258 227766 DEBUG oslo_concurrency.processutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/disk.config b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.208s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.259 227766 INFO nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Deleting local config drive /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04/disk.config because it was imported into RBD.#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.281 227766 DEBUG nova.objects.instance [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'ec2_ids' on Instance uuid f4412d8b-963a-4c3a-accf-68f8cf82c864 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:01 np0005593234 kernel: tap4ca9e3cb-f7: entered promiscuous mode
Jan 23 05:03:01 np0005593234 NetworkManager[48942]: <info>  [1769162581.3017] manager: (tap4ca9e3cb-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/200)
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.307 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:01Z|00381|binding|INFO|Claiming lport 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 for this chassis.
Jan 23 05:03:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:01Z|00382|binding|INFO|4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7: Claiming fa:16:3e:1e:ca:17 10.100.0.9
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.317 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:ca:17 10.100.0.9'], port_security=['fa:16:3e:1e:ca:17 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b0b5a1b2-04bd-48e0-a0f7-0c679d784e04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '5', 'neutron:security_group_ids': '323fc591-4197-401d-b3c4-392a8ca4598f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.318 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c bound to our chassis#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.320 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee03d7c9-e107-41bf-95cc-5508578ad66c#033[00m
Jan 23 05:03:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:01Z|00383|binding|INFO|Setting lport 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 ovn-installed in OVS
Jan 23 05:03:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:01Z|00384|binding|INFO|Setting lport 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 up in Southbound
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.332 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c677bd7a-472e-4945-989d-a8e05054292d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.333 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee03d7c9-e1 in ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.333 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.335 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee03d7c9-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.336 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[337b159e-0758-4aff-a0ae-56fc6b9ec4dc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.336 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.337 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4d6dc7bd-b05e-4e4e-9ad6-41bc1f335918]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 systemd-udevd[274197]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.350 227766 DEBUG nova.objects.instance [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'keypairs' on Instance uuid f4412d8b-963a-4c3a-accf-68f8cf82c864 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.353 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[831c26d5-1368-4ad0-9939-e28a9ab92e7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 NetworkManager[48942]: <info>  [1769162581.3580] device (tap4ca9e3cb-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:03:01 np0005593234 NetworkManager[48942]: <info>  [1769162581.3589] device (tap4ca9e3cb-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:03:01 np0005593234 systemd-machined[195626]: New machine qemu-44-instance-00000066.
Jan 23 05:03:01 np0005593234 systemd[1]: Started Virtual Machine qemu-44-instance-00000066.
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.388 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[68ae8050-b611-4b35-9933-ff4ca3cb35bd]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.419 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f76e237a-e206-40e5-afc0-f24e3507f905]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 NetworkManager[48942]: <info>  [1769162581.4266] manager: (tapee03d7c9-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/201)
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.425 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b1604da3-dbbd-4b08-b124-724e8558f42b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 systemd-udevd[274200]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.460 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[47d802b3-e673-4922-bec7-9d5ab47531ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.463 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a3bb4b-cdbc-4682-b9bc-ffe5cd107a5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 NetworkManager[48942]: <info>  [1769162581.4850] device (tapee03d7c9-e0): carrier: link connected
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.491 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4bb132c5-a1b7-4d66-89d2-bb0d77836aaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.510 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b91d64ee-ae23-4a94-9160-33bde01888c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 123], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647644, 'reachable_time': 28254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274232, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.526 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[85f406a7-1ddf-4fe0-a8a1-5ef90b7e18af]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:6530'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 647644, 'tstamp': 647644}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274233, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.546 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e81c0d-a5c9-443b-8a27-6b652df8c4b5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 123], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647644, 'reachable_time': 28254, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 274234, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.575 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[660b6173-f952-42fe-bca9-a1aaf669829c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.631 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[40e0579a-5216-47eb-a109-270b2b9564ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.633 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.633 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.633 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee03d7c9-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.634 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:01 np0005593234 NetworkManager[48942]: <info>  [1769162581.6356] manager: (tapee03d7c9-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/202)
Jan 23 05:03:01 np0005593234 kernel: tapee03d7c9-e0: entered promiscuous mode
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.641 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.642 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee03d7c9-e0, col_values=(('external_ids', {'iface-id': '702d4523-a665-42f5-9a36-57d187c0698a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:01Z|00385|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=0)
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.660 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.662 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.662 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[af344cba-6c63-4b75-a244-6f62a3556c84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.663 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:03:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:01.664 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'env', 'PROCESS_TAG=haproxy-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee03d7c9-e107-41bf-95cc-5508578ad66c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.806 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.807 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162581.806358, b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.807 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] VM Started (Lifecycle Event)#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.837 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.844 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162581.806529, b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.845 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.865 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.869 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:03:01 np0005593234 nova_compute[227762]: 2026-01-23 10:03:01.893 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 05:03:02 np0005593234 podman[274316]: 2026-01-23 10:03:02.05369913 +0000 UTC m=+0.050798636 container create fd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:03:02 np0005593234 systemd[1]: Started libpod-conmon-fd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69.scope.
Jan 23 05:03:02 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:03:02 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ad1c1e0192d8fdfe3fee851575a846074d260a6b2dd53ac3017c17559b64d3ab/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:03:02 np0005593234 podman[274316]: 2026-01-23 10:03:02.025505471 +0000 UTC m=+0.022604977 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:03:02 np0005593234 podman[274316]: 2026-01-23 10:03:02.124820282 +0000 UTC m=+0.121919808 container init fd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:03:02 np0005593234 podman[274316]: 2026-01-23 10:03:02.129597851 +0000 UTC m=+0.126697357 container start fd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.140 227766 DEBUG nova.compute.manager [req-4b3b6523-1451-47bf-b334-3585511aa498 req-2e7256dd-ddf7-4aba-a427-9d90ba504885 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.140 227766 DEBUG oslo_concurrency.lockutils [req-4b3b6523-1451-47bf-b334-3585511aa498 req-2e7256dd-ddf7-4aba-a427-9d90ba504885 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.141 227766 DEBUG oslo_concurrency.lockutils [req-4b3b6523-1451-47bf-b334-3585511aa498 req-2e7256dd-ddf7-4aba-a427-9d90ba504885 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.141 227766 DEBUG oslo_concurrency.lockutils [req-4b3b6523-1451-47bf-b334-3585511aa498 req-2e7256dd-ddf7-4aba-a427-9d90ba504885 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.141 227766 DEBUG nova.compute.manager [req-4b3b6523-1451-47bf-b334-3585511aa498 req-2e7256dd-ddf7-4aba-a427-9d90ba504885 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Processing event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.142 227766 DEBUG nova.compute.manager [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.147 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.151 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162582.1509166, b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.151 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:03:02 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[274331]: [NOTICE]   (274335) : New worker (274337) forked
Jan 23 05:03:02 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[274331]: [NOTICE]   (274335) : Loading success.
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.157 227766 INFO nova.virt.libvirt.driver [-] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Instance spawned successfully.#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.157 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.188 227766 INFO nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Creating config drive at /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/disk.config#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.194 227766 DEBUG oslo_concurrency.processutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps_x41f7p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.223 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.230 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.236 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.237 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.237 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.238 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.239 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.239 227766 DEBUG nova.virt.libvirt.driver [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.295 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.327 227766 DEBUG oslo_concurrency.processutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmps_x41f7p" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:02.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.356 227766 DEBUG nova.storage.rbd_utils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] rbd image f4412d8b-963a-4c3a-accf-68f8cf82c864_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.360 227766 DEBUG oslo_concurrency.processutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/disk.config f4412d8b-963a-4c3a-accf-68f8cf82c864_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.386 227766 DEBUG nova.compute.manager [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.529 227766 DEBUG oslo_concurrency.lockutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.530 227766 DEBUG oslo_concurrency.lockutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.530 227766 DEBUG nova.objects.instance [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.629 227766 DEBUG oslo_concurrency.lockutils [None req-70734f9d-35cf-48b4-8b11-78a66e5ca77f 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:02.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.925 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162567.9240997, f4412d8b-963a-4c3a-accf-68f8cf82c864 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.925 227766 INFO nova.compute.manager [-] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.961 227766 DEBUG nova.compute.manager [None req-c1e321dd-0b93-49b4-b461-81c4660f62f4 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.964 227766 DEBUG nova.compute.manager [None req-c1e321dd-0b93-49b4-b461-81c4660f62f4 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:03:02 np0005593234 nova_compute[227762]: 2026-01-23 10:03:02.994 227766 INFO nova.compute.manager [None req-c1e321dd-0b93-49b4-b461-81c4660f62f4 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 05:03:03 np0005593234 nova_compute[227762]: 2026-01-23 10:03:03.886 227766 DEBUG oslo_concurrency.processutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/disk.config f4412d8b-963a-4c3a-accf-68f8cf82c864_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.526s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:03 np0005593234 nova_compute[227762]: 2026-01-23 10:03:03.886 227766 INFO nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Deleting local config drive /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864/disk.config because it was imported into RBD.#033[00m
Jan 23 05:03:03 np0005593234 kernel: tape14a3386-c7: entered promiscuous mode
Jan 23 05:03:03 np0005593234 NetworkManager[48942]: <info>  [1769162583.9351] manager: (tape14a3386-c7): new Tun device (/org/freedesktop/NetworkManager/Devices/203)
Jan 23 05:03:03 np0005593234 systemd-udevd[274227]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:03:03 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:03Z|00386|binding|INFO|Claiming lport e14a3386-c770-46be-bafc-0418fa8274a7 for this chassis.
Jan 23 05:03:03 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:03Z|00387|binding|INFO|e14a3386-c770-46be-bafc-0418fa8274a7: Claiming fa:16:3e:b2:95:7c 10.100.0.7
Jan 23 05:03:03 np0005593234 nova_compute[227762]: 2026-01-23 10:03:03.939 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:03 np0005593234 NetworkManager[48942]: <info>  [1769162583.9539] device (tape14a3386-c7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:03:03 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:03.953 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:95:7c 10.100.0.7'], port_security=['fa:16:3e:b2:95:7c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f4412d8b-963a-4c3a-accf-68f8cf82c864', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c16cd713fa74a88b43e4edf01c273bd', 'neutron:revision_number': '5', 'neutron:security_group_ids': '1b3b2b26-a9c9-438c-b14e-9fddf18d8ea5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c3aed5f-30b8-4c57-808e-87764ab67fc8, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=e14a3386-c770-46be-bafc-0418fa8274a7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:03:03 np0005593234 NetworkManager[48942]: <info>  [1769162583.9555] device (tape14a3386-c7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:03:03 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:03.955 144381 INFO neutron.agent.ovn.metadata.agent [-] Port e14a3386-c770-46be-bafc-0418fa8274a7 in datapath 8575e824-4be0-4206-873e-2f9a3d1ded0b bound to our chassis#033[00m
Jan 23 05:03:03 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:03Z|00388|binding|INFO|Setting lport e14a3386-c770-46be-bafc-0418fa8274a7 ovn-installed in OVS
Jan 23 05:03:03 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:03Z|00389|binding|INFO|Setting lport e14a3386-c770-46be-bafc-0418fa8274a7 up in Southbound
Jan 23 05:03:03 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:03.956 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8575e824-4be0-4206-873e-2f9a3d1ded0b#033[00m
Jan 23 05:03:03 np0005593234 nova_compute[227762]: 2026-01-23 10:03:03.957 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:03 np0005593234 nova_compute[227762]: 2026-01-23 10:03:03.961 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:03 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:03.973 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8bbf09e3-b61c-4c4c-bd4b-547dbe0e49df]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:03 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:03.975 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8575e824-41 in ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:03:03 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:03.977 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8575e824-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:03:03 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:03.977 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7331c1b0-226b-493a-b910-edc60a3c35e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:03 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:03.979 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[34a7d4d4-015b-4777-bbbc-f968f7afad80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:03 np0005593234 systemd-machined[195626]: New machine qemu-45-instance-00000065.
Jan 23 05:03:03 np0005593234 systemd[1]: Started Virtual Machine qemu-45-instance-00000065.
Jan 23 05:03:03 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:03.997 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[d1bec3fb-00fa-4c48-8a53-3e09a54776fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.016 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[18f75697-7efa-40fd-8fb7-4fcc7b7850a5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.051 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[476cd7b9-ef90-4924-b296-625400a2d1eb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593234 NetworkManager[48942]: <info>  [1769162584.0574] manager: (tap8575e824-40): new Veth device (/org/freedesktop/NetworkManager/Devices/204)
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.058 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[62f94fcd-6cc3-45ea-9bf8-42ea7a8d536b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.093 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[5c68d2b3-8101-46ee-ad63-ce5b074e268c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.096 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c2a00b6c-5021-4cbd-808f-a760c6a81112]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593234 NetworkManager[48942]: <info>  [1769162584.1161] device (tap8575e824-40): carrier: link connected
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.120 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[69c511ae-d28d-49f7-86c9-5134ab691cf3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.135 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[29efe624-d388-4bf8-a824-06c054b7f92f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8575e824-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:16:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647907, 'reachable_time': 24660, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274418, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.149 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9acd69b9-4392-478a-9685-99604f7b7bb6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef2:16ca'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 647907, 'tstamp': 647907}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 274419, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.162 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[74d74e9c-fdf7-4e1b-8c32-829c31fd6031]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8575e824-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:16:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 125], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647907, 'reachable_time': 24660, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 274420, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.186 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dc4bad13-2d01-4c60-bef8-b4c3d42974b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.249 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[62dd2dd5-41dd-4df0-b8e3-03a35f0bdf9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.250 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8575e824-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.250 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.251 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8575e824-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.263 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:04 np0005593234 NetworkManager[48942]: <info>  [1769162584.2641] manager: (tap8575e824-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/205)
Jan 23 05:03:04 np0005593234 kernel: tap8575e824-40: entered promiscuous mode
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.270 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.271 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8575e824-40, col_values=(('external_ids', {'iface-id': 'f7023d86-3158-4cc4-b690-f57bb76e92b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:04Z|00390|binding|INFO|Releasing lport f7023d86-3158-4cc4-b690-f57bb76e92b5 from this chassis (sb_readonly=0)
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.297 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8575e824-4be0-4206-873e-2f9a3d1ded0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8575e824-4be0-4206-873e-2f9a3d1ded0b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.297 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[89373883-f4cb-446a-aab2-f2f780970bc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.299 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-8575e824-4be0-4206-873e-2f9a3d1ded0b
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/8575e824-4be0-4206-873e-2f9a3d1ded0b.pid.haproxy
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 8575e824-4be0-4206-873e-2f9a3d1ded0b
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:03:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:04.299 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'env', 'PROCESS_TAG=haproxy-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8575e824-4be0-4206-873e-2f9a3d1ded0b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.301 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:04.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.386 227766 DEBUG nova.compute.manager [req-d8237942-322e-422f-8372-b2bea1b2dc25 req-3a39fd7b-b0cf-4372-9906-d7941ff2568a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.387 227766 DEBUG oslo_concurrency.lockutils [req-d8237942-322e-422f-8372-b2bea1b2dc25 req-3a39fd7b-b0cf-4372-9906-d7941ff2568a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.388 227766 DEBUG oslo_concurrency.lockutils [req-d8237942-322e-422f-8372-b2bea1b2dc25 req-3a39fd7b-b0cf-4372-9906-d7941ff2568a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.388 227766 DEBUG oslo_concurrency.lockutils [req-d8237942-322e-422f-8372-b2bea1b2dc25 req-3a39fd7b-b0cf-4372-9906-d7941ff2568a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.388 227766 DEBUG nova.compute.manager [req-d8237942-322e-422f-8372-b2bea1b2dc25 req-3a39fd7b-b0cf-4372-9906-d7941ff2568a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] No waiting events found dispatching network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.389 227766 WARNING nova.compute.manager [req-d8237942-322e-422f-8372-b2bea1b2dc25 req-3a39fd7b-b0cf-4372-9906-d7941ff2568a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received unexpected event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:03:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:04.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.673 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162584.6726024, f4412d8b-963a-4c3a-accf-68f8cf82c864 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.673 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] VM Started (Lifecycle Event)#033[00m
Jan 23 05:03:04 np0005593234 podman[274545]: 2026-01-23 10:03:04.732954384 +0000 UTC m=+0.046159252 container create 3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:03:04 np0005593234 systemd[1]: Started libpod-conmon-3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d.scope.
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.782 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.786 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162584.6746092, f4412d8b-963a-4c3a-accf-68f8cf82c864 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.786 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:03:04 np0005593234 podman[274545]: 2026-01-23 10:03:04.711703911 +0000 UTC m=+0.024908819 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:03:04 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.813 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:04 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53bb5cb24d345ad66f093f674953f5e050c1d89672ebb8fdb510e1849c7c9a5e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.820 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:03:04 np0005593234 podman[274545]: 2026-01-23 10:03:04.827783257 +0000 UTC m=+0.140988155 container init 3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:03:04 np0005593234 podman[274545]: 2026-01-23 10:03:04.833263608 +0000 UTC m=+0.146468486 container start 3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 05:03:04 np0005593234 nova_compute[227762]: 2026-01-23 10:03:04.841 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 05:03:04 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[274561]: [NOTICE]   (274565) : New worker (274567) forked
Jan 23 05:03:04 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[274561]: [NOTICE]   (274565) : Loading success.
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #100. Immutable memtables: 0.
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.767431) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 61] Flushing memtable with next log file: 100
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162585767514, "job": 61, "event": "flush_started", "num_memtables": 1, "num_entries": 2367, "num_deletes": 251, "total_data_size": 5627081, "memory_usage": 5705816, "flush_reason": "Manual Compaction"}
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 61] Level-0 flush table #101: started
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162585802882, "cf_name": "default", "job": 61, "event": "table_file_creation", "file_number": 101, "file_size": 3639383, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 49481, "largest_seqno": 51843, "table_properties": {"data_size": 3629919, "index_size": 5894, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20166, "raw_average_key_size": 20, "raw_value_size": 3610814, "raw_average_value_size": 3665, "num_data_blocks": 257, "num_entries": 985, "num_filter_entries": 985, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162390, "oldest_key_time": 1769162390, "file_creation_time": 1769162585, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 101, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 61] Flush lasted 35496 microseconds, and 14257 cpu microseconds.
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:03:05 np0005593234 podman[274576]: 2026-01-23 10:03:05.801547881 +0000 UTC m=+0.090928331 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.802944) [db/flush_job.cc:967] [default] [JOB 61] Level-0 flush table #101: 3639383 bytes OK
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.802966) [db/memtable_list.cc:519] [default] Level-0 commit table #101 started
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.806180) [db/memtable_list.cc:722] [default] Level-0 commit table #101: memtable #1 done
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.806228) EVENT_LOG_v1 {"time_micros": 1769162585806218, "job": 61, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.806250) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 61] Try to delete WAL files size 5616525, prev total WAL file size 5616525, number of live WAL files 2.
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000097.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.807754) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034323637' seq:72057594037927935, type:22 .. '7061786F730034353139' seq:0, type:0; will stop at (end)
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 62] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 61 Base level 0, inputs: [101(3554KB)], [99(8925KB)]
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162585807854, "job": 62, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [101], "files_L6": [99], "score": -1, "input_data_size": 12778816, "oldest_snapshot_seqno": -1}
Jan 23 05:03:05 np0005593234 nova_compute[227762]: 2026-01-23 10:03:05.840 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 62] Generated table #102: 7435 keys, 10853771 bytes, temperature: kUnknown
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162585892987, "cf_name": "default", "job": 62, "event": "table_file_creation", "file_number": 102, "file_size": 10853771, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10805082, "index_size": 28993, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 18629, "raw_key_size": 191922, "raw_average_key_size": 25, "raw_value_size": 10673240, "raw_average_value_size": 1435, "num_data_blocks": 1144, "num_entries": 7435, "num_filter_entries": 7435, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769162585, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 102, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.893436) [db/compaction/compaction_job.cc:1663] [default] [JOB 62] Compacted 1@0 + 1@6 files to L6 => 10853771 bytes
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.894837) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 149.6 rd, 127.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 8.7 +0.0 blob) out(10.4 +0.0 blob), read-write-amplify(6.5) write-amplify(3.0) OK, records in: 7954, records dropped: 519 output_compression: NoCompression
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.894860) EVENT_LOG_v1 {"time_micros": 1769162585894850, "job": 62, "event": "compaction_finished", "compaction_time_micros": 85412, "compaction_time_cpu_micros": 35223, "output_level": 6, "num_output_files": 1, "total_output_size": 10853771, "num_input_records": 7954, "num_output_records": 7435, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000101.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162585896160, "job": 62, "event": "table_file_deletion", "file_number": 101}
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000099.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162585898674, "job": 62, "event": "table_file_deletion", "file_number": 99}
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.807637) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.898810) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.898815) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.898817) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.898819) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:03:05 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:03:05.898821) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:03:06 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:06Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:2d:54:e7 10.100.0.12
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.231 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:06.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.589 227766 DEBUG nova.compute.manager [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.590 227766 DEBUG oslo_concurrency.lockutils [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.590 227766 DEBUG oslo_concurrency.lockutils [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.591 227766 DEBUG oslo_concurrency.lockutils [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.591 227766 DEBUG nova.compute.manager [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Processing event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.591 227766 DEBUG nova.compute.manager [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.592 227766 DEBUG oslo_concurrency.lockutils [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.592 227766 DEBUG oslo_concurrency.lockutils [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.592 227766 DEBUG oslo_concurrency.lockutils [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.592 227766 DEBUG nova.compute.manager [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] No waiting events found dispatching network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.593 227766 WARNING nova.compute.manager [req-809b1e64-39a4-424f-8d0b-d4014dfd757d req-2450acdd-272c-4c3a-a330-f36f29c1f02a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received unexpected event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 for instance with vm_state stopped and task_state rebuild_spawning.#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.594 227766 DEBUG nova.compute.manager [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.599 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.602 227766 INFO nova.virt.libvirt.driver [-] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance spawned successfully.#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.603 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162586.6033466, f4412d8b-963a-4c3a-accf-68f8cf82c864 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.604 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.605 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.637 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:06.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.642 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.643 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.643 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.644 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.645 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.645 227766 DEBUG nova.virt.libvirt.driver [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.650 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: rebuild_spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.706 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.767 227766 DEBUG nova.compute.manager [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.768 227766 DEBUG oslo_concurrency.lockutils [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.768 227766 DEBUG oslo_concurrency.lockutils [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.769 227766 DEBUG oslo_concurrency.lockutils [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.771 227766 DEBUG oslo_concurrency.lockutils [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.773 227766 DEBUG oslo_concurrency.lockutils [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.775 227766 INFO nova.compute.manager [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Terminating instance#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.779 227766 DEBUG nova.compute.manager [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.892 227766 INFO nova.compute.manager [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] bringing vm to original state: 'stopped'#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.981 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.981 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:06.983 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=40, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=39) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:03:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:06.984 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.985 227766 DEBUG nova.compute.manager [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.986 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:06 np0005593234 nova_compute[227762]: 2026-01-23 10:03:06.991 227766 DEBUG nova.compute.manager [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 23 05:03:07 np0005593234 kernel: tap4ca9e3cb-f7 (unregistering): left promiscuous mode
Jan 23 05:03:07 np0005593234 NetworkManager[48942]: <info>  [1769162587.0294] device (tap4ca9e3cb-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:03:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:07Z|00391|binding|INFO|Releasing lport 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 from this chassis (sb_readonly=0)
Jan 23 05:03:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:07Z|00392|binding|INFO|Setting lport 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 down in Southbound
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.039 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:07Z|00393|binding|INFO|Removing iface tap4ca9e3cb-f7 ovn-installed in OVS
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.041 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.047 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1e:ca:17 10.100.0.9'], port_security=['fa:16:3e:1e:ca:17 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'b0b5a1b2-04bd-48e0-a0f7-0c679d784e04', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '6', 'neutron:security_group_ids': '323fc591-4197-401d-b3c4-392a8ca4598f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.048 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c unbound from our chassis#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.049 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee03d7c9-e107-41bf-95cc-5508578ad66c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.050 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[76b260a2-cca6-4524-98f3-0c17a9498e71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.050 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace which is not needed anymore#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.055 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000066.scope: Deactivated successfully.
Jan 23 05:03:07 np0005593234 systemd[1]: machine-qemu\x2d44\x2dinstance\x2d00000066.scope: Consumed 5.141s CPU time.
Jan 23 05:03:07 np0005593234 systemd-machined[195626]: Machine qemu-44-instance-00000066 terminated.
Jan 23 05:03:07 np0005593234 kernel: tape14a3386-c7 (unregistering): left promiscuous mode
Jan 23 05:03:07 np0005593234 NetworkManager[48942]: <info>  [1769162587.1268] device (tape14a3386-c7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.131 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:07Z|00394|binding|INFO|Releasing lport e14a3386-c770-46be-bafc-0418fa8274a7 from this chassis (sb_readonly=0)
Jan 23 05:03:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:07Z|00395|binding|INFO|Setting lport e14a3386-c770-46be-bafc-0418fa8274a7 down in Southbound
Jan 23 05:03:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:07Z|00396|binding|INFO|Removing iface tape14a3386-c7 ovn-installed in OVS
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.134 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.139 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b2:95:7c 10.100.0.7'], port_security=['fa:16:3e:b2:95:7c 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'f4412d8b-963a-4c3a-accf-68f8cf82c864', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c16cd713fa74a88b43e4edf01c273bd', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1b3b2b26-a9c9-438c-b14e-9fddf18d8ea5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c3aed5f-30b8-4c57-808e-87764ab67fc8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=e14a3386-c770-46be-bafc-0418fa8274a7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.154 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 systemd[1]: machine-qemu\x2d45\x2dinstance\x2d00000065.scope: Deactivated successfully.
Jan 23 05:03:07 np0005593234 systemd-machined[195626]: Machine qemu-45-instance-00000065 terminated.
Jan 23 05:03:07 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[274331]: [NOTICE]   (274335) : haproxy version is 2.8.14-c23fe91
Jan 23 05:03:07 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[274331]: [NOTICE]   (274335) : path to executable is /usr/sbin/haproxy
Jan 23 05:03:07 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[274331]: [WARNING]  (274335) : Exiting Master process...
Jan 23 05:03:07 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[274331]: [WARNING]  (274335) : Exiting Master process...
Jan 23 05:03:07 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[274331]: [ALERT]    (274335) : Current worker (274337) exited with code 143 (Terminated)
Jan 23 05:03:07 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[274331]: [WARNING]  (274335) : All workers exited. Exiting... (0)
Jan 23 05:03:07 np0005593234 systemd[1]: libpod-fd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69.scope: Deactivated successfully.
Jan 23 05:03:07 np0005593234 podman[274625]: 2026-01-23 10:03:07.181834613 +0000 UTC m=+0.046138472 container died fd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:03:07 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69-userdata-shm.mount: Deactivated successfully.
Jan 23 05:03:07 np0005593234 systemd[1]: var-lib-containers-storage-overlay-ad1c1e0192d8fdfe3fee851575a846074d260a6b2dd53ac3017c17559b64d3ab-merged.mount: Deactivated successfully.
Jan 23 05:03:07 np0005593234 podman[274625]: 2026-01-23 10:03:07.216225857 +0000 UTC m=+0.080529716 container cleanup fd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.218 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.225 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.232 227766 INFO nova.virt.libvirt.driver [-] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance destroyed successfully.#033[00m
Jan 23 05:03:07 np0005593234 systemd[1]: libpod-conmon-fd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69.scope: Deactivated successfully.
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.233 227766 DEBUG nova.compute.manager [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.240 227766 INFO nova.virt.libvirt.driver [-] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Instance destroyed successfully.#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.241 227766 DEBUG nova.objects.instance [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'resources' on Instance uuid b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.270 227766 DEBUG nova.virt.libvirt.vif [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:02:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-998112874',display_name='tempest-ServerActionsTestJSON-server-832385645',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-998112874',id=102,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:03:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-fjoe9sqo',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:03:02Z,user_data=None,user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=b0b5a1b2-04bd-48e0-a0f7-0c679d784e04,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.270 227766 DEBUG nova.network.os_vif_util [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "address": "fa:16:3e:1e:ca:17", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4ca9e3cb-f7", "ovs_interfaceid": "4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.271 227766 DEBUG nova.network.os_vif_util [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.271 227766 DEBUG os_vif [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.273 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.273 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4ca9e3cb-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.275 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.277 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.280 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 podman[274672]: 2026-01-23 10:03:07.282147836 +0000 UTC m=+0.040266619 container remove fd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.282 227766 INFO os_vif [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1e:ca:17,bridge_name='br-int',has_traffic_filtering=True,id=4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4ca9e3cb-f7')#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.288 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[da5b4876-e477-468c-85db-4e5b0eb9fee3]: (4, ('Fri Jan 23 10:03:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (fd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69)\nfd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69\nFri Jan 23 10:03:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (fd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69)\nfd560a754f7b9b85f292f361d2677e902049c199f348037fd0341b6b7e255d69\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.290 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[23136d94-c24b-45cb-b394-290ee0f3322e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.290 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:07 np0005593234 kernel: tapee03d7c9-e0: left promiscuous mode
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.300 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.306 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 0.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.309 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.310 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.313 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0e68d85b-9f43-4407-bdf4-6e4e59c85aeb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.331 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b2b38e7c-6adb-45db-a1ba-a6aa58e5d168]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.333 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[146819f9-cc5a-46df-a9ab-b8bea3e9a77f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.343 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.344 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.344 227766 DEBUG nova.objects.instance [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.349 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a81567db-74b7-4d51-b329-105903ba6e9d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647637, 'reachable_time': 20259, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274720, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.352 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.352 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[f93705f2-b52d-405d-8a9f-4503edac7df4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.353 144381 INFO neutron.agent.ovn.metadata.agent [-] Port e14a3386-c770-46be-bafc-0418fa8274a7 in datapath 8575e824-4be0-4206-873e-2f9a3d1ded0b unbound from our chassis#033[00m
Jan 23 05:03:07 np0005593234 systemd[1]: run-netns-ovnmeta\x2dee03d7c9\x2de107\x2d41bf\x2d95cc\x2d5508578ad66c.mount: Deactivated successfully.
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.354 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8575e824-4be0-4206-873e-2f9a3d1ded0b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.355 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[12045127-b3cd-4bca-96fc-7667e9837dbb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.356 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b namespace which is not needed anymore#033[00m
Jan 23 05:03:07 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[274561]: [NOTICE]   (274565) : haproxy version is 2.8.14-c23fe91
Jan 23 05:03:07 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[274561]: [NOTICE]   (274565) : path to executable is /usr/sbin/haproxy
Jan 23 05:03:07 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[274561]: [WARNING]  (274565) : Exiting Master process...
Jan 23 05:03:07 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[274561]: [ALERT]    (274565) : Current worker (274567) exited with code 143 (Terminated)
Jan 23 05:03:07 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[274561]: [WARNING]  (274565) : All workers exited. Exiting... (0)
Jan 23 05:03:07 np0005593234 systemd[1]: libpod-3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d.scope: Deactivated successfully.
Jan 23 05:03:07 np0005593234 podman[274738]: 2026-01-23 10:03:07.474895416 +0000 UTC m=+0.042682994 container died 3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:03:07 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d-userdata-shm.mount: Deactivated successfully.
Jan 23 05:03:07 np0005593234 systemd[1]: var-lib-containers-storage-overlay-53bb5cb24d345ad66f093f674953f5e050c1d89672ebb8fdb510e1849c7c9a5e-merged.mount: Deactivated successfully.
Jan 23 05:03:07 np0005593234 podman[274738]: 2026-01-23 10:03:07.50736673 +0000 UTC m=+0.075154298 container cleanup 3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 05:03:07 np0005593234 systemd[1]: libpod-conmon-3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d.scope: Deactivated successfully.
Jan 23 05:03:07 np0005593234 podman[274765]: 2026-01-23 10:03:07.564607678 +0000 UTC m=+0.038668958 container remove 3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.570 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[45d95707-c8e1-41e1-8993-a30e50d432ae]: (4, ('Fri Jan 23 10:03:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b (3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d)\n3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d\nFri Jan 23 10:03:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b (3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d)\n3b30a56a39b499432daf64bb499493c84510d7d4de19ee445e6bbbc9f1036b5d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.572 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6efff1ff-5fb0-497c-8094-6c342212f4da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.573 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8575e824-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.601 227766 DEBUG oslo_concurrency.lockutils [None req-c7013a02-addf-4dd6-8242-0acc7dde8e0b 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.612 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 kernel: tap8575e824-40: left promiscuous mode
Jan 23 05:03:07 np0005593234 nova_compute[227762]: 2026-01-23 10:03:07.632 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.634 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[db5347ad-90f9-4810-a50d-79c960d25093]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.650 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[755624b5-c4b2-4100-be42-0ffa398063d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.652 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fa144fc4-6afb-4b53-b62d-ab4f167a1e07]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.668 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[61e28ea6-9bbb-42ba-a358-d3a047c798fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 647900, 'reachable_time': 18887, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274782, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.670 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:03:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:07.670 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[0595e2cd-5347-40ed-9dfc-d19931b54413]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:08 np0005593234 systemd[1]: run-netns-ovnmeta\x2d8575e824\x2d4be0\x2d4206\x2d873e\x2d2f9a3d1ded0b.mount: Deactivated successfully.
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.341 227766 INFO nova.virt.libvirt.driver [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Deleting instance files /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_del#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.342 227766 INFO nova.virt.libvirt.driver [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Deletion of /var/lib/nova/instances/b0b5a1b2-04bd-48e0-a0f7-0c679d784e04_del complete#033[00m
Jan 23 05:03:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:08.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.441 227766 INFO nova.compute.manager [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Took 1.66 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.442 227766 DEBUG oslo.service.loopingcall [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.442 227766 DEBUG nova.compute.manager [-] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.442 227766 DEBUG nova.network.neutron [-] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:03:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:08.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.806 227766 DEBUG nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received event network-vif-unplugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.806 227766 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.807 227766 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.807 227766 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.807 227766 DEBUG nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] No waiting events found dispatching network-vif-unplugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.807 227766 DEBUG nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received event network-vif-unplugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.807 227766 DEBUG nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.807 227766 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.808 227766 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.808 227766 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.808 227766 DEBUG nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] No waiting events found dispatching network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.808 227766 WARNING nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received unexpected event network-vif-plugged-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.808 227766 DEBUG nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received event network-vif-unplugged-e14a3386-c770-46be-bafc-0418fa8274a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.808 227766 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.808 227766 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.809 227766 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.809 227766 DEBUG nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] No waiting events found dispatching network-vif-unplugged-e14a3386-c770-46be-bafc-0418fa8274a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.809 227766 WARNING nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received unexpected event network-vif-unplugged-e14a3386-c770-46be-bafc-0418fa8274a7 for instance with vm_state stopped and task_state None.#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.809 227766 DEBUG nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.809 227766 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.809 227766 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.809 227766 DEBUG oslo_concurrency.lockutils [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.809 227766 DEBUG nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] No waiting events found dispatching network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:08 np0005593234 nova_compute[227762]: 2026-01-23 10:03:08.810 227766 WARNING nova.compute.manager [req-c56bc48d-5066-4582-ae0c-728d97ae150b req-8133339e-c738-4d52-a9a1-5b7d818cc831 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received unexpected event network-vif-plugged-e14a3386-c770-46be-bafc-0418fa8274a7 for instance with vm_state stopped and task_state None.#033[00m
Jan 23 05:03:09 np0005593234 nova_compute[227762]: 2026-01-23 10:03:09.772 227766 DEBUG nova.network.neutron [-] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:03:09 np0005593234 nova_compute[227762]: 2026-01-23 10:03:09.810 227766 INFO nova.compute.manager [-] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Took 1.37 seconds to deallocate network for instance.#033[00m
Jan 23 05:03:09 np0005593234 nova_compute[227762]: 2026-01-23 10:03:09.869 227766 DEBUG oslo_concurrency.lockutils [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:09 np0005593234 nova_compute[227762]: 2026-01-23 10:03:09.870 227766 DEBUG oslo_concurrency.lockutils [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:09 np0005593234 nova_compute[227762]: 2026-01-23 10:03:09.959 227766 DEBUG oslo_concurrency.processutils [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:10 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/659800559' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:10.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.367 227766 DEBUG oslo_concurrency.processutils [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.374 227766 DEBUG nova.compute.provider_tree [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.400 227766 DEBUG nova.scheduler.client.report [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.436 227766 DEBUG oslo_concurrency.lockutils [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.566s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.468 227766 INFO nova.scheduler.client.report [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Deleted allocations for instance b0b5a1b2-04bd-48e0-a0f7-0c679d784e04#033[00m
Jan 23 05:03:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:03:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:10.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:03:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.844 227766 DEBUG oslo_concurrency.lockutils [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.844 227766 DEBUG oslo_concurrency.lockutils [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.845 227766 DEBUG oslo_concurrency.lockutils [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.845 227766 DEBUG oslo_concurrency.lockutils [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.845 227766 DEBUG oslo_concurrency.lockutils [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.846 227766 INFO nova.compute.manager [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Terminating instance#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.847 227766 DEBUG nova.compute.manager [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.853 227766 INFO nova.virt.libvirt.driver [-] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Instance destroyed successfully.#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.853 227766 DEBUG nova.objects.instance [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'resources' on Instance uuid f4412d8b-963a-4c3a-accf-68f8cf82c864 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.872 227766 DEBUG oslo_concurrency.lockutils [None req-75c83d5f-97a2-41a3-851b-a5157c0d026a 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "b0b5a1b2-04bd-48e0-a0f7-0c679d784e04" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.874 227766 DEBUG nova.virt.libvirt.vif [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:02:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-tempest.common.compute-instance-191757669',display_name='tempest-tempest.common.compute-instance-191757669',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-tempest-common-compute-instance-191757669',id=101,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:03:06Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='8c16cd713fa74a88b43e4edf01c273bd',ramdisk_id='',reservation_id='r-lj8pfndu',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherA-882763067',owner_user_name='tempest-ServerActionsTestOtherA-882763067-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:03:07Z,user_data=None,user_id='29710db389c842df836944048225740f',uuid=f4412d8b-963a-4c3a-accf-68f8cf82c864,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.874 227766 DEBUG nova.network.os_vif_util [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converting VIF {"id": "e14a3386-c770-46be-bafc-0418fa8274a7", "address": "fa:16:3e:b2:95:7c", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape14a3386-c7", "ovs_interfaceid": "e14a3386-c770-46be-bafc-0418fa8274a7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.875 227766 DEBUG nova.network.os_vif_util [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.875 227766 DEBUG os_vif [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.877 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.877 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape14a3386-c7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.879 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.882 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.883 227766 INFO os_vif [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b2:95:7c,bridge_name='br-int',has_traffic_filtering=True,id=e14a3386-c770-46be-bafc-0418fa8274a7,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape14a3386-c7')#033[00m
Jan 23 05:03:10 np0005593234 nova_compute[227762]: 2026-01-23 10:03:10.943 227766 DEBUG nova.compute.manager [req-7f4813d2-6e0f-4f37-93ff-9770175bd666 req-52b7b84b-d5d3-4e2d-88a4-a7fb2fdede50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Received event network-vif-deleted-4ca9e3cb-f79f-4a10-b2d8-179f491a0eb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:11 np0005593234 nova_compute[227762]: 2026-01-23 10:03:11.236 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:11 np0005593234 nova_compute[227762]: 2026-01-23 10:03:11.331 227766 INFO nova.virt.libvirt.driver [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Deleting instance files /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864_del#033[00m
Jan 23 05:03:11 np0005593234 nova_compute[227762]: 2026-01-23 10:03:11.332 227766 INFO nova.virt.libvirt.driver [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Deletion of /var/lib/nova/instances/f4412d8b-963a-4c3a-accf-68f8cf82c864_del complete#033[00m
Jan 23 05:03:11 np0005593234 nova_compute[227762]: 2026-01-23 10:03:11.420 227766 INFO nova.compute.manager [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Took 0.57 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:03:11 np0005593234 nova_compute[227762]: 2026-01-23 10:03:11.420 227766 DEBUG oslo.service.loopingcall [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:03:11 np0005593234 nova_compute[227762]: 2026-01-23 10:03:11.420 227766 DEBUG nova.compute.manager [-] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:03:11 np0005593234 nova_compute[227762]: 2026-01-23 10:03:11.421 227766 DEBUG nova.network.neutron [-] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:03:12 np0005593234 nova_compute[227762]: 2026-01-23 10:03:12.074 227766 DEBUG nova.network.neutron [-] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:03:12 np0005593234 nova_compute[227762]: 2026-01-23 10:03:12.111 227766 INFO nova.compute.manager [-] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Took 0.69 seconds to deallocate network for instance.#033[00m
Jan 23 05:03:12 np0005593234 nova_compute[227762]: 2026-01-23 10:03:12.203 227766 DEBUG oslo_concurrency.lockutils [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:12 np0005593234 nova_compute[227762]: 2026-01-23 10:03:12.203 227766 DEBUG oslo_concurrency.lockutils [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:12 np0005593234 nova_compute[227762]: 2026-01-23 10:03:12.281 227766 DEBUG oslo_concurrency.processutils [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:03:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:12.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:03:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:12.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/718161078' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:12 np0005593234 nova_compute[227762]: 2026-01-23 10:03:12.710 227766 DEBUG oslo_concurrency.processutils [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:12 np0005593234 nova_compute[227762]: 2026-01-23 10:03:12.717 227766 DEBUG nova.compute.provider_tree [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:03:12 np0005593234 nova_compute[227762]: 2026-01-23 10:03:12.939 227766 DEBUG nova.scheduler.client.report [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:03:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 05:03:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 05:03:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000005 to be held by another RGW process; skipping for now
Jan 23 05:03:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000007 to be held by another RGW process; skipping for now
Jan 23 05:03:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 23 05:03:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 23 05:03:13 np0005593234 nova_compute[227762]: 2026-01-23 10:03:13.707 227766 DEBUG oslo_concurrency.lockutils [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.504s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:13 np0005593234 nova_compute[227762]: 2026-01-23 10:03:13.719 227766 DEBUG nova.compute.manager [req-e02a1e5a-b571-42fe-9401-f07a26314202 req-5ec1dd59-7604-40bd-bffb-b032b4b64a81 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Received event network-vif-deleted-e14a3386-c770-46be-bafc-0418fa8274a7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:13 np0005593234 nova_compute[227762]: 2026-01-23 10:03:13.894 227766 DEBUG oslo_concurrency.lockutils [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "61f18fb1-66aa-4089-b98f-50b8a49800ff" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:13 np0005593234 nova_compute[227762]: 2026-01-23 10:03:13.894 227766 DEBUG oslo_concurrency.lockutils [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:13 np0005593234 nova_compute[227762]: 2026-01-23 10:03:13.895 227766 DEBUG oslo_concurrency.lockutils [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:13 np0005593234 nova_compute[227762]: 2026-01-23 10:03:13.895 227766 DEBUG oslo_concurrency.lockutils [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:13 np0005593234 nova_compute[227762]: 2026-01-23 10:03:13.895 227766 DEBUG oslo_concurrency.lockutils [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:13 np0005593234 nova_compute[227762]: 2026-01-23 10:03:13.896 227766 INFO nova.compute.manager [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Terminating instance#033[00m
Jan 23 05:03:13 np0005593234 nova_compute[227762]: 2026-01-23 10:03:13.897 227766 DEBUG nova.compute.manager [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:03:13 np0005593234 nova_compute[227762]: 2026-01-23 10:03:13.914 227766 INFO nova.scheduler.client.report [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Deleted allocations for instance f4412d8b-963a-4c3a-accf-68f8cf82c864#033[00m
Jan 23 05:03:13 np0005593234 kernel: tapc9c463b9-37 (unregistering): left promiscuous mode
Jan 23 05:03:13 np0005593234 NetworkManager[48942]: <info>  [1769162593.9483] device (tapc9c463b9-37): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:03:13 np0005593234 nova_compute[227762]: 2026-01-23 10:03:13.955 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:13Z|00397|binding|INFO|Releasing lport c9c463b9-3793-44a9-9773-69cc1638096d from this chassis (sb_readonly=0)
Jan 23 05:03:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:13Z|00398|binding|INFO|Setting lport c9c463b9-3793-44a9-9773-69cc1638096d down in Southbound
Jan 23 05:03:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:13Z|00399|binding|INFO|Removing iface tapc9c463b9-37 ovn-installed in OVS
Jan 23 05:03:13 np0005593234 nova_compute[227762]: 2026-01-23 10:03:13.957 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:13 np0005593234 nova_compute[227762]: 2026-01-23 10:03:13.976 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:13.985 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '40'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:14 np0005593234 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000062.scope: Deactivated successfully.
Jan 23 05:03:14 np0005593234 systemd[1]: machine-qemu\x2d43\x2dinstance\x2d00000062.scope: Consumed 14.141s CPU time.
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.003 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:54:e7 10.100.0.12'], port_security=['fa:16:3e:2d:54:e7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '61f18fb1-66aa-4089-b98f-50b8a49800ff', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0f5ca0233c1a490aa2d596b88a0ec503', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ad8a7362-692a-4044-8393-1c10014f8bab', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=83406af9-ea42-4cda-96ee-b8c04ab0651a, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=c9c463b9-3793-44a9-9773-69cc1638096d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.004 144381 INFO neutron.agent.ovn.metadata.agent [-] Port c9c463b9-3793-44a9-9773-69cc1638096d in datapath 969bd83a-7542-46e3-90f0-1a81f26ba6b8 unbound from our chassis#033[00m
Jan 23 05:03:14 np0005593234 systemd-machined[195626]: Machine qemu-43-instance-00000062 terminated.
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.014 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 969bd83a-7542-46e3-90f0-1a81f26ba6b8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.015 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[399caebf-7d9e-47a2-892e-dd35fcb93748]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.015 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 namespace which is not needed anymore#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.117 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.123 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:14 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[273470]: [NOTICE]   (273474) : haproxy version is 2.8.14-c23fe91
Jan 23 05:03:14 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[273470]: [NOTICE]   (273474) : path to executable is /usr/sbin/haproxy
Jan 23 05:03:14 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[273470]: [WARNING]  (273474) : Exiting Master process...
Jan 23 05:03:14 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[273470]: [ALERT]    (273474) : Current worker (273476) exited with code 143 (Terminated)
Jan 23 05:03:14 np0005593234 neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8[273470]: [WARNING]  (273474) : All workers exited. Exiting... (0)
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.132 227766 INFO nova.virt.libvirt.driver [-] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Instance destroyed successfully.#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.132 227766 DEBUG nova.objects.instance [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lazy-loading 'resources' on Instance uuid 61f18fb1-66aa-4089-b98f-50b8a49800ff obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:14 np0005593234 systemd[1]: libpod-8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b.scope: Deactivated successfully.
Jan 23 05:03:14 np0005593234 podman[274880]: 2026-01-23 10:03:14.140466639 +0000 UTC m=+0.043816839 container died 8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.153 227766 DEBUG oslo_concurrency.lockutils [None req-a9ae82c1-faa3-4c49-a4dd-e36e2676f3b6 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "f4412d8b-963a-4c3a-accf-68f8cf82c864" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.164 227766 DEBUG nova.virt.libvirt.vif [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:01:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ListServerFiltersTestJSON-instance-1972148084',display_name='tempest-ListServerFiltersTestJSON-instance-1972148084',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-listserverfilterstestjson-instance-1972148084',id=98,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:02:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0f5ca0233c1a490aa2d596b88a0ec503',ramdisk_id='',reservation_id='r-l9i0gz1m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ListServerFiltersTestJSON-1524131674',owner_user_name='tempest-ListServerFiltersTestJSON-1524131674-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:02:50Z,user_data=None,user_id='c09e682996b940dc97c866f9e4f1e74e',uuid=61f18fb1-66aa-4089-b98f-50b8a49800ff,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.164 227766 DEBUG nova.network.os_vif_util [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converting VIF {"id": "c9c463b9-3793-44a9-9773-69cc1638096d", "address": "fa:16:3e:2d:54:e7", "network": {"id": "969bd83a-7542-46e3-90f0-1a81f26ba6b8", "bridge": "br-int", "label": "tempest-ListServerFiltersTestJSON-1578393838-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0f5ca0233c1a490aa2d596b88a0ec503", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc9c463b9-37", "ovs_interfaceid": "c9c463b9-3793-44a9-9773-69cc1638096d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.165 227766 DEBUG nova.network.os_vif_util [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.165 227766 DEBUG os_vif [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.167 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.167 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc9c463b9-37, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.169 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.171 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.173 227766 INFO os_vif [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:2d:54:e7,bridge_name='br-int',has_traffic_filtering=True,id=c9c463b9-3793-44a9-9773-69cc1638096d,network=Network(969bd83a-7542-46e3-90f0-1a81f26ba6b8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc9c463b9-37')#033[00m
Jan 23 05:03:14 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b-userdata-shm.mount: Deactivated successfully.
Jan 23 05:03:14 np0005593234 systemd[1]: var-lib-containers-storage-overlay-46e3a1bc2f3618bad9c838c505c71c387248f73f4a4644cd13c180e4cb992d97-merged.mount: Deactivated successfully.
Jan 23 05:03:14 np0005593234 podman[274880]: 2026-01-23 10:03:14.185238337 +0000 UTC m=+0.088588557 container cleanup 8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 05:03:14 np0005593234 systemd[1]: libpod-conmon-8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b.scope: Deactivated successfully.
Jan 23 05:03:14 np0005593234 podman[274934]: 2026-01-23 10:03:14.251201578 +0000 UTC m=+0.041494838 container remove 8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.257 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[805da4ae-42b3-476a-9ead-f3fee7251116]: (4, ('Fri Jan 23 10:03:14 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 (8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b)\n8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b\nFri Jan 23 10:03:14 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 (8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b)\n8eb487a49a2e5dfe2d1a612d05d46dbf8e6837284e175b861cb14e18ae1c9b2b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.259 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eb32fcb4-fea1-4291-882d-aca0032a7f09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.260 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap969bd83a-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:14 np0005593234 kernel: tap969bd83a-70: left promiscuous mode
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.261 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.276 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.278 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[626dcc38-b748-4d72-bdce-149e0316a4e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.297 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[07e166b9-f435-40d2-bf94-65d0f811e2c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.298 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[901ddbfb-cfc8-4643-acef-b8ac3948f61b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.314 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[80fb2233-469d-4759-acaa-730bfce31fa4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 646458, 'reachable_time': 30206, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 274950, 'error': None, 'target': 'ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.316 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-969bd83a-7542-46e3-90f0-1a81f26ba6b8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:03:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:14.316 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[1ed6192d-a15a-4773-9c99-3c0991cac971]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:14 np0005593234 systemd[1]: run-netns-ovnmeta\x2d969bd83a\x2d7542\x2d46e3\x2d90f0\x2d1a81f26ba6b8.mount: Deactivated successfully.
Jan 23 05:03:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:14.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.451 227766 DEBUG nova.compute.manager [req-f398add8-f2e7-4395-ad03-cf0fb62ea152 req-13cb35ba-a3a5-4317-a517-9d4eae6743c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received event network-vif-unplugged-c9c463b9-3793-44a9-9773-69cc1638096d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.452 227766 DEBUG oslo_concurrency.lockutils [req-f398add8-f2e7-4395-ad03-cf0fb62ea152 req-13cb35ba-a3a5-4317-a517-9d4eae6743c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.452 227766 DEBUG oslo_concurrency.lockutils [req-f398add8-f2e7-4395-ad03-cf0fb62ea152 req-13cb35ba-a3a5-4317-a517-9d4eae6743c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.453 227766 DEBUG oslo_concurrency.lockutils [req-f398add8-f2e7-4395-ad03-cf0fb62ea152 req-13cb35ba-a3a5-4317-a517-9d4eae6743c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.453 227766 DEBUG nova.compute.manager [req-f398add8-f2e7-4395-ad03-cf0fb62ea152 req-13cb35ba-a3a5-4317-a517-9d4eae6743c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] No waiting events found dispatching network-vif-unplugged-c9c463b9-3793-44a9-9773-69cc1638096d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.453 227766 DEBUG nova.compute.manager [req-f398add8-f2e7-4395-ad03-cf0fb62ea152 req-13cb35ba-a3a5-4317-a517-9d4eae6743c2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received event network-vif-unplugged-c9c463b9-3793-44a9-9773-69cc1638096d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.622 227766 INFO nova.virt.libvirt.driver [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Deleting instance files /var/lib/nova/instances/61f18fb1-66aa-4089-b98f-50b8a49800ff_del#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.624 227766 INFO nova.virt.libvirt.driver [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Deletion of /var/lib/nova/instances/61f18fb1-66aa-4089-b98f-50b8a49800ff_del complete#033[00m
Jan 23 05:03:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:14.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.915 227766 INFO nova.compute.manager [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Took 1.02 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.916 227766 DEBUG oslo.service.loopingcall [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.916 227766 DEBUG nova.compute.manager [-] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:03:14 np0005593234 nova_compute[227762]: 2026-01-23 10:03:14.917 227766 DEBUG nova.network.neutron [-] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:03:15 np0005593234 nova_compute[227762]: 2026-01-23 10:03:15.160 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:15 np0005593234 nova_compute[227762]: 2026-01-23 10:03:15.367 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:16 np0005593234 nova_compute[227762]: 2026-01-23 10:03:16.237 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:16.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:16.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:17 np0005593234 nova_compute[227762]: 2026-01-23 10:03:17.027 227766 DEBUG nova.compute.manager [req-a8376f5c-69e8-4cc4-bffc-71133622976e req-8fbe0a43-7d79-4779-836d-539cc096d6c3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received event network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:17 np0005593234 nova_compute[227762]: 2026-01-23 10:03:17.027 227766 DEBUG oslo_concurrency.lockutils [req-a8376f5c-69e8-4cc4-bffc-71133622976e req-8fbe0a43-7d79-4779-836d-539cc096d6c3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:17 np0005593234 nova_compute[227762]: 2026-01-23 10:03:17.027 227766 DEBUG oslo_concurrency.lockutils [req-a8376f5c-69e8-4cc4-bffc-71133622976e req-8fbe0a43-7d79-4779-836d-539cc096d6c3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:17 np0005593234 nova_compute[227762]: 2026-01-23 10:03:17.027 227766 DEBUG oslo_concurrency.lockutils [req-a8376f5c-69e8-4cc4-bffc-71133622976e req-8fbe0a43-7d79-4779-836d-539cc096d6c3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:17 np0005593234 nova_compute[227762]: 2026-01-23 10:03:17.028 227766 DEBUG nova.compute.manager [req-a8376f5c-69e8-4cc4-bffc-71133622976e req-8fbe0a43-7d79-4779-836d-539cc096d6c3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] No waiting events found dispatching network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:17 np0005593234 nova_compute[227762]: 2026-01-23 10:03:17.028 227766 WARNING nova.compute.manager [req-a8376f5c-69e8-4cc4-bffc-71133622976e req-8fbe0a43-7d79-4779-836d-539cc096d6c3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received unexpected event network-vif-plugged-c9c463b9-3793-44a9-9773-69cc1638096d for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:03:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:18.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:18.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:19 np0005593234 nova_compute[227762]: 2026-01-23 10:03:19.169 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:19 np0005593234 nova_compute[227762]: 2026-01-23 10:03:19.720 227766 DEBUG nova.network.neutron [-] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:03:19 np0005593234 nova_compute[227762]: 2026-01-23 10:03:19.745 227766 INFO nova.compute.manager [-] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Took 4.83 seconds to deallocate network for instance.#033[00m
Jan 23 05:03:19 np0005593234 nova_compute[227762]: 2026-01-23 10:03:19.798 227766 DEBUG oslo_concurrency.lockutils [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:19 np0005593234 nova_compute[227762]: 2026-01-23 10:03:19.799 227766 DEBUG oslo_concurrency.lockutils [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:19 np0005593234 nova_compute[227762]: 2026-01-23 10:03:19.839 227766 DEBUG nova.compute.manager [req-b96310bd-acd5-445a-b45c-6961371c8214 req-0fed47bb-9afc-4612-ad9e-2efd02d8306d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Received event network-vif-deleted-c9c463b9-3793-44a9-9773-69cc1638096d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:19 np0005593234 nova_compute[227762]: 2026-01-23 10:03:19.855 227766 DEBUG oslo_concurrency.processutils [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:20 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1162929138' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:20 np0005593234 nova_compute[227762]: 2026-01-23 10:03:20.322 227766 DEBUG oslo_concurrency.processutils [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:20 np0005593234 nova_compute[227762]: 2026-01-23 10:03:20.327 227766 DEBUG nova.compute.provider_tree [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:03:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:20.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:20 np0005593234 nova_compute[227762]: 2026-01-23 10:03:20.376 227766 DEBUG nova.scheduler.client.report [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:03:20 np0005593234 nova_compute[227762]: 2026-01-23 10:03:20.419 227766 DEBUG oslo_concurrency.lockutils [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.620s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:20 np0005593234 nova_compute[227762]: 2026-01-23 10:03:20.457 227766 INFO nova.scheduler.client.report [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Deleted allocations for instance 61f18fb1-66aa-4089-b98f-50b8a49800ff#033[00m
Jan 23 05:03:20 np0005593234 nova_compute[227762]: 2026-01-23 10:03:20.531 227766 DEBUG oslo_concurrency.lockutils [None req-7da455f1-2dc8-4a2f-ba73-9f73aa32bfc5 c09e682996b940dc97c866f9e4f1e74e 0f5ca0233c1a490aa2d596b88a0ec503 - - default default] Lock "61f18fb1-66aa-4089-b98f-50b8a49800ff" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:20.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:21 np0005593234 nova_compute[227762]: 2026-01-23 10:03:21.238 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:21 np0005593234 nova_compute[227762]: 2026-01-23 10:03:21.251 227766 DEBUG nova.compute.manager [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 23 05:03:21 np0005593234 nova_compute[227762]: 2026-01-23 10:03:21.378 227766 DEBUG oslo_concurrency.lockutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:21 np0005593234 nova_compute[227762]: 2026-01-23 10:03:21.379 227766 DEBUG oslo_concurrency.lockutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:21 np0005593234 nova_compute[227762]: 2026-01-23 10:03:21.406 227766 DEBUG nova.objects.instance [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'pci_requests' on Instance uuid 1bdbf4d2-447b-47d0-8b3f-878ee65905a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:21 np0005593234 nova_compute[227762]: 2026-01-23 10:03:21.449 227766 DEBUG nova.virt.hardware [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:03:21 np0005593234 nova_compute[227762]: 2026-01-23 10:03:21.450 227766 INFO nova.compute.claims [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:03:21 np0005593234 nova_compute[227762]: 2026-01-23 10:03:21.450 227766 DEBUG nova.objects.instance [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'resources' on Instance uuid 1bdbf4d2-447b-47d0-8b3f-878ee65905a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:21 np0005593234 nova_compute[227762]: 2026-01-23 10:03:21.469 227766 DEBUG nova.objects.instance [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'pci_devices' on Instance uuid 1bdbf4d2-447b-47d0-8b3f-878ee65905a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:21 np0005593234 nova_compute[227762]: 2026-01-23 10:03:21.519 227766 INFO nova.compute.resource_tracker [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Updating resource usage from migration 568d5ba9-cc8d-4d7d-83d3-6c9eb255b306#033[00m
Jan 23 05:03:21 np0005593234 nova_compute[227762]: 2026-01-23 10:03:21.520 227766 DEBUG nova.compute.resource_tracker [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Starting to track incoming migration 568d5ba9-cc8d-4d7d-83d3-6c9eb255b306 with flavor eebea5f8-9b11-45ad-873d-c4ea90d3de87 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 05:03:21 np0005593234 nova_compute[227762]: 2026-01-23 10:03:21.602 227766 DEBUG oslo_concurrency.processutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1866588629' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:22 np0005593234 nova_compute[227762]: 2026-01-23 10:03:22.021 227766 DEBUG oslo_concurrency.processutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:22 np0005593234 nova_compute[227762]: 2026-01-23 10:03:22.028 227766 DEBUG nova.compute.provider_tree [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:03:22 np0005593234 nova_compute[227762]: 2026-01-23 10:03:22.048 227766 DEBUG nova.scheduler.client.report [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:03:22 np0005593234 nova_compute[227762]: 2026-01-23 10:03:22.079 227766 DEBUG oslo_concurrency.lockutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.700s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:22 np0005593234 nova_compute[227762]: 2026-01-23 10:03:22.079 227766 INFO nova.compute.manager [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Migrating#033[00m
Jan 23 05:03:22 np0005593234 nova_compute[227762]: 2026-01-23 10:03:22.231 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162587.2300396, f4412d8b-963a-4c3a-accf-68f8cf82c864 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:22 np0005593234 nova_compute[227762]: 2026-01-23 10:03:22.231 227766 INFO nova.compute.manager [-] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:03:22 np0005593234 nova_compute[227762]: 2026-01-23 10:03:22.240 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162587.2384548, b0b5a1b2-04bd-48e0-a0f7-0c679d784e04 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:22 np0005593234 nova_compute[227762]: 2026-01-23 10:03:22.240 227766 INFO nova.compute.manager [-] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:03:22 np0005593234 nova_compute[227762]: 2026-01-23 10:03:22.263 227766 DEBUG nova.compute.manager [None req-dd9c8217-b42c-40a2-9232-e66a493f573f - - - - - -] [instance: f4412d8b-963a-4c3a-accf-68f8cf82c864] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:22 np0005593234 nova_compute[227762]: 2026-01-23 10:03:22.269 227766 DEBUG nova.compute.manager [None req-b91486af-6163-41f7-9d3a-90d40faa1332 - - - - - -] [instance: b0b5a1b2-04bd-48e0-a0f7-0c679d784e04] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.003000095s ======
Jan 23 05:03:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:22.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 23 05:03:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:22.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:24 np0005593234 nova_compute[227762]: 2026-01-23 10:03:24.170 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:24.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:24 np0005593234 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 05:03:24 np0005593234 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 05:03:24 np0005593234 systemd-logind[794]: New session 63 of user nova.
Jan 23 05:03:24 np0005593234 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 05:03:24 np0005593234 systemd[1]: Starting User Manager for UID 42436...
Jan 23 05:03:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:24.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:24 np0005593234 systemd[275056]: Queued start job for default target Main User Target.
Jan 23 05:03:24 np0005593234 systemd[275056]: Created slice User Application Slice.
Jan 23 05:03:24 np0005593234 systemd[275056]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 05:03:24 np0005593234 systemd[275056]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 05:03:24 np0005593234 systemd[275056]: Reached target Paths.
Jan 23 05:03:24 np0005593234 systemd[275056]: Reached target Timers.
Jan 23 05:03:24 np0005593234 systemd[275056]: Starting D-Bus User Message Bus Socket...
Jan 23 05:03:24 np0005593234 systemd[275056]: Starting Create User's Volatile Files and Directories...
Jan 23 05:03:24 np0005593234 systemd[275056]: Listening on D-Bus User Message Bus Socket.
Jan 23 05:03:24 np0005593234 systemd[275056]: Reached target Sockets.
Jan 23 05:03:24 np0005593234 systemd[275056]: Finished Create User's Volatile Files and Directories.
Jan 23 05:03:24 np0005593234 systemd[275056]: Reached target Basic System.
Jan 23 05:03:24 np0005593234 systemd[275056]: Reached target Main User Target.
Jan 23 05:03:24 np0005593234 systemd[275056]: Startup finished in 139ms.
Jan 23 05:03:24 np0005593234 systemd[1]: Started User Manager for UID 42436.
Jan 23 05:03:24 np0005593234 systemd[1]: Started Session 63 of User nova.
Jan 23 05:03:24 np0005593234 podman[275071]: 2026-01-23 10:03:24.878319696 +0000 UTC m=+0.063064272 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:03:24 np0005593234 systemd[1]: session-63.scope: Deactivated successfully.
Jan 23 05:03:24 np0005593234 systemd-logind[794]: Session 63 logged out. Waiting for processes to exit.
Jan 23 05:03:24 np0005593234 systemd-logind[794]: Removed session 63.
Jan 23 05:03:25 np0005593234 systemd-logind[794]: New session 65 of user nova.
Jan 23 05:03:25 np0005593234 systemd[1]: Started Session 65 of User nova.
Jan 23 05:03:25 np0005593234 systemd[1]: session-65.scope: Deactivated successfully.
Jan 23 05:03:25 np0005593234 systemd-logind[794]: Session 65 logged out. Waiting for processes to exit.
Jan 23 05:03:25 np0005593234 systemd-logind[794]: Removed session 65.
Jan 23 05:03:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:26 np0005593234 nova_compute[227762]: 2026-01-23 10:03:26.240 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:26.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:26.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:27 np0005593234 nova_compute[227762]: 2026-01-23 10:03:27.863 227766 DEBUG nova.compute.manager [req-735a8fb7-99cc-4b80-adca-87fa6d702271 req-280c2595-4d5e-4102-8570-1d0a90a1a49e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received event network-vif-unplugged-35f84523-a0b5-4102-ba04-cc5da6075d54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:27 np0005593234 nova_compute[227762]: 2026-01-23 10:03:27.863 227766 DEBUG oslo_concurrency.lockutils [req-735a8fb7-99cc-4b80-adca-87fa6d702271 req-280c2595-4d5e-4102-8570-1d0a90a1a49e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:27 np0005593234 nova_compute[227762]: 2026-01-23 10:03:27.863 227766 DEBUG oslo_concurrency.lockutils [req-735a8fb7-99cc-4b80-adca-87fa6d702271 req-280c2595-4d5e-4102-8570-1d0a90a1a49e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:27 np0005593234 nova_compute[227762]: 2026-01-23 10:03:27.863 227766 DEBUG oslo_concurrency.lockutils [req-735a8fb7-99cc-4b80-adca-87fa6d702271 req-280c2595-4d5e-4102-8570-1d0a90a1a49e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:27 np0005593234 nova_compute[227762]: 2026-01-23 10:03:27.863 227766 DEBUG nova.compute.manager [req-735a8fb7-99cc-4b80-adca-87fa6d702271 req-280c2595-4d5e-4102-8570-1d0a90a1a49e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] No waiting events found dispatching network-vif-unplugged-35f84523-a0b5-4102-ba04-cc5da6075d54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:27 np0005593234 nova_compute[227762]: 2026-01-23 10:03:27.864 227766 WARNING nova.compute.manager [req-735a8fb7-99cc-4b80-adca-87fa6d702271 req-280c2595-4d5e-4102-8570-1d0a90a1a49e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received unexpected event network-vif-unplugged-35f84523-a0b5-4102-ba04-cc5da6075d54 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 23 05:03:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:28.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:28.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:28 np0005593234 nova_compute[227762]: 2026-01-23 10:03:28.697 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:29 np0005593234 nova_compute[227762]: 2026-01-23 10:03:29.130 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162594.1284251, 61f18fb1-66aa-4089-b98f-50b8a49800ff => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:29 np0005593234 nova_compute[227762]: 2026-01-23 10:03:29.130 227766 INFO nova.compute.manager [-] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:03:29 np0005593234 nova_compute[227762]: 2026-01-23 10:03:29.155 227766 DEBUG nova.compute.manager [None req-95fd4516-1687-4e74-a1ba-1e1d22f36a4d - - - - - -] [instance: 61f18fb1-66aa-4089-b98f-50b8a49800ff] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:29 np0005593234 nova_compute[227762]: 2026-01-23 10:03:29.172 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:29 np0005593234 nova_compute[227762]: 2026-01-23 10:03:29.347 227766 INFO nova.network.neutron [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Updating port 35f84523-a0b5-4102-ba04-cc5da6075d54 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.031 227766 DEBUG nova.compute.manager [req-dd01ca6a-b817-4cbb-9ae4-afaf80dad27f req-8e6cfb1d-091b-4f55-88a4-c96bb8c0325f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received event network-vif-plugged-35f84523-a0b5-4102-ba04-cc5da6075d54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.031 227766 DEBUG oslo_concurrency.lockutils [req-dd01ca6a-b817-4cbb-9ae4-afaf80dad27f req-8e6cfb1d-091b-4f55-88a4-c96bb8c0325f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.031 227766 DEBUG oslo_concurrency.lockutils [req-dd01ca6a-b817-4cbb-9ae4-afaf80dad27f req-8e6cfb1d-091b-4f55-88a4-c96bb8c0325f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.031 227766 DEBUG oslo_concurrency.lockutils [req-dd01ca6a-b817-4cbb-9ae4-afaf80dad27f req-8e6cfb1d-091b-4f55-88a4-c96bb8c0325f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.032 227766 DEBUG nova.compute.manager [req-dd01ca6a-b817-4cbb-9ae4-afaf80dad27f req-8e6cfb1d-091b-4f55-88a4-c96bb8c0325f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] No waiting events found dispatching network-vif-plugged-35f84523-a0b5-4102-ba04-cc5da6075d54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.032 227766 WARNING nova.compute.manager [req-dd01ca6a-b817-4cbb-9ae4-afaf80dad27f req-8e6cfb1d-091b-4f55-88a4-c96bb8c0325f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received unexpected event network-vif-plugged-35f84523-a0b5-4102-ba04-cc5da6075d54 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 05:03:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:30.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.429 227766 DEBUG oslo_concurrency.lockutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "refresh_cache-1bdbf4d2-447b-47d0-8b3f-878ee65905a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.429 227766 DEBUG oslo_concurrency.lockutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquired lock "refresh_cache-1bdbf4d2-447b-47d0-8b3f-878ee65905a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.429 227766 DEBUG nova.network.neutron [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.654 227766 DEBUG nova.compute.manager [req-036dd269-5d50-4301-93ca-855fa8f851db req-0f8b00cb-d70f-4e7f-9913-502ec932c8a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received event network-changed-35f84523-a0b5-4102-ba04-cc5da6075d54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.655 227766 DEBUG nova.compute.manager [req-036dd269-5d50-4301-93ca-855fa8f851db req-0f8b00cb-d70f-4e7f-9913-502ec932c8a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Refreshing instance network info cache due to event network-changed-35f84523-a0b5-4102-ba04-cc5da6075d54. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.655 227766 DEBUG oslo_concurrency.lockutils [req-036dd269-5d50-4301-93ca-855fa8f851db req-0f8b00cb-d70f-4e7f-9913-502ec932c8a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-1bdbf4d2-447b-47d0-8b3f-878ee65905a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:03:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:30.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.770 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.770 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.771 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.771 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:03:30 np0005593234 nova_compute[227762]: 2026-01-23 10:03:30.771 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e262 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3799992887' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:31 np0005593234 nova_compute[227762]: 2026-01-23 10:03:31.217 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:31 np0005593234 nova_compute[227762]: 2026-01-23 10:03:31.243 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:31 np0005593234 nova_compute[227762]: 2026-01-23 10:03:31.404 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:03:31 np0005593234 nova_compute[227762]: 2026-01-23 10:03:31.405 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4480MB free_disk=20.896987915039062GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:03:31 np0005593234 nova_compute[227762]: 2026-01-23 10:03:31.406 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:31 np0005593234 nova_compute[227762]: 2026-01-23 10:03:31.406 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:31 np0005593234 nova_compute[227762]: 2026-01-23 10:03:31.516 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Applying migration context for instance 1bdbf4d2-447b-47d0-8b3f-878ee65905a7 as it has an incoming, in-progress migration 568d5ba9-cc8d-4d7d-83d3-6c9eb255b306. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 23 05:03:31 np0005593234 nova_compute[227762]: 2026-01-23 10:03:31.517 227766 INFO nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Updating resource usage from migration 568d5ba9-cc8d-4d7d-83d3-6c9eb255b306#033[00m
Jan 23 05:03:31 np0005593234 nova_compute[227762]: 2026-01-23 10:03:31.725 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 1bdbf4d2-447b-47d0-8b3f-878ee65905a7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 192, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:03:31 np0005593234 nova_compute[227762]: 2026-01-23 10:03:31.725 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:03:31 np0005593234 nova_compute[227762]: 2026-01-23 10:03:31.726 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:03:31 np0005593234 podman[275390]: 2026-01-23 10:03:31.931429242 +0000 UTC m=+0.035926213 container create c489a18a61af94cbb1b74c4a6408bf98ab240597f8d10bd89dc4de39b6c0f921 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=friendly_napier, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True)
Jan 23 05:03:31 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:31 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:31 np0005593234 systemd[1]: Started libpod-conmon-c489a18a61af94cbb1b74c4a6408bf98ab240597f8d10bd89dc4de39b6c0f921.scope.
Jan 23 05:03:32 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:03:32 np0005593234 podman[275390]: 2026-01-23 10:03:31.915697521 +0000 UTC m=+0.020194512 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 05:03:32 np0005593234 podman[275390]: 2026-01-23 10:03:32.016325564 +0000 UTC m=+0.120822555 container init c489a18a61af94cbb1b74c4a6408bf98ab240597f8d10bd89dc4de39b6c0f921 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=friendly_napier, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507)
Jan 23 05:03:32 np0005593234 podman[275390]: 2026-01-23 10:03:32.022930981 +0000 UTC m=+0.127427952 container start c489a18a61af94cbb1b74c4a6408bf98ab240597f8d10bd89dc4de39b6c0f921 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=friendly_napier, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250507, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 05:03:32 np0005593234 podman[275390]: 2026-01-23 10:03:32.025690017 +0000 UTC m=+0.130187058 container attach c489a18a61af94cbb1b74c4a6408bf98ab240597f8d10bd89dc4de39b6c0f921 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=friendly_napier, CEPH_REF=reef, io.buildah.version=1.39.3, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 05:03:32 np0005593234 friendly_napier[275406]: 167 167
Jan 23 05:03:32 np0005593234 systemd[1]: libpod-c489a18a61af94cbb1b74c4a6408bf98ab240597f8d10bd89dc4de39b6c0f921.scope: Deactivated successfully.
Jan 23 05:03:32 np0005593234 podman[275390]: 2026-01-23 10:03:32.029264318 +0000 UTC m=+0.133761299 container died c489a18a61af94cbb1b74c4a6408bf98ab240597f8d10bd89dc4de39b6c0f921 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=friendly_napier, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.040 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:32 np0005593234 systemd[1]: var-lib-containers-storage-overlay-4007337dfbc35d66eb11335b53909b1db9aa48108d881a81952f1c75a314eb0e-merged.mount: Deactivated successfully.
Jan 23 05:03:32 np0005593234 podman[275390]: 2026-01-23 10:03:32.065794989 +0000 UTC m=+0.170291960 container remove c489a18a61af94cbb1b74c4a6408bf98ab240597f8d10bd89dc4de39b6c0f921 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=friendly_napier, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.schema-version=1.0)
Jan 23 05:03:32 np0005593234 systemd[1]: libpod-conmon-c489a18a61af94cbb1b74c4a6408bf98ab240597f8d10bd89dc4de39b6c0f921.scope: Deactivated successfully.
Jan 23 05:03:32 np0005593234 podman[275443]: 2026-01-23 10:03:32.225711214 +0000 UTC m=+0.048604320 container create d29eda1e860cd960533379c9c3348f4d6d6e1a8ad2f449f4c346d439fdca13cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_leakey, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef)
Jan 23 05:03:32 np0005593234 systemd[1]: Started libpod-conmon-d29eda1e860cd960533379c9c3348f4d6d6e1a8ad2f449f4c346d439fdca13cd.scope.
Jan 23 05:03:32 np0005593234 podman[275443]: 2026-01-23 10:03:32.199747312 +0000 UTC m=+0.022640448 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 05:03:32 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:03:32 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d32bc4744fa9f65d1cbdae36f4d9c5c80823bb7ec206fe6653639040bdcb35b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 05:03:32 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d32bc4744fa9f65d1cbdae36f4d9c5c80823bb7ec206fe6653639040bdcb35b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 05:03:32 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d32bc4744fa9f65d1cbdae36f4d9c5c80823bb7ec206fe6653639040bdcb35b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:03:32 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0d32bc4744fa9f65d1cbdae36f4d9c5c80823bb7ec206fe6653639040bdcb35b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 05:03:32 np0005593234 podman[275443]: 2026-01-23 10:03:32.337195886 +0000 UTC m=+0.160089012 container init d29eda1e860cd960533379c9c3348f4d6d6e1a8ad2f449f4c346d439fdca13cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_leakey, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=reef, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3)
Jan 23 05:03:32 np0005593234 podman[275443]: 2026-01-23 10:03:32.343839543 +0000 UTC m=+0.166732649 container start d29eda1e860cd960533379c9c3348f4d6d6e1a8ad2f449f4c346d439fdca13cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_leakey, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.39.3)
Jan 23 05:03:32 np0005593234 podman[275443]: 2026-01-23 10:03:32.347847778 +0000 UTC m=+0.170740904 container attach d29eda1e860cd960533379c9c3348f4d6d6e1a8ad2f449f4c346d439fdca13cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_leakey, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 05:03:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:32.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2300751788' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.518 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.525 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.544 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.565 227766 DEBUG nova.network.neutron [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Updating instance_info_cache with network_info: [{"id": "35f84523-a0b5-4102-ba04-cc5da6075d54", "address": "fa:16:3e:b8:d6:dc", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35f84523-a0", "ovs_interfaceid": "35f84523-a0b5-4102-ba04-cc5da6075d54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.590 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.591 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.594 227766 DEBUG oslo_concurrency.lockutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Releasing lock "refresh_cache-1bdbf4d2-447b-47d0-8b3f-878ee65905a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.598 227766 DEBUG oslo_concurrency.lockutils [req-036dd269-5d50-4301-93ca-855fa8f851db req-0f8b00cb-d70f-4e7f-9913-502ec932c8a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-1bdbf4d2-447b-47d0-8b3f-878ee65905a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.598 227766 DEBUG nova.network.neutron [req-036dd269-5d50-4301-93ca-855fa8f851db req-0f8b00cb-d70f-4e7f-9913-502ec932c8a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Refreshing network info cache for port 35f84523-a0b5-4102-ba04-cc5da6075d54 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:03:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:32.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.708 227766 DEBUG nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.710 227766 DEBUG nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.711 227766 INFO nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Creating image(s)#033[00m
Jan 23 05:03:32 np0005593234 nova_compute[227762]: 2026-01-23 10:03:32.748 227766 DEBUG nova.storage.rbd_utils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] creating snapshot(nova-resize) on rbd image(1bdbf4d2-447b-47d0-8b3f-878ee65905a7_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:03:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e263 e263: 3 total, 3 up, 3 in
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.059 227766 DEBUG nova.objects.instance [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1bdbf4d2-447b-47d0-8b3f-878ee65905a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.183 227766 DEBUG nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.184 227766 DEBUG nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Ensure instance console log exists: /var/lib/nova/instances/1bdbf4d2-447b-47d0-8b3f-878ee65905a7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.185 227766 DEBUG oslo_concurrency.lockutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.185 227766 DEBUG oslo_concurrency.lockutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.185 227766 DEBUG oslo_concurrency.lockutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.189 227766 DEBUG nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Start _get_guest_xml network_info=[{"id": "35f84523-a0b5-4102-ba04-cc5da6075d54", "address": "fa:16:3e:b8:d6:dc", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-267124880-network", "vif_mac": "fa:16:3e:b8:d6:dc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35f84523-a0", "ovs_interfaceid": "35f84523-a0b5-4102-ba04-cc5da6075d54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.194 227766 WARNING nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.201 227766 DEBUG nova.virt.libvirt.host [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.202 227766 DEBUG nova.virt.libvirt.host [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.206 227766 DEBUG nova.virt.libvirt.host [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.206 227766 DEBUG nova.virt.libvirt.host [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.208 227766 DEBUG nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.208 227766 DEBUG nova.virt.hardware [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eebea5f8-9b11-45ad-873d-c4ea90d3de87',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.208 227766 DEBUG nova.virt.hardware [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.209 227766 DEBUG nova.virt.hardware [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.209 227766 DEBUG nova.virt.hardware [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.209 227766 DEBUG nova.virt.hardware [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.209 227766 DEBUG nova.virt.hardware [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.209 227766 DEBUG nova.virt.hardware [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.210 227766 DEBUG nova.virt.hardware [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.210 227766 DEBUG nova.virt.hardware [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.210 227766 DEBUG nova.virt.hardware [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.210 227766 DEBUG nova.virt.hardware [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.211 227766 DEBUG nova.objects.instance [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'vcpu_model' on Instance uuid 1bdbf4d2-447b-47d0-8b3f-878ee65905a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.241 227766 DEBUG oslo_concurrency.processutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:33 np0005593234 nice_leakey[275468]: [
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:    {
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:        "available": false,
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:        "ceph_device": false,
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:        "lsm_data": {},
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:        "lvs": [],
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:        "path": "/dev/sr0",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:        "rejected_reasons": [
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "Has a FileSystem",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "Insufficient space (<5GB)"
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:        ],
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:        "sys_api": {
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "actuators": null,
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "device_nodes": "sr0",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "devname": "sr0",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "human_readable_size": "482.00 KB",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "id_bus": "ata",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "model": "QEMU DVD-ROM",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "nr_requests": "2",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "parent": "/dev/sr0",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "partitions": {},
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "path": "/dev/sr0",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "removable": "1",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "rev": "2.5+",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "ro": "0",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "rotational": "1",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "sas_address": "",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "sas_device_handle": "",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "scheduler_mode": "mq-deadline",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "sectors": 0,
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "sectorsize": "2048",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "size": 493568.0,
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "support_discard": "2048",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "type": "disk",
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:            "vendor": "QEMU"
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:        }
Jan 23 05:03:33 np0005593234 nice_leakey[275468]:    }
Jan 23 05:03:33 np0005593234 nice_leakey[275468]: ]
Jan 23 05:03:33 np0005593234 systemd[1]: libpod-d29eda1e860cd960533379c9c3348f4d6d6e1a8ad2f449f4c346d439fdca13cd.scope: Deactivated successfully.
Jan 23 05:03:33 np0005593234 podman[275443]: 2026-01-23 10:03:33.565758798 +0000 UTC m=+1.388651904 container died d29eda1e860cd960533379c9c3348f4d6d6e1a8ad2f449f4c346d439fdca13cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_leakey, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2)
Jan 23 05:03:33 np0005593234 systemd[1]: libpod-d29eda1e860cd960533379c9c3348f4d6d6e1a8ad2f449f4c346d439fdca13cd.scope: Consumed 1.206s CPU time.
Jan 23 05:03:33 np0005593234 systemd[1]: var-lib-containers-storage-overlay-0d32bc4744fa9f65d1cbdae36f4d9c5c80823bb7ec206fe6653639040bdcb35b-merged.mount: Deactivated successfully.
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.595 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.597 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:03:33 np0005593234 podman[275443]: 2026-01-23 10:03:33.621395287 +0000 UTC m=+1.444288393 container remove d29eda1e860cd960533379c9c3348f4d6d6e1a8ad2f449f4c346d439fdca13cd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=nice_leakey, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 05:03:33 np0005593234 systemd[1]: libpod-conmon-d29eda1e860cd960533379c9c3348f4d6d6e1a8ad2f449f4c346d439fdca13cd.scope: Deactivated successfully.
Jan 23 05:03:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:03:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/84045615' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.673 227766 DEBUG oslo_concurrency.processutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.705 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.705 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:33 np0005593234 nova_compute[227762]: 2026-01-23 10:03:33.708 227766 DEBUG oslo_concurrency.processutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:03:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:03:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:03:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1360198651' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.159 227766 DEBUG oslo_concurrency.processutils [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.161 227766 DEBUG nova.virt.libvirt.vif [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:59:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1908507141',display_name='tempest-ServerActionsTestJSON-server-1908507141',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1908507141',id=93,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:59:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-ii3s65d1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:03:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=1bdbf4d2-447b-47d0-8b3f-878ee65905a7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35f84523-a0b5-4102-ba04-cc5da6075d54", "address": "fa:16:3e:b8:d6:dc", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-267124880-network", "vif_mac": "fa:16:3e:b8:d6:dc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35f84523-a0", "ovs_interfaceid": "35f84523-a0b5-4102-ba04-cc5da6075d54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.161 227766 DEBUG nova.network.os_vif_util [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "35f84523-a0b5-4102-ba04-cc5da6075d54", "address": "fa:16:3e:b8:d6:dc", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-267124880-network", "vif_mac": "fa:16:3e:b8:d6:dc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35f84523-a0", "ovs_interfaceid": "35f84523-a0b5-4102-ba04-cc5da6075d54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.162 227766 DEBUG nova.network.os_vif_util [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:d6:dc,bridge_name='br-int',has_traffic_filtering=True,id=35f84523-a0b5-4102-ba04-cc5da6075d54,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35f84523-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.165 227766 DEBUG nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  <uuid>1bdbf4d2-447b-47d0-8b3f-878ee65905a7</uuid>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  <name>instance-0000005d</name>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  <memory>196608</memory>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerActionsTestJSON-server-1908507141</nova:name>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:03:33</nova:creationTime>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.micro">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <nova:memory>192</nova:memory>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <nova:user uuid="9d4a5c201efa4992a9ef57d8abdc1675">tempest-ServerActionsTestJSON-1619235720-project-member</nova:user>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <nova:project uuid="74c5c1d0762242f29a5d26033efd9f6d">tempest-ServerActionsTestJSON-1619235720</nova:project>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <nova:port uuid="35f84523-a0b5-4102-ba04-cc5da6075d54">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <entry name="serial">1bdbf4d2-447b-47d0-8b3f-878ee65905a7</entry>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <entry name="uuid">1bdbf4d2-447b-47d0-8b3f-878ee65905a7</entry>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1bdbf4d2-447b-47d0-8b3f-878ee65905a7_disk">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1bdbf4d2-447b-47d0-8b3f-878ee65905a7_disk.config">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:b8:d6:dc"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <target dev="tap35f84523-a0"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/1bdbf4d2-447b-47d0-8b3f-878ee65905a7/console.log" append="off"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:03:34 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:03:34 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:03:34 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:03:34 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.167 227766 DEBUG nova.virt.libvirt.vif [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:59:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1908507141',display_name='tempest-ServerActionsTestJSON-server-1908507141',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1908507141',id=93,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T09:59:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-ii3s65d1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:03:29Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=1bdbf4d2-447b-47d0-8b3f-878ee65905a7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35f84523-a0b5-4102-ba04-cc5da6075d54", "address": "fa:16:3e:b8:d6:dc", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-267124880-network", "vif_mac": "fa:16:3e:b8:d6:dc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35f84523-a0", "ovs_interfaceid": "35f84523-a0b5-4102-ba04-cc5da6075d54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.167 227766 DEBUG nova.network.os_vif_util [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "35f84523-a0b5-4102-ba04-cc5da6075d54", "address": "fa:16:3e:b8:d6:dc", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-267124880-network", "vif_mac": "fa:16:3e:b8:d6:dc"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35f84523-a0", "ovs_interfaceid": "35f84523-a0b5-4102-ba04-cc5da6075d54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.168 227766 DEBUG nova.network.os_vif_util [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b8:d6:dc,bridge_name='br-int',has_traffic_filtering=True,id=35f84523-a0b5-4102-ba04-cc5da6075d54,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35f84523-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.168 227766 DEBUG os_vif [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:d6:dc,bridge_name='br-int',has_traffic_filtering=True,id=35f84523-a0b5-4102-ba04-cc5da6075d54,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35f84523-a0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.169 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.169 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.170 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.172 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.172 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap35f84523-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.173 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap35f84523-a0, col_values=(('external_ids', {'iface-id': '35f84523-a0b5-4102-ba04-cc5da6075d54', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b8:d6:dc', 'vm-uuid': '1bdbf4d2-447b-47d0-8b3f-878ee65905a7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.174 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 NetworkManager[48942]: <info>  [1769162614.1753] manager: (tap35f84523-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/206)
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.176 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.181 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.181 227766 INFO os_vif [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b8:d6:dc,bridge_name='br-int',has_traffic_filtering=True,id=35f84523-a0b5-4102-ba04-cc5da6075d54,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35f84523-a0')#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.258 227766 DEBUG nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.259 227766 DEBUG nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.260 227766 DEBUG nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No VIF found with MAC fa:16:3e:b8:d6:dc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.261 227766 INFO nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Using config drive#033[00m
Jan 23 05:03:34 np0005593234 kernel: tap35f84523-a0: entered promiscuous mode
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.338 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:34Z|00400|binding|INFO|Claiming lport 35f84523-a0b5-4102-ba04-cc5da6075d54 for this chassis.
Jan 23 05:03:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:34Z|00401|binding|INFO|35f84523-a0b5-4102-ba04-cc5da6075d54: Claiming fa:16:3e:b8:d6:dc 10.100.0.5
Jan 23 05:03:34 np0005593234 NetworkManager[48942]: <info>  [1769162614.3398] manager: (tap35f84523-a0): new Tun device (/org/freedesktop/NetworkManager/Devices/207)
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.346 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.348 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.348 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 NetworkManager[48942]: <info>  [1769162614.3498] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/208)
Jan 23 05:03:34 np0005593234 NetworkManager[48942]: <info>  [1769162614.3502] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/209)
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.357 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:d6:dc 10.100.0.5'], port_security=['fa:16:3e:b8:d6:dc 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '1bdbf4d2-447b-47d0-8b3f-878ee65905a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '12', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=35f84523-a0b5-4102-ba04-cc5da6075d54) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.359 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 35f84523-a0b5-4102-ba04-cc5da6075d54 in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c bound to our chassis#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.360 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee03d7c9-e107-41bf-95cc-5508578ad66c#033[00m
Jan 23 05:03:34 np0005593234 systemd-udevd[276827]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:03:34 np0005593234 systemd-machined[195626]: New machine qemu-46-instance-0000005d.
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.374 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[14ceec38-2039-4dbb-9cb5-6565024e7f7f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.375 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee03d7c9-e1 in ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.377 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee03d7c9-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.377 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a27d6e36-32f0-42cc-ab8a-414322272db5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.378 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[214def2e-1c91-4e3a-9788-460433e76de5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 NetworkManager[48942]: <info>  [1769162614.3837] device (tap35f84523-a0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:03:34 np0005593234 NetworkManager[48942]: <info>  [1769162614.3844] device (tap35f84523-a0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:03:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:34.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.391 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[1fa4fc6a-59f7-4602-ba19-73ec2e11d768]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 systemd[1]: Started Virtual Machine qemu-46-instance-0000005d.
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.416 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4373c83d-435d-4c9b-9d67-2194c1cb5ce7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.447 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c9c3ed33-5130-494f-9e84-bad9a751befe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.469 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[314346f7-3125-4d93-bc41-2a7b22e9bd71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 systemd-udevd[276830]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:03:34 np0005593234 NetworkManager[48942]: <info>  [1769162614.4700] manager: (tapee03d7c9-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/210)
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.494 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.497 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.505 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[693f615f-65ac-42b3-9585-1bcbcb71c8df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.508 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ea5b3c04-f7eb-448a-b1cf-3c6fd94697f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.511 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:34Z|00402|binding|INFO|Setting lport 35f84523-a0b5-4102-ba04-cc5da6075d54 ovn-installed in OVS
Jan 23 05:03:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:34Z|00403|binding|INFO|Setting lport 35f84523-a0b5-4102-ba04-cc5da6075d54 up in Southbound
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.525 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 NetworkManager[48942]: <info>  [1769162614.5314] device (tapee03d7c9-e0): carrier: link connected
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.538 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9d1e092b-b645-4017-b85c-e496b7069bcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.553 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e6a99867-81b8-40a0-a85c-e3dd80d1709c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 650949, 'reachable_time': 18359, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 276859, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.568 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fef03941-2471-4653-99ed-37a75aeac84c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:6530'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 650949, 'tstamp': 650949}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 276868, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.584 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[390684fb-918a-4f92-bb5a-90268cb7bdfe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 130], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 650949, 'reachable_time': 18359, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 276878, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.618 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6b2908a6-75af-4aec-a268-b538a3218563]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.676 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[31a6e0a9-04b2-4889-821f-1160281bf29f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.678 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.678 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.678 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee03d7c9-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.680 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 kernel: tapee03d7c9-e0: entered promiscuous mode
Jan 23 05:03:34 np0005593234 NetworkManager[48942]: <info>  [1769162614.6806] manager: (tapee03d7c9-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/211)
Jan 23 05:03:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:34.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.682 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee03d7c9-e0, col_values=(('external_ids', {'iface-id': '702d4523-a665-42f5-9a36-57d187c0698a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:34Z|00404|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=0)
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.697 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.698 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.698 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8ef22f1c-e63f-4a89-9c2f-db60cc5f6a32]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.699 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:03:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:34.700 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'env', 'PROCESS_TAG=haproxy-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee03d7c9-e107-41bf-95cc-5508578ad66c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.738 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162614.737778, 1bdbf4d2-447b-47d0-8b3f-878ee65905a7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.738 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.740 227766 DEBUG nova.compute.manager [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.746 227766 INFO nova.virt.libvirt.driver [-] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Instance running successfully.#033[00m
Jan 23 05:03:34 np0005593234 virtqemud[227483]: argument unsupported: QEMU guest agent is not configured
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.749 227766 DEBUG nova.virt.libvirt.guest [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.750 227766 DEBUG nova.virt.libvirt.driver [None req-1fddec7d-2ba4-43c7-8f78-574f9a080d64 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.767 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.932 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.936 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.995 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.996 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162614.740055, 1bdbf4d2-447b-47d0-8b3f-878ee65905a7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:03:34 np0005593234 nova_compute[227762]: 2026-01-23 10:03:34.996 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] VM Started (Lifecycle Event)#033[00m
Jan 23 05:03:35 np0005593234 nova_compute[227762]: 2026-01-23 10:03:35.025 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:03:35 np0005593234 nova_compute[227762]: 2026-01-23 10:03:35.031 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:03:35 np0005593234 podman[276935]: 2026-01-23 10:03:35.099143333 +0000 UTC m=+0.052623305 container create a5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:03:35 np0005593234 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 05:03:35 np0005593234 systemd[275056]: Activating special unit Exit the Session...
Jan 23 05:03:35 np0005593234 systemd[275056]: Stopped target Main User Target.
Jan 23 05:03:35 np0005593234 systemd[275056]: Stopped target Basic System.
Jan 23 05:03:35 np0005593234 systemd[275056]: Stopped target Paths.
Jan 23 05:03:35 np0005593234 systemd[275056]: Stopped target Sockets.
Jan 23 05:03:35 np0005593234 systemd[275056]: Stopped target Timers.
Jan 23 05:03:35 np0005593234 systemd[275056]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 05:03:35 np0005593234 systemd[275056]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 05:03:35 np0005593234 systemd[275056]: Closed D-Bus User Message Bus Socket.
Jan 23 05:03:35 np0005593234 systemd[275056]: Stopped Create User's Volatile Files and Directories.
Jan 23 05:03:35 np0005593234 systemd[275056]: Removed slice User Application Slice.
Jan 23 05:03:35 np0005593234 systemd[275056]: Reached target Shutdown.
Jan 23 05:03:35 np0005593234 systemd[275056]: Finished Exit the Session.
Jan 23 05:03:35 np0005593234 systemd[275056]: Reached target Exit the Session.
Jan 23 05:03:35 np0005593234 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 05:03:35 np0005593234 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 05:03:35 np0005593234 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 05:03:35 np0005593234 systemd[1]: Started libpod-conmon-a5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12.scope.
Jan 23 05:03:35 np0005593234 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 05:03:35 np0005593234 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 05:03:35 np0005593234 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 05:03:35 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:03:35 np0005593234 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 05:03:35 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/700cdc8ae4b97d24d5135e1b19199d3adf7c847a8acb2946796732198c826756/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:03:35 np0005593234 podman[276935]: 2026-01-23 10:03:35.067468613 +0000 UTC m=+0.020948605 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:03:35 np0005593234 podman[276935]: 2026-01-23 10:03:35.177388827 +0000 UTC m=+0.130868829 container init a5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:03:35 np0005593234 podman[276935]: 2026-01-23 10:03:35.182979801 +0000 UTC m=+0.136459763 container start a5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 05:03:35 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[276951]: [NOTICE]   (276955) : New worker (276957) forked
Jan 23 05:03:35 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[276951]: [NOTICE]   (276955) : Loading success.
Jan 23 05:03:35 np0005593234 nova_compute[227762]: 2026-01-23 10:03:35.490 227766 DEBUG nova.compute.manager [req-2a66a981-26ed-4db3-9cae-3e594e263d43 req-b88c73c9-aec8-49a4-b162-52abc89f903d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received event network-vif-plugged-35f84523-a0b5-4102-ba04-cc5da6075d54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:35 np0005593234 nova_compute[227762]: 2026-01-23 10:03:35.490 227766 DEBUG oslo_concurrency.lockutils [req-2a66a981-26ed-4db3-9cae-3e594e263d43 req-b88c73c9-aec8-49a4-b162-52abc89f903d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:35 np0005593234 nova_compute[227762]: 2026-01-23 10:03:35.491 227766 DEBUG oslo_concurrency.lockutils [req-2a66a981-26ed-4db3-9cae-3e594e263d43 req-b88c73c9-aec8-49a4-b162-52abc89f903d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:35 np0005593234 nova_compute[227762]: 2026-01-23 10:03:35.491 227766 DEBUG oslo_concurrency.lockutils [req-2a66a981-26ed-4db3-9cae-3e594e263d43 req-b88c73c9-aec8-49a4-b162-52abc89f903d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:35 np0005593234 nova_compute[227762]: 2026-01-23 10:03:35.491 227766 DEBUG nova.compute.manager [req-2a66a981-26ed-4db3-9cae-3e594e263d43 req-b88c73c9-aec8-49a4-b162-52abc89f903d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] No waiting events found dispatching network-vif-plugged-35f84523-a0b5-4102-ba04-cc5da6075d54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:35 np0005593234 nova_compute[227762]: 2026-01-23 10:03:35.491 227766 WARNING nova.compute.manager [req-2a66a981-26ed-4db3-9cae-3e594e263d43 req-b88c73c9-aec8-49a4-b162-52abc89f903d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received unexpected event network-vif-plugged-35f84523-a0b5-4102-ba04-cc5da6075d54 for instance with vm_state resized and task_state None.#033[00m
Jan 23 05:03:35 np0005593234 nova_compute[227762]: 2026-01-23 10:03:35.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:35 np0005593234 nova_compute[227762]: 2026-01-23 10:03:35.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:03:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e263 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:36 np0005593234 nova_compute[227762]: 2026-01-23 10:03:36.245 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:36.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:36 np0005593234 nova_compute[227762]: 2026-01-23 10:03:36.600 227766 DEBUG nova.network.neutron [req-036dd269-5d50-4301-93ca-855fa8f851db req-0f8b00cb-d70f-4e7f-9913-502ec932c8a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Updated VIF entry in instance network info cache for port 35f84523-a0b5-4102-ba04-cc5da6075d54. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:03:36 np0005593234 nova_compute[227762]: 2026-01-23 10:03:36.601 227766 DEBUG nova.network.neutron [req-036dd269-5d50-4301-93ca-855fa8f851db req-0f8b00cb-d70f-4e7f-9913-502ec932c8a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Updating instance_info_cache with network_info: [{"id": "35f84523-a0b5-4102-ba04-cc5da6075d54", "address": "fa:16:3e:b8:d6:dc", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35f84523-a0", "ovs_interfaceid": "35f84523-a0b5-4102-ba04-cc5da6075d54", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:03:36 np0005593234 nova_compute[227762]: 2026-01-23 10:03:36.623 227766 DEBUG oslo_concurrency.lockutils [req-036dd269-5d50-4301-93ca-855fa8f851db req-0f8b00cb-d70f-4e7f-9913-502ec932c8a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-1bdbf4d2-447b-47d0-8b3f-878ee65905a7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:03:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:03:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:36.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:03:36 np0005593234 nova_compute[227762]: 2026-01-23 10:03:36.687 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:36 np0005593234 podman[276967]: 2026-01-23 10:03:36.79616013 +0000 UTC m=+0.086031541 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 05:03:37 np0005593234 nova_compute[227762]: 2026-01-23 10:03:37.661 227766 DEBUG nova.compute.manager [req-95c440bd-f2bd-4a19-bedf-809e6261ad0d req-5469e3ab-aa08-4b18-b080-3e2ea02779db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received event network-vif-plugged-35f84523-a0b5-4102-ba04-cc5da6075d54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:37 np0005593234 nova_compute[227762]: 2026-01-23 10:03:37.663 227766 DEBUG oslo_concurrency.lockutils [req-95c440bd-f2bd-4a19-bedf-809e6261ad0d req-5469e3ab-aa08-4b18-b080-3e2ea02779db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:37 np0005593234 nova_compute[227762]: 2026-01-23 10:03:37.663 227766 DEBUG oslo_concurrency.lockutils [req-95c440bd-f2bd-4a19-bedf-809e6261ad0d req-5469e3ab-aa08-4b18-b080-3e2ea02779db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:37 np0005593234 nova_compute[227762]: 2026-01-23 10:03:37.664 227766 DEBUG oslo_concurrency.lockutils [req-95c440bd-f2bd-4a19-bedf-809e6261ad0d req-5469e3ab-aa08-4b18-b080-3e2ea02779db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:37 np0005593234 nova_compute[227762]: 2026-01-23 10:03:37.664 227766 DEBUG nova.compute.manager [req-95c440bd-f2bd-4a19-bedf-809e6261ad0d req-5469e3ab-aa08-4b18-b080-3e2ea02779db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] No waiting events found dispatching network-vif-plugged-35f84523-a0b5-4102-ba04-cc5da6075d54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:37 np0005593234 nova_compute[227762]: 2026-01-23 10:03:37.665 227766 WARNING nova.compute.manager [req-95c440bd-f2bd-4a19-bedf-809e6261ad0d req-5469e3ab-aa08-4b18-b080-3e2ea02779db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received unexpected event network-vif-plugged-35f84523-a0b5-4102-ba04-cc5da6075d54 for instance with vm_state resized and task_state None.#033[00m
Jan 23 05:03:37 np0005593234 nova_compute[227762]: 2026-01-23 10:03:37.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:38.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:38.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:39 np0005593234 nova_compute[227762]: 2026-01-23 10:03:39.174 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:03:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:40.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:40.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:40 np0005593234 nova_compute[227762]: 2026-01-23 10:03:40.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e263 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e264 e264: 3 total, 3 up, 3 in
Jan 23 05:03:41 np0005593234 nova_compute[227762]: 2026-01-23 10:03:41.287 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:41 np0005593234 nova_compute[227762]: 2026-01-23 10:03:41.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:42.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:42.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:42 np0005593234 nova_compute[227762]: 2026-01-23 10:03:42.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:42.840 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:42.841 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:42.842 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:43 np0005593234 nova_compute[227762]: 2026-01-23 10:03:43.535 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:44 np0005593234 nova_compute[227762]: 2026-01-23 10:03:44.176 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:44.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:44.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.497 227766 DEBUG oslo_concurrency.lockutils [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.498 227766 DEBUG oslo_concurrency.lockutils [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.498 227766 DEBUG oslo_concurrency.lockutils [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.498 227766 DEBUG oslo_concurrency.lockutils [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.499 227766 DEBUG oslo_concurrency.lockutils [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.500 227766 INFO nova.compute.manager [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Terminating instance#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.501 227766 DEBUG nova.compute.manager [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:03:45 np0005593234 kernel: tap35f84523-a0 (unregistering): left promiscuous mode
Jan 23 05:03:45 np0005593234 NetworkManager[48942]: <info>  [1769162625.5411] device (tap35f84523-a0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:03:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:45Z|00405|binding|INFO|Releasing lport 35f84523-a0b5-4102-ba04-cc5da6075d54 from this chassis (sb_readonly=0)
Jan 23 05:03:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:45Z|00406|binding|INFO|Setting lport 35f84523-a0b5-4102-ba04-cc5da6075d54 down in Southbound
Jan 23 05:03:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:03:45Z|00407|binding|INFO|Removing iface tap35f84523-a0 ovn-installed in OVS
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.557 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.563 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b8:d6:dc 10.100.0.5'], port_security=['fa:16:3e:b8:d6:dc 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '1bdbf4d2-447b-47d0-8b3f-878ee65905a7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '14', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=35f84523-a0b5-4102-ba04-cc5da6075d54) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.565 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 35f84523-a0b5-4102-ba04-cc5da6075d54 in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c unbound from our chassis#033[00m
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.566 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee03d7c9-e107-41bf-95cc-5508578ad66c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.567 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b0c0152c-f5ca-4f67-85d8-7991f1ac89a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.568 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace which is not needed anymore#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.584 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:45 np0005593234 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000005d.scope: Deactivated successfully.
Jan 23 05:03:45 np0005593234 systemd[1]: machine-qemu\x2d46\x2dinstance\x2d0000005d.scope: Consumed 11.197s CPU time.
Jan 23 05:03:45 np0005593234 systemd-machined[195626]: Machine qemu-46-instance-0000005d terminated.
Jan 23 05:03:45 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[276951]: [NOTICE]   (276955) : haproxy version is 2.8.14-c23fe91
Jan 23 05:03:45 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[276951]: [NOTICE]   (276955) : path to executable is /usr/sbin/haproxy
Jan 23 05:03:45 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[276951]: [WARNING]  (276955) : Exiting Master process...
Jan 23 05:03:45 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[276951]: [ALERT]    (276955) : Current worker (276957) exited with code 143 (Terminated)
Jan 23 05:03:45 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[276951]: [WARNING]  (276955) : All workers exited. Exiting... (0)
Jan 23 05:03:45 np0005593234 systemd[1]: libpod-a5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12.scope: Deactivated successfully.
Jan 23 05:03:45 np0005593234 podman[277122]: 2026-01-23 10:03:45.728292232 +0000 UTC m=+0.058481660 container died a5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.735 227766 INFO nova.virt.libvirt.driver [-] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Instance destroyed successfully.#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.736 227766 DEBUG nova.objects.instance [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'resources' on Instance uuid 1bdbf4d2-447b-47d0-8b3f-878ee65905a7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.752 227766 DEBUG nova.virt.libvirt.vif [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T09:59:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1908507141',display_name='tempest-ServerActionsTestJSON-server-1908507141',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1908507141',id=93,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:03:34Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-ii3s65d1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:03:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=1bdbf4d2-447b-47d0-8b3f-878ee65905a7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35f84523-a0b5-4102-ba04-cc5da6075d54", "address": "fa:16:3e:b8:d6:dc", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35f84523-a0", "ovs_interfaceid": "35f84523-a0b5-4102-ba04-cc5da6075d54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.754 227766 DEBUG nova.network.os_vif_util [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "35f84523-a0b5-4102-ba04-cc5da6075d54", "address": "fa:16:3e:b8:d6:dc", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35f84523-a0", "ovs_interfaceid": "35f84523-a0b5-4102-ba04-cc5da6075d54", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.755 227766 DEBUG nova.network.os_vif_util [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b8:d6:dc,bridge_name='br-int',has_traffic_filtering=True,id=35f84523-a0b5-4102-ba04-cc5da6075d54,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35f84523-a0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.755 227766 DEBUG os_vif [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:d6:dc,bridge_name='br-int',has_traffic_filtering=True,id=35f84523-a0b5-4102-ba04-cc5da6075d54,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35f84523-a0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:03:45 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12-userdata-shm.mount: Deactivated successfully.
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.757 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.758 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap35f84523-a0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.807 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:45 np0005593234 systemd[1]: var-lib-containers-storage-overlay-700cdc8ae4b97d24d5135e1b19199d3adf7c847a8acb2946796732198c826756-merged.mount: Deactivated successfully.
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.809 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.812 227766 INFO os_vif [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b8:d6:dc,bridge_name='br-int',has_traffic_filtering=True,id=35f84523-a0b5-4102-ba04-cc5da6075d54,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35f84523-a0')#033[00m
Jan 23 05:03:45 np0005593234 podman[277122]: 2026-01-23 10:03:45.814114125 +0000 UTC m=+0.144303543 container cleanup a5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:03:45 np0005593234 systemd[1]: libpod-conmon-a5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12.scope: Deactivated successfully.
Jan 23 05:03:45 np0005593234 podman[277169]: 2026-01-23 10:03:45.872180621 +0000 UTC m=+0.035341016 container remove a5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.877 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[88e01b1e-4f06-447a-8eab-c6e35ad5e93b]: (4, ('Fri Jan 23 10:03:45 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (a5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12)\na5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12\nFri Jan 23 10:03:45 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (a5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12)\na5cc482ccc7e006f7d5ceeb9889f88e8258517a3e67a15eb22ce7dd40aeabf12\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.878 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0060386f-ba6f-45ea-992a-239074caf82d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.880 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.881 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:45 np0005593234 kernel: tapee03d7c9-e0: left promiscuous mode
Jan 23 05:03:45 np0005593234 nova_compute[227762]: 2026-01-23 10:03:45.897 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.899 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4692f4f4-c3ed-4a99-ac28-d69c98a29698]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.913 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cfea7f24-c3c2-4db7-8ed7-16bf1a78ebc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.915 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[60fd073c-6152-4aac-96ac-6f7c0cb30486]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.930 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fc3b0fd5-312a-4afb-82ad-b95d070d3649]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 650940, 'reachable_time': 35586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277195, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.932 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:03:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:03:45.933 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[0c127cf5-513d-4916-b99a-67c7dbaaa329]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:03:45 np0005593234 systemd[1]: run-netns-ovnmeta\x2dee03d7c9\x2de107\x2d41bf\x2d95cc\x2d5508578ad66c.mount: Deactivated successfully.
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.226 227766 INFO nova.virt.libvirt.driver [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Deleting instance files /var/lib/nova/instances/1bdbf4d2-447b-47d0-8b3f-878ee65905a7_del#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.227 227766 INFO nova.virt.libvirt.driver [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Deletion of /var/lib/nova/instances/1bdbf4d2-447b-47d0-8b3f-878ee65905a7_del complete#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.288 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.307 227766 INFO nova.compute.manager [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Took 0.81 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.308 227766 DEBUG oslo.service.loopingcall [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.308 227766 DEBUG nova.compute.manager [-] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.309 227766 DEBUG nova.network.neutron [-] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:03:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:46.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.677 227766 DEBUG nova.compute.manager [req-2a7bd6c3-cad7-4f72-b526-a13da1f92e71 req-3b05f514-7dd7-41ba-b3d8-a2581dd1d300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received event network-vif-unplugged-35f84523-a0b5-4102-ba04-cc5da6075d54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.678 227766 DEBUG oslo_concurrency.lockutils [req-2a7bd6c3-cad7-4f72-b526-a13da1f92e71 req-3b05f514-7dd7-41ba-b3d8-a2581dd1d300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.678 227766 DEBUG oslo_concurrency.lockutils [req-2a7bd6c3-cad7-4f72-b526-a13da1f92e71 req-3b05f514-7dd7-41ba-b3d8-a2581dd1d300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.678 227766 DEBUG oslo_concurrency.lockutils [req-2a7bd6c3-cad7-4f72-b526-a13da1f92e71 req-3b05f514-7dd7-41ba-b3d8-a2581dd1d300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.679 227766 DEBUG nova.compute.manager [req-2a7bd6c3-cad7-4f72-b526-a13da1f92e71 req-3b05f514-7dd7-41ba-b3d8-a2581dd1d300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] No waiting events found dispatching network-vif-unplugged-35f84523-a0b5-4102-ba04-cc5da6075d54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.679 227766 DEBUG nova.compute.manager [req-2a7bd6c3-cad7-4f72-b526-a13da1f92e71 req-3b05f514-7dd7-41ba-b3d8-a2581dd1d300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received event network-vif-unplugged-35f84523-a0b5-4102-ba04-cc5da6075d54 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.679 227766 DEBUG nova.compute.manager [req-2a7bd6c3-cad7-4f72-b526-a13da1f92e71 req-3b05f514-7dd7-41ba-b3d8-a2581dd1d300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received event network-vif-plugged-35f84523-a0b5-4102-ba04-cc5da6075d54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.679 227766 DEBUG oslo_concurrency.lockutils [req-2a7bd6c3-cad7-4f72-b526-a13da1f92e71 req-3b05f514-7dd7-41ba-b3d8-a2581dd1d300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.680 227766 DEBUG oslo_concurrency.lockutils [req-2a7bd6c3-cad7-4f72-b526-a13da1f92e71 req-3b05f514-7dd7-41ba-b3d8-a2581dd1d300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.680 227766 DEBUG oslo_concurrency.lockutils [req-2a7bd6c3-cad7-4f72-b526-a13da1f92e71 req-3b05f514-7dd7-41ba-b3d8-a2581dd1d300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.680 227766 DEBUG nova.compute.manager [req-2a7bd6c3-cad7-4f72-b526-a13da1f92e71 req-3b05f514-7dd7-41ba-b3d8-a2581dd1d300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] No waiting events found dispatching network-vif-plugged-35f84523-a0b5-4102-ba04-cc5da6075d54 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:03:46 np0005593234 nova_compute[227762]: 2026-01-23 10:03:46.680 227766 WARNING nova.compute.manager [req-2a7bd6c3-cad7-4f72-b526-a13da1f92e71 req-3b05f514-7dd7-41ba-b3d8-a2581dd1d300 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received unexpected event network-vif-plugged-35f84523-a0b5-4102-ba04-cc5da6075d54 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:03:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:47.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:47 np0005593234 nova_compute[227762]: 2026-01-23 10:03:47.609 227766 DEBUG nova.network.neutron [-] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:03:47 np0005593234 nova_compute[227762]: 2026-01-23 10:03:47.631 227766 INFO nova.compute.manager [-] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Took 1.32 seconds to deallocate network for instance.#033[00m
Jan 23 05:03:47 np0005593234 nova_compute[227762]: 2026-01-23 10:03:47.690 227766 DEBUG oslo_concurrency.lockutils [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:47 np0005593234 nova_compute[227762]: 2026-01-23 10:03:47.691 227766 DEBUG oslo_concurrency.lockutils [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:47 np0005593234 nova_compute[227762]: 2026-01-23 10:03:47.754 227766 DEBUG oslo_concurrency.processutils [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:48 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3660536398' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:48 np0005593234 nova_compute[227762]: 2026-01-23 10:03:48.230 227766 DEBUG oslo_concurrency.processutils [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:48 np0005593234 nova_compute[227762]: 2026-01-23 10:03:48.236 227766 DEBUG nova.compute.provider_tree [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:03:48 np0005593234 nova_compute[227762]: 2026-01-23 10:03:48.258 227766 DEBUG nova.scheduler.client.report [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:03:48 np0005593234 nova_compute[227762]: 2026-01-23 10:03:48.295 227766 DEBUG oslo_concurrency.lockutils [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.604s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:48 np0005593234 nova_compute[227762]: 2026-01-23 10:03:48.345 227766 INFO nova.scheduler.client.report [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Deleted allocations for instance 1bdbf4d2-447b-47d0-8b3f-878ee65905a7#033[00m
Jan 23 05:03:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:48.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:48 np0005593234 nova_compute[227762]: 2026-01-23 10:03:48.473 227766 DEBUG oslo_concurrency.lockutils [None req-81227f08-c9ac-4285-a8b8-4c992b907b13 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "1bdbf4d2-447b-47d0-8b3f-878ee65905a7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.975s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:48 np0005593234 nova_compute[227762]: 2026-01-23 10:03:48.800 227766 DEBUG nova.compute.manager [req-7acc06e1-04b3-4c58-a8f5-4af0dc47bbdb req-b6f13a09-14f1-4054-9050-5894a052f4cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Received event network-vif-deleted-35f84523-a0b5-4102-ba04-cc5da6075d54 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:03:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:49.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:50.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:50 np0005593234 nova_compute[227762]: 2026-01-23 10:03:50.487 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:50 np0005593234 nova_compute[227762]: 2026-01-23 10:03:50.808 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e264 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:51.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:51 np0005593234 nova_compute[227762]: 2026-01-23 10:03:51.291 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e265 e265: 3 total, 3 up, 3 in
Jan 23 05:03:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:52.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:52 np0005593234 nova_compute[227762]: 2026-01-23 10:03:52.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:52 np0005593234 nova_compute[227762]: 2026-01-23 10:03:52.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:03:52 np0005593234 nova_compute[227762]: 2026-01-23 10:03:52.761 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:03:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:03:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:53.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:03:53 np0005593234 nova_compute[227762]: 2026-01-23 10:03:53.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:03:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:54.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:03:54 np0005593234 nova_compute[227762]: 2026-01-23 10:03:54.682 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:54 np0005593234 nova_compute[227762]: 2026-01-23 10:03:54.682 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:54 np0005593234 nova_compute[227762]: 2026-01-23 10:03:54.704 227766 DEBUG nova.compute.manager [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:03:54 np0005593234 nova_compute[227762]: 2026-01-23 10:03:54.818 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:54 np0005593234 nova_compute[227762]: 2026-01-23 10:03:54.819 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:54 np0005593234 nova_compute[227762]: 2026-01-23 10:03:54.824 227766 DEBUG nova.virt.hardware [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:03:54 np0005593234 nova_compute[227762]: 2026-01-23 10:03:54.824 227766 INFO nova.compute.claims [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.015 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.027 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:55.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:03:55 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3025412293' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.450 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.455 227766 DEBUG nova.compute.provider_tree [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.476 227766 DEBUG nova.scheduler.client.report [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.500 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.501 227766 DEBUG nova.compute.manager [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.562 227766 DEBUG nova.compute.manager [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.562 227766 DEBUG nova.network.neutron [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.608 227766 INFO nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.639 227766 DEBUG nova.compute.manager [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.732 227766 DEBUG nova.compute.manager [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.733 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.733 227766 INFO nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Creating image(s)#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.756 227766 DEBUG nova.storage.rbd_utils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image ae2a211d-e923-498b-9ceb-97274a2fd725_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:55 np0005593234 podman[277246]: 2026-01-23 10:03:55.761167582 +0000 UTC m=+0.054935699 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.781 227766 DEBUG nova.storage.rbd_utils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image ae2a211d-e923-498b-9ceb-97274a2fd725_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.802 227766 DEBUG nova.storage.rbd_utils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image ae2a211d-e923-498b-9ceb-97274a2fd725_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.806 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.826 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.829 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.829 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.866 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.867 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.868 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.868 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.889 227766 DEBUG nova.storage.rbd_utils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image ae2a211d-e923-498b-9ceb-97274a2fd725_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:03:55 np0005593234 nova_compute[227762]: 2026-01-23 10:03:55.892 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 ae2a211d-e923-498b-9ceb-97274a2fd725_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:03:56 np0005593234 nova_compute[227762]: 2026-01-23 10:03:56.179 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 ae2a211d-e923-498b-9ceb-97274a2fd725_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.287s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:03:56 np0005593234 nova_compute[227762]: 2026-01-23 10:03:56.212 227766 DEBUG nova.policy [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9d4a5c201efa4992a9ef57d8abdc1675', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:03:56 np0005593234 nova_compute[227762]: 2026-01-23 10:03:56.256 227766 DEBUG nova.storage.rbd_utils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] resizing rbd image ae2a211d-e923-498b-9ceb-97274a2fd725_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:03:56 np0005593234 nova_compute[227762]: 2026-01-23 10:03:56.293 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:03:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:56.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:56 np0005593234 nova_compute[227762]: 2026-01-23 10:03:56.591 227766 DEBUG nova.objects.instance [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'migration_context' on Instance uuid ae2a211d-e923-498b-9ceb-97274a2fd725 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:03:56 np0005593234 nova_compute[227762]: 2026-01-23 10:03:56.731 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:03:56 np0005593234 nova_compute[227762]: 2026-01-23 10:03:56.732 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Ensure instance console log exists: /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:03:56 np0005593234 nova_compute[227762]: 2026-01-23 10:03:56.732 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:03:56 np0005593234 nova_compute[227762]: 2026-01-23 10:03:56.733 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:03:56 np0005593234 nova_compute[227762]: 2026-01-23 10:03:56.733 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:03:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:57.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:58.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:03:58 np0005593234 nova_compute[227762]: 2026-01-23 10:03:58.717 227766 DEBUG nova.network.neutron [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Successfully created port: 115f68c4-4489-4fc8-bb90-3c2d3011db2d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:03:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:03:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:03:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:59.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:00.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.462 227766 DEBUG nova.network.neutron [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Successfully updated port: 115f68c4-4489-4fc8-bb90-3c2d3011db2d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.482 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.482 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquired lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.482 227766 DEBUG nova.network.neutron [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.727 227766 DEBUG nova.network.neutron [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.734 227766 DEBUG nova.compute.manager [req-a5441d20-8f82-4254-a255-d46a4e3b4ef9 req-c5dfc2c3-a029-48a3-9052-0649ae1a2744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-changed-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.734 227766 DEBUG nova.compute.manager [req-a5441d20-8f82-4254-a255-d46a4e3b4ef9 req-c5dfc2c3-a029-48a3-9052-0649ae1a2744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Refreshing instance network info cache due to event network-changed-115f68c4-4489-4fc8-bb90-3c2d3011db2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.735 227766 DEBUG oslo_concurrency.lockutils [req-a5441d20-8f82-4254-a255-d46a4e3b4ef9 req-c5dfc2c3-a029-48a3-9052-0649ae1a2744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.735 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162625.7330358, 1bdbf4d2-447b-47d0-8b3f-878ee65905a7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.735 227766 INFO nova.compute.manager [-] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.758 227766 DEBUG nova.compute.manager [None req-20d405b9-cfe4-469e-853e-9f2f8bffa854 - - - - - -] [instance: 1bdbf4d2-447b-47d0-8b3f-878ee65905a7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.828 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Acquiring lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.829 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.844 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:00 np0005593234 nova_compute[227762]: 2026-01-23 10:04:00.878 227766 DEBUG nova.compute.manager [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:04:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:01.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.071 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.071 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.077 227766 DEBUG nova.virt.hardware [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.077 227766 INFO nova.compute.claims [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.225 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.296 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:04:01 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3923078140' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.661 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.666 227766 DEBUG nova.compute.provider_tree [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.686 227766 DEBUG nova.scheduler.client.report [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.716 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.717 227766 DEBUG nova.compute.manager [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.766 227766 DEBUG nova.compute.manager [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.767 227766 DEBUG nova.network.neutron [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.949 227766 INFO nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:04:01 np0005593234 nova_compute[227762]: 2026-01-23 10:04:01.979 227766 DEBUG nova.compute.manager [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.083 227766 DEBUG nova.network.neutron [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance_info_cache with network_info: [{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.095 227766 DEBUG nova.compute.manager [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.097 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.098 227766 INFO nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Creating image(s)#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.125 227766 DEBUG nova.storage.rbd_utils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] rbd image aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.150 227766 DEBUG nova.storage.rbd_utils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] rbd image aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.172 227766 DEBUG nova.storage.rbd_utils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] rbd image aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.175 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.196 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Releasing lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.196 227766 DEBUG nova.compute.manager [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Instance network_info: |[{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.197 227766 DEBUG oslo_concurrency.lockutils [req-a5441d20-8f82-4254-a255-d46a4e3b4ef9 req-c5dfc2c3-a029-48a3-9052-0649ae1a2744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.197 227766 DEBUG nova.network.neutron [req-a5441d20-8f82-4254-a255-d46a4e3b4ef9 req-c5dfc2c3-a029-48a3-9052-0649ae1a2744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Refreshing network info cache for port 115f68c4-4489-4fc8-bb90-3c2d3011db2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.200 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Start _get_guest_xml network_info=[{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.205 227766 WARNING nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.211 227766 DEBUG nova.virt.libvirt.host [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.211 227766 DEBUG nova.virt.libvirt.host [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.215 227766 DEBUG nova.virt.libvirt.host [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.215 227766 DEBUG nova.virt.libvirt.host [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.217 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.217 227766 DEBUG nova.virt.hardware [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.218 227766 DEBUG nova.virt.hardware [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.219 227766 DEBUG nova.virt.hardware [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.219 227766 DEBUG nova.virt.hardware [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.219 227766 DEBUG nova.virt.hardware [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.219 227766 DEBUG nova.virt.hardware [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.220 227766 DEBUG nova.virt.hardware [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.220 227766 DEBUG nova.virt.hardware [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.220 227766 DEBUG nova.virt.hardware [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.221 227766 DEBUG nova.virt.hardware [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.221 227766 DEBUG nova.virt.hardware [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.225 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.254 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.255 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.256 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.256 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.280 227766 DEBUG nova.storage.rbd_utils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] rbd image aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.283 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.308 227766 DEBUG nova.policy [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'faad005151bd403e905a16eb7b539f14', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '115a816b885b44c3956744176af911f2', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:04:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:02.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:04:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1463150176' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.660 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.684 227766 DEBUG nova.storage.rbd_utils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image ae2a211d-e923-498b-9ceb-97274a2fd725_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:02 np0005593234 nova_compute[227762]: 2026-01-23 10:04:02.689 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:03.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:04:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2350758596' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.142 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.144 227766 DEBUG nova.virt.libvirt.vif [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-782058218',display_name='tempest-ServerActionsTestJSON-server-782058218',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-782058218',id=109,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-05cy3qfb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:03:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=ae2a211d-e923-498b-9ceb-97274a2fd725,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.145 227766 DEBUG nova.network.os_vif_util [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.146 227766 DEBUG nova.network.os_vif_util [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.147 227766 DEBUG nova.objects.instance [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'pci_devices' on Instance uuid ae2a211d-e923-498b-9ceb-97274a2fd725 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.171 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  <uuid>ae2a211d-e923-498b-9ceb-97274a2fd725</uuid>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  <name>instance-0000006d</name>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerActionsTestJSON-server-782058218</nova:name>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:04:02</nova:creationTime>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <nova:user uuid="9d4a5c201efa4992a9ef57d8abdc1675">tempest-ServerActionsTestJSON-1619235720-project-member</nova:user>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <nova:project uuid="74c5c1d0762242f29a5d26033efd9f6d">tempest-ServerActionsTestJSON-1619235720</nova:project>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <nova:port uuid="115f68c4-4489-4fc8-bb90-3c2d3011db2d">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <entry name="serial">ae2a211d-e923-498b-9ceb-97274a2fd725</entry>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <entry name="uuid">ae2a211d-e923-498b-9ceb-97274a2fd725</entry>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/ae2a211d-e923-498b-9ceb-97274a2fd725_disk">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/ae2a211d-e923-498b-9ceb-97274a2fd725_disk.config">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:e2:de:d3"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <target dev="tap115f68c4-44"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/console.log" append="off"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:04:03 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:04:03 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:04:03 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:04:03 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.173 227766 DEBUG nova.compute.manager [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Preparing to wait for external event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.173 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.174 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.174 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.175 227766 DEBUG nova.virt.libvirt.vif [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-782058218',display_name='tempest-ServerActionsTestJSON-server-782058218',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-782058218',id=109,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-05cy3qfb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:03:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=ae2a211d-e923-498b-9ceb-97274a2fd725,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.175 227766 DEBUG nova.network.os_vif_util [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.176 227766 DEBUG nova.network.os_vif_util [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.177 227766 DEBUG os_vif [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.177 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.178 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.178 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.182 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.182 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap115f68c4-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.183 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap115f68c4-44, col_values=(('external_ids', {'iface-id': '115f68c4-4489-4fc8-bb90-3c2d3011db2d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:de:d3', 'vm-uuid': 'ae2a211d-e923-498b-9ceb-97274a2fd725'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.208 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:03 np0005593234 NetworkManager[48942]: <info>  [1769162643.2093] manager: (tap115f68c4-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/212)
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.212 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.214 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.215 227766 INFO os_vif [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44')#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.284 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.285 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.285 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No VIF found with MAC fa:16:3e:e2:de:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.286 227766 INFO nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Using config drive#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.313 227766 DEBUG nova.storage.rbd_utils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image ae2a211d-e923-498b-9ceb-97274a2fd725_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.582 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.299s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.644 227766 DEBUG nova.network.neutron [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Successfully created port: e9ecb184-3eff-49b4-8c20-37b3270c5d4b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.658 227766 DEBUG nova.storage.rbd_utils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] resizing rbd image aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.845 227766 DEBUG nova.objects.instance [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lazy-loading 'migration_context' on Instance uuid aaea9e32-85e4-4a86-b997-e3c7d7ef6335 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.873 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.874 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Ensure instance console log exists: /var/lib/nova/instances/aaea9e32-85e4-4a86-b997-e3c7d7ef6335/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.874 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.875 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:03 np0005593234 nova_compute[227762]: 2026-01-23 10:04:03.875 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.272 227766 INFO nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Creating config drive at /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/disk.config#033[00m
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.281 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6jtqniwb execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.416 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6jtqniwb" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:04.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.446 227766 DEBUG nova.storage.rbd_utils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image ae2a211d-e923-498b-9ceb-97274a2fd725_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.449 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/disk.config ae2a211d-e923-498b-9ceb-97274a2fd725_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.593 227766 DEBUG oslo_concurrency.processutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/disk.config ae2a211d-e923-498b-9ceb-97274a2fd725_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.594 227766 INFO nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Deleting local config drive /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/disk.config because it was imported into RBD.#033[00m
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.605 227766 DEBUG nova.network.neutron [req-a5441d20-8f82-4254-a255-d46a4e3b4ef9 req-c5dfc2c3-a029-48a3-9052-0649ae1a2744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updated VIF entry in instance network info cache for port 115f68c4-4489-4fc8-bb90-3c2d3011db2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.607 227766 DEBUG nova.network.neutron [req-a5441d20-8f82-4254-a255-d46a4e3b4ef9 req-c5dfc2c3-a029-48a3-9052-0649ae1a2744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance_info_cache with network_info: [{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:04 np0005593234 kernel: tap115f68c4-44: entered promiscuous mode
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.645 227766 DEBUG oslo_concurrency.lockutils [req-a5441d20-8f82-4254-a255-d46a4e3b4ef9 req-c5dfc2c3-a029-48a3-9052-0649ae1a2744 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:04 np0005593234 NetworkManager[48942]: <info>  [1769162644.6471] manager: (tap115f68c4-44): new Tun device (/org/freedesktop/NetworkManager/Devices/213)
Jan 23 05:04:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:04Z|00408|binding|INFO|Claiming lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d for this chassis.
Jan 23 05:04:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:04Z|00409|binding|INFO|115f68c4-4489-4fc8-bb90-3c2d3011db2d: Claiming fa:16:3e:e2:de:d3 10.100.0.7
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.649 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.668 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:de:d3 10.100.0.7'], port_security=['fa:16:3e:e2:de:d3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ae2a211d-e923-498b-9ceb-97274a2fd725', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=115f68c4-4489-4fc8-bb90-3c2d3011db2d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.670 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 115f68c4-4489-4fc8-bb90-3c2d3011db2d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c bound to our chassis#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.672 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee03d7c9-e107-41bf-95cc-5508578ad66c#033[00m
Jan 23 05:04:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:04Z|00410|binding|INFO|Setting lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d ovn-installed in OVS
Jan 23 05:04:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:04Z|00411|binding|INFO|Setting lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d up in Southbound
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.682 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.686 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:04 np0005593234 systemd-machined[195626]: New machine qemu-47-instance-0000006d.
Jan 23 05:04:04 np0005593234 systemd-udevd[277762]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.686 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e6a5ed97-46e5-4fb5-886e-2cc8e055c711]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.687 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee03d7c9-e1 in ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.689 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee03d7c9-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.689 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1d9981f3-91f8-4a9e-a498-96fb89da87d4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.690 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c409d08d-b6c7-4acf-b548-5003dd6943d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 systemd[1]: Started Virtual Machine qemu-47-instance-0000006d.
Jan 23 05:04:04 np0005593234 NetworkManager[48942]: <info>  [1769162644.6980] device (tap115f68c4-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:04:04 np0005593234 NetworkManager[48942]: <info>  [1769162644.6989] device (tap115f68c4-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.701 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[1fd9e288-c65d-4c65-90b5-c5a29edae060]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.725 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4864a9b9-73ac-4da4-9d01-7be02e0112dc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.749 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9e14c097-f14c-4e09-8ece-8da0ab8b42db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 systemd-udevd[277765]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:04:04 np0005593234 NetworkManager[48942]: <info>  [1769162644.7561] manager: (tapee03d7c9-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/214)
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.756 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a9d0dbdf-81a3-417d-848c-9a42ad5f62f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.788 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[5ca7d4cd-fd5d-4184-87a9-00f42b704573]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.793 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ddc5af-b703-4526-b49d-083cca16547b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 NetworkManager[48942]: <info>  [1769162644.8155] device (tapee03d7c9-e0): carrier: link connected
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.822 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[925f50c8-181e-4ac4-82bf-1a681f9c0fd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.840 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[035d9757-5b6e-48a0-94a6-1b635758641c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 653977, 'reachable_time': 24790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 277819, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.858 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c266113d-d04a-4fec-8ce2-81c765a82af4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:6530'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 653977, 'tstamp': 653977}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 277837, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.875 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c65edd3e-bc58-4a77-a875-a6d3e61f145a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 133], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 653977, 'reachable_time': 24790, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 277845, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.907 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c1fef6aa-9dcd-4885-80ee-1c9f17bac7cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.963 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3db6d2e9-1957-4bee-8c29-84d986d0bc97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.965 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.965 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.966 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee03d7c9-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.968 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:04 np0005593234 kernel: tapee03d7c9-e0: entered promiscuous mode
Jan 23 05:04:04 np0005593234 NetworkManager[48942]: <info>  [1769162644.9683] manager: (tapee03d7c9-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/215)
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.970 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.971 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee03d7c9-e0, col_values=(('external_ids', {'iface-id': '702d4523-a665-42f5-9a36-57d187c0698a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:04Z|00412|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=0)
Jan 23 05:04:04 np0005593234 nova_compute[227762]: 2026-01-23 10:04:04.990 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.991 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.992 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5ea626ca-d916-488c-92c9-8e6268db1dcd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.992 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:04:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:04.993 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'env', 'PROCESS_TAG=haproxy-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee03d7c9-e107-41bf-95cc-5508578ad66c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.025 227766 DEBUG nova.network.neutron [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Successfully updated port: e9ecb184-3eff-49b4-8c20-37b3270c5d4b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.041 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Acquiring lock "refresh_cache-aaea9e32-85e4-4a86-b997-e3c7d7ef6335" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.042 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Acquired lock "refresh_cache-aaea9e32-85e4-4a86-b997-e3c7d7ef6335" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.042 227766 DEBUG nova.network.neutron [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:04:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:05.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.144 227766 DEBUG nova.compute.manager [req-0a14389a-810e-47fc-a304-0e92f1c30ecb req-f647a0d6-a13e-46c9-9900-e55a7a8a916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Received event network-changed-e9ecb184-3eff-49b4-8c20-37b3270c5d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.144 227766 DEBUG nova.compute.manager [req-0a14389a-810e-47fc-a304-0e92f1c30ecb req-f647a0d6-a13e-46c9-9900-e55a7a8a916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Refreshing instance network info cache due to event network-changed-e9ecb184-3eff-49b4-8c20-37b3270c5d4b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.145 227766 DEBUG oslo_concurrency.lockutils [req-0a14389a-810e-47fc-a304-0e92f1c30ecb req-f647a0d6-a13e-46c9-9900-e55a7a8a916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-aaea9e32-85e4-4a86-b997-e3c7d7ef6335" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.251 227766 DEBUG nova.network.neutron [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:04:05 np0005593234 podman[277917]: 2026-01-23 10:04:05.338533229 +0000 UTC m=+0.055312281 container create 535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.352 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162645.351883, ae2a211d-e923-498b-9ceb-97274a2fd725 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.353 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] VM Started (Lifecycle Event)#033[00m
Jan 23 05:04:05 np0005593234 systemd[1]: Started libpod-conmon-535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856.scope.
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.383 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.387 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162645.353017, ae2a211d-e923-498b-9ceb-97274a2fd725 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.388 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:04:05 np0005593234 podman[277917]: 2026-01-23 10:04:05.30433517 +0000 UTC m=+0.021114252 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:04:05 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:04:05 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a7f5b49db158623d9efe05cca9d020906ed1480abd489c30793ee2bb660dc4e5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:04:05 np0005593234 podman[277917]: 2026-01-23 10:04:05.422600328 +0000 UTC m=+0.139379400 container init 535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.425 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.428 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:04:05 np0005593234 podman[277917]: 2026-01-23 10:04:05.428834533 +0000 UTC m=+0.145613595 container start 535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:04:05 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[277933]: [NOTICE]   (277937) : New worker (277939) forked
Jan 23 05:04:05 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[277933]: [NOTICE]   (277937) : Loading success.
Jan 23 05:04:05 np0005593234 nova_compute[227762]: 2026-01-23 10:04:05.459 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:04:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.297 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.327 227766 DEBUG nova.network.neutron [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Updating instance_info_cache with network_info: [{"id": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "address": "fa:16:3e:5b:4b:d5", "network": {"id": "d13015e1-22b2-432c-983c-fc4995e88988", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2108569587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "115a816b885b44c3956744176af911f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ecb184-3e", "ovs_interfaceid": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.347 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Releasing lock "refresh_cache-aaea9e32-85e4-4a86-b997-e3c7d7ef6335" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.348 227766 DEBUG nova.compute.manager [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Instance network_info: |[{"id": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "address": "fa:16:3e:5b:4b:d5", "network": {"id": "d13015e1-22b2-432c-983c-fc4995e88988", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2108569587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "115a816b885b44c3956744176af911f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ecb184-3e", "ovs_interfaceid": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.348 227766 DEBUG oslo_concurrency.lockutils [req-0a14389a-810e-47fc-a304-0e92f1c30ecb req-f647a0d6-a13e-46c9-9900-e55a7a8a916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-aaea9e32-85e4-4a86-b997-e3c7d7ef6335" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.349 227766 DEBUG nova.network.neutron [req-0a14389a-810e-47fc-a304-0e92f1c30ecb req-f647a0d6-a13e-46c9-9900-e55a7a8a916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Refreshing network info cache for port e9ecb184-3eff-49b4-8c20-37b3270c5d4b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.351 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Start _get_guest_xml network_info=[{"id": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "address": "fa:16:3e:5b:4b:d5", "network": {"id": "d13015e1-22b2-432c-983c-fc4995e88988", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2108569587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "115a816b885b44c3956744176af911f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ecb184-3e", "ovs_interfaceid": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.355 227766 WARNING nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.361 227766 DEBUG nova.virt.libvirt.host [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.361 227766 DEBUG nova.virt.libvirt.host [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.364 227766 DEBUG nova.virt.libvirt.host [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.365 227766 DEBUG nova.virt.libvirt.host [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.366 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.366 227766 DEBUG nova.virt.hardware [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.367 227766 DEBUG nova.virt.hardware [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.367 227766 DEBUG nova.virt.hardware [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.367 227766 DEBUG nova.virt.hardware [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.367 227766 DEBUG nova.virt.hardware [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.368 227766 DEBUG nova.virt.hardware [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.368 227766 DEBUG nova.virt.hardware [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.368 227766 DEBUG nova.virt.hardware [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.369 227766 DEBUG nova.virt.hardware [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.369 227766 DEBUG nova.virt.hardware [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.369 227766 DEBUG nova.virt.hardware [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.372 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:06.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:04:06 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1745277861' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.799 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.822 227766 DEBUG nova.storage.rbd_utils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] rbd image aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:06 np0005593234 nova_compute[227762]: 2026-01-23 10:04:06.826 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:04:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:07.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.156 227766 DEBUG nova.compute.manager [req-ca4b2a8b-e450-4cd0-9ccc-d185d6970d56 req-4de86a1d-2c81-4197-90d8-9c8a37562ad6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.157 227766 DEBUG oslo_concurrency.lockutils [req-ca4b2a8b-e450-4cd0-9ccc-d185d6970d56 req-4de86a1d-2c81-4197-90d8-9c8a37562ad6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.158 227766 DEBUG oslo_concurrency.lockutils [req-ca4b2a8b-e450-4cd0-9ccc-d185d6970d56 req-4de86a1d-2c81-4197-90d8-9c8a37562ad6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.158 227766 DEBUG oslo_concurrency.lockutils [req-ca4b2a8b-e450-4cd0-9ccc-d185d6970d56 req-4de86a1d-2c81-4197-90d8-9c8a37562ad6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.159 227766 DEBUG nova.compute.manager [req-ca4b2a8b-e450-4cd0-9ccc-d185d6970d56 req-4de86a1d-2c81-4197-90d8-9c8a37562ad6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Processing event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.159 227766 DEBUG nova.compute.manager [req-ca4b2a8b-e450-4cd0-9ccc-d185d6970d56 req-4de86a1d-2c81-4197-90d8-9c8a37562ad6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.160 227766 DEBUG oslo_concurrency.lockutils [req-ca4b2a8b-e450-4cd0-9ccc-d185d6970d56 req-4de86a1d-2c81-4197-90d8-9c8a37562ad6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.160 227766 DEBUG oslo_concurrency.lockutils [req-ca4b2a8b-e450-4cd0-9ccc-d185d6970d56 req-4de86a1d-2c81-4197-90d8-9c8a37562ad6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.160 227766 DEBUG oslo_concurrency.lockutils [req-ca4b2a8b-e450-4cd0-9ccc-d185d6970d56 req-4de86a1d-2c81-4197-90d8-9c8a37562ad6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.161 227766 DEBUG nova.compute.manager [req-ca4b2a8b-e450-4cd0-9ccc-d185d6970d56 req-4de86a1d-2c81-4197-90d8-9c8a37562ad6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.161 227766 WARNING nova.compute.manager [req-ca4b2a8b-e450-4cd0-9ccc-d185d6970d56 req-4de86a1d-2c81-4197-90d8-9c8a37562ad6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state building and task_state spawning.#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.163 227766 DEBUG nova.compute.manager [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.169 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162647.1691785, ae2a211d-e923-498b-9ceb-97274a2fd725 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.170 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.174 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.179 227766 INFO nova.virt.libvirt.driver [-] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Instance spawned successfully.#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.180 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.202 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.210 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.220 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.220 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.221 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.221 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.222 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.222 227766 DEBUG nova.virt.libvirt.driver [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.263 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:04:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:04:07 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2623041868' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.283 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.284 227766 DEBUG nova.virt.libvirt.vif [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:03:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1461939973',display_name='tempest-ServerAddressesTestJSON-server-1461939973',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1461939973',id=110,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='115a816b885b44c3956744176af911f2',ramdisk_id='',reservation_id='r-7ipsj6to',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-854918154',owner_user_name='tempest-ServerAddressesTestJSON-854918154-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:04:02Z,user_data=None,user_id='faad005151bd403e905a16eb7b539f14',uuid=aaea9e32-85e4-4a86-b997-e3c7d7ef6335,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "address": "fa:16:3e:5b:4b:d5", "network": {"id": "d13015e1-22b2-432c-983c-fc4995e88988", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2108569587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "115a816b885b44c3956744176af911f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ecb184-3e", "ovs_interfaceid": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.284 227766 DEBUG nova.network.os_vif_util [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Converting VIF {"id": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "address": "fa:16:3e:5b:4b:d5", "network": {"id": "d13015e1-22b2-432c-983c-fc4995e88988", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2108569587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "115a816b885b44c3956744176af911f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ecb184-3e", "ovs_interfaceid": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.285 227766 DEBUG nova.network.os_vif_util [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:4b:d5,bridge_name='br-int',has_traffic_filtering=True,id=e9ecb184-3eff-49b4-8c20-37b3270c5d4b,network=Network(d13015e1-22b2-432c-983c-fc4995e88988),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ecb184-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.287 227766 DEBUG nova.objects.instance [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lazy-loading 'pci_devices' on Instance uuid aaea9e32-85e4-4a86-b997-e3c7d7ef6335 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.311 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  <uuid>aaea9e32-85e4-4a86-b997-e3c7d7ef6335</uuid>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  <name>instance-0000006e</name>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerAddressesTestJSON-server-1461939973</nova:name>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:04:06</nova:creationTime>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <nova:user uuid="faad005151bd403e905a16eb7b539f14">tempest-ServerAddressesTestJSON-854918154-project-member</nova:user>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <nova:project uuid="115a816b885b44c3956744176af911f2">tempest-ServerAddressesTestJSON-854918154</nova:project>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <nova:port uuid="e9ecb184-3eff-49b4-8c20-37b3270c5d4b">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <entry name="serial">aaea9e32-85e4-4a86-b997-e3c7d7ef6335</entry>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <entry name="uuid">aaea9e32-85e4-4a86-b997-e3c7d7ef6335</entry>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk.config">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:5b:4b:d5"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <target dev="tape9ecb184-3e"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/aaea9e32-85e4-4a86-b997-e3c7d7ef6335/console.log" append="off"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:04:07 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:04:07 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:04:07 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:04:07 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.313 227766 DEBUG nova.compute.manager [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Preparing to wait for external event network-vif-plugged-e9ecb184-3eff-49b4-8c20-37b3270c5d4b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.313 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Acquiring lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.314 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.314 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.315 227766 DEBUG nova.virt.libvirt.vif [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:03:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1461939973',display_name='tempest-ServerAddressesTestJSON-server-1461939973',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1461939973',id=110,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='115a816b885b44c3956744176af911f2',ramdisk_id='',reservation_id='r-7ipsj6to',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesTestJSON-854918154',owner_user_name='tempest-ServerAddressesTestJSON-854918154-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:04:02Z,user_data=None,user_id='faad005151bd403e905a16eb7b539f14',uuid=aaea9e32-85e4-4a86-b997-e3c7d7ef6335,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "address": "fa:16:3e:5b:4b:d5", "network": {"id": "d13015e1-22b2-432c-983c-fc4995e88988", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2108569587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "115a816b885b44c3956744176af911f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ecb184-3e", "ovs_interfaceid": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.315 227766 DEBUG nova.network.os_vif_util [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Converting VIF {"id": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "address": "fa:16:3e:5b:4b:d5", "network": {"id": "d13015e1-22b2-432c-983c-fc4995e88988", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2108569587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "115a816b885b44c3956744176af911f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ecb184-3e", "ovs_interfaceid": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.316 227766 DEBUG nova.network.os_vif_util [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:4b:d5,bridge_name='br-int',has_traffic_filtering=True,id=e9ecb184-3eff-49b4-8c20-37b3270c5d4b,network=Network(d13015e1-22b2-432c-983c-fc4995e88988),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ecb184-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.316 227766 DEBUG os_vif [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:4b:d5,bridge_name='br-int',has_traffic_filtering=True,id=e9ecb184-3eff-49b4-8c20-37b3270c5d4b,network=Network(d13015e1-22b2-432c-983c-fc4995e88988),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ecb184-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.317 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.318 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.318 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.319 227766 INFO nova.compute.manager [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Took 11.59 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.320 227766 DEBUG nova.compute.manager [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.322 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.322 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape9ecb184-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.323 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape9ecb184-3e, col_values=(('external_ids', {'iface-id': 'e9ecb184-3eff-49b4-8c20-37b3270c5d4b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:4b:d5', 'vm-uuid': 'aaea9e32-85e4-4a86-b997-e3c7d7ef6335'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.324 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:07 np0005593234 NetworkManager[48942]: <info>  [1769162647.3255] manager: (tape9ecb184-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/216)
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.326 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.331 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.331 227766 INFO os_vif [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:4b:d5,bridge_name='br-int',has_traffic_filtering=True,id=e9ecb184-3eff-49b4-8c20-37b3270c5d4b,network=Network(d13015e1-22b2-432c-983c-fc4995e88988),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ecb184-3e')#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.412 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.412 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.412 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] No VIF found with MAC fa:16:3e:5b:4b:d5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.413 227766 INFO nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Using config drive#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.456 227766 DEBUG nova.storage.rbd_utils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] rbd image aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:07 np0005593234 podman[278014]: 2026-01-23 10:04:07.465442049 +0000 UTC m=+0.106546173 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.471 227766 INFO nova.compute.manager [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Took 12.70 seconds to build instance.#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.500 227766 DEBUG oslo_concurrency.lockutils [None req-94ddd2e2-5fc0-4c20-8b9e-090fff8f31ec 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.814 227766 INFO nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Creating config drive at /var/lib/nova/instances/aaea9e32-85e4-4a86-b997-e3c7d7ef6335/disk.config#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.820 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aaea9e32-85e4-4a86-b997-e3c7d7ef6335/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgo8kh43t execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.952 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aaea9e32-85e4-4a86-b997-e3c7d7ef6335/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgo8kh43t" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.992 227766 DEBUG nova.storage.rbd_utils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] rbd image aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:04:07 np0005593234 nova_compute[227762]: 2026-01-23 10:04:07.997 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/aaea9e32-85e4-4a86-b997-e3c7d7ef6335/disk.config aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.157 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.158 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=41, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=40) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.160 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.179 227766 DEBUG oslo_concurrency.processutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/aaea9e32-85e4-4a86-b997-e3c7d7ef6335/disk.config aaea9e32-85e4-4a86-b997-e3c7d7ef6335_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.182s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.179 227766 INFO nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Deleting local config drive /var/lib/nova/instances/aaea9e32-85e4-4a86-b997-e3c7d7ef6335/disk.config because it was imported into RBD.#033[00m
Jan 23 05:04:08 np0005593234 kernel: tape9ecb184-3e: entered promiscuous mode
Jan 23 05:04:08 np0005593234 NetworkManager[48942]: <info>  [1769162648.2232] manager: (tape9ecb184-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/217)
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.227 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:08Z|00413|binding|INFO|Claiming lport e9ecb184-3eff-49b4-8c20-37b3270c5d4b for this chassis.
Jan 23 05:04:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:08Z|00414|binding|INFO|e9ecb184-3eff-49b4-8c20-37b3270c5d4b: Claiming fa:16:3e:5b:4b:d5 10.100.0.10
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.237 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:4b:d5 10.100.0.10'], port_security=['fa:16:3e:5b:4b:d5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'aaea9e32-85e4-4a86-b997-e3c7d7ef6335', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d13015e1-22b2-432c-983c-fc4995e88988', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '115a816b885b44c3956744176af911f2', 'neutron:revision_number': '2', 'neutron:security_group_ids': '558defe0-a126-4ff9-9056-65c3ceef8f86', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54a9ff3b-17d3-4c79-9db9-bfe573600737, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=e9ecb184-3eff-49b4-8c20-37b3270c5d4b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.240 144381 INFO neutron.agent.ovn.metadata.agent [-] Port e9ecb184-3eff-49b4-8c20-37b3270c5d4b in datapath d13015e1-22b2-432c-983c-fc4995e88988 bound to our chassis#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.242 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d13015e1-22b2-432c-983c-fc4995e88988#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.254 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7c7b180f-04ba-4669-a37f-9497ce43c6f6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.255 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd13015e1-21 in ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:04:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:08Z|00415|binding|INFO|Setting lport e9ecb184-3eff-49b4-8c20-37b3270c5d4b ovn-installed in OVS
Jan 23 05:04:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:08Z|00416|binding|INFO|Setting lport e9ecb184-3eff-49b4-8c20-37b3270c5d4b up in Southbound
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.257 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.260 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd13015e1-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.267 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[47e21d28-f88d-4f7d-9251-ed64cc383873]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.271 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[628975ae-b9db-496d-96e3-6a3a16028912]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 systemd-udevd[278113]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:04:08 np0005593234 systemd-machined[195626]: New machine qemu-48-instance-0000006e.
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.285 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[ab4fd1b2-ac6f-42de-97bf-8521599ce616]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 systemd[1]: Started Virtual Machine qemu-48-instance-0000006e.
Jan 23 05:04:08 np0005593234 NetworkManager[48942]: <info>  [1769162648.2915] device (tape9ecb184-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:04:08 np0005593234 NetworkManager[48942]: <info>  [1769162648.2921] device (tape9ecb184-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.302 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[de98fa72-4a10-41bd-abce-39c0698a1350]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.334 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[71e26354-35ce-4120-bc16-883e178b4ca0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 systemd-udevd[278118]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.338 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bd1d45bc-703d-4198-8695-2042352d2b0a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 NetworkManager[48942]: <info>  [1769162648.3422] manager: (tapd13015e1-20): new Veth device (/org/freedesktop/NetworkManager/Devices/218)
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.369 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[801dfc55-6f66-498a-a7fe-9c0f31e031d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.373 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[0e0cac6a-c9cd-414d-8096-0436828aeffd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 NetworkManager[48942]: <info>  [1769162648.3956] device (tapd13015e1-20): carrier: link connected
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.402 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[748ebf76-9d94-40cb-a941-669afcba0d08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.418 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[07dc8e92-783c-4bc1-bf12-2e3c81a164f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd13015e1-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:67:bf:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 654335, 'reachable_time': 23962, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278146, 'error': None, 'target': 'ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:08.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.441 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7c371d97-5879-4a25-b907-87cd4032154d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe67:bfeb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 654335, 'tstamp': 654335}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 278147, 'error': None, 'target': 'ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.458 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[db523981-9f8e-4f3b-b53c-c3a485ab2b36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd13015e1-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:67:bf:eb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 135], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 654335, 'reachable_time': 23962, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 278148, 'error': None, 'target': 'ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.484 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d20c11be-049f-4830-adcd-66bce3429b4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.525 227766 DEBUG nova.network.neutron [req-0a14389a-810e-47fc-a304-0e92f1c30ecb req-f647a0d6-a13e-46c9-9900-e55a7a8a916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Updated VIF entry in instance network info cache for port e9ecb184-3eff-49b4-8c20-37b3270c5d4b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.526 227766 DEBUG nova.network.neutron [req-0a14389a-810e-47fc-a304-0e92f1c30ecb req-f647a0d6-a13e-46c9-9900-e55a7a8a916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Updating instance_info_cache with network_info: [{"id": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "address": "fa:16:3e:5b:4b:d5", "network": {"id": "d13015e1-22b2-432c-983c-fc4995e88988", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2108569587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "115a816b885b44c3956744176af911f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ecb184-3e", "ovs_interfaceid": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.535 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[df5a0728-e147-47ff-b6e4-a0296106203b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.537 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd13015e1-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.537 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.538 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd13015e1-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:08 np0005593234 kernel: tapd13015e1-20: entered promiscuous mode
Jan 23 05:04:08 np0005593234 NetworkManager[48942]: <info>  [1769162648.5417] manager: (tapd13015e1-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/219)
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.544 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd13015e1-20, col_values=(('external_ids', {'iface-id': 'de3e163a-d382-45fe-967f-0e6286ac8b0e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:08Z|00417|binding|INFO|Releasing lport de3e163a-d382-45fe-967f-0e6286ac8b0e from this chassis (sb_readonly=0)
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.540 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.560 227766 DEBUG oslo_concurrency.lockutils [req-0a14389a-810e-47fc-a304-0e92f1c30ecb req-f647a0d6-a13e-46c9-9900-e55a7a8a916b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-aaea9e32-85e4-4a86-b997-e3c7d7ef6335" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.562 227766 DEBUG nova.compute.manager [req-c849af91-d80d-48ce-bd2e-08b6dd2cf638 req-763bd660-b3eb-4c76-9d63-9fe85aabeaaf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Received event network-vif-plugged-e9ecb184-3eff-49b4-8c20-37b3270c5d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.563 227766 DEBUG oslo_concurrency.lockutils [req-c849af91-d80d-48ce-bd2e-08b6dd2cf638 req-763bd660-b3eb-4c76-9d63-9fe85aabeaaf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.563 227766 DEBUG oslo_concurrency.lockutils [req-c849af91-d80d-48ce-bd2e-08b6dd2cf638 req-763bd660-b3eb-4c76-9d63-9fe85aabeaaf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.563 227766 DEBUG oslo_concurrency.lockutils [req-c849af91-d80d-48ce-bd2e-08b6dd2cf638 req-763bd660-b3eb-4c76-9d63-9fe85aabeaaf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.563 227766 DEBUG nova.compute.manager [req-c849af91-d80d-48ce-bd2e-08b6dd2cf638 req-763bd660-b3eb-4c76-9d63-9fe85aabeaaf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Processing event network-vif-plugged-e9ecb184-3eff-49b4-8c20-37b3270c5d4b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:04:08 np0005593234 nova_compute[227762]: 2026-01-23 10:04:08.564 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.565 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d13015e1-22b2-432c-983c-fc4995e88988.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d13015e1-22b2-432c-983c-fc4995e88988.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.567 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4724879b-f2f2-47e5-b20e-bf4379b000be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.568 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-d13015e1-22b2-432c-983c-fc4995e88988
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/d13015e1-22b2-432c-983c-fc4995e88988.pid.haproxy
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID d13015e1-22b2-432c-983c-fc4995e88988
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:04:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:08.569 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988', 'env', 'PROCESS_TAG=haproxy-d13015e1-22b2-432c-983c-fc4995e88988', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d13015e1-22b2-432c-983c-fc4995e88988.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:04:08 np0005593234 podman[278198]: 2026-01-23 10:04:08.924630157 +0000 UTC m=+0.043223572 container create 487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 23 05:04:08 np0005593234 systemd[1]: Started libpod-conmon-487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79.scope.
Jan 23 05:04:08 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:04:08 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/aa9e123fec4bf12d3aa4926aa9883d869e6f9f2b9db5337103ad9dd85f1093a1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:04:08 np0005593234 podman[278198]: 2026-01-23 10:04:08.976724987 +0000 UTC m=+0.095318422 container init 487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:04:08 np0005593234 podman[278198]: 2026-01-23 10:04:08.983645293 +0000 UTC m=+0.102238708 container start 487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:04:08 np0005593234 podman[278198]: 2026-01-23 10:04:08.904772796 +0000 UTC m=+0.023366231 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:04:09 np0005593234 neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988[278236]: [NOTICE]   (278240) : New worker (278243) forked
Jan 23 05:04:09 np0005593234 neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988[278236]: [NOTICE]   (278240) : Loading success.
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.035 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162649.0347455, aaea9e32-85e4-4a86-b997-e3c7d7ef6335 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.035 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] VM Started (Lifecycle Event)#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.037 227766 DEBUG nova.compute.manager [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.042 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.046 227766 INFO nova.virt.libvirt.driver [-] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Instance spawned successfully.#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.046 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.064 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.071 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.071 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.072 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.072 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.073 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.073 227766 DEBUG nova.virt.libvirt.driver [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:04:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:09.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.080 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.111 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.112 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162649.0356367, aaea9e32-85e4-4a86-b997-e3c7d7ef6335 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.113 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.140 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.144 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162649.040972, aaea9e32-85e4-4a86-b997-e3c7d7ef6335 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.144 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.158 227766 INFO nova.compute.manager [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Took 7.06 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.158 227766 DEBUG nova.compute.manager [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.188 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.191 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.230 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.255 227766 INFO nova.compute.manager [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Took 8.22 seconds to build instance.#033[00m
Jan 23 05:04:09 np0005593234 nova_compute[227762]: 2026-01-23 10:04:09.282 227766 DEBUG oslo_concurrency.lockutils [None req-caf9ba0a-7c15-4a6c-9085-2e93ce3215d9 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.453s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:10.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:10 np0005593234 nova_compute[227762]: 2026-01-23 10:04:10.652 227766 DEBUG nova.compute.manager [req-7ac9a107-8e45-4c20-974f-95dbb82db028 req-8cde9da8-235c-4267-9a41-f8e9e11dce68 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Received event network-vif-plugged-e9ecb184-3eff-49b4-8c20-37b3270c5d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:10 np0005593234 nova_compute[227762]: 2026-01-23 10:04:10.653 227766 DEBUG oslo_concurrency.lockutils [req-7ac9a107-8e45-4c20-974f-95dbb82db028 req-8cde9da8-235c-4267-9a41-f8e9e11dce68 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:10 np0005593234 nova_compute[227762]: 2026-01-23 10:04:10.654 227766 DEBUG oslo_concurrency.lockutils [req-7ac9a107-8e45-4c20-974f-95dbb82db028 req-8cde9da8-235c-4267-9a41-f8e9e11dce68 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:10 np0005593234 nova_compute[227762]: 2026-01-23 10:04:10.654 227766 DEBUG oslo_concurrency.lockutils [req-7ac9a107-8e45-4c20-974f-95dbb82db028 req-8cde9da8-235c-4267-9a41-f8e9e11dce68 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:10 np0005593234 nova_compute[227762]: 2026-01-23 10:04:10.654 227766 DEBUG nova.compute.manager [req-7ac9a107-8e45-4c20-974f-95dbb82db028 req-8cde9da8-235c-4267-9a41-f8e9e11dce68 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] No waiting events found dispatching network-vif-plugged-e9ecb184-3eff-49b4-8c20-37b3270c5d4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:10 np0005593234 nova_compute[227762]: 2026-01-23 10:04:10.655 227766 WARNING nova.compute.manager [req-7ac9a107-8e45-4c20-974f-95dbb82db028 req-8cde9da8-235c-4267-9a41-f8e9e11dce68 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Received unexpected event network-vif-plugged-e9ecb184-3eff-49b4-8c20-37b3270c5d4b for instance with vm_state active and task_state None.#033[00m
Jan 23 05:04:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:11.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:11 np0005593234 nova_compute[227762]: 2026-01-23 10:04:11.301 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:12 np0005593234 nova_compute[227762]: 2026-01-23 10:04:12.325 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:12.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:12 np0005593234 nova_compute[227762]: 2026-01-23 10:04:12.856 227766 DEBUG nova.compute.manager [req-37c44273-5711-4561-81b5-881d17e286a4 req-879097c4-a4df-45cd-ae61-f4ee764c4bd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-changed-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:12 np0005593234 nova_compute[227762]: 2026-01-23 10:04:12.857 227766 DEBUG nova.compute.manager [req-37c44273-5711-4561-81b5-881d17e286a4 req-879097c4-a4df-45cd-ae61-f4ee764c4bd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Refreshing instance network info cache due to event network-changed-115f68c4-4489-4fc8-bb90-3c2d3011db2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:04:12 np0005593234 nova_compute[227762]: 2026-01-23 10:04:12.857 227766 DEBUG oslo_concurrency.lockutils [req-37c44273-5711-4561-81b5-881d17e286a4 req-879097c4-a4df-45cd-ae61-f4ee764c4bd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:12 np0005593234 nova_compute[227762]: 2026-01-23 10:04:12.858 227766 DEBUG oslo_concurrency.lockutils [req-37c44273-5711-4561-81b5-881d17e286a4 req-879097c4-a4df-45cd-ae61-f4ee764c4bd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:12 np0005593234 nova_compute[227762]: 2026-01-23 10:04:12.858 227766 DEBUG nova.network.neutron [req-37c44273-5711-4561-81b5-881d17e286a4 req-879097c4-a4df-45cd-ae61-f4ee764c4bd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Refreshing network info cache for port 115f68c4-4489-4fc8-bb90-3c2d3011db2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:04:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.003000095s ======
Jan 23 05:04:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:13.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 23 05:04:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:13Z|00418|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=0)
Jan 23 05:04:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:13Z|00419|binding|INFO|Releasing lport de3e163a-d382-45fe-967f-0e6286ac8b0e from this chassis (sb_readonly=0)
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.271 227766 DEBUG oslo_concurrency.lockutils [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Acquiring lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.271 227766 DEBUG oslo_concurrency.lockutils [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.271 227766 DEBUG oslo_concurrency.lockutils [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Acquiring lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.272 227766 DEBUG oslo_concurrency.lockutils [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.272 227766 DEBUG oslo_concurrency.lockutils [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.273 227766 INFO nova.compute.manager [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Terminating instance#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.274 227766 DEBUG nova.compute.manager [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.344 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:13 np0005593234 kernel: tape9ecb184-3e (unregistering): left promiscuous mode
Jan 23 05:04:13 np0005593234 NetworkManager[48942]: <info>  [1769162653.3813] device (tape9ecb184-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:04:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:13Z|00420|binding|INFO|Releasing lport e9ecb184-3eff-49b4-8c20-37b3270c5d4b from this chassis (sb_readonly=0)
Jan 23 05:04:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:13Z|00421|binding|INFO|Setting lport e9ecb184-3eff-49b4-8c20-37b3270c5d4b down in Southbound
Jan 23 05:04:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:13Z|00422|binding|INFO|Removing iface tape9ecb184-3e ovn-installed in OVS
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.400 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.403 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.408 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:4b:d5 10.100.0.10'], port_security=['fa:16:3e:5b:4b:d5 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'aaea9e32-85e4-4a86-b997-e3c7d7ef6335', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d13015e1-22b2-432c-983c-fc4995e88988', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '115a816b885b44c3956744176af911f2', 'neutron:revision_number': '4', 'neutron:security_group_ids': '558defe0-a126-4ff9-9056-65c3ceef8f86', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=54a9ff3b-17d3-4c79-9db9-bfe573600737, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=e9ecb184-3eff-49b4-8c20-37b3270c5d4b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.410 144381 INFO neutron.agent.ovn.metadata.agent [-] Port e9ecb184-3eff-49b4-8c20-37b3270c5d4b in datapath d13015e1-22b2-432c-983c-fc4995e88988 unbound from our chassis#033[00m
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.414 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d13015e1-22b2-432c-983c-fc4995e88988, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.420 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ba6d010e-2d26-4eac-ac5b-270d7f630db6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.422 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988 namespace which is not needed anymore#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.443 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:13 np0005593234 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006e.scope: Deactivated successfully.
Jan 23 05:04:13 np0005593234 systemd[1]: machine-qemu\x2d48\x2dinstance\x2d0000006e.scope: Consumed 5.104s CPU time.
Jan 23 05:04:13 np0005593234 systemd-machined[195626]: Machine qemu-48-instance-0000006e terminated.
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.502 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.508 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.516 227766 INFO nova.virt.libvirt.driver [-] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Instance destroyed successfully.#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.517 227766 DEBUG nova.objects.instance [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lazy-loading 'resources' on Instance uuid aaea9e32-85e4-4a86-b997-e3c7d7ef6335 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:13 np0005593234 neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988[278236]: [NOTICE]   (278240) : haproxy version is 2.8.14-c23fe91
Jan 23 05:04:13 np0005593234 neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988[278236]: [NOTICE]   (278240) : path to executable is /usr/sbin/haproxy
Jan 23 05:04:13 np0005593234 neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988[278236]: [WARNING]  (278240) : Exiting Master process...
Jan 23 05:04:13 np0005593234 neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988[278236]: [ALERT]    (278240) : Current worker (278243) exited with code 143 (Terminated)
Jan 23 05:04:13 np0005593234 neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988[278236]: [WARNING]  (278240) : All workers exited. Exiting... (0)
Jan 23 05:04:13 np0005593234 systemd[1]: libpod-487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79.scope: Deactivated successfully.
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.586 227766 DEBUG nova.virt.libvirt.vif [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:03:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesTestJSON-server-1461939973',display_name='tempest-ServerAddressesTestJSON-server-1461939973',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressestestjson-server-1461939973',id=110,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:04:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='115a816b885b44c3956744176af911f2',ramdisk_id='',reservation_id='r-7ipsj6to',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesTestJSON-854918154',owner_user_name='tempest-ServerAddressesTestJSON-854918154-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:04:09Z,user_data=None,user_id='faad005151bd403e905a16eb7b539f14',uuid=aaea9e32-85e4-4a86-b997-e3c7d7ef6335,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "address": "fa:16:3e:5b:4b:d5", "network": {"id": "d13015e1-22b2-432c-983c-fc4995e88988", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2108569587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "115a816b885b44c3956744176af911f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ecb184-3e", "ovs_interfaceid": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.587 227766 DEBUG nova.network.os_vif_util [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Converting VIF {"id": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "address": "fa:16:3e:5b:4b:d5", "network": {"id": "d13015e1-22b2-432c-983c-fc4995e88988", "bridge": "br-int", "label": "tempest-ServerAddressesTestJSON-2108569587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "115a816b885b44c3956744176af911f2", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape9ecb184-3e", "ovs_interfaceid": "e9ecb184-3eff-49b4-8c20-37b3270c5d4b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.587 227766 DEBUG nova.network.os_vif_util [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:4b:d5,bridge_name='br-int',has_traffic_filtering=True,id=e9ecb184-3eff-49b4-8c20-37b3270c5d4b,network=Network(d13015e1-22b2-432c-983c-fc4995e88988),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ecb184-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.588 227766 DEBUG os_vif [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:4b:d5,bridge_name='br-int',has_traffic_filtering=True,id=e9ecb184-3eff-49b4-8c20-37b3270c5d4b,network=Network(d13015e1-22b2-432c-983c-fc4995e88988),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ecb184-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:04:13 np0005593234 podman[278285]: 2026-01-23 10:04:13.589769458 +0000 UTC m=+0.044372208 container died 487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.590 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.590 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9ecb184-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.592 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.596 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.599 227766 INFO os_vif [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:4b:d5,bridge_name='br-int',has_traffic_filtering=True,id=e9ecb184-3eff-49b4-8c20-37b3270c5d4b,network=Network(d13015e1-22b2-432c-983c-fc4995e88988),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape9ecb184-3e')#033[00m
Jan 23 05:04:13 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79-userdata-shm.mount: Deactivated successfully.
Jan 23 05:04:13 np0005593234 systemd[1]: var-lib-containers-storage-overlay-aa9e123fec4bf12d3aa4926aa9883d869e6f9f2b9db5337103ad9dd85f1093a1-merged.mount: Deactivated successfully.
Jan 23 05:04:13 np0005593234 podman[278285]: 2026-01-23 10:04:13.634547129 +0000 UTC m=+0.089149879 container cleanup 487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:04:13 np0005593234 systemd[1]: libpod-conmon-487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79.scope: Deactivated successfully.
Jan 23 05:04:13 np0005593234 podman[278326]: 2026-01-23 10:04:13.691217751 +0000 UTC m=+0.037380650 container remove 487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.696 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c354475a-cb90-40d8-91ce-6294872026c3]: (4, ('Fri Jan 23 10:04:13 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988 (487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79)\n487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79\nFri Jan 23 10:04:13 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988 (487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79)\n487681d8d6ef275a9f38725e5da251f164fb891ba83d9d02cc7ee1e143f28e79\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.698 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f9a6ad7c-1b46-44a5-98c6-9b13ed6dd450]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.699 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd13015e1-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.701 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:13 np0005593234 kernel: tapd13015e1-20: left promiscuous mode
Jan 23 05:04:13 np0005593234 nova_compute[227762]: 2026-01-23 10:04:13.719 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.723 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b3eb9bf9-c28f-47bb-89fd-26f467694726]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.744 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4de5a5df-fdae-4f01-a586-1804c6c455f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.746 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[48accfb4-5bed-4fa9-8de2-81e75cfe9d58]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.761 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[16bc3d77-6e10-4333-92f8-0f28db8ccf90]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 654328, 'reachable_time': 19679, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278341, 'error': None, 'target': 'ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:13 np0005593234 systemd[1]: run-netns-ovnmeta\x2dd13015e1\x2d22b2\x2d432c\x2d983c\x2dfc4995e88988.mount: Deactivated successfully.
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.764 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d13015e1-22b2-432c-983c-fc4995e88988 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:04:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:13.765 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[57fc8ab2-3095-4614-950f-01fdd33b4477]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:14.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.955 227766 DEBUG nova.compute.manager [req-5735354e-7cc1-46c5-9e25-30ba76b13a62 req-ab1efbe2-7c22-490f-ab2d-0bc07a320052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Received event network-vif-unplugged-e9ecb184-3eff-49b4-8c20-37b3270c5d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.955 227766 DEBUG oslo_concurrency.lockutils [req-5735354e-7cc1-46c5-9e25-30ba76b13a62 req-ab1efbe2-7c22-490f-ab2d-0bc07a320052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.956 227766 DEBUG oslo_concurrency.lockutils [req-5735354e-7cc1-46c5-9e25-30ba76b13a62 req-ab1efbe2-7c22-490f-ab2d-0bc07a320052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.956 227766 DEBUG oslo_concurrency.lockutils [req-5735354e-7cc1-46c5-9e25-30ba76b13a62 req-ab1efbe2-7c22-490f-ab2d-0bc07a320052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.956 227766 DEBUG nova.compute.manager [req-5735354e-7cc1-46c5-9e25-30ba76b13a62 req-ab1efbe2-7c22-490f-ab2d-0bc07a320052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] No waiting events found dispatching network-vif-unplugged-e9ecb184-3eff-49b4-8c20-37b3270c5d4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.956 227766 DEBUG nova.compute.manager [req-5735354e-7cc1-46c5-9e25-30ba76b13a62 req-ab1efbe2-7c22-490f-ab2d-0bc07a320052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Received event network-vif-unplugged-e9ecb184-3eff-49b4-8c20-37b3270c5d4b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.956 227766 DEBUG nova.compute.manager [req-5735354e-7cc1-46c5-9e25-30ba76b13a62 req-ab1efbe2-7c22-490f-ab2d-0bc07a320052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Received event network-vif-plugged-e9ecb184-3eff-49b4-8c20-37b3270c5d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.957 227766 DEBUG oslo_concurrency.lockutils [req-5735354e-7cc1-46c5-9e25-30ba76b13a62 req-ab1efbe2-7c22-490f-ab2d-0bc07a320052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.957 227766 DEBUG oslo_concurrency.lockutils [req-5735354e-7cc1-46c5-9e25-30ba76b13a62 req-ab1efbe2-7c22-490f-ab2d-0bc07a320052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.957 227766 DEBUG oslo_concurrency.lockutils [req-5735354e-7cc1-46c5-9e25-30ba76b13a62 req-ab1efbe2-7c22-490f-ab2d-0bc07a320052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.957 227766 DEBUG nova.compute.manager [req-5735354e-7cc1-46c5-9e25-30ba76b13a62 req-ab1efbe2-7c22-490f-ab2d-0bc07a320052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] No waiting events found dispatching network-vif-plugged-e9ecb184-3eff-49b4-8c20-37b3270c5d4b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.957 227766 WARNING nova.compute.manager [req-5735354e-7cc1-46c5-9e25-30ba76b13a62 req-ab1efbe2-7c22-490f-ab2d-0bc07a320052 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Received unexpected event network-vif-plugged-e9ecb184-3eff-49b4-8c20-37b3270c5d4b for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.983 227766 DEBUG nova.network.neutron [req-37c44273-5711-4561-81b5-881d17e286a4 req-879097c4-a4df-45cd-ae61-f4ee764c4bd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updated VIF entry in instance network info cache for port 115f68c4-4489-4fc8-bb90-3c2d3011db2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:04:14 np0005593234 nova_compute[227762]: 2026-01-23 10:04:14.984 227766 DEBUG nova.network.neutron [req-37c44273-5711-4561-81b5-881d17e286a4 req-879097c4-a4df-45cd-ae61-f4ee764c4bd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance_info_cache with network_info: [{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:15 np0005593234 nova_compute[227762]: 2026-01-23 10:04:15.012 227766 DEBUG oslo_concurrency.lockutils [req-37c44273-5711-4561-81b5-881d17e286a4 req-879097c4-a4df-45cd-ae61-f4ee764c4bd5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:15.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:15 np0005593234 nova_compute[227762]: 2026-01-23 10:04:15.709 227766 INFO nova.virt.libvirt.driver [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Deleting instance files /var/lib/nova/instances/aaea9e32-85e4-4a86-b997-e3c7d7ef6335_del#033[00m
Jan 23 05:04:15 np0005593234 nova_compute[227762]: 2026-01-23 10:04:15.710 227766 INFO nova.virt.libvirt.driver [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Deletion of /var/lib/nova/instances/aaea9e32-85e4-4a86-b997-e3c7d7ef6335_del complete#033[00m
Jan 23 05:04:15 np0005593234 nova_compute[227762]: 2026-01-23 10:04:15.794 227766 INFO nova.compute.manager [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Took 2.52 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:04:15 np0005593234 nova_compute[227762]: 2026-01-23 10:04:15.794 227766 DEBUG oslo.service.loopingcall [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:04:15 np0005593234 nova_compute[227762]: 2026-01-23 10:04:15.795 227766 DEBUG nova.compute.manager [-] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:04:15 np0005593234 nova_compute[227762]: 2026-01-23 10:04:15.795 227766 DEBUG nova.network.neutron [-] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:04:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:16 np0005593234 nova_compute[227762]: 2026-01-23 10:04:16.302 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:04:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:16.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:04:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:04:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3830183895' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:04:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:17.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:17 np0005593234 nova_compute[227762]: 2026-01-23 10:04:17.197 227766 DEBUG nova.network.neutron [-] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:17 np0005593234 nova_compute[227762]: 2026-01-23 10:04:17.216 227766 INFO nova.compute.manager [-] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Took 1.42 seconds to deallocate network for instance.#033[00m
Jan 23 05:04:17 np0005593234 nova_compute[227762]: 2026-01-23 10:04:17.272 227766 DEBUG oslo_concurrency.lockutils [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:17 np0005593234 nova_compute[227762]: 2026-01-23 10:04:17.272 227766 DEBUG oslo_concurrency.lockutils [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:17 np0005593234 nova_compute[227762]: 2026-01-23 10:04:17.358 227766 DEBUG oslo_concurrency.processutils [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:17 np0005593234 nova_compute[227762]: 2026-01-23 10:04:17.393 227766 DEBUG nova.compute.manager [req-bc7d12bc-8106-4a44-b2b9-9d5edacfbf81 req-f170b5b7-3cc4-4fd5-b8a4-a1a5eddb0213 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Received event network-vif-deleted-e9ecb184-3eff-49b4-8c20-37b3270c5d4b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:04:17 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1707263600' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:04:17 np0005593234 nova_compute[227762]: 2026-01-23 10:04:17.810 227766 DEBUG oslo_concurrency.processutils [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:17 np0005593234 nova_compute[227762]: 2026-01-23 10:04:17.816 227766 DEBUG nova.compute.provider_tree [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:04:18 np0005593234 nova_compute[227762]: 2026-01-23 10:04:18.069 227766 DEBUG nova.scheduler.client.report [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:04:18 np0005593234 nova_compute[227762]: 2026-01-23 10:04:18.116 227766 DEBUG oslo_concurrency.lockutils [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:18.162 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '41'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:18 np0005593234 nova_compute[227762]: 2026-01-23 10:04:18.164 227766 INFO nova.scheduler.client.report [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Deleted allocations for instance aaea9e32-85e4-4a86-b997-e3c7d7ef6335#033[00m
Jan 23 05:04:18 np0005593234 nova_compute[227762]: 2026-01-23 10:04:18.278 227766 DEBUG oslo_concurrency.lockutils [None req-361bc044-8ae8-4f59-ab95-1e0bdef1d561 faad005151bd403e905a16eb7b539f14 115a816b885b44c3956744176af911f2 - - default default] Lock "aaea9e32-85e4-4a86-b997-e3c7d7ef6335" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:18.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:18 np0005593234 nova_compute[227762]: 2026-01-23 10:04:18.594 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:19.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:19 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:19Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e2:de:d3 10.100.0.7
Jan 23 05:04:19 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:19Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e2:de:d3 10.100.0.7
Jan 23 05:04:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:20.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:21.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:21 np0005593234 nova_compute[227762]: 2026-01-23 10:04:21.303 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:04:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:22.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:04:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:04:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:23.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:04:23 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:23Z|00423|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=0)
Jan 23 05:04:23 np0005593234 nova_compute[227762]: 2026-01-23 10:04:23.417 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:23 np0005593234 nova_compute[227762]: 2026-01-23 10:04:23.597 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:23 np0005593234 nova_compute[227762]: 2026-01-23 10:04:23.965 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:04:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:24.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:04:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:25.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:26 np0005593234 nova_compute[227762]: 2026-01-23 10:04:26.306 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:26.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:26 np0005593234 podman[278423]: 2026-01-23 10:04:26.762364802 +0000 UTC m=+0.050700217 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:04:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:27.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:27 np0005593234 nova_compute[227762]: 2026-01-23 10:04:27.774 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:28.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:28 np0005593234 nova_compute[227762]: 2026-01-23 10:04:28.460 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:28 np0005593234 nova_compute[227762]: 2026-01-23 10:04:28.514 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162653.5126889, aaea9e32-85e4-4a86-b997-e3c7d7ef6335 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:28 np0005593234 nova_compute[227762]: 2026-01-23 10:04:28.514 227766 INFO nova.compute.manager [-] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:04:28 np0005593234 nova_compute[227762]: 2026-01-23 10:04:28.546 227766 DEBUG nova.compute.manager [None req-9eb14ea5-1e38-4fdb-8cf9-b05ee93de2b8 - - - - - -] [instance: aaea9e32-85e4-4a86-b997-e3c7d7ef6335] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:28 np0005593234 nova_compute[227762]: 2026-01-23 10:04:28.598 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:29.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:30.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:30 np0005593234 nova_compute[227762]: 2026-01-23 10:04:30.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:30 np0005593234 nova_compute[227762]: 2026-01-23 10:04:30.797 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:30 np0005593234 nova_compute[227762]: 2026-01-23 10:04:30.797 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:30 np0005593234 nova_compute[227762]: 2026-01-23 10:04:30.797 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:30 np0005593234 nova_compute[227762]: 2026-01-23 10:04:30.797 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:04:30 np0005593234 nova_compute[227762]: 2026-01-23 10:04:30.798 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 05:04:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:31.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 05:04:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:04:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1814704474' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:04:31 np0005593234 nova_compute[227762]: 2026-01-23 10:04:31.232 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:31 np0005593234 nova_compute[227762]: 2026-01-23 10:04:31.307 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:31 np0005593234 nova_compute[227762]: 2026-01-23 10:04:31.324 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000006d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:04:31 np0005593234 nova_compute[227762]: 2026-01-23 10:04:31.324 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000006d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:04:31 np0005593234 nova_compute[227762]: 2026-01-23 10:04:31.467 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:04:31 np0005593234 nova_compute[227762]: 2026-01-23 10:04:31.469 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4321MB free_disk=20.87621307373047GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:04:31 np0005593234 nova_compute[227762]: 2026-01-23 10:04:31.469 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:31 np0005593234 nova_compute[227762]: 2026-01-23 10:04:31.469 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:31 np0005593234 nova_compute[227762]: 2026-01-23 10:04:31.535 227766 INFO nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating resource usage from migration cb27169a-f251-4da9-9cb2-2425cc564251#033[00m
Jan 23 05:04:31 np0005593234 nova_compute[227762]: 2026-01-23 10:04:31.570 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Migration cb27169a-f251-4da9-9cb2-2425cc564251 is active on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Jan 23 05:04:31 np0005593234 nova_compute[227762]: 2026-01-23 10:04:31.571 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:04:31 np0005593234 nova_compute[227762]: 2026-01-23 10:04:31.571 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:04:31 np0005593234 nova_compute[227762]: 2026-01-23 10:04:31.632 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:04:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1687107721' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:04:32 np0005593234 nova_compute[227762]: 2026-01-23 10:04:32.069 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:32 np0005593234 nova_compute[227762]: 2026-01-23 10:04:32.076 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:04:32 np0005593234 nova_compute[227762]: 2026-01-23 10:04:32.285 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:04:32 np0005593234 nova_compute[227762]: 2026-01-23 10:04:32.323 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:04:32 np0005593234 nova_compute[227762]: 2026-01-23 10:04:32.324 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.854s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:32.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:32 np0005593234 nova_compute[227762]: 2026-01-23 10:04:32.563 227766 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:32 np0005593234 nova_compute[227762]: 2026-01-23 10:04:32.564 227766 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquired lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:32 np0005593234 nova_compute[227762]: 2026-01-23 10:04:32.564 227766 DEBUG nova.network.neutron [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:04:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:33.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:33 np0005593234 nova_compute[227762]: 2026-01-23 10:04:33.601 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:34 np0005593234 nova_compute[227762]: 2026-01-23 10:04:34.324 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:34.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:34 np0005593234 nova_compute[227762]: 2026-01-23 10:04:34.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:34 np0005593234 nova_compute[227762]: 2026-01-23 10:04:34.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:04:34 np0005593234 nova_compute[227762]: 2026-01-23 10:04:34.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:04:35 np0005593234 nova_compute[227762]: 2026-01-23 10:04:35.044 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:35.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:35 np0005593234 nova_compute[227762]: 2026-01-23 10:04:35.445 227766 DEBUG nova.network.neutron [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance_info_cache with network_info: [{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:35 np0005593234 nova_compute[227762]: 2026-01-23 10:04:35.463 227766 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Releasing lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:35 np0005593234 nova_compute[227762]: 2026-01-23 10:04:35.465 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:35 np0005593234 nova_compute[227762]: 2026-01-23 10:04:35.465 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:04:35 np0005593234 nova_compute[227762]: 2026-01-23 10:04:35.466 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ae2a211d-e923-498b-9ceb-97274a2fd725 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:04:35 np0005593234 nova_compute[227762]: 2026-01-23 10:04:35.559 227766 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 23 05:04:35 np0005593234 nova_compute[227762]: 2026-01-23 10:04:35.559 227766 DEBUG nova.virt.libvirt.volume.remotefs [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Creating file /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/a0bf95dccd314f74a2bb16f56d743e83.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 23 05:04:35 np0005593234 nova_compute[227762]: 2026-01-23 10:04:35.560 227766 DEBUG oslo_concurrency.processutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/a0bf95dccd314f74a2bb16f56d743e83.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:36 np0005593234 nova_compute[227762]: 2026-01-23 10:04:36.059 227766 DEBUG oslo_concurrency.processutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/a0bf95dccd314f74a2bb16f56d743e83.tmp" returned: 1 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:36 np0005593234 nova_compute[227762]: 2026-01-23 10:04:36.060 227766 DEBUG oslo_concurrency.processutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/a0bf95dccd314f74a2bb16f56d743e83.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 05:04:36 np0005593234 nova_compute[227762]: 2026-01-23 10:04:36.060 227766 DEBUG nova.virt.libvirt.volume.remotefs [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Creating directory /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 23 05:04:36 np0005593234 nova_compute[227762]: 2026-01-23 10:04:36.061 227766 DEBUG oslo_concurrency.processutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:04:36 np0005593234 nova_compute[227762]: 2026-01-23 10:04:36.289 227766 DEBUG oslo_concurrency.processutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725" returned: 0 in 0.228s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:04:36 np0005593234 nova_compute[227762]: 2026-01-23 10:04:36.292 227766 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:04:36 np0005593234 nova_compute[227762]: 2026-01-23 10:04:36.309 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:36.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:36 np0005593234 nova_compute[227762]: 2026-01-23 10:04:36.921 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:37.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:37 np0005593234 nova_compute[227762]: 2026-01-23 10:04:37.220 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance_info_cache with network_info: [{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:37 np0005593234 nova_compute[227762]: 2026-01-23 10:04:37.248 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:37 np0005593234 nova_compute[227762]: 2026-01-23 10:04:37.249 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:04:37 np0005593234 nova_compute[227762]: 2026-01-23 10:04:37.249 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:37 np0005593234 nova_compute[227762]: 2026-01-23 10:04:37.250 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:37 np0005593234 nova_compute[227762]: 2026-01-23 10:04:37.276 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Triggering sync for uuid ae2a211d-e923-498b-9ceb-97274a2fd725 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:04:37 np0005593234 nova_compute[227762]: 2026-01-23 10:04:37.277 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:37 np0005593234 nova_compute[227762]: 2026-01-23 10:04:37.277 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:37 np0005593234 nova_compute[227762]: 2026-01-23 10:04:37.277 227766 INFO nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] During sync_power_state the instance has a pending task (resize_migrating). Skip.#033[00m
Jan 23 05:04:37 np0005593234 nova_compute[227762]: 2026-01-23 10:04:37.277 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:37 np0005593234 nova_compute[227762]: 2026-01-23 10:04:37.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:37 np0005593234 nova_compute[227762]: 2026-01-23 10:04:37.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:04:37 np0005593234 podman[278494]: 2026-01-23 10:04:37.772502051 +0000 UTC m=+0.070990931 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:04:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:38.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:38 np0005593234 kernel: tap115f68c4-44 (unregistering): left promiscuous mode
Jan 23 05:04:38 np0005593234 NetworkManager[48942]: <info>  [1769162678.5511] device (tap115f68c4-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:04:38 np0005593234 nova_compute[227762]: 2026-01-23 10:04:38.565 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:38Z|00424|binding|INFO|Releasing lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d from this chassis (sb_readonly=0)
Jan 23 05:04:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:38Z|00425|binding|INFO|Setting lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d down in Southbound
Jan 23 05:04:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:04:38Z|00426|binding|INFO|Removing iface tap115f68c4-44 ovn-installed in OVS
Jan 23 05:04:38 np0005593234 nova_compute[227762]: 2026-01-23 10:04:38.568 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.573 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:de:d3 10.100.0.7'], port_security=['fa:16:3e:e2:de:d3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ae2a211d-e923-498b-9ceb-97274a2fd725', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=115f68c4-4489-4fc8-bb90-3c2d3011db2d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.574 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 115f68c4-4489-4fc8-bb90-3c2d3011db2d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c unbound from our chassis#033[00m
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.575 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee03d7c9-e107-41bf-95cc-5508578ad66c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.576 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b4047b34-4e69-4d83-9029-e5bfbce06f32]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.577 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace which is not needed anymore#033[00m
Jan 23 05:04:38 np0005593234 nova_compute[227762]: 2026-01-23 10:04:38.585 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:38 np0005593234 nova_compute[227762]: 2026-01-23 10:04:38.603 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:38 np0005593234 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006d.scope: Deactivated successfully.
Jan 23 05:04:38 np0005593234 systemd[1]: machine-qemu\x2d47\x2dinstance\x2d0000006d.scope: Consumed 14.424s CPU time.
Jan 23 05:04:38 np0005593234 systemd-machined[195626]: Machine qemu-47-instance-0000006d terminated.
Jan 23 05:04:38 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[277933]: [NOTICE]   (277937) : haproxy version is 2.8.14-c23fe91
Jan 23 05:04:38 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[277933]: [NOTICE]   (277937) : path to executable is /usr/sbin/haproxy
Jan 23 05:04:38 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[277933]: [WARNING]  (277937) : Exiting Master process...
Jan 23 05:04:38 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[277933]: [ALERT]    (277937) : Current worker (277939) exited with code 143 (Terminated)
Jan 23 05:04:38 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[277933]: [WARNING]  (277937) : All workers exited. Exiting... (0)
Jan 23 05:04:38 np0005593234 systemd[1]: libpod-535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856.scope: Deactivated successfully.
Jan 23 05:04:38 np0005593234 podman[278547]: 2026-01-23 10:04:38.7129732 +0000 UTC m=+0.051213243 container died 535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:04:38 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856-userdata-shm.mount: Deactivated successfully.
Jan 23 05:04:38 np0005593234 systemd[1]: var-lib-containers-storage-overlay-a7f5b49db158623d9efe05cca9d020906ed1480abd489c30793ee2bb660dc4e5-merged.mount: Deactivated successfully.
Jan 23 05:04:38 np0005593234 podman[278547]: 2026-01-23 10:04:38.747909693 +0000 UTC m=+0.086149746 container cleanup 535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:04:38 np0005593234 systemd[1]: libpod-conmon-535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856.scope: Deactivated successfully.
Jan 23 05:04:38 np0005593234 podman[278578]: 2026-01-23 10:04:38.811631765 +0000 UTC m=+0.043105249 container remove 535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.817 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6c2aace2-2ce0-46a6-b715-d43b6788aa1d]: (4, ('Fri Jan 23 10:04:38 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856)\n535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856\nFri Jan 23 10:04:38 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856)\n535089d5940d61787f9f1f0c226b551725d81e2342a5508c27b7c438a7fdc856\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.820 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e071e0cb-b879-41bb-bc24-a216b550e6a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.821 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:38 np0005593234 nova_compute[227762]: 2026-01-23 10:04:38.822 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:38 np0005593234 kernel: tapee03d7c9-e0: left promiscuous mode
Jan 23 05:04:38 np0005593234 nova_compute[227762]: 2026-01-23 10:04:38.840 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.843 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[36b937ae-af8e-4369-901a-3feb7a61288f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.856 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2cf3e8c8-48c3-4ad4-82fc-b273ab472d8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.857 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c9f063e3-35b8-46c5-8f84-fec54dc9448b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.870 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aba1e075-5c8e-43fc-a0f3-4b3563de95dc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 653970, 'reachable_time': 22283, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 278607, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.873 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:04:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:38.873 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[bf970fab-fb8f-411f-82a2-65d68dfc6414]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:04:38 np0005593234 systemd[1]: run-netns-ovnmeta\x2dee03d7c9\x2de107\x2d41bf\x2d95cc\x2d5508578ad66c.mount: Deactivated successfully.
Jan 23 05:04:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:39.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.309 227766 INFO nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.315 227766 INFO nova.virt.libvirt.driver [-] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Instance destroyed successfully.#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.315 227766 DEBUG nova.virt.libvirt.vif [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-782058218',display_name='tempest-ServerActionsTestJSON-server-782058218',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-782058218',id=109,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:04:07Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-05cy3qfb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:04:31Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=ae2a211d-e923-498b-9ceb-97274a2fd725,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-267124880-network", "vif_mac": "fa:16:3e:e2:de:d3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.316 227766 DEBUG nova.network.os_vif_util [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestJSON-267124880-network", "vif_mac": "fa:16:3e:e2:de:d3"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.316 227766 DEBUG nova.network.os_vif_util [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.317 227766 DEBUG os_vif [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.318 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.318 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap115f68c4-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.320 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.322 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.324 227766 INFO os_vif [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44')#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.327 227766 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] skipping disk for instance-0000006d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.327 227766 DEBUG nova.virt.libvirt.driver [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] skipping disk for instance-0000006d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.754 227766 DEBUG neutronclient.v2_0.client [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 115f68c4-4489-4fc8-bb90-3c2d3011db2d for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.843 227766 DEBUG nova.compute.manager [req-2d0b9471-2db6-43c0-aefc-020afb5b8429 req-0c754aeb-b2ec-4809-8964-326658b58d46 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.844 227766 DEBUG oslo_concurrency.lockutils [req-2d0b9471-2db6-43c0-aefc-020afb5b8429 req-0c754aeb-b2ec-4809-8964-326658b58d46 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.844 227766 DEBUG oslo_concurrency.lockutils [req-2d0b9471-2db6-43c0-aefc-020afb5b8429 req-0c754aeb-b2ec-4809-8964-326658b58d46 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.844 227766 DEBUG oslo_concurrency.lockutils [req-2d0b9471-2db6-43c0-aefc-020afb5b8429 req-0c754aeb-b2ec-4809-8964-326658b58d46 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.844 227766 DEBUG nova.compute.manager [req-2d0b9471-2db6-43c0-aefc-020afb5b8429 req-0c754aeb-b2ec-4809-8964-326658b58d46 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.844 227766 WARNING nova.compute.manager [req-2d0b9471-2db6-43c0-aefc-020afb5b8429 req-0c754aeb-b2ec-4809-8964-326658b58d46 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.986 227766 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.986 227766 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:39 np0005593234 nova_compute[227762]: 2026-01-23 10:04:39.986 227766 DEBUG oslo_concurrency.lockutils [None req-be9cb11a-fe19-40d2-8c38-6e703fc3ea59 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:40.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:40 np0005593234 nova_compute[227762]: 2026-01-23 10:04:40.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:41.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:41 np0005593234 nova_compute[227762]: 2026-01-23 10:04:41.310 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:04:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:04:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:04:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:04:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:04:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:04:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:04:41 np0005593234 nova_compute[227762]: 2026-01-23 10:04:41.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:42.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:42 np0005593234 nova_compute[227762]: 2026-01-23 10:04:42.687 227766 DEBUG nova.compute.manager [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:42 np0005593234 nova_compute[227762]: 2026-01-23 10:04:42.688 227766 DEBUG oslo_concurrency.lockutils [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:42 np0005593234 nova_compute[227762]: 2026-01-23 10:04:42.688 227766 DEBUG oslo_concurrency.lockutils [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:42 np0005593234 nova_compute[227762]: 2026-01-23 10:04:42.688 227766 DEBUG oslo_concurrency.lockutils [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:42 np0005593234 nova_compute[227762]: 2026-01-23 10:04:42.688 227766 DEBUG nova.compute.manager [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:42 np0005593234 nova_compute[227762]: 2026-01-23 10:04:42.688 227766 WARNING nova.compute.manager [req-4ee08716-68c4-47f7-b88f-b6bc9b0c5074 req-dd6292c0-7f39-4177-b80d-81bbd26905d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 05:04:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:42.841 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:42.842 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:04:42.842 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:04:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:43.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:04:43 np0005593234 nova_compute[227762]: 2026-01-23 10:04:43.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:04:44 np0005593234 nova_compute[227762]: 2026-01-23 10:04:44.321 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:44 np0005593234 nova_compute[227762]: 2026-01-23 10:04:44.377 227766 DEBUG nova.compute.manager [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-changed-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:44 np0005593234 nova_compute[227762]: 2026-01-23 10:04:44.377 227766 DEBUG nova.compute.manager [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Refreshing instance network info cache due to event network-changed-115f68c4-4489-4fc8-bb90-3c2d3011db2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:04:44 np0005593234 nova_compute[227762]: 2026-01-23 10:04:44.377 227766 DEBUG oslo_concurrency.lockutils [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:04:44 np0005593234 nova_compute[227762]: 2026-01-23 10:04:44.377 227766 DEBUG oslo_concurrency.lockutils [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:04:44 np0005593234 nova_compute[227762]: 2026-01-23 10:04:44.377 227766 DEBUG nova.network.neutron [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Refreshing network info cache for port 115f68c4-4489-4fc8-bb90-3c2d3011db2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:04:44 np0005593234 nova_compute[227762]: 2026-01-23 10:04:44.438 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:44.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:45.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e265 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:46 np0005593234 nova_compute[227762]: 2026-01-23 10:04:46.312 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:46.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:04:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:47.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:04:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:04:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:04:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:48.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:49.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:49 np0005593234 nova_compute[227762]: 2026-01-23 10:04:49.323 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e266 e266: 3 total, 3 up, 3 in
Jan 23 05:04:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:50.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:51.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:51 np0005593234 nova_compute[227762]: 2026-01-23 10:04:51.315 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:51 np0005593234 nova_compute[227762]: 2026-01-23 10:04:51.567 227766 DEBUG nova.network.neutron [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updated VIF entry in instance network info cache for port 115f68c4-4489-4fc8-bb90-3c2d3011db2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:04:51 np0005593234 nova_compute[227762]: 2026-01-23 10:04:51.568 227766 DEBUG nova.network.neutron [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance_info_cache with network_info: [{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:04:51 np0005593234 nova_compute[227762]: 2026-01-23 10:04:51.606 227766 DEBUG oslo_concurrency.lockutils [req-b8a9185b-692b-46fb-be97-d13523ef0afd req-0c6d9d71-6f0f-48da-a14d-38f055c94832 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:04:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:52.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:52 np0005593234 nova_compute[227762]: 2026-01-23 10:04:52.701 227766 DEBUG nova.compute.manager [req-1a8dcd39-2d64-4507-a807-1fe709006329 req-b5405643-c9f7-405c-bc73-f2a5375c0f32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:52 np0005593234 nova_compute[227762]: 2026-01-23 10:04:52.702 227766 DEBUG oslo_concurrency.lockutils [req-1a8dcd39-2d64-4507-a807-1fe709006329 req-b5405643-c9f7-405c-bc73-f2a5375c0f32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:52 np0005593234 nova_compute[227762]: 2026-01-23 10:04:52.702 227766 DEBUG oslo_concurrency.lockutils [req-1a8dcd39-2d64-4507-a807-1fe709006329 req-b5405643-c9f7-405c-bc73-f2a5375c0f32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:52 np0005593234 nova_compute[227762]: 2026-01-23 10:04:52.702 227766 DEBUG oslo_concurrency.lockutils [req-1a8dcd39-2d64-4507-a807-1fe709006329 req-b5405643-c9f7-405c-bc73-f2a5375c0f32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:52 np0005593234 nova_compute[227762]: 2026-01-23 10:04:52.702 227766 DEBUG nova.compute.manager [req-1a8dcd39-2d64-4507-a807-1fe709006329 req-b5405643-c9f7-405c-bc73-f2a5375c0f32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:52 np0005593234 nova_compute[227762]: 2026-01-23 10:04:52.702 227766 WARNING nova.compute.manager [req-1a8dcd39-2d64-4507-a807-1fe709006329 req-b5405643-c9f7-405c-bc73-f2a5375c0f32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state resized and task_state None.#033[00m
Jan 23 05:04:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:53.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:53 np0005593234 nova_compute[227762]: 2026-01-23 10:04:53.808 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162678.8074932, ae2a211d-e923-498b-9ceb-97274a2fd725 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:04:53 np0005593234 nova_compute[227762]: 2026-01-23 10:04:53.808 227766 INFO nova.compute.manager [-] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:04:53 np0005593234 nova_compute[227762]: 2026-01-23 10:04:53.831 227766 DEBUG nova.compute.manager [None req-f2788070-0c75-4bf3-ac3e-8e62c5d21d4f - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:04:53 np0005593234 nova_compute[227762]: 2026-01-23 10:04:53.834 227766 DEBUG nova.compute.manager [None req-f2788070-0c75-4bf3-ac3e-8e62c5d21d4f - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:04:53 np0005593234 nova_compute[227762]: 2026-01-23 10:04:53.876 227766 INFO nova.compute.manager [None req-f2788070-0c75-4bf3-ac3e-8e62c5d21d4f - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 23 05:04:54 np0005593234 nova_compute[227762]: 2026-01-23 10:04:54.324 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:54.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:54 np0005593234 nova_compute[227762]: 2026-01-23 10:04:54.893 227766 DEBUG nova.compute.manager [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:54 np0005593234 nova_compute[227762]: 2026-01-23 10:04:54.893 227766 DEBUG oslo_concurrency.lockutils [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:54 np0005593234 nova_compute[227762]: 2026-01-23 10:04:54.894 227766 DEBUG oslo_concurrency.lockutils [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:54 np0005593234 nova_compute[227762]: 2026-01-23 10:04:54.894 227766 DEBUG oslo_concurrency.lockutils [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:54 np0005593234 nova_compute[227762]: 2026-01-23 10:04:54.894 227766 DEBUG nova.compute.manager [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:54 np0005593234 nova_compute[227762]: 2026-01-23 10:04:54.894 227766 WARNING nova.compute.manager [req-b62ffb32-9606-46b4-a525-045f2192468c req-62763de6-a1c5-47d2-9c73-6d291c3d26b9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:04:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:55.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:04:56 np0005593234 nova_compute[227762]: 2026-01-23 10:04:56.316 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:56.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:57.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:57 np0005593234 podman[278967]: 2026-01-23 10:04:57.791514065 +0000 UTC m=+0.067689288 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 05:04:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:58.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:59 np0005593234 nova_compute[227762]: 2026-01-23 10:04:59.069 227766 DEBUG nova.compute.manager [req-1954d11c-a023-4b87-878f-aa6a1d488c11 req-8a8b67c8-5c0b-4f82-b5d0-7696673e20e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:04:59 np0005593234 nova_compute[227762]: 2026-01-23 10:04:59.070 227766 DEBUG oslo_concurrency.lockutils [req-1954d11c-a023-4b87-878f-aa6a1d488c11 req-8a8b67c8-5c0b-4f82-b5d0-7696673e20e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:04:59 np0005593234 nova_compute[227762]: 2026-01-23 10:04:59.070 227766 DEBUG oslo_concurrency.lockutils [req-1954d11c-a023-4b87-878f-aa6a1d488c11 req-8a8b67c8-5c0b-4f82-b5d0-7696673e20e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:04:59 np0005593234 nova_compute[227762]: 2026-01-23 10:04:59.070 227766 DEBUG oslo_concurrency.lockutils [req-1954d11c-a023-4b87-878f-aa6a1d488c11 req-8a8b67c8-5c0b-4f82-b5d0-7696673e20e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:04:59 np0005593234 nova_compute[227762]: 2026-01-23 10:04:59.070 227766 DEBUG nova.compute.manager [req-1954d11c-a023-4b87-878f-aa6a1d488c11 req-8a8b67c8-5c0b-4f82-b5d0-7696673e20e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:04:59 np0005593234 nova_compute[227762]: 2026-01-23 10:04:59.070 227766 WARNING nova.compute.manager [req-1954d11c-a023-4b87-878f-aa6a1d488c11 req-8a8b67c8-5c0b-4f82-b5d0-7696673e20e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:04:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:04:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:04:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:59.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:04:59 np0005593234 nova_compute[227762]: 2026-01-23 10:04:59.327 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:04:59 np0005593234 nova_compute[227762]: 2026-01-23 10:04:59.659 227766 INFO nova.compute.manager [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Swapping old allocation on dict_keys(['89873210-bee9-46e9-9f9d-0cd7a156c3a8']) held by migration cb27169a-f251-4da9-9cb2-2425cc564251 for instance#033[00m
Jan 23 05:04:59 np0005593234 nova_compute[227762]: 2026-01-23 10:04:59.706 227766 DEBUG nova.scheduler.client.report [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Overwriting current allocation {'allocations': {'929812a2-38ca-4ee7-9f24-090d633cb42b': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}, 'generation': 60}}, 'project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'user_id': '9d4a5c201efa4992a9ef57d8abdc1675', 'consumer_generation': 1} on consumer ae2a211d-e923-498b-9ceb-97274a2fd725 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 23 05:05:00 np0005593234 nova_compute[227762]: 2026-01-23 10:05:00.129 227766 INFO nova.network.neutron [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating port 115f68c4-4489-4fc8-bb90-3c2d3011db2d with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 05:05:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:00.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:01.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:01 np0005593234 nova_compute[227762]: 2026-01-23 10:05:01.300 227766 DEBUG nova.compute.manager [req-01d72504-e038-4473-9659-65d5a6459dd0 req-8d96cb6e-a34a-40c8-a3b1-aced264f7e65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:01 np0005593234 nova_compute[227762]: 2026-01-23 10:05:01.301 227766 DEBUG oslo_concurrency.lockutils [req-01d72504-e038-4473-9659-65d5a6459dd0 req-8d96cb6e-a34a-40c8-a3b1-aced264f7e65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:01 np0005593234 nova_compute[227762]: 2026-01-23 10:05:01.301 227766 DEBUG oslo_concurrency.lockutils [req-01d72504-e038-4473-9659-65d5a6459dd0 req-8d96cb6e-a34a-40c8-a3b1-aced264f7e65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:01 np0005593234 nova_compute[227762]: 2026-01-23 10:05:01.301 227766 DEBUG oslo_concurrency.lockutils [req-01d72504-e038-4473-9659-65d5a6459dd0 req-8d96cb6e-a34a-40c8-a3b1-aced264f7e65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:01 np0005593234 nova_compute[227762]: 2026-01-23 10:05:01.302 227766 DEBUG nova.compute.manager [req-01d72504-e038-4473-9659-65d5a6459dd0 req-8d96cb6e-a34a-40c8-a3b1-aced264f7e65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:05:01 np0005593234 nova_compute[227762]: 2026-01-23 10:05:01.302 227766 WARNING nova.compute.manager [req-01d72504-e038-4473-9659-65d5a6459dd0 req-8d96cb6e-a34a-40c8-a3b1-aced264f7e65 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:05:01 np0005593234 nova_compute[227762]: 2026-01-23 10:05:01.317 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:02 np0005593234 nova_compute[227762]: 2026-01-23 10:05:02.157 227766 DEBUG oslo_concurrency.lockutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:05:02 np0005593234 nova_compute[227762]: 2026-01-23 10:05:02.158 227766 DEBUG oslo_concurrency.lockutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquired lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:05:02 np0005593234 nova_compute[227762]: 2026-01-23 10:05:02.158 227766 DEBUG nova.network.neutron [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:05:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:02.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:02 np0005593234 nova_compute[227762]: 2026-01-23 10:05:02.749 227766 DEBUG nova.compute.manager [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-changed-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:02 np0005593234 nova_compute[227762]: 2026-01-23 10:05:02.750 227766 DEBUG nova.compute.manager [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Refreshing instance network info cache due to event network-changed-115f68c4-4489-4fc8-bb90-3c2d3011db2d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:05:02 np0005593234 nova_compute[227762]: 2026-01-23 10:05:02.750 227766 DEBUG oslo_concurrency.lockutils [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:05:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:03.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:04 np0005593234 nova_compute[227762]: 2026-01-23 10:05:04.331 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:05:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:04.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:05:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:05.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e266 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:06 np0005593234 nova_compute[227762]: 2026-01-23 10:05:06.318 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:06 np0005593234 nova_compute[227762]: 2026-01-23 10:05:06.425 227766 DEBUG nova.network.neutron [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance_info_cache with network_info: [{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:06 np0005593234 nova_compute[227762]: 2026-01-23 10:05:06.481 227766 DEBUG oslo_concurrency.lockutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Releasing lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:05:06 np0005593234 nova_compute[227762]: 2026-01-23 10:05:06.482 227766 DEBUG nova.virt.libvirt.driver [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 23 05:05:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:05:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:06.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:05:06 np0005593234 nova_compute[227762]: 2026-01-23 10:05:06.511 227766 DEBUG oslo_concurrency.lockutils [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:05:06 np0005593234 nova_compute[227762]: 2026-01-23 10:05:06.512 227766 DEBUG nova.network.neutron [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Refreshing network info cache for port 115f68c4-4489-4fc8-bb90-3c2d3011db2d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:05:06 np0005593234 nova_compute[227762]: 2026-01-23 10:05:06.550 227766 DEBUG nova.storage.rbd_utils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rolling back rbd image(ae2a211d-e923-498b-9ceb-97274a2fd725_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Jan 23 05:05:06 np0005593234 nova_compute[227762]: 2026-01-23 10:05:06.804 227766 DEBUG nova.storage.rbd_utils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] removing snapshot(nova-resize) on rbd image(ae2a211d-e923-498b-9ceb-97274a2fd725_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:05:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e267 e267: 3 total, 3 up, 3 in
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.135 227766 DEBUG nova.virt.libvirt.driver [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Start _get_guest_xml network_info=[{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.139 227766 WARNING nova.virt.libvirt.driver [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.143 227766 DEBUG nova.virt.libvirt.host [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.143 227766 DEBUG nova.virt.libvirt.host [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.146 227766 DEBUG nova.virt.libvirt.host [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.147 227766 DEBUG nova.virt.libvirt.host [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.148 227766 DEBUG nova.virt.libvirt.driver [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.148 227766 DEBUG nova.virt.hardware [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.149 227766 DEBUG nova.virt.hardware [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.149 227766 DEBUG nova.virt.hardware [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.149 227766 DEBUG nova.virt.hardware [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.149 227766 DEBUG nova.virt.hardware [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.149 227766 DEBUG nova.virt.hardware [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.150 227766 DEBUG nova.virt.hardware [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.150 227766 DEBUG nova.virt.hardware [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.150 227766 DEBUG nova.virt.hardware [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.150 227766 DEBUG nova.virt.hardware [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.150 227766 DEBUG nova.virt.hardware [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.151 227766 DEBUG nova.objects.instance [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'vcpu_model' on Instance uuid ae2a211d-e923-498b-9ceb-97274a2fd725 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.171 227766 DEBUG oslo_concurrency.processutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:07.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:05:07 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1087546212' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.619 227766 DEBUG oslo_concurrency.processutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:07 np0005593234 nova_compute[227762]: 2026-01-23 10:05:07.664 227766 DEBUG oslo_concurrency.processutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:05:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3731758975' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.092 227766 DEBUG oslo_concurrency.processutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.094 227766 DEBUG nova.virt.libvirt.vif [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-782058218',display_name='tempest-ServerActionsTestJSON-server-782058218',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-782058218',id=109,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:04:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-05cy3qfb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:04:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=ae2a211d-e923-498b-9ceb-97274a2fd725,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.094 227766 DEBUG nova.network.os_vif_util [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.095 227766 DEBUG nova.network.os_vif_util [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.098 227766 DEBUG nova.virt.libvirt.driver [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  <uuid>ae2a211d-e923-498b-9ceb-97274a2fd725</uuid>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  <name>instance-0000006d</name>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerActionsTestJSON-server-782058218</nova:name>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:05:07</nova:creationTime>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <nova:user uuid="9d4a5c201efa4992a9ef57d8abdc1675">tempest-ServerActionsTestJSON-1619235720-project-member</nova:user>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <nova:project uuid="74c5c1d0762242f29a5d26033efd9f6d">tempest-ServerActionsTestJSON-1619235720</nova:project>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <nova:port uuid="115f68c4-4489-4fc8-bb90-3c2d3011db2d">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <entry name="serial">ae2a211d-e923-498b-9ceb-97274a2fd725</entry>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <entry name="uuid">ae2a211d-e923-498b-9ceb-97274a2fd725</entry>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/ae2a211d-e923-498b-9ceb-97274a2fd725_disk">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/ae2a211d-e923-498b-9ceb-97274a2fd725_disk.config">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:e2:de:d3"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <target dev="tap115f68c4-44"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725/console.log" append="off"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <input type="keyboard" bus="usb"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:05:08 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:05:08 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:05:08 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:05:08 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.100 227766 DEBUG nova.compute.manager [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Preparing to wait for external event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.100 227766 DEBUG oslo_concurrency.lockutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.100 227766 DEBUG oslo_concurrency.lockutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.101 227766 DEBUG oslo_concurrency.lockutils [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.101 227766 DEBUG nova.virt.libvirt.vif [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-782058218',display_name='tempest-ServerActionsTestJSON-server-782058218',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-782058218',id=109,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:04:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-05cy3qfb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:04:54Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=ae2a211d-e923-498b-9ceb-97274a2fd725,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.102 227766 DEBUG nova.network.os_vif_util [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.102 227766 DEBUG nova.network.os_vif_util [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.102 227766 DEBUG os_vif [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.103 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.103 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.104 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.106 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.107 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap115f68c4-44, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.107 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap115f68c4-44, col_values=(('external_ids', {'iface-id': '115f68c4-4489-4fc8-bb90-3c2d3011db2d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e2:de:d3', 'vm-uuid': 'ae2a211d-e923-498b-9ceb-97274a2fd725'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.109 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:08 np0005593234 NetworkManager[48942]: <info>  [1769162708.1103] manager: (tap115f68c4-44): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/220)
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.111 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.115 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.116 227766 INFO os_vif [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44')#033[00m
Jan 23 05:05:08 np0005593234 kernel: tap115f68c4-44: entered promiscuous mode
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.179 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:08Z|00427|binding|INFO|Claiming lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d for this chassis.
Jan 23 05:05:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:08Z|00428|binding|INFO|115f68c4-4489-4fc8-bb90-3c2d3011db2d: Claiming fa:16:3e:e2:de:d3 10.100.0.7
Jan 23 05:05:08 np0005593234 NetworkManager[48942]: <info>  [1769162708.1817] manager: (tap115f68c4-44): new Tun device (/org/freedesktop/NetworkManager/Devices/221)
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.195 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:de:d3 10.100.0.7'], port_security=['fa:16:3e:e2:de:d3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ae2a211d-e923-498b-9ceb-97274a2fd725', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '10', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=115f68c4-4489-4fc8-bb90-3c2d3011db2d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.196 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 115f68c4-4489-4fc8-bb90-3c2d3011db2d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c bound to our chassis#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.198 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee03d7c9-e107-41bf-95cc-5508578ad66c#033[00m
Jan 23 05:05:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:08Z|00429|binding|INFO|Setting lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d ovn-installed in OVS
Jan 23 05:05:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:08Z|00430|binding|INFO|Setting lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d up in Southbound
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.204 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.208 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.211 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8816bd21-b0a1-40ae-bea6-2b7370be5466]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.212 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee03d7c9-e1 in ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.213 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee03d7c9-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.213 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2c0fae95-f92e-4ed9-913b-0ca9e4cebc2b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 systemd-udevd[279190]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.214 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[763361a5-21fd-44fe-975c-b4f76aa98b94]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 systemd-machined[195626]: New machine qemu-49-instance-0000006d.
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.225 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[8620743d-f003-444d-a6ab-2ddba72d9fc0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 NetworkManager[48942]: <info>  [1769162708.2293] device (tap115f68c4-44): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:05:08 np0005593234 NetworkManager[48942]: <info>  [1769162708.2304] device (tap115f68c4-44): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:05:08 np0005593234 systemd[1]: Started Virtual Machine qemu-49-instance-0000006d.
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.251 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[44c401ad-9eca-4702-8386-af2e8a3184ad]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 podman[279161]: 2026-01-23 10:05:08.262905019 +0000 UTC m=+0.116251057 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.284 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c25075-5b37-4cf1-b133-bd5c89d7e40f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.289 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[86886cf4-f413-4b30-941a-d687ddf52b6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 NetworkManager[48942]: <info>  [1769162708.2903] manager: (tapee03d7c9-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/222)
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.319 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[147750f5-4f64-47f6-994f-fe5a0bd3797a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.321 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b88e9354-b588-420f-a14d-9a23d87605f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 NetworkManager[48942]: <info>  [1769162708.3438] device (tapee03d7c9-e0): carrier: link connected
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.350 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4e2791be-b598-48d0-ac74-e33d2c6f7365]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.368 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[02eae5fd-cfdd-42dd-8c40-521bce7deb5a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660330, 'reachable_time': 27769, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279232, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.386 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a4ee4e-2d73-443e-a2bb-c7c70c32995d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:6530'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 660330, 'tstamp': 660330}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 279233, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.403 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[60e9e94b-8e55-4d4f-ab18-fff448a466e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 139], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660330, 'reachable_time': 27769, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 279234, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.438 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[96a4e66c-4908-42cb-8a6e-0e2a32052911]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.490 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c0df7495-4b0b-4580-be3b-68511b4ee429]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.492 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:08.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.492 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.493 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee03d7c9-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.496 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:08 np0005593234 NetworkManager[48942]: <info>  [1769162708.4969] manager: (tapee03d7c9-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/223)
Jan 23 05:05:08 np0005593234 kernel: tapee03d7c9-e0: entered promiscuous mode
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.499 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee03d7c9-e0, col_values=(('external_ids', {'iface-id': '702d4523-a665-42f5-9a36-57d187c0698a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.500 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:08Z|00431|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=0)
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.517 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.519 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.520 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d7a718b1-2336-4118-bba2-1ebef7b31e3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.520 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:05:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:08.522 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'env', 'PROCESS_TAG=haproxy-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee03d7c9-e107-41bf-95cc-5508578ad66c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.636 227766 DEBUG nova.compute.manager [req-c6a7ea4b-730a-4a57-bbcc-64ce6e4b2864 req-e7fca34f-5baa-4884-8d4b-572ea6798a59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.636 227766 DEBUG oslo_concurrency.lockutils [req-c6a7ea4b-730a-4a57-bbcc-64ce6e4b2864 req-e7fca34f-5baa-4884-8d4b-572ea6798a59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.637 227766 DEBUG oslo_concurrency.lockutils [req-c6a7ea4b-730a-4a57-bbcc-64ce6e4b2864 req-e7fca34f-5baa-4884-8d4b-572ea6798a59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.637 227766 DEBUG oslo_concurrency.lockutils [req-c6a7ea4b-730a-4a57-bbcc-64ce6e4b2864 req-e7fca34f-5baa-4884-8d4b-572ea6798a59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:08 np0005593234 nova_compute[227762]: 2026-01-23 10:05:08.637 227766 DEBUG nova.compute.manager [req-c6a7ea4b-730a-4a57-bbcc-64ce6e4b2864 req-e7fca34f-5baa-4884-8d4b-572ea6798a59 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Processing event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:05:08 np0005593234 podman[279281]: 2026-01-23 10:05:08.886172769 +0000 UTC m=+0.055673052 container create 355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:05:08 np0005593234 systemd[1]: Started libpod-conmon-355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902.scope.
Jan 23 05:05:08 np0005593234 podman[279281]: 2026-01-23 10:05:08.854438477 +0000 UTC m=+0.023938800 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:05:08 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:05:08 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e977b40561091c0fa56ab4f673f0009f99d733eca2a934aa39700db1126618ff/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:05:08 np0005593234 podman[279281]: 2026-01-23 10:05:08.976426111 +0000 UTC m=+0.145926424 container init 355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:05:08 np0005593234 podman[279281]: 2026-01-23 10:05:08.984280637 +0000 UTC m=+0.153780920 container start 355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:05:09 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279317]: [NOTICE]   (279326) : New worker (279328) forked
Jan 23 05:05:09 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279317]: [NOTICE]   (279326) : Loading success.
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.055 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162709.0526884, ae2a211d-e923-498b-9ceb-97274a2fd725 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.055 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] VM Started (Lifecycle Event)#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.058 227766 DEBUG nova.compute.manager [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.064 227766 INFO nova.virt.libvirt.driver [-] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Instance running successfully.#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.065 227766 DEBUG nova.virt.libvirt.driver [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.110 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.113 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:05:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:09.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.272 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.273 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162709.0558288, ae2a211d-e923-498b-9ceb-97274a2fd725 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.273 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.393 227766 INFO nova.compute.manager [None req-c3bb811a-31ad-41c6-a68d-638942f72736 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance to original state: 'active'#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.396 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.400 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162709.0607862, ae2a211d-e923-498b-9ceb-97274a2fd725 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.400 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.449 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.452 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:05:09 np0005593234 nova_compute[227762]: 2026-01-23 10:05:09.482 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 23 05:05:10 np0005593234 nova_compute[227762]: 2026-01-23 10:05:10.464 227766 DEBUG nova.network.neutron [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updated VIF entry in instance network info cache for port 115f68c4-4489-4fc8-bb90-3c2d3011db2d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:05:10 np0005593234 nova_compute[227762]: 2026-01-23 10:05:10.465 227766 DEBUG nova.network.neutron [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance_info_cache with network_info: [{"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:10.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:10 np0005593234 nova_compute[227762]: 2026-01-23 10:05:10.497 227766 DEBUG oslo_concurrency.lockutils [req-bc5d93b0-c8be-4e8e-bbfb-137f99eea198 req-9ab22828-58e2-459f-afc4-bc1bc4dc50e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ae2a211d-e923-498b-9ceb-97274a2fd725" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:05:10 np0005593234 nova_compute[227762]: 2026-01-23 10:05:10.715 227766 DEBUG nova.compute.manager [req-b3fcbc8d-4b5a-4b19-a93d-5462b4c50d20 req-ed681d7b-bbde-4d9c-9d1a-3af2970309ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:10 np0005593234 nova_compute[227762]: 2026-01-23 10:05:10.715 227766 DEBUG oslo_concurrency.lockutils [req-b3fcbc8d-4b5a-4b19-a93d-5462b4c50d20 req-ed681d7b-bbde-4d9c-9d1a-3af2970309ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:10 np0005593234 nova_compute[227762]: 2026-01-23 10:05:10.715 227766 DEBUG oslo_concurrency.lockutils [req-b3fcbc8d-4b5a-4b19-a93d-5462b4c50d20 req-ed681d7b-bbde-4d9c-9d1a-3af2970309ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:10 np0005593234 nova_compute[227762]: 2026-01-23 10:05:10.716 227766 DEBUG oslo_concurrency.lockutils [req-b3fcbc8d-4b5a-4b19-a93d-5462b4c50d20 req-ed681d7b-bbde-4d9c-9d1a-3af2970309ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:10 np0005593234 nova_compute[227762]: 2026-01-23 10:05:10.716 227766 DEBUG nova.compute.manager [req-b3fcbc8d-4b5a-4b19-a93d-5462b4c50d20 req-ed681d7b-bbde-4d9c-9d1a-3af2970309ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:05:10 np0005593234 nova_compute[227762]: 2026-01-23 10:05:10.716 227766 WARNING nova.compute.manager [req-b3fcbc8d-4b5a-4b19-a93d-5462b4c50d20 req-ed681d7b-bbde-4d9c-9d1a-3af2970309ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:05:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:11.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:11 np0005593234 nova_compute[227762]: 2026-01-23 10:05:11.322 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:12.451 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=42, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=41) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:05:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:12.453 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:05:12 np0005593234 nova_compute[227762]: 2026-01-23 10:05:12.453 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:12.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:12 np0005593234 nova_compute[227762]: 2026-01-23 10:05:12.802 227766 DEBUG oslo_concurrency.lockutils [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:12 np0005593234 nova_compute[227762]: 2026-01-23 10:05:12.803 227766 DEBUG oslo_concurrency.lockutils [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:12 np0005593234 nova_compute[227762]: 2026-01-23 10:05:12.803 227766 DEBUG oslo_concurrency.lockutils [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:12 np0005593234 nova_compute[227762]: 2026-01-23 10:05:12.803 227766 DEBUG oslo_concurrency.lockutils [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:12 np0005593234 nova_compute[227762]: 2026-01-23 10:05:12.804 227766 DEBUG oslo_concurrency.lockutils [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:12 np0005593234 nova_compute[227762]: 2026-01-23 10:05:12.805 227766 INFO nova.compute.manager [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Terminating instance#033[00m
Jan 23 05:05:12 np0005593234 nova_compute[227762]: 2026-01-23 10:05:12.806 227766 DEBUG nova.compute.manager [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:05:12 np0005593234 kernel: tap115f68c4-44 (unregistering): left promiscuous mode
Jan 23 05:05:12 np0005593234 NetworkManager[48942]: <info>  [1769162712.9611] device (tap115f68c4-44): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:05:12 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:12Z|00432|binding|INFO|Releasing lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d from this chassis (sb_readonly=0)
Jan 23 05:05:12 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:12Z|00433|binding|INFO|Setting lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d down in Southbound
Jan 23 05:05:12 np0005593234 nova_compute[227762]: 2026-01-23 10:05:12.970 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:12 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:12Z|00434|binding|INFO|Removing iface tap115f68c4-44 ovn-installed in OVS
Jan 23 05:05:12 np0005593234 nova_compute[227762]: 2026-01-23 10:05:12.972 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:12 np0005593234 nova_compute[227762]: 2026-01-23 10:05:12.989 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000006d.scope: Deactivated successfully.
Jan 23 05:05:13 np0005593234 systemd[1]: machine-qemu\x2d49\x2dinstance\x2d0000006d.scope: Consumed 4.749s CPU time.
Jan 23 05:05:13 np0005593234 systemd-machined[195626]: Machine qemu-49-instance-0000006d terminated.
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.088 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:de:d3 10.100.0.7'], port_security=['fa:16:3e:e2:de:d3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ae2a211d-e923-498b-9ceb-97274a2fd725', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '12', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=115f68c4-4489-4fc8-bb90-3c2d3011db2d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.089 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 115f68c4-4489-4fc8-bb90-3c2d3011db2d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c unbound from our chassis#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.091 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee03d7c9-e107-41bf-95cc-5508578ad66c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.092 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6a6c7f08-300b-4eaf-af5f-1531b597a6ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.092 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace which is not needed anymore#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.109 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:13.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:13 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279317]: [NOTICE]   (279326) : haproxy version is 2.8.14-c23fe91
Jan 23 05:05:13 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279317]: [NOTICE]   (279326) : path to executable is /usr/sbin/haproxy
Jan 23 05:05:13 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279317]: [WARNING]  (279326) : Exiting Master process...
Jan 23 05:05:13 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279317]: [WARNING]  (279326) : Exiting Master process...
Jan 23 05:05:13 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279317]: [ALERT]    (279326) : Current worker (279328) exited with code 143 (Terminated)
Jan 23 05:05:13 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279317]: [WARNING]  (279326) : All workers exited. Exiting... (0)
Jan 23 05:05:13 np0005593234 systemd[1]: libpod-355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902.scope: Deactivated successfully.
Jan 23 05:05:13 np0005593234 podman[279364]: 2026-01-23 10:05:13.208979295 +0000 UTC m=+0.043798311 container died 355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:05:13 np0005593234 kernel: tap115f68c4-44: entered promiscuous mode
Jan 23 05:05:13 np0005593234 systemd-udevd[279343]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:05:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:13Z|00435|binding|INFO|Claiming lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d for this chassis.
Jan 23 05:05:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:13Z|00436|binding|INFO|115f68c4-4489-4fc8-bb90-3c2d3011db2d: Claiming fa:16:3e:e2:de:d3 10.100.0.7
Jan 23 05:05:13 np0005593234 NetworkManager[48942]: <info>  [1769162713.2231] manager: (tap115f68c4-44): new Tun device (/org/freedesktop/NetworkManager/Devices/224)
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.222 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 kernel: tap115f68c4-44 (unregistering): left promiscuous mode
Jan 23 05:05:13 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902-userdata-shm.mount: Deactivated successfully.
Jan 23 05:05:13 np0005593234 systemd[1]: var-lib-containers-storage-overlay-e977b40561091c0fa56ab4f673f0009f99d733eca2a934aa39700db1126618ff-merged.mount: Deactivated successfully.
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.245 227766 INFO nova.virt.libvirt.driver [-] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Instance destroyed successfully.#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.246 227766 DEBUG nova.objects.instance [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'resources' on Instance uuid ae2a211d-e923-498b-9ceb-97274a2fd725 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:13Z|00437|binding|INFO|Setting lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d ovn-installed in OVS
Jan 23 05:05:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:13Z|00438|if_status|INFO|Dropped 4 log messages in last 329 seconds (most recently, 329 seconds ago) due to excessive rate
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.247 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:13Z|00439|if_status|INFO|Not setting lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d down as sb is readonly
Jan 23 05:05:13 np0005593234 podman[279364]: 2026-01-23 10:05:13.252941589 +0000 UTC m=+0.087760585 container cleanup 355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:05:13 np0005593234 systemd[1]: libpod-conmon-355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902.scope: Deactivated successfully.
Jan 23 05:05:13 np0005593234 podman[279398]: 2026-01-23 10:05:13.311238983 +0000 UTC m=+0.039090714 container remove 355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.316 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fe4dfa3b-1a47-49c3-aaec-270240ab57f8]: (4, ('Fri Jan 23 10:05:13 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902)\n355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902\nFri Jan 23 10:05:13 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902)\n355e0e3e199fc80bb22ff6e11482c599f02c5a7e64af889bbd75c28fb106a902\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.318 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2d1d6911-7861-472a-9fa3-e0ecf752ecfe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.319 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.321 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 kernel: tapee03d7c9-e0: left promiscuous mode
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.337 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.340 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d41fd315-4fb5-40a6-8e1a-1c6b64ce1419]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.352 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[43a5b7ee-2ed4-45f9-82e7-2c3ba52c11ad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.353 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[65fe3853-2a35-4efd-b971-776c08e968bd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.368 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[96a23a7a-61b7-4c64-91c4-c11a77a86f69]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660323, 'reachable_time': 27323, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279416, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.370 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.370 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[c66c0da1-0083-44a9-acf6-92f5517260d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 systemd[1]: run-netns-ovnmeta\x2dee03d7c9\x2de107\x2d41bf\x2d95cc\x2d5508578ad66c.mount: Deactivated successfully.
Jan 23 05:05:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:13Z|00440|binding|INFO|Releasing lport 115f68c4-4489-4fc8-bb90-3c2d3011db2d from this chassis (sb_readonly=0)
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.376 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:de:d3 10.100.0.7'], port_security=['fa:16:3e:e2:de:d3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ae2a211d-e923-498b-9ceb-97274a2fd725', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '12', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=115f68c4-4489-4fc8-bb90-3c2d3011db2d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.377 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 115f68c4-4489-4fc8-bb90-3c2d3011db2d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c bound to our chassis#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.378 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee03d7c9-e107-41bf-95cc-5508578ad66c#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.389 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.391 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f9251171-0887-4226-ab19-6457844aaf0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.392 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee03d7c9-e1 in ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.394 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e2:de:d3 10.100.0.7'], port_security=['fa:16:3e:e2:de:d3 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ae2a211d-e923-498b-9ceb-97274a2fd725', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '12', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=115f68c4-4489-4fc8-bb90-3c2d3011db2d) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.393 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee03d7c9-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.395 227766 DEBUG nova.virt.libvirt.vif [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:03:53Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-782058218',display_name='tempest-ServerActionsTestJSON-server-782058218',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-782058218',id=109,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:05:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-05cy3qfb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:05:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=ae2a211d-e923-498b-9ceb-97274a2fd725,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.394 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fcf84b7c-72b5-4296-8b1f-607692a39c86]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.396 227766 DEBUG nova.network.os_vif_util [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "address": "fa:16:3e:e2:de:d3", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap115f68c4-44", "ovs_interfaceid": "115f68c4-4489-4fc8-bb90-3c2d3011db2d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.396 227766 DEBUG nova.network.os_vif_util [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.396 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d018329a-8788-4876-8392-a05560a928ce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.397 227766 DEBUG os_vif [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.398 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.398 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap115f68c4-44, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.399 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.400 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.403 227766 INFO os_vif [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e2:de:d3,bridge_name='br-int',has_traffic_filtering=True,id=115f68c4-4489-4fc8-bb90-3c2d3011db2d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap115f68c4-44')#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.407 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[04ebef33-a254-4878-a433-14b023847a57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.420 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cd5ab02d-908f-4278-97d9-6453b89f6e51]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.449 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[2a9ff931-b88d-4d57-be17-d0c8c9a518ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 NetworkManager[48942]: <info>  [1769162713.4582] manager: (tapee03d7c9-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/225)
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.456 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[de81cafb-c7be-4ccf-af78-d0f9a96cbba9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.493 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[5074cdca-9e05-4b60-93fa-e8e66a0236cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.497 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[444e1fbe-9977-48c3-b20e-fd8df434e497]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 NetworkManager[48942]: <info>  [1769162713.5275] device (tapee03d7c9-e0): carrier: link connected
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.534 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9d6a53af-8728-4f5f-a860-6a473f12543a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.553 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[77e7917c-011f-4f75-b324-ce08d207fd74]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660848, 'reachable_time': 33939, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279459, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.573 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bb9bca91-8655-41f0-93b6-3141add626bb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:6530'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 660848, 'tstamp': 660848}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 279460, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.600 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d986aaba-3ee5-4eb9-ad2c-78b226bb148f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 141], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660848, 'reachable_time': 33939, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 279461, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.639 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fd841a51-4a86-4af3-87f3-dbf886b16b1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.699 227766 DEBUG nova.compute.manager [req-bedb82c9-18b2-4c04-a1c8-bc5c83fb87b4 req-a70c4692-8835-4d2e-b6c2-e952fb80a9a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.700 227766 DEBUG oslo_concurrency.lockutils [req-bedb82c9-18b2-4c04-a1c8-bc5c83fb87b4 req-a70c4692-8835-4d2e-b6c2-e952fb80a9a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.700 227766 DEBUG oslo_concurrency.lockutils [req-bedb82c9-18b2-4c04-a1c8-bc5c83fb87b4 req-a70c4692-8835-4d2e-b6c2-e952fb80a9a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.700 227766 DEBUG oslo_concurrency.lockutils [req-bedb82c9-18b2-4c04-a1c8-bc5c83fb87b4 req-a70c4692-8835-4d2e-b6c2-e952fb80a9a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.700 227766 DEBUG nova.compute.manager [req-bedb82c9-18b2-4c04-a1c8-bc5c83fb87b4 req-a70c4692-8835-4d2e-b6c2-e952fb80a9a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.701 227766 DEBUG nova.compute.manager [req-bedb82c9-18b2-4c04-a1c8-bc5c83fb87b4 req-a70c4692-8835-4d2e-b6c2-e952fb80a9a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-unplugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.715 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b06b11d7-ca9c-40cf-a88e-c403963edaeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.717 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.717 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.717 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee03d7c9-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.719 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 NetworkManager[48942]: <info>  [1769162713.7199] manager: (tapee03d7c9-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/226)
Jan 23 05:05:13 np0005593234 kernel: tapee03d7c9-e0: entered promiscuous mode
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.720 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.721 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee03d7c9-e0, col_values=(('external_ids', {'iface-id': '702d4523-a665-42f5-9a36-57d187c0698a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.722 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:13Z|00441|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=0)
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.735 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.736 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.737 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f9d19375-dd39-43ab-966e-7665dac3eacd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.738 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:05:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:13.739 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'env', 'PROCESS_TAG=haproxy-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee03d7c9-e107-41bf-95cc-5508578ad66c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.878 227766 INFO nova.virt.libvirt.driver [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Deleting instance files /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725_del#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.879 227766 INFO nova.virt.libvirt.driver [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Deletion of /var/lib/nova/instances/ae2a211d-e923-498b-9ceb-97274a2fd725_del complete#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.940 227766 INFO nova.compute.manager [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Took 1.13 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.941 227766 DEBUG oslo.service.loopingcall [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.941 227766 DEBUG nova.compute.manager [-] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:05:13 np0005593234 nova_compute[227762]: 2026-01-23 10:05:13.941 227766 DEBUG nova.network.neutron [-] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:05:14 np0005593234 podman[279494]: 2026-01-23 10:05:14.113265412 +0000 UTC m=+0.051504361 container create cfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 05:05:14 np0005593234 systemd[1]: Started libpod-conmon-cfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2.scope.
Jan 23 05:05:14 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:05:14 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/58ed8dfa7b5401ded7f5e9135be557b0670c77f372ca720805ab692999b67d3d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:05:14 np0005593234 podman[279494]: 2026-01-23 10:05:14.086171655 +0000 UTC m=+0.024410624 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:05:14 np0005593234 podman[279494]: 2026-01-23 10:05:14.18673883 +0000 UTC m=+0.124977799 container init cfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 05:05:14 np0005593234 podman[279494]: 2026-01-23 10:05:14.191938212 +0000 UTC m=+0.130177161 container start cfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 05:05:14 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279509]: [NOTICE]   (279513) : New worker (279515) forked
Jan 23 05:05:14 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279509]: [NOTICE]   (279513) : Loading success.
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.251 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 115f68c4-4489-4fc8-bb90-3c2d3011db2d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c unbound from our chassis#033[00m
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.254 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee03d7c9-e107-41bf-95cc-5508578ad66c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.255 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3d0b99d1-747b-42dc-873f-a1a26e7b178b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.255 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace which is not needed anymore#033[00m
Jan 23 05:05:14 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279509]: [NOTICE]   (279513) : haproxy version is 2.8.14-c23fe91
Jan 23 05:05:14 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279509]: [NOTICE]   (279513) : path to executable is /usr/sbin/haproxy
Jan 23 05:05:14 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279509]: [WARNING]  (279513) : Exiting Master process...
Jan 23 05:05:14 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279509]: [ALERT]    (279513) : Current worker (279515) exited with code 143 (Terminated)
Jan 23 05:05:14 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[279509]: [WARNING]  (279513) : All workers exited. Exiting... (0)
Jan 23 05:05:14 np0005593234 systemd[1]: libpod-cfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2.scope: Deactivated successfully.
Jan 23 05:05:14 np0005593234 podman[279541]: 2026-01-23 10:05:14.372105747 +0000 UTC m=+0.042249823 container died cfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:05:14 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2-userdata-shm.mount: Deactivated successfully.
Jan 23 05:05:14 np0005593234 systemd[1]: var-lib-containers-storage-overlay-58ed8dfa7b5401ded7f5e9135be557b0670c77f372ca720805ab692999b67d3d-merged.mount: Deactivated successfully.
Jan 23 05:05:14 np0005593234 podman[279541]: 2026-01-23 10:05:14.406392608 +0000 UTC m=+0.076536684 container cleanup cfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 05:05:14 np0005593234 systemd[1]: libpod-conmon-cfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2.scope: Deactivated successfully.
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.454 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '42'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:14 np0005593234 podman[279569]: 2026-01-23 10:05:14.459674554 +0000 UTC m=+0.035447629 container remove cfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.464 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7da1b89d-b727-4e26-94c3-3d32ca6d88b3]: (4, ('Fri Jan 23 10:05:14 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (cfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2)\ncfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2\nFri Jan 23 10:05:14 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (cfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2)\ncfefaf09f5fa60a8896b5cb17cb3f157e239ed6c2cc234b8235017bee23dccf2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.466 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2e4f513e-8589-43bb-b040-0a861e8452f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.466 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:14 np0005593234 nova_compute[227762]: 2026-01-23 10:05:14.468 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:14 np0005593234 kernel: tapee03d7c9-e0: left promiscuous mode
Jan 23 05:05:14 np0005593234 nova_compute[227762]: 2026-01-23 10:05:14.481 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.485 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e02c447f-269b-43f1-b631-480cc24aea82]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:14.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.500 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0b15dec8-7e1c-4fbc-9907-861514b8ea8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.500 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[406d194a-31c5-4eb3-b939-ab488b2a9f57]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.516 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eccdb28e-c8f5-4c66-b579-8302840db23b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 660840, 'reachable_time': 33018, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 279584, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:14 np0005593234 systemd[1]: run-netns-ovnmeta\x2dee03d7c9\x2de107\x2d41bf\x2d95cc\x2d5508578ad66c.mount: Deactivated successfully.
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.519 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:14.519 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[c453a7f1-4cf2-479d-9783-51e7a0cdeaca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:15.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.281 227766 DEBUG nova.network.neutron [-] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.315 227766 INFO nova.compute.manager [-] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Took 1.37 seconds to deallocate network for instance.#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.545 227766 DEBUG oslo_concurrency.lockutils [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.546 227766 DEBUG oslo_concurrency.lockutils [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.584 227766 DEBUG nova.scheduler.client.report [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.608 227766 DEBUG nova.scheduler.client.report [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.608 227766 DEBUG nova.compute.provider_tree [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.634 227766 DEBUG nova.scheduler.client.report [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.638 227766 DEBUG nova.compute.manager [req-8cd3b032-c2e2-4d16-a31f-9144e1fc3008 req-cd25d207-e76e-4a95-93ee-f4df81acf02d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-deleted-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.662 227766 DEBUG nova.scheduler.client.report [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.727 227766 DEBUG oslo_concurrency.processutils [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.836 227766 DEBUG nova.compute.manager [req-083637dc-4f3e-475d-8458-3ecf3752cbe4 req-681e4b94-57f8-496a-9214-61ead656bd5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.837 227766 DEBUG oslo_concurrency.lockutils [req-083637dc-4f3e-475d-8458-3ecf3752cbe4 req-681e4b94-57f8-496a-9214-61ead656bd5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.838 227766 DEBUG oslo_concurrency.lockutils [req-083637dc-4f3e-475d-8458-3ecf3752cbe4 req-681e4b94-57f8-496a-9214-61ead656bd5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.838 227766 DEBUG oslo_concurrency.lockutils [req-083637dc-4f3e-475d-8458-3ecf3752cbe4 req-681e4b94-57f8-496a-9214-61ead656bd5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.838 227766 DEBUG nova.compute.manager [req-083637dc-4f3e-475d-8458-3ecf3752cbe4 req-681e4b94-57f8-496a-9214-61ead656bd5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] No waiting events found dispatching network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:05:15 np0005593234 nova_compute[227762]: 2026-01-23 10:05:15.838 227766 WARNING nova.compute.manager [req-083637dc-4f3e-475d-8458-3ecf3752cbe4 req-681e4b94-57f8-496a-9214-61ead656bd5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Received unexpected event network-vif-plugged-115f68c4-4489-4fc8-bb90-3c2d3011db2d for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:05:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1895138751' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:05:16 np0005593234 nova_compute[227762]: 2026-01-23 10:05:16.171 227766 DEBUG oslo_concurrency.processutils [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:16 np0005593234 nova_compute[227762]: 2026-01-23 10:05:16.176 227766 DEBUG nova.compute.provider_tree [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:05:16 np0005593234 nova_compute[227762]: 2026-01-23 10:05:16.192 227766 DEBUG nova.scheduler.client.report [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:05:16 np0005593234 nova_compute[227762]: 2026-01-23 10:05:16.215 227766 DEBUG oslo_concurrency.lockutils [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.670s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:16 np0005593234 nova_compute[227762]: 2026-01-23 10:05:16.254 227766 INFO nova.scheduler.client.report [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Deleted allocations for instance ae2a211d-e923-498b-9ceb-97274a2fd725#033[00m
Jan 23 05:05:16 np0005593234 nova_compute[227762]: 2026-01-23 10:05:16.324 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:16 np0005593234 nova_compute[227762]: 2026-01-23 10:05:16.330 227766 DEBUG oslo_concurrency.lockutils [None req-49b26c86-b9bc-4a1f-ad36-cfbeaee35fe5 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "ae2a211d-e923-498b-9ceb-97274a2fd725" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.527s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:16.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 e268: 3 total, 3 up, 3 in
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #103. Immutable memtables: 0.
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.653437) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 63] Flushing memtable with next log file: 103
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162716653513, "job": 63, "event": "flush_started", "num_memtables": 1, "num_entries": 1756, "num_deletes": 258, "total_data_size": 3833118, "memory_usage": 3880288, "flush_reason": "Manual Compaction"}
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 63] Level-0 flush table #104: started
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162716666935, "cf_name": "default", "job": 63, "event": "table_file_creation", "file_number": 104, "file_size": 2506998, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 51848, "largest_seqno": 53599, "table_properties": {"data_size": 2499629, "index_size": 4248, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16261, "raw_average_key_size": 20, "raw_value_size": 2484539, "raw_average_value_size": 3113, "num_data_blocks": 184, "num_entries": 798, "num_filter_entries": 798, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162586, "oldest_key_time": 1769162586, "file_creation_time": 1769162716, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 104, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 63] Flush lasted 13545 microseconds, and 5818 cpu microseconds.
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.666992) [db/flush_job.cc:967] [default] [JOB 63] Level-0 flush table #104: 2506998 bytes OK
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.667011) [db/memtable_list.cc:519] [default] Level-0 commit table #104 started
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.668842) [db/memtable_list.cc:722] [default] Level-0 commit table #104: memtable #1 done
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.668860) EVENT_LOG_v1 {"time_micros": 1769162716668854, "job": 63, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.668880) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 63] Try to delete WAL files size 3824948, prev total WAL file size 3824948, number of live WAL files 2.
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000100.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.670077) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031373535' seq:72057594037927935, type:22 .. '6C6F676D0032303037' seq:0, type:0; will stop at (end)
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 64] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 63 Base level 0, inputs: [104(2448KB)], [102(10MB)]
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162716670223, "job": 64, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [104], "files_L6": [102], "score": -1, "input_data_size": 13360769, "oldest_snapshot_seqno": -1}
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 64] Generated table #105: 7697 keys, 13208600 bytes, temperature: kUnknown
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162716760461, "cf_name": "default", "job": 64, "event": "table_file_creation", "file_number": 105, "file_size": 13208600, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13155636, "index_size": 32628, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19269, "raw_key_size": 198539, "raw_average_key_size": 25, "raw_value_size": 13016816, "raw_average_value_size": 1691, "num_data_blocks": 1296, "num_entries": 7697, "num_filter_entries": 7697, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769162716, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 105, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.760882) [db/compaction/compaction_job.cc:1663] [default] [JOB 64] Compacted 1@0 + 1@6 files to L6 => 13208600 bytes
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.762343) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 147.6 rd, 146.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 10.4 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(10.6) write-amplify(5.3) OK, records in: 8233, records dropped: 536 output_compression: NoCompression
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.762359) EVENT_LOG_v1 {"time_micros": 1769162716762352, "job": 64, "event": "compaction_finished", "compaction_time_micros": 90500, "compaction_time_cpu_micros": 39423, "output_level": 6, "num_output_files": 1, "total_output_size": 13208600, "num_input_records": 8233, "num_output_records": 7697, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000104.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162716763273, "job": 64, "event": "table_file_deletion", "file_number": 104}
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000102.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162716765135, "job": 64, "event": "table_file_deletion", "file_number": 102}
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.670011) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.765303) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.765308) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.765310) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.765313) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:05:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:05:16.765315) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:05:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:05:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:17.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:05:18 np0005593234 nova_compute[227762]: 2026-01-23 10:05:18.399 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:18.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:19.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:20.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:21.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:21 np0005593234 nova_compute[227762]: 2026-01-23 10:05:21.325 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:21 np0005593234 nova_compute[227762]: 2026-01-23 10:05:21.831 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:21 np0005593234 nova_compute[227762]: 2026-01-23 10:05:21.831 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:21 np0005593234 nova_compute[227762]: 2026-01-23 10:05:21.854 227766 DEBUG nova.compute.manager [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:05:21 np0005593234 nova_compute[227762]: 2026-01-23 10:05:21.931 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:21 np0005593234 nova_compute[227762]: 2026-01-23 10:05:21.932 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:21 np0005593234 nova_compute[227762]: 2026-01-23 10:05:21.940 227766 DEBUG nova.virt.hardware [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:05:21 np0005593234 nova_compute[227762]: 2026-01-23 10:05:21.941 227766 INFO nova.compute.claims [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:05:22 np0005593234 nova_compute[227762]: 2026-01-23 10:05:22.065 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:05:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1375493304' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:05:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:22.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:22 np0005593234 nova_compute[227762]: 2026-01-23 10:05:22.516 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:22 np0005593234 nova_compute[227762]: 2026-01-23 10:05:22.523 227766 DEBUG nova.compute.provider_tree [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.013 227766 DEBUG nova.scheduler.client.report [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.164 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.232s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.166 227766 DEBUG nova.compute.manager [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:05:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:23.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.426 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.467 227766 DEBUG nova.compute.manager [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.468 227766 DEBUG nova.network.neutron [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.502 227766 INFO nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.554 227766 DEBUG nova.compute.manager [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.688 227766 DEBUG nova.compute.manager [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.690 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.690 227766 INFO nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Creating image(s)#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.717 227766 DEBUG nova.storage.rbd_utils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.740 227766 DEBUG nova.storage.rbd_utils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.767 227766 DEBUG nova.storage.rbd_utils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.770 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.828 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.829 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.830 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.830 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.854 227766 DEBUG nova.storage.rbd_utils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.857 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:23 np0005593234 nova_compute[227762]: 2026-01-23 10:05:23.939 227766 DEBUG nova.policy [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9d4a5c201efa4992a9ef57d8abdc1675', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:05:24 np0005593234 nova_compute[227762]: 2026-01-23 10:05:24.175 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.318s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:24 np0005593234 nova_compute[227762]: 2026-01-23 10:05:24.259 227766 DEBUG nova.storage.rbd_utils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] resizing rbd image 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:05:24 np0005593234 nova_compute[227762]: 2026-01-23 10:05:24.366 227766 DEBUG nova.objects.instance [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'migration_context' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:24 np0005593234 nova_compute[227762]: 2026-01-23 10:05:24.384 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:05:24 np0005593234 nova_compute[227762]: 2026-01-23 10:05:24.384 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Ensure instance console log exists: /var/lib/nova/instances/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:05:24 np0005593234 nova_compute[227762]: 2026-01-23 10:05:24.385 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:24 np0005593234 nova_compute[227762]: 2026-01-23 10:05:24.385 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:24 np0005593234 nova_compute[227762]: 2026-01-23 10:05:24.385 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:24 np0005593234 nova_compute[227762]: 2026-01-23 10:05:24.419 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:24.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:24 np0005593234 nova_compute[227762]: 2026-01-23 10:05:24.669 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:25.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:26 np0005593234 nova_compute[227762]: 2026-01-23 10:05:26.213 227766 DEBUG nova.network.neutron [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Successfully created port: 5b468015-6b03-496b-acb0-201ef16d849d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:05:26 np0005593234 nova_compute[227762]: 2026-01-23 10:05:26.328 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:26.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:05:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:27.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:05:27 np0005593234 nova_compute[227762]: 2026-01-23 10:05:27.652 227766 DEBUG nova.network.neutron [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Successfully updated port: 5b468015-6b03-496b-acb0-201ef16d849d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:05:27 np0005593234 nova_compute[227762]: 2026-01-23 10:05:27.672 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:05:27 np0005593234 nova_compute[227762]: 2026-01-23 10:05:27.673 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquired lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:05:27 np0005593234 nova_compute[227762]: 2026-01-23 10:05:27.673 227766 DEBUG nova.network.neutron [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:05:27 np0005593234 nova_compute[227762]: 2026-01-23 10:05:27.930 227766 DEBUG nova.network.neutron [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:05:28 np0005593234 nova_compute[227762]: 2026-01-23 10:05:28.242 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162713.2367258, ae2a211d-e923-498b-9ceb-97274a2fd725 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:05:28 np0005593234 nova_compute[227762]: 2026-01-23 10:05:28.243 227766 INFO nova.compute.manager [-] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:05:28 np0005593234 nova_compute[227762]: 2026-01-23 10:05:28.262 227766 DEBUG nova.compute.manager [None req-b3da7675-3766-4412-bfea-0321faa14529 - - - - - -] [instance: ae2a211d-e923-498b-9ceb-97274a2fd725] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:28 np0005593234 nova_compute[227762]: 2026-01-23 10:05:28.298 227766 DEBUG nova.compute.manager [req-b78430e6-107c-42fb-ba06-11a37c099d8d req-89681765-d38a-43df-a440-f710331e4cfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-changed-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:28 np0005593234 nova_compute[227762]: 2026-01-23 10:05:28.298 227766 DEBUG nova.compute.manager [req-b78430e6-107c-42fb-ba06-11a37c099d8d req-89681765-d38a-43df-a440-f710331e4cfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Refreshing instance network info cache due to event network-changed-5b468015-6b03-496b-acb0-201ef16d849d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:05:28 np0005593234 nova_compute[227762]: 2026-01-23 10:05:28.298 227766 DEBUG oslo_concurrency.lockutils [req-b78430e6-107c-42fb-ba06-11a37c099d8d req-89681765-d38a-43df-a440-f710331e4cfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:05:28 np0005593234 nova_compute[227762]: 2026-01-23 10:05:28.430 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:28.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:28 np0005593234 nova_compute[227762]: 2026-01-23 10:05:28.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:28 np0005593234 podman[279853]: 2026-01-23 10:05:28.770032446 +0000 UTC m=+0.053632558 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 05:05:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:29.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.329 227766 DEBUG nova.network.neutron [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Updating instance_info_cache with network_info: [{"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.353 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Releasing lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.354 227766 DEBUG nova.compute.manager [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Instance network_info: |[{"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.354 227766 DEBUG oslo_concurrency.lockutils [req-b78430e6-107c-42fb-ba06-11a37c099d8d req-89681765-d38a-43df-a440-f710331e4cfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.354 227766 DEBUG nova.network.neutron [req-b78430e6-107c-42fb-ba06-11a37c099d8d req-89681765-d38a-43df-a440-f710331e4cfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Refreshing network info cache for port 5b468015-6b03-496b-acb0-201ef16d849d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.356 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Start _get_guest_xml network_info=[{"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.361 227766 WARNING nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.367 227766 DEBUG nova.virt.libvirt.host [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.367 227766 DEBUG nova.virt.libvirt.host [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.373 227766 DEBUG nova.virt.libvirt.host [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.374 227766 DEBUG nova.virt.libvirt.host [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.375 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.375 227766 DEBUG nova.virt.hardware [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.375 227766 DEBUG nova.virt.hardware [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.376 227766 DEBUG nova.virt.hardware [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.376 227766 DEBUG nova.virt.hardware [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.376 227766 DEBUG nova.virt.hardware [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.376 227766 DEBUG nova.virt.hardware [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.376 227766 DEBUG nova.virt.hardware [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.377 227766 DEBUG nova.virt.hardware [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.377 227766 DEBUG nova.virt.hardware [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.377 227766 DEBUG nova.virt.hardware [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.377 227766 DEBUG nova.virt.hardware [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.380 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:05:29 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3745592969' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.812 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.835 227766 DEBUG nova.storage.rbd_utils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:05:29 np0005593234 nova_compute[227762]: 2026-01-23 10:05:29.838 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:05:30 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4154589514' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.269 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.272 227766 DEBUG nova.virt.libvirt.vif [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:05:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1240996482',display_name='tempest-ServerActionsTestJSON-server-1240996482',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1240996482',id=114,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-wcdb1ybb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:05:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=06ab5530-6f75-4f7d-80cd-48cf4c63cfd9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.272 227766 DEBUG nova.network.os_vif_util [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.273 227766 DEBUG nova.network.os_vif_util [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.274 227766 DEBUG nova.objects.instance [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'pci_devices' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.312 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  <uuid>06ab5530-6f75-4f7d-80cd-48cf4c63cfd9</uuid>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  <name>instance-00000072</name>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerActionsTestJSON-server-1240996482</nova:name>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:05:29</nova:creationTime>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <nova:user uuid="9d4a5c201efa4992a9ef57d8abdc1675">tempest-ServerActionsTestJSON-1619235720-project-member</nova:user>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <nova:project uuid="74c5c1d0762242f29a5d26033efd9f6d">tempest-ServerActionsTestJSON-1619235720</nova:project>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <nova:port uuid="5b468015-6b03-496b-acb0-201ef16d849d">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <entry name="serial">06ab5530-6f75-4f7d-80cd-48cf4c63cfd9</entry>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <entry name="uuid">06ab5530-6f75-4f7d-80cd-48cf4c63cfd9</entry>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk.config">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:87:17:9b"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <target dev="tap5b468015-6b"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9/console.log" append="off"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:05:30 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:05:30 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:05:30 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:05:30 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.314 227766 DEBUG nova.compute.manager [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Preparing to wait for external event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.314 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.314 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.315 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.315 227766 DEBUG nova.virt.libvirt.vif [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:05:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1240996482',display_name='tempest-ServerActionsTestJSON-server-1240996482',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1240996482',id=114,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-wcdb1ybb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:05:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=06ab5530-6f75-4f7d-80cd-48cf4c63cfd9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.316 227766 DEBUG nova.network.os_vif_util [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.316 227766 DEBUG nova.network.os_vif_util [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.316 227766 DEBUG os_vif [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.317 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.318 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.318 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.322 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.322 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5b468015-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.322 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5b468015-6b, col_values=(('external_ids', {'iface-id': '5b468015-6b03-496b-acb0-201ef16d849d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:17:9b', 'vm-uuid': '06ab5530-6f75-4f7d-80cd-48cf4c63cfd9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:30 np0005593234 NetworkManager[48942]: <info>  [1769162730.3247] manager: (tap5b468015-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/227)
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.324 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.326 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.330 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.331 227766 INFO os_vif [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b')#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.435 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.436 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.436 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] No VIF found with MAC fa:16:3e:87:17:9b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.437 227766 INFO nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Using config drive#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.459 227766 DEBUG nova.storage.rbd_utils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:05:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:30.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.992 227766 INFO nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Creating config drive at /var/lib/nova/instances/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9/disk.config#033[00m
Jan 23 05:05:30 np0005593234 nova_compute[227762]: 2026-01-23 10:05:30.998 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo6d32g1_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.132 227766 DEBUG nova.network.neutron [req-b78430e6-107c-42fb-ba06-11a37c099d8d req-89681765-d38a-43df-a440-f710331e4cfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Updated VIF entry in instance network info cache for port 5b468015-6b03-496b-acb0-201ef16d849d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.133 227766 DEBUG nova.network.neutron [req-b78430e6-107c-42fb-ba06-11a37c099d8d req-89681765-d38a-43df-a440-f710331e4cfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Updating instance_info_cache with network_info: [{"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.138 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpo6d32g1_" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.166 227766 DEBUG nova.storage.rbd_utils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] rbd image 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.169 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9/disk.config 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.191 227766 DEBUG oslo_concurrency.lockutils [req-b78430e6-107c-42fb-ba06-11a37c099d8d req-89681765-d38a-43df-a440-f710331e4cfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:05:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:05:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:31.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.316 227766 DEBUG oslo_concurrency.processutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9/disk.config 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.147s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.317 227766 INFO nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Deleting local config drive /var/lib/nova/instances/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9/disk.config because it was imported into RBD.#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.330 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:31 np0005593234 kernel: tap5b468015-6b: entered promiscuous mode
Jan 23 05:05:31 np0005593234 NetworkManager[48942]: <info>  [1769162731.3692] manager: (tap5b468015-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/228)
Jan 23 05:05:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:31Z|00442|binding|INFO|Claiming lport 5b468015-6b03-496b-acb0-201ef16d849d for this chassis.
Jan 23 05:05:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:31Z|00443|binding|INFO|5b468015-6b03-496b-acb0-201ef16d849d: Claiming fa:16:3e:87:17:9b 10.100.0.6
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.369 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.374 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:31 np0005593234 systemd-udevd[280008]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:05:31 np0005593234 systemd-machined[195626]: New machine qemu-50-instance-00000072.
Jan 23 05:05:31 np0005593234 NetworkManager[48942]: <info>  [1769162731.4104] device (tap5b468015-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:05:31 np0005593234 NetworkManager[48942]: <info>  [1769162731.4114] device (tap5b468015-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:05:31 np0005593234 systemd[1]: Started Virtual Machine qemu-50-instance-00000072.
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.439 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:31Z|00444|binding|INFO|Setting lport 5b468015-6b03-496b-acb0-201ef16d849d ovn-installed in OVS
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.447 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:31Z|00445|binding|INFO|Setting lport 5b468015-6b03-496b-acb0-201ef16d849d up in Southbound
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.632 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:17:9b 10.100.0.6'], port_security=['fa:16:3e:87:17:9b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '06ab5530-6f75-4f7d-80cd-48cf4c63cfd9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5b468015-6b03-496b-acb0-201ef16d849d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.634 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5b468015-6b03-496b-acb0-201ef16d849d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c bound to our chassis#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.635 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee03d7c9-e107-41bf-95cc-5508578ad66c#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.645 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[016760f2-eb0c-4f48-8849-93595f254aad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.647 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee03d7c9-e1 in ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.648 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee03d7c9-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.649 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[95964148-9aa3-4061-aa05-2e84298442b4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.649 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[08bd0e21-0507-4844-8883-b82924a73473]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.665 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[a7f9298d-cf80-448c-abc2-91bddf4b2649]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.689 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e1b7ddb6-90eb-4523-bb35-69bb2f290262]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.719 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[6ae12aca-b831-4946-a468-6ba9c8a80039]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 NetworkManager[48942]: <info>  [1769162731.7253] manager: (tapee03d7c9-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/229)
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.724 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6d88b9e1-f219-48c0-b5b2-b52f97383396]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.753 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[708087bc-4cb3-43b4-a3bc-876008c530fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.756 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4344a801-bec7-43e1-a626-a1a9020acca7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 NetworkManager[48942]: <info>  [1769162731.7754] device (tapee03d7c9-e0): carrier: link connected
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.782 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a5b958-3afc-4aff-a756-2e7d68b8a3b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.785 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.786 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.786 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.786 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.786 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.802 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b2fd008f-2fcc-4a0a-b3f4-e0dfb157b6e6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662673, 'reachable_time': 28697, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280042, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.818 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b830769b-f5b7-4c8a-8359-a0fdc92f1bdf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:6530'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 662673, 'tstamp': 662673}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280044, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.833 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[da037e17-0a5a-4698-afe0-41587463bed0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 143], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662673, 'reachable_time': 28697, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 280045, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.855 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[de4861de-59fb-42e1-9fd7-12b7ae8a11f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.898 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[69220568-dc9c-4c51-9cf7-ff1c2060b331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.899 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.899 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.900 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee03d7c9-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:31 np0005593234 kernel: tapee03d7c9-e0: entered promiscuous mode
Jan 23 05:05:31 np0005593234 NetworkManager[48942]: <info>  [1769162731.9020] manager: (tapee03d7c9-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/230)
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.901 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.904 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee03d7c9-e0, col_values=(('external_ids', {'iface-id': '702d4523-a665-42f5-9a36-57d187c0698a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:31Z|00446|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=0)
Jan 23 05:05:31 np0005593234 nova_compute[227762]: 2026-01-23 10:05:31.920 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.921 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.922 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f926ec28-01b5-4776-b25b-cda32c0b8590]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.922 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:05:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:31.923 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'env', 'PROCESS_TAG=haproxy-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee03d7c9-e107-41bf-95cc-5508578ad66c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.173 227766 DEBUG nova.compute.manager [req-f1db0246-0e6c-405d-87c8-318a3d24a849 req-9a24f504-2c30-4f35-81ad-fee7503ca895 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.175 227766 DEBUG oslo_concurrency.lockutils [req-f1db0246-0e6c-405d-87c8-318a3d24a849 req-9a24f504-2c30-4f35-81ad-fee7503ca895 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.175 227766 DEBUG oslo_concurrency.lockutils [req-f1db0246-0e6c-405d-87c8-318a3d24a849 req-9a24f504-2c30-4f35-81ad-fee7503ca895 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.176 227766 DEBUG oslo_concurrency.lockutils [req-f1db0246-0e6c-405d-87c8-318a3d24a849 req-9a24f504-2c30-4f35-81ad-fee7503ca895 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.176 227766 DEBUG nova.compute.manager [req-f1db0246-0e6c-405d-87c8-318a3d24a849 req-9a24f504-2c30-4f35-81ad-fee7503ca895 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Processing event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:05:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:05:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4275090164' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.245 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.302 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162732.3017743, 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.303 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] VM Started (Lifecycle Event)#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.305 227766 DEBUG nova.compute.manager [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.312 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.322 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.323 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.323 227766 INFO nova.virt.libvirt.driver [-] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Instance spawned successfully.#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.324 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:05:32 np0005593234 podman[280140]: 2026-01-23 10:05:32.326094065 +0000 UTC m=+0.058692666 container create 2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.333 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.339 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.365 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.366 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162732.3020768, 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.366 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.371 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.371 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.372 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.372 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.373 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.373 227766 DEBUG nova.virt.libvirt.driver [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:05:32 np0005593234 systemd[1]: Started libpod-conmon-2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb.scope.
Jan 23 05:05:32 np0005593234 podman[280140]: 2026-01-23 10:05:32.290786141 +0000 UTC m=+0.023384762 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:05:32 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.407 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:32 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/102d859c716e4e924e7ac039a891098cbc9f54e060f787957e2abf98f56f6b0f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.411 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162732.310002, 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.412 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:05:32 np0005593234 podman[280140]: 2026-01-23 10:05:32.421791927 +0000 UTC m=+0.154390548 container init 2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:05:32 np0005593234 podman[280140]: 2026-01-23 10:05:32.428545779 +0000 UTC m=+0.161144380 container start 2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.450 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:32 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280157]: [NOTICE]   (280161) : New worker (280163) forked
Jan 23 05:05:32 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280157]: [NOTICE]   (280161) : Loading success.
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.456 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.492 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.503 227766 INFO nova.compute.manager [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Took 8.81 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.504 227766 DEBUG nova.compute.manager [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:32.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.555 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.557 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4382MB free_disk=20.81018829345703GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.557 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.557 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.592 227766 INFO nova.compute.manager [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Took 10.69 seconds to build instance.#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.616 227766 DEBUG oslo_concurrency.lockutils [None req-93a2acb1-02c8-4288-8f15-9ffa72a2ea75 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.784s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.658 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.659 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.659 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:05:32 np0005593234 nova_compute[227762]: 2026-01-23 10:05:32.723 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:05:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1856771807' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:05:33 np0005593234 nova_compute[227762]: 2026-01-23 10:05:33.168 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:33 np0005593234 nova_compute[227762]: 2026-01-23 10:05:33.173 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:05:33 np0005593234 nova_compute[227762]: 2026-01-23 10:05:33.196 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:05:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:05:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:33.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:05:33 np0005593234 nova_compute[227762]: 2026-01-23 10:05:33.234 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:05:33 np0005593234 nova_compute[227762]: 2026-01-23 10:05:33.235 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.678s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:05:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1548050063' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:05:34 np0005593234 nova_compute[227762]: 2026-01-23 10:05:34.235 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:34 np0005593234 nova_compute[227762]: 2026-01-23 10:05:34.299 227766 DEBUG nova.compute.manager [req-625f2dd0-e601-45c2-8025-f05240bed767 req-8e5af96f-dfdd-4811-8469-130261207f4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:34 np0005593234 nova_compute[227762]: 2026-01-23 10:05:34.299 227766 DEBUG oslo_concurrency.lockutils [req-625f2dd0-e601-45c2-8025-f05240bed767 req-8e5af96f-dfdd-4811-8469-130261207f4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:34 np0005593234 nova_compute[227762]: 2026-01-23 10:05:34.300 227766 DEBUG oslo_concurrency.lockutils [req-625f2dd0-e601-45c2-8025-f05240bed767 req-8e5af96f-dfdd-4811-8469-130261207f4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:34 np0005593234 nova_compute[227762]: 2026-01-23 10:05:34.300 227766 DEBUG oslo_concurrency.lockutils [req-625f2dd0-e601-45c2-8025-f05240bed767 req-8e5af96f-dfdd-4811-8469-130261207f4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:34 np0005593234 nova_compute[227762]: 2026-01-23 10:05:34.300 227766 DEBUG nova.compute.manager [req-625f2dd0-e601-45c2-8025-f05240bed767 req-8e5af96f-dfdd-4811-8469-130261207f4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] No waiting events found dispatching network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:05:34 np0005593234 nova_compute[227762]: 2026-01-23 10:05:34.301 227766 WARNING nova.compute.manager [req-625f2dd0-e601-45c2-8025-f05240bed767 req-8e5af96f-dfdd-4811-8469-130261207f4b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received unexpected event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:05:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:34.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:35.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:35 np0005593234 nova_compute[227762]: 2026-01-23 10:05:35.325 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:36 np0005593234 nova_compute[227762]: 2026-01-23 10:05:36.332 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:36.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:36 np0005593234 nova_compute[227762]: 2026-01-23 10:05:36.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:36 np0005593234 nova_compute[227762]: 2026-01-23 10:05:36.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:05:36 np0005593234 nova_compute[227762]: 2026-01-23 10:05:36.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:05:37 np0005593234 nova_compute[227762]: 2026-01-23 10:05:37.096 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:05:37 np0005593234 nova_compute[227762]: 2026-01-23 10:05:37.096 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:05:37 np0005593234 nova_compute[227762]: 2026-01-23 10:05:37.097 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:05:37 np0005593234 nova_compute[227762]: 2026-01-23 10:05:37.097 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:37.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:38 np0005593234 nova_compute[227762]: 2026-01-23 10:05:38.492 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:38 np0005593234 NetworkManager[48942]: <info>  [1769162738.4930] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/231)
Jan 23 05:05:38 np0005593234 NetworkManager[48942]: <info>  [1769162738.4938] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/232)
Jan 23 05:05:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:05:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:38.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:05:38 np0005593234 nova_compute[227762]: 2026-01-23 10:05:38.622 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:38Z|00447|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=0)
Jan 23 05:05:38 np0005593234 nova_compute[227762]: 2026-01-23 10:05:38.634 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:38 np0005593234 podman[280198]: 2026-01-23 10:05:38.790127218 +0000 UTC m=+0.084011828 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:05:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:39.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:39 np0005593234 nova_compute[227762]: 2026-01-23 10:05:39.394 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Updating instance_info_cache with network_info: [{"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:39 np0005593234 nova_compute[227762]: 2026-01-23 10:05:39.420 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:05:39 np0005593234 nova_compute[227762]: 2026-01-23 10:05:39.420 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:05:39 np0005593234 nova_compute[227762]: 2026-01-23 10:05:39.421 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:39 np0005593234 nova_compute[227762]: 2026-01-23 10:05:39.421 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:39 np0005593234 nova_compute[227762]: 2026-01-23 10:05:39.421 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:05:40 np0005593234 nova_compute[227762]: 2026-01-23 10:05:40.327 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:40.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:40 np0005593234 nova_compute[227762]: 2026-01-23 10:05:40.696 227766 DEBUG nova.compute.manager [req-9387808d-4828-43e8-beb1-4366b7724208 req-4278436e-4dea-4888-aa4f-9fae83292185 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-changed-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:05:40 np0005593234 nova_compute[227762]: 2026-01-23 10:05:40.697 227766 DEBUG nova.compute.manager [req-9387808d-4828-43e8-beb1-4366b7724208 req-4278436e-4dea-4888-aa4f-9fae83292185 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Refreshing instance network info cache due to event network-changed-5b468015-6b03-496b-acb0-201ef16d849d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:05:40 np0005593234 nova_compute[227762]: 2026-01-23 10:05:40.697 227766 DEBUG oslo_concurrency.lockutils [req-9387808d-4828-43e8-beb1-4366b7724208 req-4278436e-4dea-4888-aa4f-9fae83292185 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:05:40 np0005593234 nova_compute[227762]: 2026-01-23 10:05:40.698 227766 DEBUG oslo_concurrency.lockutils [req-9387808d-4828-43e8-beb1-4366b7724208 req-4278436e-4dea-4888-aa4f-9fae83292185 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:05:40 np0005593234 nova_compute[227762]: 2026-01-23 10:05:40.698 227766 DEBUG nova.network.neutron [req-9387808d-4828-43e8-beb1-4366b7724208 req-4278436e-4dea-4888-aa4f-9fae83292185 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Refreshing network info cache for port 5b468015-6b03-496b-acb0-201ef16d849d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:05:40 np0005593234 nova_compute[227762]: 2026-01-23 10:05:40.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:40 np0005593234 nova_compute[227762]: 2026-01-23 10:05:40.772 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:41.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:41 np0005593234 nova_compute[227762]: 2026-01-23 10:05:41.333 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:42.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:42 np0005593234 nova_compute[227762]: 2026-01-23 10:05:42.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:42.842 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:42.843 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:42.844 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:43.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:43 np0005593234 nova_compute[227762]: 2026-01-23 10:05:43.320 227766 DEBUG nova.network.neutron [req-9387808d-4828-43e8-beb1-4366b7724208 req-4278436e-4dea-4888-aa4f-9fae83292185 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Updated VIF entry in instance network info cache for port 5b468015-6b03-496b-acb0-201ef16d849d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:05:43 np0005593234 nova_compute[227762]: 2026-01-23 10:05:43.321 227766 DEBUG nova.network.neutron [req-9387808d-4828-43e8-beb1-4366b7724208 req-4278436e-4dea-4888-aa4f-9fae83292185 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Updating instance_info_cache with network_info: [{"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:05:43 np0005593234 nova_compute[227762]: 2026-01-23 10:05:43.347 227766 DEBUG oslo_concurrency.lockutils [req-9387808d-4828-43e8-beb1-4366b7724208 req-4278436e-4dea-4888-aa4f-9fae83292185 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:05:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:44.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:45.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:45 np0005593234 nova_compute[227762]: 2026-01-23 10:05:45.330 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:45Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:87:17:9b 10.100.0.6
Jan 23 05:05:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:45Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:87:17:9b 10.100.0.6
Jan 23 05:05:45 np0005593234 nova_compute[227762]: 2026-01-23 10:05:45.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:05:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:46 np0005593234 nova_compute[227762]: 2026-01-23 10:05:46.336 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:46.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:47.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:05:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:05:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:05:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:05:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:48.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:49.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:49 np0005593234 nova_compute[227762]: 2026-01-23 10:05:49.887 227766 DEBUG nova.compute.manager [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 23 05:05:49 np0005593234 nova_compute[227762]: 2026-01-23 10:05:49.981 227766 DEBUG oslo_concurrency.lockutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:49 np0005593234 nova_compute[227762]: 2026-01-23 10:05:49.982 227766 DEBUG oslo_concurrency.lockutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.004 227766 DEBUG nova.objects.instance [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'pci_requests' on Instance uuid 483afeac-561b-48ff-89d6-d02d1b615fc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.021 227766 DEBUG nova.virt.hardware [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.021 227766 INFO nova.compute.claims [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.022 227766 DEBUG nova.objects.instance [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'resources' on Instance uuid 483afeac-561b-48ff-89d6-d02d1b615fc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.062 227766 DEBUG nova.objects.instance [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'pci_devices' on Instance uuid 483afeac-561b-48ff-89d6-d02d1b615fc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.119 227766 INFO nova.compute.resource_tracker [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Updating resource usage from migration df920c0b-dafc-41b8-b8ba-e843582c7bd4#033[00m
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.120 227766 DEBUG nova.compute.resource_tracker [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Starting to track incoming migration df920c0b-dafc-41b8-b8ba-e843582c7bd4 with flavor eebea5f8-9b11-45ad-873d-c4ea90d3de87 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.213 227766 DEBUG oslo_concurrency.processutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.332 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:50.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:05:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/73735374' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.681 227766 DEBUG oslo_concurrency.processutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.687 227766 DEBUG nova.compute.provider_tree [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.710 227766 DEBUG nova.scheduler.client.report [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.730 227766 DEBUG oslo_concurrency.lockutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.748s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:50 np0005593234 nova_compute[227762]: 2026-01-23 10:05:50.730 227766 INFO nova.compute.manager [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Migrating#033[00m
Jan 23 05:05:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:05:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:51.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:05:51 np0005593234 nova_compute[227762]: 2026-01-23 10:05:51.364 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:52.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:05:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:53.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:05:53 np0005593234 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 05:05:53 np0005593234 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 05:05:53 np0005593234 systemd-logind[794]: New session 66 of user nova.
Jan 23 05:05:53 np0005593234 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 05:05:53 np0005593234 systemd[1]: Starting User Manager for UID 42436...
Jan 23 05:05:53 np0005593234 systemd[280440]: Queued start job for default target Main User Target.
Jan 23 05:05:53 np0005593234 systemd[280440]: Created slice User Application Slice.
Jan 23 05:05:53 np0005593234 systemd[280440]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 05:05:53 np0005593234 systemd[280440]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 05:05:53 np0005593234 systemd[280440]: Reached target Paths.
Jan 23 05:05:53 np0005593234 systemd[280440]: Reached target Timers.
Jan 23 05:05:53 np0005593234 systemd[280440]: Starting D-Bus User Message Bus Socket...
Jan 23 05:05:53 np0005593234 systemd[280440]: Starting Create User's Volatile Files and Directories...
Jan 23 05:05:53 np0005593234 systemd[280440]: Finished Create User's Volatile Files and Directories.
Jan 23 05:05:53 np0005593234 systemd[280440]: Listening on D-Bus User Message Bus Socket.
Jan 23 05:05:53 np0005593234 systemd[280440]: Reached target Sockets.
Jan 23 05:05:53 np0005593234 systemd[280440]: Reached target Basic System.
Jan 23 05:05:53 np0005593234 systemd[280440]: Reached target Main User Target.
Jan 23 05:05:53 np0005593234 systemd[280440]: Startup finished in 138ms.
Jan 23 05:05:53 np0005593234 systemd[1]: Started User Manager for UID 42436.
Jan 23 05:05:53 np0005593234 systemd[1]: Started Session 66 of User nova.
Jan 23 05:05:53 np0005593234 systemd[1]: session-66.scope: Deactivated successfully.
Jan 23 05:05:53 np0005593234 systemd-logind[794]: Session 66 logged out. Waiting for processes to exit.
Jan 23 05:05:53 np0005593234 systemd-logind[794]: Removed session 66.
Jan 23 05:05:53 np0005593234 systemd-logind[794]: New session 68 of user nova.
Jan 23 05:05:53 np0005593234 systemd[1]: Started Session 68 of User nova.
Jan 23 05:05:53 np0005593234 systemd[1]: session-68.scope: Deactivated successfully.
Jan 23 05:05:53 np0005593234 systemd-logind[794]: Session 68 logged out. Waiting for processes to exit.
Jan 23 05:05:53 np0005593234 systemd-logind[794]: Removed session 68.
Jan 23 05:05:54 np0005593234 nova_compute[227762]: 2026-01-23 10:05:54.345 227766 DEBUG oslo_concurrency.lockutils [None req-779b0006-7d51-4b88-b7dd-3fcf1b468412 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:05:54 np0005593234 nova_compute[227762]: 2026-01-23 10:05:54.346 227766 DEBUG oslo_concurrency.lockutils [None req-779b0006-7d51-4b88-b7dd-3fcf1b468412 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:05:54 np0005593234 nova_compute[227762]: 2026-01-23 10:05:54.346 227766 DEBUG nova.compute.manager [None req-779b0006-7d51-4b88-b7dd-3fcf1b468412 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:54 np0005593234 nova_compute[227762]: 2026-01-23 10:05:54.351 227766 DEBUG nova.compute.manager [None req-779b0006-7d51-4b88-b7dd-3fcf1b468412 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 23 05:05:54 np0005593234 nova_compute[227762]: 2026-01-23 10:05:54.352 227766 DEBUG nova.objects.instance [None req-779b0006-7d51-4b88-b7dd-3fcf1b468412 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'flavor' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:54 np0005593234 nova_compute[227762]: 2026-01-23 10:05:54.380 227766 DEBUG nova.virt.libvirt.driver [None req-779b0006-7d51-4b88-b7dd-3fcf1b468412 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:05:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:54.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:55.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:55 np0005593234 nova_compute[227762]: 2026-01-23 10:05:55.337 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:05:56 np0005593234 nova_compute[227762]: 2026-01-23 10:05:56.367 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:56.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:56 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:05:56 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:05:56 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:56Z|00448|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=0)
Jan 23 05:05:57 np0005593234 nova_compute[227762]: 2026-01-23 10:05:57.013 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:57.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:57 np0005593234 kernel: tap5b468015-6b (unregistering): left promiscuous mode
Jan 23 05:05:57 np0005593234 NetworkManager[48942]: <info>  [1769162757.7243] device (tap5b468015-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:05:57 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:57Z|00449|binding|INFO|Releasing lport 5b468015-6b03-496b-acb0-201ef16d849d from this chassis (sb_readonly=0)
Jan 23 05:05:57 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:57Z|00450|binding|INFO|Setting lport 5b468015-6b03-496b-acb0-201ef16d849d down in Southbound
Jan 23 05:05:57 np0005593234 nova_compute[227762]: 2026-01-23 10:05:57.731 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:57 np0005593234 ovn_controller[134547]: 2026-01-23T10:05:57Z|00451|binding|INFO|Removing iface tap5b468015-6b ovn-installed in OVS
Jan 23 05:05:57 np0005593234 nova_compute[227762]: 2026-01-23 10:05:57.733 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:57.739 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:17:9b 10.100.0.6'], port_security=['fa:16:3e:87:17:9b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '06ab5530-6f75-4f7d-80cd-48cf4c63cfd9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5b468015-6b03-496b-acb0-201ef16d849d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:05:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:57.741 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5b468015-6b03-496b-acb0-201ef16d849d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c unbound from our chassis#033[00m
Jan 23 05:05:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:57.742 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee03d7c9-e107-41bf-95cc-5508578ad66c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:05:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:57.743 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c876c222-5ac5-454c-8cb4-9f0d24b3a15d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:57.744 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace which is not needed anymore#033[00m
Jan 23 05:05:57 np0005593234 nova_compute[227762]: 2026-01-23 10:05:57.751 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:57 np0005593234 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000072.scope: Deactivated successfully.
Jan 23 05:05:57 np0005593234 systemd[1]: machine-qemu\x2d50\x2dinstance\x2d00000072.scope: Consumed 14.200s CPU time.
Jan 23 05:05:57 np0005593234 systemd-machined[195626]: Machine qemu-50-instance-00000072 terminated.
Jan 23 05:05:57 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280157]: [NOTICE]   (280161) : haproxy version is 2.8.14-c23fe91
Jan 23 05:05:57 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280157]: [NOTICE]   (280161) : path to executable is /usr/sbin/haproxy
Jan 23 05:05:57 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280157]: [WARNING]  (280161) : Exiting Master process...
Jan 23 05:05:57 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280157]: [ALERT]    (280161) : Current worker (280163) exited with code 143 (Terminated)
Jan 23 05:05:57 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280157]: [WARNING]  (280161) : All workers exited. Exiting... (0)
Jan 23 05:05:57 np0005593234 systemd[1]: libpod-2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb.scope: Deactivated successfully.
Jan 23 05:05:57 np0005593234 podman[280539]: 2026-01-23 10:05:57.883428555 +0000 UTC m=+0.047850048 container died 2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:05:57 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb-userdata-shm.mount: Deactivated successfully.
Jan 23 05:05:57 np0005593234 systemd[1]: var-lib-containers-storage-overlay-102d859c716e4e924e7ac039a891098cbc9f54e060f787957e2abf98f56f6b0f-merged.mount: Deactivated successfully.
Jan 23 05:05:57 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:05:57 np0005593234 podman[280539]: 2026-01-23 10:05:57.931516638 +0000 UTC m=+0.095938131 container cleanup 2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:05:57 np0005593234 systemd[1]: libpod-conmon-2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb.scope: Deactivated successfully.
Jan 23 05:05:57 np0005593234 nova_compute[227762]: 2026-01-23 10:05:57.953 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:57 np0005593234 nova_compute[227762]: 2026-01-23 10:05:57.957 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:57 np0005593234 podman[280570]: 2026-01-23 10:05:57.991038609 +0000 UTC m=+0.039091633 container remove 2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:05:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:57.997 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[882de34f-3024-48af-b1dd-0a98484b3f13]: (4, ('Fri Jan 23 10:05:57 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb)\n2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb\nFri Jan 23 10:05:57 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb)\n2867fdb58ccae20553df42b7a15e4afeb2719d16206ac857833db0795f52bebb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:57.998 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d60630e4-754f-40e8-af96-1e3fe6cb2c8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:57.999 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:05:58 np0005593234 nova_compute[227762]: 2026-01-23 10:05:58.001 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:58 np0005593234 kernel: tapee03d7c9-e0: left promiscuous mode
Jan 23 05:05:58 np0005593234 nova_compute[227762]: 2026-01-23 10:05:58.017 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:05:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:58.021 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[082d779c-622d-44b5-b629-74599686bdd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:58.037 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a8abb0d1-2abb-4670-9ae2-a0f12d2992a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:58.038 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6ea97d6e-ac66-4e1c-845b-a710bb22ba32]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:58.053 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a7f79c0c-eb8a-4783-b0cd-37d225d1e5a6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 662667, 'reachable_time': 17650, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280600, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:58.055 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:05:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:05:58.055 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf9b49d-0685-44fa-8ba9-9005b343b045]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:05:58 np0005593234 systemd[1]: run-netns-ovnmeta\x2dee03d7c9\x2de107\x2d41bf\x2d95cc\x2d5508578ad66c.mount: Deactivated successfully.
Jan 23 05:05:58 np0005593234 nova_compute[227762]: 2026-01-23 10:05:58.401 227766 INFO nova.virt.libvirt.driver [None req-779b0006-7d51-4b88-b7dd-3fcf1b468412 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Instance shutdown successfully after 4 seconds.#033[00m
Jan 23 05:05:58 np0005593234 nova_compute[227762]: 2026-01-23 10:05:58.406 227766 INFO nova.virt.libvirt.driver [-] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Instance destroyed successfully.#033[00m
Jan 23 05:05:58 np0005593234 nova_compute[227762]: 2026-01-23 10:05:58.407 227766 DEBUG nova.objects.instance [None req-779b0006-7d51-4b88-b7dd-3fcf1b468412 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'numa_topology' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:05:58 np0005593234 nova_compute[227762]: 2026-01-23 10:05:58.454 227766 DEBUG nova.compute.manager [None req-779b0006-7d51-4b88-b7dd-3fcf1b468412 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:05:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:58.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:58 np0005593234 nova_compute[227762]: 2026-01-23 10:05:58.552 227766 DEBUG oslo_concurrency.lockutils [None req-779b0006-7d51-4b88-b7dd-3fcf1b468412 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 4.206s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:05:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:05:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:05:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:59.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:05:59 np0005593234 podman[280601]: 2026-01-23 10:05:59.79178999 +0000 UTC m=+0.086591389 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 05:06:00 np0005593234 nova_compute[227762]: 2026-01-23 10:06:00.331 227766 DEBUG nova.compute.manager [req-0d0eecc6-8aa4-497b-afbd-c4da4d82038c req-153d6050-e742-47e9-bb7e-4d745b83a5ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-unplugged-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:00 np0005593234 nova_compute[227762]: 2026-01-23 10:06:00.332 227766 DEBUG oslo_concurrency.lockutils [req-0d0eecc6-8aa4-497b-afbd-c4da4d82038c req-153d6050-e742-47e9-bb7e-4d745b83a5ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:00 np0005593234 nova_compute[227762]: 2026-01-23 10:06:00.332 227766 DEBUG oslo_concurrency.lockutils [req-0d0eecc6-8aa4-497b-afbd-c4da4d82038c req-153d6050-e742-47e9-bb7e-4d745b83a5ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:00 np0005593234 nova_compute[227762]: 2026-01-23 10:06:00.332 227766 DEBUG oslo_concurrency.lockutils [req-0d0eecc6-8aa4-497b-afbd-c4da4d82038c req-153d6050-e742-47e9-bb7e-4d745b83a5ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:00 np0005593234 nova_compute[227762]: 2026-01-23 10:06:00.332 227766 DEBUG nova.compute.manager [req-0d0eecc6-8aa4-497b-afbd-c4da4d82038c req-153d6050-e742-47e9-bb7e-4d745b83a5ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] No waiting events found dispatching network-vif-unplugged-5b468015-6b03-496b-acb0-201ef16d849d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:00 np0005593234 nova_compute[227762]: 2026-01-23 10:06:00.332 227766 WARNING nova.compute.manager [req-0d0eecc6-8aa4-497b-afbd-c4da4d82038c req-153d6050-e742-47e9-bb7e-4d745b83a5ce 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received unexpected event network-vif-unplugged-5b468015-6b03-496b-acb0-201ef16d849d for instance with vm_state stopped and task_state None.#033[00m
Jan 23 05:06:00 np0005593234 nova_compute[227762]: 2026-01-23 10:06:00.340 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:00.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:01.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:01 np0005593234 nova_compute[227762]: 2026-01-23 10:06:01.369 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:02.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:02 np0005593234 nova_compute[227762]: 2026-01-23 10:06:02.697 227766 DEBUG nova.compute.manager [req-e78be887-b229-473a-8279-db19168ffa2b req-8cea8224-151a-46e9-94a5-a2edff870f7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:02 np0005593234 nova_compute[227762]: 2026-01-23 10:06:02.697 227766 DEBUG oslo_concurrency.lockutils [req-e78be887-b229-473a-8279-db19168ffa2b req-8cea8224-151a-46e9-94a5-a2edff870f7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:02 np0005593234 nova_compute[227762]: 2026-01-23 10:06:02.697 227766 DEBUG oslo_concurrency.lockutils [req-e78be887-b229-473a-8279-db19168ffa2b req-8cea8224-151a-46e9-94a5-a2edff870f7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:02 np0005593234 nova_compute[227762]: 2026-01-23 10:06:02.698 227766 DEBUG oslo_concurrency.lockutils [req-e78be887-b229-473a-8279-db19168ffa2b req-8cea8224-151a-46e9-94a5-a2edff870f7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:02 np0005593234 nova_compute[227762]: 2026-01-23 10:06:02.698 227766 DEBUG nova.compute.manager [req-e78be887-b229-473a-8279-db19168ffa2b req-8cea8224-151a-46e9-94a5-a2edff870f7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] No waiting events found dispatching network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:02 np0005593234 nova_compute[227762]: 2026-01-23 10:06:02.698 227766 WARNING nova.compute.manager [req-e78be887-b229-473a-8279-db19168ffa2b req-8cea8224-151a-46e9-94a5-a2edff870f7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received unexpected event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d for instance with vm_state stopped and task_state None.#033[00m
Jan 23 05:06:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:03.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:03 np0005593234 nova_compute[227762]: 2026-01-23 10:06:03.826 227766 DEBUG nova.objects.instance [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'flavor' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:03 np0005593234 nova_compute[227762]: 2026-01-23 10:06:03.895 227766 DEBUG oslo_concurrency.lockutils [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:06:03 np0005593234 nova_compute[227762]: 2026-01-23 10:06:03.896 227766 DEBUG oslo_concurrency.lockutils [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquired lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:06:03 np0005593234 nova_compute[227762]: 2026-01-23 10:06:03.896 227766 DEBUG nova.network.neutron [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:06:03 np0005593234 nova_compute[227762]: 2026-01-23 10:06:03.896 227766 DEBUG nova.objects.instance [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'info_cache' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:03 np0005593234 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 05:06:03 np0005593234 systemd[280440]: Activating special unit Exit the Session...
Jan 23 05:06:03 np0005593234 systemd[280440]: Stopped target Main User Target.
Jan 23 05:06:03 np0005593234 systemd[280440]: Stopped target Basic System.
Jan 23 05:06:03 np0005593234 systemd[280440]: Stopped target Paths.
Jan 23 05:06:03 np0005593234 systemd[280440]: Stopped target Sockets.
Jan 23 05:06:03 np0005593234 systemd[280440]: Stopped target Timers.
Jan 23 05:06:03 np0005593234 systemd[280440]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 05:06:03 np0005593234 systemd[280440]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 05:06:03 np0005593234 systemd[280440]: Closed D-Bus User Message Bus Socket.
Jan 23 05:06:03 np0005593234 systemd[280440]: Stopped Create User's Volatile Files and Directories.
Jan 23 05:06:03 np0005593234 systemd[280440]: Removed slice User Application Slice.
Jan 23 05:06:03 np0005593234 systemd[280440]: Reached target Shutdown.
Jan 23 05:06:03 np0005593234 systemd[280440]: Finished Exit the Session.
Jan 23 05:06:03 np0005593234 systemd[280440]: Reached target Exit the Session.
Jan 23 05:06:03 np0005593234 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 05:06:03 np0005593234 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 05:06:03 np0005593234 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 05:06:03 np0005593234 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 05:06:03 np0005593234 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 05:06:03 np0005593234 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 05:06:03 np0005593234 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 05:06:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:04.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:05.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:05 np0005593234 nova_compute[227762]: 2026-01-23 10:06:05.343 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:06 np0005593234 nova_compute[227762]: 2026-01-23 10:06:06.372 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:06.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:07.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:08.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:08 np0005593234 nova_compute[227762]: 2026-01-23 10:06:08.820 227766 DEBUG nova.compute.manager [req-61da747e-304e-4733-8868-9cf5dca7e300 req-4c452f90-2fc5-4ca0-b9c9-5171e6c5e3db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received event network-vif-unplugged-f35157ad-0f62-41af-962e-a3afcd66400e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:08 np0005593234 nova_compute[227762]: 2026-01-23 10:06:08.820 227766 DEBUG oslo_concurrency.lockutils [req-61da747e-304e-4733-8868-9cf5dca7e300 req-4c452f90-2fc5-4ca0-b9c9-5171e6c5e3db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:08 np0005593234 nova_compute[227762]: 2026-01-23 10:06:08.820 227766 DEBUG oslo_concurrency.lockutils [req-61da747e-304e-4733-8868-9cf5dca7e300 req-4c452f90-2fc5-4ca0-b9c9-5171e6c5e3db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:08 np0005593234 nova_compute[227762]: 2026-01-23 10:06:08.820 227766 DEBUG oslo_concurrency.lockutils [req-61da747e-304e-4733-8868-9cf5dca7e300 req-4c452f90-2fc5-4ca0-b9c9-5171e6c5e3db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:08 np0005593234 nova_compute[227762]: 2026-01-23 10:06:08.820 227766 DEBUG nova.compute.manager [req-61da747e-304e-4733-8868-9cf5dca7e300 req-4c452f90-2fc5-4ca0-b9c9-5171e6c5e3db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] No waiting events found dispatching network-vif-unplugged-f35157ad-0f62-41af-962e-a3afcd66400e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:08 np0005593234 nova_compute[227762]: 2026-01-23 10:06:08.821 227766 WARNING nova.compute.manager [req-61da747e-304e-4733-8868-9cf5dca7e300 req-4c452f90-2fc5-4ca0-b9c9-5171e6c5e3db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received unexpected event network-vif-unplugged-f35157ad-0f62-41af-962e-a3afcd66400e for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 23 05:06:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:09.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:09 np0005593234 podman[280680]: 2026-01-23 10:06:09.786300091 +0000 UTC m=+0.078885208 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 05:06:10 np0005593234 nova_compute[227762]: 2026-01-23 10:06:10.347 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:10.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:11.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:11 np0005593234 nova_compute[227762]: 2026-01-23 10:06:11.373 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:12 np0005593234 nova_compute[227762]: 2026-01-23 10:06:12.419 227766 DEBUG nova.compute.manager [req-8b786cf4-ff56-4032-9152-efdd85e93c14 req-f1b79472-5cb5-4207-8da8-cbdb56d67fc7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received event network-vif-plugged-f35157ad-0f62-41af-962e-a3afcd66400e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:12 np0005593234 nova_compute[227762]: 2026-01-23 10:06:12.420 227766 DEBUG oslo_concurrency.lockutils [req-8b786cf4-ff56-4032-9152-efdd85e93c14 req-f1b79472-5cb5-4207-8da8-cbdb56d67fc7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:12 np0005593234 nova_compute[227762]: 2026-01-23 10:06:12.420 227766 DEBUG oslo_concurrency.lockutils [req-8b786cf4-ff56-4032-9152-efdd85e93c14 req-f1b79472-5cb5-4207-8da8-cbdb56d67fc7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:12 np0005593234 nova_compute[227762]: 2026-01-23 10:06:12.420 227766 DEBUG oslo_concurrency.lockutils [req-8b786cf4-ff56-4032-9152-efdd85e93c14 req-f1b79472-5cb5-4207-8da8-cbdb56d67fc7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:12 np0005593234 nova_compute[227762]: 2026-01-23 10:06:12.420 227766 DEBUG nova.compute.manager [req-8b786cf4-ff56-4032-9152-efdd85e93c14 req-f1b79472-5cb5-4207-8da8-cbdb56d67fc7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] No waiting events found dispatching network-vif-plugged-f35157ad-0f62-41af-962e-a3afcd66400e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:12 np0005593234 nova_compute[227762]: 2026-01-23 10:06:12.421 227766 WARNING nova.compute.manager [req-8b786cf4-ff56-4032-9152-efdd85e93c14 req-f1b79472-5cb5-4207-8da8-cbdb56d67fc7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received unexpected event network-vif-plugged-f35157ad-0f62-41af-962e-a3afcd66400e for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 23 05:06:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:12.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:12 np0005593234 nova_compute[227762]: 2026-01-23 10:06:12.969 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162757.967141, 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:06:12 np0005593234 nova_compute[227762]: 2026-01-23 10:06:12.969 227766 INFO nova.compute.manager [-] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:06:13 np0005593234 nova_compute[227762]: 2026-01-23 10:06:13.069 227766 DEBUG nova.compute.manager [None req-18c2fb07-ffb6-46ed-82dc-437ca4146adc - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:13 np0005593234 nova_compute[227762]: 2026-01-23 10:06:13.073 227766 DEBUG nova.compute.manager [None req-18c2fb07-ffb6-46ed-82dc-437ca4146adc - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:06:13 np0005593234 nova_compute[227762]: 2026-01-23 10:06:13.223 227766 INFO nova.compute.manager [None req-18c2fb07-ffb6-46ed-82dc-437ca4146adc - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 23 05:06:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:13.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:14.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:14 np0005593234 nova_compute[227762]: 2026-01-23 10:06:14.974 227766 DEBUG nova.network.neutron [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Updating instance_info_cache with network_info: [{"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.109 227766 DEBUG oslo_concurrency.lockutils [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Releasing lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.170 227766 INFO nova.virt.libvirt.driver [-] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Instance destroyed successfully.#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.171 227766 DEBUG nova.objects.instance [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'numa_topology' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.254 227766 DEBUG nova.objects.instance [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'resources' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.295 227766 DEBUG nova.virt.libvirt.vif [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:05:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1240996482',display_name='tempest-ServerActionsTestJSON-server-1240996482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1240996482',id=114,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:05:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-wcdb1ybb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:05:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=06ab5530-6f75-4f7d-80cd-48cf4c63cfd9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.296 227766 DEBUG nova.network.os_vif_util [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.296 227766 DEBUG nova.network.os_vif_util [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.297 227766 DEBUG os_vif [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.299 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.300 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b468015-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.302 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.305 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.309 227766 INFO os_vif [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b')#033[00m
Jan 23 05:06:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:15.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.319 227766 DEBUG nova.virt.libvirt.driver [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Start _get_guest_xml network_info=[{"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.324 227766 WARNING nova.virt.libvirt.driver [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.338 227766 DEBUG nova.virt.libvirt.host [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.339 227766 DEBUG nova.virt.libvirt.host [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.392 227766 DEBUG nova.virt.libvirt.host [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.393 227766 DEBUG nova.virt.libvirt.host [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.394 227766 DEBUG nova.virt.libvirt.driver [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.394 227766 DEBUG nova.virt.hardware [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.394 227766 DEBUG nova.virt.hardware [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.395 227766 DEBUG nova.virt.hardware [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.395 227766 DEBUG nova.virt.hardware [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.395 227766 DEBUG nova.virt.hardware [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.395 227766 DEBUG nova.virt.hardware [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.395 227766 DEBUG nova.virt.hardware [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.396 227766 DEBUG nova.virt.hardware [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.396 227766 DEBUG nova.virt.hardware [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.396 227766 DEBUG nova.virt.hardware [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.396 227766 DEBUG nova.virt.hardware [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.396 227766 DEBUG nova.objects.instance [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'vcpu_model' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.440 227766 DEBUG oslo_concurrency.processutils [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:06:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/197633371' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.904 227766 DEBUG oslo_concurrency.processutils [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:15 np0005593234 nova_compute[227762]: 2026-01-23 10:06:15.938 227766 DEBUG oslo_concurrency.processutils [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:06:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1711887372' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.367 227766 DEBUG oslo_concurrency.processutils [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.369 227766 DEBUG nova.virt.libvirt.vif [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:05:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1240996482',display_name='tempest-ServerActionsTestJSON-server-1240996482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1240996482',id=114,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:05:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-wcdb1ybb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:05:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=06ab5530-6f75-4f7d-80cd-48cf4c63cfd9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.370 227766 DEBUG nova.network.os_vif_util [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.371 227766 DEBUG nova.network.os_vif_util [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.373 227766 DEBUG nova.objects.instance [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'pci_devices' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.375 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.466 227766 DEBUG nova.virt.libvirt.driver [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  <uuid>06ab5530-6f75-4f7d-80cd-48cf4c63cfd9</uuid>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  <name>instance-00000072</name>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerActionsTestJSON-server-1240996482</nova:name>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:06:15</nova:creationTime>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <nova:user uuid="9d4a5c201efa4992a9ef57d8abdc1675">tempest-ServerActionsTestJSON-1619235720-project-member</nova:user>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <nova:project uuid="74c5c1d0762242f29a5d26033efd9f6d">tempest-ServerActionsTestJSON-1619235720</nova:project>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <nova:port uuid="5b468015-6b03-496b-acb0-201ef16d849d">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <entry name="serial">06ab5530-6f75-4f7d-80cd-48cf4c63cfd9</entry>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <entry name="uuid">06ab5530-6f75-4f7d-80cd-48cf4c63cfd9</entry>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_disk.config">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:87:17:9b"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <target dev="tap5b468015-6b"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9/console.log" append="off"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <input type="keyboard" bus="usb"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:06:16 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:06:16 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:06:16 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:06:16 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.468 227766 DEBUG nova.virt.libvirt.driver [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.468 227766 DEBUG nova.virt.libvirt.driver [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.469 227766 DEBUG nova.virt.libvirt.vif [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:05:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1240996482',display_name='tempest-ServerActionsTestJSON-server-1240996482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1240996482',id=114,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:05:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-wcdb1ybb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:05:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=06ab5530-6f75-4f7d-80cd-48cf4c63cfd9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.469 227766 DEBUG nova.network.os_vif_util [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.470 227766 DEBUG nova.network.os_vif_util [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.470 227766 DEBUG os_vif [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.471 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.471 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.471 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.474 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.475 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5b468015-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.475 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5b468015-6b, col_values=(('external_ids', {'iface-id': '5b468015-6b03-496b-acb0-201ef16d849d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:17:9b', 'vm-uuid': '06ab5530-6f75-4f7d-80cd-48cf4c63cfd9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.476 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:16 np0005593234 NetworkManager[48942]: <info>  [1769162776.4778] manager: (tap5b468015-6b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/233)
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.479 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.485 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.486 227766 INFO os_vif [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b')#033[00m
Jan 23 05:06:16 np0005593234 kernel: tap5b468015-6b: entered promiscuous mode
Jan 23 05:06:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:16.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:16 np0005593234 NetworkManager[48942]: <info>  [1769162776.5665] manager: (tap5b468015-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/234)
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.567 227766 INFO nova.network.neutron [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Updating port f35157ad-0f62-41af-962e-a3afcd66400e with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 05:06:16 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:16Z|00452|binding|INFO|Claiming lport 5b468015-6b03-496b-acb0-201ef16d849d for this chassis.
Jan 23 05:06:16 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:16Z|00453|binding|INFO|5b468015-6b03-496b-acb0-201ef16d849d: Claiming fa:16:3e:87:17:9b 10.100.0.6
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.571 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:16 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:16Z|00454|binding|INFO|Setting lport 5b468015-6b03-496b-acb0-201ef16d849d ovn-installed in OVS
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.587 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.590 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:16 np0005593234 systemd-machined[195626]: New machine qemu-51-instance-00000072.
Jan 23 05:06:16 np0005593234 systemd-udevd[280787]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:06:16 np0005593234 systemd[1]: Started Virtual Machine qemu-51-instance-00000072.
Jan 23 05:06:16 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:16Z|00455|binding|INFO|Setting lport 5b468015-6b03-496b-acb0-201ef16d849d up in Southbound
Jan 23 05:06:16 np0005593234 NetworkManager[48942]: <info>  [1769162776.6166] device (tap5b468015-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:06:16 np0005593234 NetworkManager[48942]: <info>  [1769162776.6173] device (tap5b468015-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.615 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:17:9b 10.100.0.6'], port_security=['fa:16:3e:87:17:9b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '06ab5530-6f75-4f7d-80cd-48cf4c63cfd9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '5', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5b468015-6b03-496b-acb0-201ef16d849d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.616 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5b468015-6b03-496b-acb0-201ef16d849d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c bound to our chassis#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.618 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee03d7c9-e107-41bf-95cc-5508578ad66c#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.629 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b0fd4ad2-6ca6-4cd2-9737-8e3a7631d83e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.630 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee03d7c9-e1 in ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.632 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee03d7c9-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.632 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5098a136-30e5-4f3d-a987-5d8ef30106a8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.633 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[175a5f9b-4755-4422-b590-787211721c80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.644 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[801d26c6-eba4-4bb7-a56d-aa1ce689bc5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.658 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6fc46b91-10a7-4c56-a24a-977eedb3fc18]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.683 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ada8a8b8-45ed-4879-a79d-b18fc5598b10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.687 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a5a993de-069f-4a6b-b34d-ac1b619e5ae4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 NetworkManager[48942]: <info>  [1769162776.6887] manager: (tapee03d7c9-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/235)
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.715 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d8cde9af-a359-487f-b154-5e00bdc87e16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.718 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[2bec0438-8a57-4ae8-9471-9679ecc64325]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 NetworkManager[48942]: <info>  [1769162776.7409] device (tapee03d7c9-e0): carrier: link connected
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.746 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[98d8498a-fc90-4718-9484-3ecf4a487ed8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.762 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9812cebc-8d74-49f3-8ce3-09171c0e42bc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 146], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 667170, 'reachable_time': 35304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 280820, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.778 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fde19a48-0726-4ec8-8b1b-49be067fa772]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:6530'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 667170, 'tstamp': 667170}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 280821, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.793 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9b702323-f3c7-493b-87e2-974e4b5e1287]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 146], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 667170, 'reachable_time': 35304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 280822, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.821 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c13a0804-fc16-48ef-9e04-77144fc17ce2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.878 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f5f3c19d-8d36-4530-9a55-8172e6e004f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.880 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.880 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.880 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee03d7c9-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:16 np0005593234 NetworkManager[48942]: <info>  [1769162776.8826] manager: (tapee03d7c9-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/236)
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.882 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:16 np0005593234 kernel: tapee03d7c9-e0: entered promiscuous mode
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.885 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.886 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee03d7c9-e0, col_values=(('external_ids', {'iface-id': '702d4523-a665-42f5-9a36-57d187c0698a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.887 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:16 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:16Z|00456|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=1)
Jan 23 05:06:16 np0005593234 nova_compute[227762]: 2026-01-23 10:06:16.899 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.900 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.901 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6d5e65a3-a010-4500-b594-3bec358968df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.901 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:06:16 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:16.902 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'env', 'PROCESS_TAG=haproxy-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee03d7c9-e107-41bf-95cc-5508578ad66c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:06:17 np0005593234 nova_compute[227762]: 2026-01-23 10:06:17.026 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162777.026147, 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:06:17 np0005593234 nova_compute[227762]: 2026-01-23 10:06:17.027 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:06:17 np0005593234 nova_compute[227762]: 2026-01-23 10:06:17.029 227766 DEBUG nova.compute.manager [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:06:17 np0005593234 nova_compute[227762]: 2026-01-23 10:06:17.032 227766 INFO nova.virt.libvirt.driver [-] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Instance rebooted successfully.#033[00m
Jan 23 05:06:17 np0005593234 nova_compute[227762]: 2026-01-23 10:06:17.033 227766 DEBUG nova.compute.manager [None req-01b2a12f-ef7d-44de-8831-a5e0ae48a2a3 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:17.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:17 np0005593234 podman[280894]: 2026-01-23 10:06:17.25498172 +0000 UTC m=+0.028203243 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:06:17 np0005593234 podman[280894]: 2026-01-23 10:06:17.376392027 +0000 UTC m=+0.149613530 container create df0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:06:17 np0005593234 systemd[1]: Started libpod-conmon-df0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d.scope.
Jan 23 05:06:17 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:06:17 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8ee1f035f4d67072dd0f66a331a13de8c3934448e7468ac66e43ebc07e4c38e5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:06:17 np0005593234 podman[280894]: 2026-01-23 10:06:17.497549905 +0000 UTC m=+0.270771428 container init df0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:06:17 np0005593234 podman[280894]: 2026-01-23 10:06:17.504443191 +0000 UTC m=+0.277664694 container start df0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:06:17 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280910]: [NOTICE]   (280914) : New worker (280916) forked
Jan 23 05:06:17 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280910]: [NOTICE]   (280914) : Loading success.
Jan 23 05:06:17 np0005593234 nova_compute[227762]: 2026-01-23 10:06:17.891 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:17 np0005593234 nova_compute[227762]: 2026-01-23 10:06:17.896 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:06:18 np0005593234 nova_compute[227762]: 2026-01-23 10:06:18.008 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 23 05:06:18 np0005593234 nova_compute[227762]: 2026-01-23 10:06:18.009 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162777.0269904, 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:06:18 np0005593234 nova_compute[227762]: 2026-01-23 10:06:18.009 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] VM Started (Lifecycle Event)#033[00m
Jan 23 05:06:18 np0005593234 nova_compute[227762]: 2026-01-23 10:06:18.129 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:18 np0005593234 nova_compute[227762]: 2026-01-23 10:06:18.133 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:06:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:18.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:18 np0005593234 nova_compute[227762]: 2026-01-23 10:06:18.817 227766 DEBUG nova.compute.manager [req-f86df214-6390-4961-a8f1-8176ec354ea6 req-74374529-2660-41e6-9858-ff5cded84979 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:18 np0005593234 nova_compute[227762]: 2026-01-23 10:06:18.817 227766 DEBUG oslo_concurrency.lockutils [req-f86df214-6390-4961-a8f1-8176ec354ea6 req-74374529-2660-41e6-9858-ff5cded84979 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:18 np0005593234 nova_compute[227762]: 2026-01-23 10:06:18.818 227766 DEBUG oslo_concurrency.lockutils [req-f86df214-6390-4961-a8f1-8176ec354ea6 req-74374529-2660-41e6-9858-ff5cded84979 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:18 np0005593234 nova_compute[227762]: 2026-01-23 10:06:18.818 227766 DEBUG oslo_concurrency.lockutils [req-f86df214-6390-4961-a8f1-8176ec354ea6 req-74374529-2660-41e6-9858-ff5cded84979 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:18 np0005593234 nova_compute[227762]: 2026-01-23 10:06:18.818 227766 DEBUG nova.compute.manager [req-f86df214-6390-4961-a8f1-8176ec354ea6 req-74374529-2660-41e6-9858-ff5cded84979 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] No waiting events found dispatching network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:18 np0005593234 nova_compute[227762]: 2026-01-23 10:06:18.818 227766 WARNING nova.compute.manager [req-f86df214-6390-4961-a8f1-8176ec354ea6 req-74374529-2660-41e6-9858-ff5cded84979 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received unexpected event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:06:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:19.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:19 np0005593234 nova_compute[227762]: 2026-01-23 10:06:19.400 227766 DEBUG oslo_concurrency.lockutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "refresh_cache-483afeac-561b-48ff-89d6-d02d1b615fc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:06:19 np0005593234 nova_compute[227762]: 2026-01-23 10:06:19.400 227766 DEBUG oslo_concurrency.lockutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquired lock "refresh_cache-483afeac-561b-48ff-89d6-d02d1b615fc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:06:19 np0005593234 nova_compute[227762]: 2026-01-23 10:06:19.401 227766 DEBUG nova.network.neutron [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:06:19 np0005593234 nova_compute[227762]: 2026-01-23 10:06:19.452 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:19.452 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=43, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=42) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:06:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:19.454 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:06:19 np0005593234 nova_compute[227762]: 2026-01-23 10:06:19.853 227766 DEBUG nova.compute.manager [req-9be21528-3741-4bcd-8c70-cef1125531f1 req-acc1caeb-d312-45d5-a999-980f8c89c69f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received event network-changed-f35157ad-0f62-41af-962e-a3afcd66400e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:19 np0005593234 nova_compute[227762]: 2026-01-23 10:06:19.854 227766 DEBUG nova.compute.manager [req-9be21528-3741-4bcd-8c70-cef1125531f1 req-acc1caeb-d312-45d5-a999-980f8c89c69f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Refreshing instance network info cache due to event network-changed-f35157ad-0f62-41af-962e-a3afcd66400e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:06:19 np0005593234 nova_compute[227762]: 2026-01-23 10:06:19.854 227766 DEBUG oslo_concurrency.lockutils [req-9be21528-3741-4bcd-8c70-cef1125531f1 req-acc1caeb-d312-45d5-a999-980f8c89c69f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-483afeac-561b-48ff-89d6-d02d1b615fc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #106. Immutable memtables: 0.
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.522587) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 65] Flushing memtable with next log file: 106
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162780522691, "job": 65, "event": "flush_started", "num_memtables": 1, "num_entries": 906, "num_deletes": 251, "total_data_size": 1699916, "memory_usage": 1732880, "flush_reason": "Manual Compaction"}
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 65] Level-0 flush table #107: started
Jan 23 05:06:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:20.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162780574074, "cf_name": "default", "job": 65, "event": "table_file_creation", "file_number": 107, "file_size": 1121837, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 53604, "largest_seqno": 54505, "table_properties": {"data_size": 1117637, "index_size": 1852, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 9647, "raw_average_key_size": 19, "raw_value_size": 1109191, "raw_average_value_size": 2277, "num_data_blocks": 81, "num_entries": 487, "num_filter_entries": 487, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162717, "oldest_key_time": 1769162717, "file_creation_time": 1769162780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 107, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 65] Flush lasted 51542 microseconds, and 3976 cpu microseconds.
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.574143) [db/flush_job.cc:967] [default] [JOB 65] Level-0 flush table #107: 1121837 bytes OK
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.574163) [db/memtable_list.cc:519] [default] Level-0 commit table #107 started
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.626166) [db/memtable_list.cc:722] [default] Level-0 commit table #107: memtable #1 done
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.626210) EVENT_LOG_v1 {"time_micros": 1769162780626201, "job": 65, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.626231) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 65] Try to delete WAL files size 1695267, prev total WAL file size 1695267, number of live WAL files 2.
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000103.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.626987) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034353138' seq:72057594037927935, type:22 .. '7061786F730034373730' seq:0, type:0; will stop at (end)
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 66] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 65 Base level 0, inputs: [107(1095KB)], [105(12MB)]
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162780627105, "job": 66, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [107], "files_L6": [105], "score": -1, "input_data_size": 14330437, "oldest_snapshot_seqno": -1}
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 66] Generated table #108: 7670 keys, 12511656 bytes, temperature: kUnknown
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162780754879, "cf_name": "default", "job": 66, "event": "table_file_creation", "file_number": 108, "file_size": 12511656, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12459453, "index_size": 31931, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19205, "raw_key_size": 198748, "raw_average_key_size": 25, "raw_value_size": 12321598, "raw_average_value_size": 1606, "num_data_blocks": 1261, "num_entries": 7670, "num_filter_entries": 7670, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769162780, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 108, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.755139) [db/compaction/compaction_job.cc:1663] [default] [JOB 66] Compacted 1@0 + 1@6 files to L6 => 12511656 bytes
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.773380) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 112.1 rd, 97.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 12.6 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(23.9) write-amplify(11.2) OK, records in: 8184, records dropped: 514 output_compression: NoCompression
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.773424) EVENT_LOG_v1 {"time_micros": 1769162780773406, "job": 66, "event": "compaction_finished", "compaction_time_micros": 127851, "compaction_time_cpu_micros": 27147, "output_level": 6, "num_output_files": 1, "total_output_size": 12511656, "num_input_records": 8184, "num_output_records": 7670, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000107.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162780773903, "job": 66, "event": "table_file_deletion", "file_number": 107}
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000105.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162780776298, "job": 66, "event": "table_file_deletion", "file_number": 105}
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.626885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.776403) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.776408) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.776409) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.776410) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:06:20.776412) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:06:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:21.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:21 np0005593234 nova_compute[227762]: 2026-01-23 10:06:21.378 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:21 np0005593234 nova_compute[227762]: 2026-01-23 10:06:21.477 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:06:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:22.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:06:23 np0005593234 nova_compute[227762]: 2026-01-23 10:06:23.285 227766 DEBUG nova.compute.manager [req-16f26542-78b2-4e11-a074-9e1ebfd40dd8 req-f1b3cd0a-76a5-43b5-ab5c-caa992b60889 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:23 np0005593234 nova_compute[227762]: 2026-01-23 10:06:23.286 227766 DEBUG oslo_concurrency.lockutils [req-16f26542-78b2-4e11-a074-9e1ebfd40dd8 req-f1b3cd0a-76a5-43b5-ab5c-caa992b60889 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:23 np0005593234 nova_compute[227762]: 2026-01-23 10:06:23.286 227766 DEBUG oslo_concurrency.lockutils [req-16f26542-78b2-4e11-a074-9e1ebfd40dd8 req-f1b3cd0a-76a5-43b5-ab5c-caa992b60889 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:23 np0005593234 nova_compute[227762]: 2026-01-23 10:06:23.287 227766 DEBUG oslo_concurrency.lockutils [req-16f26542-78b2-4e11-a074-9e1ebfd40dd8 req-f1b3cd0a-76a5-43b5-ab5c-caa992b60889 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:23 np0005593234 nova_compute[227762]: 2026-01-23 10:06:23.287 227766 DEBUG nova.compute.manager [req-16f26542-78b2-4e11-a074-9e1ebfd40dd8 req-f1b3cd0a-76a5-43b5-ab5c-caa992b60889 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] No waiting events found dispatching network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:23 np0005593234 nova_compute[227762]: 2026-01-23 10:06:23.287 227766 WARNING nova.compute.manager [req-16f26542-78b2-4e11-a074-9e1ebfd40dd8 req-f1b3cd0a-76a5-43b5-ab5c-caa992b60889 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received unexpected event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:06:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:06:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:23.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:06:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:24.456 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '43'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:24.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:25.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:26 np0005593234 nova_compute[227762]: 2026-01-23 10:06:26.380 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:26 np0005593234 nova_compute[227762]: 2026-01-23 10:06:26.478 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:26.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:27.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:27 np0005593234 nova_compute[227762]: 2026-01-23 10:06:27.467 227766 DEBUG oslo_concurrency.lockutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "735c1181-0c35-4816-96b4-886b145e0c0c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:27 np0005593234 nova_compute[227762]: 2026-01-23 10:06:27.468 227766 DEBUG oslo_concurrency.lockutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "735c1181-0c35-4816-96b4-886b145e0c0c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:27 np0005593234 nova_compute[227762]: 2026-01-23 10:06:27.549 227766 DEBUG nova.compute.manager [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:06:27 np0005593234 nova_compute[227762]: 2026-01-23 10:06:27.670 227766 DEBUG oslo_concurrency.lockutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:27 np0005593234 nova_compute[227762]: 2026-01-23 10:06:27.671 227766 DEBUG oslo_concurrency.lockutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:27 np0005593234 nova_compute[227762]: 2026-01-23 10:06:27.677 227766 DEBUG nova.virt.hardware [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:06:27 np0005593234 nova_compute[227762]: 2026-01-23 10:06:27.678 227766 INFO nova.compute.claims [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:06:28 np0005593234 nova_compute[227762]: 2026-01-23 10:06:28.002 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:28Z|00457|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 23 05:06:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:06:28 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2892758350' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:06:28 np0005593234 nova_compute[227762]: 2026-01-23 10:06:28.421 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:28 np0005593234 nova_compute[227762]: 2026-01-23 10:06:28.428 227766 DEBUG nova.compute.provider_tree [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:06:28 np0005593234 nova_compute[227762]: 2026-01-23 10:06:28.480 227766 DEBUG nova.scheduler.client.report [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:06:28 np0005593234 nova_compute[227762]: 2026-01-23 10:06:28.530 227766 DEBUG oslo_concurrency.lockutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:28 np0005593234 nova_compute[227762]: 2026-01-23 10:06:28.531 227766 DEBUG nova.compute.manager [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:06:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:28.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:28 np0005593234 nova_compute[227762]: 2026-01-23 10:06:28.634 227766 DEBUG nova.compute.manager [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 23 05:06:28 np0005593234 nova_compute[227762]: 2026-01-23 10:06:28.674 227766 INFO nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:06:28 np0005593234 nova_compute[227762]: 2026-01-23 10:06:28.698 227766 DEBUG nova.compute.manager [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:06:28 np0005593234 nova_compute[227762]: 2026-01-23 10:06:28.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:28 np0005593234 nova_compute[227762]: 2026-01-23 10:06:28.970 227766 DEBUG nova.network.neutron [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Updating instance_info_cache with network_info: [{"id": "f35157ad-0f62-41af-962e-a3afcd66400e", "address": "fa:16:3e:15:67:98", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf35157ad-0f", "ovs_interfaceid": "f35157ad-0f62-41af-962e-a3afcd66400e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.164 227766 DEBUG oslo_concurrency.lockutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Releasing lock "refresh_cache-483afeac-561b-48ff-89d6-d02d1b615fc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.168 227766 DEBUG oslo_concurrency.lockutils [req-9be21528-3741-4bcd-8c70-cef1125531f1 req-acc1caeb-d312-45d5-a999-980f8c89c69f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-483afeac-561b-48ff-89d6-d02d1b615fc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.168 227766 DEBUG nova.network.neutron [req-9be21528-3741-4bcd-8c70-cef1125531f1 req-acc1caeb-d312-45d5-a999-980f8c89c69f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Refreshing network info cache for port f35157ad-0f62-41af-962e-a3afcd66400e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.232 227766 DEBUG nova.compute.manager [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.234 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.234 227766 INFO nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Creating image(s)#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.263 227766 DEBUG nova.storage.rbd_utils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.308 227766 DEBUG nova.storage.rbd_utils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:29.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.338 227766 DEBUG nova.storage.rbd_utils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.341 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.367 227766 DEBUG os_brick.utils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.369 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.382 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.383 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f58fb4-bd1c-4b7c-867e-60dceee355d8]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.385 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.397 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.397 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[cf1f6710-c033-4b07-abdd-1b2755a1a030]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.400 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.404 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.405 227766 DEBUG oslo_concurrency.lockutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.406 227766 DEBUG oslo_concurrency.lockutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.406 227766 DEBUG oslo_concurrency.lockutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.432 227766 DEBUG nova.storage.rbd_utils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.436 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 735c1181-0c35-4816-96b4-886b145e0c0c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.411 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.412 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[6f42709d-6bce-4d46-a673-e3c1480006f7]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.464 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[1d6dbfff-7c49-4df0-9ded-2d681ecc7155]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.465 227766 DEBUG oslo_concurrency.processutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.486 227766 DEBUG oslo_concurrency.processutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "nvme version" returned: 0 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.489 227766 DEBUG os_brick.initiator.connectors.lightos [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.489 227766 DEBUG os_brick.initiator.connectors.lightos [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.490 227766 DEBUG os_brick.initiator.connectors.lightos [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.490 227766 DEBUG os_brick.utils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] <== get_connector_properties: return (122ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:06:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:29Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:87:17:9b 10.100.0.6
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.925 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 735c1181-0c35-4816-96b4-886b145e0c0c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:29 np0005593234 nova_compute[227762]: 2026-01-23 10:06:29.996 227766 DEBUG nova.storage.rbd_utils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] resizing rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.102 227766 DEBUG nova.objects.instance [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'migration_context' on Instance uuid 735c1181-0c35-4816-96b4-886b145e0c0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.136 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.137 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Ensure instance console log exists: /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.137 227766 DEBUG oslo_concurrency.lockutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.138 227766 DEBUG oslo_concurrency.lockutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.138 227766 DEBUG oslo_concurrency.lockutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.139 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.144 227766 WARNING nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.151 227766 DEBUG nova.virt.libvirt.host [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.152 227766 DEBUG nova.virt.libvirt.host [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.157 227766 DEBUG nova.virt.libvirt.host [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.158 227766 DEBUG nova.virt.libvirt.host [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.159 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.159 227766 DEBUG nova.virt.hardware [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.159 227766 DEBUG nova.virt.hardware [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.159 227766 DEBUG nova.virt.hardware [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.160 227766 DEBUG nova.virt.hardware [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.160 227766 DEBUG nova.virt.hardware [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.160 227766 DEBUG nova.virt.hardware [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.160 227766 DEBUG nova.virt.hardware [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.160 227766 DEBUG nova.virt.hardware [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.161 227766 DEBUG nova.virt.hardware [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.161 227766 DEBUG nova.virt.hardware [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.161 227766 DEBUG nova.virt.hardware [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.163 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:06:30 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/186568101' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:06:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:30.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.580 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.606 227766 DEBUG nova.storage.rbd_utils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:30 np0005593234 nova_compute[227762]: 2026-01-23 10:06:30.610 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:30 np0005593234 podman[281218]: 2026-01-23 10:06:30.778770295 +0000 UTC m=+0.075804522 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 23 05:06:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:06:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/740841838' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:06:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:06:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1488164210' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:06:31 np0005593234 nova_compute[227762]: 2026-01-23 10:06:31.034 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:31 np0005593234 nova_compute[227762]: 2026-01-23 10:06:31.035 227766 DEBUG nova.objects.instance [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'pci_devices' on Instance uuid 735c1181-0c35-4816-96b4-886b145e0c0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:31 np0005593234 nova_compute[227762]: 2026-01-23 10:06:31.069 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  <uuid>735c1181-0c35-4816-96b4-886b145e0c0c</uuid>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  <name>instance-00000076</name>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerShowV247Test-server-964731518</nova:name>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:06:30</nova:creationTime>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <nova:user uuid="e0fe7d252cd04174840bdf8dfefa3510">tempest-ServerShowV247Test-1613897366-project-member</nova:user>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <nova:project uuid="3ecb2c0cafc441fd9457198fe09cc97b">tempest-ServerShowV247Test-1613897366</nova:project>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <entry name="serial">735c1181-0c35-4816-96b4-886b145e0c0c</entry>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <entry name="uuid">735c1181-0c35-4816-96b4-886b145e0c0c</entry>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/735c1181-0c35-4816-96b4-886b145e0c0c_disk">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/735c1181-0c35-4816-96b4-886b145e0c0c_disk.config">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/console.log" append="off"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:06:31 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:06:31 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:06:31 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:06:31 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:06:31 np0005593234 nova_compute[227762]: 2026-01-23 10:06:31.269 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:06:31 np0005593234 nova_compute[227762]: 2026-01-23 10:06:31.269 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:06:31 np0005593234 nova_compute[227762]: 2026-01-23 10:06:31.269 227766 INFO nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Using config drive#033[00m
Jan 23 05:06:31 np0005593234 nova_compute[227762]: 2026-01-23 10:06:31.296 227766 DEBUG nova.storage.rbd_utils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:31.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:31 np0005593234 nova_compute[227762]: 2026-01-23 10:06:31.382 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:31 np0005593234 nova_compute[227762]: 2026-01-23 10:06:31.480 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:31 np0005593234 nova_compute[227762]: 2026-01-23 10:06:31.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:32.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:32 np0005593234 nova_compute[227762]: 2026-01-23 10:06:32.950 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:32 np0005593234 nova_compute[227762]: 2026-01-23 10:06:32.951 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:32 np0005593234 nova_compute[227762]: 2026-01-23 10:06:32.951 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:32 np0005593234 nova_compute[227762]: 2026-01-23 10:06:32.951 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:06:32 np0005593234 nova_compute[227762]: 2026-01-23 10:06:32.951 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:33.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:06:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/862517546' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.397 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.612 227766 INFO nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Creating config drive at /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/disk.config#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.618 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu4hyz2pn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.723 227766 DEBUG nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.724 227766 DEBUG nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.725 227766 INFO nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Creating image(s)#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.725 227766 DEBUG nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.725 227766 DEBUG nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Ensure instance console log exists: /var/lib/nova/instances/483afeac-561b-48ff-89d6-d02d1b615fc9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.726 227766 DEBUG oslo_concurrency.lockutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.726 227766 DEBUG oslo_concurrency.lockutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.726 227766 DEBUG oslo_concurrency.lockutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.729 227766 DEBUG nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Start _get_guest_xml network_info=[{"id": "f35157ad-0f62-41af-962e-a3afcd66400e", "address": "fa:16:3e:15:67:98", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-2130726771-network", "vif_mac": "fa:16:3e:15:67:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf35157ad-0f", "ovs_interfaceid": "f35157ad-0f62-41af-962e-a3afcd66400e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'boot_index': 0, 'mount_device': '/dev/vda', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-1337690c-8061-4b7e-bb70-8cbfeecc77ac', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '1337690c-8061-4b7e-bb70-8cbfeecc77ac', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'attaching', 'instance': '483afeac-561b-48ff-89d6-d02d1b615fc9', 'attached_at': '2026-01-23T10:06:31.000000', 'detached_at': '', 'volume_id': '1337690c-8061-4b7e-bb70-8cbfeecc77ac', 'serial': '1337690c-8061-4b7e-bb70-8cbfeecc77ac'}, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '4b5e30a1-7a86-47bd-b514-be1adeaa07a1', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.734 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000076 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.735 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000076 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.737 227766 WARNING nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.739 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.739 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000072 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.748 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpu4hyz2pn" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.776 227766 DEBUG nova.storage.rbd_utils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.780 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/disk.config 735c1181-0c35-4816-96b4-886b145e0c0c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.799 227766 DEBUG nova.virt.libvirt.host [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.801 227766 DEBUG nova.virt.libvirt.host [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.805 227766 DEBUG nova.virt.libvirt.host [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.805 227766 DEBUG nova.virt.libvirt.host [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.807 227766 DEBUG nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.807 227766 DEBUG nova.virt.hardware [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eebea5f8-9b11-45ad-873d-c4ea90d3de87',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.808 227766 DEBUG nova.virt.hardware [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.808 227766 DEBUG nova.virt.hardware [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.808 227766 DEBUG nova.virt.hardware [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.808 227766 DEBUG nova.virt.hardware [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.808 227766 DEBUG nova.virt.hardware [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.809 227766 DEBUG nova.virt.hardware [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.809 227766 DEBUG nova.virt.hardware [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.809 227766 DEBUG nova.virt.hardware [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.809 227766 DEBUG nova.virt.hardware [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.809 227766 DEBUG nova.virt.hardware [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.810 227766 DEBUG nova.objects.instance [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'vcpu_model' on Instance uuid 483afeac-561b-48ff-89d6-d02d1b615fc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:33 np0005593234 nova_compute[227762]: 2026-01-23 10:06:33.911 227766 DEBUG oslo_concurrency.processutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.002 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.003 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4297MB free_disk=20.80972671508789GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.003 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.004 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.143 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Applying migration context for instance 483afeac-561b-48ff-89d6-d02d1b615fc9 as it has an incoming, in-progress migration df920c0b-dafc-41b8-b8ba-e843582c7bd4. Migration status is post-migrating _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:950#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.144 227766 INFO nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Updating resource usage from migration df920c0b-dafc-41b8-b8ba-e843582c7bd4#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.172 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.173 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 483afeac-561b-48ff-89d6-d02d1b615fc9 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 192, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.173 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 735c1181-0c35-4816-96b4-886b145e0c0c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.173 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.173 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=960MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.297 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:06:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3379460646' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.347 227766 DEBUG oslo_concurrency.processutils [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.370 227766 DEBUG nova.virt.libvirt.vif [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1175667419',display_name='tempest-ServerActionsTestOtherA-server-1175667419',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1175667419',id=116,image_ref='',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPDPMbCnqcp11s7OR05vsDdiZlZSU5ZbBJSLaqQpawTODCANj+91AmOb6Hdh0FgzlQPvmSu+VYXOLfZik0SA3L4m61/nruOol9dJ9Mz34f8cV2NJKksVR2Ar2t+W5r4M6w==',key_name='tempest-keypair-2078677939',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:05:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8c16cd713fa74a88b43e4edf01c273bd',ramdisk_id='',reservation_id='r-z67ffbwd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherA-882763067',owner_user_name='tempest-ServerActionsTestOtherA-882763067-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:06:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='29710db389c842df836944048225740f',uuid=483afeac-561b-48ff-89d6-d02d1b615fc9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f35157ad-0f62-41af-962e-a3afcd66400e", "address": "fa:16:3e:15:67:98", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-2130726771-network", "vif_mac": "fa:16:3e:15:67:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf35157ad-0f", "ovs_interfaceid": "f35157ad-0f62-41af-962e-a3afcd66400e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.371 227766 DEBUG nova.network.os_vif_util [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converting VIF {"id": "f35157ad-0f62-41af-962e-a3afcd66400e", "address": "fa:16:3e:15:67:98", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-2130726771-network", "vif_mac": "fa:16:3e:15:67:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf35157ad-0f", "ovs_interfaceid": "f35157ad-0f62-41af-962e-a3afcd66400e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.371 227766 DEBUG nova.network.os_vif_util [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:67:98,bridge_name='br-int',has_traffic_filtering=True,id=f35157ad-0f62-41af-962e-a3afcd66400e,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf35157ad-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.373 227766 DEBUG nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  <uuid>483afeac-561b-48ff-89d6-d02d1b615fc9</uuid>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  <name>instance-00000074</name>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  <memory>196608</memory>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerActionsTestOtherA-server-1175667419</nova:name>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:06:33</nova:creationTime>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.micro">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <nova:memory>192</nova:memory>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <nova:user uuid="29710db389c842df836944048225740f">tempest-ServerActionsTestOtherA-882763067-project-member</nova:user>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <nova:project uuid="8c16cd713fa74a88b43e4edf01c273bd">tempest-ServerActionsTestOtherA-882763067</nova:project>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <nova:port uuid="f35157ad-0f62-41af-962e-a3afcd66400e">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <entry name="serial">483afeac-561b-48ff-89d6-d02d1b615fc9</entry>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <entry name="uuid">483afeac-561b-48ff-89d6-d02d1b615fc9</entry>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/483afeac-561b-48ff-89d6-d02d1b615fc9_disk.config">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="volumes/volume-1337690c-8061-4b7e-bb70-8cbfeecc77ac">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <serial>1337690c-8061-4b7e-bb70-8cbfeecc77ac</serial>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:15:67:98"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <target dev="tapf35157ad-0f"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/483afeac-561b-48ff-89d6-d02d1b615fc9/console.log" append="off"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:06:34 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:06:34 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:06:34 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:06:34 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.374 227766 DEBUG nova.virt.libvirt.vif [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1175667419',display_name='tempest-ServerActionsTestOtherA-server-1175667419',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1175667419',id=116,image_ref='',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPDPMbCnqcp11s7OR05vsDdiZlZSU5ZbBJSLaqQpawTODCANj+91AmOb6Hdh0FgzlQPvmSu+VYXOLfZik0SA3L4m61/nruOol9dJ9Mz34f8cV2NJKksVR2Ar2t+W5r4M6w==',key_name='tempest-keypair-2078677939',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:05:42Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8c16cd713fa74a88b43e4edf01c273bd',ramdisk_id='',reservation_id='r-z67ffbwd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherA-882763067',owner_user_name='tempest-ServerActionsTestOtherA-882763067-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:06:15Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='29710db389c842df836944048225740f',uuid=483afeac-561b-48ff-89d6-d02d1b615fc9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f35157ad-0f62-41af-962e-a3afcd66400e", "address": "fa:16:3e:15:67:98", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-2130726771-network", "vif_mac": "fa:16:3e:15:67:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf35157ad-0f", "ovs_interfaceid": "f35157ad-0f62-41af-962e-a3afcd66400e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.374 227766 DEBUG nova.network.os_vif_util [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converting VIF {"id": "f35157ad-0f62-41af-962e-a3afcd66400e", "address": "fa:16:3e:15:67:98", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherA-2130726771-network", "vif_mac": "fa:16:3e:15:67:98"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf35157ad-0f", "ovs_interfaceid": "f35157ad-0f62-41af-962e-a3afcd66400e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.375 227766 DEBUG nova.network.os_vif_util [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:15:67:98,bridge_name='br-int',has_traffic_filtering=True,id=f35157ad-0f62-41af-962e-a3afcd66400e,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf35157ad-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.375 227766 DEBUG os_vif [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:67:98,bridge_name='br-int',has_traffic_filtering=True,id=f35157ad-0f62-41af-962e-a3afcd66400e,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf35157ad-0f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.376 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.376 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.377 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.380 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.380 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf35157ad-0f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.380 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf35157ad-0f, col_values=(('external_ids', {'iface-id': 'f35157ad-0f62-41af-962e-a3afcd66400e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:15:67:98', 'vm-uuid': '483afeac-561b-48ff-89d6-d02d1b615fc9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.381 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:34 np0005593234 NetworkManager[48942]: <info>  [1769162794.3824] manager: (tapf35157ad-0f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/237)
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.383 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.389 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.389 227766 INFO os_vif [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:15:67:98,bridge_name='br-int',has_traffic_filtering=True,id=f35157ad-0f62-41af-962e-a3afcd66400e,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf35157ad-0f')#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.464 227766 DEBUG nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.465 227766 DEBUG nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.465 227766 DEBUG nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] No VIF found with MAC fa:16:3e:15:67:98, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.465 227766 INFO nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Using config drive#033[00m
Jan 23 05:06:34 np0005593234 kernel: tapf35157ad-0f: entered promiscuous mode
Jan 23 05:06:34 np0005593234 NetworkManager[48942]: <info>  [1769162794.5423] manager: (tapf35157ad-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/238)
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.542 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:34Z|00458|binding|INFO|Claiming lport f35157ad-0f62-41af-962e-a3afcd66400e for this chassis.
Jan 23 05:06:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:34Z|00459|binding|INFO|f35157ad-0f62-41af-962e-a3afcd66400e: Claiming fa:16:3e:15:67:98 10.100.0.5
Jan 23 05:06:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:34Z|00460|binding|INFO|Setting lport f35157ad-0f62-41af-962e-a3afcd66400e ovn-installed in OVS
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.560 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.562 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:34 np0005593234 systemd-udevd[281430]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:06:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:34Z|00461|binding|INFO|Setting lport f35157ad-0f62-41af-962e-a3afcd66400e up in Southbound
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.572 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:67:98 10.100.0.5'], port_security=['fa:16:3e:15:67:98 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '483afeac-561b-48ff-89d6-d02d1b615fc9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c16cd713fa74a88b43e4edf01c273bd', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'b9910180-8b38-41b2-8cb3-4e4af7eb2c2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.238'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c3aed5f-30b8-4c57-808e-87764ab67fc8, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=f35157ad-0f62-41af-962e-a3afcd66400e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.574 144381 INFO neutron.agent.ovn.metadata.agent [-] Port f35157ad-0f62-41af-962e-a3afcd66400e in datapath 8575e824-4be0-4206-873e-2f9a3d1ded0b bound to our chassis#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.575 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8575e824-4be0-4206-873e-2f9a3d1ded0b#033[00m
Jan 23 05:06:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:34.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:34 np0005593234 NetworkManager[48942]: <info>  [1769162794.5868] device (tapf35157ad-0f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:06:34 np0005593234 NetworkManager[48942]: <info>  [1769162794.5881] device (tapf35157ad-0f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:06:34 np0005593234 systemd-machined[195626]: New machine qemu-52-instance-00000074.
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.592 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f715f60b-b79e-42c4-b32e-6bf8cf1d7541]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.597 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8575e824-41 in ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.599 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8575e824-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.599 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8e5dd082-3453-4d47-abda-c64db4bde52e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.600 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0ec85aad-565a-47a0-8a1f-8346dc6f6f4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 systemd[1]: Started Virtual Machine qemu-52-instance-00000074.
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.612 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f0d962-3831-47de-aa75-9d81213f082c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.635 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[91cdb223-52cd-4e3d-b7f6-12a2cf7be71e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.671 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[59370961-187b-4a0d-81c0-8bc1d7306002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.679 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e7723b60-7a63-41ed-88bb-daf10db0f33f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 NetworkManager[48942]: <info>  [1769162794.6809] manager: (tap8575e824-40): new Veth device (/org/freedesktop/NetworkManager/Devices/239)
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.709 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9baf1a43-af9a-4a59-9d30-3eae3b85e5be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.712 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[04cbe440-69b4-465f-af40-fd5f3d8a4fb3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:06:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1197543214' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:06:34 np0005593234 NetworkManager[48942]: <info>  [1769162794.7367] device (tap8575e824-40): carrier: link connected
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.743 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.747 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[951a97fa-69f4-4302-9051-b997c6f55c68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.750 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.765 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b848d288-1fe5-40e1-872b-905852f9c7d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8575e824-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:16:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 148], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 668969, 'reachable_time': 42378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281468, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.779 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5ec4fecb-233d-4ff1-b982-bd4ef0c7345f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef2:16ca'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 668969, 'tstamp': 668969}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281469, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.787 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.793 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7e15db69-270e-45da-a0a1-9efa4b6c95d0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8575e824-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f2:16:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 148], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 668969, 'reachable_time': 42378, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 281470, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.814 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.815 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.825 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b323a280-827f-4ff2-b82b-e84694f78239]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.882 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6b2fa339-3829-4799-a327-bf7c13c6a88f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.884 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8575e824-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.884 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.884 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8575e824-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.886 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:34 np0005593234 kernel: tap8575e824-40: entered promiscuous mode
Jan 23 05:06:34 np0005593234 NetworkManager[48942]: <info>  [1769162794.8884] manager: (tap8575e824-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/240)
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.889 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8575e824-40, col_values=(('external_ids', {'iface-id': 'f7023d86-3158-4cc4-b690-f57bb76e92b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:34Z|00462|binding|INFO|Releasing lport f7023d86-3158-4cc4-b690-f57bb76e92b5 from this chassis (sb_readonly=0)
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.890 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:34 np0005593234 nova_compute[227762]: 2026-01-23 10:06:34.904 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.905 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8575e824-4be0-4206-873e-2f9a3d1ded0b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8575e824-4be0-4206-873e-2f9a3d1ded0b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.906 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e36c6960-3556-4497-996c-eacce611983e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.907 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-8575e824-4be0-4206-873e-2f9a3d1ded0b
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/8575e824-4be0-4206-873e-2f9a3d1ded0b.pid.haproxy
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 8575e824-4be0-4206-873e-2f9a3d1ded0b
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:06:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:34.909 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'env', 'PROCESS_TAG=haproxy-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8575e824-4be0-4206-873e-2f9a3d1ded0b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.029 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162795.028633, 483afeac-561b-48ff-89d6-d02d1b615fc9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.029 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.037 227766 DEBUG nova.compute.manager [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.041 227766 DEBUG oslo_concurrency.processutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/disk.config 735c1181-0c35-4816-96b4-886b145e0c0c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.262s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.042 227766 INFO nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Deleting local config drive /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/disk.config because it was imported into RBD.#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.043 227766 INFO nova.virt.libvirt.driver [-] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Instance running successfully.#033[00m
Jan 23 05:06:35 np0005593234 virtqemud[227483]: argument unsupported: QEMU guest agent is not configured
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.047 227766 DEBUG nova.virt.libvirt.guest [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.047 227766 DEBUG nova.virt.libvirt.driver [None req-a83e9424-9f46-4ec6-baee-dff8a11a74ed 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.061 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.066 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.102 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.102 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162795.03581, 483afeac-561b-48ff-89d6-d02d1b615fc9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.103 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] VM Started (Lifecycle Event)#033[00m
Jan 23 05:06:35 np0005593234 systemd-machined[195626]: New machine qemu-53-instance-00000076.
Jan 23 05:06:35 np0005593234 systemd[1]: Started Virtual Machine qemu-53-instance-00000076.
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.132 227766 DEBUG nova.compute.manager [req-337f2ec7-f944-4dc9-b7fb-017dc608ac6d req-3e3ee73f-2662-409f-8103-80829efe1a04 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received event network-vif-plugged-f35157ad-0f62-41af-962e-a3afcd66400e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.132 227766 DEBUG oslo_concurrency.lockutils [req-337f2ec7-f944-4dc9-b7fb-017dc608ac6d req-3e3ee73f-2662-409f-8103-80829efe1a04 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.132 227766 DEBUG oslo_concurrency.lockutils [req-337f2ec7-f944-4dc9-b7fb-017dc608ac6d req-3e3ee73f-2662-409f-8103-80829efe1a04 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.132 227766 DEBUG oslo_concurrency.lockutils [req-337f2ec7-f944-4dc9-b7fb-017dc608ac6d req-3e3ee73f-2662-409f-8103-80829efe1a04 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.132 227766 DEBUG nova.compute.manager [req-337f2ec7-f944-4dc9-b7fb-017dc608ac6d req-3e3ee73f-2662-409f-8103-80829efe1a04 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] No waiting events found dispatching network-vif-plugged-f35157ad-0f62-41af-962e-a3afcd66400e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.133 227766 WARNING nova.compute.manager [req-337f2ec7-f944-4dc9-b7fb-017dc608ac6d req-3e3ee73f-2662-409f-8103-80829efe1a04 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received unexpected event network-vif-plugged-f35157ad-0f62-41af-962e-a3afcd66400e for instance with vm_state active and task_state resize_finish.#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.169 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.173 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.202 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 23 05:06:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:35.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:35 np0005593234 podman[281559]: 2026-01-23 10:06:35.259313204 +0000 UTC m=+0.022567276 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.504 227766 DEBUG nova.network.neutron [req-9be21528-3741-4bcd-8c70-cef1125531f1 req-acc1caeb-d312-45d5-a999-980f8c89c69f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Updated VIF entry in instance network info cache for port f35157ad-0f62-41af-962e-a3afcd66400e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.505 227766 DEBUG nova.network.neutron [req-9be21528-3741-4bcd-8c70-cef1125531f1 req-acc1caeb-d312-45d5-a999-980f8c89c69f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Updating instance_info_cache with network_info: [{"id": "f35157ad-0f62-41af-962e-a3afcd66400e", "address": "fa:16:3e:15:67:98", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf35157ad-0f", "ovs_interfaceid": "f35157ad-0f62-41af-962e-a3afcd66400e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:06:35 np0005593234 podman[281559]: 2026-01-23 10:06:35.509065861 +0000 UTC m=+0.272319933 container create 61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.534 227766 DEBUG oslo_concurrency.lockutils [req-9be21528-3741-4bcd-8c70-cef1125531f1 req-acc1caeb-d312-45d5-a999-980f8c89c69f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-483afeac-561b-48ff-89d6-d02d1b615fc9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:06:35 np0005593234 systemd[1]: Started libpod-conmon-61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4.scope.
Jan 23 05:06:35 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:06:35 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/15a59a5920ef80587e29a2cfb6e655639ad1178d9e60c592b36adc78f6a0a673/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:06:35 np0005593234 podman[281559]: 2026-01-23 10:06:35.582084128 +0000 UTC m=+0.345338220 container init 61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 05:06:35 np0005593234 podman[281559]: 2026-01-23 10:06:35.587682453 +0000 UTC m=+0.350936525 container start 61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 05:06:35 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[281580]: [NOTICE]   (281584) : New worker (281586) forked
Jan 23 05:06:35 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[281580]: [NOTICE]   (281584) : Loading success.
Jan 23 05:06:35 np0005593234 nova_compute[227762]: 2026-01-23 10:06:35.815 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.066 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162796.066309, 735c1181-0c35-4816-96b4-886b145e0c0c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.066 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.069 227766 DEBUG nova.compute.manager [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.069 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.072 227766 INFO nova.virt.libvirt.driver [-] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Instance spawned successfully.#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.073 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.113 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.113 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.114 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.114 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.115 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.115 227766 DEBUG nova.virt.libvirt.driver [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.123 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.126 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.159 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.160 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162796.069206, 735c1181-0c35-4816-96b4-886b145e0c0c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.160 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] VM Started (Lifecycle Event)#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.207 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.210 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.254 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.268 227766 INFO nova.compute.manager [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Took 7.03 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.268 227766 DEBUG nova.compute.manager [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.345 227766 INFO nova.compute.manager [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Took 8.72 seconds to build instance.#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.373 227766 DEBUG oslo_concurrency.lockutils [None req-d11a5fbc-5fb7-40cd-8fba-7a6f49cb6ef0 e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "735c1181-0c35-4816-96b4-886b145e0c0c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.906s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.384 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:36.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.635 227766 DEBUG nova.objects.instance [None req-803bcae0-0e2b-463f-a3c0-c4a1d4190030 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'pci_devices' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.667 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162796.6673775, 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.668 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.720 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.724 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:36 np0005593234 nova_compute[227762]: 2026-01-23 10:06:36.819 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 23 05:06:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:37.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:37 np0005593234 kernel: tap5b468015-6b (unregistering): left promiscuous mode
Jan 23 05:06:37 np0005593234 NetworkManager[48942]: <info>  [1769162797.8536] device (tap5b468015-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:06:37 np0005593234 nova_compute[227762]: 2026-01-23 10:06:37.871 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:37 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:37Z|00463|binding|INFO|Releasing lport 5b468015-6b03-496b-acb0-201ef16d849d from this chassis (sb_readonly=0)
Jan 23 05:06:37 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:37Z|00464|binding|INFO|Setting lport 5b468015-6b03-496b-acb0-201ef16d849d down in Southbound
Jan 23 05:06:37 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:37Z|00465|binding|INFO|Removing iface tap5b468015-6b ovn-installed in OVS
Jan 23 05:06:37 np0005593234 nova_compute[227762]: 2026-01-23 10:06:37.874 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:37 np0005593234 nova_compute[227762]: 2026-01-23 10:06:37.886 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:37 np0005593234 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000072.scope: Deactivated successfully.
Jan 23 05:06:37 np0005593234 systemd[1]: machine-qemu\x2d51\x2dinstance\x2d00000072.scope: Consumed 13.301s CPU time.
Jan 23 05:06:37 np0005593234 systemd-machined[195626]: Machine qemu-51-instance-00000072 terminated.
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.046 227766 DEBUG nova.compute.manager [None req-803bcae0-0e2b-463f-a3c0-c4a1d4190030 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.125 227766 DEBUG nova.compute.manager [req-e1a1057e-e9ad-4b83-bd68-3e5ae3f10407 req-dee2f310-3fa7-426f-878b-7e76075f008f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received event network-vif-plugged-f35157ad-0f62-41af-962e-a3afcd66400e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.126 227766 DEBUG oslo_concurrency.lockutils [req-e1a1057e-e9ad-4b83-bd68-3e5ae3f10407 req-dee2f310-3fa7-426f-878b-7e76075f008f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.126 227766 DEBUG oslo_concurrency.lockutils [req-e1a1057e-e9ad-4b83-bd68-3e5ae3f10407 req-dee2f310-3fa7-426f-878b-7e76075f008f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.126 227766 DEBUG oslo_concurrency.lockutils [req-e1a1057e-e9ad-4b83-bd68-3e5ae3f10407 req-dee2f310-3fa7-426f-878b-7e76075f008f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.126 227766 DEBUG nova.compute.manager [req-e1a1057e-e9ad-4b83-bd68-3e5ae3f10407 req-dee2f310-3fa7-426f-878b-7e76075f008f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] No waiting events found dispatching network-vif-plugged-f35157ad-0f62-41af-962e-a3afcd66400e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.127 227766 WARNING nova.compute.manager [req-e1a1057e-e9ad-4b83-bd68-3e5ae3f10407 req-dee2f310-3fa7-426f-878b-7e76075f008f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received unexpected event network-vif-plugged-f35157ad-0f62-41af-962e-a3afcd66400e for instance with vm_state resized and task_state None.#033[00m
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.354 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:17:9b 10.100.0.6'], port_security=['fa:16:3e:87:17:9b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '06ab5530-6f75-4f7d-80cd-48cf4c63cfd9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '6', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5b468015-6b03-496b-acb0-201ef16d849d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.355 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5b468015-6b03-496b-acb0-201ef16d849d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c unbound from our chassis#033[00m
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.357 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee03d7c9-e107-41bf-95cc-5508578ad66c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.359 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bb2ef38f-0362-4d98-9a3d-a933c309f400]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.360 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace which is not needed anymore#033[00m
Jan 23 05:06:38 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280910]: [NOTICE]   (280914) : haproxy version is 2.8.14-c23fe91
Jan 23 05:06:38 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280910]: [NOTICE]   (280914) : path to executable is /usr/sbin/haproxy
Jan 23 05:06:38 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280910]: [WARNING]  (280914) : Exiting Master process...
Jan 23 05:06:38 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280910]: [ALERT]    (280914) : Current worker (280916) exited with code 143 (Terminated)
Jan 23 05:06:38 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[280910]: [WARNING]  (280914) : All workers exited. Exiting... (0)
Jan 23 05:06:38 np0005593234 systemd[1]: libpod-df0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d.scope: Deactivated successfully.
Jan 23 05:06:38 np0005593234 podman[281668]: 2026-01-23 10:06:38.519540853 +0000 UTC m=+0.048219328 container died df0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 05:06:38 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-df0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d-userdata-shm.mount: Deactivated successfully.
Jan 23 05:06:38 np0005593234 systemd[1]: var-lib-containers-storage-overlay-8ee1f035f4d67072dd0f66a331a13de8c3934448e7468ac66e43ebc07e4c38e5-merged.mount: Deactivated successfully.
Jan 23 05:06:38 np0005593234 podman[281668]: 2026-01-23 10:06:38.553747562 +0000 UTC m=+0.082426037 container cleanup df0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:06:38 np0005593234 systemd[1]: libpod-conmon-df0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d.scope: Deactivated successfully.
Jan 23 05:06:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:38.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:38 np0005593234 podman[281696]: 2026-01-23 10:06:38.617730723 +0000 UTC m=+0.042705566 container remove df0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.622 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bf449e91-d508-4da6-b769-9621bc3b116c]: (4, ('Fri Jan 23 10:06:38 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (df0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d)\ndf0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d\nFri Jan 23 10:06:38 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (df0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d)\ndf0eea9ad9d1b64fbcb9a2b6360ba9f8deec5c5a77c6dfe8047b92467257246d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.623 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[23d14994-5246-4fcc-89fe-d5ba7bfa816e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.625 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:38 np0005593234 kernel: tapee03d7c9-e0: left promiscuous mode
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.627 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.647 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.648 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.652 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b14caa76-59dd-430c-b09a-c457b723a973]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.668 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[19a4337f-7102-4c51-8621-d240a29e5c65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.670 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cc6ab415-0086-48cc-b4a9-5313338a8ca0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.685 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3d7e0923-905c-470b-870d-b39f61d21750]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 667163, 'reachable_time': 28486, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281713, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.687 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:06:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:38.687 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[aa5c4e62-78c8-44d9-b359-3cf1bf86e714]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:38 np0005593234 systemd[1]: run-netns-ovnmeta\x2dee03d7c9\x2de107\x2d41bf\x2d95cc\x2d5508578ad66c.mount: Deactivated successfully.
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.966 227766 DEBUG nova.compute.manager [req-22154679-fb17-484a-b073-f984cb423bd4 req-2b28f45b-6211-4ad6-a895-26018c03e87f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-unplugged-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.967 227766 DEBUG oslo_concurrency.lockutils [req-22154679-fb17-484a-b073-f984cb423bd4 req-2b28f45b-6211-4ad6-a895-26018c03e87f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.968 227766 DEBUG oslo_concurrency.lockutils [req-22154679-fb17-484a-b073-f984cb423bd4 req-2b28f45b-6211-4ad6-a895-26018c03e87f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.969 227766 DEBUG oslo_concurrency.lockutils [req-22154679-fb17-484a-b073-f984cb423bd4 req-2b28f45b-6211-4ad6-a895-26018c03e87f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.969 227766 DEBUG nova.compute.manager [req-22154679-fb17-484a-b073-f984cb423bd4 req-2b28f45b-6211-4ad6-a895-26018c03e87f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] No waiting events found dispatching network-vif-unplugged-5b468015-6b03-496b-acb0-201ef16d849d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:38 np0005593234 nova_compute[227762]: 2026-01-23 10:06:38.970 227766 WARNING nova.compute.manager [req-22154679-fb17-484a-b073-f984cb423bd4 req-2b28f45b-6211-4ad6-a895-26018c03e87f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received unexpected event network-vif-unplugged-5b468015-6b03-496b-acb0-201ef16d849d for instance with vm_state suspended and task_state None.#033[00m
Jan 23 05:06:39 np0005593234 nova_compute[227762]: 2026-01-23 10:06:39.332 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:06:39 np0005593234 nova_compute[227762]: 2026-01-23 10:06:39.333 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:06:39 np0005593234 nova_compute[227762]: 2026-01-23 10:06:39.333 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:06:39 np0005593234 nova_compute[227762]: 2026-01-23 10:06:39.333 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:39.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:39 np0005593234 nova_compute[227762]: 2026-01-23 10:06:39.458 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:40 np0005593234 nova_compute[227762]: 2026-01-23 10:06:40.471 227766 INFO nova.compute.manager [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Rebuilding instance#033[00m
Jan 23 05:06:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:40.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:40 np0005593234 podman[281715]: 2026-01-23 10:06:40.783437948 +0000 UTC m=+0.081907052 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 23 05:06:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.077 227766 DEBUG nova.objects.instance [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'trusted_certs' on Instance uuid 735c1181-0c35-4816-96b4-886b145e0c0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.149 227766 DEBUG nova.compute.manager [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.273 227766 INFO nova.compute.manager [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Resuming#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.274 227766 DEBUG nova.objects.instance [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'flavor' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.323 227766 DEBUG nova.objects.instance [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'pci_requests' on Instance uuid 735c1181-0c35-4816-96b4-886b145e0c0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:41.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.374 227766 DEBUG oslo_concurrency.lockutils [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.378 227766 DEBUG nova.objects.instance [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'pci_devices' on Instance uuid 735c1181-0c35-4816-96b4-886b145e0c0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.383 227766 DEBUG nova.compute.manager [req-e55c4bfa-b83d-4b3c-8c07-c67fdee0f435 req-f89493b0-09cf-4bc3-b558-c817ef622f02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.383 227766 DEBUG oslo_concurrency.lockutils [req-e55c4bfa-b83d-4b3c-8c07-c67fdee0f435 req-f89493b0-09cf-4bc3-b558-c817ef622f02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.383 227766 DEBUG oslo_concurrency.lockutils [req-e55c4bfa-b83d-4b3c-8c07-c67fdee0f435 req-f89493b0-09cf-4bc3-b558-c817ef622f02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.384 227766 DEBUG oslo_concurrency.lockutils [req-e55c4bfa-b83d-4b3c-8c07-c67fdee0f435 req-f89493b0-09cf-4bc3-b558-c817ef622f02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.384 227766 DEBUG nova.compute.manager [req-e55c4bfa-b83d-4b3c-8c07-c67fdee0f435 req-f89493b0-09cf-4bc3-b558-c817ef622f02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] No waiting events found dispatching network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.384 227766 WARNING nova.compute.manager [req-e55c4bfa-b83d-4b3c-8c07-c67fdee0f435 req-f89493b0-09cf-4bc3-b558-c817ef622f02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received unexpected event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d for instance with vm_state suspended and task_state resuming.#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.385 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.408 227766 DEBUG nova.objects.instance [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'resources' on Instance uuid 735c1181-0c35-4816-96b4-886b145e0c0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.476 227766 DEBUG nova.objects.instance [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'migration_context' on Instance uuid 735c1181-0c35-4816-96b4-886b145e0c0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.501 227766 DEBUG nova.objects.instance [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 05:06:41 np0005593234 nova_compute[227762]: 2026-01-23 10:06:41.505 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:06:42 np0005593234 nova_compute[227762]: 2026-01-23 10:06:42.160 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Updating instance_info_cache with network_info: [{"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:06:42 np0005593234 nova_compute[227762]: 2026-01-23 10:06:42.207 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:06:42 np0005593234 nova_compute[227762]: 2026-01-23 10:06:42.208 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:06:42 np0005593234 nova_compute[227762]: 2026-01-23 10:06:42.210 227766 DEBUG oslo_concurrency.lockutils [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquired lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:06:42 np0005593234 nova_compute[227762]: 2026-01-23 10:06:42.210 227766 DEBUG nova.network.neutron [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:06:42 np0005593234 nova_compute[227762]: 2026-01-23 10:06:42.212 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:42 np0005593234 nova_compute[227762]: 2026-01-23 10:06:42.213 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:42 np0005593234 nova_compute[227762]: 2026-01-23 10:06:42.214 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:06:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:42.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:42.843 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:42.843 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:42.844 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:43.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:44 np0005593234 nova_compute[227762]: 2026-01-23 10:06:44.460 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:44.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:44 np0005593234 nova_compute[227762]: 2026-01-23 10:06:44.747 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:45.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:46 np0005593234 nova_compute[227762]: 2026-01-23 10:06:46.388 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:46.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:46 np0005593234 nova_compute[227762]: 2026-01-23 10:06:46.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:06:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:47.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:47 np0005593234 nova_compute[227762]: 2026-01-23 10:06:47.976 227766 DEBUG nova.network.neutron [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Updating instance_info_cache with network_info: [{"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.025 227766 DEBUG oslo_concurrency.lockutils [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Releasing lock "refresh_cache-06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.032 227766 DEBUG nova.virt.libvirt.vif [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:05:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1240996482',display_name='tempest-ServerActionsTestJSON-server-1240996482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1240996482',id=114,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:05:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-wcdb1ybb',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:06:38Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=06ab5530-6f75-4f7d-80cd-48cf4c63cfd9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.034 227766 DEBUG nova.network.os_vif_util [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.035 227766 DEBUG nova.network.os_vif_util [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.035 227766 DEBUG os_vif [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.036 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.036 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.037 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.040 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.041 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5b468015-6b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.041 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5b468015-6b, col_values=(('external_ids', {'iface-id': '5b468015-6b03-496b-acb0-201ef16d849d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:87:17:9b', 'vm-uuid': '06ab5530-6f75-4f7d-80cd-48cf4c63cfd9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.042 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.043 227766 INFO os_vif [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b')#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.067 227766 DEBUG nova.objects.instance [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'numa_topology' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:48 np0005593234 kernel: tap5b468015-6b: entered promiscuous mode
Jan 23 05:06:48 np0005593234 NetworkManager[48942]: <info>  [1769162808.1432] manager: (tap5b468015-6b): new Tun device (/org/freedesktop/NetworkManager/Devices/241)
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.145 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:48Z|00466|binding|INFO|Claiming lport 5b468015-6b03-496b-acb0-201ef16d849d for this chassis.
Jan 23 05:06:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:48Z|00467|binding|INFO|5b468015-6b03-496b-acb0-201ef16d849d: Claiming fa:16:3e:87:17:9b 10.100.0.6
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.162 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:17:9b 10.100.0.6'], port_security=['fa:16:3e:87:17:9b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '06ab5530-6f75-4f7d-80cd-48cf4c63cfd9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '7', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5b468015-6b03-496b-acb0-201ef16d849d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.163 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5b468015-6b03-496b-acb0-201ef16d849d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c bound to our chassis#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.164 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ee03d7c9-e107-41bf-95cc-5508578ad66c#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.168 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:48Z|00468|binding|INFO|Setting lport 5b468015-6b03-496b-acb0-201ef16d849d ovn-installed in OVS
Jan 23 05:06:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:48Z|00469|binding|INFO|Setting lport 5b468015-6b03-496b-acb0-201ef16d849d up in Southbound
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.171 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.175 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1f926e9e-fa62-4424-be1f-efc9fc01e348]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.176 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapee03d7c9-e1 in ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.178 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapee03d7c9-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.178 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[30d95ce2-f992-40f4-8487-408fdb2d785a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.180 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1431833a-afd4-421b-a6bc-8e07fd3de3bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 systemd-machined[195626]: New machine qemu-54-instance-00000072.
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.198 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[d9ec332d-7afb-4213-8657-75e7e315c35c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 systemd[1]: Started Virtual Machine qemu-54-instance-00000072.
Jan 23 05:06:48 np0005593234 systemd-udevd[281815]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.222 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5825ddda-d9d0-450b-be6a-3f40192ac579]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 NetworkManager[48942]: <info>  [1769162808.2306] device (tap5b468015-6b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:06:48 np0005593234 NetworkManager[48942]: <info>  [1769162808.2316] device (tap5b468015-6b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.261 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[41c11011-6dfb-4477-893c-d99db083dcee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 NetworkManager[48942]: <info>  [1769162808.2694] manager: (tapee03d7c9-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/242)
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.268 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f95e8340-253d-4634-94d4-a60201bbd493]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 systemd-udevd[281817]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.318 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c778f152-a28a-4b5b-b51c-2aee7232b3e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.321 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a7e7efbd-591b-41f7-a2d2-6cf530789205]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 NetworkManager[48942]: <info>  [1769162808.3445] device (tapee03d7c9-e0): carrier: link connected
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.354 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4dd4c647-f8f6-40ab-ab8b-a511ee1ffd55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.374 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[746a7b01-1f33-40c7-bf7b-30b0f0d2943f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 670330, 'reachable_time': 28810, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 281845, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.397 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0b6eb6fb-f15e-48b5-8084-48ddf44f6efc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fecd:6530'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 670330, 'tstamp': 670330}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 281846, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.417 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d1f30b1f-4aaa-4226-b4fc-1700ed8b9d17]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapee03d7c9-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:cd:65:30'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 151], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 670330, 'reachable_time': 28810, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 281847, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.464 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c0b168f6-8e9f-45d5-a885-03891cf2e401]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.530 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f93586d9-bc44-4903-9e9b-dd3b45ca29af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.533 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.533 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.534 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapee03d7c9-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:48 np0005593234 kernel: tapee03d7c9-e0: entered promiscuous mode
Jan 23 05:06:48 np0005593234 NetworkManager[48942]: <info>  [1769162808.5414] manager: (tapee03d7c9-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/243)
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.542 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.550 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapee03d7c9-e0, col_values=(('external_ids', {'iface-id': '702d4523-a665-42f5-9a36-57d187c0698a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:48Z|00470|binding|INFO|Releasing lport 702d4523-a665-42f5-9a36-57d187c0698a from this chassis (sb_readonly=0)
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.552 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.555 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.556 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f029ed59-549e-4a58-b61e-c7fdf64d42f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.557 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ee03d7c9-e107-41bf-95cc-5508578ad66c.pid.haproxy
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ee03d7c9-e107-41bf-95cc-5508578ad66c
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:06:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:48.557 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'env', 'PROCESS_TAG=haproxy-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ee03d7c9-e107-41bf-95cc-5508578ad66c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.565 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:48.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:48Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:15:67:98 10.100.0.5
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.888 227766 DEBUG nova.compute.manager [req-0e16354e-26a3-4df0-b1e3-27669fbacfc0 req-dbe89c1f-bf87-4291-a2f3-b449e947c258 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.888 227766 DEBUG oslo_concurrency.lockutils [req-0e16354e-26a3-4df0-b1e3-27669fbacfc0 req-dbe89c1f-bf87-4291-a2f3-b449e947c258 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.889 227766 DEBUG oslo_concurrency.lockutils [req-0e16354e-26a3-4df0-b1e3-27669fbacfc0 req-dbe89c1f-bf87-4291-a2f3-b449e947c258 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.889 227766 DEBUG oslo_concurrency.lockutils [req-0e16354e-26a3-4df0-b1e3-27669fbacfc0 req-dbe89c1f-bf87-4291-a2f3-b449e947c258 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.889 227766 DEBUG nova.compute.manager [req-0e16354e-26a3-4df0-b1e3-27669fbacfc0 req-dbe89c1f-bf87-4291-a2f3-b449e947c258 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] No waiting events found dispatching network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:48 np0005593234 nova_compute[227762]: 2026-01-23 10:06:48.889 227766 WARNING nova.compute.manager [req-0e16354e-26a3-4df0-b1e3-27669fbacfc0 req-dbe89c1f-bf87-4291-a2f3-b449e947c258 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received unexpected event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d for instance with vm_state suspended and task_state resuming.#033[00m
Jan 23 05:06:48 np0005593234 podman[281894]: 2026-01-23 10:06:48.918469464 +0000 UTC m=+0.051333477 container create bf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:06:48 np0005593234 systemd[1]: Started libpod-conmon-bf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002.scope.
Jan 23 05:06:48 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:06:48 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bb65fa7f442ca1f6aa7a804bc6e0caab07498a174eb6f2d6695f77f56688557b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:06:48 np0005593234 podman[281894]: 2026-01-23 10:06:48.980843344 +0000 UTC m=+0.113707387 container init bf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 05:06:48 np0005593234 podman[281894]: 2026-01-23 10:06:48.892429669 +0000 UTC m=+0.025293682 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:06:48 np0005593234 podman[281894]: 2026-01-23 10:06:48.986409418 +0000 UTC m=+0.119273431 container start bf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:06:49 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[281934]: [NOTICE]   (281939) : New worker (281941) forked
Jan 23 05:06:49 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[281934]: [NOTICE]   (281939) : Loading success.
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.263 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.263 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162809.2630217, 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.264 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] VM Started (Lifecycle Event)#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.291 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.309 227766 DEBUG nova.compute.manager [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.310 227766 DEBUG nova.objects.instance [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'pci_devices' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.313 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.346 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.346 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162809.2659423, 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.347 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.349 227766 INFO nova.virt.libvirt.driver [-] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Instance running successfully.#033[00m
Jan 23 05:06:49 np0005593234 virtqemud[227483]: argument unsupported: QEMU guest agent is not configured
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.353 227766 DEBUG nova.virt.libvirt.guest [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.354 227766 DEBUG nova.compute.manager [None req-1a0b22e6-eaa9-4d80-ada2-1637e176fd0e 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:06:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:49.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.372 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.376 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.461 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:49 np0005593234 nova_compute[227762]: 2026-01-23 10:06:49.586 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 23 05:06:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:50.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:50 np0005593234 nova_compute[227762]: 2026-01-23 10:06:50.652 227766 INFO nova.compute.manager [None req-c624c097-60e1-4100-9229-7dea450a1428 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Get console output#033[00m
Jan 23 05:06:50 np0005593234 nova_compute[227762]: 2026-01-23 10:06:50.744 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:06:50 np0005593234 nova_compute[227762]: 2026-01-23 10:06:50.808 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:51 np0005593234 nova_compute[227762]: 2026-01-23 10:06:51.095 227766 DEBUG nova.compute.manager [req-476968f1-2184-43c3-9729-a0528c6d8989 req-f3a7faa1-0125-4bcc-acb8-ad4997ff0a18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:51 np0005593234 nova_compute[227762]: 2026-01-23 10:06:51.095 227766 DEBUG oslo_concurrency.lockutils [req-476968f1-2184-43c3-9729-a0528c6d8989 req-f3a7faa1-0125-4bcc-acb8-ad4997ff0a18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:51 np0005593234 nova_compute[227762]: 2026-01-23 10:06:51.096 227766 DEBUG oslo_concurrency.lockutils [req-476968f1-2184-43c3-9729-a0528c6d8989 req-f3a7faa1-0125-4bcc-acb8-ad4997ff0a18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:51 np0005593234 nova_compute[227762]: 2026-01-23 10:06:51.096 227766 DEBUG oslo_concurrency.lockutils [req-476968f1-2184-43c3-9729-a0528c6d8989 req-f3a7faa1-0125-4bcc-acb8-ad4997ff0a18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:51 np0005593234 nova_compute[227762]: 2026-01-23 10:06:51.096 227766 DEBUG nova.compute.manager [req-476968f1-2184-43c3-9729-a0528c6d8989 req-f3a7faa1-0125-4bcc-acb8-ad4997ff0a18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] No waiting events found dispatching network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:51 np0005593234 nova_compute[227762]: 2026-01-23 10:06:51.096 227766 WARNING nova.compute.manager [req-476968f1-2184-43c3-9729-a0528c6d8989 req-f3a7faa1-0125-4bcc-acb8-ad4997ff0a18 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received unexpected event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:06:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:51.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:51 np0005593234 nova_compute[227762]: 2026-01-23 10:06:51.391 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:51 np0005593234 nova_compute[227762]: 2026-01-23 10:06:51.557 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 05:06:51 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #51. Immutable memtables: 8.
Jan 23 05:06:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:52.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:53.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:53 np0005593234 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000076.scope: Deactivated successfully.
Jan 23 05:06:53 np0005593234 systemd[1]: machine-qemu\x2d53\x2dinstance\x2d00000076.scope: Consumed 13.688s CPU time.
Jan 23 05:06:53 np0005593234 systemd-machined[195626]: Machine qemu-53-instance-00000076 terminated.
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.463 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.570 227766 INFO nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Instance shutdown successfully after 13 seconds.#033[00m
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.574 227766 INFO nova.virt.libvirt.driver [-] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Instance destroyed successfully.#033[00m
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.578 227766 INFO nova.virt.libvirt.driver [-] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Instance destroyed successfully.#033[00m
Jan 23 05:06:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:54.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.762 227766 DEBUG oslo_concurrency.lockutils [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.762 227766 DEBUG oslo_concurrency.lockutils [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.762 227766 DEBUG oslo_concurrency.lockutils [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.763 227766 DEBUG oslo_concurrency.lockutils [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.763 227766 DEBUG oslo_concurrency.lockutils [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.764 227766 INFO nova.compute.manager [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Terminating instance#033[00m
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.765 227766 DEBUG nova.compute.manager [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:06:54 np0005593234 kernel: tap5b468015-6b (unregistering): left promiscuous mode
Jan 23 05:06:54 np0005593234 NetworkManager[48942]: <info>  [1769162814.8166] device (tap5b468015-6b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:06:54 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:54Z|00471|binding|INFO|Releasing lport 5b468015-6b03-496b-acb0-201ef16d849d from this chassis (sb_readonly=0)
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.823 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:54 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:54Z|00472|binding|INFO|Setting lport 5b468015-6b03-496b-acb0-201ef16d849d down in Southbound
Jan 23 05:06:54 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:54Z|00473|binding|INFO|Removing iface tap5b468015-6b ovn-installed in OVS
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.825 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:54.832 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:17:9b 10.100.0.6'], port_security=['fa:16:3e:87:17:9b 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': '06ab5530-6f75-4f7d-80cd-48cf4c63cfd9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '74c5c1d0762242f29a5d26033efd9f6d', 'neutron:revision_number': '8', 'neutron:security_group_ids': '53abfec9-e9a4-4b72-b0e0-38bea0069f7b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=26eced02-0507-4a33-9943-52faf3fc8cd2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5b468015-6b03-496b-acb0-201ef16d849d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:06:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:54.834 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5b468015-6b03-496b-acb0-201ef16d849d in datapath ee03d7c9-e107-41bf-95cc-5508578ad66c unbound from our chassis#033[00m
Jan 23 05:06:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:54.835 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ee03d7c9-e107-41bf-95cc-5508578ad66c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:06:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:54.836 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2e749b9e-43b6-4b46-83c9-95c65c118e10]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:54.837 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c namespace which is not needed anymore#033[00m
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.839 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:54 np0005593234 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000072.scope: Deactivated successfully.
Jan 23 05:06:54 np0005593234 systemd[1]: machine-qemu\x2d54\x2dinstance\x2d00000072.scope: Consumed 1.120s CPU time.
Jan 23 05:06:54 np0005593234 systemd-machined[195626]: Machine qemu-54-instance-00000072 terminated.
Jan 23 05:06:54 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[281934]: [NOTICE]   (281939) : haproxy version is 2.8.14-c23fe91
Jan 23 05:06:54 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[281934]: [NOTICE]   (281939) : path to executable is /usr/sbin/haproxy
Jan 23 05:06:54 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[281934]: [WARNING]  (281939) : Exiting Master process...
Jan 23 05:06:54 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[281934]: [WARNING]  (281939) : Exiting Master process...
Jan 23 05:06:54 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[281934]: [ALERT]    (281939) : Current worker (281941) exited with code 143 (Terminated)
Jan 23 05:06:54 np0005593234 neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c[281934]: [WARNING]  (281939) : All workers exited. Exiting... (0)
Jan 23 05:06:54 np0005593234 systemd[1]: libpod-bf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002.scope: Deactivated successfully.
Jan 23 05:06:54 np0005593234 podman[281997]: 2026-01-23 10:06:54.971866517 +0000 UTC m=+0.046414363 container died bf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.984 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:54 np0005593234 nova_compute[227762]: 2026-01-23 10:06:54.989 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:55 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002-userdata-shm.mount: Deactivated successfully.
Jan 23 05:06:55 np0005593234 systemd[1]: var-lib-containers-storage-overlay-bb65fa7f442ca1f6aa7a804bc6e0caab07498a174eb6f2d6695f77f56688557b-merged.mount: Deactivated successfully.
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.007 227766 INFO nova.virt.libvirt.driver [-] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Instance destroyed successfully.#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.008 227766 DEBUG nova.objects.instance [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lazy-loading 'resources' on Instance uuid 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:55 np0005593234 podman[281997]: 2026-01-23 10:06:55.008492902 +0000 UTC m=+0.083040738 container cleanup bf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:06:55 np0005593234 systemd[1]: libpod-conmon-bf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002.scope: Deactivated successfully.
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.040 227766 DEBUG nova.virt.libvirt.vif [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:05:20Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestJSON-server-1240996482',display_name='tempest-ServerActionsTestJSON-server-1240996482',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestjson-server-1240996482',id=114,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBB0AiEKt9gHrsbueqjCG64VrzhP898xYsJXOd2/6uW3CZrw7c/2vnYXFOKeIp4qvJ25g/gz5/w2irrKH3R3Pyr6HiyEmMxGMtHTZ1L/l92xM4YiKXMLNL4VsFVwX3d+71g==',key_name='tempest-keypair-1055968095',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:05:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='74c5c1d0762242f29a5d26033efd9f6d',ramdisk_id='',reservation_id='r-wcdb1ybb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestJSON-1619235720',owner_user_name='tempest-ServerActionsTestJSON-1619235720-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:06:49Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='9d4a5c201efa4992a9ef57d8abdc1675',uuid=06ab5530-6f75-4f7d-80cd-48cf4c63cfd9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.041 227766 DEBUG nova.network.os_vif_util [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converting VIF {"id": "5b468015-6b03-496b-acb0-201ef16d849d", "address": "fa:16:3e:87:17:9b", "network": {"id": "ee03d7c9-e107-41bf-95cc-5508578ad66c", "bridge": "br-int", "label": "tempest-ServerActionsTestJSON-267124880-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "74c5c1d0762242f29a5d26033efd9f6d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5b468015-6b", "ovs_interfaceid": "5b468015-6b03-496b-acb0-201ef16d849d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.041 227766 DEBUG nova.network.os_vif_util [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.042 227766 DEBUG os_vif [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.044 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.044 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5b468015-6b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.046 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.047 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.049 227766 INFO os_vif [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:87:17:9b,bridge_name='br-int',has_traffic_filtering=True,id=5b468015-6b03-496b-acb0-201ef16d849d,network=Network(ee03d7c9-e107-41bf-95cc-5508578ad66c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5b468015-6b')#033[00m
Jan 23 05:06:55 np0005593234 podman[282038]: 2026-01-23 10:06:55.071679028 +0000 UTC m=+0.039528908 container remove bf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.076 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1368f1bf-a1a0-4ea5-ba97-27294cc20970]: (4, ('Fri Jan 23 10:06:54 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (bf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002)\nbf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002\nFri Jan 23 10:06:55 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c (bf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002)\nbf2393d8e51d4228a7400d90c31213c843b3f6b3f6c17fdb6760214441425002\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.078 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f801c834-df71-4700-a12f-298b49acd523]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.079 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapee03d7c9-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.132 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:55 np0005593234 kernel: tapee03d7c9-e0: left promiscuous mode
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.145 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.148 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a3d68ce6-424b-4fd9-a7f7-3b703e7ef65a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.161 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8b8f1daf-a388-47ec-ac95-1a3bce38991e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.161 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2205f5b3-b619-4bd9-be04-2211894c77b6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.174 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a8c331df-f778-4b43-9ac5-abd37ea26c95]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 670321, 'reachable_time': 40131, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282071, 'error': None, 'target': 'ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.177 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ee03d7c9-e107-41bf-95cc-5508578ad66c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:06:55 np0005593234 systemd[1]: run-netns-ovnmeta\x2dee03d7c9\x2de107\x2d41bf\x2d95cc\x2d5508578ad66c.mount: Deactivated successfully.
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.177 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[226b15cc-d3bf-4440-b50a-a6f1e6706e8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:55.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.633 227766 DEBUG oslo_concurrency.lockutils [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "483afeac-561b-48ff-89d6-d02d1b615fc9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.634 227766 DEBUG oslo_concurrency.lockutils [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.634 227766 DEBUG oslo_concurrency.lockutils [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.634 227766 DEBUG oslo_concurrency.lockutils [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.634 227766 DEBUG oslo_concurrency.lockutils [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.635 227766 INFO nova.compute.manager [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Terminating instance#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.636 227766 DEBUG nova.compute.manager [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:06:55 np0005593234 kernel: tapf35157ad-0f (unregistering): left promiscuous mode
Jan 23 05:06:55 np0005593234 NetworkManager[48942]: <info>  [1769162815.6854] device (tapf35157ad-0f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.694 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:55Z|00474|binding|INFO|Releasing lport f35157ad-0f62-41af-962e-a3afcd66400e from this chassis (sb_readonly=0)
Jan 23 05:06:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:55Z|00475|binding|INFO|Setting lport f35157ad-0f62-41af-962e-a3afcd66400e down in Southbound
Jan 23 05:06:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:06:55Z|00476|binding|INFO|Removing iface tapf35157ad-0f ovn-installed in OVS
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.696 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.713 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:55 np0005593234 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000074.scope: Deactivated successfully.
Jan 23 05:06:55 np0005593234 systemd[1]: machine-qemu\x2d52\x2dinstance\x2d00000074.scope: Consumed 13.849s CPU time.
Jan 23 05:06:55 np0005593234 systemd-machined[195626]: Machine qemu-52-instance-00000074 terminated.
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.765 227766 INFO nova.virt.libvirt.driver [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Deleting instance files /var/lib/nova/instances/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_del#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.766 227766 INFO nova.virt.libvirt.driver [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Deletion of /var/lib/nova/instances/06ab5530-6f75-4f7d-80cd-48cf4c63cfd9_del complete#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.839 227766 INFO nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Deleting instance files /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c_del#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.841 227766 INFO nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Deletion of /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c_del complete#033[00m
Jan 23 05:06:55 np0005593234 NetworkManager[48942]: <info>  [1769162815.8531] manager: (tapf35157ad-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/244)
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.875 227766 INFO nova.virt.libvirt.driver [-] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Instance destroyed successfully.#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.876 227766 DEBUG nova.objects.instance [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lazy-loading 'resources' on Instance uuid 483afeac-561b-48ff-89d6-d02d1b615fc9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.923 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:15:67:98 10.100.0.5'], port_security=['fa:16:3e:15:67:98 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '483afeac-561b-48ff-89d6-d02d1b615fc9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8c16cd713fa74a88b43e4edf01c273bd', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'b9910180-8b38-41b2-8cb3-4e4af7eb2c2b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.238', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6c3aed5f-30b8-4c57-808e-87764ab67fc8, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=f35157ad-0f62-41af-962e-a3afcd66400e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.925 144381 INFO neutron.agent.ovn.metadata.agent [-] Port f35157ad-0f62-41af-962e-a3afcd66400e in datapath 8575e824-4be0-4206-873e-2f9a3d1ded0b unbound from our chassis#033[00m
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.926 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8575e824-4be0-4206-873e-2f9a3d1ded0b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.926 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9be13324-2ecb-440a-8ea0-45e210337e69]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:55.927 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b namespace which is not needed anymore#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.979 227766 DEBUG nova.virt.libvirt.vif [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:05:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherA-server-1175667419',display_name='tempest-ServerActionsTestOtherA-server-1175667419',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestothera-server-1175667419',id=116,image_ref='',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPDPMbCnqcp11s7OR05vsDdiZlZSU5ZbBJSLaqQpawTODCANj+91AmOb6Hdh0FgzlQPvmSu+VYXOLfZik0SA3L4m61/nruOol9dJ9Mz34f8cV2NJKksVR2Ar2t+W5r4M6w==',key_name='tempest-keypair-2078677939',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:06:35Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8c16cd713fa74a88b43e4edf01c273bd',ramdisk_id='',reservation_id='r-z67ffbwd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-ServerActionsTestOtherA-882763067',owner_user_name='tempest-ServerActionsTestOtherA-882763067-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:06:46Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='29710db389c842df836944048225740f',uuid=483afeac-561b-48ff-89d6-d02d1b615fc9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f35157ad-0f62-41af-962e-a3afcd66400e", "address": "fa:16:3e:15:67:98", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf35157ad-0f", "ovs_interfaceid": "f35157ad-0f62-41af-962e-a3afcd66400e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.980 227766 DEBUG nova.network.os_vif_util [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converting VIF {"id": "f35157ad-0f62-41af-962e-a3afcd66400e", "address": "fa:16:3e:15:67:98", "network": {"id": "8575e824-4be0-4206-873e-2f9a3d1ded0b", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherA-2130726771-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8c16cd713fa74a88b43e4edf01c273bd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf35157ad-0f", "ovs_interfaceid": "f35157ad-0f62-41af-962e-a3afcd66400e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.980 227766 DEBUG nova.network.os_vif_util [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:15:67:98,bridge_name='br-int',has_traffic_filtering=True,id=f35157ad-0f62-41af-962e-a3afcd66400e,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf35157ad-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.981 227766 DEBUG os_vif [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:67:98,bridge_name='br-int',has_traffic_filtering=True,id=f35157ad-0f62-41af-962e-a3afcd66400e,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf35157ad-0f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.982 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.982 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf35157ad-0f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.983 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.986 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:06:55 np0005593234 nova_compute[227762]: 2026-01-23 10:06:55.989 227766 INFO os_vif [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:15:67:98,bridge_name='br-int',has_traffic_filtering=True,id=f35157ad-0f62-41af-962e-a3afcd66400e,network=Network(8575e824-4be0-4206-873e-2f9a3d1ded0b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf35157ad-0f')#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.035 227766 INFO nova.compute.manager [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Took 1.27 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.036 227766 DEBUG oslo.service.loopingcall [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.036 227766 DEBUG nova.compute.manager [-] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.036 227766 DEBUG nova.network.neutron [-] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:06:56 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[281580]: [NOTICE]   (281584) : haproxy version is 2.8.14-c23fe91
Jan 23 05:06:56 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[281580]: [NOTICE]   (281584) : path to executable is /usr/sbin/haproxy
Jan 23 05:06:56 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[281580]: [WARNING]  (281584) : Exiting Master process...
Jan 23 05:06:56 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[281580]: [WARNING]  (281584) : Exiting Master process...
Jan 23 05:06:56 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[281580]: [ALERT]    (281584) : Current worker (281586) exited with code 143 (Terminated)
Jan 23 05:06:56 np0005593234 neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b[281580]: [WARNING]  (281584) : All workers exited. Exiting... (0)
Jan 23 05:06:56 np0005593234 systemd[1]: libpod-61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4.scope: Deactivated successfully.
Jan 23 05:06:56 np0005593234 podman[282105]: 2026-01-23 10:06:56.0544648 +0000 UTC m=+0.049333684 container died 61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:06:56 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4-userdata-shm.mount: Deactivated successfully.
Jan 23 05:06:56 np0005593234 systemd[1]: var-lib-containers-storage-overlay-15a59a5920ef80587e29a2cfb6e655639ad1178d9e60c592b36adc78f6a0a673-merged.mount: Deactivated successfully.
Jan 23 05:06:56 np0005593234 podman[282105]: 2026-01-23 10:06:56.087504232 +0000 UTC m=+0.082373116 container cleanup 61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:06:56 np0005593234 systemd[1]: libpod-conmon-61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4.scope: Deactivated successfully.
Jan 23 05:06:56 np0005593234 podman[282152]: 2026-01-23 10:06:56.14434963 +0000 UTC m=+0.038192555 container remove 61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:06:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:56.150 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d9f4e7d4-516b-40d5-8d47-d0f823130e64]: (4, ('Fri Jan 23 10:06:55 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b (61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4)\n61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4\nFri Jan 23 10:06:56 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b (61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4)\n61874f92d7968fa8d12d9ea2999238e2adde3041261f94443f25605bff40b0b4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:56.151 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[da0a849c-c29d-42e9-929b-08ccec44898d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:56.152 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8575e824-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.154 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:56 np0005593234 kernel: tap8575e824-40: left promiscuous mode
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.157 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:56.161 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3d477c89-c10d-4308-a426-f5bf0828b8d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.216 227766 INFO nova.virt.libvirt.driver [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Deleting instance files /var/lib/nova/instances/483afeac-561b-48ff-89d6-d02d1b615fc9_del#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.216 227766 INFO nova.virt.libvirt.driver [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Deletion of /var/lib/nova/instances/483afeac-561b-48ff-89d6-d02d1b615fc9_del complete#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.221 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:56.225 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[135bf5b8-b437-4de3-9ede-ddf39a622543]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:56.226 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a3a7fb39-d4c7-48ea-b3a1-922a69834734]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:56.240 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d7f540f5-f4e9-4d29-9c4a-365176eae7c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 668962, 'reachable_time': 17102, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 282168, 'error': None, 'target': 'ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:56.242 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8575e824-4be0-4206-873e-2f9a3d1ded0b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:06:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:06:56.242 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[8f4f0c84-85c2-461d-aa27-8f9dbb5953a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:06:56 np0005593234 systemd[1]: run-netns-ovnmeta\x2d8575e824\x2d4be0\x2d4206\x2d873e\x2d2f9a3d1ded0b.mount: Deactivated successfully.
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.377 227766 INFO nova.compute.manager [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Took 0.74 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.378 227766 DEBUG oslo.service.loopingcall [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.379 227766 DEBUG nova.compute.manager [-] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.379 227766 DEBUG nova.network.neutron [-] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.395 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:56.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.622 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.623 227766 INFO nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Creating image(s)#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.656 227766 DEBUG nova.storage.rbd_utils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.683 227766 DEBUG nova.storage.rbd_utils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.709 227766 DEBUG nova.storage.rbd_utils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.713 227766 DEBUG oslo_concurrency.processutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.780 227766 DEBUG oslo_concurrency.processutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.781 227766 DEBUG oslo_concurrency.lockutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.781 227766 DEBUG oslo_concurrency.lockutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.782 227766 DEBUG oslo_concurrency.lockutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.805 227766 DEBUG nova.storage.rbd_utils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.809 227766 DEBUG oslo_concurrency.processutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 735c1181-0c35-4816-96b4-886b145e0c0c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.899 227766 DEBUG nova.compute.manager [req-2808631f-f8df-495a-ae11-756ebd33e8a6 req-5b8ec4d4-32db-4bed-89ae-e7382dc6085b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-unplugged-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.900 227766 DEBUG oslo_concurrency.lockutils [req-2808631f-f8df-495a-ae11-756ebd33e8a6 req-5b8ec4d4-32db-4bed-89ae-e7382dc6085b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.900 227766 DEBUG oslo_concurrency.lockutils [req-2808631f-f8df-495a-ae11-756ebd33e8a6 req-5b8ec4d4-32db-4bed-89ae-e7382dc6085b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.900 227766 DEBUG oslo_concurrency.lockutils [req-2808631f-f8df-495a-ae11-756ebd33e8a6 req-5b8ec4d4-32db-4bed-89ae-e7382dc6085b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.900 227766 DEBUG nova.compute.manager [req-2808631f-f8df-495a-ae11-756ebd33e8a6 req-5b8ec4d4-32db-4bed-89ae-e7382dc6085b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] No waiting events found dispatching network-vif-unplugged-5b468015-6b03-496b-acb0-201ef16d849d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:56 np0005593234 nova_compute[227762]: 2026-01-23 10:06:56.901 227766 DEBUG nova.compute.manager [req-2808631f-f8df-495a-ae11-756ebd33e8a6 req-5b8ec4d4-32db-4bed-89ae-e7382dc6085b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-unplugged-5b468015-6b03-496b-acb0-201ef16d849d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:06:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:06:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:57.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.496 227766 DEBUG nova.compute.manager [req-e9c34159-0fc6-40ce-969c-bb281634b66a req-734465b1-1d42-41af-b515-56b742d5bdd6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received event network-vif-unplugged-f35157ad-0f62-41af-962e-a3afcd66400e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.496 227766 DEBUG oslo_concurrency.lockutils [req-e9c34159-0fc6-40ce-969c-bb281634b66a req-734465b1-1d42-41af-b515-56b742d5bdd6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.497 227766 DEBUG oslo_concurrency.lockutils [req-e9c34159-0fc6-40ce-969c-bb281634b66a req-734465b1-1d42-41af-b515-56b742d5bdd6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.497 227766 DEBUG oslo_concurrency.lockutils [req-e9c34159-0fc6-40ce-969c-bb281634b66a req-734465b1-1d42-41af-b515-56b742d5bdd6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.498 227766 DEBUG nova.compute.manager [req-e9c34159-0fc6-40ce-969c-bb281634b66a req-734465b1-1d42-41af-b515-56b742d5bdd6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] No waiting events found dispatching network-vif-unplugged-f35157ad-0f62-41af-962e-a3afcd66400e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.498 227766 DEBUG nova.compute.manager [req-e9c34159-0fc6-40ce-969c-bb281634b66a req-734465b1-1d42-41af-b515-56b742d5bdd6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received event network-vif-unplugged-f35157ad-0f62-41af-962e-a3afcd66400e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.565 227766 DEBUG oslo_concurrency.processutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 735c1181-0c35-4816-96b4-886b145e0c0c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.757s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:57 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.646 227766 DEBUG nova.storage.rbd_utils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] resizing rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.772 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.773 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Ensure instance console log exists: /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.774 227766 DEBUG oslo_concurrency.lockutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.775 227766 DEBUG oslo_concurrency.lockutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.775 227766 DEBUG oslo_concurrency.lockutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.777 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.784 227766 WARNING nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.808 227766 DEBUG nova.virt.libvirt.host [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.808 227766 DEBUG nova.virt.libvirt.host [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.814 227766 DEBUG nova.virt.libvirt.host [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.815 227766 DEBUG nova.virt.libvirt.host [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.816 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.816 227766 DEBUG nova.virt.hardware [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.816 227766 DEBUG nova.virt.hardware [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.817 227766 DEBUG nova.virt.hardware [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.817 227766 DEBUG nova.virt.hardware [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.817 227766 DEBUG nova.virt.hardware [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.817 227766 DEBUG nova.virt.hardware [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.817 227766 DEBUG nova.virt.hardware [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.818 227766 DEBUG nova.virt.hardware [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.818 227766 DEBUG nova.virt.hardware [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.818 227766 DEBUG nova.virt.hardware [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.818 227766 DEBUG nova.virt.hardware [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:06:57 np0005593234 nova_compute[227762]: 2026-01-23 10:06:57.818 227766 DEBUG nova.objects.instance [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'vcpu_model' on Instance uuid 735c1181-0c35-4816-96b4-886b145e0c0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:58 np0005593234 nova_compute[227762]: 2026-01-23 10:06:58.294 227766 DEBUG oslo_concurrency.processutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:58 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:06:58 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:06:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:58.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:58 np0005593234 nova_compute[227762]: 2026-01-23 10:06:58.675 227766 DEBUG nova.network.neutron [-] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:06:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:06:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3582035784' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:06:58 np0005593234 nova_compute[227762]: 2026-01-23 10:06:58.704 227766 INFO nova.compute.manager [-] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Took 2.67 seconds to deallocate network for instance.#033[00m
Jan 23 05:06:58 np0005593234 nova_compute[227762]: 2026-01-23 10:06:58.720 227766 DEBUG oslo_concurrency.processutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:58 np0005593234 nova_compute[227762]: 2026-01-23 10:06:58.747 227766 DEBUG nova.storage.rbd_utils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:58 np0005593234 nova_compute[227762]: 2026-01-23 10:06:58.751 227766 DEBUG oslo_concurrency.processutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:58 np0005593234 nova_compute[227762]: 2026-01-23 10:06:58.841 227766 DEBUG oslo_concurrency.lockutils [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:58 np0005593234 nova_compute[227762]: 2026-01-23 10:06:58.841 227766 DEBUG oslo_concurrency.lockutils [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.030 227766 DEBUG oslo_concurrency.processutils [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.050 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.142 227766 DEBUG nova.compute.manager [req-e7add97f-c1e0-4d6f-9d80-3de0f173f545 req-79b8d7e0-37d7-4bff-83f8-be437ab015b7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.142 227766 DEBUG oslo_concurrency.lockutils [req-e7add97f-c1e0-4d6f-9d80-3de0f173f545 req-79b8d7e0-37d7-4bff-83f8-be437ab015b7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.143 227766 DEBUG oslo_concurrency.lockutils [req-e7add97f-c1e0-4d6f-9d80-3de0f173f545 req-79b8d7e0-37d7-4bff-83f8-be437ab015b7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.143 227766 DEBUG oslo_concurrency.lockutils [req-e7add97f-c1e0-4d6f-9d80-3de0f173f545 req-79b8d7e0-37d7-4bff-83f8-be437ab015b7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.143 227766 DEBUG nova.compute.manager [req-e7add97f-c1e0-4d6f-9d80-3de0f173f545 req-79b8d7e0-37d7-4bff-83f8-be437ab015b7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] No waiting events found dispatching network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.143 227766 WARNING nova.compute.manager [req-e7add97f-c1e0-4d6f-9d80-3de0f173f545 req-79b8d7e0-37d7-4bff-83f8-be437ab015b7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received unexpected event network-vif-plugged-5b468015-6b03-496b-acb0-201ef16d849d for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.144 227766 DEBUG nova.compute.manager [req-e7add97f-c1e0-4d6f-9d80-3de0f173f545 req-79b8d7e0-37d7-4bff-83f8-be437ab015b7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Received event network-vif-deleted-5b468015-6b03-496b-acb0-201ef16d849d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:06:59 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3719943878' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.193 227766 DEBUG oslo_concurrency.processutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.195 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  <uuid>735c1181-0c35-4816-96b4-886b145e0c0c</uuid>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  <name>instance-00000076</name>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerShowV247Test-server-964731518</nova:name>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:06:57</nova:creationTime>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <nova:user uuid="e0fe7d252cd04174840bdf8dfefa3510">tempest-ServerShowV247Test-1613897366-project-member</nova:user>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <nova:project uuid="3ecb2c0cafc441fd9457198fe09cc97b">tempest-ServerShowV247Test-1613897366</nova:project>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="ae1f9e37-418c-462f-81d1-3599a6d89de9"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <entry name="serial">735c1181-0c35-4816-96b4-886b145e0c0c</entry>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <entry name="uuid">735c1181-0c35-4816-96b4-886b145e0c0c</entry>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/735c1181-0c35-4816-96b4-886b145e0c0c_disk">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/735c1181-0c35-4816-96b4-886b145e0c0c_disk.config">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/console.log" append="off"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:06:59 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:06:59 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:06:59 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:06:59 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.260 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.261 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.262 227766 INFO nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Using config drive#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.289 227766 DEBUG nova.storage.rbd_utils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.334 227766 DEBUG nova.objects.instance [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'ec2_ids' on Instance uuid 735c1181-0c35-4816-96b4-886b145e0c0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.365 227766 DEBUG nova.objects.instance [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'keypairs' on Instance uuid 735c1181-0c35-4816-96b4-886b145e0c0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:06:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:06:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:06:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:59.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:06:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:06:59 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2962554768' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.500 227766 DEBUG oslo_concurrency.processutils [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.505 227766 DEBUG nova.compute.provider_tree [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.527 227766 DEBUG nova.scheduler.client.report [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.580 227766 DEBUG oslo_concurrency.lockutils [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.663 227766 DEBUG nova.network.neutron [-] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.673 227766 INFO nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Creating config drive at /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/disk.config#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.678 227766 DEBUG oslo_concurrency.processutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkpn0bzx7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.701 227766 INFO nova.compute.manager [-] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Took 3.32 seconds to deallocate network for instance.#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.703 227766 DEBUG nova.compute.manager [req-83362b31-7751-47f3-9c0f-3a29ae8a12f5 req-bced69b2-0f87-47c0-a7b4-7b25f864ad55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received event network-vif-plugged-f35157ad-0f62-41af-962e-a3afcd66400e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.704 227766 DEBUG oslo_concurrency.lockutils [req-83362b31-7751-47f3-9c0f-3a29ae8a12f5 req-bced69b2-0f87-47c0-a7b4-7b25f864ad55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.704 227766 DEBUG oslo_concurrency.lockutils [req-83362b31-7751-47f3-9c0f-3a29ae8a12f5 req-bced69b2-0f87-47c0-a7b4-7b25f864ad55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.704 227766 DEBUG oslo_concurrency.lockutils [req-83362b31-7751-47f3-9c0f-3a29ae8a12f5 req-bced69b2-0f87-47c0-a7b4-7b25f864ad55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.704 227766 DEBUG nova.compute.manager [req-83362b31-7751-47f3-9c0f-3a29ae8a12f5 req-bced69b2-0f87-47c0-a7b4-7b25f864ad55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] No waiting events found dispatching network-vif-plugged-f35157ad-0f62-41af-962e-a3afcd66400e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.705 227766 WARNING nova.compute.manager [req-83362b31-7751-47f3-9c0f-3a29ae8a12f5 req-bced69b2-0f87-47c0-a7b4-7b25f864ad55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received unexpected event network-vif-plugged-f35157ad-0f62-41af-962e-a3afcd66400e for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.713 227766 INFO nova.scheduler.client.report [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Deleted allocations for instance 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.807 227766 DEBUG oslo_concurrency.processutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpkpn0bzx7" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.832 227766 DEBUG nova.storage.rbd_utils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] rbd image 735c1181-0c35-4816-96b4-886b145e0c0c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.835 227766 DEBUG oslo_concurrency.processutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/disk.config 735c1181-0c35-4816-96b4-886b145e0c0c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.875 227766 DEBUG oslo_concurrency.lockutils [None req-ad0842d1-e4cf-4933-97aa-9c210e5b2d77 9d4a5c201efa4992a9ef57d8abdc1675 74c5c1d0762242f29a5d26033efd9f6d - - default default] Lock "06ab5530-6f75-4f7d-80cd-48cf4c63cfd9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.113s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:06:59 np0005593234 nova_compute[227762]: 2026-01-23 10:06:59.927 227766 DEBUG nova.compute.manager [req-9102b8e0-3972-4916-bee5-fab34b0d0c6d req-eab9ac4b-4f67-4921-b0be-0da07dd9997c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Received event network-vif-deleted-f35157ad-0f62-41af-962e-a3afcd66400e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.003 227766 DEBUG oslo_concurrency.processutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/disk.config 735c1181-0c35-4816-96b4-886b145e0c0c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.168s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.004 227766 INFO nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Deleting local config drive /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c/disk.config because it was imported into RBD.#033[00m
Jan 23 05:07:00 np0005593234 systemd-machined[195626]: New machine qemu-55-instance-00000076.
Jan 23 05:07:00 np0005593234 systemd[1]: Started Virtual Machine qemu-55-instance-00000076.
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.305 227766 INFO nova.compute.manager [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Took 0.60 seconds to detach 1 volumes for instance.#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.307 227766 DEBUG nova.compute.manager [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Deleting volume: 1337690c-8061-4b7e-bb70-8cbfeecc77ac _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.495 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for 735c1181-0c35-4816-96b4-886b145e0c0c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.496 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162820.49537, 735c1181-0c35-4816-96b4-886b145e0c0c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.496 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.498 227766 DEBUG nova.compute.manager [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.498 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.501 227766 INFO nova.virt.libvirt.driver [-] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Instance spawned successfully.#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.501 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.540 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.541 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.541 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.541 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.542 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.542 227766 DEBUG nova.virt.libvirt.driver [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.546 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.549 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.598 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.598 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162820.497788, 735c1181-0c35-4816-96b4-886b145e0c0c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.599 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] VM Started (Lifecycle Event)#033[00m
Jan 23 05:07:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:00.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.629 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.633 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.656 227766 DEBUG nova.compute.manager [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.665 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.760 227766 DEBUG oslo_concurrency.lockutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.760 227766 DEBUG oslo_concurrency.lockutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.761 227766 DEBUG nova.objects.instance [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.869 227766 DEBUG oslo_concurrency.lockutils [None req-a9dda455-7492-46fb-b53f-b4713488e31d e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.930 227766 DEBUG oslo_concurrency.lockutils [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.931 227766 DEBUG oslo_concurrency.lockutils [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:00 np0005593234 nova_compute[227762]: 2026-01-23 10:07:00.985 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:01 np0005593234 nova_compute[227762]: 2026-01-23 10:07:01.084 227766 DEBUG oslo_concurrency.processutils [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:01.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:01 np0005593234 nova_compute[227762]: 2026-01-23 10:07:01.396 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:07:01 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/374796509' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:07:01 np0005593234 nova_compute[227762]: 2026-01-23 10:07:01.552 227766 DEBUG oslo_concurrency.processutils [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:01 np0005593234 nova_compute[227762]: 2026-01-23 10:07:01.557 227766 DEBUG nova.compute.provider_tree [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:07:01 np0005593234 nova_compute[227762]: 2026-01-23 10:07:01.585 227766 DEBUG nova.scheduler.client.report [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:07:01 np0005593234 nova_compute[227762]: 2026-01-23 10:07:01.632 227766 DEBUG oslo_concurrency.lockutils [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:01 np0005593234 nova_compute[227762]: 2026-01-23 10:07:01.737 227766 INFO nova.scheduler.client.report [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Deleted allocations for instance 483afeac-561b-48ff-89d6-d02d1b615fc9#033[00m
Jan 23 05:07:01 np0005593234 podman[282688]: 2026-01-23 10:07:01.759144567 +0000 UTC m=+0.056134666 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 05:07:01 np0005593234 nova_compute[227762]: 2026-01-23 10:07:01.871 227766 DEBUG oslo_concurrency.lockutils [None req-a304a3fd-a526-425f-b4e6-c5eeac806e53 29710db389c842df836944048225740f 8c16cd713fa74a88b43e4edf01c273bd - - default default] Lock "483afeac-561b-48ff-89d6-d02d1b615fc9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.238s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:02.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:03.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:03 np0005593234 nova_compute[227762]: 2026-01-23 10:07:03.902 227766 DEBUG oslo_concurrency.lockutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "735c1181-0c35-4816-96b4-886b145e0c0c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:03 np0005593234 nova_compute[227762]: 2026-01-23 10:07:03.903 227766 DEBUG oslo_concurrency.lockutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "735c1181-0c35-4816-96b4-886b145e0c0c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:03 np0005593234 nova_compute[227762]: 2026-01-23 10:07:03.903 227766 DEBUG oslo_concurrency.lockutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "735c1181-0c35-4816-96b4-886b145e0c0c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:03 np0005593234 nova_compute[227762]: 2026-01-23 10:07:03.903 227766 DEBUG oslo_concurrency.lockutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "735c1181-0c35-4816-96b4-886b145e0c0c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:03 np0005593234 nova_compute[227762]: 2026-01-23 10:07:03.904 227766 DEBUG oslo_concurrency.lockutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "735c1181-0c35-4816-96b4-886b145e0c0c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:03 np0005593234 nova_compute[227762]: 2026-01-23 10:07:03.905 227766 INFO nova.compute.manager [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Terminating instance#033[00m
Jan 23 05:07:03 np0005593234 nova_compute[227762]: 2026-01-23 10:07:03.906 227766 DEBUG oslo_concurrency.lockutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "refresh_cache-735c1181-0c35-4816-96b4-886b145e0c0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:07:03 np0005593234 nova_compute[227762]: 2026-01-23 10:07:03.906 227766 DEBUG oslo_concurrency.lockutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquired lock "refresh_cache-735c1181-0c35-4816-96b4-886b145e0c0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:07:03 np0005593234 nova_compute[227762]: 2026-01-23 10:07:03.906 227766 DEBUG nova.network.neutron [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:07:04 np0005593234 nova_compute[227762]: 2026-01-23 10:07:04.215 227766 DEBUG nova.network.neutron [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:07:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:04.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:04 np0005593234 nova_compute[227762]: 2026-01-23 10:07:04.776 227766 DEBUG nova.network.neutron [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:07:04 np0005593234 nova_compute[227762]: 2026-01-23 10:07:04.818 227766 DEBUG oslo_concurrency.lockutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Releasing lock "refresh_cache-735c1181-0c35-4816-96b4-886b145e0c0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:07:04 np0005593234 nova_compute[227762]: 2026-01-23 10:07:04.819 227766 DEBUG nova.compute.manager [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:07:04 np0005593234 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000076.scope: Deactivated successfully.
Jan 23 05:07:04 np0005593234 systemd[1]: machine-qemu\x2d55\x2dinstance\x2d00000076.scope: Consumed 4.807s CPU time.
Jan 23 05:07:04 np0005593234 systemd-machined[195626]: Machine qemu-55-instance-00000076 terminated.
Jan 23 05:07:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:07:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:07:05 np0005593234 nova_compute[227762]: 2026-01-23 10:07:05.041 227766 INFO nova.virt.libvirt.driver [-] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Instance destroyed successfully.#033[00m
Jan 23 05:07:05 np0005593234 nova_compute[227762]: 2026-01-23 10:07:05.042 227766 DEBUG nova.objects.instance [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lazy-loading 'resources' on Instance uuid 735c1181-0c35-4816-96b4-886b145e0c0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:07:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:05.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:05 np0005593234 nova_compute[227762]: 2026-01-23 10:07:05.988 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:06 np0005593234 nova_compute[227762]: 2026-01-23 10:07:06.398 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:06.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:07.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:08 np0005593234 nova_compute[227762]: 2026-01-23 10:07:08.131 227766 INFO nova.virt.libvirt.driver [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Deleting instance files /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c_del#033[00m
Jan 23 05:07:08 np0005593234 nova_compute[227762]: 2026-01-23 10:07:08.132 227766 INFO nova.virt.libvirt.driver [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Deletion of /var/lib/nova/instances/735c1181-0c35-4816-96b4-886b145e0c0c_del complete#033[00m
Jan 23 05:07:08 np0005593234 nova_compute[227762]: 2026-01-23 10:07:08.231 227766 INFO nova.compute.manager [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Took 3.41 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:07:08 np0005593234 nova_compute[227762]: 2026-01-23 10:07:08.231 227766 DEBUG oslo.service.loopingcall [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:07:08 np0005593234 nova_compute[227762]: 2026-01-23 10:07:08.232 227766 DEBUG nova.compute.manager [-] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:07:08 np0005593234 nova_compute[227762]: 2026-01-23 10:07:08.232 227766 DEBUG nova.network.neutron [-] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:07:08 np0005593234 nova_compute[227762]: 2026-01-23 10:07:08.518 227766 DEBUG nova.network.neutron [-] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:07:08 np0005593234 nova_compute[227762]: 2026-01-23 10:07:08.550 227766 DEBUG nova.network.neutron [-] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:07:08 np0005593234 nova_compute[227762]: 2026-01-23 10:07:08.574 227766 INFO nova.compute.manager [-] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Took 0.34 seconds to deallocate network for instance.#033[00m
Jan 23 05:07:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:08.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:08 np0005593234 nova_compute[227762]: 2026-01-23 10:07:08.650 227766 DEBUG oslo_concurrency.lockutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:08 np0005593234 nova_compute[227762]: 2026-01-23 10:07:08.651 227766 DEBUG oslo_concurrency.lockutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:08 np0005593234 nova_compute[227762]: 2026-01-23 10:07:08.729 227766 DEBUG oslo_concurrency.processutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:09 np0005593234 nova_compute[227762]: 2026-01-23 10:07:09.090 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:07:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/281355539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:07:09 np0005593234 nova_compute[227762]: 2026-01-23 10:07:09.181 227766 DEBUG oslo_concurrency.processutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:09 np0005593234 nova_compute[227762]: 2026-01-23 10:07:09.188 227766 DEBUG nova.compute.provider_tree [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:07:09 np0005593234 nova_compute[227762]: 2026-01-23 10:07:09.214 227766 DEBUG nova.scheduler.client.report [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:07:09 np0005593234 nova_compute[227762]: 2026-01-23 10:07:09.245 227766 DEBUG oslo_concurrency.lockutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:09 np0005593234 nova_compute[227762]: 2026-01-23 10:07:09.267 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:09 np0005593234 nova_compute[227762]: 2026-01-23 10:07:09.299 227766 INFO nova.scheduler.client.report [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Deleted allocations for instance 735c1181-0c35-4816-96b4-886b145e0c0c#033[00m
Jan 23 05:07:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:07:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:09.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:07:09 np0005593234 nova_compute[227762]: 2026-01-23 10:07:09.498 227766 DEBUG oslo_concurrency.lockutils [None req-6d05e940-f380-4eb1-afba-0110810118bf e0fe7d252cd04174840bdf8dfefa3510 3ecb2c0cafc441fd9457198fe09cc97b - - default default] Lock "735c1181-0c35-4816-96b4-886b145e0c0c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:10 np0005593234 nova_compute[227762]: 2026-01-23 10:07:10.006 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162815.0051033, 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:07:10 np0005593234 nova_compute[227762]: 2026-01-23 10:07:10.007 227766 INFO nova.compute.manager [-] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:07:10 np0005593234 nova_compute[227762]: 2026-01-23 10:07:10.034 227766 DEBUG nova.compute.manager [None req-817dbdf1-ec24-42c6-91c8-0e6baf66ec3e - - - - - -] [instance: 06ab5530-6f75-4f7d-80cd-48cf4c63cfd9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:10.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:10 np0005593234 nova_compute[227762]: 2026-01-23 10:07:10.874 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162815.873285, 483afeac-561b-48ff-89d6-d02d1b615fc9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:07:10 np0005593234 nova_compute[227762]: 2026-01-23 10:07:10.875 227766 INFO nova.compute.manager [-] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:07:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:10 np0005593234 nova_compute[227762]: 2026-01-23 10:07:10.930 227766 DEBUG nova.compute.manager [None req-6cf74a10-40dd-4321-8836-1857657632eb - - - - - -] [instance: 483afeac-561b-48ff-89d6-d02d1b615fc9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:10 np0005593234 nova_compute[227762]: 2026-01-23 10:07:10.992 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:11.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:11 np0005593234 nova_compute[227762]: 2026-01-23 10:07:11.417 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:11 np0005593234 podman[282857]: 2026-01-23 10:07:11.787830278 +0000 UTC m=+0.082770109 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Jan 23 05:07:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:07:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:12.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:07:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:13.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:14.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:15.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:15 np0005593234 nova_compute[227762]: 2026-01-23 10:07:15.995 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:16 np0005593234 nova_compute[227762]: 2026-01-23 10:07:16.429 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:16.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:17.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:18.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:19.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:20 np0005593234 nova_compute[227762]: 2026-01-23 10:07:20.039 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162825.0385423, 735c1181-0c35-4816-96b4-886b145e0c0c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:07:20 np0005593234 nova_compute[227762]: 2026-01-23 10:07:20.040 227766 INFO nova.compute.manager [-] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:07:20 np0005593234 nova_compute[227762]: 2026-01-23 10:07:20.072 227766 DEBUG nova.compute.manager [None req-0f20f7b3-930b-485e-9e90-523a0ba64ac0 - - - - - -] [instance: 735c1181-0c35-4816-96b4-886b145e0c0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:07:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:20.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:07:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:20 np0005593234 nova_compute[227762]: 2026-01-23 10:07:20.998 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:21.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:21 np0005593234 nova_compute[227762]: 2026-01-23 10:07:21.432 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #109. Immutable memtables: 0.
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.681907) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 67] Flushing memtable with next log file: 109
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162841681982, "job": 67, "event": "flush_started", "num_memtables": 1, "num_entries": 896, "num_deletes": 250, "total_data_size": 1664147, "memory_usage": 1685928, "flush_reason": "Manual Compaction"}
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 67] Level-0 flush table #110: started
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162841688443, "cf_name": "default", "job": 67, "event": "table_file_creation", "file_number": 110, "file_size": 715406, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 54510, "largest_seqno": 55401, "table_properties": {"data_size": 711904, "index_size": 1218, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 9673, "raw_average_key_size": 20, "raw_value_size": 704393, "raw_average_value_size": 1524, "num_data_blocks": 53, "num_entries": 462, "num_filter_entries": 462, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162781, "oldest_key_time": 1769162781, "file_creation_time": 1769162841, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 110, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 67] Flush lasted 6592 microseconds, and 2870 cpu microseconds.
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.688512) [db/flush_job.cc:967] [default] [JOB 67] Level-0 flush table #110: 715406 bytes OK
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.688532) [db/memtable_list.cc:519] [default] Level-0 commit table #110 started
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.690386) [db/memtable_list.cc:722] [default] Level-0 commit table #110: memtable #1 done
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.690399) EVENT_LOG_v1 {"time_micros": 1769162841690395, "job": 67, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.690415) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 67] Try to delete WAL files size 1659521, prev total WAL file size 1659521, number of live WAL files 2.
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000106.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.691066) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031373536' seq:72057594037927935, type:22 .. '6D6772737461740032303037' seq:0, type:0; will stop at (end)
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 68] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 67 Base level 0, inputs: [110(698KB)], [108(11MB)]
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162841691163, "job": 68, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [110], "files_L6": [108], "score": -1, "input_data_size": 13227062, "oldest_snapshot_seqno": -1}
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 68] Generated table #111: 7644 keys, 9760428 bytes, temperature: kUnknown
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162841754082, "cf_name": "default", "job": 68, "event": "table_file_creation", "file_number": 111, "file_size": 9760428, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9712479, "index_size": 27741, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19141, "raw_key_size": 198493, "raw_average_key_size": 25, "raw_value_size": 9579321, "raw_average_value_size": 1253, "num_data_blocks": 1087, "num_entries": 7644, "num_filter_entries": 7644, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769162841, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 111, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.754319) [db/compaction/compaction_job.cc:1663] [default] [JOB 68] Compacted 1@0 + 1@6 files to L6 => 9760428 bytes
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.755603) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 210.0 rd, 154.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 11.9 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(32.1) write-amplify(13.6) OK, records in: 8132, records dropped: 488 output_compression: NoCompression
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.755621) EVENT_LOG_v1 {"time_micros": 1769162841755613, "job": 68, "event": "compaction_finished", "compaction_time_micros": 62992, "compaction_time_cpu_micros": 24527, "output_level": 6, "num_output_files": 1, "total_output_size": 9760428, "num_input_records": 8132, "num_output_records": 7644, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000110.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162841755856, "job": 68, "event": "table_file_deletion", "file_number": 110}
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000108.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162841758366, "job": 68, "event": "table_file_deletion", "file_number": 108}
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.690984) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.758455) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.758460) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.758462) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.758463) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:21 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:07:21.758464) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:07:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:22.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:23.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:24.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:25.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:26 np0005593234 nova_compute[227762]: 2026-01-23 10:07:26.002 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:26 np0005593234 nova_compute[227762]: 2026-01-23 10:07:26.435 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:26.489 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=44, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=43) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:07:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:26.491 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:07:26 np0005593234 nova_compute[227762]: 2026-01-23 10:07:26.540 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:26.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:27.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:07:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:28.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:07:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:29.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:30.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:30 np0005593234 nova_compute[227762]: 2026-01-23 10:07:30.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:31 np0005593234 nova_compute[227762]: 2026-01-23 10:07:31.005 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:31.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:31 np0005593234 nova_compute[227762]: 2026-01-23 10:07:31.436 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:31 np0005593234 nova_compute[227762]: 2026-01-23 10:07:31.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:31 np0005593234 nova_compute[227762]: 2026-01-23 10:07:31.837 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:31 np0005593234 nova_compute[227762]: 2026-01-23 10:07:31.838 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:31 np0005593234 nova_compute[227762]: 2026-01-23 10:07:31.838 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:31 np0005593234 nova_compute[227762]: 2026-01-23 10:07:31.838 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:07:31 np0005593234 nova_compute[227762]: 2026-01-23 10:07:31.838 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:07:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2793488864' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:07:32 np0005593234 nova_compute[227762]: 2026-01-23 10:07:32.264 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:32 np0005593234 podman[282968]: 2026-01-23 10:07:32.370905031 +0000 UTC m=+0.067409929 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 05:07:32 np0005593234 nova_compute[227762]: 2026-01-23 10:07:32.454 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:07:32 np0005593234 nova_compute[227762]: 2026-01-23 10:07:32.456 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4507MB free_disk=20.967357635498047GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:07:32 np0005593234 nova_compute[227762]: 2026-01-23 10:07:32.456 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:32 np0005593234 nova_compute[227762]: 2026-01-23 10:07:32.456 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:32 np0005593234 nova_compute[227762]: 2026-01-23 10:07:32.561 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:07:32 np0005593234 nova_compute[227762]: 2026-01-23 10:07:32.561 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:07:32 np0005593234 nova_compute[227762]: 2026-01-23 10:07:32.596 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:32.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:07:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3251130463' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:07:33 np0005593234 nova_compute[227762]: 2026-01-23 10:07:33.057 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:33 np0005593234 nova_compute[227762]: 2026-01-23 10:07:33.063 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:07:33 np0005593234 nova_compute[227762]: 2026-01-23 10:07:33.105 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:07:33 np0005593234 nova_compute[227762]: 2026-01-23 10:07:33.132 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:07:33 np0005593234 nova_compute[227762]: 2026-01-23 10:07:33.132 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:33.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:34.493 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '44'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:34.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:35 np0005593234 nova_compute[227762]: 2026-01-23 10:07:35.132 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:35.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:36 np0005593234 nova_compute[227762]: 2026-01-23 10:07:36.007 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:36 np0005593234 nova_compute[227762]: 2026-01-23 10:07:36.438 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:36.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:36 np0005593234 nova_compute[227762]: 2026-01-23 10:07:36.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:37.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:38 np0005593234 nova_compute[227762]: 2026-01-23 10:07:38.610 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Acquiring lock "d87dd410-da49-4f11-b99a-130005041777" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:38 np0005593234 nova_compute[227762]: 2026-01-23 10:07:38.610 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:38 np0005593234 nova_compute[227762]: 2026-01-23 10:07:38.632 227766 DEBUG nova.compute.manager [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:07:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:38.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:38 np0005593234 nova_compute[227762]: 2026-01-23 10:07:38.727 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:38 np0005593234 nova_compute[227762]: 2026-01-23 10:07:38.728 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:38 np0005593234 nova_compute[227762]: 2026-01-23 10:07:38.744 227766 DEBUG nova.virt.hardware [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:07:38 np0005593234 nova_compute[227762]: 2026-01-23 10:07:38.745 227766 INFO nova.compute.claims [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:07:38 np0005593234 nova_compute[227762]: 2026-01-23 10:07:38.947 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:07:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1177649285' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.400 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.408 227766 DEBUG nova.compute.provider_tree [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:07:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:39.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.461 227766 DEBUG nova.scheduler.client.report [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.517 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.789s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.518 227766 DEBUG nova.compute.manager [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.572 227766 DEBUG nova.compute.manager [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.572 227766 DEBUG nova.network.neutron [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.603 227766 INFO nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.636 227766 DEBUG nova.compute.manager [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.805 227766 DEBUG nova.compute.manager [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.806 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.807 227766 INFO nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Creating image(s)#033[00m
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.963 227766 DEBUG nova.storage.rbd_utils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] rbd image d87dd410-da49-4f11-b99a-130005041777_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:39 np0005593234 nova_compute[227762]: 2026-01-23 10:07:39.992 227766 DEBUG nova.storage.rbd_utils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] rbd image d87dd410-da49-4f11-b99a-130005041777_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:40 np0005593234 nova_compute[227762]: 2026-01-23 10:07:40.020 227766 DEBUG nova.storage.rbd_utils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] rbd image d87dd410-da49-4f11-b99a-130005041777_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:40 np0005593234 nova_compute[227762]: 2026-01-23 10:07:40.023 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:40 np0005593234 nova_compute[227762]: 2026-01-23 10:07:40.086 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:40 np0005593234 nova_compute[227762]: 2026-01-23 10:07:40.088 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:40 np0005593234 nova_compute[227762]: 2026-01-23 10:07:40.088 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:40 np0005593234 nova_compute[227762]: 2026-01-23 10:07:40.089 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:40 np0005593234 nova_compute[227762]: 2026-01-23 10:07:40.118 227766 DEBUG nova.storage.rbd_utils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] rbd image d87dd410-da49-4f11-b99a-130005041777_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:40 np0005593234 nova_compute[227762]: 2026-01-23 10:07:40.123 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 d87dd410-da49-4f11-b99a-130005041777_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:40.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:40 np0005593234 nova_compute[227762]: 2026-01-23 10:07:40.744 227766 DEBUG nova.policy [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0cae28d3746642598ee191045d7c2970', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '64e70435945f4186b0e2ab8143c69d80', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:07:40 np0005593234 nova_compute[227762]: 2026-01-23 10:07:40.747 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:40 np0005593234 nova_compute[227762]: 2026-01-23 10:07:40.748 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:07:40 np0005593234 nova_compute[227762]: 2026-01-23 10:07:40.869 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:07:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.011 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.236 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 d87dd410-da49-4f11-b99a-130005041777_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.113s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.308 227766 DEBUG nova.storage.rbd_utils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] resizing rbd image d87dd410-da49-4f11-b99a-130005041777_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:07:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:41.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.439 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.658 227766 DEBUG nova.objects.instance [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lazy-loading 'migration_context' on Instance uuid d87dd410-da49-4f11-b99a-130005041777 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.790 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.791 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Ensure instance console log exists: /var/lib/nova/instances/d87dd410-da49-4f11-b99a-130005041777/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.791 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.792 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.792 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.818 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.821 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:41 np0005593234 nova_compute[227762]: 2026-01-23 10:07:41.822 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:07:42 np0005593234 nova_compute[227762]: 2026-01-23 10:07:42.447 227766 DEBUG nova.network.neutron [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Successfully created port: eaedac9f-972e-42ea-8823-d8239f00f66d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:07:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:42.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:42 np0005593234 podman[283201]: 2026-01-23 10:07:42.790081492 +0000 UTC m=+0.087637142 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:07:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:42.844 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:42.844 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:42.845 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:43.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:44.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:45.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:45 np0005593234 nova_compute[227762]: 2026-01-23 10:07:45.749 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:45 np0005593234 nova_compute[227762]: 2026-01-23 10:07:45.867 227766 DEBUG nova.network.neutron [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Successfully updated port: eaedac9f-972e-42ea-8823-d8239f00f66d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:07:45 np0005593234 nova_compute[227762]: 2026-01-23 10:07:45.889 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Acquiring lock "refresh_cache-d87dd410-da49-4f11-b99a-130005041777" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:07:45 np0005593234 nova_compute[227762]: 2026-01-23 10:07:45.889 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Acquired lock "refresh_cache-d87dd410-da49-4f11-b99a-130005041777" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:07:45 np0005593234 nova_compute[227762]: 2026-01-23 10:07:45.889 227766 DEBUG nova.network.neutron [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:07:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:46 np0005593234 nova_compute[227762]: 2026-01-23 10:07:46.014 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:46 np0005593234 nova_compute[227762]: 2026-01-23 10:07:46.069 227766 DEBUG nova.compute.manager [req-c087cf09-711b-48cc-a758-004b35f92e40 req-e3f53910-792e-43c5-a720-68a8f5bb9cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Received event network-changed-eaedac9f-972e-42ea-8823-d8239f00f66d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:07:46 np0005593234 nova_compute[227762]: 2026-01-23 10:07:46.069 227766 DEBUG nova.compute.manager [req-c087cf09-711b-48cc-a758-004b35f92e40 req-e3f53910-792e-43c5-a720-68a8f5bb9cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Refreshing instance network info cache due to event network-changed-eaedac9f-972e-42ea-8823-d8239f00f66d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:07:46 np0005593234 nova_compute[227762]: 2026-01-23 10:07:46.070 227766 DEBUG oslo_concurrency.lockutils [req-c087cf09-711b-48cc-a758-004b35f92e40 req-e3f53910-792e-43c5-a720-68a8f5bb9cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d87dd410-da49-4f11-b99a-130005041777" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:07:46 np0005593234 nova_compute[227762]: 2026-01-23 10:07:46.441 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:46 np0005593234 nova_compute[227762]: 2026-01-23 10:07:46.636 227766 DEBUG nova.network.neutron [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:07:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:46.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:47.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:47 np0005593234 nova_compute[227762]: 2026-01-23 10:07:47.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.596 227766 DEBUG nova.network.neutron [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Updating instance_info_cache with network_info: [{"id": "eaedac9f-972e-42ea-8823-d8239f00f66d", "address": "fa:16:3e:4f:23:08", "network": {"id": "ce292c89-5b6f-435a-b5dd-18e7b02f207c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-894698614-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64e70435945f4186b0e2ab8143c69d80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaedac9f-97", "ovs_interfaceid": "eaedac9f-972e-42ea-8823-d8239f00f66d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.636 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Releasing lock "refresh_cache-d87dd410-da49-4f11-b99a-130005041777" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.637 227766 DEBUG nova.compute.manager [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Instance network_info: |[{"id": "eaedac9f-972e-42ea-8823-d8239f00f66d", "address": "fa:16:3e:4f:23:08", "network": {"id": "ce292c89-5b6f-435a-b5dd-18e7b02f207c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-894698614-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64e70435945f4186b0e2ab8143c69d80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaedac9f-97", "ovs_interfaceid": "eaedac9f-972e-42ea-8823-d8239f00f66d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.637 227766 DEBUG oslo_concurrency.lockutils [req-c087cf09-711b-48cc-a758-004b35f92e40 req-e3f53910-792e-43c5-a720-68a8f5bb9cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d87dd410-da49-4f11-b99a-130005041777" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.637 227766 DEBUG nova.network.neutron [req-c087cf09-711b-48cc-a758-004b35f92e40 req-e3f53910-792e-43c5-a720-68a8f5bb9cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Refreshing network info cache for port eaedac9f-972e-42ea-8823-d8239f00f66d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.640 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Start _get_guest_xml network_info=[{"id": "eaedac9f-972e-42ea-8823-d8239f00f66d", "address": "fa:16:3e:4f:23:08", "network": {"id": "ce292c89-5b6f-435a-b5dd-18e7b02f207c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-894698614-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64e70435945f4186b0e2ab8143c69d80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaedac9f-97", "ovs_interfaceid": "eaedac9f-972e-42ea-8823-d8239f00f66d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.646 227766 WARNING nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.652 227766 DEBUG nova.virt.libvirt.host [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.652 227766 DEBUG nova.virt.libvirt.host [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.657 227766 DEBUG nova.virt.libvirt.host [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:07:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:48.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.658 227766 DEBUG nova.virt.libvirt.host [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.659 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.659 227766 DEBUG nova.virt.hardware [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.660 227766 DEBUG nova.virt.hardware [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.660 227766 DEBUG nova.virt.hardware [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.660 227766 DEBUG nova.virt.hardware [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.660 227766 DEBUG nova.virt.hardware [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.660 227766 DEBUG nova.virt.hardware [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.661 227766 DEBUG nova.virt.hardware [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.661 227766 DEBUG nova.virt.hardware [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.661 227766 DEBUG nova.virt.hardware [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.661 227766 DEBUG nova.virt.hardware [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.661 227766 DEBUG nova.virt.hardware [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:07:48 np0005593234 nova_compute[227762]: 2026-01-23 10:07:48.664 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:07:49 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1780236225' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.115 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.143 227766 DEBUG nova.storage.rbd_utils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] rbd image d87dd410-da49-4f11-b99a-130005041777_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.148 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:49.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:07:49 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2432022098' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.596 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.598 227766 DEBUG nova.virt.libvirt.vif [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:07:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1255797485',display_name='tempest-ServerAddressesNegativeTestJSON-server-1255797485',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1255797485',id=120,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='64e70435945f4186b0e2ab8143c69d80',ramdisk_id='',reservation_id='r-g2650g60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-654457736',owner_user_name='tempest-ServerAddressesNegativeTestJSON-654457736-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:07:39Z,user_data=None,user_id='0cae28d3746642598ee191045d7c2970',uuid=d87dd410-da49-4f11-b99a-130005041777,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eaedac9f-972e-42ea-8823-d8239f00f66d", "address": "fa:16:3e:4f:23:08", "network": {"id": "ce292c89-5b6f-435a-b5dd-18e7b02f207c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-894698614-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64e70435945f4186b0e2ab8143c69d80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaedac9f-97", "ovs_interfaceid": "eaedac9f-972e-42ea-8823-d8239f00f66d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.599 227766 DEBUG nova.network.os_vif_util [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Converting VIF {"id": "eaedac9f-972e-42ea-8823-d8239f00f66d", "address": "fa:16:3e:4f:23:08", "network": {"id": "ce292c89-5b6f-435a-b5dd-18e7b02f207c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-894698614-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64e70435945f4186b0e2ab8143c69d80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaedac9f-97", "ovs_interfaceid": "eaedac9f-972e-42ea-8823-d8239f00f66d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.600 227766 DEBUG nova.network.os_vif_util [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:23:08,bridge_name='br-int',has_traffic_filtering=True,id=eaedac9f-972e-42ea-8823-d8239f00f66d,network=Network(ce292c89-5b6f-435a-b5dd-18e7b02f207c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaedac9f-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.601 227766 DEBUG nova.objects.instance [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lazy-loading 'pci_devices' on Instance uuid d87dd410-da49-4f11-b99a-130005041777 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.621 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  <uuid>d87dd410-da49-4f11-b99a-130005041777</uuid>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  <name>instance-00000078</name>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerAddressesNegativeTestJSON-server-1255797485</nova:name>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:07:48</nova:creationTime>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <nova:user uuid="0cae28d3746642598ee191045d7c2970">tempest-ServerAddressesNegativeTestJSON-654457736-project-member</nova:user>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <nova:project uuid="64e70435945f4186b0e2ab8143c69d80">tempest-ServerAddressesNegativeTestJSON-654457736</nova:project>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <nova:port uuid="eaedac9f-972e-42ea-8823-d8239f00f66d">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <entry name="serial">d87dd410-da49-4f11-b99a-130005041777</entry>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <entry name="uuid">d87dd410-da49-4f11-b99a-130005041777</entry>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/d87dd410-da49-4f11-b99a-130005041777_disk">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/d87dd410-da49-4f11-b99a-130005041777_disk.config">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:4f:23:08"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <target dev="tapeaedac9f-97"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/d87dd410-da49-4f11-b99a-130005041777/console.log" append="off"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:07:49 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:07:49 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:07:49 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:07:49 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.622 227766 DEBUG nova.compute.manager [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Preparing to wait for external event network-vif-plugged-eaedac9f-972e-42ea-8823-d8239f00f66d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.622 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Acquiring lock "d87dd410-da49-4f11-b99a-130005041777-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.623 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.623 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.623 227766 DEBUG nova.virt.libvirt.vif [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:07:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1255797485',display_name='tempest-ServerAddressesNegativeTestJSON-server-1255797485',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1255797485',id=120,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='64e70435945f4186b0e2ab8143c69d80',ramdisk_id='',reservation_id='r-g2650g60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerAddressesNegativeTestJSON-654457736',owner_user_name='tempest-ServerAddressesNegativeTestJSON-654457736-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:07:39Z,user_data=None,user_id='0cae28d3746642598ee191045d7c2970',uuid=d87dd410-da49-4f11-b99a-130005041777,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "eaedac9f-972e-42ea-8823-d8239f00f66d", "address": "fa:16:3e:4f:23:08", "network": {"id": "ce292c89-5b6f-435a-b5dd-18e7b02f207c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-894698614-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64e70435945f4186b0e2ab8143c69d80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaedac9f-97", "ovs_interfaceid": "eaedac9f-972e-42ea-8823-d8239f00f66d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.624 227766 DEBUG nova.network.os_vif_util [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Converting VIF {"id": "eaedac9f-972e-42ea-8823-d8239f00f66d", "address": "fa:16:3e:4f:23:08", "network": {"id": "ce292c89-5b6f-435a-b5dd-18e7b02f207c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-894698614-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64e70435945f4186b0e2ab8143c69d80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaedac9f-97", "ovs_interfaceid": "eaedac9f-972e-42ea-8823-d8239f00f66d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.624 227766 DEBUG nova.network.os_vif_util [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:23:08,bridge_name='br-int',has_traffic_filtering=True,id=eaedac9f-972e-42ea-8823-d8239f00f66d,network=Network(ce292c89-5b6f-435a-b5dd-18e7b02f207c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaedac9f-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.625 227766 DEBUG os_vif [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:23:08,bridge_name='br-int',has_traffic_filtering=True,id=eaedac9f-972e-42ea-8823-d8239f00f66d,network=Network(ce292c89-5b6f-435a-b5dd-18e7b02f207c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaedac9f-97') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.626 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.626 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.626 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.630 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.630 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeaedac9f-97, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.631 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeaedac9f-97, col_values=(('external_ids', {'iface-id': 'eaedac9f-972e-42ea-8823-d8239f00f66d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:23:08', 'vm-uuid': 'd87dd410-da49-4f11-b99a-130005041777'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.632 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:49 np0005593234 NetworkManager[48942]: <info>  [1769162869.6332] manager: (tapeaedac9f-97): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/245)
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.635 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.638 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.639 227766 INFO os_vif [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:23:08,bridge_name='br-int',has_traffic_filtering=True,id=eaedac9f-972e-42ea-8823-d8239f00f66d,network=Network(ce292c89-5b6f-435a-b5dd-18e7b02f207c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaedac9f-97')#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.767 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.768 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.768 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] No VIF found with MAC fa:16:3e:4f:23:08, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.768 227766 INFO nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Using config drive#033[00m
Jan 23 05:07:49 np0005593234 nova_compute[227762]: 2026-01-23 10:07:49.795 227766 DEBUG nova.storage.rbd_utils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] rbd image d87dd410-da49-4f11-b99a-130005041777_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:50 np0005593234 nova_compute[227762]: 2026-01-23 10:07:50.486 227766 INFO nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Creating config drive at /var/lib/nova/instances/d87dd410-da49-4f11-b99a-130005041777/disk.config#033[00m
Jan 23 05:07:50 np0005593234 nova_compute[227762]: 2026-01-23 10:07:50.491 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d87dd410-da49-4f11-b99a-130005041777/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3pxjmo5q execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:50 np0005593234 nova_compute[227762]: 2026-01-23 10:07:50.623 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d87dd410-da49-4f11-b99a-130005041777/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3pxjmo5q" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:50 np0005593234 nova_compute[227762]: 2026-01-23 10:07:50.657 227766 DEBUG nova.storage.rbd_utils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] rbd image d87dd410-da49-4f11-b99a-130005041777_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:07:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:50.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:50 np0005593234 nova_compute[227762]: 2026-01-23 10:07:50.661 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d87dd410-da49-4f11-b99a-130005041777/disk.config d87dd410-da49-4f11-b99a-130005041777_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:50 np0005593234 nova_compute[227762]: 2026-01-23 10:07:50.868 227766 DEBUG oslo_concurrency.processutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d87dd410-da49-4f11-b99a-130005041777/disk.config d87dd410-da49-4f11-b99a-130005041777_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.207s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:50 np0005593234 nova_compute[227762]: 2026-01-23 10:07:50.869 227766 INFO nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Deleting local config drive /var/lib/nova/instances/d87dd410-da49-4f11-b99a-130005041777/disk.config because it was imported into RBD.#033[00m
Jan 23 05:07:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:50 np0005593234 kernel: tapeaedac9f-97: entered promiscuous mode
Jan 23 05:07:50 np0005593234 NetworkManager[48942]: <info>  [1769162870.9248] manager: (tapeaedac9f-97): new Tun device (/org/freedesktop/NetworkManager/Devices/246)
Jan 23 05:07:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:07:50Z|00477|binding|INFO|Claiming lport eaedac9f-972e-42ea-8823-d8239f00f66d for this chassis.
Jan 23 05:07:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:07:50Z|00478|binding|INFO|eaedac9f-972e-42ea-8823-d8239f00f66d: Claiming fa:16:3e:4f:23:08 10.100.0.7
Jan 23 05:07:50 np0005593234 nova_compute[227762]: 2026-01-23 10:07:50.925 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:50 np0005593234 nova_compute[227762]: 2026-01-23 10:07:50.931 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:50.950 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:23:08 10.100.0.7'], port_security=['fa:16:3e:4f:23:08 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'd87dd410-da49-4f11-b99a-130005041777', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce292c89-5b6f-435a-b5dd-18e7b02f207c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '64e70435945f4186b0e2ab8143c69d80', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b7f099bf-0f5f-4211-a8e8-88471d592740', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9282aeaa-b9aa-49a7-9e59-fa3a634c8a40, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=eaedac9f-972e-42ea-8823-d8239f00f66d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:07:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:50.951 144381 INFO neutron.agent.ovn.metadata.agent [-] Port eaedac9f-972e-42ea-8823-d8239f00f66d in datapath ce292c89-5b6f-435a-b5dd-18e7b02f207c bound to our chassis#033[00m
Jan 23 05:07:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:50.953 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ce292c89-5b6f-435a-b5dd-18e7b02f207c#033[00m
Jan 23 05:07:50 np0005593234 systemd-udevd[283415]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:07:50 np0005593234 NetworkManager[48942]: <info>  [1769162870.9680] device (tapeaedac9f-97): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:07:50 np0005593234 NetworkManager[48942]: <info>  [1769162870.9685] device (tapeaedac9f-97): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:07:50 np0005593234 systemd-machined[195626]: New machine qemu-56-instance-00000078.
Jan 23 05:07:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:50.970 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d2919379-5674-4c68-9c79-eafd5e2ef7d8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:50.972 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapce292c89-51 in ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:07:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:50.975 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapce292c89-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:07:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:50.975 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2a7d8932-1512-42b0-b7f8-a1fc33ca998f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:50.976 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b5a12faa-f08a-49f6-b074-f46531a7a2c9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:50 np0005593234 systemd[1]: Started Virtual Machine qemu-56-instance-00000078.
Jan 23 05:07:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:50.991 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[23fe0008-2bce-469e-b5c6-d89a8d0c11dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:50 np0005593234 nova_compute[227762]: 2026-01-23 10:07:50.997 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:51 np0005593234 ovn_controller[134547]: 2026-01-23T10:07:51Z|00479|binding|INFO|Setting lport eaedac9f-972e-42ea-8823-d8239f00f66d ovn-installed in OVS
Jan 23 05:07:51 np0005593234 ovn_controller[134547]: 2026-01-23T10:07:51Z|00480|binding|INFO|Setting lport eaedac9f-972e-42ea-8823-d8239f00f66d up in Southbound
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.005 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.008 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[91bf25f1-555a-4bfc-a2fb-f5ec8cf0e199]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.040 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7429ab36-2794-4a13-ab99-c6c68b40273e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.046 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[002cf49a-3c59-405b-adeb-90986d22a7a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:51 np0005593234 NetworkManager[48942]: <info>  [1769162871.0473] manager: (tapce292c89-50): new Veth device (/org/freedesktop/NetworkManager/Devices/247)
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.078 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[08e11dff-6554-45e7-aa46-e83f07886fad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.081 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7e392cf6-c969-4dab-a657-46734a249f92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:51 np0005593234 NetworkManager[48942]: <info>  [1769162871.1023] device (tapce292c89-50): carrier: link connected
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.107 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[69d76f9f-9a0b-48ad-a6cf-ebfa363c4e1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.124 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bb1f4209-3a61-44bb-9507-35eb5c0e8aab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce292c89-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:a3:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 676606, 'reachable_time': 24336, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283451, 'error': None, 'target': 'ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.143 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e378afbc-e380-4d8e-b7fb-b5c2a7408c15]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe18:a35b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 676606, 'tstamp': 676606}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 283452, 'error': None, 'target': 'ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.160 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6d9fc47b-8816-4ce4-b078-dc636177cd22]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapce292c89-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:18:a3:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 155], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 676606, 'reachable_time': 24336, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 283453, 'error': None, 'target': 'ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.192 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fa9ac628-9edc-4974-bd07-c45b6e1e0094]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.246 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[de1aa77b-a91c-4e9d-ae23-29aa8675532d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.247 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce292c89-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.247 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.248 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapce292c89-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:51 np0005593234 NetworkManager[48942]: <info>  [1769162871.2501] manager: (tapce292c89-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/248)
Jan 23 05:07:51 np0005593234 kernel: tapce292c89-50: entered promiscuous mode
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.249 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.252 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapce292c89-50, col_values=(('external_ids', {'iface-id': '357e8a05-0ae3-4cc7-adba-e31c4640cbfd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:51 np0005593234 ovn_controller[134547]: 2026-01-23T10:07:51Z|00481|binding|INFO|Releasing lport 357e8a05-0ae3-4cc7-adba-e31c4640cbfd from this chassis (sb_readonly=0)
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.266 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.267 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ce292c89-5b6f-435a-b5dd-18e7b02f207c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ce292c89-5b6f-435a-b5dd-18e7b02f207c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.268 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a89053e9-a094-4ff7-a9b0-6d3e7742fe35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.269 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ce292c89-5b6f-435a-b5dd-18e7b02f207c
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ce292c89-5b6f-435a-b5dd-18e7b02f207c.pid.haproxy
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ce292c89-5b6f-435a-b5dd-18e7b02f207c
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:07:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:51.270 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c', 'env', 'PROCESS_TAG=haproxy-ce292c89-5b6f-435a-b5dd-18e7b02f207c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ce292c89-5b6f-435a-b5dd-18e7b02f207c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.361 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162871.3604965, d87dd410-da49-4f11-b99a-130005041777 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.362 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d87dd410-da49-4f11-b99a-130005041777] VM Started (Lifecycle Event)#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.391 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d87dd410-da49-4f11-b99a-130005041777] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.402 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162871.3607984, d87dd410-da49-4f11-b99a-130005041777 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.402 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d87dd410-da49-4f11-b99a-130005041777] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:07:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:51.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.443 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.450 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d87dd410-da49-4f11-b99a-130005041777] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.453 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d87dd410-da49-4f11-b99a-130005041777] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.485 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d87dd410-da49-4f11-b99a-130005041777] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:07:51 np0005593234 podman[283527]: 2026-01-23 10:07:51.669374881 +0000 UTC m=+0.054328919 container create 5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 05:07:51 np0005593234 systemd[1]: Started libpod-conmon-5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b.scope.
Jan 23 05:07:51 np0005593234 podman[283527]: 2026-01-23 10:07:51.63830016 +0000 UTC m=+0.023254228 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:07:51 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:07:51 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/572a58eb717972b47506902fd14d35ed51b2e2ba05a219407b847f42f2f8971e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:07:51 np0005593234 podman[283527]: 2026-01-23 10:07:51.762950328 +0000 UTC m=+0.147904386 container init 5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:07:51 np0005593234 podman[283527]: 2026-01-23 10:07:51.769472632 +0000 UTC m=+0.154426660 container start 5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 05:07:51 np0005593234 neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c[283542]: [NOTICE]   (283547) : New worker (283549) forked
Jan 23 05:07:51 np0005593234 neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c[283542]: [NOTICE]   (283547) : Loading success.
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.840 227766 DEBUG nova.network.neutron [req-c087cf09-711b-48cc-a758-004b35f92e40 req-e3f53910-792e-43c5-a720-68a8f5bb9cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Updated VIF entry in instance network info cache for port eaedac9f-972e-42ea-8823-d8239f00f66d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.842 227766 DEBUG nova.network.neutron [req-c087cf09-711b-48cc-a758-004b35f92e40 req-e3f53910-792e-43c5-a720-68a8f5bb9cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Updating instance_info_cache with network_info: [{"id": "eaedac9f-972e-42ea-8823-d8239f00f66d", "address": "fa:16:3e:4f:23:08", "network": {"id": "ce292c89-5b6f-435a-b5dd-18e7b02f207c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-894698614-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64e70435945f4186b0e2ab8143c69d80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaedac9f-97", "ovs_interfaceid": "eaedac9f-972e-42ea-8823-d8239f00f66d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.871 227766 DEBUG oslo_concurrency.lockutils [req-c087cf09-711b-48cc-a758-004b35f92e40 req-e3f53910-792e-43c5-a720-68a8f5bb9cf4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d87dd410-da49-4f11-b99a-130005041777" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.962 227766 DEBUG nova.compute.manager [req-88a862c4-d777-4c65-8d26-f7bc2d1be09f req-58558786-1fd1-49f9-8d26-944295db641a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Received event network-vif-plugged-eaedac9f-972e-42ea-8823-d8239f00f66d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.963 227766 DEBUG oslo_concurrency.lockutils [req-88a862c4-d777-4c65-8d26-f7bc2d1be09f req-58558786-1fd1-49f9-8d26-944295db641a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d87dd410-da49-4f11-b99a-130005041777-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.963 227766 DEBUG oslo_concurrency.lockutils [req-88a862c4-d777-4c65-8d26-f7bc2d1be09f req-58558786-1fd1-49f9-8d26-944295db641a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.963 227766 DEBUG oslo_concurrency.lockutils [req-88a862c4-d777-4c65-8d26-f7bc2d1be09f req-58558786-1fd1-49f9-8d26-944295db641a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.963 227766 DEBUG nova.compute.manager [req-88a862c4-d777-4c65-8d26-f7bc2d1be09f req-58558786-1fd1-49f9-8d26-944295db641a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Processing event network-vif-plugged-eaedac9f-972e-42ea-8823-d8239f00f66d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.964 227766 DEBUG nova.compute.manager [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.971 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162871.971069, d87dd410-da49-4f11-b99a-130005041777 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.971 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d87dd410-da49-4f11-b99a-130005041777] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.973 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.976 227766 INFO nova.virt.libvirt.driver [-] [instance: d87dd410-da49-4f11-b99a-130005041777] Instance spawned successfully.#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.977 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.992 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d87dd410-da49-4f11-b99a-130005041777] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:51 np0005593234 nova_compute[227762]: 2026-01-23 10:07:51.997 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d87dd410-da49-4f11-b99a-130005041777] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:07:52 np0005593234 nova_compute[227762]: 2026-01-23 10:07:52.010 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:52 np0005593234 nova_compute[227762]: 2026-01-23 10:07:52.010 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:52 np0005593234 nova_compute[227762]: 2026-01-23 10:07:52.011 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:52 np0005593234 nova_compute[227762]: 2026-01-23 10:07:52.011 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:52 np0005593234 nova_compute[227762]: 2026-01-23 10:07:52.011 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:52 np0005593234 nova_compute[227762]: 2026-01-23 10:07:52.011 227766 DEBUG nova.virt.libvirt.driver [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:07:52 np0005593234 nova_compute[227762]: 2026-01-23 10:07:52.038 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d87dd410-da49-4f11-b99a-130005041777] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:07:52 np0005593234 nova_compute[227762]: 2026-01-23 10:07:52.081 227766 INFO nova.compute.manager [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Took 12.28 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:07:52 np0005593234 nova_compute[227762]: 2026-01-23 10:07:52.081 227766 DEBUG nova.compute.manager [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:07:52 np0005593234 nova_compute[227762]: 2026-01-23 10:07:52.217 227766 INFO nova.compute.manager [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Took 13.53 seconds to build instance.#033[00m
Jan 23 05:07:52 np0005593234 nova_compute[227762]: 2026-01-23 10:07:52.239 227766 DEBUG oslo_concurrency.lockutils [None req-43c47554-5bc5-4b65-9363-c1fecdc67120 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:07:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:52.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:07:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:53.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:54 np0005593234 nova_compute[227762]: 2026-01-23 10:07:54.373 227766 DEBUG nova.compute.manager [req-5be5c2dc-651a-40d3-93f0-a3c3dd4ea560 req-e3d128b7-183e-4e02-b861-f4a92c87a3ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Received event network-vif-plugged-eaedac9f-972e-42ea-8823-d8239f00f66d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:07:54 np0005593234 nova_compute[227762]: 2026-01-23 10:07:54.375 227766 DEBUG oslo_concurrency.lockutils [req-5be5c2dc-651a-40d3-93f0-a3c3dd4ea560 req-e3d128b7-183e-4e02-b861-f4a92c87a3ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d87dd410-da49-4f11-b99a-130005041777-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:54 np0005593234 nova_compute[227762]: 2026-01-23 10:07:54.375 227766 DEBUG oslo_concurrency.lockutils [req-5be5c2dc-651a-40d3-93f0-a3c3dd4ea560 req-e3d128b7-183e-4e02-b861-f4a92c87a3ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:54 np0005593234 nova_compute[227762]: 2026-01-23 10:07:54.376 227766 DEBUG oslo_concurrency.lockutils [req-5be5c2dc-651a-40d3-93f0-a3c3dd4ea560 req-e3d128b7-183e-4e02-b861-f4a92c87a3ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:54 np0005593234 nova_compute[227762]: 2026-01-23 10:07:54.376 227766 DEBUG nova.compute.manager [req-5be5c2dc-651a-40d3-93f0-a3c3dd4ea560 req-e3d128b7-183e-4e02-b861-f4a92c87a3ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] No waiting events found dispatching network-vif-plugged-eaedac9f-972e-42ea-8823-d8239f00f66d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:07:54 np0005593234 nova_compute[227762]: 2026-01-23 10:07:54.376 227766 WARNING nova.compute.manager [req-5be5c2dc-651a-40d3-93f0-a3c3dd4ea560 req-e3d128b7-183e-4e02-b861-f4a92c87a3ac 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Received unexpected event network-vif-plugged-eaedac9f-972e-42ea-8823-d8239f00f66d for instance with vm_state active and task_state None.#033[00m
Jan 23 05:07:54 np0005593234 nova_compute[227762]: 2026-01-23 10:07:54.633 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:07:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:54.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.145 227766 DEBUG oslo_concurrency.lockutils [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Acquiring lock "d87dd410-da49-4f11-b99a-130005041777" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.145 227766 DEBUG oslo_concurrency.lockutils [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.146 227766 DEBUG oslo_concurrency.lockutils [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Acquiring lock "d87dd410-da49-4f11-b99a-130005041777-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.146 227766 DEBUG oslo_concurrency.lockutils [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.146 227766 DEBUG oslo_concurrency.lockutils [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.148 227766 INFO nova.compute.manager [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Terminating instance#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.149 227766 DEBUG nova.compute.manager [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:07:55 np0005593234 kernel: tapeaedac9f-97 (unregistering): left promiscuous mode
Jan 23 05:07:55 np0005593234 NetworkManager[48942]: <info>  [1769162875.1903] device (tapeaedac9f-97): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.198 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:07:55Z|00482|binding|INFO|Releasing lport eaedac9f-972e-42ea-8823-d8239f00f66d from this chassis (sb_readonly=0)
Jan 23 05:07:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:07:55Z|00483|binding|INFO|Setting lport eaedac9f-972e-42ea-8823-d8239f00f66d down in Southbound
Jan 23 05:07:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:07:55Z|00484|binding|INFO|Removing iface tapeaedac9f-97 ovn-installed in OVS
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.202 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.206 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:23:08 10.100.0.7'], port_security=['fa:16:3e:4f:23:08 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'd87dd410-da49-4f11-b99a-130005041777', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ce292c89-5b6f-435a-b5dd-18e7b02f207c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '64e70435945f4186b0e2ab8143c69d80', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b7f099bf-0f5f-4211-a8e8-88471d592740', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9282aeaa-b9aa-49a7-9e59-fa3a634c8a40, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=eaedac9f-972e-42ea-8823-d8239f00f66d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.207 144381 INFO neutron.agent.ovn.metadata.agent [-] Port eaedac9f-972e-42ea-8823-d8239f00f66d in datapath ce292c89-5b6f-435a-b5dd-18e7b02f207c unbound from our chassis#033[00m
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.208 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ce292c89-5b6f-435a-b5dd-18e7b02f207c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.210 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b6db3265-c838-4140-843a-b2e2d2f9efc6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.210 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c namespace which is not needed anymore#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.218 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:55 np0005593234 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000078.scope: Deactivated successfully.
Jan 23 05:07:55 np0005593234 systemd[1]: machine-qemu\x2d56\x2dinstance\x2d00000078.scope: Consumed 3.609s CPU time.
Jan 23 05:07:55 np0005593234 systemd-machined[195626]: Machine qemu-56-instance-00000078 terminated.
Jan 23 05:07:55 np0005593234 neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c[283542]: [NOTICE]   (283547) : haproxy version is 2.8.14-c23fe91
Jan 23 05:07:55 np0005593234 neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c[283542]: [NOTICE]   (283547) : path to executable is /usr/sbin/haproxy
Jan 23 05:07:55 np0005593234 neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c[283542]: [WARNING]  (283547) : Exiting Master process...
Jan 23 05:07:55 np0005593234 neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c[283542]: [ALERT]    (283547) : Current worker (283549) exited with code 143 (Terminated)
Jan 23 05:07:55 np0005593234 neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c[283542]: [WARNING]  (283547) : All workers exited. Exiting... (0)
Jan 23 05:07:55 np0005593234 systemd[1]: libpod-5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b.scope: Deactivated successfully.
Jan 23 05:07:55 np0005593234 podman[283584]: 2026-01-23 10:07:55.350243694 +0000 UTC m=+0.050502570 container died 5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:07:55 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b-userdata-shm.mount: Deactivated successfully.
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.387 227766 INFO nova.virt.libvirt.driver [-] [instance: d87dd410-da49-4f11-b99a-130005041777] Instance destroyed successfully.#033[00m
Jan 23 05:07:55 np0005593234 systemd[1]: var-lib-containers-storage-overlay-572a58eb717972b47506902fd14d35ed51b2e2ba05a219407b847f42f2f8971e-merged.mount: Deactivated successfully.
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.429 227766 DEBUG nova.objects.instance [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lazy-loading 'resources' on Instance uuid d87dd410-da49-4f11-b99a-130005041777 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:07:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:55.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:55 np0005593234 podman[283584]: 2026-01-23 10:07:55.438175033 +0000 UTC m=+0.138433899 container cleanup 5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:07:55 np0005593234 systemd[1]: libpod-conmon-5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b.scope: Deactivated successfully.
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.448 227766 DEBUG nova.virt.libvirt.vif [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:07:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerAddressesNegativeTestJSON-server-1255797485',display_name='tempest-ServerAddressesNegativeTestJSON-server-1255797485',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveraddressesnegativetestjson-server-1255797485',id=120,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:07:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='64e70435945f4186b0e2ab8143c69d80',ramdisk_id='',reservation_id='r-g2650g60',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerAddressesNegativeTestJSON-654457736',owner_user_name='tempest-ServerAddressesNegativeTestJSON-654457736-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:07:52Z,user_data=None,user_id='0cae28d3746642598ee191045d7c2970',uuid=d87dd410-da49-4f11-b99a-130005041777,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eaedac9f-972e-42ea-8823-d8239f00f66d", "address": "fa:16:3e:4f:23:08", "network": {"id": "ce292c89-5b6f-435a-b5dd-18e7b02f207c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-894698614-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64e70435945f4186b0e2ab8143c69d80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaedac9f-97", "ovs_interfaceid": "eaedac9f-972e-42ea-8823-d8239f00f66d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.448 227766 DEBUG nova.network.os_vif_util [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Converting VIF {"id": "eaedac9f-972e-42ea-8823-d8239f00f66d", "address": "fa:16:3e:4f:23:08", "network": {"id": "ce292c89-5b6f-435a-b5dd-18e7b02f207c", "bridge": "br-int", "label": "tempest-ServerAddressesNegativeTestJSON-894698614-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "64e70435945f4186b0e2ab8143c69d80", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeaedac9f-97", "ovs_interfaceid": "eaedac9f-972e-42ea-8823-d8239f00f66d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.449 227766 DEBUG nova.network.os_vif_util [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:23:08,bridge_name='br-int',has_traffic_filtering=True,id=eaedac9f-972e-42ea-8823-d8239f00f66d,network=Network(ce292c89-5b6f-435a-b5dd-18e7b02f207c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaedac9f-97') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.450 227766 DEBUG os_vif [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:23:08,bridge_name='br-int',has_traffic_filtering=True,id=eaedac9f-972e-42ea-8823-d8239f00f66d,network=Network(ce292c89-5b6f-435a-b5dd-18e7b02f207c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaedac9f-97') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.452 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.452 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeaedac9f-97, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.453 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.456 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.458 227766 INFO os_vif [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:23:08,bridge_name='br-int',has_traffic_filtering=True,id=eaedac9f-972e-42ea-8823-d8239f00f66d,network=Network(ce292c89-5b6f-435a-b5dd-18e7b02f207c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeaedac9f-97')#033[00m
Jan 23 05:07:55 np0005593234 podman[283622]: 2026-01-23 10:07:55.503354312 +0000 UTC m=+0.041384845 container remove 5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.509 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6b788d-c68a-46c8-a633-ebb4763722ff]: (4, ('Fri Jan 23 10:07:55 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c (5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b)\n5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b\nFri Jan 23 10:07:55 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c (5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b)\n5c67812720d69eff9fe240d2b3461e16d62ecee7d0c720113fed2290dd60979b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.511 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[10e34f52-ecb7-4bfc-82ee-cb703605a196]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.512 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapce292c89-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:07:55 np0005593234 kernel: tapce292c89-50: left promiscuous mode
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.514 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.529 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.532 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[22fe6a0b-71a0-4142-a3ca-b3e48b07283a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.545 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[65f548a4-bb1c-4bbf-b79b-8cb094a93780]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.547 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9bad9a95-fca0-4fdb-96e5-1594d56ccb22]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.565 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7143a458-2940-4a8d-8e5d-96a11bdfc1aa]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 676599, 'reachable_time': 30480, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 283655, 'error': None, 'target': 'ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:55 np0005593234 systemd[1]: run-netns-ovnmeta\x2dce292c89\x2d5b6f\x2d435a\x2db5dd\x2d18e7b02f207c.mount: Deactivated successfully.
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.570 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ce292c89-5b6f-435a-b5dd-18e7b02f207c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:07:55.570 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[5ee26530-68ac-4a56-b0b6-bcf5285a8d0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.904 227766 INFO nova.virt.libvirt.driver [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Deleting instance files /var/lib/nova/instances/d87dd410-da49-4f11-b99a-130005041777_del#033[00m
Jan 23 05:07:55 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.905 227766 INFO nova.virt.libvirt.driver [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Deletion of /var/lib/nova/instances/d87dd410-da49-4f11-b99a-130005041777_del complete#033[00m
Jan 23 05:07:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:55.999 227766 INFO nova.compute.manager [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Took 0.85 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.000 227766 DEBUG oslo.service.loopingcall [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.000 227766 DEBUG nova.compute.manager [-] [instance: d87dd410-da49-4f11-b99a-130005041777] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.000 227766 DEBUG nova.network.neutron [-] [instance: d87dd410-da49-4f11-b99a-130005041777] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.445 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.585 227766 DEBUG nova.compute.manager [req-b304dcda-9d0a-46e6-aa3f-f3cc6e8c6a4e req-6a668f2f-23ce-4924-9401-227cc07e6b0c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Received event network-vif-unplugged-eaedac9f-972e-42ea-8823-d8239f00f66d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.586 227766 DEBUG oslo_concurrency.lockutils [req-b304dcda-9d0a-46e6-aa3f-f3cc6e8c6a4e req-6a668f2f-23ce-4924-9401-227cc07e6b0c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d87dd410-da49-4f11-b99a-130005041777-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.586 227766 DEBUG oslo_concurrency.lockutils [req-b304dcda-9d0a-46e6-aa3f-f3cc6e8c6a4e req-6a668f2f-23ce-4924-9401-227cc07e6b0c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.586 227766 DEBUG oslo_concurrency.lockutils [req-b304dcda-9d0a-46e6-aa3f-f3cc6e8c6a4e req-6a668f2f-23ce-4924-9401-227cc07e6b0c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.586 227766 DEBUG nova.compute.manager [req-b304dcda-9d0a-46e6-aa3f-f3cc6e8c6a4e req-6a668f2f-23ce-4924-9401-227cc07e6b0c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] No waiting events found dispatching network-vif-unplugged-eaedac9f-972e-42ea-8823-d8239f00f66d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.586 227766 DEBUG nova.compute.manager [req-b304dcda-9d0a-46e6-aa3f-f3cc6e8c6a4e req-6a668f2f-23ce-4924-9401-227cc07e6b0c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Received event network-vif-unplugged-eaedac9f-972e-42ea-8823-d8239f00f66d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.587 227766 DEBUG nova.compute.manager [req-b304dcda-9d0a-46e6-aa3f-f3cc6e8c6a4e req-6a668f2f-23ce-4924-9401-227cc07e6b0c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Received event network-vif-plugged-eaedac9f-972e-42ea-8823-d8239f00f66d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.587 227766 DEBUG oslo_concurrency.lockutils [req-b304dcda-9d0a-46e6-aa3f-f3cc6e8c6a4e req-6a668f2f-23ce-4924-9401-227cc07e6b0c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d87dd410-da49-4f11-b99a-130005041777-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.587 227766 DEBUG oslo_concurrency.lockutils [req-b304dcda-9d0a-46e6-aa3f-f3cc6e8c6a4e req-6a668f2f-23ce-4924-9401-227cc07e6b0c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.587 227766 DEBUG oslo_concurrency.lockutils [req-b304dcda-9d0a-46e6-aa3f-f3cc6e8c6a4e req-6a668f2f-23ce-4924-9401-227cc07e6b0c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.587 227766 DEBUG nova.compute.manager [req-b304dcda-9d0a-46e6-aa3f-f3cc6e8c6a4e req-6a668f2f-23ce-4924-9401-227cc07e6b0c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] No waiting events found dispatching network-vif-plugged-eaedac9f-972e-42ea-8823-d8239f00f66d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:07:56 np0005593234 nova_compute[227762]: 2026-01-23 10:07:56.588 227766 WARNING nova.compute.manager [req-b304dcda-9d0a-46e6-aa3f-f3cc6e8c6a4e req-6a668f2f-23ce-4924-9401-227cc07e6b0c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Received unexpected event network-vif-plugged-eaedac9f-972e-42ea-8823-d8239f00f66d for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:07:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:56.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:57.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:57 np0005593234 nova_compute[227762]: 2026-01-23 10:07:57.558 227766 DEBUG nova.network.neutron [-] [instance: d87dd410-da49-4f11-b99a-130005041777] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:07:57 np0005593234 nova_compute[227762]: 2026-01-23 10:07:57.588 227766 INFO nova.compute.manager [-] [instance: d87dd410-da49-4f11-b99a-130005041777] Took 1.59 seconds to deallocate network for instance.#033[00m
Jan 23 05:07:57 np0005593234 nova_compute[227762]: 2026-01-23 10:07:57.661 227766 DEBUG oslo_concurrency.lockutils [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:07:57 np0005593234 nova_compute[227762]: 2026-01-23 10:07:57.661 227766 DEBUG oslo_concurrency.lockutils [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:07:57 np0005593234 nova_compute[227762]: 2026-01-23 10:07:57.680 227766 DEBUG nova.compute.manager [req-be61a438-ea41-4914-ad80-ee956ae14c26 req-e9cde4be-70f3-4320-8a6c-af103dd1c1a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d87dd410-da49-4f11-b99a-130005041777] Received event network-vif-deleted-eaedac9f-972e-42ea-8823-d8239f00f66d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:07:57 np0005593234 nova_compute[227762]: 2026-01-23 10:07:57.825 227766 DEBUG oslo_concurrency.processutils [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:07:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:07:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3673835596' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:07:58 np0005593234 nova_compute[227762]: 2026-01-23 10:07:58.296 227766 DEBUG oslo_concurrency.processutils [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:07:58 np0005593234 nova_compute[227762]: 2026-01-23 10:07:58.302 227766 DEBUG nova.compute.provider_tree [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:07:58 np0005593234 nova_compute[227762]: 2026-01-23 10:07:58.332 227766 DEBUG nova.scheduler.client.report [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:07:58 np0005593234 nova_compute[227762]: 2026-01-23 10:07:58.371 227766 DEBUG oslo_concurrency.lockutils [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:58 np0005593234 nova_compute[227762]: 2026-01-23 10:07:58.410 227766 INFO nova.scheduler.client.report [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Deleted allocations for instance d87dd410-da49-4f11-b99a-130005041777#033[00m
Jan 23 05:07:58 np0005593234 nova_compute[227762]: 2026-01-23 10:07:58.570 227766 DEBUG oslo_concurrency.lockutils [None req-e6860986-51d3-47fc-a760-761bd8a2d3ce 0cae28d3746642598ee191045d7c2970 64e70435945f4186b0e2ab8143c69d80 - - default default] Lock "d87dd410-da49-4f11-b99a-130005041777" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.425s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:07:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:58.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:07:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:07:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:07:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:59.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:00 np0005593234 nova_compute[227762]: 2026-01-23 10:08:00.455 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:00.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:01.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:01 np0005593234 nova_compute[227762]: 2026-01-23 10:08:01.446 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:02.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:02 np0005593234 nova_compute[227762]: 2026-01-23 10:08:02.738 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:02 np0005593234 nova_compute[227762]: 2026-01-23 10:08:02.739 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:02 np0005593234 podman[283683]: 2026-01-23 10:08:02.762645263 +0000 UTC m=+0.056339303 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 23 05:08:02 np0005593234 nova_compute[227762]: 2026-01-23 10:08:02.770 227766 DEBUG nova.compute.manager [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:08:02 np0005593234 nova_compute[227762]: 2026-01-23 10:08:02.876 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:02 np0005593234 nova_compute[227762]: 2026-01-23 10:08:02.876 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:02 np0005593234 nova_compute[227762]: 2026-01-23 10:08:02.888 227766 DEBUG nova.virt.hardware [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:08:02 np0005593234 nova_compute[227762]: 2026-01-23 10:08:02.889 227766 INFO nova.compute.claims [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:08:02 np0005593234 nova_compute[227762]: 2026-01-23 10:08:02.913 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:02 np0005593234 nova_compute[227762]: 2026-01-23 10:08:02.995 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:08:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3325692018' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:08:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:03.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.455 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.462 227766 DEBUG nova.compute.provider_tree [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.520 227766 DEBUG nova.scheduler.client.report [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.571 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.572 227766 DEBUG nova.compute.manager [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.647 227766 DEBUG nova.compute.manager [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.648 227766 DEBUG nova.network.neutron [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.676 227766 INFO nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.698 227766 DEBUG nova.compute.manager [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.835 227766 DEBUG nova.compute.manager [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.837 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.837 227766 INFO nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Creating image(s)#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.861 227766 DEBUG nova.storage.rbd_utils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.883 227766 DEBUG nova.storage.rbd_utils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.915 227766 DEBUG nova.storage.rbd_utils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.920 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.944 227766 DEBUG nova.policy [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c99d09acd2e849a69846a6ccda1e0bc7', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '924f976bcbb74ec195730b68eebe1f2a', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.984 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.985 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.985 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:03 np0005593234 nova_compute[227762]: 2026-01-23 10:08:03.986 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:04 np0005593234 nova_compute[227762]: 2026-01-23 10:08:04.011 227766 DEBUG nova.storage.rbd_utils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:04 np0005593234 nova_compute[227762]: 2026-01-23 10:08:04.015 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:04 np0005593234 nova_compute[227762]: 2026-01-23 10:08:04.339 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.324s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:04 np0005593234 nova_compute[227762]: 2026-01-23 10:08:04.422 227766 DEBUG nova.storage.rbd_utils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] resizing rbd image 44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:08:04 np0005593234 nova_compute[227762]: 2026-01-23 10:08:04.542 227766 DEBUG nova.objects.instance [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lazy-loading 'migration_context' on Instance uuid 44fc19d2-6a98-47dd-803d-d3d6a2cc2486 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:08:04 np0005593234 nova_compute[227762]: 2026-01-23 10:08:04.559 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:08:04 np0005593234 nova_compute[227762]: 2026-01-23 10:08:04.559 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Ensure instance console log exists: /var/lib/nova/instances/44fc19d2-6a98-47dd-803d-d3d6a2cc2486/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:08:04 np0005593234 nova_compute[227762]: 2026-01-23 10:08:04.560 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:04 np0005593234 nova_compute[227762]: 2026-01-23 10:08:04.560 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:04 np0005593234 nova_compute[227762]: 2026-01-23 10:08:04.561 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:04.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:04 np0005593234 nova_compute[227762]: 2026-01-23 10:08:04.991 227766 DEBUG nova.network.neutron [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Successfully created port: dd535142-468c-4e43-b21c-8262c1d7ed46 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:08:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:05.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:05 np0005593234 nova_compute[227762]: 2026-01-23 10:08:05.458 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:06 np0005593234 nova_compute[227762]: 2026-01-23 10:08:06.260 227766 DEBUG nova.network.neutron [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Successfully updated port: dd535142-468c-4e43-b21c-8262c1d7ed46 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:08:06 np0005593234 nova_compute[227762]: 2026-01-23 10:08:06.307 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "refresh_cache-44fc19d2-6a98-47dd-803d-d3d6a2cc2486" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:08:06 np0005593234 nova_compute[227762]: 2026-01-23 10:08:06.307 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquired lock "refresh_cache-44fc19d2-6a98-47dd-803d-d3d6a2cc2486" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:08:06 np0005593234 nova_compute[227762]: 2026-01-23 10:08:06.307 227766 DEBUG nova.network.neutron [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:08:06 np0005593234 nova_compute[227762]: 2026-01-23 10:08:06.447 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:06 np0005593234 nova_compute[227762]: 2026-01-23 10:08:06.483 227766 DEBUG nova.compute.manager [req-35f463da-178c-4d94-991b-33d564c90849 req-79004319-0e61-4374-9c4d-48ecb815f45d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Received event network-changed-dd535142-468c-4e43-b21c-8262c1d7ed46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:06 np0005593234 nova_compute[227762]: 2026-01-23 10:08:06.484 227766 DEBUG nova.compute.manager [req-35f463da-178c-4d94-991b-33d564c90849 req-79004319-0e61-4374-9c4d-48ecb815f45d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Refreshing instance network info cache due to event network-changed-dd535142-468c-4e43-b21c-8262c1d7ed46. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:08:06 np0005593234 nova_compute[227762]: 2026-01-23 10:08:06.484 227766 DEBUG oslo_concurrency.lockutils [req-35f463da-178c-4d94-991b-33d564c90849 req-79004319-0e61-4374-9c4d-48ecb815f45d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-44fc19d2-6a98-47dd-803d-d3d6a2cc2486" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:08:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:06.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:06 np0005593234 nova_compute[227762]: 2026-01-23 10:08:06.721 227766 DEBUG nova.network.neutron [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:08:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:08:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:08:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:08:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:07.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:08:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:08:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:08.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.387 227766 DEBUG nova.network.neutron [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Updating instance_info_cache with network_info: [{"id": "dd535142-468c-4e43-b21c-8262c1d7ed46", "address": "fa:16:3e:46:41:dd", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd535142-46", "ovs_interfaceid": "dd535142-468c-4e43-b21c-8262c1d7ed46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:08:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:09.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.449 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Releasing lock "refresh_cache-44fc19d2-6a98-47dd-803d-d3d6a2cc2486" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.449 227766 DEBUG nova.compute.manager [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Instance network_info: |[{"id": "dd535142-468c-4e43-b21c-8262c1d7ed46", "address": "fa:16:3e:46:41:dd", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd535142-46", "ovs_interfaceid": "dd535142-468c-4e43-b21c-8262c1d7ed46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.450 227766 DEBUG oslo_concurrency.lockutils [req-35f463da-178c-4d94-991b-33d564c90849 req-79004319-0e61-4374-9c4d-48ecb815f45d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-44fc19d2-6a98-47dd-803d-d3d6a2cc2486" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.450 227766 DEBUG nova.network.neutron [req-35f463da-178c-4d94-991b-33d564c90849 req-79004319-0e61-4374-9c4d-48ecb815f45d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Refreshing network info cache for port dd535142-468c-4e43-b21c-8262c1d7ed46 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.452 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Start _get_guest_xml network_info=[{"id": "dd535142-468c-4e43-b21c-8262c1d7ed46", "address": "fa:16:3e:46:41:dd", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd535142-46", "ovs_interfaceid": "dd535142-468c-4e43-b21c-8262c1d7ed46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.457 227766 WARNING nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.463 227766 DEBUG nova.virt.libvirt.host [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.464 227766 DEBUG nova.virt.libvirt.host [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.466 227766 DEBUG nova.virt.libvirt.host [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.467 227766 DEBUG nova.virt.libvirt.host [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.468 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.469 227766 DEBUG nova.virt.hardware [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.469 227766 DEBUG nova.virt.hardware [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.469 227766 DEBUG nova.virt.hardware [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.470 227766 DEBUG nova.virt.hardware [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.470 227766 DEBUG nova.virt.hardware [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.470 227766 DEBUG nova.virt.hardware [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.470 227766 DEBUG nova.virt.hardware [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.471 227766 DEBUG nova.virt.hardware [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.471 227766 DEBUG nova.virt.hardware [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.471 227766 DEBUG nova.virt.hardware [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.471 227766 DEBUG nova.virt.hardware [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.474 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:08:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2893249017' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.909 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.934 227766 DEBUG nova.storage.rbd_utils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:09 np0005593234 nova_compute[227762]: 2026-01-23 10:08:09.938 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.382 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162875.3816447, d87dd410-da49-4f11-b99a-130005041777 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.383 227766 INFO nova.compute.manager [-] [instance: d87dd410-da49-4f11-b99a-130005041777] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.412 227766 DEBUG nova.compute.manager [None req-c575b92e-f372-4c20-8de9-9729c29c6509 - - - - - -] [instance: d87dd410-da49-4f11-b99a-130005041777] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:08:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:08:10 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2919745016' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.462 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.566 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.627s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.567 227766 DEBUG nova.virt.libvirt.vif [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:08:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1729959968',display_name='tempest-AttachVolumeNegativeTest-server-1729959968',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1729959968',id=123,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHXa4OeH8DK3YeFO66XIr0uYOp9qQ0Pe9xS8kQR1WC8bD6iRszrhkHpZRj7CHrlEJe+PAgfdJlK8iQw4IKHGn1w4RvSiiqiFwp8IyluzUc7xoPCgtFKyqvu1kEnS0jpeTw==',key_name='tempest-keypair-1592454648',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='924f976bcbb74ec195730b68eebe1f2a',ramdisk_id='',reservation_id='r-2j3t8nfd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1470050886',owner_user_name='tempest-AttachVolumeNegativeTest-1470050886-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:08:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c99d09acd2e849a69846a6ccda1e0bc7',uuid=44fc19d2-6a98-47dd-803d-d3d6a2cc2486,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd535142-468c-4e43-b21c-8262c1d7ed46", "address": "fa:16:3e:46:41:dd", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd535142-46", "ovs_interfaceid": "dd535142-468c-4e43-b21c-8262c1d7ed46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.567 227766 DEBUG nova.network.os_vif_util [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Converting VIF {"id": "dd535142-468c-4e43-b21c-8262c1d7ed46", "address": "fa:16:3e:46:41:dd", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd535142-46", "ovs_interfaceid": "dd535142-468c-4e43-b21c-8262c1d7ed46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.568 227766 DEBUG nova.network.os_vif_util [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:41:dd,bridge_name='br-int',has_traffic_filtering=True,id=dd535142-468c-4e43-b21c-8262c1d7ed46,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd535142-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.569 227766 DEBUG nova.objects.instance [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lazy-loading 'pci_devices' on Instance uuid 44fc19d2-6a98-47dd-803d-d3d6a2cc2486 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.594 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  <uuid>44fc19d2-6a98-47dd-803d-d3d6a2cc2486</uuid>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  <name>instance-0000007b</name>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <nova:name>tempest-AttachVolumeNegativeTest-server-1729959968</nova:name>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:08:09</nova:creationTime>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <nova:user uuid="c99d09acd2e849a69846a6ccda1e0bc7">tempest-AttachVolumeNegativeTest-1470050886-project-member</nova:user>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <nova:project uuid="924f976bcbb74ec195730b68eebe1f2a">tempest-AttachVolumeNegativeTest-1470050886</nova:project>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <nova:port uuid="dd535142-468c-4e43-b21c-8262c1d7ed46">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <entry name="serial">44fc19d2-6a98-47dd-803d-d3d6a2cc2486</entry>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <entry name="uuid">44fc19d2-6a98-47dd-803d-d3d6a2cc2486</entry>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk.config">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:46:41:dd"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <target dev="tapdd535142-46"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/44fc19d2-6a98-47dd-803d-d3d6a2cc2486/console.log" append="off"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:08:10 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:08:10 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:08:10 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:08:10 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.596 227766 DEBUG nova.compute.manager [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Preparing to wait for external event network-vif-plugged-dd535142-468c-4e43-b21c-8262c1d7ed46 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.596 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.597 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.597 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.598 227766 DEBUG nova.virt.libvirt.vif [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:08:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1729959968',display_name='tempest-AttachVolumeNegativeTest-server-1729959968',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1729959968',id=123,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHXa4OeH8DK3YeFO66XIr0uYOp9qQ0Pe9xS8kQR1WC8bD6iRszrhkHpZRj7CHrlEJe+PAgfdJlK8iQw4IKHGn1w4RvSiiqiFwp8IyluzUc7xoPCgtFKyqvu1kEnS0jpeTw==',key_name='tempest-keypair-1592454648',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='924f976bcbb74ec195730b68eebe1f2a',ramdisk_id='',reservation_id='r-2j3t8nfd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeNegativeTest-1470050886',owner_user_name='tempest-AttachVolumeNegativeTest-1470050886-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:08:03Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c99d09acd2e849a69846a6ccda1e0bc7',uuid=44fc19d2-6a98-47dd-803d-d3d6a2cc2486,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dd535142-468c-4e43-b21c-8262c1d7ed46", "address": "fa:16:3e:46:41:dd", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd535142-46", "ovs_interfaceid": "dd535142-468c-4e43-b21c-8262c1d7ed46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.598 227766 DEBUG nova.network.os_vif_util [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Converting VIF {"id": "dd535142-468c-4e43-b21c-8262c1d7ed46", "address": "fa:16:3e:46:41:dd", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd535142-46", "ovs_interfaceid": "dd535142-468c-4e43-b21c-8262c1d7ed46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.599 227766 DEBUG nova.network.os_vif_util [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:46:41:dd,bridge_name='br-int',has_traffic_filtering=True,id=dd535142-468c-4e43-b21c-8262c1d7ed46,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd535142-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.600 227766 DEBUG os_vif [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:41:dd,bridge_name='br-int',has_traffic_filtering=True,id=dd535142-468c-4e43-b21c-8262c1d7ed46,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd535142-46') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.600 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.601 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.601 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.606 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.606 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdd535142-46, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.607 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdd535142-46, col_values=(('external_ids', {'iface-id': 'dd535142-468c-4e43-b21c-8262c1d7ed46', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:46:41:dd', 'vm-uuid': '44fc19d2-6a98-47dd-803d-d3d6a2cc2486'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.609 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:10 np0005593234 NetworkManager[48942]: <info>  [1769162890.6096] manager: (tapdd535142-46): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/249)
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.610 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.617 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.618 227766 INFO os_vif [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:46:41:dd,bridge_name='br-int',has_traffic_filtering=True,id=dd535142-468c-4e43-b21c-8262c1d7ed46,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd535142-46')#033[00m
Jan 23 05:08:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:10.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.736 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.736 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.737 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] No VIF found with MAC fa:16:3e:46:41:dd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.737 227766 INFO nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Using config drive#033[00m
Jan 23 05:08:10 np0005593234 nova_compute[227762]: 2026-01-23 10:08:10.762 227766 DEBUG nova.storage.rbd_utils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:11 np0005593234 nova_compute[227762]: 2026-01-23 10:08:11.448 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:08:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:11.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:08:11 np0005593234 nova_compute[227762]: 2026-01-23 10:08:11.739 227766 DEBUG nova.network.neutron [req-35f463da-178c-4d94-991b-33d564c90849 req-79004319-0e61-4374-9c4d-48ecb815f45d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Updated VIF entry in instance network info cache for port dd535142-468c-4e43-b21c-8262c1d7ed46. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:08:11 np0005593234 nova_compute[227762]: 2026-01-23 10:08:11.740 227766 DEBUG nova.network.neutron [req-35f463da-178c-4d94-991b-33d564c90849 req-79004319-0e61-4374-9c4d-48ecb815f45d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Updating instance_info_cache with network_info: [{"id": "dd535142-468c-4e43-b21c-8262c1d7ed46", "address": "fa:16:3e:46:41:dd", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd535142-46", "ovs_interfaceid": "dd535142-468c-4e43-b21c-8262c1d7ed46", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:08:11 np0005593234 nova_compute[227762]: 2026-01-23 10:08:11.791 227766 DEBUG oslo_concurrency.lockutils [req-35f463da-178c-4d94-991b-33d564c90849 req-79004319-0e61-4374-9c4d-48ecb815f45d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-44fc19d2-6a98-47dd-803d-d3d6a2cc2486" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:08:12 np0005593234 nova_compute[227762]: 2026-01-23 10:08:12.574 227766 INFO nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Creating config drive at /var/lib/nova/instances/44fc19d2-6a98-47dd-803d-d3d6a2cc2486/disk.config#033[00m
Jan 23 05:08:12 np0005593234 nova_compute[227762]: 2026-01-23 10:08:12.581 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/44fc19d2-6a98-47dd-803d-d3d6a2cc2486/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4legtwzd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:12.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:12 np0005593234 nova_compute[227762]: 2026-01-23 10:08:12.715 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/44fc19d2-6a98-47dd-803d-d3d6a2cc2486/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4legtwzd" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:12 np0005593234 nova_compute[227762]: 2026-01-23 10:08:12.745 227766 DEBUG nova.storage.rbd_utils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] rbd image 44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:08:12 np0005593234 nova_compute[227762]: 2026-01-23 10:08:12.749 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/44fc19d2-6a98-47dd-803d-d3d6a2cc2486/disk.config 44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:12 np0005593234 nova_compute[227762]: 2026-01-23 10:08:12.898 227766 DEBUG oslo_concurrency.processutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/44fc19d2-6a98-47dd-803d-d3d6a2cc2486/disk.config 44fc19d2-6a98-47dd-803d-d3d6a2cc2486_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:12 np0005593234 nova_compute[227762]: 2026-01-23 10:08:12.899 227766 INFO nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Deleting local config drive /var/lib/nova/instances/44fc19d2-6a98-47dd-803d-d3d6a2cc2486/disk.config because it was imported into RBD.#033[00m
Jan 23 05:08:12 np0005593234 kernel: tapdd535142-46: entered promiscuous mode
Jan 23 05:08:12 np0005593234 NetworkManager[48942]: <info>  [1769162892.9464] manager: (tapdd535142-46): new Tun device (/org/freedesktop/NetworkManager/Devices/250)
Jan 23 05:08:12 np0005593234 ovn_controller[134547]: 2026-01-23T10:08:12Z|00485|binding|INFO|Claiming lport dd535142-468c-4e43-b21c-8262c1d7ed46 for this chassis.
Jan 23 05:08:12 np0005593234 nova_compute[227762]: 2026-01-23 10:08:12.945 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:12 np0005593234 ovn_controller[134547]: 2026-01-23T10:08:12Z|00486|binding|INFO|dd535142-468c-4e43-b21c-8262c1d7ed46: Claiming fa:16:3e:46:41:dd 10.100.0.14
Jan 23 05:08:12 np0005593234 nova_compute[227762]: 2026-01-23 10:08:12.952 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:12 np0005593234 nova_compute[227762]: 2026-01-23 10:08:12.954 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:12 np0005593234 nova_compute[227762]: 2026-01-23 10:08:12.960 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:12 np0005593234 nova_compute[227762]: 2026-01-23 10:08:12.962 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:12 np0005593234 NetworkManager[48942]: <info>  [1769162892.9629] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/251)
Jan 23 05:08:12 np0005593234 NetworkManager[48942]: <info>  [1769162892.9637] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/252)
Jan 23 05:08:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:12.969 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:41:dd 10.100.0.14'], port_security=['fa:16:3e:46:41:dd 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '44fc19d2-6a98-47dd-803d-d3d6a2cc2486', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93735878-f62d-4a5f-96df-bf97f85d787a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '924f976bcbb74ec195730b68eebe1f2a', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7f12f7be-3ed6-4af6-ac42-cfa64f874b41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1f72e5c-e22f-424b-b6ed-0c502ff13aa3, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=dd535142-468c-4e43-b21c-8262c1d7ed46) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:08:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:12.971 144381 INFO neutron.agent.ovn.metadata.agent [-] Port dd535142-468c-4e43-b21c-8262c1d7ed46 in datapath 93735878-f62d-4a5f-96df-bf97f85d787a bound to our chassis#033[00m
Jan 23 05:08:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:12.972 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 93735878-f62d-4a5f-96df-bf97f85d787a#033[00m
Jan 23 05:08:12 np0005593234 systemd-machined[195626]: New machine qemu-57-instance-0000007b.
Jan 23 05:08:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:12.983 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[faec3960-13e5-429c-9830-6b886d909fcd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:12.984 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap93735878-f1 in ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:08:12 np0005593234 systemd[1]: Started Virtual Machine qemu-57-instance-0000007b.
Jan 23 05:08:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:12.986 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap93735878-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:08:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:12.986 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[da9f745d-5295-4048-a695-5c090dcd2ef1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:12.987 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f97d67c3-46cd-4570-9227-a369ecf85157]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 systemd-udevd[284224]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:12.999 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[b88059be-6bab-43f3-a2ae-b42c9c14c70f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 NetworkManager[48942]: <info>  [1769162893.0130] device (tapdd535142-46): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:08:13 np0005593234 NetworkManager[48942]: <info>  [1769162893.0141] device (tapdd535142-46): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.027 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[debedf42-9d15-4a76-9678-be010d1ddded]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.059 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc61a0e-42ca-40ac-a55e-124a49d65003]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 systemd-udevd[284228]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.067 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7350fc82-e890-47d1-a8eb-52555448c76c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 NetworkManager[48942]: <info>  [1769162893.0682] manager: (tap93735878-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/253)
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.099 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a10e8228-066e-48cd-8f95-5a633f12f7a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.103 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[92976d74-1d39-4ece-bddd-7f8239ebe684]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 NetworkManager[48942]: <info>  [1769162893.1292] device (tap93735878-f0): carrier: link connected
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.135 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b5342211-9bf7-4a84-a6a0-12da21052ffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.153 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[451df0cb-c5dd-4d7b-b80d-59ef3854cf12]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap93735878-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:41:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678808, 'reachable_time': 24914, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284270, 'error': None, 'target': 'ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.156 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:13 np0005593234 podman[284208]: 2026-01-23 10:08:13.159793447 +0000 UTC m=+0.182721055 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.182 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.180 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd764e2-693e-4f33-ab14-681ac436be16]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec0:41c8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 678808, 'tstamp': 678808}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 284271, 'error': None, 'target': 'ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:08:13Z|00487|binding|INFO|Setting lport dd535142-468c-4e43-b21c-8262c1d7ed46 ovn-installed in OVS
Jan 23 05:08:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:08:13Z|00488|binding|INFO|Setting lport dd535142-468c-4e43-b21c-8262c1d7ed46 up in Southbound
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.195 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.202 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[be00657b-4a74-49fd-9b43-41905223f68e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap93735878-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c0:41:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 158], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678808, 'reachable_time': 24914, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 284272, 'error': None, 'target': 'ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.233 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[16d0dce4-98c3-4d49-af23-a95ef5db6347]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.302 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a249df-bb1b-4040-a7c3-8b5bac721ab8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.304 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93735878-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.304 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.305 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap93735878-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:13 np0005593234 NetworkManager[48942]: <info>  [1769162893.3068] manager: (tap93735878-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/254)
Jan 23 05:08:13 np0005593234 kernel: tap93735878-f0: entered promiscuous mode
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.306 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.309 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.310 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap93735878-f0, col_values=(('external_ids', {'iface-id': 'c75eef02-aabe-4477-9239-97f7fb86cd02'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.310 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:08:13Z|00489|binding|INFO|Releasing lport c75eef02-aabe-4477-9239-97f7fb86cd02 from this chassis (sb_readonly=0)
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.325 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.325 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/93735878-f62d-4a5f-96df-bf97f85d787a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/93735878-f62d-4a5f-96df-bf97f85d787a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.326 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b4232362-1551-42c9-b45f-080344aa397f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.327 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-93735878-f62d-4a5f-96df-bf97f85d787a
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/93735878-f62d-4a5f-96df-bf97f85d787a.pid.haproxy
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 93735878-f62d-4a5f-96df-bf97f85d787a
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:08:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:13.328 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a', 'env', 'PROCESS_TAG=haproxy-93735878-f62d-4a5f-96df-bf97f85d787a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/93735878-f62d-4a5f-96df-bf97f85d787a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.398 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162893.3966482, 44fc19d2-6a98-47dd-803d-d3d6a2cc2486 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.399 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] VM Started (Lifecycle Event)#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.437 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:13.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.457 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.464 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162893.396825, 44fc19d2-6a98-47dd-803d-d3d6a2cc2486 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.464 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.542 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.549 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.605 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:08:13 np0005593234 podman[284371]: 2026-01-23 10:08:13.693468696 +0000 UTC m=+0.052708260 container create b31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:08:13 np0005593234 systemd[1]: Started libpod-conmon-b31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df.scope.
Jan 23 05:08:13 np0005593234 podman[284371]: 2026-01-23 10:08:13.66449771 +0000 UTC m=+0.023737304 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:08:13 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:08:13 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8653691ea21f0ed64ed5993f2d754d90684a3570a4b7bfe2c7ba438c4e830806/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.783 227766 DEBUG nova.compute.manager [req-08bed78d-e01a-4890-9e7e-16cb01c532b6 req-c8a7196c-d13a-48f9-baf9-b48e49f13a0d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Received event network-vif-plugged-dd535142-468c-4e43-b21c-8262c1d7ed46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.784 227766 DEBUG oslo_concurrency.lockutils [req-08bed78d-e01a-4890-9e7e-16cb01c532b6 req-c8a7196c-d13a-48f9-baf9-b48e49f13a0d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.784 227766 DEBUG oslo_concurrency.lockutils [req-08bed78d-e01a-4890-9e7e-16cb01c532b6 req-c8a7196c-d13a-48f9-baf9-b48e49f13a0d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.785 227766 DEBUG oslo_concurrency.lockutils [req-08bed78d-e01a-4890-9e7e-16cb01c532b6 req-c8a7196c-d13a-48f9-baf9-b48e49f13a0d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.785 227766 DEBUG nova.compute.manager [req-08bed78d-e01a-4890-9e7e-16cb01c532b6 req-c8a7196c-d13a-48f9-baf9-b48e49f13a0d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Processing event network-vif-plugged-dd535142-468c-4e43-b21c-8262c1d7ed46 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.786 227766 DEBUG nova.compute.manager [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:08:13 np0005593234 podman[284371]: 2026-01-23 10:08:13.789904501 +0000 UTC m=+0.149144085 container init b31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.791 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162893.7913518, 44fc19d2-6a98-47dd-803d-d3d6a2cc2486 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.791 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.793 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:08:13 np0005593234 podman[284371]: 2026-01-23 10:08:13.796733895 +0000 UTC m=+0.155973449 container start b31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.796 227766 INFO nova.virt.libvirt.driver [-] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Instance spawned successfully.#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.797 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:08:13 np0005593234 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[284412]: [NOTICE]   (284416) : New worker (284418) forked
Jan 23 05:08:13 np0005593234 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[284412]: [NOTICE]   (284416) : Loading success.
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.850 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.857 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.860 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.860 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.861 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.861 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.861 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.862 227766 DEBUG nova.virt.libvirt.driver [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.899 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.971 227766 INFO nova.compute.manager [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Took 10.14 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:08:13 np0005593234 nova_compute[227762]: 2026-01-23 10:08:13.972 227766 DEBUG nova.compute.manager [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:08:14 np0005593234 nova_compute[227762]: 2026-01-23 10:08:14.057 227766 INFO nova.compute.manager [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Took 11.22 seconds to build instance.#033[00m
Jan 23 05:08:14 np0005593234 nova_compute[227762]: 2026-01-23 10:08:14.095 227766 DEBUG oslo_concurrency.lockutils [None req-3f959962-d37e-46bc-b2ea-64b28f1a75ea c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:14 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:08:14 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:08:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:14.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:15.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:15 np0005593234 nova_compute[227762]: 2026-01-23 10:08:15.611 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:16 np0005593234 nova_compute[227762]: 2026-01-23 10:08:16.451 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:16.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:16 np0005593234 nova_compute[227762]: 2026-01-23 10:08:16.962 227766 DEBUG nova.compute.manager [req-a144612b-5adf-4fde-9b9e-6265f41f5982 req-4d446561-848c-4ac3-83f4-08b36b4fb1f6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Received event network-vif-plugged-dd535142-468c-4e43-b21c-8262c1d7ed46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:16 np0005593234 nova_compute[227762]: 2026-01-23 10:08:16.963 227766 DEBUG oslo_concurrency.lockutils [req-a144612b-5adf-4fde-9b9e-6265f41f5982 req-4d446561-848c-4ac3-83f4-08b36b4fb1f6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:16 np0005593234 nova_compute[227762]: 2026-01-23 10:08:16.963 227766 DEBUG oslo_concurrency.lockutils [req-a144612b-5adf-4fde-9b9e-6265f41f5982 req-4d446561-848c-4ac3-83f4-08b36b4fb1f6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:16 np0005593234 nova_compute[227762]: 2026-01-23 10:08:16.963 227766 DEBUG oslo_concurrency.lockutils [req-a144612b-5adf-4fde-9b9e-6265f41f5982 req-4d446561-848c-4ac3-83f4-08b36b4fb1f6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:16 np0005593234 nova_compute[227762]: 2026-01-23 10:08:16.964 227766 DEBUG nova.compute.manager [req-a144612b-5adf-4fde-9b9e-6265f41f5982 req-4d446561-848c-4ac3-83f4-08b36b4fb1f6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] No waiting events found dispatching network-vif-plugged-dd535142-468c-4e43-b21c-8262c1d7ed46 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:08:16 np0005593234 nova_compute[227762]: 2026-01-23 10:08:16.964 227766 WARNING nova.compute.manager [req-a144612b-5adf-4fde-9b9e-6265f41f5982 req-4d446561-848c-4ac3-83f4-08b36b4fb1f6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Received unexpected event network-vif-plugged-dd535142-468c-4e43-b21c-8262c1d7ed46 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:08:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:17.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:18.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:19.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:20 np0005593234 nova_compute[227762]: 2026-01-23 10:08:20.614 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:20.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:21 np0005593234 nova_compute[227762]: 2026-01-23 10:08:21.455 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:21.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:22.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:23.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:24.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:25.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:25 np0005593234 nova_compute[227762]: 2026-01-23 10:08:25.617 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:26 np0005593234 nova_compute[227762]: 2026-01-23 10:08:26.459 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:26.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:27 np0005593234 nova_compute[227762]: 2026-01-23 10:08:27.331 227766 DEBUG nova.compute.manager [req-be6cb517-ab67-4ac5-9b03-5c19368aed2d req-36533d1d-58a8-4a4a-8eb9-bdf58f5cf96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Received event network-changed-dd535142-468c-4e43-b21c-8262c1d7ed46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:27 np0005593234 nova_compute[227762]: 2026-01-23 10:08:27.332 227766 DEBUG nova.compute.manager [req-be6cb517-ab67-4ac5-9b03-5c19368aed2d req-36533d1d-58a8-4a4a-8eb9-bdf58f5cf96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Refreshing instance network info cache due to event network-changed-dd535142-468c-4e43-b21c-8262c1d7ed46. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:08:27 np0005593234 nova_compute[227762]: 2026-01-23 10:08:27.333 227766 DEBUG oslo_concurrency.lockutils [req-be6cb517-ab67-4ac5-9b03-5c19368aed2d req-36533d1d-58a8-4a4a-8eb9-bdf58f5cf96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-44fc19d2-6a98-47dd-803d-d3d6a2cc2486" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:08:27 np0005593234 nova_compute[227762]: 2026-01-23 10:08:27.333 227766 DEBUG oslo_concurrency.lockutils [req-be6cb517-ab67-4ac5-9b03-5c19368aed2d req-36533d1d-58a8-4a4a-8eb9-bdf58f5cf96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-44fc19d2-6a98-47dd-803d-d3d6a2cc2486" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:08:27 np0005593234 nova_compute[227762]: 2026-01-23 10:08:27.333 227766 DEBUG nova.network.neutron [req-be6cb517-ab67-4ac5-9b03-5c19368aed2d req-36533d1d-58a8-4a4a-8eb9-bdf58f5cf96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Refreshing network info cache for port dd535142-468c-4e43-b21c-8262c1d7ed46 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:08:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:27.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:27 np0005593234 nova_compute[227762]: 2026-01-23 10:08:27.944 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:08:28Z|00050|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:46:41:dd 10.100.0.14
Jan 23 05:08:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:08:28Z|00051|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:46:41:dd 10.100.0.14
Jan 23 05:08:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:28.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:29.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:30 np0005593234 nova_compute[227762]: 2026-01-23 10:08:30.619 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:30.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:31 np0005593234 nova_compute[227762]: 2026-01-23 10:08:31.462 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:08:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:31.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:08:31 np0005593234 nova_compute[227762]: 2026-01-23 10:08:31.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:32.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:33 np0005593234 nova_compute[227762]: 2026-01-23 10:08:33.392 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:33 np0005593234 nova_compute[227762]: 2026-01-23 10:08:33.393 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:33 np0005593234 nova_compute[227762]: 2026-01-23 10:08:33.393 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:33 np0005593234 nova_compute[227762]: 2026-01-23 10:08:33.394 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:08:33 np0005593234 nova_compute[227762]: 2026-01-23 10:08:33.394 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:33.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:33 np0005593234 podman[284508]: 2026-01-23 10:08:33.770299149 +0000 UTC m=+0.063548438 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:08:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:08:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3514492820' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:08:33 np0005593234 nova_compute[227762]: 2026-01-23 10:08:33.908 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:33 np0005593234 nova_compute[227762]: 2026-01-23 10:08:33.992 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:08:33 np0005593234 nova_compute[227762]: 2026-01-23 10:08:33.992 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007b as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:08:34 np0005593234 nova_compute[227762]: 2026-01-23 10:08:34.155 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:08:34 np0005593234 nova_compute[227762]: 2026-01-23 10:08:34.157 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4304MB free_disk=20.80620574951172GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:08:34 np0005593234 nova_compute[227762]: 2026-01-23 10:08:34.157 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:34 np0005593234 nova_compute[227762]: 2026-01-23 10:08:34.157 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:34 np0005593234 nova_compute[227762]: 2026-01-23 10:08:34.291 227766 DEBUG nova.network.neutron [req-be6cb517-ab67-4ac5-9b03-5c19368aed2d req-36533d1d-58a8-4a4a-8eb9-bdf58f5cf96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Updated VIF entry in instance network info cache for port dd535142-468c-4e43-b21c-8262c1d7ed46. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:08:34 np0005593234 nova_compute[227762]: 2026-01-23 10:08:34.291 227766 DEBUG nova.network.neutron [req-be6cb517-ab67-4ac5-9b03-5c19368aed2d req-36533d1d-58a8-4a4a-8eb9-bdf58f5cf96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Updating instance_info_cache with network_info: [{"id": "dd535142-468c-4e43-b21c-8262c1d7ed46", "address": "fa:16:3e:46:41:dd", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd535142-46", "ovs_interfaceid": "dd535142-468c-4e43-b21c-8262c1d7ed46", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:08:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:34.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:35.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:35 np0005593234 nova_compute[227762]: 2026-01-23 10:08:35.622 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:35 np0005593234 nova_compute[227762]: 2026-01-23 10:08:35.979 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 44fc19d2-6a98-47dd-803d-d3d6a2cc2486 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:08:35 np0005593234 nova_compute[227762]: 2026-01-23 10:08:35.980 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:08:35 np0005593234 nova_compute[227762]: 2026-01-23 10:08:35.980 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:08:35 np0005593234 nova_compute[227762]: 2026-01-23 10:08:35.984 227766 DEBUG oslo_concurrency.lockutils [req-be6cb517-ab67-4ac5-9b03-5c19368aed2d req-36533d1d-58a8-4a4a-8eb9-bdf58f5cf96e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-44fc19d2-6a98-47dd-803d-d3d6a2cc2486" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:08:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e268 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:36 np0005593234 nova_compute[227762]: 2026-01-23 10:08:36.256 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:36 np0005593234 nova_compute[227762]: 2026-01-23 10:08:36.516 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:08:36 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1762005904' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:08:36 np0005593234 nova_compute[227762]: 2026-01-23 10:08:36.706 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:36.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:36 np0005593234 nova_compute[227762]: 2026-01-23 10:08:36.713 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:08:36 np0005593234 nova_compute[227762]: 2026-01-23 10:08:36.742 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:08:36 np0005593234 nova_compute[227762]: 2026-01-23 10:08:36.810 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:08:36 np0005593234 nova_compute[227762]: 2026-01-23 10:08:36.810 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:08:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:37.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:08:37 np0005593234 nova_compute[227762]: 2026-01-23 10:08:37.810 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:37 np0005593234 nova_compute[227762]: 2026-01-23 10:08:37.811 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:37 np0005593234 nova_compute[227762]: 2026-01-23 10:08:37.811 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:38.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:38 np0005593234 nova_compute[227762]: 2026-01-23 10:08:38.975 227766 DEBUG oslo_concurrency.lockutils [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:38 np0005593234 nova_compute[227762]: 2026-01-23 10:08:38.976 227766 DEBUG oslo_concurrency.lockutils [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:38 np0005593234 nova_compute[227762]: 2026-01-23 10:08:38.976 227766 DEBUG oslo_concurrency.lockutils [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:38 np0005593234 nova_compute[227762]: 2026-01-23 10:08:38.976 227766 DEBUG oslo_concurrency.lockutils [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:38 np0005593234 nova_compute[227762]: 2026-01-23 10:08:38.976 227766 DEBUG oslo_concurrency.lockutils [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:38 np0005593234 nova_compute[227762]: 2026-01-23 10:08:38.978 227766 INFO nova.compute.manager [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Terminating instance#033[00m
Jan 23 05:08:38 np0005593234 nova_compute[227762]: 2026-01-23 10:08:38.979 227766 DEBUG nova.compute.manager [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:08:39 np0005593234 kernel: tapdd535142-46 (unregistering): left promiscuous mode
Jan 23 05:08:39 np0005593234 NetworkManager[48942]: <info>  [1769162919.0439] device (tapdd535142-46): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:08:39 np0005593234 ovn_controller[134547]: 2026-01-23T10:08:39Z|00490|binding|INFO|Releasing lport dd535142-468c-4e43-b21c-8262c1d7ed46 from this chassis (sb_readonly=0)
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.050 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:39 np0005593234 ovn_controller[134547]: 2026-01-23T10:08:39Z|00491|binding|INFO|Setting lport dd535142-468c-4e43-b21c-8262c1d7ed46 down in Southbound
Jan 23 05:08:39 np0005593234 ovn_controller[134547]: 2026-01-23T10:08:39Z|00492|binding|INFO|Removing iface tapdd535142-46 ovn-installed in OVS
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.056 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.066 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:46:41:dd 10.100.0.14'], port_security=['fa:16:3e:46:41:dd 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '44fc19d2-6a98-47dd-803d-d3d6a2cc2486', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-93735878-f62d-4a5f-96df-bf97f85d787a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '924f976bcbb74ec195730b68eebe1f2a', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7f12f7be-3ed6-4af6-ac42-cfa64f874b41', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.241'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e1f72e5c-e22f-424b-b6ed-0c502ff13aa3, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=dd535142-468c-4e43-b21c-8262c1d7ed46) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.068 144381 INFO neutron.agent.ovn.metadata.agent [-] Port dd535142-468c-4e43-b21c-8262c1d7ed46 in datapath 93735878-f62d-4a5f-96df-bf97f85d787a unbound from our chassis#033[00m
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.070 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 93735878-f62d-4a5f-96df-bf97f85d787a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.071 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ad834688-5d32-4871-a38d-1954e81e9c8d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.071 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.072 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a namespace which is not needed anymore#033[00m
Jan 23 05:08:39 np0005593234 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007b.scope: Deactivated successfully.
Jan 23 05:08:39 np0005593234 systemd[1]: machine-qemu\x2d57\x2dinstance\x2d0000007b.scope: Consumed 13.587s CPU time.
Jan 23 05:08:39 np0005593234 systemd-machined[195626]: Machine qemu-57-instance-0000007b terminated.
Jan 23 05:08:39 np0005593234 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[284412]: [NOTICE]   (284416) : haproxy version is 2.8.14-c23fe91
Jan 23 05:08:39 np0005593234 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[284412]: [NOTICE]   (284416) : path to executable is /usr/sbin/haproxy
Jan 23 05:08:39 np0005593234 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[284412]: [WARNING]  (284416) : Exiting Master process...
Jan 23 05:08:39 np0005593234 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[284412]: [ALERT]    (284416) : Current worker (284418) exited with code 143 (Terminated)
Jan 23 05:08:39 np0005593234 neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a[284412]: [WARNING]  (284416) : All workers exited. Exiting... (0)
Jan 23 05:08:39 np0005593234 systemd[1]: libpod-b31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df.scope: Deactivated successfully.
Jan 23 05:08:39 np0005593234 conmon[284412]: conmon b31b1bf0c4bcda1722f9 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df.scope/container/memory.events
Jan 23 05:08:39 np0005593234 podman[284581]: 2026-01-23 10:08:39.202835016 +0000 UTC m=+0.044836593 container died b31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.205 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.208 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.225 227766 INFO nova.virt.libvirt.driver [-] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Instance destroyed successfully.#033[00m
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.230 227766 DEBUG nova.objects.instance [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lazy-loading 'resources' on Instance uuid 44fc19d2-6a98-47dd-803d-d3d6a2cc2486 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:08:39 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df-userdata-shm.mount: Deactivated successfully.
Jan 23 05:08:39 np0005593234 systemd[1]: var-lib-containers-storage-overlay-8653691ea21f0ed64ed5993f2d754d90684a3570a4b7bfe2c7ba438c4e830806-merged.mount: Deactivated successfully.
Jan 23 05:08:39 np0005593234 podman[284581]: 2026-01-23 10:08:39.248242525 +0000 UTC m=+0.090244102 container cleanup b31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.252 227766 DEBUG nova.virt.libvirt.vif [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:08:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeNegativeTest-server-1729959968',display_name='tempest-AttachVolumeNegativeTest-server-1729959968',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumenegativetest-server-1729959968',id=123,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHXa4OeH8DK3YeFO66XIr0uYOp9qQ0Pe9xS8kQR1WC8bD6iRszrhkHpZRj7CHrlEJe+PAgfdJlK8iQw4IKHGn1w4RvSiiqiFwp8IyluzUc7xoPCgtFKyqvu1kEnS0jpeTw==',key_name='tempest-keypair-1592454648',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:08:13Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='924f976bcbb74ec195730b68eebe1f2a',ramdisk_id='',reservation_id='r-2j3t8nfd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeNegativeTest-1470050886',owner_user_name='tempest-AttachVolumeNegativeTest-1470050886-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:08:14Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='c99d09acd2e849a69846a6ccda1e0bc7',uuid=44fc19d2-6a98-47dd-803d-d3d6a2cc2486,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dd535142-468c-4e43-b21c-8262c1d7ed46", "address": "fa:16:3e:46:41:dd", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd535142-46", "ovs_interfaceid": "dd535142-468c-4e43-b21c-8262c1d7ed46", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.253 227766 DEBUG nova.network.os_vif_util [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Converting VIF {"id": "dd535142-468c-4e43-b21c-8262c1d7ed46", "address": "fa:16:3e:46:41:dd", "network": {"id": "93735878-f62d-4a5f-96df-bf97f85d787a", "bridge": "br-int", "label": "tempest-AttachVolumeNegativeTest-259751152-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "924f976bcbb74ec195730b68eebe1f2a", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdd535142-46", "ovs_interfaceid": "dd535142-468c-4e43-b21c-8262c1d7ed46", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.254 227766 DEBUG nova.network.os_vif_util [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:46:41:dd,bridge_name='br-int',has_traffic_filtering=True,id=dd535142-468c-4e43-b21c-8262c1d7ed46,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd535142-46') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.254 227766 DEBUG os_vif [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:41:dd,bridge_name='br-int',has_traffic_filtering=True,id=dd535142-468c-4e43-b21c-8262c1d7ed46,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd535142-46') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.257 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:39 np0005593234 systemd[1]: libpod-conmon-b31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df.scope: Deactivated successfully.
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.257 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdd535142-46, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.259 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.260 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.262 227766 INFO os_vif [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:46:41:dd,bridge_name='br-int',has_traffic_filtering=True,id=dd535142-468c-4e43-b21c-8262c1d7ed46,network=Network(93735878-f62d-4a5f-96df-bf97f85d787a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdd535142-46')#033[00m
Jan 23 05:08:39 np0005593234 podman[284616]: 2026-01-23 10:08:39.314366644 +0000 UTC m=+0.047043182 container remove b31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.321 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[176da5b3-6a40-4293-aaa9-7d5552db5381]: (4, ('Fri Jan 23 10:08:39 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a (b31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df)\nb31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df\nFri Jan 23 10:08:39 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a (b31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df)\nb31b1bf0c4bcda1722f948d5f913b93c43f3093982f9e8a42ed964376c7d93df\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.323 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[72d8325e-a1b5-4a5a-9835-365591d87450]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.324 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93735878-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:39 np0005593234 kernel: tap93735878-f0: left promiscuous mode
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.326 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.342 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:39 np0005593234 nova_compute[227762]: 2026-01-23 10:08:39.344 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.345 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dbc06e48-b5d9-4148-9d2e-e68d6f989112]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.364 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bfbb8b21-cce1-477a-8656-ff5da782f712]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.365 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[095b3b7f-73a3-424c-bb2a-d331172a5f98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.382 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b2cc0b97-9a25-4381-b7b1-41f0a274e9ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 678801, 'reachable_time': 23560, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 284650, 'error': None, 'target': 'ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.386 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-93735878-f62d-4a5f-96df-bf97f85d787a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:08:39 np0005593234 systemd[1]: run-netns-ovnmeta\x2d93735878\x2df62d\x2d4a5f\x2d96df\x2dbf97f85d787a.mount: Deactivated successfully.
Jan 23 05:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:39.386 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[cbcc59ff-5193-43a2-b301-ebc893ed6271]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:08:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:39.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:40 np0005593234 nova_compute[227762]: 2026-01-23 10:08:40.068 227766 INFO nova.virt.libvirt.driver [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Deleting instance files /var/lib/nova/instances/44fc19d2-6a98-47dd-803d-d3d6a2cc2486_del#033[00m
Jan 23 05:08:40 np0005593234 nova_compute[227762]: 2026-01-23 10:08:40.069 227766 INFO nova.virt.libvirt.driver [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Deletion of /var/lib/nova/instances/44fc19d2-6a98-47dd-803d-d3d6a2cc2486_del complete#033[00m
Jan 23 05:08:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e269 e269: 3 total, 3 up, 3 in
Jan 23 05:08:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:40.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:41 np0005593234 nova_compute[227762]: 2026-01-23 10:08:41.043 227766 DEBUG nova.compute.manager [req-19d6ee95-ce69-452a-b929-bac4bbdcdfb9 req-af4f30d3-3217-438b-9494-cda350f6ac16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Received event network-vif-unplugged-dd535142-468c-4e43-b21c-8262c1d7ed46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:41 np0005593234 nova_compute[227762]: 2026-01-23 10:08:41.043 227766 DEBUG oslo_concurrency.lockutils [req-19d6ee95-ce69-452a-b929-bac4bbdcdfb9 req-af4f30d3-3217-438b-9494-cda350f6ac16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:41 np0005593234 nova_compute[227762]: 2026-01-23 10:08:41.043 227766 DEBUG oslo_concurrency.lockutils [req-19d6ee95-ce69-452a-b929-bac4bbdcdfb9 req-af4f30d3-3217-438b-9494-cda350f6ac16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:41 np0005593234 nova_compute[227762]: 2026-01-23 10:08:41.043 227766 DEBUG oslo_concurrency.lockutils [req-19d6ee95-ce69-452a-b929-bac4bbdcdfb9 req-af4f30d3-3217-438b-9494-cda350f6ac16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:41 np0005593234 nova_compute[227762]: 2026-01-23 10:08:41.044 227766 DEBUG nova.compute.manager [req-19d6ee95-ce69-452a-b929-bac4bbdcdfb9 req-af4f30d3-3217-438b-9494-cda350f6ac16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] No waiting events found dispatching network-vif-unplugged-dd535142-468c-4e43-b21c-8262c1d7ed46 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:08:41 np0005593234 nova_compute[227762]: 2026-01-23 10:08:41.044 227766 DEBUG nova.compute.manager [req-19d6ee95-ce69-452a-b929-bac4bbdcdfb9 req-af4f30d3-3217-438b-9494-cda350f6ac16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Received event network-vif-unplugged-dd535142-468c-4e43-b21c-8262c1d7ed46 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:08:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:41.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:41 np0005593234 nova_compute[227762]: 2026-01-23 10:08:41.552 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e270 e270: 3 total, 3 up, 3 in
Jan 23 05:08:41 np0005593234 nova_compute[227762]: 2026-01-23 10:08:41.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:41 np0005593234 nova_compute[227762]: 2026-01-23 10:08:41.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:08:42 np0005593234 nova_compute[227762]: 2026-01-23 10:08:42.395 227766 INFO nova.compute.manager [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Took 3.42 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:08:42 np0005593234 nova_compute[227762]: 2026-01-23 10:08:42.395 227766 DEBUG oslo.service.loopingcall [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:08:42 np0005593234 nova_compute[227762]: 2026-01-23 10:08:42.396 227766 DEBUG nova.compute.manager [-] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:08:42 np0005593234 nova_compute[227762]: 2026-01-23 10:08:42.396 227766 DEBUG nova.network.neutron [-] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:08:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e271 e271: 3 total, 3 up, 3 in
Jan 23 05:08:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:42.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:42 np0005593234 nova_compute[227762]: 2026-01-23 10:08:42.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:42 np0005593234 nova_compute[227762]: 2026-01-23 10:08:42.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:08:42 np0005593234 nova_compute[227762]: 2026-01-23 10:08:42.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:08:42 np0005593234 nova_compute[227762]: 2026-01-23 10:08:42.768 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Jan 23 05:08:42 np0005593234 nova_compute[227762]: 2026-01-23 10:08:42.769 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:08:42 np0005593234 nova_compute[227762]: 2026-01-23 10:08:42.769 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:42.845 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:42.846 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:42.846 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:43 np0005593234 nova_compute[227762]: 2026-01-23 10:08:43.193 227766 DEBUG nova.compute.manager [req-5e5364dc-86ff-43e2-83bd-b7aef616a3b4 req-c39febdd-61ff-4a4e-a78a-cd8294f3176a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Received event network-vif-plugged-dd535142-468c-4e43-b21c-8262c1d7ed46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:43 np0005593234 nova_compute[227762]: 2026-01-23 10:08:43.194 227766 DEBUG oslo_concurrency.lockutils [req-5e5364dc-86ff-43e2-83bd-b7aef616a3b4 req-c39febdd-61ff-4a4e-a78a-cd8294f3176a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:43 np0005593234 nova_compute[227762]: 2026-01-23 10:08:43.194 227766 DEBUG oslo_concurrency.lockutils [req-5e5364dc-86ff-43e2-83bd-b7aef616a3b4 req-c39febdd-61ff-4a4e-a78a-cd8294f3176a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:43 np0005593234 nova_compute[227762]: 2026-01-23 10:08:43.194 227766 DEBUG oslo_concurrency.lockutils [req-5e5364dc-86ff-43e2-83bd-b7aef616a3b4 req-c39febdd-61ff-4a4e-a78a-cd8294f3176a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:43 np0005593234 nova_compute[227762]: 2026-01-23 10:08:43.194 227766 DEBUG nova.compute.manager [req-5e5364dc-86ff-43e2-83bd-b7aef616a3b4 req-c39febdd-61ff-4a4e-a78a-cd8294f3176a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] No waiting events found dispatching network-vif-plugged-dd535142-468c-4e43-b21c-8262c1d7ed46 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:08:43 np0005593234 nova_compute[227762]: 2026-01-23 10:08:43.195 227766 WARNING nova.compute.manager [req-5e5364dc-86ff-43e2-83bd-b7aef616a3b4 req-c39febdd-61ff-4a4e-a78a-cd8294f3176a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Received unexpected event network-vif-plugged-dd535142-468c-4e43-b21c-8262c1d7ed46 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:08:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:43.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:43 np0005593234 podman[284654]: 2026-01-23 10:08:43.784361782 +0000 UTC m=+0.082823340 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 23 05:08:44 np0005593234 nova_compute[227762]: 2026-01-23 10:08:44.259 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:44 np0005593234 nova_compute[227762]: 2026-01-23 10:08:44.391 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:44.391 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=45, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=44) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:08:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:44.392 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:08:44 np0005593234 nova_compute[227762]: 2026-01-23 10:08:44.429 227766 DEBUG nova.network.neutron [-] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:08:44 np0005593234 nova_compute[227762]: 2026-01-23 10:08:44.487 227766 INFO nova.compute.manager [-] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Took 2.09 seconds to deallocate network for instance.#033[00m
Jan 23 05:08:44 np0005593234 nova_compute[227762]: 2026-01-23 10:08:44.623 227766 DEBUG oslo_concurrency.lockutils [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:08:44 np0005593234 nova_compute[227762]: 2026-01-23 10:08:44.625 227766 DEBUG oslo_concurrency.lockutils [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:08:44 np0005593234 nova_compute[227762]: 2026-01-23 10:08:44.714 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:44.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:44 np0005593234 nova_compute[227762]: 2026-01-23 10:08:44.733 227766 DEBUG nova.compute.manager [req-84742a57-1107-4512-baae-44fb33317611 req-5feffcba-dd58-46b9-afb3-3f7c492c395f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Received event network-vif-deleted-dd535142-468c-4e43-b21c-8262c1d7ed46 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:08:44 np0005593234 nova_compute[227762]: 2026-01-23 10:08:44.863 227766 DEBUG oslo_concurrency.processutils [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:08:45 np0005593234 nova_compute[227762]: 2026-01-23 10:08:45.075 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:08:45.394 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '45'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:08:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:45.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:08:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4122320813' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:08:45 np0005593234 nova_compute[227762]: 2026-01-23 10:08:45.676 227766 DEBUG oslo_concurrency.processutils [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.813s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:08:45 np0005593234 nova_compute[227762]: 2026-01-23 10:08:45.683 227766 DEBUG nova.compute.provider_tree [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:08:45 np0005593234 nova_compute[227762]: 2026-01-23 10:08:45.703 227766 DEBUG nova.scheduler.client.report [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:08:45 np0005593234 nova_compute[227762]: 2026-01-23 10:08:45.733 227766 DEBUG oslo_concurrency.lockutils [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:45 np0005593234 nova_compute[227762]: 2026-01-23 10:08:45.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:45 np0005593234 nova_compute[227762]: 2026-01-23 10:08:45.785 227766 INFO nova.scheduler.client.report [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Deleted allocations for instance 44fc19d2-6a98-47dd-803d-d3d6a2cc2486#033[00m
Jan 23 05:08:45 np0005593234 nova_compute[227762]: 2026-01-23 10:08:45.898 227766 DEBUG oslo_concurrency.lockutils [None req-79c14e8a-801a-452a-837d-7eca71a3564c c99d09acd2e849a69846a6ccda1e0bc7 924f976bcbb74ec195730b68eebe1f2a - - default default] Lock "44fc19d2-6a98-47dd-803d-d3d6a2cc2486" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.922s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:08:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e271 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:46 np0005593234 nova_compute[227762]: 2026-01-23 10:08:46.553 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:46.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:47.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:48.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:48 np0005593234 nova_compute[227762]: 2026-01-23 10:08:48.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:49 np0005593234 nova_compute[227762]: 2026-01-23 10:08:49.264 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:49.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e272 e272: 3 total, 3 up, 3 in
Jan 23 05:08:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:50.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:51 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [P] New memtable created with log file: #52. Immutable memtables: 0.
Jan 23 05:08:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:51.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:51 np0005593234 nova_compute[227762]: 2026-01-23 10:08:51.555 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e273 e273: 3 total, 3 up, 3 in
Jan 23 05:08:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e274 e274: 3 total, 3 up, 3 in
Jan 23 05:08:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:52.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:53.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:54 np0005593234 nova_compute[227762]: 2026-01-23 10:08:54.218 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162919.2174997, 44fc19d2-6a98-47dd-803d-d3d6a2cc2486 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:08:54 np0005593234 nova_compute[227762]: 2026-01-23 10:08:54.219 227766 INFO nova.compute.manager [-] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:08:54 np0005593234 nova_compute[227762]: 2026-01-23 10:08:54.245 227766 DEBUG nova.compute.manager [None req-e6c7ff86-9cf6-42c6-a379-45954f63c990 - - - - - -] [instance: 44fc19d2-6a98-47dd-803d-d3d6a2cc2486] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:08:54 np0005593234 nova_compute[227762]: 2026-01-23 10:08:54.267 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:08:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:54.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:08:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:55.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:08:56 np0005593234 nova_compute[227762]: 2026-01-23 10:08:56.556 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:56.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:56 np0005593234 nova_compute[227762]: 2026-01-23 10:08:56.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:56 np0005593234 nova_compute[227762]: 2026-01-23 10:08:56.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:08:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:57.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:58.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:08:58 np0005593234 nova_compute[227762]: 2026-01-23 10:08:58.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:08:59 np0005593234 nova_compute[227762]: 2026-01-23 10:08:59.272 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:08:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:08:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:08:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:59.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:00.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e275 e275: 3 total, 3 up, 3 in
Jan 23 05:09:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:01.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:01 np0005593234 nova_compute[227762]: 2026-01-23 10:09:01.557 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e276 e276: 3 total, 3 up, 3 in
Jan 23 05:09:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:02.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e277 e277: 3 total, 3 up, 3 in
Jan 23 05:09:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.007000222s ======
Jan 23 05:09:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:03.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.007000222s
Jan 23 05:09:03 np0005593234 nova_compute[227762]: 2026-01-23 10:09:03.802 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:03 np0005593234 nova_compute[227762]: 2026-01-23 10:09:03.803 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:09:04 np0005593234 nova_compute[227762]: 2026-01-23 10:09:04.154 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:09:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e278 e278: 3 total, 3 up, 3 in
Jan 23 05:09:04 np0005593234 nova_compute[227762]: 2026-01-23 10:09:04.275 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:04.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:04 np0005593234 podman[284765]: 2026-01-23 10:09:04.753599329 +0000 UTC m=+0.049812818 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 05:09:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:05.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:06 np0005593234 nova_compute[227762]: 2026-01-23 10:09:06.558 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:06.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:07.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:08.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:09 np0005593234 nova_compute[227762]: 2026-01-23 10:09:09.279 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:09.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e279 e279: 3 total, 3 up, 3 in
Jan 23 05:09:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:10.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:11.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:11 np0005593234 nova_compute[227762]: 2026-01-23 10:09:11.559 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e280 e280: 3 total, 3 up, 3 in
Jan 23 05:09:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:12.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:12 np0005593234 nova_compute[227762]: 2026-01-23 10:09:12.832 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Acquiring lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:12 np0005593234 nova_compute[227762]: 2026-01-23 10:09:12.832 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:12 np0005593234 nova_compute[227762]: 2026-01-23 10:09:12.854 227766 DEBUG nova.compute.manager [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:09:12 np0005593234 nova_compute[227762]: 2026-01-23 10:09:12.954 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:12 np0005593234 nova_compute[227762]: 2026-01-23 10:09:12.955 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:12 np0005593234 nova_compute[227762]: 2026-01-23 10:09:12.966 227766 DEBUG nova.virt.hardware [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:09:12 np0005593234 nova_compute[227762]: 2026-01-23 10:09:12.966 227766 INFO nova.compute.claims [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:09:13 np0005593234 nova_compute[227762]: 2026-01-23 10:09:13.157 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:13.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:09:13 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1605045781' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:09:13 np0005593234 nova_compute[227762]: 2026-01-23 10:09:13.646 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:13 np0005593234 nova_compute[227762]: 2026-01-23 10:09:13.654 227766 DEBUG nova.compute.provider_tree [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:09:13 np0005593234 nova_compute[227762]: 2026-01-23 10:09:13.675 227766 DEBUG nova.scheduler.client.report [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:09:13 np0005593234 nova_compute[227762]: 2026-01-23 10:09:13.712 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.757s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:13 np0005593234 nova_compute[227762]: 2026-01-23 10:09:13.713 227766 DEBUG nova.compute.manager [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:09:13 np0005593234 nova_compute[227762]: 2026-01-23 10:09:13.917 227766 DEBUG nova.compute.manager [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:09:13 np0005593234 nova_compute[227762]: 2026-01-23 10:09:13.918 227766 DEBUG nova.network.neutron [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:09:13 np0005593234 podman[284911]: 2026-01-23 10:09:13.936042398 +0000 UTC m=+0.084489513 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 05:09:13 np0005593234 nova_compute[227762]: 2026-01-23 10:09:13.950 227766 INFO nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:09:13 np0005593234 nova_compute[227762]: 2026-01-23 10:09:13.971 227766 DEBUG nova.compute.manager [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.103 227766 DEBUG nova.compute.manager [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.106 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.108 227766 INFO nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Creating image(s)#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.137 227766 DEBUG nova.storage.rbd_utils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] rbd image d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.169 227766 DEBUG nova.storage.rbd_utils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] rbd image d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.198 227766 DEBUG nova.storage.rbd_utils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] rbd image d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.203 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.279 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.281 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.282 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.283 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.310 227766 DEBUG nova.storage.rbd_utils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] rbd image d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.315 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.338 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.342 227766 DEBUG nova.policy [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '11cb5a7c448c4cb5b509c29925463448', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '5b1dffee0b924f70840ddadf2a893b31', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.592 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.277s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.695 227766 DEBUG nova.storage.rbd_utils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] resizing rbd image d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:09:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:14.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e281 e281: 3 total, 3 up, 3 in
Jan 23 05:09:14 np0005593234 nova_compute[227762]: 2026-01-23 10:09:14.829 227766 DEBUG nova.objects.instance [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lazy-loading 'migration_context' on Instance uuid d247ce17-f43c-4f04-9eda-6dcd931a2f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:09:15 np0005593234 nova_compute[227762]: 2026-01-23 10:09:15.110 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:09:15 np0005593234 nova_compute[227762]: 2026-01-23 10:09:15.111 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Ensure instance console log exists: /var/lib/nova/instances/d247ce17-f43c-4f04-9eda-6dcd931a2f43/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:09:15 np0005593234 nova_compute[227762]: 2026-01-23 10:09:15.111 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:15 np0005593234 nova_compute[227762]: 2026-01-23 10:09:15.111 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:15 np0005593234 nova_compute[227762]: 2026-01-23 10:09:15.112 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:15.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:09:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:09:16 np0005593234 nova_compute[227762]: 2026-01-23 10:09:16.006 227766 DEBUG nova.network.neutron [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Successfully created port: fe0d3280-f78b-41fe-b883-bd8af39bb281 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:09:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:16 np0005593234 nova_compute[227762]: 2026-01-23 10:09:16.562 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:16.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e282 e282: 3 total, 3 up, 3 in
Jan 23 05:09:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:09:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:09:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:09:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:17.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:17 np0005593234 nova_compute[227762]: 2026-01-23 10:09:17.937 227766 DEBUG nova.network.neutron [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Successfully updated port: fe0d3280-f78b-41fe-b883-bd8af39bb281 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:09:17 np0005593234 nova_compute[227762]: 2026-01-23 10:09:17.977 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Acquiring lock "refresh_cache-d247ce17-f43c-4f04-9eda-6dcd931a2f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:09:17 np0005593234 nova_compute[227762]: 2026-01-23 10:09:17.977 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Acquired lock "refresh_cache-d247ce17-f43c-4f04-9eda-6dcd931a2f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:09:17 np0005593234 nova_compute[227762]: 2026-01-23 10:09:17.977 227766 DEBUG nova.network.neutron [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:09:18 np0005593234 nova_compute[227762]: 2026-01-23 10:09:18.492 227766 DEBUG nova.network.neutron [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:09:18 np0005593234 nova_compute[227762]: 2026-01-23 10:09:18.713 227766 DEBUG nova.compute.manager [req-acc619bc-3459-4519-9e61-3e4b517803bd req-952a5540-7afa-49ab-ba6b-400400b949ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Received event network-changed-fe0d3280-f78b-41fe-b883-bd8af39bb281 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:09:18 np0005593234 nova_compute[227762]: 2026-01-23 10:09:18.713 227766 DEBUG nova.compute.manager [req-acc619bc-3459-4519-9e61-3e4b517803bd req-952a5540-7afa-49ab-ba6b-400400b949ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Refreshing instance network info cache due to event network-changed-fe0d3280-f78b-41fe-b883-bd8af39bb281. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:09:18 np0005593234 nova_compute[227762]: 2026-01-23 10:09:18.713 227766 DEBUG oslo_concurrency.lockutils [req-acc619bc-3459-4519-9e61-3e4b517803bd req-952a5540-7afa-49ab-ba6b-400400b949ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d247ce17-f43c-4f04-9eda-6dcd931a2f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:09:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:09:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:18.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:09:19 np0005593234 nova_compute[227762]: 2026-01-23 10:09:19.340 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:19.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:20.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:20 np0005593234 nova_compute[227762]: 2026-01-23 10:09:20.832 227766 DEBUG nova.network.neutron [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Updating instance_info_cache with network_info: [{"id": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "address": "fa:16:3e:5b:40:32", "network": {"id": "1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2076307979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b1dffee0b924f70840ddadf2a893b31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe0d3280-f7", "ovs_interfaceid": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:09:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e282 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:21.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.563 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e283 e283: 3 total, 3 up, 3 in
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.928 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Releasing lock "refresh_cache-d247ce17-f43c-4f04-9eda-6dcd931a2f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.928 227766 DEBUG nova.compute.manager [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Instance network_info: |[{"id": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "address": "fa:16:3e:5b:40:32", "network": {"id": "1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2076307979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b1dffee0b924f70840ddadf2a893b31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe0d3280-f7", "ovs_interfaceid": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.928 227766 DEBUG oslo_concurrency.lockutils [req-acc619bc-3459-4519-9e61-3e4b517803bd req-952a5540-7afa-49ab-ba6b-400400b949ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d247ce17-f43c-4f04-9eda-6dcd931a2f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.929 227766 DEBUG nova.network.neutron [req-acc619bc-3459-4519-9e61-3e4b517803bd req-952a5540-7afa-49ab-ba6b-400400b949ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Refreshing network info cache for port fe0d3280-f78b-41fe-b883-bd8af39bb281 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.931 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Start _get_guest_xml network_info=[{"id": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "address": "fa:16:3e:5b:40:32", "network": {"id": "1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2076307979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b1dffee0b924f70840ddadf2a893b31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe0d3280-f7", "ovs_interfaceid": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.937 227766 WARNING nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.947 227766 DEBUG nova.virt.libvirt.host [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.948 227766 DEBUG nova.virt.libvirt.host [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.957 227766 DEBUG nova.virt.libvirt.host [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.957 227766 DEBUG nova.virt.libvirt.host [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.959 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.959 227766 DEBUG nova.virt.hardware [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.959 227766 DEBUG nova.virt.hardware [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.959 227766 DEBUG nova.virt.hardware [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.960 227766 DEBUG nova.virt.hardware [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.960 227766 DEBUG nova.virt.hardware [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.960 227766 DEBUG nova.virt.hardware [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.960 227766 DEBUG nova.virt.hardware [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.961 227766 DEBUG nova.virt.hardware [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.961 227766 DEBUG nova.virt.hardware [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.961 227766 DEBUG nova.virt.hardware [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.961 227766 DEBUG nova.virt.hardware [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:09:21 np0005593234 nova_compute[227762]: 2026-01-23 10:09:21.965 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:09:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1476677644' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.413 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.444 227766 DEBUG nova.storage.rbd_utils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] rbd image d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.449 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:22.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:09:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4266656527' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:09:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:09:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.893 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.895 227766 DEBUG nova.virt.libvirt.vif [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:09:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-302700014',display_name='tempest-ServerMetadataNegativeTestJSON-server-302700014',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-302700014',id=125,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b1dffee0b924f70840ddadf2a893b31',ramdisk_id='',reservation_id='r-r9vua8he',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-164591532',owner_user_name='tempest-ServerMetadataNegativeTestJSON-164591532-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:09:14Z,user_data=None,user_id='11cb5a7c448c4cb5b509c29925463448',uuid=d247ce17-f43c-4f04-9eda-6dcd931a2f43,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "address": "fa:16:3e:5b:40:32", "network": {"id": "1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2076307979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b1dffee0b924f70840ddadf2a893b31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe0d3280-f7", "ovs_interfaceid": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.895 227766 DEBUG nova.network.os_vif_util [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Converting VIF {"id": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "address": "fa:16:3e:5b:40:32", "network": {"id": "1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2076307979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b1dffee0b924f70840ddadf2a893b31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe0d3280-f7", "ovs_interfaceid": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.896 227766 DEBUG nova.network.os_vif_util [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:40:32,bridge_name='br-int',has_traffic_filtering=True,id=fe0d3280-f78b-41fe-b883-bd8af39bb281,network=Network(1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe0d3280-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.897 227766 DEBUG nova.objects.instance [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lazy-loading 'pci_devices' on Instance uuid d247ce17-f43c-4f04-9eda-6dcd931a2f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.918 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  <uuid>d247ce17-f43c-4f04-9eda-6dcd931a2f43</uuid>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  <name>instance-0000007d</name>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerMetadataNegativeTestJSON-server-302700014</nova:name>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:09:21</nova:creationTime>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <nova:user uuid="11cb5a7c448c4cb5b509c29925463448">tempest-ServerMetadataNegativeTestJSON-164591532-project-member</nova:user>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <nova:project uuid="5b1dffee0b924f70840ddadf2a893b31">tempest-ServerMetadataNegativeTestJSON-164591532</nova:project>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <nova:port uuid="fe0d3280-f78b-41fe-b883-bd8af39bb281">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <entry name="serial">d247ce17-f43c-4f04-9eda-6dcd931a2f43</entry>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <entry name="uuid">d247ce17-f43c-4f04-9eda-6dcd931a2f43</entry>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk.config">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:5b:40:32"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <target dev="tapfe0d3280-f7"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/d247ce17-f43c-4f04-9eda-6dcd931a2f43/console.log" append="off"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:09:22 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:09:22 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:09:22 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:09:22 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.919 227766 DEBUG nova.compute.manager [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Preparing to wait for external event network-vif-plugged-fe0d3280-f78b-41fe-b883-bd8af39bb281 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.919 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Acquiring lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.919 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.920 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.920 227766 DEBUG nova.virt.libvirt.vif [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:09:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-302700014',display_name='tempest-ServerMetadataNegativeTestJSON-server-302700014',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-302700014',id=125,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='5b1dffee0b924f70840ddadf2a893b31',ramdisk_id='',reservation_id='r-r9vua8he',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerMetadataNegativeTestJSON-164591532',owner_user_name='tempest-ServerMetadataNegativeTestJSON-164591532-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:09:14Z,user_data=None,user_id='11cb5a7c448c4cb5b509c29925463448',uuid=d247ce17-f43c-4f04-9eda-6dcd931a2f43,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "address": "fa:16:3e:5b:40:32", "network": {"id": "1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2076307979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b1dffee0b924f70840ddadf2a893b31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe0d3280-f7", "ovs_interfaceid": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.920 227766 DEBUG nova.network.os_vif_util [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Converting VIF {"id": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "address": "fa:16:3e:5b:40:32", "network": {"id": "1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2076307979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b1dffee0b924f70840ddadf2a893b31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe0d3280-f7", "ovs_interfaceid": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.921 227766 DEBUG nova.network.os_vif_util [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:40:32,bridge_name='br-int',has_traffic_filtering=True,id=fe0d3280-f78b-41fe-b883-bd8af39bb281,network=Network(1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe0d3280-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.921 227766 DEBUG os_vif [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:40:32,bridge_name='br-int',has_traffic_filtering=True,id=fe0d3280-f78b-41fe-b883-bd8af39bb281,network=Network(1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe0d3280-f7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.922 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.922 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.923 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.926 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.927 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe0d3280-f7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.927 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfe0d3280-f7, col_values=(('external_ids', {'iface-id': 'fe0d3280-f78b-41fe-b883-bd8af39bb281', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5b:40:32', 'vm-uuid': 'd247ce17-f43c-4f04-9eda-6dcd931a2f43'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.929 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:22 np0005593234 NetworkManager[48942]: <info>  [1769162962.9296] manager: (tapfe0d3280-f7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/255)
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.931 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.935 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:22 np0005593234 nova_compute[227762]: 2026-01-23 10:09:22.936 227766 INFO os_vif [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:40:32,bridge_name='br-int',has_traffic_filtering=True,id=fe0d3280-f78b-41fe-b883-bd8af39bb281,network=Network(1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe0d3280-f7')#033[00m
Jan 23 05:09:23 np0005593234 nova_compute[227762]: 2026-01-23 10:09:23.180 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:09:23 np0005593234 nova_compute[227762]: 2026-01-23 10:09:23.181 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:09:23 np0005593234 nova_compute[227762]: 2026-01-23 10:09:23.181 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] No VIF found with MAC fa:16:3e:5b:40:32, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:09:23 np0005593234 nova_compute[227762]: 2026-01-23 10:09:23.182 227766 INFO nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Using config drive#033[00m
Jan 23 05:09:23 np0005593234 nova_compute[227762]: 2026-01-23 10:09:23.212 227766 DEBUG nova.storage.rbd_utils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] rbd image d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:23.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:24.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:25.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:26 np0005593234 nova_compute[227762]: 2026-01-23 10:09:26.565 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:26.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e284 e284: 3 total, 3 up, 3 in
Jan 23 05:09:26 np0005593234 nova_compute[227762]: 2026-01-23 10:09:26.941 227766 INFO nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Creating config drive at /var/lib/nova/instances/d247ce17-f43c-4f04-9eda-6dcd931a2f43/disk.config#033[00m
Jan 23 05:09:26 np0005593234 nova_compute[227762]: 2026-01-23 10:09:26.946 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d247ce17-f43c-4f04-9eda-6dcd931a2f43/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf8ox8obn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:27 np0005593234 nova_compute[227762]: 2026-01-23 10:09:27.075 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d247ce17-f43c-4f04-9eda-6dcd931a2f43/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf8ox8obn" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:27 np0005593234 nova_compute[227762]: 2026-01-23 10:09:27.415 227766 DEBUG nova.storage.rbd_utils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] rbd image d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:27 np0005593234 nova_compute[227762]: 2026-01-23 10:09:27.419 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d247ce17-f43c-4f04-9eda-6dcd931a2f43/disk.config d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:27.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:27 np0005593234 nova_compute[227762]: 2026-01-23 10:09:27.931 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:28 np0005593234 nova_compute[227762]: 2026-01-23 10:09:28.216 227766 DEBUG oslo_concurrency.processutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d247ce17-f43c-4f04-9eda-6dcd931a2f43/disk.config d247ce17-f43c-4f04-9eda-6dcd931a2f43_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.797s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:28 np0005593234 nova_compute[227762]: 2026-01-23 10:09:28.216 227766 INFO nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Deleting local config drive /var/lib/nova/instances/d247ce17-f43c-4f04-9eda-6dcd931a2f43/disk.config because it was imported into RBD.#033[00m
Jan 23 05:09:28 np0005593234 kernel: tapfe0d3280-f7: entered promiscuous mode
Jan 23 05:09:28 np0005593234 NetworkManager[48942]: <info>  [1769162968.2691] manager: (tapfe0d3280-f7): new Tun device (/org/freedesktop/NetworkManager/Devices/256)
Jan 23 05:09:28 np0005593234 nova_compute[227762]: 2026-01-23 10:09:28.268 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:28Z|00493|binding|INFO|Claiming lport fe0d3280-f78b-41fe-b883-bd8af39bb281 for this chassis.
Jan 23 05:09:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:28Z|00494|binding|INFO|fe0d3280-f78b-41fe-b883-bd8af39bb281: Claiming fa:16:3e:5b:40:32 10.100.0.10
Jan 23 05:09:28 np0005593234 nova_compute[227762]: 2026-01-23 10:09:28.273 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:28 np0005593234 nova_compute[227762]: 2026-01-23 10:09:28.278 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:28 np0005593234 systemd-udevd[285427]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:09:28 np0005593234 systemd-machined[195626]: New machine qemu-58-instance-0000007d.
Jan 23 05:09:28 np0005593234 NetworkManager[48942]: <info>  [1769162968.3112] device (tapfe0d3280-f7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:09:28 np0005593234 NetworkManager[48942]: <info>  [1769162968.3121] device (tapfe0d3280-f7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.317 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:40:32 10.100.0.10'], port_security=['fa:16:3e:5b:40:32 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd247ce17-f43c-4f04-9eda-6dcd931a2f43', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b1dffee0b924f70840ddadf2a893b31', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0c0dc881-49bc-4e0e-87e3-28984dff236d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b3d5d2a-18ab-4e59-bcd9-bf0c0b6cebaf, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fe0d3280-f78b-41fe-b883-bd8af39bb281) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.319 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fe0d3280-f78b-41fe-b883-bd8af39bb281 in datapath 1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c bound to our chassis#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.320 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c#033[00m
Jan 23 05:09:28 np0005593234 systemd[1]: Started Virtual Machine qemu-58-instance-0000007d.
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.331 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e875a9-f2ab-4ccb-a1c0-e39592150c27]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.332 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1b4b1a37-01 in ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.334 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1b4b1a37-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.334 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a6f6fed7-86c4-410b-814b-bf8c64c802a9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.335 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f079b78d-eec8-4bba-aa94-738bc205e60b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 nova_compute[227762]: 2026-01-23 10:09:28.342 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:28Z|00495|binding|INFO|Setting lport fe0d3280-f78b-41fe-b883-bd8af39bb281 ovn-installed in OVS
Jan 23 05:09:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:28Z|00496|binding|INFO|Setting lport fe0d3280-f78b-41fe-b883-bd8af39bb281 up in Southbound
Jan 23 05:09:28 np0005593234 nova_compute[227762]: 2026-01-23 10:09:28.347 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.347 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[94d36d09-cd48-4be3-9899-ad6959eec538]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.360 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[239297a3-ce73-449a-a2e2-85c275610ff5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.386 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ae974389-f9fb-43a3-9f3c-cb48902e2635]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.390 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[333c1375-57b6-4a45-89de-1e3047990661]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 systemd-udevd[285430]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:09:28 np0005593234 NetworkManager[48942]: <info>  [1769162968.3914] manager: (tap1b4b1a37-00): new Veth device (/org/freedesktop/NetworkManager/Devices/257)
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.420 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e92b00e6-c401-4007-b071-1d79c086edbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.423 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9efd9035-a3da-4c3f-80e6-58b30a014e68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 NetworkManager[48942]: <info>  [1769162968.4479] device (tap1b4b1a37-00): carrier: link connected
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.453 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[afb41479-f291-4d2a-aff2-1113cfd7bd35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.469 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fc88cf68-056f-469b-9552-c7d2263feb31]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b4b1a37-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:c9:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 686340, 'reachable_time': 42185, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285461, 'error': None, 'target': 'ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.484 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7d9673c8-8e03-4029-aa72-99dcbcdf9dde]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed9:c9d6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 686340, 'tstamp': 686340}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 285462, 'error': None, 'target': 'ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.501 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c65edbc8-a176-4d8c-a21d-5b1d90dfeb4e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1b4b1a37-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d9:c9:d6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 161], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 686340, 'reachable_time': 42185, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 285463, 'error': None, 'target': 'ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.529 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4e88b9d2-c6b5-48a2-bab6-db63ba7860b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.580 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ee56b6-9aaf-4274-9101-aabda9930723]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.582 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b4b1a37-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.582 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.583 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1b4b1a37-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:28 np0005593234 nova_compute[227762]: 2026-01-23 10:09:28.584 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:28 np0005593234 NetworkManager[48942]: <info>  [1769162968.5854] manager: (tap1b4b1a37-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/258)
Jan 23 05:09:28 np0005593234 kernel: tap1b4b1a37-00: entered promiscuous mode
Jan 23 05:09:28 np0005593234 nova_compute[227762]: 2026-01-23 10:09:28.587 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.591 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1b4b1a37-00, col_values=(('external_ids', {'iface-id': 'bcfaeeaa-852c-4aac-b528-a17f15062ded'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:28 np0005593234 nova_compute[227762]: 2026-01-23 10:09:28.591 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:28Z|00497|binding|INFO|Releasing lport bcfaeeaa-852c-4aac-b528-a17f15062ded from this chassis (sb_readonly=0)
Jan 23 05:09:28 np0005593234 nova_compute[227762]: 2026-01-23 10:09:28.592 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.594 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.595 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b418a6a9-87a8-45c6-bbd3-8f2637c266a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.596 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c.pid.haproxy
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:09:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:28.596 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c', 'env', 'PROCESS_TAG=haproxy-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:09:28 np0005593234 nova_compute[227762]: 2026-01-23 10:09:28.604 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:28.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:29 np0005593234 podman[285494]: 2026-01-23 10:09:28.916149473 +0000 UTC m=+0.020548744 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:09:29 np0005593234 nova_compute[227762]: 2026-01-23 10:09:29.521 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162969.5210288, d247ce17-f43c-4f04-9eda-6dcd931a2f43 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:09:29 np0005593234 nova_compute[227762]: 2026-01-23 10:09:29.523 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] VM Started (Lifecycle Event)#033[00m
Jan 23 05:09:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:29.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:29 np0005593234 nova_compute[227762]: 2026-01-23 10:09:29.834 227766 DEBUG nova.network.neutron [req-acc619bc-3459-4519-9e61-3e4b517803bd req-952a5540-7afa-49ab-ba6b-400400b949ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Updated VIF entry in instance network info cache for port fe0d3280-f78b-41fe-b883-bd8af39bb281. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:09:29 np0005593234 nova_compute[227762]: 2026-01-23 10:09:29.835 227766 DEBUG nova.network.neutron [req-acc619bc-3459-4519-9e61-3e4b517803bd req-952a5540-7afa-49ab-ba6b-400400b949ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Updating instance_info_cache with network_info: [{"id": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "address": "fa:16:3e:5b:40:32", "network": {"id": "1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2076307979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b1dffee0b924f70840ddadf2a893b31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe0d3280-f7", "ovs_interfaceid": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:09:29 np0005593234 nova_compute[227762]: 2026-01-23 10:09:29.998 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.002 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162969.5218244, d247ce17-f43c-4f04-9eda-6dcd931a2f43 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.002 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.076 227766 DEBUG oslo_concurrency.lockutils [req-acc619bc-3459-4519-9e61-3e4b517803bd req-952a5540-7afa-49ab-ba6b-400400b949ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d247ce17-f43c-4f04-9eda-6dcd931a2f43" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.077 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.081 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:09:30 np0005593234 podman[285494]: 2026-01-23 10:09:30.312140747 +0000 UTC m=+1.416539988 container create 31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:09:30 np0005593234 systemd[1]: Started libpod-conmon-31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b.scope.
Jan 23 05:09:30 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:09:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:30 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c79b91e34f672627b6dbe39320a30116d434a4e4c71b7e400d370a35c5784e39/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:09:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:30.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.839 227766 DEBUG nova.compute.manager [req-c23c2ca0-b186-4eaa-a404-7e06f4780d0e req-5de9c408-64ae-4f33-addf-684f19a0ce2b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Received event network-vif-plugged-fe0d3280-f78b-41fe-b883-bd8af39bb281 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.840 227766 DEBUG oslo_concurrency.lockutils [req-c23c2ca0-b186-4eaa-a404-7e06f4780d0e req-5de9c408-64ae-4f33-addf-684f19a0ce2b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.840 227766 DEBUG oslo_concurrency.lockutils [req-c23c2ca0-b186-4eaa-a404-7e06f4780d0e req-5de9c408-64ae-4f33-addf-684f19a0ce2b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.841 227766 DEBUG oslo_concurrency.lockutils [req-c23c2ca0-b186-4eaa-a404-7e06f4780d0e req-5de9c408-64ae-4f33-addf-684f19a0ce2b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.841 227766 DEBUG nova.compute.manager [req-c23c2ca0-b186-4eaa-a404-7e06f4780d0e req-5de9c408-64ae-4f33-addf-684f19a0ce2b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Processing event network-vif-plugged-fe0d3280-f78b-41fe-b883-bd8af39bb281 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.842 227766 DEBUG nova.compute.manager [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.844 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.846 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.846 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162970.8459928, d247ce17-f43c-4f04-9eda-6dcd931a2f43 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.846 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.850 227766 INFO nova.virt.libvirt.driver [-] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Instance spawned successfully.#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.851 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.884 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.885 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.885 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.886 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.886 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.887 227766 DEBUG nova.virt.libvirt.driver [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.901 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:09:30 np0005593234 nova_compute[227762]: 2026-01-23 10:09:30.905 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:09:31 np0005593234 podman[285494]: 2026-01-23 10:09:31.041917097 +0000 UTC m=+2.146316368 container init 31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 05:09:31 np0005593234 podman[285494]: 2026-01-23 10:09:31.049111352 +0000 UTC m=+2.153510603 container start 31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:09:31 np0005593234 neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c[285552]: [NOTICE]   (285556) : New worker (285558) forked
Jan 23 05:09:31 np0005593234 neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c[285552]: [NOTICE]   (285556) : Loading success.
Jan 23 05:09:31 np0005593234 nova_compute[227762]: 2026-01-23 10:09:31.259 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:09:31 np0005593234 nova_compute[227762]: 2026-01-23 10:09:31.270 227766 INFO nova.compute.manager [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Took 17.17 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:09:31 np0005593234 nova_compute[227762]: 2026-01-23 10:09:31.271 227766 DEBUG nova.compute.manager [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:09:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:31 np0005593234 nova_compute[227762]: 2026-01-23 10:09:31.364 227766 INFO nova.compute.manager [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Took 18.45 seconds to build instance.#033[00m
Jan 23 05:09:31 np0005593234 nova_compute[227762]: 2026-01-23 10:09:31.534 227766 DEBUG oslo_concurrency.lockutils [None req-fd12f6fe-29bf-4f47-9dcb-cc4b2d38a494 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 18.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:31.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:31 np0005593234 nova_compute[227762]: 2026-01-23 10:09:31.566 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:09:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:32.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:09:32 np0005593234 nova_compute[227762]: 2026-01-23 10:09:32.935 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:32 np0005593234 nova_compute[227762]: 2026-01-23 10:09:32.978 227766 DEBUG nova.compute.manager [req-45957a1e-7a18-43fa-a101-5943de2dd0ac req-b31f9b73-b98b-491d-87d5-897667c9cae7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Received event network-vif-plugged-fe0d3280-f78b-41fe-b883-bd8af39bb281 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:09:32 np0005593234 nova_compute[227762]: 2026-01-23 10:09:32.979 227766 DEBUG oslo_concurrency.lockutils [req-45957a1e-7a18-43fa-a101-5943de2dd0ac req-b31f9b73-b98b-491d-87d5-897667c9cae7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:32 np0005593234 nova_compute[227762]: 2026-01-23 10:09:32.979 227766 DEBUG oslo_concurrency.lockutils [req-45957a1e-7a18-43fa-a101-5943de2dd0ac req-b31f9b73-b98b-491d-87d5-897667c9cae7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:32 np0005593234 nova_compute[227762]: 2026-01-23 10:09:32.979 227766 DEBUG oslo_concurrency.lockutils [req-45957a1e-7a18-43fa-a101-5943de2dd0ac req-b31f9b73-b98b-491d-87d5-897667c9cae7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:32 np0005593234 nova_compute[227762]: 2026-01-23 10:09:32.980 227766 DEBUG nova.compute.manager [req-45957a1e-7a18-43fa-a101-5943de2dd0ac req-b31f9b73-b98b-491d-87d5-897667c9cae7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] No waiting events found dispatching network-vif-plugged-fe0d3280-f78b-41fe-b883-bd8af39bb281 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:09:32 np0005593234 nova_compute[227762]: 2026-01-23 10:09:32.980 227766 WARNING nova.compute.manager [req-45957a1e-7a18-43fa-a101-5943de2dd0ac req-b31f9b73-b98b-491d-87d5-897667c9cae7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Received unexpected event network-vif-plugged-fe0d3280-f78b-41fe-b883-bd8af39bb281 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.095 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.096 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.128 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.129 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.129 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.129 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.130 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:33.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2704907952' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.589 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.671 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.671 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007d as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.839 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.840 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4295MB free_disk=20.855682373046875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.841 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.841 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.934 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance d247ce17-f43c-4f04-9eda-6dcd931a2f43 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.934 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.935 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #112. Immutable memtables: 0.
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:33.946805) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 69] Flushing memtable with next log file: 112
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162973946865, "job": 69, "event": "flush_started", "num_memtables": 1, "num_entries": 1810, "num_deletes": 258, "total_data_size": 3977948, "memory_usage": 4036520, "flush_reason": "Manual Compaction"}
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 69] Level-0 flush table #113: started
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162973970134, "cf_name": "default", "job": 69, "event": "table_file_creation", "file_number": 113, "file_size": 2600978, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 55406, "largest_seqno": 57211, "table_properties": {"data_size": 2593315, "index_size": 4541, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16662, "raw_average_key_size": 20, "raw_value_size": 2577779, "raw_average_value_size": 3218, "num_data_blocks": 198, "num_entries": 801, "num_filter_entries": 801, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162842, "oldest_key_time": 1769162842, "file_creation_time": 1769162973, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 113, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 69] Flush lasted 23390 microseconds, and 6012 cpu microseconds.
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:33.970196) [db/flush_job.cc:967] [default] [JOB 69] Level-0 flush table #113: 2600978 bytes OK
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:33.970219) [db/memtable_list.cc:519] [default] Level-0 commit table #113 started
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:33.972142) [db/memtable_list.cc:722] [default] Level-0 commit table #113: memtable #1 done
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:33.972155) EVENT_LOG_v1 {"time_micros": 1769162973972151, "job": 69, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:33.972170) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 69] Try to delete WAL files size 3969568, prev total WAL file size 3969832, number of live WAL files 2.
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000109.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:33.973063) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730034373639' seq:72057594037927935, type:22 .. '7061786F730035303231' seq:0, type:0; will stop at (end)
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 70] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 69 Base level 0, inputs: [113(2540KB)], [111(9531KB)]
Jan 23 05:09:33 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162973973204, "job": 70, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [113], "files_L6": [111], "score": -1, "input_data_size": 12361406, "oldest_snapshot_seqno": -1}
Jan 23 05:09:33 np0005593234 nova_compute[227762]: 2026-01-23 10:09:33.994 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 70] Generated table #114: 7915 keys, 10458076 bytes, temperature: kUnknown
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162974060650, "cf_name": "default", "job": 70, "event": "table_file_creation", "file_number": 114, "file_size": 10458076, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10407432, "index_size": 29732, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 19845, "raw_key_size": 205054, "raw_average_key_size": 25, "raw_value_size": 10268612, "raw_average_value_size": 1297, "num_data_blocks": 1165, "num_entries": 7915, "num_filter_entries": 7915, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769162973, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 114, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:34.060970) [db/compaction/compaction_job.cc:1663] [default] [JOB 70] Compacted 1@0 + 1@6 files to L6 => 10458076 bytes
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:34.062691) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 141.2 rd, 119.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 9.3 +0.0 blob) out(10.0 +0.0 blob), read-write-amplify(8.8) write-amplify(4.0) OK, records in: 8445, records dropped: 530 output_compression: NoCompression
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:34.062711) EVENT_LOG_v1 {"time_micros": 1769162974062702, "job": 70, "event": "compaction_finished", "compaction_time_micros": 87535, "compaction_time_cpu_micros": 27693, "output_level": 6, "num_output_files": 1, "total_output_size": 10458076, "num_input_records": 8445, "num_output_records": 7915, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000113.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162974063279, "job": 70, "event": "table_file_deletion", "file_number": 113}
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000111.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162974065126, "job": 70, "event": "table_file_deletion", "file_number": 111}
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:33.972928) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:34.065169) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:34.065173) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:34.065175) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:34.065176) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:09:34.065177) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:09:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1211132685' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:09:34 np0005593234 nova_compute[227762]: 2026-01-23 10:09:34.499 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:34 np0005593234 nova_compute[227762]: 2026-01-23 10:09:34.505 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:09:34 np0005593234 nova_compute[227762]: 2026-01-23 10:09:34.532 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:09:34 np0005593234 nova_compute[227762]: 2026-01-23 10:09:34.551 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:09:34 np0005593234 nova_compute[227762]: 2026-01-23 10:09:34.552 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:34.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 05:09:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:35.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 05:09:35 np0005593234 nova_compute[227762]: 2026-01-23 10:09:35.577 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:35 np0005593234 nova_compute[227762]: 2026-01-23 10:09:35.578 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:35 np0005593234 nova_compute[227762]: 2026-01-23 10:09:35.600 227766 DEBUG nova.compute.manager [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:09:35 np0005593234 nova_compute[227762]: 2026-01-23 10:09:35.718 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:35 np0005593234 nova_compute[227762]: 2026-01-23 10:09:35.719 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:35 np0005593234 nova_compute[227762]: 2026-01-23 10:09:35.729 227766 DEBUG nova.virt.hardware [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:09:35 np0005593234 nova_compute[227762]: 2026-01-23 10:09:35.729 227766 INFO nova.compute.claims [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:09:35 np0005593234 podman[285614]: 2026-01-23 10:09:35.76563465 +0000 UTC m=+0.063801247 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:09:35 np0005593234 nova_compute[227762]: 2026-01-23 10:09:35.958 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:09:36 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3538562038' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.446 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.452 227766 DEBUG nova.compute.provider_tree [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.471 227766 DEBUG nova.scheduler.client.report [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.501 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.782s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.503 227766 DEBUG nova.compute.manager [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.556 227766 DEBUG nova.compute.manager [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.557 227766 DEBUG nova.network.neutron [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.569 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.583 227766 INFO nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.610 227766 DEBUG nova.compute.manager [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:09:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:36.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.817 227766 DEBUG nova.compute.manager [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.818 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.818 227766 INFO nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Creating image(s)#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.841 227766 DEBUG nova.storage.rbd_utils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.868 227766 DEBUG nova.storage.rbd_utils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.894 227766 DEBUG nova.storage.rbd_utils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.904 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.930 227766 DEBUG nova.policy [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aca3cab576d641d3b89e7dddf155d467', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.969 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.970 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.970 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:36 np0005593234 nova_compute[227762]: 2026-01-23 10:09:36.971 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.009 227766 DEBUG nova.storage.rbd_utils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.012 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.201 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.244 227766 DEBUG oslo_concurrency.lockutils [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Acquiring lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.245 227766 DEBUG oslo_concurrency.lockutils [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.245 227766 DEBUG oslo_concurrency.lockutils [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Acquiring lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.246 227766 DEBUG oslo_concurrency.lockutils [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.246 227766 DEBUG oslo_concurrency.lockutils [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.248 227766 INFO nova.compute.manager [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Terminating instance#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.249 227766 DEBUG nova.compute.manager [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:09:37 np0005593234 kernel: tapfe0d3280-f7 (unregistering): left promiscuous mode
Jan 23 05:09:37 np0005593234 NetworkManager[48942]: <info>  [1769162977.3827] device (tapfe0d3280-f7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:09:37 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:37Z|00498|binding|INFO|Releasing lport fe0d3280-f78b-41fe-b883-bd8af39bb281 from this chassis (sb_readonly=0)
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.392 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:37 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:37Z|00499|binding|INFO|Setting lport fe0d3280-f78b-41fe-b883-bd8af39bb281 down in Southbound
Jan 23 05:09:37 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:37Z|00500|binding|INFO|Removing iface tapfe0d3280-f7 ovn-installed in OVS
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.407 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.395s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:37 np0005593234 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007d.scope: Deactivated successfully.
Jan 23 05:09:37 np0005593234 systemd[1]: machine-qemu\x2d58\x2dinstance\x2d0000007d.scope: Consumed 7.808s CPU time.
Jan 23 05:09:37 np0005593234 systemd-machined[195626]: Machine qemu-58-instance-0000007d terminated.
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.450 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.500 227766 DEBUG nova.storage.rbd_utils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] resizing rbd image 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.524 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5b:40:32 10.100.0.10'], port_security=['fa:16:3e:5b:40:32 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd247ce17-f43c-4f04-9eda-6dcd931a2f43', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '5b1dffee0b924f70840ddadf2a893b31', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0c0dc881-49bc-4e0e-87e3-28984dff236d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1b3d5d2a-18ab-4e59-bcd9-bf0c0b6cebaf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fe0d3280-f78b-41fe-b883-bd8af39bb281) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.526 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fe0d3280-f78b-41fe-b883-bd8af39bb281 in datapath 1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c unbound from our chassis#033[00m
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.528 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.530 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5a14ddfb-3b7a-4e90-ad26-48e7e9f3b8f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.530 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c namespace which is not needed anymore#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.534 227766 INFO nova.virt.libvirt.driver [-] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Instance destroyed successfully.#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.535 227766 DEBUG nova.objects.instance [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lazy-loading 'resources' on Instance uuid d247ce17-f43c-4f04-9eda-6dcd931a2f43 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:09:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:09:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:37.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.604 227766 DEBUG nova.objects.instance [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'migration_context' on Instance uuid 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.651 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.652 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Ensure instance console log exists: /var/lib/nova/instances/11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.652 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:37 np0005593234 neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c[285552]: [NOTICE]   (285556) : haproxy version is 2.8.14-c23fe91
Jan 23 05:09:37 np0005593234 neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c[285552]: [NOTICE]   (285556) : path to executable is /usr/sbin/haproxy
Jan 23 05:09:37 np0005593234 neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c[285552]: [WARNING]  (285556) : Exiting Master process...
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.653 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.653 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:37 np0005593234 neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c[285552]: [ALERT]    (285556) : Current worker (285558) exited with code 143 (Terminated)
Jan 23 05:09:37 np0005593234 neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c[285552]: [WARNING]  (285556) : All workers exited. Exiting... (0)
Jan 23 05:09:37 np0005593234 systemd[1]: libpod-31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b.scope: Deactivated successfully.
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.659 227766 DEBUG nova.virt.libvirt.vif [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:09:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerMetadataNegativeTestJSON-server-302700014',display_name='tempest-ServerMetadataNegativeTestJSON-server-302700014',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-servermetadatanegativetestjson-server-302700014',id=125,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:09:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='5b1dffee0b924f70840ddadf2a893b31',ramdisk_id='',reservation_id='r-r9vua8he',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerMetadataNegativeTestJSON-164591532',owner_user_name='tempest-ServerMetadataNegativeTestJSON-164591532-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:09:31Z,user_data=None,user_id='11cb5a7c448c4cb5b509c29925463448',uuid=d247ce17-f43c-4f04-9eda-6dcd931a2f43,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "address": "fa:16:3e:5b:40:32", "network": {"id": "1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2076307979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b1dffee0b924f70840ddadf2a893b31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe0d3280-f7", "ovs_interfaceid": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.659 227766 DEBUG nova.network.os_vif_util [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Converting VIF {"id": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "address": "fa:16:3e:5b:40:32", "network": {"id": "1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c", "bridge": "br-int", "label": "tempest-ServerMetadataNegativeTestJSON-2076307979-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "5b1dffee0b924f70840ddadf2a893b31", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfe0d3280-f7", "ovs_interfaceid": "fe0d3280-f78b-41fe-b883-bd8af39bb281", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.660 227766 DEBUG nova.network.os_vif_util [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5b:40:32,bridge_name='br-int',has_traffic_filtering=True,id=fe0d3280-f78b-41fe-b883-bd8af39bb281,network=Network(1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe0d3280-f7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.661 227766 DEBUG os_vif [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:40:32,bridge_name='br-int',has_traffic_filtering=True,id=fe0d3280-f78b-41fe-b883-bd8af39bb281,network=Network(1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe0d3280-f7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.662 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.663 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe0d3280-f7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:37 np0005593234 podman[285857]: 2026-01-23 10:09:37.663619711 +0000 UTC m=+0.043924655 container died 31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.664 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.665 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.669 227766 INFO os_vif [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5b:40:32,bridge_name='br-int',has_traffic_filtering=True,id=fe0d3280-f78b-41fe-b883-bd8af39bb281,network=Network(1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfe0d3280-f7')#033[00m
Jan 23 05:09:37 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b-userdata-shm.mount: Deactivated successfully.
Jan 23 05:09:37 np0005593234 systemd[1]: var-lib-containers-storage-overlay-c79b91e34f672627b6dbe39320a30116d434a4e4c71b7e400d370a35c5784e39-merged.mount: Deactivated successfully.
Jan 23 05:09:37 np0005593234 podman[285857]: 2026-01-23 10:09:37.704492538 +0000 UTC m=+0.084797452 container cleanup 31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:09:37 np0005593234 systemd[1]: libpod-conmon-31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b.scope: Deactivated successfully.
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:37 np0005593234 podman[285904]: 2026-01-23 10:09:37.762053288 +0000 UTC m=+0.037361869 container remove 31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.767 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[250a6f63-7c69-4242-a0c3-2784d3b404c1]: (4, ('Fri Jan 23 10:09:37 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c (31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b)\n31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b\nFri Jan 23 10:09:37 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c (31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b)\n31c3d9b57db8a241efe8dfaa3bb87ec44dad8412567111cf5732e653ce992c0b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.769 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fe1a03f8-9977-41a0-914c-a309b82ef84b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.770 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1b4b1a37-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.771 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:37 np0005593234 kernel: tap1b4b1a37-00: left promiscuous mode
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.787 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.791 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[acf26be3-b4ae-43ca-b8d1-ca3561b6dcef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.807 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e12d0204-150b-440f-ae1b-5481ba3c2534]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.808 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bdd986fd-d5ec-4cc8-97df-a0046538331f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.827 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7311cbda-8043-4de6-b6a7-48b393ac4a5e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 686334, 'reachable_time': 31793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 285919, 'error': None, 'target': 'ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:37 np0005593234 systemd[1]: run-netns-ovnmeta\x2d1b4b1a37\x2d078c\x2d4c8f\x2db2a3\x2d030ebfbdc95c.mount: Deactivated successfully.
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.831 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1b4b1a37-078c-4c8f-b2a3-030ebfbdc95c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:09:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:37.831 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[b68d1ea3-d423-44ee-b866-50f006eccb75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:37 np0005593234 nova_compute[227762]: 2026-01-23 10:09:37.963 227766 DEBUG nova.network.neutron [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Successfully created port: 8be6de92-c581-49d7-a315-1d1b8c33153a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:09:38 np0005593234 nova_compute[227762]: 2026-01-23 10:09:38.008 227766 DEBUG nova.compute.manager [req-c1d5add6-eb5d-4563-af0d-644c8168ab54 req-c08caafa-21a6-4f69-ac68-49b39b075479 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Received event network-vif-unplugged-fe0d3280-f78b-41fe-b883-bd8af39bb281 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:09:38 np0005593234 nova_compute[227762]: 2026-01-23 10:09:38.009 227766 DEBUG oslo_concurrency.lockutils [req-c1d5add6-eb5d-4563-af0d-644c8168ab54 req-c08caafa-21a6-4f69-ac68-49b39b075479 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:38 np0005593234 nova_compute[227762]: 2026-01-23 10:09:38.009 227766 DEBUG oslo_concurrency.lockutils [req-c1d5add6-eb5d-4563-af0d-644c8168ab54 req-c08caafa-21a6-4f69-ac68-49b39b075479 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:38 np0005593234 nova_compute[227762]: 2026-01-23 10:09:38.009 227766 DEBUG oslo_concurrency.lockutils [req-c1d5add6-eb5d-4563-af0d-644c8168ab54 req-c08caafa-21a6-4f69-ac68-49b39b075479 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:38 np0005593234 nova_compute[227762]: 2026-01-23 10:09:38.009 227766 DEBUG nova.compute.manager [req-c1d5add6-eb5d-4563-af0d-644c8168ab54 req-c08caafa-21a6-4f69-ac68-49b39b075479 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] No waiting events found dispatching network-vif-unplugged-fe0d3280-f78b-41fe-b883-bd8af39bb281 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:09:38 np0005593234 nova_compute[227762]: 2026-01-23 10:09:38.010 227766 DEBUG nova.compute.manager [req-c1d5add6-eb5d-4563-af0d-644c8168ab54 req-c08caafa-21a6-4f69-ac68-49b39b075479 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Received event network-vif-unplugged-fe0d3280-f78b-41fe-b883-bd8af39bb281 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:09:38 np0005593234 nova_compute[227762]: 2026-01-23 10:09:38.093 227766 INFO nova.virt.libvirt.driver [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Deleting instance files /var/lib/nova/instances/d247ce17-f43c-4f04-9eda-6dcd931a2f43_del#033[00m
Jan 23 05:09:38 np0005593234 nova_compute[227762]: 2026-01-23 10:09:38.094 227766 INFO nova.virt.libvirt.driver [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Deletion of /var/lib/nova/instances/d247ce17-f43c-4f04-9eda-6dcd931a2f43_del complete#033[00m
Jan 23 05:09:38 np0005593234 nova_compute[227762]: 2026-01-23 10:09:38.225 227766 INFO nova.compute.manager [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Took 0.98 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:09:38 np0005593234 nova_compute[227762]: 2026-01-23 10:09:38.226 227766 DEBUG oslo.service.loopingcall [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:09:38 np0005593234 nova_compute[227762]: 2026-01-23 10:09:38.226 227766 DEBUG nova.compute.manager [-] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:09:38 np0005593234 nova_compute[227762]: 2026-01-23 10:09:38.226 227766 DEBUG nova.network.neutron [-] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:09:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:38.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:39 np0005593234 nova_compute[227762]: 2026-01-23 10:09:39.328 227766 DEBUG nova.network.neutron [-] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:09:39 np0005593234 nova_compute[227762]: 2026-01-23 10:09:39.371 227766 INFO nova.compute.manager [-] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Took 1.14 seconds to deallocate network for instance.#033[00m
Jan 23 05:09:39 np0005593234 nova_compute[227762]: 2026-01-23 10:09:39.499 227766 DEBUG nova.network.neutron [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Successfully updated port: 8be6de92-c581-49d7-a315-1d1b8c33153a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:09:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:39.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:39 np0005593234 nova_compute[227762]: 2026-01-23 10:09:39.831 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:09:39 np0005593234 nova_compute[227762]: 2026-01-23 10:09:39.831 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquired lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:09:39 np0005593234 nova_compute[227762]: 2026-01-23 10:09:39.831 227766 DEBUG nova.network.neutron [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:09:39 np0005593234 nova_compute[227762]: 2026-01-23 10:09:39.895 227766 DEBUG nova.compute.manager [req-efa6234f-a6f7-4388-99b0-981d33bdcdb4 req-30b38b5c-4829-4797-a3e6-306e5f525191 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Received event network-changed-8be6de92-c581-49d7-a315-1d1b8c33153a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:09:39 np0005593234 nova_compute[227762]: 2026-01-23 10:09:39.895 227766 DEBUG nova.compute.manager [req-efa6234f-a6f7-4388-99b0-981d33bdcdb4 req-30b38b5c-4829-4797-a3e6-306e5f525191 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Refreshing instance network info cache due to event network-changed-8be6de92-c581-49d7-a315-1d1b8c33153a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:09:39 np0005593234 nova_compute[227762]: 2026-01-23 10:09:39.895 227766 DEBUG oslo_concurrency.lockutils [req-efa6234f-a6f7-4388-99b0-981d33bdcdb4 req-30b38b5c-4829-4797-a3e6-306e5f525191 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:09:39 np0005593234 nova_compute[227762]: 2026-01-23 10:09:39.902 227766 DEBUG oslo_concurrency.lockutils [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:39 np0005593234 nova_compute[227762]: 2026-01-23 10:09:39.902 227766 DEBUG oslo_concurrency.lockutils [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:39 np0005593234 nova_compute[227762]: 2026-01-23 10:09:39.977 227766 DEBUG oslo_concurrency.processutils [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.099 227766 DEBUG nova.network.neutron [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.133 227766 DEBUG nova.compute.manager [req-b1e72bdd-8025-4510-88b0-1ad6538c4c82 req-d8352cea-4faf-4774-9934-d5d61e9bba10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Received event network-vif-plugged-fe0d3280-f78b-41fe-b883-bd8af39bb281 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.133 227766 DEBUG oslo_concurrency.lockutils [req-b1e72bdd-8025-4510-88b0-1ad6538c4c82 req-d8352cea-4faf-4774-9934-d5d61e9bba10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.134 227766 DEBUG oslo_concurrency.lockutils [req-b1e72bdd-8025-4510-88b0-1ad6538c4c82 req-d8352cea-4faf-4774-9934-d5d61e9bba10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.134 227766 DEBUG oslo_concurrency.lockutils [req-b1e72bdd-8025-4510-88b0-1ad6538c4c82 req-d8352cea-4faf-4774-9934-d5d61e9bba10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.134 227766 DEBUG nova.compute.manager [req-b1e72bdd-8025-4510-88b0-1ad6538c4c82 req-d8352cea-4faf-4774-9934-d5d61e9bba10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] No waiting events found dispatching network-vif-plugged-fe0d3280-f78b-41fe-b883-bd8af39bb281 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.134 227766 WARNING nova.compute.manager [req-b1e72bdd-8025-4510-88b0-1ad6538c4c82 req-d8352cea-4faf-4774-9934-d5d61e9bba10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Received unexpected event network-vif-plugged-fe0d3280-f78b-41fe-b883-bd8af39bb281 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.134 227766 DEBUG nova.compute.manager [req-b1e72bdd-8025-4510-88b0-1ad6538c4c82 req-d8352cea-4faf-4774-9934-d5d61e9bba10 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Received event network-vif-deleted-fe0d3280-f78b-41fe-b883-bd8af39bb281 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:09:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:09:40 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3287734203' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.610 227766 DEBUG oslo_concurrency.processutils [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.633s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.617 227766 DEBUG nova.compute.provider_tree [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.640 227766 DEBUG nova.scheduler.client.report [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.667 227766 DEBUG oslo_concurrency.lockutils [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.765s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.696 227766 INFO nova.scheduler.client.report [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Deleted allocations for instance d247ce17-f43c-4f04-9eda-6dcd931a2f43#033[00m
Jan 23 05:09:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:40.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:40 np0005593234 nova_compute[227762]: 2026-01-23 10:09:40.797 227766 DEBUG oslo_concurrency.lockutils [None req-4107bc0b-b753-4bcc-9572-dbced490aecf 11cb5a7c448c4cb5b509c29925463448 5b1dffee0b924f70840ddadf2a893b31 - - default default] Lock "d247ce17-f43c-4f04-9eda-6dcd931a2f43" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.552s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:41.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:41 np0005593234 nova_compute[227762]: 2026-01-23 10:09:41.617 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:41 np0005593234 nova_compute[227762]: 2026-01-23 10:09:41.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:41 np0005593234 nova_compute[227762]: 2026-01-23 10:09:41.993 227766 DEBUG nova.network.neutron [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Updating instance_info_cache with network_info: [{"id": "8be6de92-c581-49d7-a315-1d1b8c33153a", "address": "fa:16:3e:d6:07:56", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8be6de92-c5", "ovs_interfaceid": "8be6de92-c581-49d7-a315-1d1b8c33153a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.019 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Releasing lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.020 227766 DEBUG nova.compute.manager [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Instance network_info: |[{"id": "8be6de92-c581-49d7-a315-1d1b8c33153a", "address": "fa:16:3e:d6:07:56", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8be6de92-c5", "ovs_interfaceid": "8be6de92-c581-49d7-a315-1d1b8c33153a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.020 227766 DEBUG oslo_concurrency.lockutils [req-efa6234f-a6f7-4388-99b0-981d33bdcdb4 req-30b38b5c-4829-4797-a3e6-306e5f525191 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.020 227766 DEBUG nova.network.neutron [req-efa6234f-a6f7-4388-99b0-981d33bdcdb4 req-30b38b5c-4829-4797-a3e6-306e5f525191 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Refreshing network info cache for port 8be6de92-c581-49d7-a315-1d1b8c33153a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.023 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Start _get_guest_xml network_info=[{"id": "8be6de92-c581-49d7-a315-1d1b8c33153a", "address": "fa:16:3e:d6:07:56", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8be6de92-c5", "ovs_interfaceid": "8be6de92-c581-49d7-a315-1d1b8c33153a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.028 227766 WARNING nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.033 227766 DEBUG nova.virt.libvirt.host [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.034 227766 DEBUG nova.virt.libvirt.host [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.037 227766 DEBUG nova.virt.libvirt.host [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.037 227766 DEBUG nova.virt.libvirt.host [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.038 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.039 227766 DEBUG nova.virt.hardware [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.039 227766 DEBUG nova.virt.hardware [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.040 227766 DEBUG nova.virt.hardware [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.040 227766 DEBUG nova.virt.hardware [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.040 227766 DEBUG nova.virt.hardware [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.040 227766 DEBUG nova.virt.hardware [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.040 227766 DEBUG nova.virt.hardware [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.041 227766 DEBUG nova.virt.hardware [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.041 227766 DEBUG nova.virt.hardware [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.042 227766 DEBUG nova.virt.hardware [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.042 227766 DEBUG nova.virt.hardware [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.045 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:09:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3605976079' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.502 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.530 227766 DEBUG nova.storage.rbd_utils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.534 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:42 np0005593234 nova_compute[227762]: 2026-01-23 10:09:42.722 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:42.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:42.846 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:42.847 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:42.847 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:09:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/893501835' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.029 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.031 227766 DEBUG nova.virt.libvirt.vif [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:09:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1792998',display_name='tempest-ServerActionsTestOtherB-server-1792998',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1792998',id=127,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-3xuyr6l4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:09:36Z,user_data=None,user_id='aca3cab576d641d3b89e7dddf155d467',uuid=11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8be6de92-c581-49d7-a315-1d1b8c33153a", "address": "fa:16:3e:d6:07:56", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8be6de92-c5", "ovs_interfaceid": "8be6de92-c581-49d7-a315-1d1b8c33153a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.032 227766 DEBUG nova.network.os_vif_util [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "8be6de92-c581-49d7-a315-1d1b8c33153a", "address": "fa:16:3e:d6:07:56", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8be6de92-c5", "ovs_interfaceid": "8be6de92-c581-49d7-a315-1d1b8c33153a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.033 227766 DEBUG nova.network.os_vif_util [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:07:56,bridge_name='br-int',has_traffic_filtering=True,id=8be6de92-c581-49d7-a315-1d1b8c33153a,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8be6de92-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.034 227766 DEBUG nova.objects.instance [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'pci_devices' on Instance uuid 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.054 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  <uuid>11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae</uuid>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  <name>instance-0000007f</name>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerActionsTestOtherB-server-1792998</nova:name>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:09:42</nova:creationTime>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <nova:user uuid="aca3cab576d641d3b89e7dddf155d467">tempest-ServerActionsTestOtherB-1052932467-project-member</nova:user>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <nova:project uuid="9dd869ce76e44fc8a82b8bbee1654d33">tempest-ServerActionsTestOtherB-1052932467</nova:project>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <nova:port uuid="8be6de92-c581-49d7-a315-1d1b8c33153a">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <entry name="serial">11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae</entry>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <entry name="uuid">11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae</entry>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk.config">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:d6:07:56"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <target dev="tap8be6de92-c5"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae/console.log" append="off"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:09:43 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:09:43 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:09:43 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:09:43 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.055 227766 DEBUG nova.compute.manager [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Preparing to wait for external event network-vif-plugged-8be6de92-c581-49d7-a315-1d1b8c33153a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.056 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.056 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.056 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.057 227766 DEBUG nova.virt.libvirt.vif [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:09:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1792998',display_name='tempest-ServerActionsTestOtherB-server-1792998',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1792998',id=127,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-3xuyr6l4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:09:36Z,user_data=None,user_id='aca3cab576d641d3b89e7dddf155d467',uuid=11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "8be6de92-c581-49d7-a315-1d1b8c33153a", "address": "fa:16:3e:d6:07:56", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8be6de92-c5", "ovs_interfaceid": "8be6de92-c581-49d7-a315-1d1b8c33153a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.057 227766 DEBUG nova.network.os_vif_util [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "8be6de92-c581-49d7-a315-1d1b8c33153a", "address": "fa:16:3e:d6:07:56", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8be6de92-c5", "ovs_interfaceid": "8be6de92-c581-49d7-a315-1d1b8c33153a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.058 227766 DEBUG nova.network.os_vif_util [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:07:56,bridge_name='br-int',has_traffic_filtering=True,id=8be6de92-c581-49d7-a315-1d1b8c33153a,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8be6de92-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.058 227766 DEBUG os_vif [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:07:56,bridge_name='br-int',has_traffic_filtering=True,id=8be6de92-c581-49d7-a315-1d1b8c33153a,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8be6de92-c5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.059 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.059 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.060 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.064 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.064 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8be6de92-c5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.065 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap8be6de92-c5, col_values=(('external_ids', {'iface-id': '8be6de92-c581-49d7-a315-1d1b8c33153a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:07:56', 'vm-uuid': '11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:43 np0005593234 NetworkManager[48942]: <info>  [1769162983.0679] manager: (tap8be6de92-c5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/259)
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.067 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.071 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.074 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.075 227766 INFO os_vif [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:07:56,bridge_name='br-int',has_traffic_filtering=True,id=8be6de92-c581-49d7-a315-1d1b8c33153a,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8be6de92-c5')#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.140 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.141 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.141 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No VIF found with MAC fa:16:3e:d6:07:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.142 227766 INFO nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Using config drive#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.167 227766 DEBUG nova.storage.rbd_utils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:09:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:43.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.910 227766 INFO nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Creating config drive at /var/lib/nova/instances/11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae/disk.config#033[00m
Jan 23 05:09:43 np0005593234 nova_compute[227762]: 2026-01-23 10:09:43.917 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsvqwcffo execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.051 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsvqwcffo" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.084 227766 DEBUG nova.storage.rbd_utils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.089 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae/disk.config 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.325 227766 DEBUG oslo_concurrency.processutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae/disk.config 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.236s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.327 227766 INFO nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Deleting local config drive /var/lib/nova/instances/11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae/disk.config because it was imported into RBD.#033[00m
Jan 23 05:09:44 np0005593234 kernel: tap8be6de92-c5: entered promiscuous mode
Jan 23 05:09:44 np0005593234 NetworkManager[48942]: <info>  [1769162984.3853] manager: (tap8be6de92-c5): new Tun device (/org/freedesktop/NetworkManager/Devices/260)
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.386 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:44 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:44Z|00501|binding|INFO|Claiming lport 8be6de92-c581-49d7-a315-1d1b8c33153a for this chassis.
Jan 23 05:09:44 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:44Z|00502|binding|INFO|8be6de92-c581-49d7-a315-1d1b8c33153a: Claiming fa:16:3e:d6:07:56 10.100.0.8
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.391 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.396 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.401 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.403 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:44 np0005593234 NetworkManager[48942]: <info>  [1769162984.4041] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/261)
Jan 23 05:09:44 np0005593234 NetworkManager[48942]: <info>  [1769162984.4048] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/262)
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.407 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:07:56 10.100.0.8'], port_security=['fa:16:3e:d6:07:56 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d9599b4-8855-4310-af02-cdd058438f7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b5b72284-9167-4768-aa53-98b2ad243e70', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=875f4baa-cb85-49ca-8f02-78715d351fdb, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=8be6de92-c581-49d7-a315-1d1b8c33153a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.409 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 8be6de92-c581-49d7-a315-1d1b8c33153a in datapath 8d9599b4-8855-4310-af02-cdd058438f7d bound to our chassis#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.410 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8d9599b4-8855-4310-af02-cdd058438f7d#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.426 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5c7026b3-2209-45c3-bc16-d89c5218f47e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.428 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8d9599b4-81 in ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:09:44 np0005593234 systemd-machined[195626]: New machine qemu-59-instance-0000007f.
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.431 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8d9599b4-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.431 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d3d66353-f973-4a9f-8687-ecd79dc100ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.432 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2c40c803-e45a-4809-a0bb-bee4539e5547]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.449 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[10dce676-a26a-43e2-bd8c-1e24481705ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 systemd[1]: Started Virtual Machine qemu-59-instance-0000007f.
Jan 23 05:09:44 np0005593234 systemd-udevd[286107]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.476 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e0dfc06b-2bee-4d07-9e23-61574e3577d5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 NetworkManager[48942]: <info>  [1769162984.4894] device (tap8be6de92-c5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:09:44 np0005593234 NetworkManager[48942]: <info>  [1769162984.4901] device (tap8be6de92-c5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.505 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f465ec36-819b-4370-aebd-04a77282d088]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.612 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:44 np0005593234 podman[286080]: 2026-01-23 10:09:44.621038031 +0000 UTC m=+0.201489212 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.632 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.668 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[74b40583-59f1-4aab-b225-2ad60e9131a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 NetworkManager[48942]: <info>  [1769162984.6721] manager: (tap8d9599b4-80): new Veth device (/org/freedesktop/NetworkManager/Devices/263)
Jan 23 05:09:44 np0005593234 systemd-udevd[286116]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.704 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a7563a20-fc79-4ae7-a2dc-a410abea601c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.707 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1c4f8eb9-c19c-4e71-be20-9275d1cad586]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:44Z|00503|binding|INFO|Setting lport 8be6de92-c581-49d7-a315-1d1b8c33153a ovn-installed in OVS
Jan 23 05:09:44 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:44Z|00504|binding|INFO|Setting lport 8be6de92-c581-49d7-a315-1d1b8c33153a up in Southbound
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.711 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:44 np0005593234 NetworkManager[48942]: <info>  [1769162984.7344] device (tap8d9599b4-80): carrier: link connected
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.739 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c6741a06-fe4d-4143-be01-b139d0aca224]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.755 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[05ce8043-0fa4-45c7-8433-a342f3030e94]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d9599b4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:a1:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687969, 'reachable_time': 24655, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 286161, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.775 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[35cab367-1a22-4f89-bf7d-383256a81d57]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe55:a12b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687969, 'tstamp': 687969}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 286177, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:44.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.795 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f70ffb85-e4e2-42b7-9c5a-411ea14f4c30]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d9599b4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:a1:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687969, 'reachable_time': 24655, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 286181, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.836 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fe475770-ab8c-4849-8a23-3dd54501710f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.894 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f709c2d8-20db-4c2c-a7f9-2823fc05ea8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.895 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d9599b4-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.895 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.896 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d9599b4-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.897 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:44 np0005593234 kernel: tap8d9599b4-80: entered promiscuous mode
Jan 23 05:09:44 np0005593234 NetworkManager[48942]: <info>  [1769162984.8985] manager: (tap8d9599b4-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/264)
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.899 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.902 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8d9599b4-80, col_values=(('external_ids', {'iface-id': 'b57bd565-3bb1-4ecc-8df0-a7c439ac84a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.903 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:44 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:44Z|00505|binding|INFO|Releasing lport b57bd565-3bb1-4ecc-8df0-a7c439ac84a6 from this chassis (sb_readonly=0)
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.904 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.907 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162984.9068935, 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.907 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8d9599b4-8855-4310-af02-cdd058438f7d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8d9599b4-8855-4310-af02-cdd058438f7d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.907 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] VM Started (Lifecycle Event)#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.908 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[094e8ff3-1d5c-42ee-9447-f0519a647cf8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.909 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-8d9599b4-8855-4310-af02-cdd058438f7d
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/8d9599b4-8855-4310-af02-cdd058438f7d.pid.haproxy
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 8d9599b4-8855-4310-af02-cdd058438f7d
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:09:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:44.909 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'env', 'PROCESS_TAG=haproxy-8d9599b4-8855-4310-af02-cdd058438f7d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8d9599b4-8855-4310-af02-cdd058438f7d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.920 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.920 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:09:44 np0005593234 nova_compute[227762]: 2026-01-23 10:09:44.920 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.001 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.006 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162984.9070318, 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.006 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.039 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.043 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.065 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.101 227766 DEBUG nova.network.neutron [req-efa6234f-a6f7-4388-99b0-981d33bdcdb4 req-30b38b5c-4829-4797-a3e6-306e5f525191 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Updated VIF entry in instance network info cache for port 8be6de92-c581-49d7-a315-1d1b8c33153a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.101 227766 DEBUG nova.network.neutron [req-efa6234f-a6f7-4388-99b0-981d33bdcdb4 req-30b38b5c-4829-4797-a3e6-306e5f525191 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Updating instance_info_cache with network_info: [{"id": "8be6de92-c581-49d7-a315-1d1b8c33153a", "address": "fa:16:3e:d6:07:56", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8be6de92-c5", "ovs_interfaceid": "8be6de92-c581-49d7-a315-1d1b8c33153a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.121 227766 DEBUG oslo_concurrency.lockutils [req-efa6234f-a6f7-4388-99b0-981d33bdcdb4 req-30b38b5c-4829-4797-a3e6-306e5f525191 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:09:45 np0005593234 podman[286220]: 2026-01-23 10:09:45.297100542 +0000 UTC m=+0.049594011 container create d5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.315 227766 DEBUG nova.compute.manager [req-6cb8c185-0220-4b03-8e02-9f23210d040e req-de0b77ee-f350-4d89-9be8-0f22bbe70dee 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Received event network-vif-plugged-8be6de92-c581-49d7-a315-1d1b8c33153a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.315 227766 DEBUG oslo_concurrency.lockutils [req-6cb8c185-0220-4b03-8e02-9f23210d040e req-de0b77ee-f350-4d89-9be8-0f22bbe70dee 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.316 227766 DEBUG oslo_concurrency.lockutils [req-6cb8c185-0220-4b03-8e02-9f23210d040e req-de0b77ee-f350-4d89-9be8-0f22bbe70dee 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.316 227766 DEBUG oslo_concurrency.lockutils [req-6cb8c185-0220-4b03-8e02-9f23210d040e req-de0b77ee-f350-4d89-9be8-0f22bbe70dee 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.316 227766 DEBUG nova.compute.manager [req-6cb8c185-0220-4b03-8e02-9f23210d040e req-de0b77ee-f350-4d89-9be8-0f22bbe70dee 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Processing event network-vif-plugged-8be6de92-c581-49d7-a315-1d1b8c33153a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.317 227766 DEBUG nova.compute.manager [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.321 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769162985.321228, 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.322 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.324 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.328 227766 INFO nova.virt.libvirt.driver [-] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Instance spawned successfully.#033[00m
Jan 23 05:09:45 np0005593234 systemd[1]: Started libpod-conmon-d5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0.scope.
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.329 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.351 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.357 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.357 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.358 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.358 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.358 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.359 227766 DEBUG nova.virt.libvirt.driver [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.362 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:09:45 np0005593234 podman[286220]: 2026-01-23 10:09:45.269588032 +0000 UTC m=+0.022081521 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:09:45 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:09:45 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e172f9831372312d99f1618df0ac2daa8d2ad83be2d9f75197e922d10e277efd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:09:45 np0005593234 podman[286220]: 2026-01-23 10:09:45.386551699 +0000 UTC m=+0.139045168 container init d5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 05:09:45 np0005593234 podman[286220]: 2026-01-23 10:09:45.39264119 +0000 UTC m=+0.145134659 container start d5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.397 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:09:45 np0005593234 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[286235]: [NOTICE]   (286239) : New worker (286241) forked
Jan 23 05:09:45 np0005593234 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[286235]: [NOTICE]   (286239) : Loading success.
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.422 227766 INFO nova.compute.manager [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Took 8.60 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.422 227766 DEBUG nova.compute.manager [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.496 227766 INFO nova.compute.manager [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Took 9.81 seconds to build instance.#033[00m
Jan 23 05:09:45 np0005593234 nova_compute[227762]: 2026-01-23 10:09:45.513 227766 DEBUG oslo_concurrency.lockutils [None req-2973f201-f6cb-4d12-b7fa-1d95223d4eee aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.935s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:45.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:46 np0005593234 nova_compute[227762]: 2026-01-23 10:09:46.656 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:46 np0005593234 nova_compute[227762]: 2026-01-23 10:09:46.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:46.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:47 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:47Z|00506|binding|INFO|Releasing lport b57bd565-3bb1-4ecc-8df0-a7c439ac84a6 from this chassis (sb_readonly=0)
Jan 23 05:09:47 np0005593234 nova_compute[227762]: 2026-01-23 10:09:47.390 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:47 np0005593234 nova_compute[227762]: 2026-01-23 10:09:47.446 227766 DEBUG nova.compute.manager [req-457acd82-c424-4a67-b4dc-e4346fe1816b req-57e377a8-4e07-44e7-94ef-e8808fc778a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Received event network-vif-plugged-8be6de92-c581-49d7-a315-1d1b8c33153a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:09:47 np0005593234 nova_compute[227762]: 2026-01-23 10:09:47.446 227766 DEBUG oslo_concurrency.lockutils [req-457acd82-c424-4a67-b4dc-e4346fe1816b req-57e377a8-4e07-44e7-94ef-e8808fc778a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:09:47 np0005593234 nova_compute[227762]: 2026-01-23 10:09:47.446 227766 DEBUG oslo_concurrency.lockutils [req-457acd82-c424-4a67-b4dc-e4346fe1816b req-57e377a8-4e07-44e7-94ef-e8808fc778a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:09:47 np0005593234 nova_compute[227762]: 2026-01-23 10:09:47.447 227766 DEBUG oslo_concurrency.lockutils [req-457acd82-c424-4a67-b4dc-e4346fe1816b req-57e377a8-4e07-44e7-94ef-e8808fc778a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:09:47 np0005593234 nova_compute[227762]: 2026-01-23 10:09:47.447 227766 DEBUG nova.compute.manager [req-457acd82-c424-4a67-b4dc-e4346fe1816b req-57e377a8-4e07-44e7-94ef-e8808fc778a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] No waiting events found dispatching network-vif-plugged-8be6de92-c581-49d7-a315-1d1b8c33153a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:09:47 np0005593234 nova_compute[227762]: 2026-01-23 10:09:47.447 227766 WARNING nova.compute.manager [req-457acd82-c424-4a67-b4dc-e4346fe1816b req-57e377a8-4e07-44e7-94ef-e8808fc778a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Received unexpected event network-vif-plugged-8be6de92-c581-49d7-a315-1d1b8c33153a for instance with vm_state active and task_state None.#033[00m
Jan 23 05:09:47 np0005593234 nova_compute[227762]: 2026-01-23 10:09:47.487 227766 INFO nova.compute.manager [None req-4f3cbb25-c78a-4df9-94aa-3ac27123e338 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Get console output#033[00m
Jan 23 05:09:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:47.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:48 np0005593234 nova_compute[227762]: 2026-01-23 10:09:48.068 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:48 np0005593234 nova_compute[227762]: 2026-01-23 10:09:48.125 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:48.132 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=46, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=45) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:09:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:48.133 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:09:48 np0005593234 nova_compute[227762]: 2026-01-23 10:09:48.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:09:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:48.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:49.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:50.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:09:51.135 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '46'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:09:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:51.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:51 np0005593234 nova_compute[227762]: 2026-01-23 10:09:51.658 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:52 np0005593234 nova_compute[227762]: 2026-01-23 10:09:52.494 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769162977.4840176, d247ce17-f43c-4f04-9eda-6dcd931a2f43 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:09:52 np0005593234 nova_compute[227762]: 2026-01-23 10:09:52.494 227766 INFO nova.compute.manager [-] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:09:52 np0005593234 nova_compute[227762]: 2026-01-23 10:09:52.518 227766 DEBUG nova.compute.manager [None req-5c20fd37-63b8-4595-9f36-8a0bc426d320 - - - - - -] [instance: d247ce17-f43c-4f04-9eda-6dcd931a2f43] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:09:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:52.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:53 np0005593234 nova_compute[227762]: 2026-01-23 10:09:53.071 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:53.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:09:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:54.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:09:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:55.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:09:56 np0005593234 nova_compute[227762]: 2026-01-23 10:09:56.660 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:56.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:57 np0005593234 nova_compute[227762]: 2026-01-23 10:09:57.096 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:09:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:57.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:09:58 np0005593234 nova_compute[227762]: 2026-01-23 10:09:58.074 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:09:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:58.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:09:59 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:59Z|00052|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d6:07:56 10.100.0.8
Jan 23 05:09:59 np0005593234 ovn_controller[134547]: 2026-01-23T10:09:59Z|00053|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d6:07:56 10.100.0.8
Jan 23 05:09:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:09:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:09:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:59.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:00 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 05:10:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:00.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:10:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:01.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:10:01 np0005593234 nova_compute[227762]: 2026-01-23 10:10:01.661 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:02.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:03 np0005593234 nova_compute[227762]: 2026-01-23 10:10:03.077 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:03.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:03 np0005593234 nova_compute[227762]: 2026-01-23 10:10:03.838 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:04.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:05.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:06 np0005593234 nova_compute[227762]: 2026-01-23 10:10:06.663 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:06 np0005593234 podman[286311]: 2026-01-23 10:10:06.761788903 +0000 UTC m=+0.048796957 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:10:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:06.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:07.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:08 np0005593234 nova_compute[227762]: 2026-01-23 10:10:08.079 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:08.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:09.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:10.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:11.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:11 np0005593234 nova_compute[227762]: 2026-01-23 10:10:11.664 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:12.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:13 np0005593234 nova_compute[227762]: 2026-01-23 10:10:13.082 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:13.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e285 e285: 3 total, 3 up, 3 in
Jan 23 05:10:14 np0005593234 podman[286385]: 2026-01-23 10:10:14.790759392 +0000 UTC m=+0.078131064 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 23 05:10:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:14.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:15.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e285 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:16 np0005593234 nova_compute[227762]: 2026-01-23 10:10:16.666 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:16.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:17.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:18 np0005593234 nova_compute[227762]: 2026-01-23 10:10:18.084 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:18.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:19.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:20.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e285 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:21.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:21 np0005593234 nova_compute[227762]: 2026-01-23 10:10:21.667 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:10:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:22.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:10:23 np0005593234 nova_compute[227762]: 2026-01-23 10:10:23.087 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e286 e286: 3 total, 3 up, 3 in
Jan 23 05:10:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:23.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:10:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:10:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:10:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:24.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e287 e287: 3 total, 3 up, 3 in
Jan 23 05:10:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:25.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e287 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:26 np0005593234 nova_compute[227762]: 2026-01-23 10:10:26.711 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:26.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e288 e288: 3 total, 3 up, 3 in
Jan 23 05:10:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:27.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e289 e289: 3 total, 3 up, 3 in
Jan 23 05:10:28 np0005593234 nova_compute[227762]: 2026-01-23 10:10:28.089 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:28.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:29.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:10:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:10:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:30.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:10:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:31.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:10:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e289 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:31 np0005593234 nova_compute[227762]: 2026-01-23 10:10:31.714 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e290 e290: 3 total, 3 up, 3 in
Jan 23 05:10:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:10:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/784611389' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:10:32 np0005593234 nova_compute[227762]: 2026-01-23 10:10:32.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:32.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:33 np0005593234 nova_compute[227762]: 2026-01-23 10:10:33.093 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:33.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:34 np0005593234 nova_compute[227762]: 2026-01-23 10:10:34.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:34.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:35 np0005593234 nova_compute[227762]: 2026-01-23 10:10:35.514 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:35 np0005593234 nova_compute[227762]: 2026-01-23 10:10:35.514 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:35 np0005593234 nova_compute[227762]: 2026-01-23 10:10:35.514 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:35 np0005593234 nova_compute[227762]: 2026-01-23 10:10:35.515 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:10:35 np0005593234 nova_compute[227762]: 2026-01-23 10:10:35.515 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:35.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:10:35 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1916074286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:10:35 np0005593234 nova_compute[227762]: 2026-01-23 10:10:35.972 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e290 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:36 np0005593234 nova_compute[227762]: 2026-01-23 10:10:36.716 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e291 e291: 3 total, 3 up, 3 in
Jan 23 05:10:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:36.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:37 np0005593234 nova_compute[227762]: 2026-01-23 10:10:37.377 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:10:37 np0005593234 nova_compute[227762]: 2026-01-23 10:10:37.378 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:10:37 np0005593234 nova_compute[227762]: 2026-01-23 10:10:37.556 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:10:37 np0005593234 nova_compute[227762]: 2026-01-23 10:10:37.557 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4262MB free_disk=20.793682098388672GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:10:37 np0005593234 nova_compute[227762]: 2026-01-23 10:10:37.558 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:37 np0005593234 nova_compute[227762]: 2026-01-23 10:10:37.558 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:37.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:37 np0005593234 podman[286677]: 2026-01-23 10:10:37.751262284 +0000 UTC m=+0.046006949 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:10:38 np0005593234 nova_compute[227762]: 2026-01-23 10:10:38.095 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:38.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:39.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:40 np0005593234 nova_compute[227762]: 2026-01-23 10:10:40.061 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:10:40 np0005593234 nova_compute[227762]: 2026-01-23 10:10:40.062 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:10:40 np0005593234 nova_compute[227762]: 2026-01-23 10:10:40.062 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:10:40 np0005593234 nova_compute[227762]: 2026-01-23 10:10:40.080 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:10:40 np0005593234 nova_compute[227762]: 2026-01-23 10:10:40.095 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:10:40 np0005593234 nova_compute[227762]: 2026-01-23 10:10:40.096 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:10:40 np0005593234 nova_compute[227762]: 2026-01-23 10:10:40.108 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:10:40 np0005593234 nova_compute[227762]: 2026-01-23 10:10:40.129 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:10:40 np0005593234 nova_compute[227762]: 2026-01-23 10:10:40.183 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:10:40 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3113727065' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:10:40 np0005593234 nova_compute[227762]: 2026-01-23 10:10:40.713 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.530s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:40 np0005593234 nova_compute[227762]: 2026-01-23 10:10:40.720 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:10:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:40.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:41 np0005593234 nova_compute[227762]: 2026-01-23 10:10:41.034 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:10:41 np0005593234 nova_compute[227762]: 2026-01-23 10:10:41.083 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:10:41 np0005593234 nova_compute[227762]: 2026-01-23 10:10:41.083 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.525s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:41.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:41 np0005593234 nova_compute[227762]: 2026-01-23 10:10:41.718 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:42.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:10:42.847 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:10:42.848 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:10:42.848 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:43 np0005593234 nova_compute[227762]: 2026-01-23 10:10:43.099 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:43.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:44 np0005593234 nova_compute[227762]: 2026-01-23 10:10:44.083 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:44 np0005593234 nova_compute[227762]: 2026-01-23 10:10:44.084 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:44 np0005593234 nova_compute[227762]: 2026-01-23 10:10:44.084 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:10:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1606914261' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:10:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:10:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1606914261' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:10:44 np0005593234 nova_compute[227762]: 2026-01-23 10:10:44.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:44 np0005593234 nova_compute[227762]: 2026-01-23 10:10:44.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:10:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:44.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:45.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:45 np0005593234 podman[286720]: 2026-01-23 10:10:45.854443623 +0000 UTC m=+0.150515937 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 05:10:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:46 np0005593234 nova_compute[227762]: 2026-01-23 10:10:46.720 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:46 np0005593234 nova_compute[227762]: 2026-01-23 10:10:46.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:46 np0005593234 nova_compute[227762]: 2026-01-23 10:10:46.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:10:46 np0005593234 nova_compute[227762]: 2026-01-23 10:10:46.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:10:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:46.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:47 np0005593234 nova_compute[227762]: 2026-01-23 10:10:47.193 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:47 np0005593234 nova_compute[227762]: 2026-01-23 10:10:47.193 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:47 np0005593234 nova_compute[227762]: 2026-01-23 10:10:47.193 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:10:47 np0005593234 nova_compute[227762]: 2026-01-23 10:10:47.194 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:47.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:48 np0005593234 nova_compute[227762]: 2026-01-23 10:10:48.101 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:48.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:49.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:50.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:51 np0005593234 nova_compute[227762]: 2026-01-23 10:10:51.155 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Updating instance_info_cache with network_info: [{"id": "8be6de92-c581-49d7-a315-1d1b8c33153a", "address": "fa:16:3e:d6:07:56", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8be6de92-c5", "ovs_interfaceid": "8be6de92-c581-49d7-a315-1d1b8c33153a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:10:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:51.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:51 np0005593234 nova_compute[227762]: 2026-01-23 10:10:51.792 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:52 np0005593234 nova_compute[227762]: 2026-01-23 10:10:52.667 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:10:52 np0005593234 nova_compute[227762]: 2026-01-23 10:10:52.667 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:10:52 np0005593234 nova_compute[227762]: 2026-01-23 10:10:52.667 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:52.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:53 np0005593234 nova_compute[227762]: 2026-01-23 10:10:53.105 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:53.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:54 np0005593234 nova_compute[227762]: 2026-01-23 10:10:54.280 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:54 np0005593234 nova_compute[227762]: 2026-01-23 10:10:54.281 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:54 np0005593234 nova_compute[227762]: 2026-01-23 10:10:54.312 227766 DEBUG nova.compute.manager [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:10:54 np0005593234 nova_compute[227762]: 2026-01-23 10:10:54.405 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:54 np0005593234 nova_compute[227762]: 2026-01-23 10:10:54.406 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:54 np0005593234 nova_compute[227762]: 2026-01-23 10:10:54.437 227766 DEBUG nova.virt.hardware [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:10:54 np0005593234 nova_compute[227762]: 2026-01-23 10:10:54.438 227766 INFO nova.compute.claims [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:10:54 np0005593234 nova_compute[227762]: 2026-01-23 10:10:54.652 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:54 np0005593234 nova_compute[227762]: 2026-01-23 10:10:54.673 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:10:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:54.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:10:55 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4030296179' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.104 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.113 227766 DEBUG nova.compute.provider_tree [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.133 227766 DEBUG nova.scheduler.client.report [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.171 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.765s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.172 227766 DEBUG nova.compute.manager [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.250 227766 DEBUG nova.compute.manager [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.250 227766 DEBUG nova.network.neutron [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.285 227766 INFO nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.331 227766 DEBUG nova.compute.manager [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.385 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.385 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.411 227766 DEBUG nova.compute.manager [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.471 227766 DEBUG nova.compute.manager [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.472 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.473 227766 INFO nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Creating image(s)#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.499 227766 DEBUG nova.storage.rbd_utils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.524 227766 DEBUG nova.storage.rbd_utils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.550 227766 DEBUG nova.storage.rbd_utils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.554 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.599 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.600 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.608 227766 DEBUG nova.virt.hardware [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.608 227766 INFO nova.compute.claims [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.622 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.623 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.623 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.624 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:10:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:55.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.651 227766 DEBUG nova.storage.rbd_utils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.654 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:55 np0005593234 nova_compute[227762]: 2026-01-23 10:10:55.806 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:10:56 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/159899216' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.270 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.282 227766 DEBUG nova.compute.provider_tree [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.311 227766 DEBUG nova.scheduler.client.report [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.336 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.736s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.336 227766 DEBUG nova.compute.manager [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.387 227766 DEBUG nova.compute.manager [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.388 227766 DEBUG nova.network.neutron [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.414 227766 INFO nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.484 227766 DEBUG nova.compute.manager [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.578 227766 DEBUG nova.compute.manager [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.579 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.579 227766 INFO nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Creating image(s)#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.693 227766 DEBUG nova.storage.rbd_utils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.725 227766 DEBUG nova.storage.rbd_utils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.754 227766 DEBUG nova.storage.rbd_utils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.758 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.784 227766 DEBUG nova.policy [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eb500aabc93044e380f4bc905205803d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'f00cc6e26e5c435b902306c6421e146d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.789 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.827 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.831 227766 DEBUG nova.policy [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aca3cab576d641d3b89e7dddf155d467', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.834 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.835 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.836 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.836 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:56.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.860 227766 DEBUG nova.storage.rbd_utils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.863 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:10:56 np0005593234 nova_compute[227762]: 2026-01-23 10:10:56.930 227766 DEBUG nova.storage.rbd_utils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] resizing rbd image a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:10:57 np0005593234 nova_compute[227762]: 2026-01-23 10:10:57.254 227766 DEBUG nova.objects.instance [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'migration_context' on Instance uuid a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:57 np0005593234 nova_compute[227762]: 2026-01-23 10:10:57.549 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:10:57 np0005593234 nova_compute[227762]: 2026-01-23 10:10:57.550 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Ensure instance console log exists: /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:10:57 np0005593234 nova_compute[227762]: 2026-01-23 10:10:57.550 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:57 np0005593234 nova_compute[227762]: 2026-01-23 10:10:57.550 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:57 np0005593234 nova_compute[227762]: 2026-01-23 10:10:57.550 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:57.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:57 np0005593234 nova_compute[227762]: 2026-01-23 10:10:57.797 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:10:57.797 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=47, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=46) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:10:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:10:57.798 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:10:57 np0005593234 nova_compute[227762]: 2026-01-23 10:10:57.893 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:10:57 np0005593234 nova_compute[227762]: 2026-01-23 10:10:57.952 227766 DEBUG nova.storage.rbd_utils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] resizing rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:10:58 np0005593234 nova_compute[227762]: 2026-01-23 10:10:58.001 227766 DEBUG nova.network.neutron [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Successfully created port: 475aff6e-e556-418f-8d36-87ae65b950ae _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:10:58 np0005593234 nova_compute[227762]: 2026-01-23 10:10:58.070 227766 DEBUG nova.objects.instance [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lazy-loading 'migration_context' on Instance uuid da6a1a46-4a6b-44a0-b5a2-35d2634865be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:10:58 np0005593234 nova_compute[227762]: 2026-01-23 10:10:58.107 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:10:58 np0005593234 nova_compute[227762]: 2026-01-23 10:10:58.145 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:10:58 np0005593234 nova_compute[227762]: 2026-01-23 10:10:58.145 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Ensure instance console log exists: /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:10:58 np0005593234 nova_compute[227762]: 2026-01-23 10:10:58.145 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:10:58 np0005593234 nova_compute[227762]: 2026-01-23 10:10:58.145 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:10:58 np0005593234 nova_compute[227762]: 2026-01-23 10:10:58.146 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:10:58 np0005593234 nova_compute[227762]: 2026-01-23 10:10:58.520 227766 DEBUG nova.network.neutron [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Successfully created port: 87b7656f-9fbc-466f-bfe3-06171df90096 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:10:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:58.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:59 np0005593234 nova_compute[227762]: 2026-01-23 10:10:59.578 227766 DEBUG nova.network.neutron [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Successfully updated port: 475aff6e-e556-418f-8d36-87ae65b950ae _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:10:59 np0005593234 nova_compute[227762]: 2026-01-23 10:10:59.593 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquiring lock "refresh_cache-da6a1a46-4a6b-44a0-b5a2-35d2634865be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:59 np0005593234 nova_compute[227762]: 2026-01-23 10:10:59.593 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquired lock "refresh_cache-da6a1a46-4a6b-44a0-b5a2-35d2634865be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:59 np0005593234 nova_compute[227762]: 2026-01-23 10:10:59.593 227766 DEBUG nova.network.neutron [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:10:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:10:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:10:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:59.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:10:59 np0005593234 nova_compute[227762]: 2026-01-23 10:10:59.794 227766 DEBUG nova.network.neutron [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Successfully updated port: 87b7656f-9fbc-466f-bfe3-06171df90096 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:10:59 np0005593234 nova_compute[227762]: 2026-01-23 10:10:59.810 227766 DEBUG nova.compute.manager [req-14ddefed-3348-4fa3-adbe-2228bb0f8123 req-17c1693c-98e2-4abf-b0be-440deab37fd8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-changed-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:10:59 np0005593234 nova_compute[227762]: 2026-01-23 10:10:59.811 227766 DEBUG nova.compute.manager [req-14ddefed-3348-4fa3-adbe-2228bb0f8123 req-17c1693c-98e2-4abf-b0be-440deab37fd8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Refreshing instance network info cache due to event network-changed-475aff6e-e556-418f-8d36-87ae65b950ae. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:10:59 np0005593234 nova_compute[227762]: 2026-01-23 10:10:59.811 227766 DEBUG oslo_concurrency.lockutils [req-14ddefed-3348-4fa3-adbe-2228bb0f8123 req-17c1693c-98e2-4abf-b0be-440deab37fd8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-da6a1a46-4a6b-44a0-b5a2-35d2634865be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:59 np0005593234 nova_compute[227762]: 2026-01-23 10:10:59.828 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:10:59 np0005593234 nova_compute[227762]: 2026-01-23 10:10:59.828 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquired lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:10:59 np0005593234 nova_compute[227762]: 2026-01-23 10:10:59.828 227766 DEBUG nova.network.neutron [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:10:59 np0005593234 nova_compute[227762]: 2026-01-23 10:10:59.890 227766 DEBUG nova.network.neutron [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:11:00 np0005593234 nova_compute[227762]: 2026-01-23 10:10:59.999 227766 DEBUG nova.compute.manager [req-bd911b97-8f53-42a3-9b16-3bf0b0cca076 req-3eea72ac-d617-454a-8ba6-5fadd0fa5d8a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-changed-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:00 np0005593234 nova_compute[227762]: 2026-01-23 10:11:00.000 227766 DEBUG nova.compute.manager [req-bd911b97-8f53-42a3-9b16-3bf0b0cca076 req-3eea72ac-d617-454a-8ba6-5fadd0fa5d8a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Refreshing instance network info cache due to event network-changed-87b7656f-9fbc-466f-bfe3-06171df90096. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:11:00 np0005593234 nova_compute[227762]: 2026-01-23 10:11:00.000 227766 DEBUG oslo_concurrency.lockutils [req-bd911b97-8f53-42a3-9b16-3bf0b0cca076 req-3eea72ac-d617-454a-8ba6-5fadd0fa5d8a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:11:00 np0005593234 nova_compute[227762]: 2026-01-23 10:11:00.105 227766 DEBUG nova.network.neutron [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:11:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:11:00 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2074020461' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:11:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:00.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:01.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:01 np0005593234 nova_compute[227762]: 2026-01-23 10:11:01.850 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.064 227766 DEBUG nova.network.neutron [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Updating instance_info_cache with network_info: [{"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.066 227766 DEBUG nova.network.neutron [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updating instance_info_cache with network_info: [{"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.124 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Releasing lock "refresh_cache-da6a1a46-4a6b-44a0-b5a2-35d2634865be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.125 227766 DEBUG nova.compute.manager [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Instance network_info: |[{"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.125 227766 DEBUG oslo_concurrency.lockutils [req-14ddefed-3348-4fa3-adbe-2228bb0f8123 req-17c1693c-98e2-4abf-b0be-440deab37fd8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-da6a1a46-4a6b-44a0-b5a2-35d2634865be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.125 227766 DEBUG nova.network.neutron [req-14ddefed-3348-4fa3-adbe-2228bb0f8123 req-17c1693c-98e2-4abf-b0be-440deab37fd8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Refreshing network info cache for port 475aff6e-e556-418f-8d36-87ae65b950ae _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.128 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Start _get_guest_xml network_info=[{"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.130 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Releasing lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.130 227766 DEBUG nova.compute.manager [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Instance network_info: |[{"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.130 227766 DEBUG oslo_concurrency.lockutils [req-bd911b97-8f53-42a3-9b16-3bf0b0cca076 req-3eea72ac-d617-454a-8ba6-5fadd0fa5d8a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.130 227766 DEBUG nova.network.neutron [req-bd911b97-8f53-42a3-9b16-3bf0b0cca076 req-3eea72ac-d617-454a-8ba6-5fadd0fa5d8a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Refreshing network info cache for port 87b7656f-9fbc-466f-bfe3-06171df90096 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.132 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Start _get_guest_xml network_info=[{"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.137 227766 WARNING nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.139 227766 WARNING nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.157 227766 DEBUG nova.virt.libvirt.host [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.158 227766 DEBUG nova.virt.libvirt.host [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.158 227766 DEBUG nova.virt.libvirt.host [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.159 227766 DEBUG nova.virt.libvirt.host [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.161 227766 DEBUG nova.virt.libvirt.host [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.162 227766 DEBUG nova.virt.libvirt.host [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.163 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.163 227766 DEBUG nova.virt.hardware [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.164 227766 DEBUG nova.virt.hardware [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.164 227766 DEBUG nova.virt.hardware [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.164 227766 DEBUG nova.virt.hardware [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.164 227766 DEBUG nova.virt.hardware [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.164 227766 DEBUG nova.virt.hardware [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.165 227766 DEBUG nova.virt.hardware [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.165 227766 DEBUG nova.virt.hardware [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.165 227766 DEBUG nova.virt.hardware [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.165 227766 DEBUG nova.virt.hardware [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.165 227766 DEBUG nova.virt.hardware [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.168 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.191 227766 DEBUG nova.virt.libvirt.host [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.192 227766 DEBUG nova.virt.libvirt.host [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.194 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.194 227766 DEBUG nova.virt.hardware [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.195 227766 DEBUG nova.virt.hardware [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.195 227766 DEBUG nova.virt.hardware [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.195 227766 DEBUG nova.virt.hardware [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.195 227766 DEBUG nova.virt.hardware [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.196 227766 DEBUG nova.virt.hardware [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.196 227766 DEBUG nova.virt.hardware [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.196 227766 DEBUG nova.virt.hardware [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.196 227766 DEBUG nova.virt.hardware [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.197 227766 DEBUG nova.virt.hardware [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.197 227766 DEBUG nova.virt.hardware [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:11:02 np0005593234 nova_compute[227762]: 2026-01-23 10:11:02.201 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:02.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:03 np0005593234 nova_compute[227762]: 2026-01-23 10:11:03.109 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:03.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:11:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2112211815' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:11:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:11:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3923528694' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:11:04 np0005593234 nova_compute[227762]: 2026-01-23 10:11:04.494 227766 DEBUG nova.network.neutron [req-14ddefed-3348-4fa3-adbe-2228bb0f8123 req-17c1693c-98e2-4abf-b0be-440deab37fd8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Updated VIF entry in instance network info cache for port 475aff6e-e556-418f-8d36-87ae65b950ae. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:11:04 np0005593234 nova_compute[227762]: 2026-01-23 10:11:04.494 227766 DEBUG nova.network.neutron [req-14ddefed-3348-4fa3-adbe-2228bb0f8123 req-17c1693c-98e2-4abf-b0be-440deab37fd8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Updating instance_info_cache with network_info: [{"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:04 np0005593234 nova_compute[227762]: 2026-01-23 10:11:04.522 227766 DEBUG oslo_concurrency.lockutils [req-14ddefed-3348-4fa3-adbe-2228bb0f8123 req-17c1693c-98e2-4abf-b0be-440deab37fd8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-da6a1a46-4a6b-44a0-b5a2-35d2634865be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:11:04 np0005593234 nova_compute[227762]: 2026-01-23 10:11:04.690 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:04 np0005593234 nova_compute[227762]: 2026-01-23 10:11:04.718 227766 DEBUG nova.storage.rbd_utils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:04 np0005593234 nova_compute[227762]: 2026-01-23 10:11:04.724 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:04 np0005593234 nova_compute[227762]: 2026-01-23 10:11:04.747 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 2.579s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:04 np0005593234 nova_compute[227762]: 2026-01-23 10:11:04.774 227766 DEBUG nova.storage.rbd_utils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:04 np0005593234 nova_compute[227762]: 2026-01-23 10:11:04.779 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:04 np0005593234 nova_compute[227762]: 2026-01-23 10:11:04.804 227766 DEBUG nova.network.neutron [req-bd911b97-8f53-42a3-9b16-3bf0b0cca076 req-3eea72ac-d617-454a-8ba6-5fadd0fa5d8a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updated VIF entry in instance network info cache for port 87b7656f-9fbc-466f-bfe3-06171df90096. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:11:04 np0005593234 nova_compute[227762]: 2026-01-23 10:11:04.805 227766 DEBUG nova.network.neutron [req-bd911b97-8f53-42a3-9b16-3bf0b0cca076 req-3eea72ac-d617-454a-8ba6-5fadd0fa5d8a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updating instance_info_cache with network_info: [{"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:04 np0005593234 nova_compute[227762]: 2026-01-23 10:11:04.836 227766 DEBUG oslo_concurrency.lockutils [req-bd911b97-8f53-42a3-9b16-3bf0b0cca076 req-3eea72ac-d617-454a-8ba6-5fadd0fa5d8a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:11:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:11:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:04.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:11:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:11:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1526683013' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:11:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:11:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2051248476' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:11:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:05.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:05.801 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '47'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.836 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.837 227766 DEBUG nova.virt.libvirt.vif [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:10:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1307986454',display_name='tempest-ServerActionsTestOtherB-server-1307986454',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1307986454',id=130,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-7gk6dzv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:10:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.838 227766 DEBUG nova.network.os_vif_util [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.839 227766 DEBUG nova.network.os_vif_util [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.840 227766 DEBUG nova.objects.instance [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'pci_devices' on Instance uuid a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.841 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.117s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.842 227766 DEBUG nova.virt.libvirt.vif [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:10:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1707842336',display_name='tempest-ServerRescueTestJSON-server-1707842336',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1707842336',id=131,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f00cc6e26e5c435b902306c6421e146d',ramdisk_id='',reservation_id='r-g6qwsh4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-837476510',owner_user_name='tempest-ServerRescueTestJSON-837476510-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:10:56Z,user_data=None,user_id='eb500aabc93044e380f4bc905205803d',uuid=da6a1a46-4a6b-44a0-b5a2-35d2634865be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.842 227766 DEBUG nova.network.os_vif_util [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Converting VIF {"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.843 227766 DEBUG nova.network.os_vif_util [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:d9:a5,bridge_name='br-int',has_traffic_filtering=True,id=475aff6e-e556-418f-8d36-87ae65b950ae,network=Network(88b571fd-69ad-4860-a596-3bd637fdb189),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap475aff6e-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.843 227766 DEBUG nova.objects.instance [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lazy-loading 'pci_devices' on Instance uuid da6a1a46-4a6b-44a0-b5a2-35d2634865be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.857 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <uuid>a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c</uuid>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <name>instance-00000082</name>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerActionsTestOtherB-server-1307986454</nova:name>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:11:02</nova:creationTime>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:user uuid="aca3cab576d641d3b89e7dddf155d467">tempest-ServerActionsTestOtherB-1052932467-project-member</nova:user>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:project uuid="9dd869ce76e44fc8a82b8bbee1654d33">tempest-ServerActionsTestOtherB-1052932467</nova:project>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:port uuid="87b7656f-9fbc-466f-bfe3-06171df90096">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <entry name="serial">a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c</entry>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <entry name="uuid">a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c</entry>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk.config">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:a4:3b:96"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <target dev="tap87b7656f-9f"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c/console.log" append="off"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:11:05 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:11:05 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.858 227766 DEBUG nova.compute.manager [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Preparing to wait for external event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.858 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.858 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.858 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.859 227766 DEBUG nova.virt.libvirt.vif [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:10:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1307986454',display_name='tempest-ServerActionsTestOtherB-server-1307986454',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1307986454',id=130,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-7gk6dzv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:10:55Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.859 227766 DEBUG nova.network.os_vif_util [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.860 227766 DEBUG nova.network.os_vif_util [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.860 227766 DEBUG os_vif [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.861 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.861 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.862 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.864 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <uuid>da6a1a46-4a6b-44a0-b5a2-35d2634865be</uuid>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <name>instance-00000083</name>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerRescueTestJSON-server-1707842336</nova:name>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:11:02</nova:creationTime>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:user uuid="eb500aabc93044e380f4bc905205803d">tempest-ServerRescueTestJSON-837476510-project-member</nova:user>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:project uuid="f00cc6e26e5c435b902306c6421e146d">tempest-ServerRescueTestJSON-837476510</nova:project>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <nova:port uuid="475aff6e-e556-418f-8d36-87ae65b950ae">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <entry name="serial">da6a1a46-4a6b-44a0-b5a2-35d2634865be</entry>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <entry name="uuid">da6a1a46-4a6b-44a0-b5a2-35d2634865be</entry>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.config">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:85:d9:a5"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <target dev="tap475aff6e-e5"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/console.log" append="off"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:11:05 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:11:05 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:11:05 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:11:05 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.864 227766 DEBUG nova.compute.manager [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Preparing to wait for external event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.864 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.865 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.865 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.865 227766 DEBUG nova.virt.libvirt.vif [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:10:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1707842336',display_name='tempest-ServerRescueTestJSON-server-1707842336',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1707842336',id=131,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f00cc6e26e5c435b902306c6421e146d',ramdisk_id='',reservation_id='r-g6qwsh4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueTestJSON-837476510',owner_user_name='tempest-ServerRescueTestJSON-837476510-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:10:56Z,user_data=None,user_id='eb500aabc93044e380f4bc905205803d',uuid=da6a1a46-4a6b-44a0-b5a2-35d2634865be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.866 227766 DEBUG nova.network.os_vif_util [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Converting VIF {"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.866 227766 DEBUG nova.network.os_vif_util [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:85:d9:a5,bridge_name='br-int',has_traffic_filtering=True,id=475aff6e-e556-418f-8d36-87ae65b950ae,network=Network(88b571fd-69ad-4860-a596-3bd637fdb189),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap475aff6e-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.866 227766 DEBUG os_vif [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:d9:a5,bridge_name='br-int',has_traffic_filtering=True,id=475aff6e-e556-418f-8d36-87ae65b950ae,network=Network(88b571fd-69ad-4860-a596-3bd637fdb189),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap475aff6e-e5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.867 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.867 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.867 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.868 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.868 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap87b7656f-9f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.869 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap87b7656f-9f, col_values=(('external_ids', {'iface-id': '87b7656f-9fbc-466f-bfe3-06171df90096', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a4:3b:96', 'vm-uuid': 'a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:05 np0005593234 NetworkManager[48942]: <info>  [1769163065.8717] manager: (tap87b7656f-9f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/265)
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.872 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.876 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.877 227766 INFO os_vif [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f')#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.879 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.879 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap475aff6e-e5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.880 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap475aff6e-e5, col_values=(('external_ids', {'iface-id': '475aff6e-e556-418f-8d36-87ae65b950ae', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:85:d9:a5', 'vm-uuid': 'da6a1a46-4a6b-44a0-b5a2-35d2634865be'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.881 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:05 np0005593234 NetworkManager[48942]: <info>  [1769163065.8833] manager: (tap475aff6e-e5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/266)
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.884 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.889 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:05 np0005593234 nova_compute[227762]: 2026-01-23 10:11:05.890 227766 INFO os_vif [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:85:d9:a5,bridge_name='br-int',has_traffic_filtering=True,id=475aff6e-e556-418f-8d36-87ae65b950ae,network=Network(88b571fd-69ad-4860-a596-3bd637fdb189),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap475aff6e-e5')#033[00m
Jan 23 05:11:06 np0005593234 nova_compute[227762]: 2026-01-23 10:11:06.190 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:06 np0005593234 nova_compute[227762]: 2026-01-23 10:11:06.190 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:06 np0005593234 nova_compute[227762]: 2026-01-23 10:11:06.190 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No VIF found with MAC fa:16:3e:a4:3b:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:11:06 np0005593234 nova_compute[227762]: 2026-01-23 10:11:06.191 227766 INFO nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Using config drive#033[00m
Jan 23 05:11:06 np0005593234 nova_compute[227762]: 2026-01-23 10:11:06.220 227766 DEBUG nova.storage.rbd_utils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:06 np0005593234 nova_compute[227762]: 2026-01-23 10:11:06.262 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:06 np0005593234 nova_compute[227762]: 2026-01-23 10:11:06.263 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:06 np0005593234 nova_compute[227762]: 2026-01-23 10:11:06.263 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] No VIF found with MAC fa:16:3e:85:d9:a5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:11:06 np0005593234 nova_compute[227762]: 2026-01-23 10:11:06.264 227766 INFO nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Using config drive#033[00m
Jan 23 05:11:06 np0005593234 nova_compute[227762]: 2026-01-23 10:11:06.284 227766 DEBUG nova.storage.rbd_utils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:06 np0005593234 nova_compute[227762]: 2026-01-23 10:11:06.854 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:06.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:06 np0005593234 nova_compute[227762]: 2026-01-23 10:11:06.905 227766 INFO nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Creating config drive at /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c/disk.config#033[00m
Jan 23 05:11:06 np0005593234 nova_compute[227762]: 2026-01-23 10:11:06.912 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc_t9cyrl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.051 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc_t9cyrl" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.087 227766 DEBUG nova.storage.rbd_utils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.093 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c/disk.config a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.121 227766 INFO nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Creating config drive at /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/disk.config#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.128 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplht3qq8k execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.263 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplht3qq8k" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.289 227766 DEBUG nova.storage.rbd_utils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.294 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/disk.config da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.321 227766 DEBUG oslo_concurrency.processutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c/disk.config a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.228s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.322 227766 INFO nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Deleting local config drive /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c/disk.config because it was imported into RBD.#033[00m
Jan 23 05:11:07 np0005593234 NetworkManager[48942]: <info>  [1769163067.3758] manager: (tap87b7656f-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/267)
Jan 23 05:11:07 np0005593234 kernel: tap87b7656f-9f: entered promiscuous mode
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.383 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:07Z|00507|binding|INFO|Claiming lport 87b7656f-9fbc-466f-bfe3-06171df90096 for this chassis.
Jan 23 05:11:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:07Z|00508|binding|INFO|87b7656f-9fbc-466f-bfe3-06171df90096: Claiming fa:16:3e:a4:3b:96 10.100.0.12
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.391 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:3b:96 10.100.0.12'], port_security=['fa:16:3e:a4:3b:96 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d9599b4-8855-4310-af02-cdd058438f7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cf3e0bf9-33c6-483b-a880-c8297a0be71f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=875f4baa-cb85-49ca-8f02-78715d351fdb, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=87b7656f-9fbc-466f-bfe3-06171df90096) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.392 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 87b7656f-9fbc-466f-bfe3-06171df90096 in datapath 8d9599b4-8855-4310-af02-cdd058438f7d bound to our chassis#033[00m
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.395 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8d9599b4-8855-4310-af02-cdd058438f7d#033[00m
Jan 23 05:11:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:07Z|00509|binding|INFO|Setting lport 87b7656f-9fbc-466f-bfe3-06171df90096 ovn-installed in OVS
Jan 23 05:11:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:07Z|00510|binding|INFO|Setting lport 87b7656f-9fbc-466f-bfe3-06171df90096 up in Southbound
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.404 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593234 systemd-machined[195626]: New machine qemu-60-instance-00000082.
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.413 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593234 systemd-udevd[287441]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.414 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[edcd64de-8a4b-447f-bb47-cfa640133982]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593234 systemd[1]: Started Virtual Machine qemu-60-instance-00000082.
Jan 23 05:11:07 np0005593234 NetworkManager[48942]: <info>  [1769163067.4283] device (tap87b7656f-9f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:11:07 np0005593234 NetworkManager[48942]: <info>  [1769163067.4294] device (tap87b7656f-9f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.451 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[54502de3-c294-461f-b56c-bd1261b6a4e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.456 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9b4c1202-ff7e-49a6-82ad-1311ab90d874]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.479 227766 DEBUG oslo_concurrency.processutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/disk.config da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.185s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.480 227766 INFO nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Deleting local config drive /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/disk.config because it was imported into RBD.#033[00m
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.482 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[04a16913-ccc2-4ce5-b4e2-3f1a5ac9e545]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.500 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c47569b0-7b88-4c32-9b13-4178f93eca6a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d9599b4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:a1:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687969, 'reachable_time': 24655, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 287456, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.516 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bd4ac026-191d-4f7a-970d-db552bf26d47]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8d9599b4-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687982, 'tstamp': 687982}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287462, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8d9599b4-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687985, 'tstamp': 687985}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 287462, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.518 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d9599b4-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.519 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.523 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.524 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d9599b4-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.524 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.524 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8d9599b4-80, col_values=(('external_ids', {'iface-id': 'b57bd565-3bb1-4ecc-8df0-a7c439ac84a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.525 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:11:07 np0005593234 kernel: tap475aff6e-e5: entered promiscuous mode
Jan 23 05:11:07 np0005593234 NetworkManager[48942]: <info>  [1769163067.5272] manager: (tap475aff6e-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/268)
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.527 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:07Z|00511|binding|INFO|Claiming lport 475aff6e-e556-418f-8d36-87ae65b950ae for this chassis.
Jan 23 05:11:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:07Z|00512|binding|INFO|475aff6e-e556-418f-8d36-87ae65b950ae: Claiming fa:16:3e:85:d9:a5 10.100.0.9
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.537 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:d9:a5 10.100.0.9'], port_security=['fa:16:3e:85:d9:a5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'da6a1a46-4a6b-44a0-b5a2-35d2634865be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88b571fd-69ad-4860-a596-3bd637fdb189', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f00cc6e26e5c435b902306c6421e146d', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a7b9167c-c78b-48f5-9e9d-ac8ada29e0a2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d050303a-8173-4865-aab2-724e0c0624de, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=475aff6e-e556-418f-8d36-87ae65b950ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.538 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 475aff6e-e556-418f-8d36-87ae65b950ae in datapath 88b571fd-69ad-4860-a596-3bd637fdb189 bound to our chassis#033[00m
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.539 144381 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 88b571fd-69ad-4860-a596-3bd637fdb189 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 23 05:11:07 np0005593234 NetworkManager[48942]: <info>  [1769163067.5405] device (tap475aff6e-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:11:07 np0005593234 NetworkManager[48942]: <info>  [1769163067.5420] device (tap475aff6e-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:11:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:07.543 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b909981b-be07-4823-a6d8-3bc895315702]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:07Z|00513|binding|INFO|Setting lport 475aff6e-e556-418f-8d36-87ae65b950ae up in Southbound
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.546 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:07Z|00514|binding|INFO|Setting lport 475aff6e-e556-418f-8d36-87ae65b950ae ovn-installed in OVS
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.548 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.551 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:07 np0005593234 systemd-machined[195626]: New machine qemu-61-instance-00000083.
Jan 23 05:11:07 np0005593234 systemd[1]: Started Virtual Machine qemu-61-instance-00000083.
Jan 23 05:11:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:07.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.677 227766 DEBUG nova.compute.manager [req-65c123fd-d040-4fb0-b76e-a6cd2e3c1852 req-71a63734-d45f-4cfd-bd66-156dc0041a76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.678 227766 DEBUG oslo_concurrency.lockutils [req-65c123fd-d040-4fb0-b76e-a6cd2e3c1852 req-71a63734-d45f-4cfd-bd66-156dc0041a76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.678 227766 DEBUG oslo_concurrency.lockutils [req-65c123fd-d040-4fb0-b76e-a6cd2e3c1852 req-71a63734-d45f-4cfd-bd66-156dc0041a76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.678 227766 DEBUG oslo_concurrency.lockutils [req-65c123fd-d040-4fb0-b76e-a6cd2e3c1852 req-71a63734-d45f-4cfd-bd66-156dc0041a76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.678 227766 DEBUG nova.compute.manager [req-65c123fd-d040-4fb0-b76e-a6cd2e3c1852 req-71a63734-d45f-4cfd-bd66-156dc0041a76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Processing event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.851 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163067.8510945, a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.852 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] VM Started (Lifecycle Event)#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.855 227766 DEBUG nova.compute.manager [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.863 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.868 227766 INFO nova.virt.libvirt.driver [-] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Instance spawned successfully.#033[00m
Jan 23 05:11:07 np0005593234 nova_compute[227762]: 2026-01-23 10:11:07.868 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.044 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.045 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.045 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.046 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.046 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.046 227766 DEBUG nova.virt.libvirt.driver [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.051 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.057 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:11:08 np0005593234 podman[287587]: 2026-01-23 10:11:08.103746205 +0000 UTC m=+0.060586836 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.105 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.106 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163067.8551884, a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.106 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.141 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.145 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163067.8588238, a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.146 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.157 227766 INFO nova.compute.manager [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Took 12.69 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.157 227766 DEBUG nova.compute.manager [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.172 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.175 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.648 227766 INFO nova.compute.manager [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Took 14.28 seconds to build instance.#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.664 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163067.9938138, da6a1a46-4a6b-44a0-b5a2-35d2634865be => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.665 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] VM Started (Lifecycle Event)#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.692 227766 DEBUG oslo_concurrency.lockutils [None req-8ebd4971-74b2-4c06-b70e-7f2cc6e1f17b aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.701 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.707 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163067.993915, da6a1a46-4a6b-44a0-b5a2-35d2634865be => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.708 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.740 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.743 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:11:08 np0005593234 nova_compute[227762]: 2026-01-23 10:11:08.772 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:11:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:08.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e292 e292: 3 total, 3 up, 3 in
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.451 227766 DEBUG nova.compute.manager [req-576bf97e-9971-40d7-8035-76d32235c647 req-eb51bc42-838f-46ee-8f1e-9a3067bc8a37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.452 227766 DEBUG oslo_concurrency.lockutils [req-576bf97e-9971-40d7-8035-76d32235c647 req-eb51bc42-838f-46ee-8f1e-9a3067bc8a37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.452 227766 DEBUG oslo_concurrency.lockutils [req-576bf97e-9971-40d7-8035-76d32235c647 req-eb51bc42-838f-46ee-8f1e-9a3067bc8a37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.452 227766 DEBUG oslo_concurrency.lockutils [req-576bf97e-9971-40d7-8035-76d32235c647 req-eb51bc42-838f-46ee-8f1e-9a3067bc8a37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.452 227766 DEBUG nova.compute.manager [req-576bf97e-9971-40d7-8035-76d32235c647 req-eb51bc42-838f-46ee-8f1e-9a3067bc8a37 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Processing event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.453 227766 DEBUG nova.compute.manager [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.457 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163069.457096, da6a1a46-4a6b-44a0-b5a2-35d2634865be => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.457 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.460 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.464 227766 INFO nova.virt.libvirt.driver [-] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Instance spawned successfully.#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.465 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.496 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.572 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.578 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.579 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.580 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.580 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.580 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.581 227766 DEBUG nova.virt.libvirt.driver [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:11:09 np0005593234 nova_compute[227762]: 2026-01-23 10:11:09.630 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:11:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:09.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:10 np0005593234 nova_compute[227762]: 2026-01-23 10:11:10.186 227766 INFO nova.compute.manager [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Took 13.61 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:11:10 np0005593234 nova_compute[227762]: 2026-01-23 10:11:10.187 227766 DEBUG nova.compute.manager [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:10 np0005593234 nova_compute[227762]: 2026-01-23 10:11:10.741 227766 DEBUG nova.compute.manager [req-02b9e37a-df95-4cd7-a510-ef4bc4e499f6 req-d345805e-95ea-4ff2-a277-7ebde656cc8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:10 np0005593234 nova_compute[227762]: 2026-01-23 10:11:10.742 227766 DEBUG oslo_concurrency.lockutils [req-02b9e37a-df95-4cd7-a510-ef4bc4e499f6 req-d345805e-95ea-4ff2-a277-7ebde656cc8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:10 np0005593234 nova_compute[227762]: 2026-01-23 10:11:10.742 227766 DEBUG oslo_concurrency.lockutils [req-02b9e37a-df95-4cd7-a510-ef4bc4e499f6 req-d345805e-95ea-4ff2-a277-7ebde656cc8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:10 np0005593234 nova_compute[227762]: 2026-01-23 10:11:10.742 227766 DEBUG oslo_concurrency.lockutils [req-02b9e37a-df95-4cd7-a510-ef4bc4e499f6 req-d345805e-95ea-4ff2-a277-7ebde656cc8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:10 np0005593234 nova_compute[227762]: 2026-01-23 10:11:10.742 227766 DEBUG nova.compute.manager [req-02b9e37a-df95-4cd7-a510-ef4bc4e499f6 req-d345805e-95ea-4ff2-a277-7ebde656cc8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] No waiting events found dispatching network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:10 np0005593234 nova_compute[227762]: 2026-01-23 10:11:10.743 227766 WARNING nova.compute.manager [req-02b9e37a-df95-4cd7-a510-ef4bc4e499f6 req-d345805e-95ea-4ff2-a277-7ebde656cc8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received unexpected event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:11:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:10.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:10 np0005593234 nova_compute[227762]: 2026-01-23 10:11:10.882 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:11 np0005593234 nova_compute[227762]: 2026-01-23 10:11:11.124 227766 INFO nova.compute.manager [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Took 15.55 seconds to build instance.#033[00m
Jan 23 05:11:11 np0005593234 nova_compute[227762]: 2026-01-23 10:11:11.652 227766 DEBUG oslo_concurrency.lockutils [None req-0d9f9b87-63ee-4e27-9c76-aa790094ee83 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:11:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:11.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:11:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e292 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:11 np0005593234 nova_compute[227762]: 2026-01-23 10:11:11.855 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:11 np0005593234 nova_compute[227762]: 2026-01-23 10:11:11.916 227766 DEBUG nova.compute.manager [req-0091cb2c-b8fe-456b-b79b-e511562acc45 req-44417744-0bc9-4cc8-974d-33e946a43b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:11 np0005593234 nova_compute[227762]: 2026-01-23 10:11:11.916 227766 DEBUG oslo_concurrency.lockutils [req-0091cb2c-b8fe-456b-b79b-e511562acc45 req-44417744-0bc9-4cc8-974d-33e946a43b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:11 np0005593234 nova_compute[227762]: 2026-01-23 10:11:11.917 227766 DEBUG oslo_concurrency.lockutils [req-0091cb2c-b8fe-456b-b79b-e511562acc45 req-44417744-0bc9-4cc8-974d-33e946a43b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:11 np0005593234 nova_compute[227762]: 2026-01-23 10:11:11.917 227766 DEBUG oslo_concurrency.lockutils [req-0091cb2c-b8fe-456b-b79b-e511562acc45 req-44417744-0bc9-4cc8-974d-33e946a43b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:11 np0005593234 nova_compute[227762]: 2026-01-23 10:11:11.917 227766 DEBUG nova.compute.manager [req-0091cb2c-b8fe-456b-b79b-e511562acc45 req-44417744-0bc9-4cc8-974d-33e946a43b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] No waiting events found dispatching network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:11 np0005593234 nova_compute[227762]: 2026-01-23 10:11:11.917 227766 WARNING nova.compute.manager [req-0091cb2c-b8fe-456b-b79b-e511562acc45 req-44417744-0bc9-4cc8-974d-33e946a43b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received unexpected event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae for instance with vm_state active and task_state None.#033[00m
Jan 23 05:11:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:12.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:13.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:14 np0005593234 nova_compute[227762]: 2026-01-23 10:11:14.709 227766 INFO nova.compute.manager [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Rescuing#033[00m
Jan 23 05:11:14 np0005593234 nova_compute[227762]: 2026-01-23 10:11:14.710 227766 DEBUG oslo_concurrency.lockutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquiring lock "refresh_cache-da6a1a46-4a6b-44a0-b5a2-35d2634865be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:11:14 np0005593234 nova_compute[227762]: 2026-01-23 10:11:14.710 227766 DEBUG oslo_concurrency.lockutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquired lock "refresh_cache-da6a1a46-4a6b-44a0-b5a2-35d2634865be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:11:14 np0005593234 nova_compute[227762]: 2026-01-23 10:11:14.710 227766 DEBUG nova.network.neutron [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:11:14 np0005593234 nova_compute[227762]: 2026-01-23 10:11:14.712 227766 DEBUG nova.compute.manager [req-1e7fba65-3793-4175-9dd8-11208052c5c6 req-d0cc6785-e977-4d26-9e4c-13916857f4df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-changed-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:14 np0005593234 nova_compute[227762]: 2026-01-23 10:11:14.712 227766 DEBUG nova.compute.manager [req-1e7fba65-3793-4175-9dd8-11208052c5c6 req-d0cc6785-e977-4d26-9e4c-13916857f4df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Refreshing instance network info cache due to event network-changed-87b7656f-9fbc-466f-bfe3-06171df90096. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:11:14 np0005593234 nova_compute[227762]: 2026-01-23 10:11:14.713 227766 DEBUG oslo_concurrency.lockutils [req-1e7fba65-3793-4175-9dd8-11208052c5c6 req-d0cc6785-e977-4d26-9e4c-13916857f4df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:11:14 np0005593234 nova_compute[227762]: 2026-01-23 10:11:14.713 227766 DEBUG oslo_concurrency.lockutils [req-1e7fba65-3793-4175-9dd8-11208052c5c6 req-d0cc6785-e977-4d26-9e4c-13916857f4df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:11:14 np0005593234 nova_compute[227762]: 2026-01-23 10:11:14.713 227766 DEBUG nova.network.neutron [req-1e7fba65-3793-4175-9dd8-11208052c5c6 req-d0cc6785-e977-4d26-9e4c-13916857f4df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Refreshing network info cache for port 87b7656f-9fbc-466f-bfe3-06171df90096 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:11:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:14.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:11:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:15.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:11:15 np0005593234 nova_compute[227762]: 2026-01-23 10:11:15.884 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:16 np0005593234 podman[287638]: 2026-01-23 10:11:16.790313677 +0000 UTC m=+0.082672476 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:11:16 np0005593234 nova_compute[227762]: 2026-01-23 10:11:16.857 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:16.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e292 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e293 e293: 3 total, 3 up, 3 in
Jan 23 05:11:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:17.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:18.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:19 np0005593234 nova_compute[227762]: 2026-01-23 10:11:19.626 227766 DEBUG nova.network.neutron [req-1e7fba65-3793-4175-9dd8-11208052c5c6 req-d0cc6785-e977-4d26-9e4c-13916857f4df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updated VIF entry in instance network info cache for port 87b7656f-9fbc-466f-bfe3-06171df90096. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:11:19 np0005593234 nova_compute[227762]: 2026-01-23 10:11:19.627 227766 DEBUG nova.network.neutron [req-1e7fba65-3793-4175-9dd8-11208052c5c6 req-d0cc6785-e977-4d26-9e4c-13916857f4df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updating instance_info_cache with network_info: [{"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:19 np0005593234 nova_compute[227762]: 2026-01-23 10:11:19.667 227766 DEBUG nova.network.neutron [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Updating instance_info_cache with network_info: [{"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:11:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:19.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:11:20 np0005593234 nova_compute[227762]: 2026-01-23 10:11:20.022 227766 DEBUG oslo_concurrency.lockutils [req-1e7fba65-3793-4175-9dd8-11208052c5c6 req-d0cc6785-e977-4d26-9e4c-13916857f4df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:11:20 np0005593234 nova_compute[227762]: 2026-01-23 10:11:20.055 227766 DEBUG oslo_concurrency.lockutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Releasing lock "refresh_cache-da6a1a46-4a6b-44a0-b5a2-35d2634865be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:11:20 np0005593234 nova_compute[227762]: 2026-01-23 10:11:20.731 227766 DEBUG nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:11:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:20.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:20 np0005593234 nova_compute[227762]: 2026-01-23 10:11:20.886 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:21.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:21 np0005593234 nova_compute[227762]: 2026-01-23 10:11:21.887 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:11:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3356897344' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:11:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:11:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3356897344' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:11:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:22Z|00054|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a4:3b:96 10.100.0.12
Jan 23 05:11:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:22Z|00055|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a4:3b:96 10.100.0.12
Jan 23 05:11:22 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #53. Immutable memtables: 9.
Jan 23 05:11:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:22.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:11:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:23.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:11:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:24.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:11:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3305305687' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:11:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:11:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3305305687' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:11:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:25.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:25 np0005593234 nova_compute[227762]: 2026-01-23 10:11:25.889 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:26.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:26 np0005593234 nova_compute[227762]: 2026-01-23 10:11:26.939 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:27.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:28.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:29.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:30 np0005593234 nova_compute[227762]: 2026-01-23 10:11:30.780 227766 DEBUG nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 05:11:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:30 np0005593234 nova_compute[227762]: 2026-01-23 10:11:30.892 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:30.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:30 np0005593234 podman[287892]: 2026-01-23 10:11:30.986930735 +0000 UTC m=+0.085885898 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:11:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:31Z|00515|binding|INFO|Releasing lport b57bd565-3bb1-4ecc-8df0-a7c439ac84a6 from this chassis (sb_readonly=0)
Jan 23 05:11:31 np0005593234 podman[287892]: 2026-01-23 10:11:31.094285221 +0000 UTC m=+0.193240374 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:11:31 np0005593234 nova_compute[227762]: 2026-01-23 10:11:31.117 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:31 np0005593234 podman[288045]: 2026-01-23 10:11:31.650921018 +0000 UTC m=+0.051925295 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:11:31 np0005593234 podman[288045]: 2026-01-23 10:11:31.662894472 +0000 UTC m=+0.063898729 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:11:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:31.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:31 np0005593234 podman[288112]: 2026-01-23 10:11:31.860587484 +0000 UTC m=+0.050692416 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, release=1793, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, io.buildah.version=1.28.2, architecture=x86_64, io.openshift.tags=Ceph keepalived, name=keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph, vendor=Red Hat, Inc., version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Jan 23 05:11:31 np0005593234 podman[288112]: 2026-01-23 10:11:31.875081418 +0000 UTC m=+0.065186350 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.buildah.version=1.28.2, description=keepalived for Ceph, com.redhat.component=keepalived-container, release=1793, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived)
Jan 23 05:11:31 np0005593234 nova_compute[227762]: 2026-01-23 10:11:31.941 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:32.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:11:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:11:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:11:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:11:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:11:33 np0005593234 kernel: tap475aff6e-e5 (unregistering): left promiscuous mode
Jan 23 05:11:33 np0005593234 NetworkManager[48942]: <info>  [1769163093.0702] device (tap475aff6e-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:11:33 np0005593234 nova_compute[227762]: 2026-01-23 10:11:33.080 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:33 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:33Z|00516|binding|INFO|Releasing lport 475aff6e-e556-418f-8d36-87ae65b950ae from this chassis (sb_readonly=0)
Jan 23 05:11:33 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:33Z|00517|binding|INFO|Setting lport 475aff6e-e556-418f-8d36-87ae65b950ae down in Southbound
Jan 23 05:11:33 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:33Z|00518|binding|INFO|Removing iface tap475aff6e-e5 ovn-installed in OVS
Jan 23 05:11:33 np0005593234 nova_compute[227762]: 2026-01-23 10:11:33.083 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:33 np0005593234 nova_compute[227762]: 2026-01-23 10:11:33.192 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:33.193 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:d9:a5 10.100.0.9'], port_security=['fa:16:3e:85:d9:a5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'da6a1a46-4a6b-44a0-b5a2-35d2634865be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88b571fd-69ad-4860-a596-3bd637fdb189', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f00cc6e26e5c435b902306c6421e146d', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a7b9167c-c78b-48f5-9e9d-ac8ada29e0a2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d050303a-8173-4865-aab2-724e0c0624de, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=475aff6e-e556-418f-8d36-87ae65b950ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:11:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:33.196 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 475aff6e-e556-418f-8d36-87ae65b950ae in datapath 88b571fd-69ad-4860-a596-3bd637fdb189 unbound from our chassis#033[00m
Jan 23 05:11:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:33.197 144381 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 88b571fd-69ad-4860-a596-3bd637fdb189 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 23 05:11:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:33.200 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fcb296ba-b273-4746-a78d-588489dc0b5a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:33 np0005593234 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000083.scope: Deactivated successfully.
Jan 23 05:11:33 np0005593234 systemd[1]: machine-qemu\x2d61\x2dinstance\x2d00000083.scope: Consumed 14.532s CPU time.
Jan 23 05:11:33 np0005593234 systemd-machined[195626]: Machine qemu-61-instance-00000083 terminated.
Jan 23 05:11:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:33.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:33 np0005593234 nova_compute[227762]: 2026-01-23 10:11:33.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:33 np0005593234 nova_compute[227762]: 2026-01-23 10:11:33.794 227766 INFO nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Instance shutdown successfully after 13 seconds.#033[00m
Jan 23 05:11:33 np0005593234 nova_compute[227762]: 2026-01-23 10:11:33.799 227766 INFO nova.virt.libvirt.driver [-] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Instance destroyed successfully.#033[00m
Jan 23 05:11:33 np0005593234 nova_compute[227762]: 2026-01-23 10:11:33.799 227766 DEBUG nova.objects.instance [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lazy-loading 'numa_topology' on Instance uuid da6a1a46-4a6b-44a0-b5a2-35d2634865be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:33 np0005593234 nova_compute[227762]: 2026-01-23 10:11:33.839 227766 INFO nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Attempting rescue#033[00m
Jan 23 05:11:33 np0005593234 nova_compute[227762]: 2026-01-23 10:11:33.840 227766 DEBUG nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 23 05:11:33 np0005593234 nova_compute[227762]: 2026-01-23 10:11:33.844 227766 DEBUG nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 05:11:33 np0005593234 nova_compute[227762]: 2026-01-23 10:11:33.845 227766 INFO nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Creating image(s)#033[00m
Jan 23 05:11:33 np0005593234 nova_compute[227762]: 2026-01-23 10:11:33.879 227766 DEBUG nova.storage.rbd_utils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:33 np0005593234 nova_compute[227762]: 2026-01-23 10:11:33.882 227766 DEBUG nova.objects.instance [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lazy-loading 'trusted_certs' on Instance uuid da6a1a46-4a6b-44a0-b5a2-35d2634865be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.282 227766 DEBUG nova.storage.rbd_utils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.307 227766 DEBUG nova.storage.rbd_utils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.311 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.397 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.399 227766 DEBUG oslo_concurrency.lockutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.399 227766 DEBUG oslo_concurrency.lockutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.400 227766 DEBUG oslo_concurrency.lockutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.432 227766 DEBUG nova.storage.rbd_utils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.436 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.623 227766 DEBUG oslo_concurrency.lockutils [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.624 227766 DEBUG oslo_concurrency.lockutils [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.645 227766 DEBUG nova.objects.instance [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'flavor' on Instance uuid a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.712 227766 DEBUG oslo_concurrency.lockutils [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:34 np0005593234 nova_compute[227762]: 2026-01-23 10:11:34.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:34.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.003 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.566s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.003 227766 DEBUG nova.objects.instance [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lazy-loading 'migration_context' on Instance uuid da6a1a46-4a6b-44a0-b5a2-35d2634865be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.163 227766 DEBUG nova.compute.manager [req-eaef0a22-3b48-4c6a-9ed5-b88a995f5c42 req-728e86cf-a70f-45c0-90ad-d0c209453d85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-unplugged-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.164 227766 DEBUG oslo_concurrency.lockutils [req-eaef0a22-3b48-4c6a-9ed5-b88a995f5c42 req-728e86cf-a70f-45c0-90ad-d0c209453d85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.164 227766 DEBUG oslo_concurrency.lockutils [req-eaef0a22-3b48-4c6a-9ed5-b88a995f5c42 req-728e86cf-a70f-45c0-90ad-d0c209453d85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.164 227766 DEBUG oslo_concurrency.lockutils [req-eaef0a22-3b48-4c6a-9ed5-b88a995f5c42 req-728e86cf-a70f-45c0-90ad-d0c209453d85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.164 227766 DEBUG nova.compute.manager [req-eaef0a22-3b48-4c6a-9ed5-b88a995f5c42 req-728e86cf-a70f-45c0-90ad-d0c209453d85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] No waiting events found dispatching network-vif-unplugged-475aff6e-e556-418f-8d36-87ae65b950ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.165 227766 WARNING nova.compute.manager [req-eaef0a22-3b48-4c6a-9ed5-b88a995f5c42 req-728e86cf-a70f-45c0-90ad-d0c209453d85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received unexpected event network-vif-unplugged-475aff6e-e556-418f-8d36-87ae65b950ae for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.183 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.183 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.184 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.184 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.184 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.212 227766 DEBUG nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.213 227766 DEBUG nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Start _get_guest_xml network_info=[{"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1616424882-network", "vif_mac": "fa:16:3e:85:d9:a5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.214 227766 DEBUG nova.objects.instance [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lazy-loading 'resources' on Instance uuid da6a1a46-4a6b-44a0-b5a2-35d2634865be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.232 227766 WARNING nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.239 227766 DEBUG nova.virt.libvirt.host [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.239 227766 DEBUG nova.virt.libvirt.host [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.253 227766 DEBUG nova.virt.libvirt.host [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.254 227766 DEBUG nova.virt.libvirt.host [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.255 227766 DEBUG nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.255 227766 DEBUG nova.virt.hardware [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.256 227766 DEBUG nova.virt.hardware [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.256 227766 DEBUG nova.virt.hardware [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.256 227766 DEBUG nova.virt.hardware [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.257 227766 DEBUG nova.virt.hardware [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.257 227766 DEBUG nova.virt.hardware [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.257 227766 DEBUG nova.virt.hardware [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.321 227766 DEBUG nova.virt.hardware [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.322 227766 DEBUG nova.virt.hardware [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.322 227766 DEBUG nova.virt.hardware [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.322 227766 DEBUG nova.virt.hardware [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.322 227766 DEBUG nova.objects.instance [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lazy-loading 'vcpu_model' on Instance uuid da6a1a46-4a6b-44a0-b5a2-35d2634865be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.325 227766 DEBUG oslo_concurrency.lockutils [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.326 227766 DEBUG oslo_concurrency.lockutils [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.326 227766 INFO nova.compute.manager [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Attaching volume 649f8ce8-126a-4838-b42c-047bd1f41e67 to /dev/vdb#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.355 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.584 227766 DEBUG os_brick.utils [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.585 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.600 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.601 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[e7bbdd7f-b0cd-4820-b8da-03503c0308a5]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.602 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.611 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.611 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[dcd3a178-339b-495b-b6dc-589a806099d3]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.613 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.622 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.623 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[397b49aa-f555-4c09-b839-5759cecaef20]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.625 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[bfeb41c9-067a-44cb-9cf1-4ef2bb4e49cf]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.626 227766 DEBUG oslo_concurrency.processutils [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.654 227766 DEBUG oslo_concurrency.processutils [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "nvme version" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.657 227766 DEBUG os_brick.initiator.connectors.lightos [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.657 227766 DEBUG os_brick.initiator.connectors.lightos [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.657 227766 DEBUG os_brick.initiator.connectors.lightos [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.658 227766 DEBUG os_brick.utils [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] <== get_connector_properties: return (73ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.658 227766 DEBUG nova.virt.block_device [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updating existing volume attachment record: 639ff627-f484-487d-bfbf-d7cfb9061a31 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:11:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:11:35 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/18361021' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:11:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:35.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.716 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.532s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:11:35 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2342130159' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.826 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.827 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:35 np0005593234 nova_compute[227762]: 2026-01-23 10:11:35.895 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:11:36 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4293169983' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.270 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.271 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.317 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.318 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.321 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000082 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.321 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000082 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.324 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000083 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.324 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000083 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.509 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.510 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3932MB free_disk=20.739856719970703GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.510 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.511 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:11:36 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3722494106' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.760 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.489s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.761 227766 DEBUG nova.virt.libvirt.vif [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:10:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1707842336',display_name='tempest-ServerRescueTestJSON-server-1707842336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1707842336',id=131,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:11:10Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f00cc6e26e5c435b902306c6421e146d',ramdisk_id='',reservation_id='r-g6qwsh4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-837476510',owner_user_name='tempest-ServerRescueTestJSON-837476510-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:11:10Z,user_data=None,user_id='eb500aabc93044e380f4bc905205803d',uuid=da6a1a46-4a6b-44a0-b5a2-35d2634865be,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1616424882-network", "vif_mac": "fa:16:3e:85:d9:a5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.762 227766 DEBUG nova.network.os_vif_util [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Converting VIF {"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueTestJSON-1616424882-network", "vif_mac": "fa:16:3e:85:d9:a5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.762 227766 DEBUG nova.network.os_vif_util [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:85:d9:a5,bridge_name='br-int',has_traffic_filtering=True,id=475aff6e-e556-418f-8d36-87ae65b950ae,network=Network(88b571fd-69ad-4860-a596-3bd637fdb189),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap475aff6e-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.763 227766 DEBUG nova.objects.instance [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lazy-loading 'pci_devices' on Instance uuid da6a1a46-4a6b-44a0-b5a2-35d2634865be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.856 227766 DEBUG nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  <uuid>da6a1a46-4a6b-44a0-b5a2-35d2634865be</uuid>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  <name>instance-00000083</name>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerRescueTestJSON-server-1707842336</nova:name>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:11:35</nova:creationTime>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <nova:user uuid="eb500aabc93044e380f4bc905205803d">tempest-ServerRescueTestJSON-837476510-project-member</nova:user>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <nova:project uuid="f00cc6e26e5c435b902306c6421e146d">tempest-ServerRescueTestJSON-837476510</nova:project>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <nova:port uuid="475aff6e-e556-418f-8d36-87ae65b950ae">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <entry name="serial">da6a1a46-4a6b-44a0-b5a2-35d2634865be</entry>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <entry name="uuid">da6a1a46-4a6b-44a0-b5a2-35d2634865be</entry>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.rescue">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <target dev="vdb" bus="virtio"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.config.rescue">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:85:d9:a5"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <target dev="tap475aff6e-e5"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/console.log" append="off"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:11:36 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:11:36 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:11:36 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:11:36 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.862 227766 INFO nova.virt.libvirt.driver [-] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Instance destroyed successfully.#033[00m
Jan 23 05:11:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:11:36 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2214821569' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:11:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:11:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:36.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.951 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.952 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.952 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance da6a1a46-4a6b-44a0-b5a2-35d2634865be actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.952 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.952 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:11:36 np0005593234 nova_compute[227762]: 2026-01-23 10:11:36.975 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.047 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.163 227766 DEBUG nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.163 227766 DEBUG nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.163 227766 DEBUG nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.164 227766 DEBUG nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] No VIF found with MAC fa:16:3e:85:d9:a5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.164 227766 INFO nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Using config drive#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.193 227766 DEBUG nova.storage.rbd_utils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.277 227766 DEBUG nova.objects.instance [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'flavor' on Instance uuid a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.514 227766 DEBUG nova.compute.manager [req-830ea27d-a6c6-498a-ba1c-8ed4c0e3702f req-b1d50001-0eab-451d-b169-36968ff9e637 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.514 227766 DEBUG oslo_concurrency.lockutils [req-830ea27d-a6c6-498a-ba1c-8ed4c0e3702f req-b1d50001-0eab-451d-b169-36968ff9e637 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.514 227766 DEBUG oslo_concurrency.lockutils [req-830ea27d-a6c6-498a-ba1c-8ed4c0e3702f req-b1d50001-0eab-451d-b169-36968ff9e637 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.517 227766 DEBUG oslo_concurrency.lockutils [req-830ea27d-a6c6-498a-ba1c-8ed4c0e3702f req-b1d50001-0eab-451d-b169-36968ff9e637 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.517 227766 DEBUG nova.compute.manager [req-830ea27d-a6c6-498a-ba1c-8ed4c0e3702f req-b1d50001-0eab-451d-b169-36968ff9e637 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] No waiting events found dispatching network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.518 227766 WARNING nova.compute.manager [req-830ea27d-a6c6-498a-ba1c-8ed4c0e3702f req-b1d50001-0eab-451d-b169-36968ff9e637 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received unexpected event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:11:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:11:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3399301070' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.525 227766 DEBUG nova.objects.instance [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lazy-loading 'ec2_ids' on Instance uuid da6a1a46-4a6b-44a0-b5a2-35d2634865be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.532 227766 DEBUG nova.virt.libvirt.driver [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Attempting to attach volume 649f8ce8-126a-4838-b42c-047bd1f41e67 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.534 227766 DEBUG nova.virt.libvirt.guest [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:11:37 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:11:37 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-649f8ce8-126a-4838-b42c-047bd1f41e67">
Jan 23 05:11:37 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:11:37 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:11:37 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:11:37 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:11:37 np0005593234 nova_compute[227762]:  <auth username="openstack">
Jan 23 05:11:37 np0005593234 nova_compute[227762]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:11:37 np0005593234 nova_compute[227762]:  </auth>
Jan 23 05:11:37 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:11:37 np0005593234 nova_compute[227762]:  <serial>649f8ce8-126a-4838-b42c-047bd1f41e67</serial>
Jan 23 05:11:37 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:11:37 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.537 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.544 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.555 227766 DEBUG nova.objects.instance [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lazy-loading 'keypairs' on Instance uuid da6a1a46-4a6b-44a0-b5a2-35d2634865be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.673 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:11:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:37.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.716 227766 DEBUG nova.virt.libvirt.driver [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.717 227766 DEBUG nova.virt.libvirt.driver [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.717 227766 DEBUG nova.virt.libvirt.driver [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.717 227766 DEBUG nova.virt.libvirt.driver [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No VIF found with MAC fa:16:3e:a4:3b:96, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.809 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:11:37 np0005593234 nova_compute[227762]: 2026-01-23 10:11:37.810 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:38 np0005593234 nova_compute[227762]: 2026-01-23 10:11:38.255 227766 DEBUG oslo_concurrency.lockutils [None req-deea3c18-1411-436d-9696-6e1afd1f978f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 2.929s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:38 np0005593234 nova_compute[227762]: 2026-01-23 10:11:38.533 227766 INFO nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Creating config drive at /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/disk.config.rescue#033[00m
Jan 23 05:11:38 np0005593234 nova_compute[227762]: 2026-01-23 10:11:38.537 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpel5u2thu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:38 np0005593234 nova_compute[227762]: 2026-01-23 10:11:38.667 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpel5u2thu" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:38 np0005593234 nova_compute[227762]: 2026-01-23 10:11:38.695 227766 DEBUG nova.storage.rbd_utils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] rbd image da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:11:38 np0005593234 nova_compute[227762]: 2026-01-23 10:11:38.700 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/disk.config.rescue da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:38 np0005593234 podman[288567]: 2026-01-23 10:11:38.796498732 +0000 UTC m=+0.088475347 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:11:38 np0005593234 nova_compute[227762]: 2026-01-23 10:11:38.871 227766 DEBUG oslo_concurrency.processutils [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/disk.config.rescue da6a1a46-4a6b-44a0-b5a2-35d2634865be_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:38 np0005593234 nova_compute[227762]: 2026-01-23 10:11:38.872 227766 INFO nova.virt.libvirt.driver [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Deleting local config drive /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be/disk.config.rescue because it was imported into RBD.#033[00m
Jan 23 05:11:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:38.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:38 np0005593234 kernel: tap475aff6e-e5: entered promiscuous mode
Jan 23 05:11:38 np0005593234 NetworkManager[48942]: <info>  [1769163098.9195] manager: (tap475aff6e-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/269)
Jan 23 05:11:38 np0005593234 nova_compute[227762]: 2026-01-23 10:11:38.918 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:38Z|00519|binding|INFO|Claiming lport 475aff6e-e556-418f-8d36-87ae65b950ae for this chassis.
Jan 23 05:11:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:38Z|00520|binding|INFO|475aff6e-e556-418f-8d36-87ae65b950ae: Claiming fa:16:3e:85:d9:a5 10.100.0.9
Jan 23 05:11:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:38.967 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:d9:a5 10.100.0.9'], port_security=['fa:16:3e:85:d9:a5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'da6a1a46-4a6b-44a0-b5a2-35d2634865be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88b571fd-69ad-4860-a596-3bd637fdb189', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f00cc6e26e5c435b902306c6421e146d', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'a7b9167c-c78b-48f5-9e9d-ac8ada29e0a2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d050303a-8173-4865-aab2-724e0c0624de, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=475aff6e-e556-418f-8d36-87ae65b950ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:11:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:38.968 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 475aff6e-e556-418f-8d36-87ae65b950ae in datapath 88b571fd-69ad-4860-a596-3bd637fdb189 bound to our chassis#033[00m
Jan 23 05:11:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:38.969 144381 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 88b571fd-69ad-4860-a596-3bd637fdb189 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 23 05:11:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:38.970 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3549ab24-bba1-4ac3-8135-4db4cd709e7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:38Z|00521|binding|INFO|Setting lport 475aff6e-e556-418f-8d36-87ae65b950ae up in Southbound
Jan 23 05:11:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:38Z|00522|binding|INFO|Setting lport 475aff6e-e556-418f-8d36-87ae65b950ae ovn-installed in OVS
Jan 23 05:11:38 np0005593234 nova_compute[227762]: 2026-01-23 10:11:38.973 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:38 np0005593234 nova_compute[227762]: 2026-01-23 10:11:38.977 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:38 np0005593234 systemd-machined[195626]: New machine qemu-62-instance-00000083.
Jan 23 05:11:38 np0005593234 systemd-udevd[288622]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:11:39 np0005593234 systemd[1]: Started Virtual Machine qemu-62-instance-00000083.
Jan 23 05:11:39 np0005593234 NetworkManager[48942]: <info>  [1769163099.0109] device (tap475aff6e-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:11:39 np0005593234 NetworkManager[48942]: <info>  [1769163099.0120] device (tap475aff6e-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:11:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:11:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:11:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:39.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:39 np0005593234 nova_compute[227762]: 2026-01-23 10:11:39.808 227766 DEBUG nova.compute.manager [req-fb686127-7889-4915-9181-089dc6b2f388 req-8c243458-dd9e-4474-87fb-ce51898faff1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:39 np0005593234 nova_compute[227762]: 2026-01-23 10:11:39.808 227766 DEBUG oslo_concurrency.lockutils [req-fb686127-7889-4915-9181-089dc6b2f388 req-8c243458-dd9e-4474-87fb-ce51898faff1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:39 np0005593234 nova_compute[227762]: 2026-01-23 10:11:39.809 227766 DEBUG oslo_concurrency.lockutils [req-fb686127-7889-4915-9181-089dc6b2f388 req-8c243458-dd9e-4474-87fb-ce51898faff1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:39 np0005593234 nova_compute[227762]: 2026-01-23 10:11:39.809 227766 DEBUG oslo_concurrency.lockutils [req-fb686127-7889-4915-9181-089dc6b2f388 req-8c243458-dd9e-4474-87fb-ce51898faff1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:39 np0005593234 nova_compute[227762]: 2026-01-23 10:11:39.809 227766 DEBUG nova.compute.manager [req-fb686127-7889-4915-9181-089dc6b2f388 req-8c243458-dd9e-4474-87fb-ce51898faff1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] No waiting events found dispatching network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:39 np0005593234 nova_compute[227762]: 2026-01-23 10:11:39.809 227766 WARNING nova.compute.manager [req-fb686127-7889-4915-9181-089dc6b2f388 req-8c243458-dd9e-4474-87fb-ce51898faff1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received unexpected event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:11:40 np0005593234 nova_compute[227762]: 2026-01-23 10:11:40.246 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for da6a1a46-4a6b-44a0-b5a2-35d2634865be due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:11:40 np0005593234 nova_compute[227762]: 2026-01-23 10:11:40.246 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163100.245192, da6a1a46-4a6b-44a0-b5a2-35d2634865be => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:40 np0005593234 nova_compute[227762]: 2026-01-23 10:11:40.246 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:11:40 np0005593234 nova_compute[227762]: 2026-01-23 10:11:40.252 227766 DEBUG nova.compute.manager [None req-8357369c-7957-42a8-af9e-94e3e247abdb eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:40 np0005593234 nova_compute[227762]: 2026-01-23 10:11:40.292 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:40 np0005593234 nova_compute[227762]: 2026-01-23 10:11:40.295 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:11:40 np0005593234 nova_compute[227762]: 2026-01-23 10:11:40.353 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 23 05:11:40 np0005593234 nova_compute[227762]: 2026-01-23 10:11:40.354 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163100.2464793, da6a1a46-4a6b-44a0-b5a2-35d2634865be => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:40 np0005593234 nova_compute[227762]: 2026-01-23 10:11:40.354 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] VM Started (Lifecycle Event)#033[00m
Jan 23 05:11:40 np0005593234 nova_compute[227762]: 2026-01-23 10:11:40.390 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:40 np0005593234 nova_compute[227762]: 2026-01-23 10:11:40.393 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:11:40 np0005593234 nova_compute[227762]: 2026-01-23 10:11:40.899 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:40.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:41.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #115. Immutable memtables: 0.
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:41.883897) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 71] Flushing memtable with next log file: 115
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163101883959, "job": 71, "event": "flush_started", "num_memtables": 1, "num_entries": 1697, "num_deletes": 261, "total_data_size": 3686027, "memory_usage": 3751568, "flush_reason": "Manual Compaction"}
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 71] Level-0 flush table #116: started
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163101902954, "cf_name": "default", "job": 71, "event": "table_file_creation", "file_number": 116, "file_size": 2420708, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 57217, "largest_seqno": 58908, "table_properties": {"data_size": 2413520, "index_size": 4131, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 15848, "raw_average_key_size": 20, "raw_value_size": 2398786, "raw_average_value_size": 3087, "num_data_blocks": 180, "num_entries": 777, "num_filter_entries": 777, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162973, "oldest_key_time": 1769162973, "file_creation_time": 1769163101, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 116, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 71] Flush lasted 19126 microseconds, and 6048 cpu microseconds.
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:41.903025) [db/flush_job.cc:967] [default] [JOB 71] Level-0 flush table #116: 2420708 bytes OK
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:41.903047) [db/memtable_list.cc:519] [default] Level-0 commit table #116 started
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:41.905563) [db/memtable_list.cc:722] [default] Level-0 commit table #116: memtable #1 done
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:41.905597) EVENT_LOG_v1 {"time_micros": 1769163101905590, "job": 71, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:41.905616) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 71] Try to delete WAL files size 3678087, prev total WAL file size 3678087, number of live WAL files 2.
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000112.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:41.906766) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032303036' seq:72057594037927935, type:22 .. '6C6F676D0032323630' seq:0, type:0; will stop at (end)
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 72] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 71 Base level 0, inputs: [116(2363KB)], [114(10212KB)]
Jan 23 05:11:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163101906890, "job": 72, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [116], "files_L6": [114], "score": -1, "input_data_size": 12878784, "oldest_snapshot_seqno": -1}
Jan 23 05:11:41 np0005593234 nova_compute[227762]: 2026-01-23 10:11:41.977 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 72] Generated table #117: 8152 keys, 12736953 bytes, temperature: kUnknown
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163102005228, "cf_name": "default", "job": 72, "event": "table_file_creation", "file_number": 117, "file_size": 12736953, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12682182, "index_size": 33283, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20421, "raw_key_size": 211204, "raw_average_key_size": 25, "raw_value_size": 12536736, "raw_average_value_size": 1537, "num_data_blocks": 1314, "num_entries": 8152, "num_filter_entries": 8152, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769163101, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 117, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:42.005720) [db/compaction/compaction_job.cc:1663] [default] [JOB 72] Compacted 1@0 + 1@6 files to L6 => 12736953 bytes
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:42.007142) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 130.6 rd, 129.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 10.0 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(10.6) write-amplify(5.3) OK, records in: 8692, records dropped: 540 output_compression: NoCompression
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:42.007163) EVENT_LOG_v1 {"time_micros": 1769163102007154, "job": 72, "event": "compaction_finished", "compaction_time_micros": 98592, "compaction_time_cpu_micros": 28087, "output_level": 6, "num_output_files": 1, "total_output_size": 12736953, "num_input_records": 8692, "num_output_records": 8152, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000116.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163102007594, "job": 72, "event": "table_file_deletion", "file_number": 116}
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000114.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163102009283, "job": 72, "event": "table_file_deletion", "file_number": 114}
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:41.906627) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:42.009340) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:42.009344) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:42.009346) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:42.009347) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:11:42.009349) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:11:42 np0005593234 nova_compute[227762]: 2026-01-23 10:11:42.013 227766 DEBUG nova.compute.manager [req-64554780-3850-472a-a8ec-73ccdd724b27 req-1a3a8c09-306a-42f9-b4ac-6254ad8199a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:42 np0005593234 nova_compute[227762]: 2026-01-23 10:11:42.014 227766 DEBUG oslo_concurrency.lockutils [req-64554780-3850-472a-a8ec-73ccdd724b27 req-1a3a8c09-306a-42f9-b4ac-6254ad8199a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:42 np0005593234 nova_compute[227762]: 2026-01-23 10:11:42.014 227766 DEBUG oslo_concurrency.lockutils [req-64554780-3850-472a-a8ec-73ccdd724b27 req-1a3a8c09-306a-42f9-b4ac-6254ad8199a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:42 np0005593234 nova_compute[227762]: 2026-01-23 10:11:42.015 227766 DEBUG oslo_concurrency.lockutils [req-64554780-3850-472a-a8ec-73ccdd724b27 req-1a3a8c09-306a-42f9-b4ac-6254ad8199a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:42 np0005593234 nova_compute[227762]: 2026-01-23 10:11:42.015 227766 DEBUG nova.compute.manager [req-64554780-3850-472a-a8ec-73ccdd724b27 req-1a3a8c09-306a-42f9-b4ac-6254ad8199a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] No waiting events found dispatching network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:42 np0005593234 nova_compute[227762]: 2026-01-23 10:11:42.015 227766 WARNING nova.compute.manager [req-64554780-3850-472a-a8ec-73ccdd724b27 req-1a3a8c09-306a-42f9-b4ac-6254ad8199a8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received unexpected event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae for instance with vm_state rescued and task_state None.#033[00m
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/401260143' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:11:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/401260143' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:11:42 np0005593234 nova_compute[227762]: 2026-01-23 10:11:42.668 227766 INFO nova.compute.manager [None req-6616c0da-e872-4823-b0f8-5792fad4c8d3 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Unrescuing#033[00m
Jan 23 05:11:42 np0005593234 nova_compute[227762]: 2026-01-23 10:11:42.669 227766 DEBUG oslo_concurrency.lockutils [None req-6616c0da-e872-4823-b0f8-5792fad4c8d3 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquiring lock "refresh_cache-da6a1a46-4a6b-44a0-b5a2-35d2634865be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:11:42 np0005593234 nova_compute[227762]: 2026-01-23 10:11:42.669 227766 DEBUG oslo_concurrency.lockutils [None req-6616c0da-e872-4823-b0f8-5792fad4c8d3 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquired lock "refresh_cache-da6a1a46-4a6b-44a0-b5a2-35d2634865be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:11:42 np0005593234 nova_compute[227762]: 2026-01-23 10:11:42.669 227766 DEBUG nova.network.neutron [None req-6616c0da-e872-4823-b0f8-5792fad4c8d3 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:11:42 np0005593234 nova_compute[227762]: 2026-01-23 10:11:42.810 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:42 np0005593234 nova_compute[227762]: 2026-01-23 10:11:42.811 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:42.848 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:42.848 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:42.849 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:42.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:43 np0005593234 nova_compute[227762]: 2026-01-23 10:11:43.684 227766 DEBUG oslo_concurrency.lockutils [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:11:43 np0005593234 nova_compute[227762]: 2026-01-23 10:11:43.686 227766 DEBUG oslo_concurrency.lockutils [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquired lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:11:43 np0005593234 nova_compute[227762]: 2026-01-23 10:11:43.688 227766 DEBUG nova.network.neutron [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:11:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:43.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:43 np0005593234 nova_compute[227762]: 2026-01-23 10:11:43.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:11:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/412641226' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:11:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:11:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/412641226' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:11:44 np0005593234 nova_compute[227762]: 2026-01-23 10:11:44.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:44 np0005593234 nova_compute[227762]: 2026-01-23 10:11:44.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:11:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:11:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:44.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:11:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:45.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:45 np0005593234 nova_compute[227762]: 2026-01-23 10:11:45.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:45 np0005593234 nova_compute[227762]: 2026-01-23 10:11:45.902 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:46 np0005593234 nova_compute[227762]: 2026-01-23 10:11:46.132 227766 DEBUG nova.network.neutron [None req-6616c0da-e872-4823-b0f8-5792fad4c8d3 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Updating instance_info_cache with network_info: [{"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:46 np0005593234 nova_compute[227762]: 2026-01-23 10:11:46.163 227766 DEBUG oslo_concurrency.lockutils [None req-6616c0da-e872-4823-b0f8-5792fad4c8d3 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Releasing lock "refresh_cache-da6a1a46-4a6b-44a0-b5a2-35d2634865be" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:11:46 np0005593234 nova_compute[227762]: 2026-01-23 10:11:46.164 227766 DEBUG nova.objects.instance [None req-6616c0da-e872-4823-b0f8-5792fad4c8d3 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lazy-loading 'flavor' on Instance uuid da6a1a46-4a6b-44a0-b5a2-35d2634865be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:46 np0005593234 kernel: tap475aff6e-e5 (unregistering): left promiscuous mode
Jan 23 05:11:46 np0005593234 NetworkManager[48942]: <info>  [1769163106.2442] device (tap475aff6e-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:11:46 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:46Z|00523|binding|INFO|Releasing lport 475aff6e-e556-418f-8d36-87ae65b950ae from this chassis (sb_readonly=0)
Jan 23 05:11:46 np0005593234 nova_compute[227762]: 2026-01-23 10:11:46.252 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:46 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:46Z|00524|binding|INFO|Setting lport 475aff6e-e556-418f-8d36-87ae65b950ae down in Southbound
Jan 23 05:11:46 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:46Z|00525|binding|INFO|Removing iface tap475aff6e-e5 ovn-installed in OVS
Jan 23 05:11:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:46.265 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:d9:a5 10.100.0.9'], port_security=['fa:16:3e:85:d9:a5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'da6a1a46-4a6b-44a0-b5a2-35d2634865be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88b571fd-69ad-4860-a596-3bd637fdb189', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f00cc6e26e5c435b902306c6421e146d', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'a7b9167c-c78b-48f5-9e9d-ac8ada29e0a2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d050303a-8173-4865-aab2-724e0c0624de, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=475aff6e-e556-418f-8d36-87ae65b950ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:11:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:46.266 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 475aff6e-e556-418f-8d36-87ae65b950ae in datapath 88b571fd-69ad-4860-a596-3bd637fdb189 unbound from our chassis#033[00m
Jan 23 05:11:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:46.267 144381 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 88b571fd-69ad-4860-a596-3bd637fdb189 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 23 05:11:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:46.268 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[10f0135c-876d-4a51-8d48-605110c876cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:46 np0005593234 nova_compute[227762]: 2026-01-23 10:11:46.270 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:46 np0005593234 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000083.scope: Deactivated successfully.
Jan 23 05:11:46 np0005593234 systemd[1]: machine-qemu\x2d62\x2dinstance\x2d00000083.scope: Consumed 7.063s CPU time.
Jan 23 05:11:46 np0005593234 systemd-machined[195626]: Machine qemu-62-instance-00000083 terminated.
Jan 23 05:11:46 np0005593234 nova_compute[227762]: 2026-01-23 10:11:46.414 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:46 np0005593234 nova_compute[227762]: 2026-01-23 10:11:46.421 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:46 np0005593234 nova_compute[227762]: 2026-01-23 10:11:46.430 227766 INFO nova.virt.libvirt.driver [-] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Instance destroyed successfully.#033[00m
Jan 23 05:11:46 np0005593234 nova_compute[227762]: 2026-01-23 10:11:46.431 227766 DEBUG nova.objects.instance [None req-6616c0da-e872-4823-b0f8-5792fad4c8d3 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lazy-loading 'numa_topology' on Instance uuid da6a1a46-4a6b-44a0-b5a2-35d2634865be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:46 np0005593234 kernel: tap475aff6e-e5: entered promiscuous mode
Jan 23 05:11:46 np0005593234 systemd-udevd[288748]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:11:46 np0005593234 NetworkManager[48942]: <info>  [1769163106.5306] manager: (tap475aff6e-e5): new Tun device (/org/freedesktop/NetworkManager/Devices/270)
Jan 23 05:11:46 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:46Z|00526|binding|INFO|Claiming lport 475aff6e-e556-418f-8d36-87ae65b950ae for this chassis.
Jan 23 05:11:46 np0005593234 nova_compute[227762]: 2026-01-23 10:11:46.534 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:46 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:46Z|00527|binding|INFO|475aff6e-e556-418f-8d36-87ae65b950ae: Claiming fa:16:3e:85:d9:a5 10.100.0.9
Jan 23 05:11:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:46.542 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:d9:a5 10.100.0.9'], port_security=['fa:16:3e:85:d9:a5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'da6a1a46-4a6b-44a0-b5a2-35d2634865be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88b571fd-69ad-4860-a596-3bd637fdb189', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f00cc6e26e5c435b902306c6421e146d', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'a7b9167c-c78b-48f5-9e9d-ac8ada29e0a2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d050303a-8173-4865-aab2-724e0c0624de, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=475aff6e-e556-418f-8d36-87ae65b950ae) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:11:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:46.543 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 475aff6e-e556-418f-8d36-87ae65b950ae in datapath 88b571fd-69ad-4860-a596-3bd637fdb189 bound to our chassis#033[00m
Jan 23 05:11:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:46.544 144381 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 88b571fd-69ad-4860-a596-3bd637fdb189 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 23 05:11:46 np0005593234 NetworkManager[48942]: <info>  [1769163106.5466] device (tap475aff6e-e5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:11:46 np0005593234 NetworkManager[48942]: <info>  [1769163106.5473] device (tap475aff6e-e5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:11:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:46.545 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b4f75ce0-f1ce-4394-8413-77d8697ea7cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:46 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:46Z|00528|binding|INFO|Setting lport 475aff6e-e556-418f-8d36-87ae65b950ae up in Southbound
Jan 23 05:11:46 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:46Z|00529|binding|INFO|Setting lport 475aff6e-e556-418f-8d36-87ae65b950ae ovn-installed in OVS
Jan 23 05:11:46 np0005593234 nova_compute[227762]: 2026-01-23 10:11:46.553 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:46 np0005593234 nova_compute[227762]: 2026-01-23 10:11:46.559 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:46 np0005593234 systemd-machined[195626]: New machine qemu-63-instance-00000083.
Jan 23 05:11:46 np0005593234 systemd[1]: Started Virtual Machine qemu-63-instance-00000083.
Jan 23 05:11:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:46.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:46 np0005593234 nova_compute[227762]: 2026-01-23 10:11:46.978 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.094 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for da6a1a46-4a6b-44a0-b5a2-35d2634865be due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.095 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163107.0937512, da6a1a46-4a6b-44a0-b5a2-35d2634865be => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.095 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.132 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.136 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.168 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.170 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163107.094775, da6a1a46-4a6b-44a0-b5a2-35d2634865be => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.171 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] VM Started (Lifecycle Event)#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.198 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.202 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.231 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.329 227766 DEBUG nova.compute.manager [req-a30daff2-dc5d-4b51-bfd5-e98e25fe3202 req-639db863-c95e-405a-9f9b-5a6cc35c746d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-unplugged-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.329 227766 DEBUG oslo_concurrency.lockutils [req-a30daff2-dc5d-4b51-bfd5-e98e25fe3202 req-639db863-c95e-405a-9f9b-5a6cc35c746d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.330 227766 DEBUG oslo_concurrency.lockutils [req-a30daff2-dc5d-4b51-bfd5-e98e25fe3202 req-639db863-c95e-405a-9f9b-5a6cc35c746d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.330 227766 DEBUG oslo_concurrency.lockutils [req-a30daff2-dc5d-4b51-bfd5-e98e25fe3202 req-639db863-c95e-405a-9f9b-5a6cc35c746d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.330 227766 DEBUG nova.compute.manager [req-a30daff2-dc5d-4b51-bfd5-e98e25fe3202 req-639db863-c95e-405a-9f9b-5a6cc35c746d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] No waiting events found dispatching network-vif-unplugged-475aff6e-e556-418f-8d36-87ae65b950ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:47 np0005593234 nova_compute[227762]: 2026-01-23 10:11:47.330 227766 WARNING nova.compute.manager [req-a30daff2-dc5d-4b51-bfd5-e98e25fe3202 req-639db863-c95e-405a-9f9b-5a6cc35c746d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received unexpected event network-vif-unplugged-475aff6e-e556-418f-8d36-87ae65b950ae for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 23 05:11:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:11:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:47.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:11:47 np0005593234 podman[288839]: 2026-01-23 10:11:47.786510954 +0000 UTC m=+0.082865372 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:11:48 np0005593234 nova_compute[227762]: 2026-01-23 10:11:48.091 227766 DEBUG nova.network.neutron [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updating instance_info_cache with network_info: [{"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:48 np0005593234 nova_compute[227762]: 2026-01-23 10:11:48.105 227766 DEBUG nova.compute.manager [None req-6616c0da-e872-4823-b0f8-5792fad4c8d3 eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:11:48 np0005593234 nova_compute[227762]: 2026-01-23 10:11:48.158 227766 DEBUG oslo_concurrency.lockutils [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Releasing lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:11:48 np0005593234 nova_compute[227762]: 2026-01-23 10:11:48.334 227766 DEBUG nova.virt.libvirt.driver [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Jan 23 05:11:48 np0005593234 nova_compute[227762]: 2026-01-23 10:11:48.335 227766 DEBUG nova.virt.libvirt.volume.remotefs [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Creating file /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c/1a01220203ab47e489fddfb857e73ea3.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Jan 23 05:11:48 np0005593234 nova_compute[227762]: 2026-01-23 10:11:48.335 227766 DEBUG oslo_concurrency.processutils [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c/1a01220203ab47e489fddfb857e73ea3.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:48 np0005593234 nova_compute[227762]: 2026-01-23 10:11:48.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:48 np0005593234 nova_compute[227762]: 2026-01-23 10:11:48.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:11:48 np0005593234 nova_compute[227762]: 2026-01-23 10:11:48.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:11:48 np0005593234 nova_compute[227762]: 2026-01-23 10:11:48.828 227766 DEBUG oslo_concurrency.processutils [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c/1a01220203ab47e489fddfb857e73ea3.tmp" returned: 1 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:48 np0005593234 nova_compute[227762]: 2026-01-23 10:11:48.829 227766 DEBUG oslo_concurrency.processutils [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c/1a01220203ab47e489fddfb857e73ea3.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Jan 23 05:11:48 np0005593234 nova_compute[227762]: 2026-01-23 10:11:48.829 227766 DEBUG nova.virt.libvirt.volume.remotefs [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Creating directory /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Jan 23 05:11:48 np0005593234 nova_compute[227762]: 2026-01-23 10:11:48.830 227766 DEBUG oslo_concurrency.processutils [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:48.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.043 227766 DEBUG oslo_concurrency.processutils [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" returned: 0 in 0.213s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.048 227766 DEBUG nova.virt.libvirt.driver [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.104 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.105 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.105 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.105 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.417 227766 DEBUG nova.compute.manager [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.419 227766 DEBUG oslo_concurrency.lockutils [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.420 227766 DEBUG oslo_concurrency.lockutils [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.421 227766 DEBUG oslo_concurrency.lockutils [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.421 227766 DEBUG nova.compute.manager [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] No waiting events found dispatching network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.421 227766 WARNING nova.compute.manager [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received unexpected event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae for instance with vm_state active and task_state None.#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.421 227766 DEBUG nova.compute.manager [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.422 227766 DEBUG oslo_concurrency.lockutils [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.422 227766 DEBUG oslo_concurrency.lockutils [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.422 227766 DEBUG oslo_concurrency.lockutils [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.422 227766 DEBUG nova.compute.manager [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] No waiting events found dispatching network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.423 227766 WARNING nova.compute.manager [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received unexpected event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae for instance with vm_state active and task_state None.#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.423 227766 DEBUG nova.compute.manager [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.423 227766 DEBUG oslo_concurrency.lockutils [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.423 227766 DEBUG oslo_concurrency.lockutils [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.424 227766 DEBUG oslo_concurrency.lockutils [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.424 227766 DEBUG nova.compute.manager [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] No waiting events found dispatching network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:49 np0005593234 nova_compute[227762]: 2026-01-23 10:11:49.424 227766 WARNING nova.compute.manager [req-101ba9cd-1112-457f-86c4-08af06b36afe req-ddc38669-5040-441d-9b54-9ec98e1300a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received unexpected event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae for instance with vm_state active and task_state None.#033[00m
Jan 23 05:11:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:49.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.299 227766 DEBUG oslo_concurrency.lockutils [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.300 227766 DEBUG oslo_concurrency.lockutils [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.300 227766 DEBUG oslo_concurrency.lockutils [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.300 227766 DEBUG oslo_concurrency.lockutils [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.300 227766 DEBUG oslo_concurrency.lockutils [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.302 227766 INFO nova.compute.manager [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Terminating instance#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.303 227766 DEBUG nova.compute.manager [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:11:50 np0005593234 kernel: tap475aff6e-e5 (unregistering): left promiscuous mode
Jan 23 05:11:50 np0005593234 NetworkManager[48942]: <info>  [1769163110.3458] device (tap475aff6e-e5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:11:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:50Z|00530|binding|INFO|Releasing lport 475aff6e-e556-418f-8d36-87ae65b950ae from this chassis (sb_readonly=0)
Jan 23 05:11:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:50Z|00531|binding|INFO|Setting lport 475aff6e-e556-418f-8d36-87ae65b950ae down in Southbound
Jan 23 05:11:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:50Z|00532|binding|INFO|Removing iface tap475aff6e-e5 ovn-installed in OVS
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.355 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:50.365 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:85:d9:a5 10.100.0.9'], port_security=['fa:16:3e:85:d9:a5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'da6a1a46-4a6b-44a0-b5a2-35d2634865be', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88b571fd-69ad-4860-a596-3bd637fdb189', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f00cc6e26e5c435b902306c6421e146d', 'neutron:revision_number': '8', 'neutron:security_group_ids': 'a7b9167c-c78b-48f5-9e9d-ac8ada29e0a2', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d050303a-8173-4865-aab2-724e0c0624de, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=475aff6e-e556-418f-8d36-87ae65b950ae) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:11:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:50.366 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 475aff6e-e556-418f-8d36-87ae65b950ae in datapath 88b571fd-69ad-4860-a596-3bd637fdb189 unbound from our chassis#033[00m
Jan 23 05:11:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:50.367 144381 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 88b571fd-69ad-4860-a596-3bd637fdb189 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Jan 23 05:11:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:50.369 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b2f4b5c9-3c26-4187-8fc0-c52889acf694]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.372 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:50 np0005593234 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000083.scope: Deactivated successfully.
Jan 23 05:11:50 np0005593234 systemd[1]: machine-qemu\x2d63\x2dinstance\x2d00000083.scope: Consumed 3.757s CPU time.
Jan 23 05:11:50 np0005593234 systemd-machined[195626]: Machine qemu-63-instance-00000083 terminated.
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.543 227766 INFO nova.virt.libvirt.driver [-] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Instance destroyed successfully.#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.543 227766 DEBUG nova.objects.instance [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lazy-loading 'resources' on Instance uuid da6a1a46-4a6b-44a0-b5a2-35d2634865be obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.559 227766 DEBUG nova.virt.libvirt.vif [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:10:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueTestJSON-server-1707842336',display_name='tempest-ServerRescueTestJSON-server-1707842336',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuetestjson-server-1707842336',id=131,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:11:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f00cc6e26e5c435b902306c6421e146d',ramdisk_id='',reservation_id='r-g6qwsh4q',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueTestJSON-837476510',owner_user_name='tempest-ServerRescueTestJSON-837476510-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:11:48Z,user_data=None,user_id='eb500aabc93044e380f4bc905205803d',uuid=da6a1a46-4a6b-44a0-b5a2-35d2634865be,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.559 227766 DEBUG nova.network.os_vif_util [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Converting VIF {"id": "475aff6e-e556-418f-8d36-87ae65b950ae", "address": "fa:16:3e:85:d9:a5", "network": {"id": "88b571fd-69ad-4860-a596-3bd637fdb189", "bridge": "br-int", "label": "tempest-ServerRescueTestJSON-1616424882-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": false}}], "meta": {"injected": false, "tenant_id": "f00cc6e26e5c435b902306c6421e146d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap475aff6e-e5", "ovs_interfaceid": "475aff6e-e556-418f-8d36-87ae65b950ae", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.560 227766 DEBUG nova.network.os_vif_util [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:85:d9:a5,bridge_name='br-int',has_traffic_filtering=True,id=475aff6e-e556-418f-8d36-87ae65b950ae,network=Network(88b571fd-69ad-4860-a596-3bd637fdb189),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap475aff6e-e5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.560 227766 DEBUG os_vif [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:d9:a5,bridge_name='br-int',has_traffic_filtering=True,id=475aff6e-e556-418f-8d36-87ae65b950ae,network=Network(88b571fd-69ad-4860-a596-3bd637fdb189),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap475aff6e-e5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.563 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.563 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap475aff6e-e5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.564 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.566 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:50 np0005593234 nova_compute[227762]: 2026-01-23 10:11:50.568 227766 INFO os_vif [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:85:d9:a5,bridge_name='br-int',has_traffic_filtering=True,id=475aff6e-e556-418f-8d36-87ae65b950ae,network=Network(88b571fd-69ad-4860-a596-3bd637fdb189),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap475aff6e-e5')#033[00m
Jan 23 05:11:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:50.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.003 227766 INFO nova.virt.libvirt.driver [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Deleting instance files /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be_del#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.003 227766 INFO nova.virt.libvirt.driver [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Deletion of /var/lib/nova/instances/da6a1a46-4a6b-44a0-b5a2-35d2634865be_del complete#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.078 227766 INFO nova.compute.manager [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Took 0.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.078 227766 DEBUG oslo.service.loopingcall [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.079 227766 DEBUG nova.compute.manager [-] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.079 227766 DEBUG nova.network.neutron [-] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.512 227766 DEBUG nova.compute.manager [req-caa5366f-f9c6-4ee3-979b-f7b4106795fd req-b33999bc-be69-430e-af4c-267e58ebe7f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-unplugged-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.513 227766 DEBUG oslo_concurrency.lockutils [req-caa5366f-f9c6-4ee3-979b-f7b4106795fd req-b33999bc-be69-430e-af4c-267e58ebe7f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.513 227766 DEBUG oslo_concurrency.lockutils [req-caa5366f-f9c6-4ee3-979b-f7b4106795fd req-b33999bc-be69-430e-af4c-267e58ebe7f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.513 227766 DEBUG oslo_concurrency.lockutils [req-caa5366f-f9c6-4ee3-979b-f7b4106795fd req-b33999bc-be69-430e-af4c-267e58ebe7f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.513 227766 DEBUG nova.compute.manager [req-caa5366f-f9c6-4ee3-979b-f7b4106795fd req-b33999bc-be69-430e-af4c-267e58ebe7f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] No waiting events found dispatching network-vif-unplugged-475aff6e-e556-418f-8d36-87ae65b950ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.513 227766 DEBUG nova.compute.manager [req-caa5366f-f9c6-4ee3-979b-f7b4106795fd req-b33999bc-be69-430e-af4c-267e58ebe7f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-unplugged-475aff6e-e556-418f-8d36-87ae65b950ae for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.514 227766 DEBUG nova.compute.manager [req-caa5366f-f9c6-4ee3-979b-f7b4106795fd req-b33999bc-be69-430e-af4c-267e58ebe7f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.514 227766 DEBUG oslo_concurrency.lockutils [req-caa5366f-f9c6-4ee3-979b-f7b4106795fd req-b33999bc-be69-430e-af4c-267e58ebe7f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.514 227766 DEBUG oslo_concurrency.lockutils [req-caa5366f-f9c6-4ee3-979b-f7b4106795fd req-b33999bc-be69-430e-af4c-267e58ebe7f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.514 227766 DEBUG oslo_concurrency.lockutils [req-caa5366f-f9c6-4ee3-979b-f7b4106795fd req-b33999bc-be69-430e-af4c-267e58ebe7f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.514 227766 DEBUG nova.compute.manager [req-caa5366f-f9c6-4ee3-979b-f7b4106795fd req-b33999bc-be69-430e-af4c-267e58ebe7f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] No waiting events found dispatching network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.514 227766 WARNING nova.compute.manager [req-caa5366f-f9c6-4ee3-979b-f7b4106795fd req-b33999bc-be69-430e-af4c-267e58ebe7f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received unexpected event network-vif-plugged-475aff6e-e556-418f-8d36-87ae65b950ae for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:11:51 np0005593234 kernel: tap87b7656f-9f (unregistering): left promiscuous mode
Jan 23 05:11:51 np0005593234 NetworkManager[48942]: <info>  [1769163111.6042] device (tap87b7656f-9f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.611 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:51 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:51Z|00533|binding|INFO|Releasing lport 87b7656f-9fbc-466f-bfe3-06171df90096 from this chassis (sb_readonly=0)
Jan 23 05:11:51 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:51Z|00534|binding|INFO|Setting lport 87b7656f-9fbc-466f-bfe3-06171df90096 down in Southbound
Jan 23 05:11:51 np0005593234 ovn_controller[134547]: 2026-01-23T10:11:51Z|00535|binding|INFO|Removing iface tap87b7656f-9f ovn-installed in OVS
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.612 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.618 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:3b:96 10.100.0.12'], port_security=['fa:16:3e:a4:3b:96 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d9599b4-8855-4310-af02-cdd058438f7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cf3e0bf9-33c6-483b-a880-c8297a0be71f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.199'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=875f4baa-cb85-49ca-8f02-78715d351fdb, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=87b7656f-9fbc-466f-bfe3-06171df90096) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.619 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 87b7656f-9fbc-466f-bfe3-06171df90096 in datapath 8d9599b4-8855-4310-af02-cdd058438f7d unbound from our chassis#033[00m
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.620 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8d9599b4-8855-4310-af02-cdd058438f7d#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.629 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.634 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Updating instance_info_cache with network_info: [{"id": "8be6de92-c581-49d7-a315-1d1b8c33153a", "address": "fa:16:3e:d6:07:56", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8be6de92-c5", "ovs_interfaceid": "8be6de92-c581-49d7-a315-1d1b8c33153a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.656 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[070b94d3-38c2-4b52-a77c-d9c8c8887c64]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:51 np0005593234 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000082.scope: Deactivated successfully.
Jan 23 05:11:51 np0005593234 systemd[1]: machine-qemu\x2d60\x2dinstance\x2d00000082.scope: Consumed 15.534s CPU time.
Jan 23 05:11:51 np0005593234 systemd-machined[195626]: Machine qemu-60-instance-00000082 terminated.
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.688 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4d7f9e51-5000-4025-8753-0c56e6ce97f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.691 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.691 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.691 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.692 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[634c5f65-2b37-4ee4-8504-455f7cb96ede]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.719 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ee41a5-40a7-4019-a516-dc7f42ada35b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:51.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.737 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b1bf612e-e765-4bb1-92f2-370d2118f58d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d9599b4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:a1:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687969, 'reachable_time': 24655, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 288966, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.754 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[50214fb9-8461-4aba-92d8-160695296e14]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8d9599b4-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687982, 'tstamp': 687982}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288967, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8d9599b4-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687985, 'tstamp': 687985}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 288967, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.756 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d9599b4-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.758 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.762 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.764 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d9599b4-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.765 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.766 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8d9599b4-80, col_values=(('external_ids', {'iface-id': 'b57bd565-3bb1-4ecc-8df0-a7c439ac84a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:11:51.767 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:11:51 np0005593234 kernel: tap87b7656f-9f: entered promiscuous mode
Jan 23 05:11:51 np0005593234 kernel: tap87b7656f-9f (unregistering): left promiscuous mode
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.833 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.975 227766 DEBUG nova.compute.manager [req-6963f36a-c3ba-412a-9e8e-ad8898896979 req-34106798-0446-4e01-b1c4-05d79a3b85f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-vif-unplugged-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.975 227766 DEBUG oslo_concurrency.lockutils [req-6963f36a-c3ba-412a-9e8e-ad8898896979 req-34106798-0446-4e01-b1c4-05d79a3b85f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.975 227766 DEBUG oslo_concurrency.lockutils [req-6963f36a-c3ba-412a-9e8e-ad8898896979 req-34106798-0446-4e01-b1c4-05d79a3b85f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.975 227766 DEBUG oslo_concurrency.lockutils [req-6963f36a-c3ba-412a-9e8e-ad8898896979 req-34106798-0446-4e01-b1c4-05d79a3b85f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.975 227766 DEBUG nova.compute.manager [req-6963f36a-c3ba-412a-9e8e-ad8898896979 req-34106798-0446-4e01-b1c4-05d79a3b85f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] No waiting events found dispatching network-vif-unplugged-87b7656f-9fbc-466f-bfe3-06171df90096 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.976 227766 WARNING nova.compute.manager [req-6963f36a-c3ba-412a-9e8e-ad8898896979 req-34106798-0446-4e01-b1c4-05d79a3b85f4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received unexpected event network-vif-unplugged-87b7656f-9fbc-466f-bfe3-06171df90096 for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.977 227766 DEBUG nova.network.neutron [-] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:51 np0005593234 nova_compute[227762]: 2026-01-23 10:11:51.980 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.002 227766 INFO nova.compute.manager [-] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Took 0.92 seconds to deallocate network for instance.#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.065 227766 INFO nova.virt.libvirt.driver [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.071 227766 INFO nova.virt.libvirt.driver [-] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Instance destroyed successfully.#033[00m
Jan 23 05:11:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.072 227766 DEBUG nova.virt.libvirt.vif [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:10:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1307986454',display_name='tempest-ServerActionsTestOtherB-server-1307986454',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1307986454',id=130,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:11:08Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-7gk6dzv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:11:42Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1325714374-network", "vif_mac": "fa:16:3e:a4:3b:96"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.073 227766 DEBUG nova.network.os_vif_util [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-ServerActionsTestOtherB-1325714374-network", "vif_mac": "fa:16:3e:a4:3b:96"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.073 227766 DEBUG nova.network.os_vif_util [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.074 227766 DEBUG os_vif [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.075 227766 DEBUG oslo_concurrency.lockutils [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.076 227766 DEBUG oslo_concurrency.lockutils [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.076 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.076 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87b7656f-9f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.077 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.079 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.081 227766 INFO os_vif [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f')#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.092 227766 DEBUG nova.virt.libvirt.driver [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] skipping disk for instance-00000082 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.092 227766 DEBUG nova.virt.libvirt.driver [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] skipping disk for instance-00000082 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.092 227766 DEBUG nova.virt.libvirt.driver [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] skipping disk for instance-00000082 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.210 227766 DEBUG oslo_concurrency.processutils [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.685 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:11:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:52.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:11:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1428865053' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.974 227766 DEBUG oslo_concurrency.processutils [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.764s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.981 227766 DEBUG nova.compute.provider_tree [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:11:52 np0005593234 nova_compute[227762]: 2026-01-23 10:11:52.998 227766 DEBUG nova.scheduler.client.report [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:11:53 np0005593234 nova_compute[227762]: 2026-01-23 10:11:53.017 227766 DEBUG oslo_concurrency.lockutils [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.941s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:53 np0005593234 nova_compute[227762]: 2026-01-23 10:11:53.061 227766 INFO nova.scheduler.client.report [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Deleted allocations for instance da6a1a46-4a6b-44a0-b5a2-35d2634865be#033[00m
Jan 23 05:11:53 np0005593234 nova_compute[227762]: 2026-01-23 10:11:53.129 227766 DEBUG oslo_concurrency.lockutils [None req-55340d23-e207-43aa-83a0-b02078e9cdec eb500aabc93044e380f4bc905205803d f00cc6e26e5c435b902306c6421e146d - - default default] Lock "da6a1a46-4a6b-44a0-b5a2-35d2634865be" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.829s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:53 np0005593234 nova_compute[227762]: 2026-01-23 10:11:53.182 227766 DEBUG neutronclient.v2_0.client [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 87b7656f-9fbc-466f-bfe3-06171df90096 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Jan 23 05:11:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:11:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3797733701' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:11:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:11:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3797733701' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:11:53 np0005593234 nova_compute[227762]: 2026-01-23 10:11:53.287 227766 DEBUG oslo_concurrency.lockutils [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:53 np0005593234 nova_compute[227762]: 2026-01-23 10:11:53.287 227766 DEBUG oslo_concurrency.lockutils [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:53 np0005593234 nova_compute[227762]: 2026-01-23 10:11:53.287 227766 DEBUG oslo_concurrency.lockutils [None req-14e0c743-b013-4bda-87da-8f34d660792a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:53.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:54 np0005593234 nova_compute[227762]: 2026-01-23 10:11:54.160 227766 DEBUG nova.compute.manager [req-23e6e034-c6b8-47b5-8d6c-1d702760e38b req-cba8e35e-e088-4ce6-b199-4147fba3b6d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Received event network-vif-deleted-475aff6e-e556-418f-8d36-87ae65b950ae external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:54 np0005593234 nova_compute[227762]: 2026-01-23 10:11:54.160 227766 DEBUG nova.compute.manager [req-23e6e034-c6b8-47b5-8d6c-1d702760e38b req-cba8e35e-e088-4ce6-b199-4147fba3b6d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:54 np0005593234 nova_compute[227762]: 2026-01-23 10:11:54.160 227766 DEBUG oslo_concurrency.lockutils [req-23e6e034-c6b8-47b5-8d6c-1d702760e38b req-cba8e35e-e088-4ce6-b199-4147fba3b6d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:11:54 np0005593234 nova_compute[227762]: 2026-01-23 10:11:54.161 227766 DEBUG oslo_concurrency.lockutils [req-23e6e034-c6b8-47b5-8d6c-1d702760e38b req-cba8e35e-e088-4ce6-b199-4147fba3b6d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:11:54 np0005593234 nova_compute[227762]: 2026-01-23 10:11:54.161 227766 DEBUG oslo_concurrency.lockutils [req-23e6e034-c6b8-47b5-8d6c-1d702760e38b req-cba8e35e-e088-4ce6-b199-4147fba3b6d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:11:54 np0005593234 nova_compute[227762]: 2026-01-23 10:11:54.161 227766 DEBUG nova.compute.manager [req-23e6e034-c6b8-47b5-8d6c-1d702760e38b req-cba8e35e-e088-4ce6-b199-4147fba3b6d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] No waiting events found dispatching network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:11:54 np0005593234 nova_compute[227762]: 2026-01-23 10:11:54.161 227766 WARNING nova.compute.manager [req-23e6e034-c6b8-47b5-8d6c-1d702760e38b req-cba8e35e-e088-4ce6-b199-4147fba3b6d7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received unexpected event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 05:11:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:11:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.0 total, 600.0 interval#012Cumulative writes: 11K writes, 59K keys, 11K commit groups, 1.0 writes per commit group, ingest: 0.12 GB, 0.03 MB/s#012Cumulative WAL: 11K writes, 11K syncs, 1.00 writes per sync, written: 0.12 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1640 writes, 8247 keys, 1640 commit groups, 1.0 writes per commit group, ingest: 16.52 MB, 0.03 MB/s#012Interval WAL: 1640 writes, 1640 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     54.0      1.37              0.21        36    0.038       0      0       0.0       0.0#012  L6      1/0   12.15 MB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   4.5    109.2     92.2      3.62              1.12        35    0.103    220K    19K       0.0       0.0#012 Sum      1/0   12.15 MB   0.0      0.4     0.1      0.3       0.4      0.1       0.0   5.5     79.3     81.7      4.99              1.33        71    0.070    220K    19K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.3    107.1    112.0      0.70              0.22        12    0.059     49K   3127       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.4     0.1      0.3       0.3      0.0       0.0   0.0    109.2     92.2      3.62              1.12        35    0.103    220K    19K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     54.1      1.37              0.21        35    0.039       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4200.0 total, 600.0 interval#012Flush(GB): cumulative 0.072, interval 0.012#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.40 GB write, 0.10 MB/s write, 0.39 GB read, 0.09 MB/s read, 5.0 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.07 GB read, 0.13 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 304.00 MB usage: 43.43 MB table_size: 0 occupancy: 18446744073709551615 collections: 8 last_copies: 0 last_secs: 0.000271 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2503,41.78 MB,13.7441%) FilterBlock(71,621.55 KB,0.199664%) IndexBlock(71,1.05 MB,0.343885%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:11:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:54.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:55 np0005593234 nova_compute[227762]: 2026-01-23 10:11:55.530 227766 DEBUG nova.compute.manager [req-35a18290-a23a-4826-b034-0e16db4e6147 req-1c19cb95-010a-4aa3-bec6-93e88bf97af7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-changed-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:11:55 np0005593234 nova_compute[227762]: 2026-01-23 10:11:55.530 227766 DEBUG nova.compute.manager [req-35a18290-a23a-4826-b034-0e16db4e6147 req-1c19cb95-010a-4aa3-bec6-93e88bf97af7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Refreshing instance network info cache due to event network-changed-87b7656f-9fbc-466f-bfe3-06171df90096. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:11:55 np0005593234 nova_compute[227762]: 2026-01-23 10:11:55.530 227766 DEBUG oslo_concurrency.lockutils [req-35a18290-a23a-4826-b034-0e16db4e6147 req-1c19cb95-010a-4aa3-bec6-93e88bf97af7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:11:55 np0005593234 nova_compute[227762]: 2026-01-23 10:11:55.530 227766 DEBUG oslo_concurrency.lockutils [req-35a18290-a23a-4826-b034-0e16db4e6147 req-1c19cb95-010a-4aa3-bec6-93e88bf97af7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:11:55 np0005593234 nova_compute[227762]: 2026-01-23 10:11:55.531 227766 DEBUG nova.network.neutron [req-35a18290-a23a-4826-b034-0e16db4e6147 req-1c19cb95-010a-4aa3-bec6-93e88bf97af7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Refreshing network info cache for port 87b7656f-9fbc-466f-bfe3-06171df90096 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:11:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:55.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:56.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:56 np0005593234 nova_compute[227762]: 2026-01-23 10:11:56.982 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:57 np0005593234 nova_compute[227762]: 2026-01-23 10:11:57.077 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:11:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:11:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:11:57 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1400589595' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:11:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:11:57 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1400589595' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:11:57 np0005593234 nova_compute[227762]: 2026-01-23 10:11:57.531 227766 DEBUG nova.network.neutron [req-35a18290-a23a-4826-b034-0e16db4e6147 req-1c19cb95-010a-4aa3-bec6-93e88bf97af7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updated VIF entry in instance network info cache for port 87b7656f-9fbc-466f-bfe3-06171df90096. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:11:57 np0005593234 nova_compute[227762]: 2026-01-23 10:11:57.531 227766 DEBUG nova.network.neutron [req-35a18290-a23a-4826-b034-0e16db4e6147 req-1c19cb95-010a-4aa3-bec6-93e88bf97af7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updating instance_info_cache with network_info: [{"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:11:57 np0005593234 nova_compute[227762]: 2026-01-23 10:11:57.553 227766 DEBUG oslo_concurrency.lockutils [req-35a18290-a23a-4826-b034-0e16db4e6147 req-1c19cb95-010a-4aa3-bec6-93e88bf97af7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:11:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:57.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:58.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:11:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:11:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:11:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:59.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e294 e294: 3 total, 3 up, 3 in
Jan 23 05:12:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:00.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:00.958 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=48, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=47) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:12:00 np0005593234 nova_compute[227762]: 2026-01-23 10:12:00.958 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:00.959 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:12:01 np0005593234 nova_compute[227762]: 2026-01-23 10:12:01.208 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:12:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:01.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:12:01 np0005593234 nova_compute[227762]: 2026-01-23 10:12:01.983 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:02 np0005593234 nova_compute[227762]: 2026-01-23 10:12:02.079 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:02 np0005593234 nova_compute[227762]: 2026-01-23 10:12:02.199 227766 DEBUG nova.compute.manager [req-8b8b184d-09ef-4796-a9e6-846a5bb82fac req-f4313b99-945d-4774-afa9-6c383d2b70c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:02 np0005593234 nova_compute[227762]: 2026-01-23 10:12:02.200 227766 DEBUG oslo_concurrency.lockutils [req-8b8b184d-09ef-4796-a9e6-846a5bb82fac req-f4313b99-945d-4774-afa9-6c383d2b70c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:02 np0005593234 nova_compute[227762]: 2026-01-23 10:12:02.200 227766 DEBUG oslo_concurrency.lockutils [req-8b8b184d-09ef-4796-a9e6-846a5bb82fac req-f4313b99-945d-4774-afa9-6c383d2b70c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:02 np0005593234 nova_compute[227762]: 2026-01-23 10:12:02.201 227766 DEBUG oslo_concurrency.lockutils [req-8b8b184d-09ef-4796-a9e6-846a5bb82fac req-f4313b99-945d-4774-afa9-6c383d2b70c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:02 np0005593234 nova_compute[227762]: 2026-01-23 10:12:02.201 227766 DEBUG nova.compute.manager [req-8b8b184d-09ef-4796-a9e6-846a5bb82fac req-f4313b99-945d-4774-afa9-6c383d2b70c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] No waiting events found dispatching network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:12:02 np0005593234 nova_compute[227762]: 2026-01-23 10:12:02.202 227766 WARNING nova.compute.manager [req-8b8b184d-09ef-4796-a9e6-846a5bb82fac req-f4313b99-945d-4774-afa9-6c383d2b70c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received unexpected event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 23 05:12:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:02.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:02.961 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '48'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:03.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:04 np0005593234 nova_compute[227762]: 2026-01-23 10:12:04.438 227766 DEBUG nova.compute.manager [req-923f0934-72d2-44f2-8422-71d22decaba3 req-016605a5-5206-4b50-a726-a57983246d2e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:04 np0005593234 nova_compute[227762]: 2026-01-23 10:12:04.439 227766 DEBUG oslo_concurrency.lockutils [req-923f0934-72d2-44f2-8422-71d22decaba3 req-016605a5-5206-4b50-a726-a57983246d2e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:04 np0005593234 nova_compute[227762]: 2026-01-23 10:12:04.439 227766 DEBUG oslo_concurrency.lockutils [req-923f0934-72d2-44f2-8422-71d22decaba3 req-016605a5-5206-4b50-a726-a57983246d2e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:04 np0005593234 nova_compute[227762]: 2026-01-23 10:12:04.439 227766 DEBUG oslo_concurrency.lockutils [req-923f0934-72d2-44f2-8422-71d22decaba3 req-016605a5-5206-4b50-a726-a57983246d2e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:04 np0005593234 nova_compute[227762]: 2026-01-23 10:12:04.439 227766 DEBUG nova.compute.manager [req-923f0934-72d2-44f2-8422-71d22decaba3 req-016605a5-5206-4b50-a726-a57983246d2e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] No waiting events found dispatching network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:12:04 np0005593234 nova_compute[227762]: 2026-01-23 10:12:04.440 227766 WARNING nova.compute.manager [req-923f0934-72d2-44f2-8422-71d22decaba3 req-016605a5-5206-4b50-a726-a57983246d2e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received unexpected event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 for instance with vm_state resized and task_state None.#033[00m
Jan 23 05:12:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:04.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:05 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:05Z|00536|binding|INFO|Releasing lport b57bd565-3bb1-4ecc-8df0-a7c439ac84a6 from this chassis (sb_readonly=0)
Jan 23 05:12:05 np0005593234 nova_compute[227762]: 2026-01-23 10:12:05.516 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:05 np0005593234 nova_compute[227762]: 2026-01-23 10:12:05.542 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163110.54048, da6a1a46-4a6b-44a0-b5a2-35d2634865be => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:05 np0005593234 nova_compute[227762]: 2026-01-23 10:12:05.542 227766 INFO nova.compute.manager [-] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:12:05 np0005593234 nova_compute[227762]: 2026-01-23 10:12:05.567 227766 DEBUG nova.compute.manager [None req-8974accf-4381-408a-b60c-121efb475b0f - - - - - -] [instance: da6a1a46-4a6b-44a0-b5a2-35d2634865be] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:05.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:06 np0005593234 nova_compute[227762]: 2026-01-23 10:12:06.844 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163111.8439648, a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:06 np0005593234 nova_compute[227762]: 2026-01-23 10:12:06.845 227766 INFO nova.compute.manager [-] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:12:06 np0005593234 nova_compute[227762]: 2026-01-23 10:12:06.904 227766 DEBUG nova.compute.manager [None req-080400c1-6dfe-45c2-840e-076bb46182ae - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:06 np0005593234 nova_compute[227762]: 2026-01-23 10:12:06.907 227766 DEBUG nova.compute.manager [None req-080400c1-6dfe-45c2-840e-076bb46182ae - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:12:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:06.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:06 np0005593234 nova_compute[227762]: 2026-01-23 10:12:06.931 227766 INFO nova.compute.manager [None req-080400c1-6dfe-45c2-840e-076bb46182ae - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 23 05:12:06 np0005593234 nova_compute[227762]: 2026-01-23 10:12:06.985 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:07Z|00537|binding|INFO|Releasing lport b57bd565-3bb1-4ecc-8df0-a7c439ac84a6 from this chassis (sb_readonly=0)
Jan 23 05:12:07 np0005593234 nova_compute[227762]: 2026-01-23 10:12:07.080 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:07 np0005593234 nova_compute[227762]: 2026-01-23 10:12:07.111 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:07.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:07 np0005593234 nova_compute[227762]: 2026-01-23 10:12:07.810 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:08 np0005593234 nova_compute[227762]: 2026-01-23 10:12:08.640 227766 DEBUG nova.compute.manager [req-7074a644-3664-45a2-abf8-af3823623710 req-7e7412e3-0234-4375-aafd-9798ff4eac7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-vif-unplugged-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:08 np0005593234 nova_compute[227762]: 2026-01-23 10:12:08.640 227766 DEBUG oslo_concurrency.lockutils [req-7074a644-3664-45a2-abf8-af3823623710 req-7e7412e3-0234-4375-aafd-9798ff4eac7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:08 np0005593234 nova_compute[227762]: 2026-01-23 10:12:08.641 227766 DEBUG oslo_concurrency.lockutils [req-7074a644-3664-45a2-abf8-af3823623710 req-7e7412e3-0234-4375-aafd-9798ff4eac7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:08 np0005593234 nova_compute[227762]: 2026-01-23 10:12:08.641 227766 DEBUG oslo_concurrency.lockutils [req-7074a644-3664-45a2-abf8-af3823623710 req-7e7412e3-0234-4375-aafd-9798ff4eac7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:08 np0005593234 nova_compute[227762]: 2026-01-23 10:12:08.641 227766 DEBUG nova.compute.manager [req-7074a644-3664-45a2-abf8-af3823623710 req-7e7412e3-0234-4375-aafd-9798ff4eac7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] No waiting events found dispatching network-vif-unplugged-87b7656f-9fbc-466f-bfe3-06171df90096 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:12:08 np0005593234 nova_compute[227762]: 2026-01-23 10:12:08.641 227766 WARNING nova.compute.manager [req-7074a644-3664-45a2-abf8-af3823623710 req-7e7412e3-0234-4375-aafd-9798ff4eac7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received unexpected event network-vif-unplugged-87b7656f-9fbc-466f-bfe3-06171df90096 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:12:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:08.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:09.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:09 np0005593234 podman[289060]: 2026-01-23 10:12:09.788559282 +0000 UTC m=+0.069093161 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:12:10 np0005593234 nova_compute[227762]: 2026-01-23 10:12:10.311 227766 INFO nova.compute.manager [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Swapping old allocation on dict_keys(['89873210-bee9-46e9-9f9d-0cd7a156c3a8']) held by migration 01ea1762-4856-47b8-a387-167e93cabc21 for instance#033[00m
Jan 23 05:12:10 np0005593234 nova_compute[227762]: 2026-01-23 10:12:10.353 227766 DEBUG nova.scheduler.client.report [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Overwriting current allocation {'allocations': {'0e4a8508-835c-4c0a-aa74-aae2c6536573': {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}, 'generation': 70}}, 'project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'user_id': 'aca3cab576d641d3b89e7dddf155d467', 'consumer_generation': 1} on consumer a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Jan 23 05:12:10 np0005593234 nova_compute[227762]: 2026-01-23 10:12:10.718 227766 INFO nova.network.neutron [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updating port 87b7656f-9fbc-466f-bfe3-06171df90096 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 05:12:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:10.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:11 np0005593234 nova_compute[227762]: 2026-01-23 10:12:11.018 227766 DEBUG nova.compute.manager [req-f74dbd75-d2a3-4bc1-ab55-042a0ed026a5 req-242d5e0f-614b-49dc-be0f-3b1c74c2d501 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:11 np0005593234 nova_compute[227762]: 2026-01-23 10:12:11.018 227766 DEBUG oslo_concurrency.lockutils [req-f74dbd75-d2a3-4bc1-ab55-042a0ed026a5 req-242d5e0f-614b-49dc-be0f-3b1c74c2d501 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:11 np0005593234 nova_compute[227762]: 2026-01-23 10:12:11.019 227766 DEBUG oslo_concurrency.lockutils [req-f74dbd75-d2a3-4bc1-ab55-042a0ed026a5 req-242d5e0f-614b-49dc-be0f-3b1c74c2d501 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:11 np0005593234 nova_compute[227762]: 2026-01-23 10:12:11.019 227766 DEBUG oslo_concurrency.lockutils [req-f74dbd75-d2a3-4bc1-ab55-042a0ed026a5 req-242d5e0f-614b-49dc-be0f-3b1c74c2d501 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:11 np0005593234 nova_compute[227762]: 2026-01-23 10:12:11.019 227766 DEBUG nova.compute.manager [req-f74dbd75-d2a3-4bc1-ab55-042a0ed026a5 req-242d5e0f-614b-49dc-be0f-3b1c74c2d501 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] No waiting events found dispatching network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:12:11 np0005593234 nova_compute[227762]: 2026-01-23 10:12:11.020 227766 WARNING nova.compute.manager [req-f74dbd75-d2a3-4bc1-ab55-042a0ed026a5 req-242d5e0f-614b-49dc-be0f-3b1c74c2d501 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received unexpected event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:12:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:11.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:11 np0005593234 nova_compute[227762]: 2026-01-23 10:12:11.766 227766 DEBUG oslo_concurrency.lockutils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:12:11 np0005593234 nova_compute[227762]: 2026-01-23 10:12:11.767 227766 DEBUG oslo_concurrency.lockutils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquired lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:12:11 np0005593234 nova_compute[227762]: 2026-01-23 10:12:11.767 227766 DEBUG nova.network.neutron [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:12:11 np0005593234 nova_compute[227762]: 2026-01-23 10:12:11.986 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:12 np0005593234 nova_compute[227762]: 2026-01-23 10:12:12.081 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:12 np0005593234 nova_compute[227762]: 2026-01-23 10:12:12.508 227766 DEBUG nova.compute.manager [req-6cfd2254-0d84-4421-913c-69643e1118fc req-d8118929-ea14-41e8-8d0b-65a6d4faba1b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-changed-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:12 np0005593234 nova_compute[227762]: 2026-01-23 10:12:12.508 227766 DEBUG nova.compute.manager [req-6cfd2254-0d84-4421-913c-69643e1118fc req-d8118929-ea14-41e8-8d0b-65a6d4faba1b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Refreshing instance network info cache due to event network-changed-87b7656f-9fbc-466f-bfe3-06171df90096. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:12:12 np0005593234 nova_compute[227762]: 2026-01-23 10:12:12.508 227766 DEBUG oslo_concurrency.lockutils [req-6cfd2254-0d84-4421-913c-69643e1118fc req-d8118929-ea14-41e8-8d0b-65a6d4faba1b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:12:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:12.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:12:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:13.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.833 227766 DEBUG nova.network.neutron [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updating instance_info_cache with network_info: [{"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.863 227766 DEBUG oslo_concurrency.lockutils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Releasing lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.865 227766 DEBUG os_brick.utils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.866 227766 DEBUG oslo_concurrency.lockutils [req-6cfd2254-0d84-4421-913c-69643e1118fc req-d8118929-ea14-41e8-8d0b-65a6d4faba1b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.866 227766 DEBUG nova.network.neutron [req-6cfd2254-0d84-4421-913c-69643e1118fc req-d8118929-ea14-41e8-8d0b-65a6d4faba1b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Refreshing network info cache for port 87b7656f-9fbc-466f-bfe3-06171df90096 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.866 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.880 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.881 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[a7d4f12e-0852-475b-b19e-994f96558074]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.882 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.892 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.892 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[ca7e49c4-75ed-4e27-8477-53d0a805f0eb]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.894 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.924 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.030s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.924 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[8518b82a-a13e-4235-b8bf-62ce3785d995]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.926 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[896120c9-0039-4e66-aabc-91da182f636c]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.927 227766 DEBUG oslo_concurrency.processutils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:14.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.951 227766 DEBUG oslo_concurrency.processutils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "nvme version" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.954 227766 DEBUG os_brick.initiator.connectors.lightos [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.954 227766 DEBUG os_brick.initiator.connectors.lightos [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.954 227766 DEBUG os_brick.initiator.connectors.lightos [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:12:14 np0005593234 nova_compute[227762]: 2026-01-23 10:12:14.955 227766 DEBUG os_brick.utils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] <== get_connector_properties: return (89ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:12:14 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 23 05:12:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:15.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.149 227766 DEBUG nova.virt.libvirt.driver [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.366 227766 DEBUG nova.storage.rbd_utils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rolling back rbd image(a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk) to snapshot(nova-resize) rollback_to_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:505#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.499 227766 DEBUG nova.storage.rbd_utils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] removing snapshot(nova-resize) on rbd image(a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:12:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e295 e295: 3 total, 3 up, 3 in
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.933 227766 DEBUG nova.virt.libvirt.driver [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Start _get_guest_xml network_info=[{"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [{'boot_index': None, 'mount_device': '/dev/vdb', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-649f8ce8-126a-4838-b42c-047bd1f41e67', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '649f8ce8-126a-4838-b42c-047bd1f41e67', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'attaching', 'instance': 'a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c', 'attached_at': '2026-01-23T10:12:15.000000', 'detached_at': '', 'volume_id': '649f8ce8-126a-4838-b42c-047bd1f41e67', 'serial': '649f8ce8-126a-4838-b42c-047bd1f41e67'}, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '68e12a79-6f68-4607-ab84-fa22faab90a9', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.936 227766 WARNING nova.virt.libvirt.driver [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:12:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:16.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.943 227766 DEBUG nova.virt.libvirt.host [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.943 227766 DEBUG nova.virt.libvirt.host [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.946 227766 DEBUG nova.virt.libvirt.host [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.946 227766 DEBUG nova.virt.libvirt.host [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.947 227766 DEBUG nova.virt.libvirt.driver [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.947 227766 DEBUG nova.virt.hardware [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.948 227766 DEBUG nova.virt.hardware [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.948 227766 DEBUG nova.virt.hardware [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.948 227766 DEBUG nova.virt.hardware [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.948 227766 DEBUG nova.virt.hardware [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.948 227766 DEBUG nova.virt.hardware [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.949 227766 DEBUG nova.virt.hardware [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.949 227766 DEBUG nova.virt.hardware [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.949 227766 DEBUG nova.virt.hardware [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.949 227766 DEBUG nova.virt.hardware [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.949 227766 DEBUG nova.virt.hardware [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.949 227766 DEBUG nova.objects.instance [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'vcpu_model' on Instance uuid a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.972 227766 DEBUG oslo_concurrency.processutils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:16 np0005593234 nova_compute[227762]: 2026-01-23 10:12:16.994 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.082 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e295 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.350 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:12:17 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1391679130' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.387 227766 DEBUG oslo_concurrency.processutils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.421 227766 DEBUG oslo_concurrency.processutils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:17.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:12:17 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/830348341' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.846 227766 DEBUG oslo_concurrency.processutils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.889 227766 DEBUG nova.virt.libvirt.vif [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:10:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1307986454',display_name='tempest-ServerActionsTestOtherB-server-1307986454',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1307986454',id=130,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:12:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-7gk6dzv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:12:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.889 227766 DEBUG nova.network.os_vif_util [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.890 227766 DEBUG nova.network.os_vif_util [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.893 227766 DEBUG nova.virt.libvirt.driver [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  <uuid>a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c</uuid>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  <name>instance-00000082</name>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerActionsTestOtherB-server-1307986454</nova:name>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:12:16</nova:creationTime>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <nova:user uuid="aca3cab576d641d3b89e7dddf155d467">tempest-ServerActionsTestOtherB-1052932467-project-member</nova:user>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <nova:project uuid="9dd869ce76e44fc8a82b8bbee1654d33">tempest-ServerActionsTestOtherB-1052932467</nova:project>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <nova:port uuid="87b7656f-9fbc-466f-bfe3-06171df90096">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <entry name="serial">a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c</entry>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <entry name="uuid">a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c</entry>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_disk.config">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="volumes/volume-649f8ce8-126a-4838-b42c-047bd1f41e67">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <target dev="vdb" bus="virtio"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <serial>649f8ce8-126a-4838-b42c-047bd1f41e67</serial>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:a4:3b:96"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <target dev="tap87b7656f-9f"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c/console.log" append="off"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <input type="keyboard" bus="usb"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:12:17 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:12:17 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:12:17 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:12:17 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.894 227766 DEBUG nova.compute.manager [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Preparing to wait for external event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.895 227766 DEBUG oslo_concurrency.lockutils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.895 227766 DEBUG oslo_concurrency.lockutils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.895 227766 DEBUG oslo_concurrency.lockutils [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.896 227766 DEBUG nova.virt.libvirt.vif [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:10:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1307986454',display_name='tempest-ServerActionsTestOtherB-server-1307986454',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1307986454',id=130,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:12:02Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-7gk6dzv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:12:05Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.896 227766 DEBUG nova.network.os_vif_util [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.897 227766 DEBUG nova.network.os_vif_util [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.897 227766 DEBUG os_vif [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.898 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.898 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.899 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.902 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.903 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap87b7656f-9f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.903 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap87b7656f-9f, col_values=(('external_ids', {'iface-id': '87b7656f-9fbc-466f-bfe3-06171df90096', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a4:3b:96', 'vm-uuid': 'a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.904 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:17 np0005593234 NetworkManager[48942]: <info>  [1769163137.9055] manager: (tap87b7656f-9f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/271)
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.907 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.910 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:17 np0005593234 nova_compute[227762]: 2026-01-23 10:12:17.911 227766 INFO os_vif [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f')#033[00m
Jan 23 05:12:18 np0005593234 kernel: tap87b7656f-9f: entered promiscuous mode
Jan 23 05:12:18 np0005593234 NetworkManager[48942]: <info>  [1769163138.0194] manager: (tap87b7656f-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/272)
Jan 23 05:12:18 np0005593234 nova_compute[227762]: 2026-01-23 10:12:18.020 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:18 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:18Z|00538|binding|INFO|Claiming lport 87b7656f-9fbc-466f-bfe3-06171df90096 for this chassis.
Jan 23 05:12:18 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:18Z|00539|binding|INFO|87b7656f-9fbc-466f-bfe3-06171df90096: Claiming fa:16:3e:a4:3b:96 10.100.0.12
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.027 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:3b:96 10.100.0.12'], port_security=['fa:16:3e:a4:3b:96 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d9599b4-8855-4310-af02-cdd058438f7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'neutron:revision_number': '10', 'neutron:security_group_ids': 'cf3e0bf9-33c6-483b-a880-c8297a0be71f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.199'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=875f4baa-cb85-49ca-8f02-78715d351fdb, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=87b7656f-9fbc-466f-bfe3-06171df90096) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.029 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 87b7656f-9fbc-466f-bfe3-06171df90096 in datapath 8d9599b4-8855-4310-af02-cdd058438f7d bound to our chassis#033[00m
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.030 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8d9599b4-8855-4310-af02-cdd058438f7d#033[00m
Jan 23 05:12:18 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:18Z|00540|binding|INFO|Setting lport 87b7656f-9fbc-466f-bfe3-06171df90096 ovn-installed in OVS
Jan 23 05:12:18 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:18Z|00541|binding|INFO|Setting lport 87b7656f-9fbc-466f-bfe3-06171df90096 up in Southbound
Jan 23 05:12:18 np0005593234 nova_compute[227762]: 2026-01-23 10:12:18.040 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:18 np0005593234 nova_compute[227762]: 2026-01-23 10:12:18.042 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.046 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e97eaab4-9b66-42ee-a4cf-34cd5b15bdba]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:18 np0005593234 systemd-udevd[289248]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:12:18 np0005593234 podman[289208]: 2026-01-23 10:12:18.059237675 +0000 UTC m=+0.110154455 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:12:18 np0005593234 systemd-machined[195626]: New machine qemu-64-instance-00000082.
Jan 23 05:12:18 np0005593234 NetworkManager[48942]: <info>  [1769163138.0678] device (tap87b7656f-9f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:12:18 np0005593234 NetworkManager[48942]: <info>  [1769163138.0687] device (tap87b7656f-9f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:12:18 np0005593234 systemd[1]: Started Virtual Machine qemu-64-instance-00000082.
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.080 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[76b90abd-cad0-46f3-a75d-40ecb016fc96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.085 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[0b82cd2a-cbf8-459a-9ef7-9dcbffc6ae50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.110 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[be893f74-579b-4225-bedf-d898227d0026]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.125 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0a1a3d6f-e378-4283-9d9e-a0d37d5c9a1e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d9599b4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:a1:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 9, 'rx_bytes': 700, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687969, 'reachable_time': 24655, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289261, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.138 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[366e5203-2339-4309-9d74-fa66b4f967fd]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8d9599b4-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687982, 'tstamp': 687982}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289262, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8d9599b4-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687985, 'tstamp': 687985}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289262, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.140 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d9599b4-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:18 np0005593234 nova_compute[227762]: 2026-01-23 10:12:18.141 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:18 np0005593234 nova_compute[227762]: 2026-01-23 10:12:18.142 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.143 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d9599b4-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.143 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.143 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8d9599b4-80, col_values=(('external_ids', {'iface-id': 'b57bd565-3bb1-4ecc-8df0-a7c439ac84a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:18.144 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:12:18 np0005593234 nova_compute[227762]: 2026-01-23 10:12:18.320 227766 DEBUG nova.network.neutron [req-6cfd2254-0d84-4421-913c-69643e1118fc req-d8118929-ea14-41e8-8d0b-65a6d4faba1b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updated VIF entry in instance network info cache for port 87b7656f-9fbc-466f-bfe3-06171df90096. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:12:18 np0005593234 nova_compute[227762]: 2026-01-23 10:12:18.321 227766 DEBUG nova.network.neutron [req-6cfd2254-0d84-4421-913c-69643e1118fc req-d8118929-ea14-41e8-8d0b-65a6d4faba1b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updating instance_info_cache with network_info: [{"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:12:18 np0005593234 nova_compute[227762]: 2026-01-23 10:12:18.372 227766 DEBUG oslo_concurrency.lockutils [req-6cfd2254-0d84-4421-913c-69643e1118fc req-d8118929-ea14-41e8-8d0b-65a6d4faba1b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:12:18 np0005593234 nova_compute[227762]: 2026-01-23 10:12:18.608 227766 DEBUG nova.compute.manager [req-c1dbf397-743a-47c3-9cb4-63696a7360d5 req-2e1a555e-6f7a-42c4-aa66-59ae51e53201 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:18 np0005593234 nova_compute[227762]: 2026-01-23 10:12:18.609 227766 DEBUG oslo_concurrency.lockutils [req-c1dbf397-743a-47c3-9cb4-63696a7360d5 req-2e1a555e-6f7a-42c4-aa66-59ae51e53201 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:18 np0005593234 nova_compute[227762]: 2026-01-23 10:12:18.609 227766 DEBUG oslo_concurrency.lockutils [req-c1dbf397-743a-47c3-9cb4-63696a7360d5 req-2e1a555e-6f7a-42c4-aa66-59ae51e53201 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:18 np0005593234 nova_compute[227762]: 2026-01-23 10:12:18.609 227766 DEBUG oslo_concurrency.lockutils [req-c1dbf397-743a-47c3-9cb4-63696a7360d5 req-2e1a555e-6f7a-42c4-aa66-59ae51e53201 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:18 np0005593234 nova_compute[227762]: 2026-01-23 10:12:18.610 227766 DEBUG nova.compute.manager [req-c1dbf397-743a-47c3-9cb4-63696a7360d5 req-2e1a555e-6f7a-42c4-aa66-59ae51e53201 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Processing event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:12:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:18.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.082 227766 DEBUG nova.compute.manager [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.083 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163139.082301, a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.083 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] VM Started (Lifecycle Event)#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.089 227766 INFO nova.virt.libvirt.driver [-] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Instance running successfully.#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.090 227766 DEBUG nova.virt.libvirt.driver [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.113 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.116 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.280 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.280 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163139.0831733, a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.281 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.306 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.310 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163139.08584, a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.311 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.385 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.389 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.494 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.535 227766 INFO nova.compute.manager [None req-f2e789ae-fa14-4d30-ac2a-ede8fc3cfc83 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updating instance to original state: 'active'#033[00m
Jan 23 05:12:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:19.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:19 np0005593234 nova_compute[227762]: 2026-01-23 10:12:19.910 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:20 np0005593234 nova_compute[227762]: 2026-01-23 10:12:20.895 227766 DEBUG nova.compute.manager [req-212b2678-c192-4c4f-a873-cdae02795cdb req-f36ba851-3dac-4b8e-81ba-a2fe6e0a0ed8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:20 np0005593234 nova_compute[227762]: 2026-01-23 10:12:20.895 227766 DEBUG oslo_concurrency.lockutils [req-212b2678-c192-4c4f-a873-cdae02795cdb req-f36ba851-3dac-4b8e-81ba-a2fe6e0a0ed8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:20 np0005593234 nova_compute[227762]: 2026-01-23 10:12:20.895 227766 DEBUG oslo_concurrency.lockutils [req-212b2678-c192-4c4f-a873-cdae02795cdb req-f36ba851-3dac-4b8e-81ba-a2fe6e0a0ed8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:20 np0005593234 nova_compute[227762]: 2026-01-23 10:12:20.896 227766 DEBUG oslo_concurrency.lockutils [req-212b2678-c192-4c4f-a873-cdae02795cdb req-f36ba851-3dac-4b8e-81ba-a2fe6e0a0ed8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:20 np0005593234 nova_compute[227762]: 2026-01-23 10:12:20.896 227766 DEBUG nova.compute.manager [req-212b2678-c192-4c4f-a873-cdae02795cdb req-f36ba851-3dac-4b8e-81ba-a2fe6e0a0ed8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] No waiting events found dispatching network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:12:20 np0005593234 nova_compute[227762]: 2026-01-23 10:12:20.896 227766 WARNING nova.compute.manager [req-212b2678-c192-4c4f-a873-cdae02795cdb req-f36ba851-3dac-4b8e-81ba-a2fe6e0a0ed8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received unexpected event network-vif-plugged-87b7656f-9fbc-466f-bfe3-06171df90096 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:12:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:20.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:21.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:21 np0005593234 nova_compute[227762]: 2026-01-23 10:12:21.990 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e295 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.601 227766 DEBUG oslo_concurrency.lockutils [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.601 227766 DEBUG oslo_concurrency.lockutils [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.602 227766 DEBUG oslo_concurrency.lockutils [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.602 227766 DEBUG oslo_concurrency.lockutils [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.602 227766 DEBUG oslo_concurrency.lockutils [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.604 227766 INFO nova.compute.manager [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Terminating instance#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.605 227766 DEBUG nova.compute.manager [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:12:22 np0005593234 kernel: tap87b7656f-9f (unregistering): left promiscuous mode
Jan 23 05:12:22 np0005593234 NetworkManager[48942]: <info>  [1769163142.6474] device (tap87b7656f-9f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.709 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:22Z|00542|binding|INFO|Releasing lport 87b7656f-9fbc-466f-bfe3-06171df90096 from this chassis (sb_readonly=0)
Jan 23 05:12:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:22Z|00543|binding|INFO|Setting lport 87b7656f-9fbc-466f-bfe3-06171df90096 down in Southbound
Jan 23 05:12:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:22Z|00544|binding|INFO|Removing iface tap87b7656f-9f ovn-installed in OVS
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.710 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.719 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a4:3b:96 10.100.0.12'], port_security=['fa:16:3e:a4:3b:96 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d9599b4-8855-4310-af02-cdd058438f7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'cf3e0bf9-33c6-483b-a880-c8297a0be71f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.199', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=875f4baa-cb85-49ca-8f02-78715d351fdb, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=87b7656f-9fbc-466f-bfe3-06171df90096) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.721 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 87b7656f-9fbc-466f-bfe3-06171df90096 in datapath 8d9599b4-8855-4310-af02-cdd058438f7d unbound from our chassis#033[00m
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.722 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8d9599b4-8855-4310-af02-cdd058438f7d#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.727 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.737 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[60d073f4-a783-4257-a771-04c1a696e029]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:22 np0005593234 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000082.scope: Deactivated successfully.
Jan 23 05:12:22 np0005593234 systemd[1]: machine-qemu\x2d64\x2dinstance\x2d00000082.scope: Consumed 4.640s CPU time.
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.761 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f1c18ee2-70ad-4319-ab91-c7a7dc417bd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:22 np0005593234 systemd-machined[195626]: Machine qemu-64-instance-00000082 terminated.
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.765 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b0dbc1c3-f020-4e5e-960d-d3b7fe2e6e3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.790 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb57bb4-02a1-4f74-aa28-c8a368a9bddb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.807 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a60497e3-490c-4096-9998-1ec5005de2e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d9599b4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:a1:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 11, 'rx_bytes': 700, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687969, 'reachable_time': 24655, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 289337, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.822 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[052a10c8-8574-4857-9e5f-7d17279eb330]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8d9599b4-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687982, 'tstamp': 687982}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289338, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8d9599b4-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687985, 'tstamp': 687985}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 289338, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.825 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d9599b4-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.826 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.831 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.832 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d9599b4-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.833 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.833 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8d9599b4-80, col_values=(('external_ids', {'iface-id': 'b57bd565-3bb1-4ecc-8df0-a7c439ac84a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:22.833 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.836 227766 INFO nova.virt.libvirt.driver [-] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Instance destroyed successfully.#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.836 227766 DEBUG nova.objects.instance [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'resources' on Instance uuid a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.858 227766 DEBUG nova.virt.libvirt.vif [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:10:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1307986454',display_name='tempest-ServerActionsTestOtherB-server-1307986454',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1307986454',id=130,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:12:19Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-7gk6dzv9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:12:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.859 227766 DEBUG nova.network.os_vif_util [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "87b7656f-9fbc-466f-bfe3-06171df90096", "address": "fa:16:3e:a4:3b:96", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap87b7656f-9f", "ovs_interfaceid": "87b7656f-9fbc-466f-bfe3-06171df90096", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.859 227766 DEBUG nova.network.os_vif_util [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.859 227766 DEBUG os_vif [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.861 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.861 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap87b7656f-9f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.862 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.864 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:22 np0005593234 nova_compute[227762]: 2026-01-23 10:12:22.867 227766 INFO os_vif [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a4:3b:96,bridge_name='br-int',has_traffic_filtering=True,id=87b7656f-9fbc-466f-bfe3-06171df90096,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap87b7656f-9f')#033[00m
Jan 23 05:12:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:22.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:23.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:23 np0005593234 nova_compute[227762]: 2026-01-23 10:12:23.894 227766 INFO nova.virt.libvirt.driver [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Deleting instance files /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_del#033[00m
Jan 23 05:12:23 np0005593234 nova_compute[227762]: 2026-01-23 10:12:23.895 227766 INFO nova.virt.libvirt.driver [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Deletion of /var/lib/nova/instances/a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c_del complete#033[00m
Jan 23 05:12:23 np0005593234 nova_compute[227762]: 2026-01-23 10:12:23.952 227766 INFO nova.compute.manager [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Took 1.35 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:12:23 np0005593234 nova_compute[227762]: 2026-01-23 10:12:23.952 227766 DEBUG oslo.service.loopingcall [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:12:23 np0005593234 nova_compute[227762]: 2026-01-23 10:12:23.953 227766 DEBUG nova.compute.manager [-] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:12:23 np0005593234 nova_compute[227762]: 2026-01-23 10:12:23.953 227766 DEBUG nova.network.neutron [-] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:12:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:24.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:25 np0005593234 nova_compute[227762]: 2026-01-23 10:12:25.090 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:25 np0005593234 nova_compute[227762]: 2026-01-23 10:12:25.190 227766 DEBUG nova.network.neutron [-] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:12:25 np0005593234 nova_compute[227762]: 2026-01-23 10:12:25.242 227766 INFO nova.compute.manager [-] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Took 1.29 seconds to deallocate network for instance.#033[00m
Jan 23 05:12:25 np0005593234 nova_compute[227762]: 2026-01-23 10:12:25.348 227766 DEBUG nova.compute.manager [req-5dab22ff-83e3-4396-b47a-1a34504c22e8 req-f82af195-2789-4726-962e-e8dd41ec2199 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Received event network-vif-deleted-87b7656f-9fbc-466f-bfe3-06171df90096 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:25 np0005593234 nova_compute[227762]: 2026-01-23 10:12:25.498 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:25 np0005593234 nova_compute[227762]: 2026-01-23 10:12:25.577 227766 INFO nova.compute.manager [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Took 0.33 seconds to detach 1 volumes for instance.#033[00m
Jan 23 05:12:25 np0005593234 nova_compute[227762]: 2026-01-23 10:12:25.639 227766 DEBUG oslo_concurrency.lockutils [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:25 np0005593234 nova_compute[227762]: 2026-01-23 10:12:25.639 227766 DEBUG oslo_concurrency.lockutils [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:25 np0005593234 nova_compute[227762]: 2026-01-23 10:12:25.721 227766 DEBUG oslo_concurrency.processutils [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:25.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:12:26 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/516050119' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:12:26 np0005593234 nova_compute[227762]: 2026-01-23 10:12:26.174 227766 DEBUG oslo_concurrency.processutils [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:26 np0005593234 nova_compute[227762]: 2026-01-23 10:12:26.181 227766 DEBUG nova.compute.provider_tree [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:12:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 e296: 3 total, 3 up, 3 in
Jan 23 05:12:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:26.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:26 np0005593234 nova_compute[227762]: 2026-01-23 10:12:26.992 227766 DEBUG nova.scheduler.client.report [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:12:26 np0005593234 nova_compute[227762]: 2026-01-23 10:12:26.996 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:27 np0005593234 nova_compute[227762]: 2026-01-23 10:12:27.226 227766 DEBUG oslo_concurrency.lockutils [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:27 np0005593234 nova_compute[227762]: 2026-01-23 10:12:27.254 227766 INFO nova.scheduler.client.report [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Deleted allocations for instance a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c#033[00m
Jan 23 05:12:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:27 np0005593234 nova_compute[227762]: 2026-01-23 10:12:27.403 227766 DEBUG oslo_concurrency.lockutils [None req-d802b322-21cd-46cd-a7fa-d53fd54907b1 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.802s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:27.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:27 np0005593234 nova_compute[227762]: 2026-01-23 10:12:27.863 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:12:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:28.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:12:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:12:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4200.5 total, 600.0 interval#012Cumulative writes: 48K writes, 195K keys, 48K commit groups, 1.0 writes per commit group, ingest: 0.19 GB, 0.05 MB/s#012Cumulative WAL: 48K writes, 17K syncs, 2.78 writes per sync, written: 0.19 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 10K writes, 44K keys, 10K commit groups, 1.0 writes per commit group, ingest: 45.27 MB, 0.08 MB/s#012Interval WAL: 10K writes, 4188 syncs, 2.55 writes per sync, written: 0.04 GB, 0.08 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:12:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:29.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:30.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:31.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:31 np0005593234 nova_compute[227762]: 2026-01-23 10:12:31.996 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:32 np0005593234 nova_compute[227762]: 2026-01-23 10:12:32.864 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:32.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:33.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:34 np0005593234 nova_compute[227762]: 2026-01-23 10:12:34.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:34 np0005593234 nova_compute[227762]: 2026-01-23 10:12:34.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:34 np0005593234 nova_compute[227762]: 2026-01-23 10:12:34.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:34 np0005593234 nova_compute[227762]: 2026-01-23 10:12:34.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:34 np0005593234 nova_compute[227762]: 2026-01-23 10:12:34.773 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:12:34 np0005593234 nova_compute[227762]: 2026-01-23 10:12:34.774 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:34 np0005593234 nova_compute[227762]: 2026-01-23 10:12:34.928 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "33559028-00d9-4918-9015-26172db3d00c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:34 np0005593234 nova_compute[227762]: 2026-01-23 10:12:34.929 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:34 np0005593234 nova_compute[227762]: 2026-01-23 10:12:34.948 227766 DEBUG nova.compute.manager [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:12:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:34.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.041 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.041 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.050 227766 DEBUG nova.virt.hardware [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.050 227766 INFO nova.compute.claims [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:12:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:12:35 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3376187422' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.214 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.273 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.305 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.305 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.485 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.486 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4252MB free_disk=20.9219970703125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.486 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.514 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.515 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.531 227766 DEBUG nova.compute.manager [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.616 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:12:35 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3027339988' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.773 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.780 227766 DEBUG nova.compute.provider_tree [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.804 227766 DEBUG nova.scheduler.client.report [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:12:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:35.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.849 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.807s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.850 227766 DEBUG nova.compute.manager [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.852 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.863 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.967 227766 DEBUG nova.compute.manager [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:12:35 np0005593234 nova_compute[227762]: 2026-01-23 10:12:35.967 227766 DEBUG nova.network.neutron [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.016 227766 INFO nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.038 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.039 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 33559028-00d9-4918-9015-26172db3d00c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.064 227766 DEBUG nova.compute.manager [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.068 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance c38b8bfe-1b70-4daf-b676-250c1e933ed4 has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.069 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.069 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.185 227766 DEBUG nova.compute.manager [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.187 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.187 227766 INFO nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Creating image(s)#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.411 227766 DEBUG nova.storage.rbd_utils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 33559028-00d9-4918-9015-26172db3d00c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.528 227766 DEBUG nova.storage.rbd_utils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 33559028-00d9-4918-9015-26172db3d00c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.556 227766 DEBUG nova.storage.rbd_utils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 33559028-00d9-4918-9015-26172db3d00c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.560 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.583 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.609 227766 DEBUG nova.policy [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'aca3cab576d641d3b89e7dddf155d467', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.628 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.629 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.630 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.630 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.660 227766 DEBUG nova.storage.rbd_utils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 33559028-00d9-4918-9015-26172db3d00c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.665 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 33559028-00d9-4918-9015-26172db3d00c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:36.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:36 np0005593234 nova_compute[227762]: 2026-01-23 10:12:36.997 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:12:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2548925223' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.064 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.071 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.099 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.135 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.136 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.136 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 1.521s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.144 227766 DEBUG nova.virt.hardware [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.145 227766 INFO nova.compute.claims [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.331 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:12:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/297396611' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:12:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:37.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.817 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.823 227766 DEBUG nova.compute.provider_tree [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.835 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163142.8341208, a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.836 227766 INFO nova.compute.manager [-] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.849 227766 DEBUG nova.scheduler.client.report [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.865 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.877 227766 DEBUG nova.compute.manager [None req-07f7a3fa-5468-48b9-87c3-7caf216f0df9 - - - - - -] [instance: a4f2647f-5c8b-4e7d-bbf2-eb149db4db2c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.891 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.755s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.892 227766 DEBUG nova.compute.manager [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.940 227766 DEBUG nova.compute.manager [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.941 227766 DEBUG nova.network.neutron [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.968 227766 INFO nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:12:37 np0005593234 nova_compute[227762]: 2026-01-23 10:12:37.998 227766 DEBUG nova.compute.manager [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.135 227766 DEBUG nova.network.neutron [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Successfully created port: 5004fad4-5788-4709-9c83-b5fe075c0aa7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.138 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.142 227766 DEBUG nova.compute.manager [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.143 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.143 227766 INFO nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Creating image(s)#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.165 227766 DEBUG nova.storage.rbd_utils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.188 227766 DEBUG nova.storage.rbd_utils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.210 227766 DEBUG nova.storage.rbd_utils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.213 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.236 227766 DEBUG nova.policy [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fae914e59ec54f6b80928ef3cc68dbdb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '0a6ba16c4b9d49d3bc24cd7b44935d1f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.271 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.272 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.273 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.273 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.294 227766 DEBUG nova.storage.rbd_utils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.298 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.376 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 33559028-00d9-4918-9015-26172db3d00c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.710s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:38 np0005593234 nova_compute[227762]: 2026-01-23 10:12:38.440 227766 DEBUG nova.storage.rbd_utils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] resizing rbd image 33559028-00d9-4918-9015-26172db3d00c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:12:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:38.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:39 np0005593234 nova_compute[227762]: 2026-01-23 10:12:39.027 227766 DEBUG nova.network.neutron [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Successfully updated port: 5004fad4-5788-4709-9c83-b5fe075c0aa7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:12:39 np0005593234 nova_compute[227762]: 2026-01-23 10:12:39.045 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:12:39 np0005593234 nova_compute[227762]: 2026-01-23 10:12:39.045 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquired lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:12:39 np0005593234 nova_compute[227762]: 2026-01-23 10:12:39.045 227766 DEBUG nova.network.neutron [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:12:39 np0005593234 nova_compute[227762]: 2026-01-23 10:12:39.082 227766 DEBUG nova.network.neutron [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Successfully created port: 483c7ca9-a908-4082-bbad-1ea123d6a3f1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:12:39 np0005593234 nova_compute[227762]: 2026-01-23 10:12:39.224 227766 DEBUG nova.compute.manager [req-a1fb33ff-48e5-4ef5-9760-2efdad5a7b30 req-1d15a647-8ed6-48cf-83a8-03d9a7f1677d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Received event network-changed-5004fad4-5788-4709-9c83-b5fe075c0aa7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:39 np0005593234 nova_compute[227762]: 2026-01-23 10:12:39.224 227766 DEBUG nova.compute.manager [req-a1fb33ff-48e5-4ef5-9760-2efdad5a7b30 req-1d15a647-8ed6-48cf-83a8-03d9a7f1677d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Refreshing instance network info cache due to event network-changed-5004fad4-5788-4709-9c83-b5fe075c0aa7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:12:39 np0005593234 nova_compute[227762]: 2026-01-23 10:12:39.224 227766 DEBUG oslo_concurrency.lockutils [req-a1fb33ff-48e5-4ef5-9760-2efdad5a7b30 req-1d15a647-8ed6-48cf-83a8-03d9a7f1677d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:12:39 np0005593234 nova_compute[227762]: 2026-01-23 10:12:39.281 227766 DEBUG nova.network.neutron [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:12:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:39.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.440 227766 DEBUG nova.network.neutron [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Successfully updated port: 483c7ca9-a908-4082-bbad-1ea123d6a3f1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.470 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.471 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquired lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.471 227766 DEBUG nova.network.neutron [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.590 227766 DEBUG nova.compute.manager [req-a999fa20-83c4-407b-8b60-8ce1b4f7e616 req-591ee1cf-5956-4516-98e3-80c0581950fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received event network-changed-483c7ca9-a908-4082-bbad-1ea123d6a3f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.591 227766 DEBUG nova.compute.manager [req-a999fa20-83c4-407b-8b60-8ce1b4f7e616 req-591ee1cf-5956-4516-98e3-80c0581950fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Refreshing instance network info cache due to event network-changed-483c7ca9-a908-4082-bbad-1ea123d6a3f1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.591 227766 DEBUG oslo_concurrency.lockutils [req-a999fa20-83c4-407b-8b60-8ce1b4f7e616 req-591ee1cf-5956-4516-98e3-80c0581950fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.704 227766 DEBUG nova.network.neutron [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:40 np0005593234 podman[289915]: 2026-01-23 10:12:40.766421779 +0000 UTC m=+0.056574250 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.905 227766 DEBUG nova.network.neutron [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Updating instance_info_cache with network_info: [{"id": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "address": "fa:16:3e:95:4c:75", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5004fad4-57", "ovs_interfaceid": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.940 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Releasing lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.940 227766 DEBUG nova.compute.manager [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Instance network_info: |[{"id": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "address": "fa:16:3e:95:4c:75", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5004fad4-57", "ovs_interfaceid": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.940 227766 DEBUG oslo_concurrency.lockutils [req-a1fb33ff-48e5-4ef5-9760-2efdad5a7b30 req-1d15a647-8ed6-48cf-83a8-03d9a7f1677d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:12:40 np0005593234 nova_compute[227762]: 2026-01-23 10:12:40.941 227766 DEBUG nova.network.neutron [req-a1fb33ff-48e5-4ef5-9760-2efdad5a7b30 req-1d15a647-8ed6-48cf-83a8-03d9a7f1677d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Refreshing network info cache for port 5004fad4-5788-4709-9c83-b5fe075c0aa7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:12:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:12:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:40.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:12:41 np0005593234 nova_compute[227762]: 2026-01-23 10:12:41.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:12:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:41.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:12:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:12:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:12:41 np0005593234 nova_compute[227762]: 2026-01-23 10:12:41.998 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.085 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.787s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.123 227766 DEBUG nova.objects.instance [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'migration_context' on Instance uuid 33559028-00d9-4918-9015-26172db3d00c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.162 227766 DEBUG nova.storage.rbd_utils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] resizing rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.259 227766 DEBUG nova.objects.instance [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'migration_context' on Instance uuid c38b8bfe-1b70-4daf-b676-250c1e933ed4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.323 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.323 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Ensure instance console log exists: /var/lib/nova/instances/33559028-00d9-4918-9015-26172db3d00c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.323 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.323 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.324 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.326 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Start _get_guest_xml network_info=[{"id": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "address": "fa:16:3e:95:4c:75", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5004fad4-57", "ovs_interfaceid": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.327 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.327 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Ensure instance console log exists: /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.327 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.328 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.328 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.331 227766 WARNING nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.337 227766 DEBUG nova.virt.libvirt.host [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.338 227766 DEBUG nova.virt.libvirt.host [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.342 227766 DEBUG nova.virt.libvirt.host [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.342 227766 DEBUG nova.virt.libvirt.host [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.343 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.343 227766 DEBUG nova.virt.hardware [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.344 227766 DEBUG nova.virt.hardware [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.344 227766 DEBUG nova.virt.hardware [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.344 227766 DEBUG nova.virt.hardware [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.344 227766 DEBUG nova.virt.hardware [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.344 227766 DEBUG nova.virt.hardware [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.345 227766 DEBUG nova.virt.hardware [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.345 227766 DEBUG nova.virt.hardware [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.345 227766 DEBUG nova.virt.hardware [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.345 227766 DEBUG nova.virt.hardware [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.345 227766 DEBUG nova.virt.hardware [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.348 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:12:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3647155075' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.809 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.837 227766 DEBUG nova.storage.rbd_utils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 33559028-00d9-4918-9015-26172db3d00c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.841 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:42.848 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:42.849 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:42.849 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.868 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:42.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:42 np0005593234 nova_compute[227762]: 2026-01-23 10:12:42.979 227766 DEBUG nova.network.neutron [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Updating instance_info_cache with network_info: [{"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.008 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Releasing lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.008 227766 DEBUG nova.compute.manager [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Instance network_info: |[{"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.009 227766 DEBUG oslo_concurrency.lockutils [req-a999fa20-83c4-407b-8b60-8ce1b4f7e616 req-591ee1cf-5956-4516-98e3-80c0581950fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.009 227766 DEBUG nova.network.neutron [req-a999fa20-83c4-407b-8b60-8ce1b4f7e616 req-591ee1cf-5956-4516-98e3-80c0581950fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Refreshing network info cache for port 483c7ca9-a908-4082-bbad-1ea123d6a3f1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.012 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Start _get_guest_xml network_info=[{"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.016 227766 WARNING nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.020 227766 DEBUG nova.virt.libvirt.host [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.020 227766 DEBUG nova.virt.libvirt.host [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:12:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:12:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:12:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.023 227766 DEBUG nova.virt.libvirt.host [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.023 227766 DEBUG nova.virt.libvirt.host [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.024 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.024 227766 DEBUG nova.virt.hardware [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.024 227766 DEBUG nova.virt.hardware [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.025 227766 DEBUG nova.virt.hardware [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.025 227766 DEBUG nova.virt.hardware [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.025 227766 DEBUG nova.virt.hardware [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.025 227766 DEBUG nova.virt.hardware [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.025 227766 DEBUG nova.virt.hardware [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.025 227766 DEBUG nova.virt.hardware [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.025 227766 DEBUG nova.virt.hardware [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.026 227766 DEBUG nova.virt.hardware [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.026 227766 DEBUG nova.virt.hardware [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.029 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:12:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1227403747' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.273 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.274 227766 DEBUG nova.virt.libvirt.vif [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:12:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1716094682',display_name='tempest-ServerActionsTestOtherB-server-1716094682',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1716094682',id=134,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-frjx5azl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:12:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=33559028-00d9-4918-9015-26172db3d00c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "address": "fa:16:3e:95:4c:75", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5004fad4-57", "ovs_interfaceid": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.275 227766 DEBUG nova.network.os_vif_util [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "address": "fa:16:3e:95:4c:75", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5004fad4-57", "ovs_interfaceid": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.276 227766 DEBUG nova.network.os_vif_util [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:4c:75,bridge_name='br-int',has_traffic_filtering=True,id=5004fad4-5788-4709-9c83-b5fe075c0aa7,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5004fad4-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.277 227766 DEBUG nova.objects.instance [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'pci_devices' on Instance uuid 33559028-00d9-4918-9015-26172db3d00c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.315 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <uuid>33559028-00d9-4918-9015-26172db3d00c</uuid>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <name>instance-00000086</name>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerActionsTestOtherB-server-1716094682</nova:name>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:12:42</nova:creationTime>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:user uuid="aca3cab576d641d3b89e7dddf155d467">tempest-ServerActionsTestOtherB-1052932467-project-member</nova:user>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:project uuid="9dd869ce76e44fc8a82b8bbee1654d33">tempest-ServerActionsTestOtherB-1052932467</nova:project>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:port uuid="5004fad4-5788-4709-9c83-b5fe075c0aa7">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <entry name="serial">33559028-00d9-4918-9015-26172db3d00c</entry>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <entry name="uuid">33559028-00d9-4918-9015-26172db3d00c</entry>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/33559028-00d9-4918-9015-26172db3d00c_disk">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/33559028-00d9-4918-9015-26172db3d00c_disk.config">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:95:4c:75"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <target dev="tap5004fad4-57"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/33559028-00d9-4918-9015-26172db3d00c/console.log" append="off"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:12:43 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:12:43 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.316 227766 DEBUG nova.compute.manager [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Preparing to wait for external event network-vif-plugged-5004fad4-5788-4709-9c83-b5fe075c0aa7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.316 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "33559028-00d9-4918-9015-26172db3d00c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.316 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.317 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.317 227766 DEBUG nova.virt.libvirt.vif [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:12:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1716094682',display_name='tempest-ServerActionsTestOtherB-server-1716094682',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1716094682',id=134,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-frjx5azl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:12:36Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=33559028-00d9-4918-9015-26172db3d00c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "address": "fa:16:3e:95:4c:75", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5004fad4-57", "ovs_interfaceid": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.317 227766 DEBUG nova.network.os_vif_util [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "address": "fa:16:3e:95:4c:75", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5004fad4-57", "ovs_interfaceid": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.318 227766 DEBUG nova.network.os_vif_util [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:4c:75,bridge_name='br-int',has_traffic_filtering=True,id=5004fad4-5788-4709-9c83-b5fe075c0aa7,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5004fad4-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.318 227766 DEBUG os_vif [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:4c:75,bridge_name='br-int',has_traffic_filtering=True,id=5004fad4-5788-4709-9c83-b5fe075c0aa7,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5004fad4-57') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.319 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.319 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.319 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.322 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.322 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5004fad4-57, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.322 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5004fad4-57, col_values=(('external_ids', {'iface-id': '5004fad4-5788-4709-9c83-b5fe075c0aa7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:95:4c:75', 'vm-uuid': '33559028-00d9-4918-9015-26172db3d00c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.324 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:43 np0005593234 NetworkManager[48942]: <info>  [1769163163.3259] manager: (tap5004fad4-57): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/273)
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.326 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.331 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.332 227766 INFO os_vif [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:4c:75,bridge_name='br-int',has_traffic_filtering=True,id=5004fad4-5788-4709-9c83-b5fe075c0aa7,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5004fad4-57')#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.406 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.407 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.407 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No VIF found with MAC fa:16:3e:95:4c:75, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.407 227766 INFO nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Using config drive#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.431 227766 DEBUG nova.storage.rbd_utils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 33559028-00d9-4918-9015-26172db3d00c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:12:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1366097735' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.479 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.505 227766 DEBUG nova.storage.rbd_utils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.509 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:43.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:12:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2026659715' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.948 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.950 227766 DEBUG nova.virt.libvirt.vif [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:12:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-526971967',display_name='tempest-ServerRescueNegativeTestJSON-server-526971967',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-526971967',id=135,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a6ba16c4b9d49d3bc24cd7b44935d1f',ramdisk_id='',reservation_id='r-rypzatwh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-87224704',owner_user_name='tempest-ServerRescueNegativeTestJSON-87224704-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:12:38Z,user_data=None,user_id='fae914e59ec54f6b80928ef3cc68dbdb',uuid=c38b8bfe-1b70-4daf-b676-250c1e933ed4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.950 227766 DEBUG nova.network.os_vif_util [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converting VIF {"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.951 227766 DEBUG nova.network.os_vif_util [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:81:35,bridge_name='br-int',has_traffic_filtering=True,id=483c7ca9-a908-4082-bbad-1ea123d6a3f1,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483c7ca9-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.952 227766 DEBUG nova.objects.instance [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'pci_devices' on Instance uuid c38b8bfe-1b70-4daf-b676-250c1e933ed4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.970 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <uuid>c38b8bfe-1b70-4daf-b676-250c1e933ed4</uuid>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <name>instance-00000087</name>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-526971967</nova:name>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:12:43</nova:creationTime>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:user uuid="fae914e59ec54f6b80928ef3cc68dbdb">tempest-ServerRescueNegativeTestJSON-87224704-project-member</nova:user>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:project uuid="0a6ba16c4b9d49d3bc24cd7b44935d1f">tempest-ServerRescueNegativeTestJSON-87224704</nova:project>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <nova:port uuid="483c7ca9-a908-4082-bbad-1ea123d6a3f1">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <entry name="serial">c38b8bfe-1b70-4daf-b676-250c1e933ed4</entry>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <entry name="uuid">c38b8bfe-1b70-4daf-b676-250c1e933ed4</entry>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.config">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:53:81:35"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <target dev="tap483c7ca9-a9"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/console.log" append="off"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:12:43 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:12:43 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:12:43 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:12:43 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.970 227766 DEBUG nova.compute.manager [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Preparing to wait for external event network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.970 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.971 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.971 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.972 227766 DEBUG nova.virt.libvirt.vif [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:12:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-526971967',display_name='tempest-ServerRescueNegativeTestJSON-server-526971967',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-526971967',id=135,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='0a6ba16c4b9d49d3bc24cd7b44935d1f',ramdisk_id='',reservation_id='r-rypzatwh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerRescueNegativeTestJSON-87224704',owner_user_name='tempest-ServerRescueNegativeTestJSON-87224704-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:12:38Z,user_data=None,user_id='fae914e59ec54f6b80928ef3cc68dbdb',uuid=c38b8bfe-1b70-4daf-b676-250c1e933ed4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.972 227766 DEBUG nova.network.os_vif_util [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converting VIF {"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.973 227766 DEBUG nova.network.os_vif_util [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:53:81:35,bridge_name='br-int',has_traffic_filtering=True,id=483c7ca9-a908-4082-bbad-1ea123d6a3f1,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483c7ca9-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.973 227766 DEBUG os_vif [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:81:35,bridge_name='br-int',has_traffic_filtering=True,id=483c7ca9-a908-4082-bbad-1ea123d6a3f1,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483c7ca9-a9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.973 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.974 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.974 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.977 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.977 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap483c7ca9-a9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.978 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap483c7ca9-a9, col_values=(('external_ids', {'iface-id': '483c7ca9-a908-4082-bbad-1ea123d6a3f1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:53:81:35', 'vm-uuid': 'c38b8bfe-1b70-4daf-b676-250c1e933ed4'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.979 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:43 np0005593234 NetworkManager[48942]: <info>  [1769163163.9801] manager: (tap483c7ca9-a9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/274)
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.982 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.986 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:43 np0005593234 nova_compute[227762]: 2026-01-23 10:12:43.986 227766 INFO os_vif [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:53:81:35,bridge_name='br-int',has_traffic_filtering=True,id=483c7ca9-a908-4082-bbad-1ea123d6a3f1,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483c7ca9-a9')#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.046 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.046 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.046 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No VIF found with MAC fa:16:3e:53:81:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.047 227766 INFO nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Using config drive#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.073 227766 DEBUG nova.storage.rbd_utils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.484 227766 INFO nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Creating config drive at /var/lib/nova/instances/33559028-00d9-4918-9015-26172db3d00c/disk.config#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.490 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/33559028-00d9-4918-9015-26172db3d00c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy1_zhny4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:12:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/991099039' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:12:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:12:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/991099039' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.625 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/33559028-00d9-4918-9015-26172db3d00c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpy1_zhny4" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.653 227766 DEBUG nova.storage.rbd_utils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] rbd image 33559028-00d9-4918-9015-26172db3d00c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.657 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/33559028-00d9-4918-9015-26172db3d00c/disk.config 33559028-00d9-4918-9015-26172db3d00c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.682 227766 INFO nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Creating config drive at /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/disk.config#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.687 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp533h31mh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.817 227766 DEBUG oslo_concurrency.processutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/33559028-00d9-4918-9015-26172db3d00c/disk.config 33559028-00d9-4918-9015-26172db3d00c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.818 227766 INFO nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Deleting local config drive /var/lib/nova/instances/33559028-00d9-4918-9015-26172db3d00c/disk.config because it was imported into RBD.#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.820 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp533h31mh" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.849 227766 DEBUG nova.storage.rbd_utils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.853 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/disk.config c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:12:44 np0005593234 kernel: tap5004fad4-57: entered promiscuous mode
Jan 23 05:12:44 np0005593234 NetworkManager[48942]: <info>  [1769163164.8683] manager: (tap5004fad4-57): new Tun device (/org/freedesktop/NetworkManager/Devices/275)
Jan 23 05:12:44 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:44Z|00545|binding|INFO|Claiming lport 5004fad4-5788-4709-9c83-b5fe075c0aa7 for this chassis.
Jan 23 05:12:44 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:44Z|00546|binding|INFO|5004fad4-5788-4709-9c83-b5fe075c0aa7: Claiming fa:16:3e:95:4c:75 10.100.0.7
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.878 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:44.877 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:4c:75 10.100.0.7'], port_security=['fa:16:3e:95:4c:75 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '33559028-00d9-4918-9015-26172db3d00c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d9599b4-8855-4310-af02-cdd058438f7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cf3e0bf9-33c6-483b-a880-c8297a0be71f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=875f4baa-cb85-49ca-8f02-78715d351fdb, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5004fad4-5788-4709-9c83-b5fe075c0aa7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:12:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:44.879 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5004fad4-5788-4709-9c83-b5fe075c0aa7 in datapath 8d9599b4-8855-4310-af02-cdd058438f7d bound to our chassis#033[00m
Jan 23 05:12:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:44.880 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8d9599b4-8855-4310-af02-cdd058438f7d#033[00m
Jan 23 05:12:44 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:44Z|00547|binding|INFO|Setting lport 5004fad4-5788-4709-9c83-b5fe075c0aa7 ovn-installed in OVS
Jan 23 05:12:44 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:44Z|00548|binding|INFO|Setting lport 5004fad4-5788-4709-9c83-b5fe075c0aa7 up in Southbound
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.891 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:44 np0005593234 systemd-udevd[290268]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:12:44 np0005593234 systemd-machined[195626]: New machine qemu-65-instance-00000086.
Jan 23 05:12:44 np0005593234 nova_compute[227762]: 2026-01-23 10:12:44.899 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:44.903 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ced56cde-7c31-4b39-bf3c-38f598ef2343]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:44 np0005593234 NetworkManager[48942]: <info>  [1769163164.9103] device (tap5004fad4-57): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:12:44 np0005593234 NetworkManager[48942]: <info>  [1769163164.9112] device (tap5004fad4-57): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:12:44 np0005593234 systemd[1]: Started Virtual Machine qemu-65-instance-00000086.
Jan 23 05:12:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:44.934 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9370d0a4-8a01-493f-a9e9-b932d87054a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:44.937 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e21bd0b8-117b-49e3-ac24-c550999b4834]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:44.966 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[bc581495-635c-4285-982e-487dda8769d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:44.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:44.985 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[94f1d230-37eb-44d3-8fff-fd3b3a310277]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d9599b4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:a1:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 13, 'rx_bytes': 700, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687969, 'reachable_time': 24655, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290296, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.001 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3a1bd0f2-f2f1-40ba-bf08-909ae9017732]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8d9599b4-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687982, 'tstamp': 687982}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290298, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8d9599b4-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687985, 'tstamp': 687985}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290298, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.003 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d9599b4-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.005 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.010 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.011 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d9599b4-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.011 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.012 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8d9599b4-80, col_values=(('external_ids', {'iface-id': 'b57bd565-3bb1-4ecc-8df0-a7c439ac84a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.012 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.015 227766 DEBUG nova.network.neutron [req-a1fb33ff-48e5-4ef5-9760-2efdad5a7b30 req-1d15a647-8ed6-48cf-83a8-03d9a7f1677d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Updated VIF entry in instance network info cache for port 5004fad4-5788-4709-9c83-b5fe075c0aa7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.016 227766 DEBUG nova.network.neutron [req-a1fb33ff-48e5-4ef5-9760-2efdad5a7b30 req-1d15a647-8ed6-48cf-83a8-03d9a7f1677d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Updating instance_info_cache with network_info: [{"id": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "address": "fa:16:3e:95:4c:75", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5004fad4-57", "ovs_interfaceid": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.035 227766 DEBUG oslo_concurrency.lockutils [req-a1fb33ff-48e5-4ef5-9760-2efdad5a7b30 req-1d15a647-8ed6-48cf-83a8-03d9a7f1677d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.159 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.166 227766 DEBUG oslo_concurrency.processutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/disk.config c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.313s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.167 227766 INFO nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Deleting local config drive /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/disk.config because it was imported into RBD.#033[00m
Jan 23 05:12:45 np0005593234 kernel: tap483c7ca9-a9: entered promiscuous mode
Jan 23 05:12:45 np0005593234 NetworkManager[48942]: <info>  [1769163165.2155] manager: (tap483c7ca9-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/276)
Jan 23 05:12:45 np0005593234 systemd-udevd[290279]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:12:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:45Z|00549|binding|INFO|Claiming lport 483c7ca9-a908-4082-bbad-1ea123d6a3f1 for this chassis.
Jan 23 05:12:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:45Z|00550|binding|INFO|483c7ca9-a908-4082-bbad-1ea123d6a3f1: Claiming fa:16:3e:53:81:35 10.100.0.12
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.219 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.223 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:81:35 10.100.0.12'], port_security=['fa:16:3e:53:81:35 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c38b8bfe-1b70-4daf-b676-250c1e933ed4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a6ba16c4b9d49d3bc24cd7b44935d1f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6fc0d424-7779-4175-b5e0-e2613de6ecef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fb685af-2efd-4d70-8868-8a86ed4c3ca6, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=483c7ca9-a908-4082-bbad-1ea123d6a3f1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.224 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 483c7ca9-a908-4082-bbad-1ea123d6a3f1 in datapath 00bd3319-bfe5-4acd-b2e4-17830ee847f9 bound to our chassis#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.226 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 00bd3319-bfe5-4acd-b2e4-17830ee847f9#033[00m
Jan 23 05:12:45 np0005593234 NetworkManager[48942]: <info>  [1769163165.2316] device (tap483c7ca9-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:12:45 np0005593234 NetworkManager[48942]: <info>  [1769163165.2333] device (tap483c7ca9-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:12:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:45Z|00551|binding|INFO|Setting lport 483c7ca9-a908-4082-bbad-1ea123d6a3f1 ovn-installed in OVS
Jan 23 05:12:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:45Z|00552|binding|INFO|Setting lport 483c7ca9-a908-4082-bbad-1ea123d6a3f1 up in Southbound
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.236 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.240 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.242 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[df9cac93-1c7f-498b-ae09-46f30fa08e71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.242 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap00bd3319-b1 in ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.244 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap00bd3319-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.244 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3728b063-e209-4729-ae33-c115288aa3d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.245 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1f955adb-b0b4-4dfe-814a-b77638e820ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 systemd-machined[195626]: New machine qemu-66-instance-00000087.
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.262 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[cad7297f-0c54-4a3b-8d8a-8bf3fdd1dd0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 systemd[1]: Started Virtual Machine qemu-66-instance-00000087.
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.280 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f0c9a07b-5299-45b8-a950-e7c89e76c0be]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.310 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[6bdbb0bb-655d-4e6e-af3a-18a69b519c0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.315 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9872ec6d-c840-428e-b60c-df092fd3189e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 NetworkManager[48942]: <info>  [1769163165.3161] manager: (tap00bd3319-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/277)
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.350 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[94762e48-d54c-4ade-b20d-fe4edae7f317]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.354 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[eba3b1cc-4be2-400e-b14d-93cbf716bf46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 NetworkManager[48942]: <info>  [1769163165.3761] device (tap00bd3319-b0): carrier: link connected
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.384 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f73b476c-3191-4b1c-ac63-edd4c663bfc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.402 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a4ba88ee-a33a-4961-bba1-2ba6c54d1d5f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00bd3319-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:83:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706033, 'reachable_time': 22199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290346, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.419 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7e1cafef-03af-4696-ad83-71bd7ffd6325]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:83f8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 706033, 'tstamp': 706033}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 290347, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.435 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[77ff5ae4-97b2-4744-8ee1-4ebe9537cd28]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00bd3319-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:83:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 177], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706033, 'reachable_time': 22199, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 290348, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.468 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4d30dc29-f1b6-4eba-a18e-764441b03cb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.522 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e22437d9-321b-4d4f-a8b3-88df9df7515b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.524 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00bd3319-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.525 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.525 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00bd3319-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:45 np0005593234 NetworkManager[48942]: <info>  [1769163165.5282] manager: (tap00bd3319-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/278)
Jan 23 05:12:45 np0005593234 kernel: tap00bd3319-b0: entered promiscuous mode
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.527 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.533 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap00bd3319-b0, col_values=(('external_ids', {'iface-id': '1788b5e6-601b-4e3d-a584-c0138c3308f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.535 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:12:45Z|00553|binding|INFO|Releasing lport 1788b5e6-601b-4e3d-a584-c0138c3308f6 from this chassis (sb_readonly=0)
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.536 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.537 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.538 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d2c0ea68-61cb-4fa4-a97f-684cb241dfc4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.540 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-00bd3319-bfe5-4acd-b2e4-17830ee847f9
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 00bd3319-bfe5-4acd-b2e4-17830ee847f9
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:12:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:12:45.542 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'env', 'PROCESS_TAG=haproxy-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/00bd3319-bfe5-4acd-b2e4-17830ee847f9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.589 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.815 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163165.8139277, c38b8bfe-1b70-4daf-b676-250c1e933ed4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.815 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] VM Started (Lifecycle Event)#033[00m
Jan 23 05:12:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:45.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.852 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.856 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163165.8141527, c38b8bfe-1b70-4daf-b676-250c1e933ed4 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.857 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.886 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.923 227766 DEBUG nova.network.neutron [req-a999fa20-83c4-407b-8b60-8ce1b4f7e616 req-591ee1cf-5956-4516-98e3-80c0581950fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Updated VIF entry in instance network info cache for port 483c7ca9-a908-4082-bbad-1ea123d6a3f1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.923 227766 DEBUG nova.network.neutron [req-a999fa20-83c4-407b-8b60-8ce1b4f7e616 req-591ee1cf-5956-4516-98e3-80c0581950fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Updating instance_info_cache with network_info: [{"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.927 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.954 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.955 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163165.8213873, 33559028-00d9-4918-9015-26172db3d00c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.955 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 33559028-00d9-4918-9015-26172db3d00c] VM Started (Lifecycle Event)#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.956 227766 DEBUG oslo_concurrency.lockutils [req-a999fa20-83c4-407b-8b60-8ce1b4f7e616 req-591ee1cf-5956-4516-98e3-80c0581950fd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.981 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 33559028-00d9-4918-9015-26172db3d00c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.985 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163165.821453, 33559028-00d9-4918-9015-26172db3d00c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:45 np0005593234 nova_compute[227762]: 2026-01-23 10:12:45.985 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 33559028-00d9-4918-9015-26172db3d00c] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:12:46 np0005593234 nova_compute[227762]: 2026-01-23 10:12:46.013 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 33559028-00d9-4918-9015-26172db3d00c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:46 np0005593234 nova_compute[227762]: 2026-01-23 10:12:46.017 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 33559028-00d9-4918-9015-26172db3d00c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:12:46 np0005593234 podman[290464]: 2026-01-23 10:12:46.029855644 +0000 UTC m=+0.053060150 container create a1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:12:46 np0005593234 nova_compute[227762]: 2026-01-23 10:12:46.048 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 33559028-00d9-4918-9015-26172db3d00c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:12:46 np0005593234 systemd[1]: Started libpod-conmon-a1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912.scope.
Jan 23 05:12:46 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:12:46 np0005593234 podman[290464]: 2026-01-23 10:12:46.000525612 +0000 UTC m=+0.023730118 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:12:46 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/24c5a93fcf163a4d5328d81142dc11bb2c0e791a6fd7d6eaf57dfb43edd86207/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:12:46 np0005593234 podman[290464]: 2026-01-23 10:12:46.110996016 +0000 UTC m=+0.134200522 container init a1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:12:46 np0005593234 podman[290464]: 2026-01-23 10:12:46.116079404 +0000 UTC m=+0.139283920 container start a1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:12:46 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[290480]: [NOTICE]   (290484) : New worker (290486) forked
Jan 23 05:12:46 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[290480]: [NOTICE]   (290484) : Loading success.
Jan 23 05:12:46 np0005593234 nova_compute[227762]: 2026-01-23 10:12:46.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:12:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:46.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:12:47 np0005593234 nova_compute[227762]: 2026-01-23 10:12:47.043 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:47.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.380 227766 DEBUG nova.compute.manager [req-dcebb302-f3a8-4e3d-bbaf-0099108b74b1 req-393c393e-fc59-4725-90bd-f4d820e3a468 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Received event network-vif-plugged-5004fad4-5788-4709-9c83-b5fe075c0aa7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.381 227766 DEBUG oslo_concurrency.lockutils [req-dcebb302-f3a8-4e3d-bbaf-0099108b74b1 req-393c393e-fc59-4725-90bd-f4d820e3a468 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "33559028-00d9-4918-9015-26172db3d00c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.381 227766 DEBUG oslo_concurrency.lockutils [req-dcebb302-f3a8-4e3d-bbaf-0099108b74b1 req-393c393e-fc59-4725-90bd-f4d820e3a468 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.381 227766 DEBUG oslo_concurrency.lockutils [req-dcebb302-f3a8-4e3d-bbaf-0099108b74b1 req-393c393e-fc59-4725-90bd-f4d820e3a468 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.381 227766 DEBUG nova.compute.manager [req-dcebb302-f3a8-4e3d-bbaf-0099108b74b1 req-393c393e-fc59-4725-90bd-f4d820e3a468 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Processing event network-vif-plugged-5004fad4-5788-4709-9c83-b5fe075c0aa7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.382 227766 DEBUG nova.compute.manager [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.386 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163168.38594, 33559028-00d9-4918-9015-26172db3d00c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.386 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 33559028-00d9-4918-9015-26172db3d00c] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.388 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.391 227766 INFO nova.virt.libvirt.driver [-] [instance: 33559028-00d9-4918-9015-26172db3d00c] Instance spawned successfully.#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.392 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.432 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 33559028-00d9-4918-9015-26172db3d00c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.437 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 33559028-00d9-4918-9015-26172db3d00c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.440 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.440 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.441 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.441 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.441 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.442 227766 DEBUG nova.virt.libvirt.driver [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.507 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 33559028-00d9-4918-9015-26172db3d00c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.548 227766 INFO nova.compute.manager [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Took 12.36 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.548 227766 DEBUG nova.compute.manager [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.620 227766 INFO nova.compute.manager [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Took 13.61 seconds to build instance.#033[00m
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.637 227766 DEBUG oslo_concurrency.lockutils [None req-664bac42-2617-4993-97ea-2764005bf71f aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.709s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:48 np0005593234 podman[290501]: 2026-01-23 10:12:48.780404759 +0000 UTC m=+0.077682005 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 05:12:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:48.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:48 np0005593234 nova_compute[227762]: 2026-01-23 10:12:48.980 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:49 np0005593234 nova_compute[227762]: 2026-01-23 10:12:49.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:49 np0005593234 nova_compute[227762]: 2026-01-23 10:12:49.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:12:49 np0005593234 nova_compute[227762]: 2026-01-23 10:12:49.800 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:12:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:49.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #118. Immutable memtables: 0.
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.172294) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 73] Flushing memtable with next log file: 118
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163170172362, "job": 73, "event": "flush_started", "num_memtables": 1, "num_entries": 1013, "num_deletes": 252, "total_data_size": 1904492, "memory_usage": 1937568, "flush_reason": "Manual Compaction"}
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 73] Level-0 flush table #119: started
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163170180798, "cf_name": "default", "job": 73, "event": "table_file_creation", "file_number": 119, "file_size": 1255303, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 58913, "largest_seqno": 59921, "table_properties": {"data_size": 1250773, "index_size": 2118, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10853, "raw_average_key_size": 20, "raw_value_size": 1241281, "raw_average_value_size": 2311, "num_data_blocks": 93, "num_entries": 537, "num_filter_entries": 537, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163102, "oldest_key_time": 1769163102, "file_creation_time": 1769163170, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 119, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 73] Flush lasted 8546 microseconds, and 3466 cpu microseconds.
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.180851) [db/flush_job.cc:967] [default] [JOB 73] Level-0 flush table #119: 1255303 bytes OK
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.180869) [db/memtable_list.cc:519] [default] Level-0 commit table #119 started
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.184493) [db/memtable_list.cc:722] [default] Level-0 commit table #119: memtable #1 done
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.184505) EVENT_LOG_v1 {"time_micros": 1769163170184502, "job": 73, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.184522) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 73] Try to delete WAL files size 1899357, prev total WAL file size 1920154, number of live WAL files 2.
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000115.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.185458) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035303230' seq:72057594037927935, type:22 .. '7061786F730035323732' seq:0, type:0; will stop at (end)
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 74] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 73 Base level 0, inputs: [119(1225KB)], [117(12MB)]
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163170185709, "job": 74, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [119], "files_L6": [117], "score": -1, "input_data_size": 13992256, "oldest_snapshot_seqno": -1}
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 74] Generated table #120: 8168 keys, 12128141 bytes, temperature: kUnknown
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163170276718, "cf_name": "default", "job": 74, "event": "table_file_creation", "file_number": 120, "file_size": 12128141, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12073773, "index_size": 32816, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20485, "raw_key_size": 212393, "raw_average_key_size": 26, "raw_value_size": 11928589, "raw_average_value_size": 1460, "num_data_blocks": 1287, "num_entries": 8168, "num_filter_entries": 8168, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769163170, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 120, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.276944) [db/compaction/compaction_job.cc:1663] [default] [JOB 74] Compacted 1@0 + 1@6 files to L6 => 12128141 bytes
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.278376) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.6 rd, 133.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.2, 12.1 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(20.8) write-amplify(9.7) OK, records in: 8689, records dropped: 521 output_compression: NoCompression
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.278414) EVENT_LOG_v1 {"time_micros": 1769163170278385, "job": 74, "event": "compaction_finished", "compaction_time_micros": 91077, "compaction_time_cpu_micros": 40149, "output_level": 6, "num_output_files": 1, "total_output_size": 12128141, "num_input_records": 8689, "num_output_records": 8168, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000119.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163170278697, "job": 74, "event": "table_file_deletion", "file_number": 119}
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000117.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163170280678, "job": 74, "event": "table_file_deletion", "file_number": 117}
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.185211) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.280821) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.280825) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.280826) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.280828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:12:50 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:12:50.280829) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.628 227766 DEBUG nova.compute.manager [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Received event network-vif-plugged-5004fad4-5788-4709-9c83-b5fe075c0aa7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.628 227766 DEBUG oslo_concurrency.lockutils [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "33559028-00d9-4918-9015-26172db3d00c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.629 227766 DEBUG oslo_concurrency.lockutils [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.629 227766 DEBUG oslo_concurrency.lockutils [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.629 227766 DEBUG nova.compute.manager [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] No waiting events found dispatching network-vif-plugged-5004fad4-5788-4709-9c83-b5fe075c0aa7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.629 227766 WARNING nova.compute.manager [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Received unexpected event network-vif-plugged-5004fad4-5788-4709-9c83-b5fe075c0aa7 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.629 227766 DEBUG nova.compute.manager [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received event network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.630 227766 DEBUG oslo_concurrency.lockutils [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.630 227766 DEBUG oslo_concurrency.lockutils [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.630 227766 DEBUG oslo_concurrency.lockutils [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.630 227766 DEBUG nova.compute.manager [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Processing event network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.630 227766 DEBUG nova.compute.manager [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received event network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.630 227766 DEBUG oslo_concurrency.lockutils [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.631 227766 DEBUG oslo_concurrency.lockutils [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.631 227766 DEBUG oslo_concurrency.lockutils [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.631 227766 DEBUG nova.compute.manager [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] No waiting events found dispatching network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.631 227766 WARNING nova.compute.manager [req-335ca8a1-8bd9-43b1-912d-d0c10b637727 req-56e6122f-6111-4706-9686-c1ac984c2039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received unexpected event network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.632 227766 DEBUG nova.compute.manager [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.640 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.645 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163170.6451325, c38b8bfe-1b70-4daf-b676-250c1e933ed4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.645 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.652 227766 INFO nova.virt.libvirt.driver [-] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Instance spawned successfully.#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.653 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.677 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.684 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.687 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.688 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.688 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.689 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.689 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.689 227766 DEBUG nova.virt.libvirt.driver [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.738 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.803 227766 INFO nova.compute.manager [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Took 12.66 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.803 227766 DEBUG nova.compute.manager [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.891 227766 INFO nova.compute.manager [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Took 15.30 seconds to build instance.#033[00m
Jan 23 05:12:50 np0005593234 nova_compute[227762]: 2026-01-23 10:12:50.925 227766 DEBUG oslo_concurrency.lockutils [None req-b8395125-61df-468c-96cd-93aedfad9e67 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.410s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:12:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:12:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:50.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:12:51 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:12:51 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:12:51 np0005593234 nova_compute[227762]: 2026-01-23 10:12:51.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:12:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:12:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:51.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:12:52 np0005593234 nova_compute[227762]: 2026-01-23 10:12:52.045 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:52.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:53.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:53 np0005593234 nova_compute[227762]: 2026-01-23 10:12:53.983 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:12:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:54.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:12:55 np0005593234 nova_compute[227762]: 2026-01-23 10:12:55.032 227766 DEBUG nova.compute.manager [req-e1441622-e554-421c-b9c0-61a470076e0c req-6e016cfa-5a46-4d1a-b5f5-6efaa659ba14 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Received event network-changed-5004fad4-5788-4709-9c83-b5fe075c0aa7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:12:55 np0005593234 nova_compute[227762]: 2026-01-23 10:12:55.032 227766 DEBUG nova.compute.manager [req-e1441622-e554-421c-b9c0-61a470076e0c req-6e016cfa-5a46-4d1a-b5f5-6efaa659ba14 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Refreshing instance network info cache due to event network-changed-5004fad4-5788-4709-9c83-b5fe075c0aa7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:12:55 np0005593234 nova_compute[227762]: 2026-01-23 10:12:55.033 227766 DEBUG oslo_concurrency.lockutils [req-e1441622-e554-421c-b9c0-61a470076e0c req-6e016cfa-5a46-4d1a-b5f5-6efaa659ba14 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:12:55 np0005593234 nova_compute[227762]: 2026-01-23 10:12:55.033 227766 DEBUG oslo_concurrency.lockutils [req-e1441622-e554-421c-b9c0-61a470076e0c req-6e016cfa-5a46-4d1a-b5f5-6efaa659ba14 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:12:55 np0005593234 nova_compute[227762]: 2026-01-23 10:12:55.033 227766 DEBUG nova.network.neutron [req-e1441622-e554-421c-b9c0-61a470076e0c req-6e016cfa-5a46-4d1a-b5f5-6efaa659ba14 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Refreshing network info cache for port 5004fad4-5788-4709-9c83-b5fe075c0aa7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:12:55 np0005593234 nova_compute[227762]: 2026-01-23 10:12:55.257 227766 INFO nova.compute.manager [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Rescuing#033[00m
Jan 23 05:12:55 np0005593234 nova_compute[227762]: 2026-01-23 10:12:55.258 227766 DEBUG oslo_concurrency.lockutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:12:55 np0005593234 nova_compute[227762]: 2026-01-23 10:12:55.258 227766 DEBUG oslo_concurrency.lockutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquired lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:12:55 np0005593234 nova_compute[227762]: 2026-01-23 10:12:55.259 227766 DEBUG nova.network.neutron [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:12:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:55.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:56.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:57 np0005593234 nova_compute[227762]: 2026-01-23 10:12:57.047 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:12:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:57.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:58.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:58 np0005593234 nova_compute[227762]: 2026-01-23 10:12:58.986 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:12:59 np0005593234 nova_compute[227762]: 2026-01-23 10:12:59.191 227766 DEBUG nova.network.neutron [req-e1441622-e554-421c-b9c0-61a470076e0c req-6e016cfa-5a46-4d1a-b5f5-6efaa659ba14 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Updated VIF entry in instance network info cache for port 5004fad4-5788-4709-9c83-b5fe075c0aa7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:12:59 np0005593234 nova_compute[227762]: 2026-01-23 10:12:59.192 227766 DEBUG nova.network.neutron [req-e1441622-e554-421c-b9c0-61a470076e0c req-6e016cfa-5a46-4d1a-b5f5-6efaa659ba14 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Updating instance_info_cache with network_info: [{"id": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "address": "fa:16:3e:95:4c:75", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5004fad4-57", "ovs_interfaceid": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:12:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:12:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:12:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:59.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:12:59 np0005593234 nova_compute[227762]: 2026-01-23 10:12:59.980 227766 DEBUG oslo_concurrency.lockutils [req-e1441622-e554-421c-b9c0-61a470076e0c req-6e016cfa-5a46-4d1a-b5f5-6efaa659ba14 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:13:00 np0005593234 nova_compute[227762]: 2026-01-23 10:13:00.156 227766 DEBUG nova.network.neutron [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Updating instance_info_cache with network_info: [{"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:13:00 np0005593234 nova_compute[227762]: 2026-01-23 10:13:00.566 227766 DEBUG oslo_concurrency.lockutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Releasing lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:13:00 np0005593234 nova_compute[227762]: 2026-01-23 10:13:00.900 227766 DEBUG nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:13:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:00.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:01Z|00056|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:95:4c:75 10.100.0.7
Jan 23 05:13:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:01Z|00057|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:95:4c:75 10.100.0.7
Jan 23 05:13:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:01.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:02 np0005593234 nova_compute[227762]: 2026-01-23 10:13:02.046 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:02.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:13:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:03.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:13:03 np0005593234 nova_compute[227762]: 2026-01-23 10:13:03.989 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:04Z|00058|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:53:81:35 10.100.0.12
Jan 23 05:13:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:04Z|00059|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:53:81:35 10.100.0.12
Jan 23 05:13:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:04.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:13:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:05.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:13:06 np0005593234 nova_compute[227762]: 2026-01-23 10:13:06.156 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:06.156 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=49, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=48) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:13:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:06.158 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:13:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:06.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:07 np0005593234 nova_compute[227762]: 2026-01-23 10:13:07.047 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:07.161 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '49'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:07.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:08.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:08 np0005593234 nova_compute[227762]: 2026-01-23 10:13:08.992 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:09.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:10 np0005593234 nova_compute[227762]: 2026-01-23 10:13:10.941 227766 DEBUG nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 05:13:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:10.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:11 np0005593234 podman[290685]: 2026-01-23 10:13:11.784444479 +0000 UTC m=+0.062250516 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:13:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:13:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:11.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:13:12 np0005593234 nova_compute[227762]: 2026-01-23 10:13:12.050 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:12.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:13.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:14 np0005593234 nova_compute[227762]: 2026-01-23 10:13:14.019 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:14 np0005593234 kernel: tap483c7ca9-a9 (unregistering): left promiscuous mode
Jan 23 05:13:14 np0005593234 NetworkManager[48942]: <info>  [1769163194.4002] device (tap483c7ca9-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:13:14 np0005593234 nova_compute[227762]: 2026-01-23 10:13:14.409 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:14Z|00554|binding|INFO|Releasing lport 483c7ca9-a908-4082-bbad-1ea123d6a3f1 from this chassis (sb_readonly=0)
Jan 23 05:13:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:14Z|00555|binding|INFO|Setting lport 483c7ca9-a908-4082-bbad-1ea123d6a3f1 down in Southbound
Jan 23 05:13:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:14Z|00556|binding|INFO|Removing iface tap483c7ca9-a9 ovn-installed in OVS
Jan 23 05:13:14 np0005593234 nova_compute[227762]: 2026-01-23 10:13:14.412 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.417 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:81:35 10.100.0.12'], port_security=['fa:16:3e:53:81:35 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c38b8bfe-1b70-4daf-b676-250c1e933ed4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a6ba16c4b9d49d3bc24cd7b44935d1f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6fc0d424-7779-4175-b5e0-e2613de6ecef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fb685af-2efd-4d70-8868-8a86ed4c3ca6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=483c7ca9-a908-4082-bbad-1ea123d6a3f1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.419 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 483c7ca9-a908-4082-bbad-1ea123d6a3f1 in datapath 00bd3319-bfe5-4acd-b2e4-17830ee847f9 unbound from our chassis#033[00m
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.420 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 00bd3319-bfe5-4acd-b2e4-17830ee847f9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.423 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9925b3fa-8e70-4abf-9032-5267c15a3831]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.424 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 namespace which is not needed anymore#033[00m
Jan 23 05:13:14 np0005593234 nova_compute[227762]: 2026-01-23 10:13:14.426 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:14 np0005593234 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000087.scope: Deactivated successfully.
Jan 23 05:13:14 np0005593234 systemd[1]: machine-qemu\x2d66\x2dinstance\x2d00000087.scope: Consumed 14.358s CPU time.
Jan 23 05:13:14 np0005593234 systemd-machined[195626]: Machine qemu-66-instance-00000087 terminated.
Jan 23 05:13:14 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[290480]: [NOTICE]   (290484) : haproxy version is 2.8.14-c23fe91
Jan 23 05:13:14 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[290480]: [NOTICE]   (290484) : path to executable is /usr/sbin/haproxy
Jan 23 05:13:14 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[290480]: [WARNING]  (290484) : Exiting Master process...
Jan 23 05:13:14 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[290480]: [ALERT]    (290484) : Current worker (290486) exited with code 143 (Terminated)
Jan 23 05:13:14 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[290480]: [WARNING]  (290484) : All workers exited. Exiting... (0)
Jan 23 05:13:14 np0005593234 systemd[1]: libpod-a1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912.scope: Deactivated successfully.
Jan 23 05:13:14 np0005593234 podman[290731]: 2026-01-23 10:13:14.555645907 +0000 UTC m=+0.047303682 container died a1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:13:14 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912-userdata-shm.mount: Deactivated successfully.
Jan 23 05:13:14 np0005593234 systemd[1]: var-lib-containers-storage-overlay-24c5a93fcf163a4d5328d81142dc11bb2c0e791a6fd7d6eaf57dfb43edd86207-merged.mount: Deactivated successfully.
Jan 23 05:13:14 np0005593234 podman[290731]: 2026-01-23 10:13:14.59790316 +0000 UTC m=+0.089560935 container cleanup a1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 05:13:14 np0005593234 systemd[1]: libpod-conmon-a1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912.scope: Deactivated successfully.
Jan 23 05:13:14 np0005593234 nova_compute[227762]: 2026-01-23 10:13:14.628 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:14 np0005593234 nova_compute[227762]: 2026-01-23 10:13:14.632 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:14 np0005593234 podman[290759]: 2026-01-23 10:13:14.666797832 +0000 UTC m=+0.048913572 container remove a1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.674 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[91dc9413-770f-441c-9b77-3e1cad2d6ba3]: (4, ('Fri Jan 23 10:13:14 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 (a1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912)\na1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912\nFri Jan 23 10:13:14 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 (a1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912)\na1b5f360e55e398afedb7a2d8c5f483f9fbb859ad57b9d47792cfe20ab566912\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.678 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5b863665-0a32-4aa5-baab-aa41d27b4f6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.679 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00bd3319-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:14 np0005593234 kernel: tap00bd3319-b0: left promiscuous mode
Jan 23 05:13:14 np0005593234 nova_compute[227762]: 2026-01-23 10:13:14.682 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:14 np0005593234 nova_compute[227762]: 2026-01-23 10:13:14.701 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.704 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6f0938b8-c637-435f-98c2-0fc44a66b299]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.719 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c694dc56-cef6-47e9-ae64-0baaf8c9bcaf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.720 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[189f7d68-9357-48e5-887f-b76248a2be7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.737 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[730ae456-30bb-4770-b28f-920fab942526]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 706026, 'reachable_time': 36524, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 290788, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.741 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:13:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:14.741 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[14b14f68-323c-45af-9963-e3f77ea4f6fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:14 np0005593234 systemd[1]: run-netns-ovnmeta\x2d00bd3319\x2dbfe5\x2d4acd\x2db2e4\x2d17830ee847f9.mount: Deactivated successfully.
Jan 23 05:13:14 np0005593234 nova_compute[227762]: 2026-01-23 10:13:14.958 227766 INFO nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Instance shutdown successfully after 14 seconds.#033[00m
Jan 23 05:13:14 np0005593234 nova_compute[227762]: 2026-01-23 10:13:14.964 227766 INFO nova.virt.libvirt.driver [-] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Instance destroyed successfully.#033[00m
Jan 23 05:13:14 np0005593234 nova_compute[227762]: 2026-01-23 10:13:14.965 227766 DEBUG nova.objects.instance [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'numa_topology' on Instance uuid c38b8bfe-1b70-4daf-b676-250c1e933ed4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:13:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:14.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:15 np0005593234 nova_compute[227762]: 2026-01-23 10:13:15.529 227766 INFO nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Attempting rescue#033[00m
Jan 23 05:13:15 np0005593234 nova_compute[227762]: 2026-01-23 10:13:15.531 227766 DEBUG nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 23 05:13:15 np0005593234 nova_compute[227762]: 2026-01-23 10:13:15.537 227766 DEBUG nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 05:13:15 np0005593234 nova_compute[227762]: 2026-01-23 10:13:15.537 227766 INFO nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Creating image(s)#033[00m
Jan 23 05:13:15 np0005593234 nova_compute[227762]: 2026-01-23 10:13:15.566 227766 DEBUG nova.storage.rbd_utils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:13:15 np0005593234 nova_compute[227762]: 2026-01-23 10:13:15.569 227766 DEBUG nova.objects.instance [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'trusted_certs' on Instance uuid c38b8bfe-1b70-4daf-b676-250c1e933ed4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:13:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:15.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:16 np0005593234 nova_compute[227762]: 2026-01-23 10:13:16.518 227766 DEBUG nova.storage.rbd_utils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:13:16 np0005593234 nova_compute[227762]: 2026-01-23 10:13:16.542 227766 DEBUG nova.storage.rbd_utils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:13:16 np0005593234 nova_compute[227762]: 2026-01-23 10:13:16.546 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:16 np0005593234 nova_compute[227762]: 2026-01-23 10:13:16.608 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:16 np0005593234 nova_compute[227762]: 2026-01-23 10:13:16.609 227766 DEBUG oslo_concurrency.lockutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:16 np0005593234 nova_compute[227762]: 2026-01-23 10:13:16.610 227766 DEBUG oslo_concurrency.lockutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:16 np0005593234 nova_compute[227762]: 2026-01-23 10:13:16.610 227766 DEBUG oslo_concurrency.lockutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:16 np0005593234 nova_compute[227762]: 2026-01-23 10:13:16.633 227766 DEBUG nova.storage.rbd_utils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:13:16 np0005593234 nova_compute[227762]: 2026-01-23 10:13:16.636 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:16 np0005593234 nova_compute[227762]: 2026-01-23 10:13:16.915 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.279s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:16 np0005593234 nova_compute[227762]: 2026-01-23 10:13:16.917 227766 DEBUG nova.objects.instance [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'migration_context' on Instance uuid c38b8bfe-1b70-4daf-b676-250c1e933ed4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:13:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:16.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.052 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.472 227766 DEBUG nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.473 227766 DEBUG nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Start _get_guest_xml network_info=[{"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "vif_mac": "fa:16:3e:53:81:35"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'disk.rescue': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config.rescue': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed', 'kernel_id': '', 'ramdisk_id': ''} block_device_info=None _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.473 227766 DEBUG nova.objects.instance [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'resources' on Instance uuid c38b8bfe-1b70-4daf-b676-250c1e933ed4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.495 227766 WARNING nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.504 227766 DEBUG nova.virt.libvirt.host [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.504 227766 DEBUG nova.virt.libvirt.host [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.507 227766 DEBUG nova.virt.libvirt.host [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.507 227766 DEBUG nova.virt.libvirt.host [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.508 227766 DEBUG nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:13:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.509 227766 DEBUG nova.virt.hardware [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.509 227766 DEBUG nova.virt.hardware [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.510 227766 DEBUG nova.virt.hardware [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.510 227766 DEBUG nova.virt.hardware [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.510 227766 DEBUG nova.virt.hardware [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.510 227766 DEBUG nova.virt.hardware [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.511 227766 DEBUG nova.virt.hardware [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.511 227766 DEBUG nova.virt.hardware [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.511 227766 DEBUG nova.virt.hardware [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.511 227766 DEBUG nova.virt.hardware [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.511 227766 DEBUG nova.virt.hardware [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.512 227766 DEBUG nova.objects.instance [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'vcpu_model' on Instance uuid c38b8bfe-1b70-4daf-b676-250c1e933ed4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.537 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.599 227766 DEBUG nova.compute.manager [req-2336125f-4889-4d22-bce5-8e744683e210 req-3bac1fc3-20ef-44d1-a2dc-b1a2b34ac8bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received event network-vif-unplugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.600 227766 DEBUG oslo_concurrency.lockutils [req-2336125f-4889-4d22-bce5-8e744683e210 req-3bac1fc3-20ef-44d1-a2dc-b1a2b34ac8bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.600 227766 DEBUG oslo_concurrency.lockutils [req-2336125f-4889-4d22-bce5-8e744683e210 req-3bac1fc3-20ef-44d1-a2dc-b1a2b34ac8bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.600 227766 DEBUG oslo_concurrency.lockutils [req-2336125f-4889-4d22-bce5-8e744683e210 req-3bac1fc3-20ef-44d1-a2dc-b1a2b34ac8bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.601 227766 DEBUG nova.compute.manager [req-2336125f-4889-4d22-bce5-8e744683e210 req-3bac1fc3-20ef-44d1-a2dc-b1a2b34ac8bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] No waiting events found dispatching network-vif-unplugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.601 227766 WARNING nova.compute.manager [req-2336125f-4889-4d22-bce5-8e744683e210 req-3bac1fc3-20ef-44d1-a2dc-b1a2b34ac8bc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received unexpected event network-vif-unplugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:13:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:13:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:17.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:13:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:13:17 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2193555327' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.980 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:17 np0005593234 nova_compute[227762]: 2026-01-23 10:13:17.981 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:13:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2151799074' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:13:18 np0005593234 nova_compute[227762]: 2026-01-23 10:13:18.452 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:18 np0005593234 nova_compute[227762]: 2026-01-23 10:13:18.454 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:13:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4171807297' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:13:18 np0005593234 nova_compute[227762]: 2026-01-23 10:13:18.915 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:18 np0005593234 nova_compute[227762]: 2026-01-23 10:13:18.918 227766 DEBUG nova.virt.libvirt.vif [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:12:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-526971967',display_name='tempest-ServerRescueNegativeTestJSON-server-526971967',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-526971967',id=135,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:12:50Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0a6ba16c4b9d49d3bc24cd7b44935d1f',ramdisk_id='',reservation_id='r-rypzatwh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-87224704',owner_user_name='tempest-ServerRescueNegativeTestJSON-87224704-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:12:50Z,user_data=None,user_id='fae914e59ec54f6b80928ef3cc68dbdb',uuid=c38b8bfe-1b70-4daf-b676-250c1e933ed4,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "vif_mac": "fa:16:3e:53:81:35"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:13:18 np0005593234 nova_compute[227762]: 2026-01-23 10:13:18.919 227766 DEBUG nova.network.os_vif_util [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converting VIF {"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "vif_mac": "fa:16:3e:53:81:35"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:13:18 np0005593234 nova_compute[227762]: 2026-01-23 10:13:18.920 227766 DEBUG nova.network.os_vif_util [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:53:81:35,bridge_name='br-int',has_traffic_filtering=True,id=483c7ca9-a908-4082-bbad-1ea123d6a3f1,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483c7ca9-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:13:18 np0005593234 nova_compute[227762]: 2026-01-23 10:13:18.921 227766 DEBUG nova.objects.instance [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'pci_devices' on Instance uuid c38b8bfe-1b70-4daf-b676-250c1e933ed4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:13:18 np0005593234 nova_compute[227762]: 2026-01-23 10:13:18.951 227766 DEBUG nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  <uuid>c38b8bfe-1b70-4daf-b676-250c1e933ed4</uuid>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  <name>instance-00000087</name>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerRescueNegativeTestJSON-server-526971967</nova:name>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:13:17</nova:creationTime>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <nova:user uuid="fae914e59ec54f6b80928ef3cc68dbdb">tempest-ServerRescueNegativeTestJSON-87224704-project-member</nova:user>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <nova:project uuid="0a6ba16c4b9d49d3bc24cd7b44935d1f">tempest-ServerRescueNegativeTestJSON-87224704</nova:project>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <nova:port uuid="483c7ca9-a908-4082-bbad-1ea123d6a3f1">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <entry name="serial">c38b8bfe-1b70-4daf-b676-250c1e933ed4</entry>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <entry name="uuid">c38b8bfe-1b70-4daf-b676-250c1e933ed4</entry>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.rescue">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <target dev="vdb" bus="virtio"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.config.rescue">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:53:81:35"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <target dev="tap483c7ca9-a9"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/console.log" append="off"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:13:18 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:13:18 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:13:18 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:13:18 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:13:18 np0005593234 nova_compute[227762]: 2026-01-23 10:13:18.958 227766 INFO nova.virt.libvirt.driver [-] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Instance destroyed successfully.#033[00m
Jan 23 05:13:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:19.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.018 227766 DEBUG nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.018 227766 DEBUG nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.018 227766 DEBUG nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.019 227766 DEBUG nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] No VIF found with MAC fa:16:3e:53:81:35, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.019 227766 INFO nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Using config drive#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.046 227766 DEBUG nova.storage.rbd_utils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.052 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:19 np0005593234 podman[290952]: 2026-01-23 10:13:19.073470696 +0000 UTC m=+0.079525143 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.080 227766 DEBUG nova.objects.instance [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'ec2_ids' on Instance uuid c38b8bfe-1b70-4daf-b676-250c1e933ed4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.122 227766 DEBUG nova.objects.instance [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'keypairs' on Instance uuid c38b8bfe-1b70-4daf-b676-250c1e933ed4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.520 227766 INFO nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Creating config drive at /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/disk.config.rescue#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.525 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwwm8h8a5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.656 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwwm8h8a5" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.707 227766 DEBUG nova.storage.rbd_utils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] rbd image c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.710 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/disk.config.rescue c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.734 227766 DEBUG nova.compute.manager [req-6e3b1d5b-e7e8-4d9c-8c25-ee98d0c33130 req-a31b75c3-6065-42b0-8c16-55361fefa704 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received event network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.734 227766 DEBUG oslo_concurrency.lockutils [req-6e3b1d5b-e7e8-4d9c-8c25-ee98d0c33130 req-a31b75c3-6065-42b0-8c16-55361fefa704 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.735 227766 DEBUG oslo_concurrency.lockutils [req-6e3b1d5b-e7e8-4d9c-8c25-ee98d0c33130 req-a31b75c3-6065-42b0-8c16-55361fefa704 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.735 227766 DEBUG oslo_concurrency.lockutils [req-6e3b1d5b-e7e8-4d9c-8c25-ee98d0c33130 req-a31b75c3-6065-42b0-8c16-55361fefa704 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.735 227766 DEBUG nova.compute.manager [req-6e3b1d5b-e7e8-4d9c-8c25-ee98d0c33130 req-a31b75c3-6065-42b0-8c16-55361fefa704 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] No waiting events found dispatching network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:13:19 np0005593234 nova_compute[227762]: 2026-01-23 10:13:19.735 227766 WARNING nova.compute.manager [req-6e3b1d5b-e7e8-4d9c-8c25-ee98d0c33130 req-a31b75c3-6065-42b0-8c16-55361fefa704 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received unexpected event network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:13:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:13:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:19.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:13:20 np0005593234 nova_compute[227762]: 2026-01-23 10:13:20.422 227766 DEBUG oslo_concurrency.processutils [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/disk.config.rescue c38b8bfe-1b70-4daf-b676-250c1e933ed4_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.711s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:20 np0005593234 nova_compute[227762]: 2026-01-23 10:13:20.422 227766 INFO nova.virt.libvirt.driver [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Deleting local config drive /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4/disk.config.rescue because it was imported into RBD.#033[00m
Jan 23 05:13:20 np0005593234 kernel: tap483c7ca9-a9: entered promiscuous mode
Jan 23 05:13:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:20Z|00557|binding|INFO|Claiming lport 483c7ca9-a908-4082-bbad-1ea123d6a3f1 for this chassis.
Jan 23 05:13:20 np0005593234 NetworkManager[48942]: <info>  [1769163200.4761] manager: (tap483c7ca9-a9): new Tun device (/org/freedesktop/NetworkManager/Devices/279)
Jan 23 05:13:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:20Z|00558|binding|INFO|483c7ca9-a908-4082-bbad-1ea123d6a3f1: Claiming fa:16:3e:53:81:35 10.100.0.12
Jan 23 05:13:20 np0005593234 nova_compute[227762]: 2026-01-23 10:13:20.475 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.487 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:81:35 10.100.0.12'], port_security=['fa:16:3e:53:81:35 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c38b8bfe-1b70-4daf-b676-250c1e933ed4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a6ba16c4b9d49d3bc24cd7b44935d1f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '6fc0d424-7779-4175-b5e0-e2613de6ecef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fb685af-2efd-4d70-8868-8a86ed4c3ca6, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=483c7ca9-a908-4082-bbad-1ea123d6a3f1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.489 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 483c7ca9-a908-4082-bbad-1ea123d6a3f1 in datapath 00bd3319-bfe5-4acd-b2e4-17830ee847f9 bound to our chassis#033[00m
Jan 23 05:13:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:20Z|00559|binding|INFO|Setting lport 483c7ca9-a908-4082-bbad-1ea123d6a3f1 ovn-installed in OVS
Jan 23 05:13:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:20Z|00560|binding|INFO|Setting lport 483c7ca9-a908-4082-bbad-1ea123d6a3f1 up in Southbound
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.491 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 00bd3319-bfe5-4acd-b2e4-17830ee847f9#033[00m
Jan 23 05:13:20 np0005593234 nova_compute[227762]: 2026-01-23 10:13:20.492 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:20 np0005593234 nova_compute[227762]: 2026-01-23 10:13:20.495 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:20 np0005593234 systemd-udevd[291050]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.503 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c9e85ad5-ec3d-47fd-bb97-68bfbe81627f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.504 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap00bd3319-b1 in ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.505 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap00bd3319-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.505 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[672d058d-d4f0-4239-b5f2-0beb53926752]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.506 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[022d61cd-fff0-448f-993e-01e229ddec6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 NetworkManager[48942]: <info>  [1769163200.5127] device (tap483c7ca9-a9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:13:20 np0005593234 systemd-machined[195626]: New machine qemu-67-instance-00000087.
Jan 23 05:13:20 np0005593234 NetworkManager[48942]: <info>  [1769163200.5140] device (tap483c7ca9-a9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.518 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[559d8533-1666-413e-b827-e6ccee65bb3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 systemd[1]: Started Virtual Machine qemu-67-instance-00000087.
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.541 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b7a70770-1abe-40d2-9238-d3f709e2c81c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.570 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[90c821d7-a825-4ee5-815d-6acfae5a597f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 NetworkManager[48942]: <info>  [1769163200.5780] manager: (tap00bd3319-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/280)
Jan 23 05:13:20 np0005593234 systemd-udevd[291055]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.577 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0983fea2-a75b-432e-a8dd-493c797600fa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.612 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[fb554e84-a7bf-4be2-8607-809915c51725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.615 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[80a5acf1-25f6-4dcd-b291-2d9ab3b23f4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 NetworkManager[48942]: <info>  [1769163200.6371] device (tap00bd3319-b0): carrier: link connected
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.643 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[dc10c48a-5a7d-4437-8762-2804f62b6204]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.660 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3088d594-af12-4d70-a8ce-eaddea0277cb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00bd3319-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:83:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 709559, 'reachable_time': 16953, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291084, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.677 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[acca4929-4543-4039-8d32-eb3974c75182]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6b:83f8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 709559, 'tstamp': 709559}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291085, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.695 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[90cb396d-8fcd-423c-865a-6821f1c1f600]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap00bd3319-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6b:83:f8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 180], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 709559, 'reachable_time': 16953, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 291086, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.721 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a33606df-2f1c-4834-be50-9a4b4b701b13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.765 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2dffbf93-df15-40f6-9a52-9d200a787a7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.766 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00bd3319-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.766 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.766 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap00bd3319-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:20 np0005593234 NetworkManager[48942]: <info>  [1769163200.8083] manager: (tap00bd3319-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/281)
Jan 23 05:13:20 np0005593234 kernel: tap00bd3319-b0: entered promiscuous mode
Jan 23 05:13:20 np0005593234 nova_compute[227762]: 2026-01-23 10:13:20.808 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:20 np0005593234 nova_compute[227762]: 2026-01-23 10:13:20.811 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.811 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap00bd3319-b0, col_values=(('external_ids', {'iface-id': '1788b5e6-601b-4e3d-a584-c0138c3308f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:13:20 np0005593234 nova_compute[227762]: 2026-01-23 10:13:20.812 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:20Z|00561|binding|INFO|Releasing lport 1788b5e6-601b-4e3d-a584-c0138c3308f6 from this chassis (sb_readonly=0)
Jan 23 05:13:20 np0005593234 nova_compute[227762]: 2026-01-23 10:13:20.826 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.827 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.828 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea46a42-268f-4d24-97d5-d331f5a3160a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.829 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-00bd3319-bfe5-4acd-b2e4-17830ee847f9
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/00bd3319-bfe5-4acd-b2e4-17830ee847f9.pid.haproxy
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 00bd3319-bfe5-4acd-b2e4-17830ee847f9
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:13:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:20.829 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'env', 'PROCESS_TAG=haproxy-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/00bd3319-bfe5-4acd-b2e4-17830ee847f9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:13:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:21.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:21 np0005593234 podman[291118]: 2026-01-23 10:13:21.190674256 +0000 UTC m=+0.045414873 container create ee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 23 05:13:21 np0005593234 systemd[1]: Started libpod-conmon-ee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d.scope.
Jan 23 05:13:21 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:13:21 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d7737df349e5a533369b961e4f4e817d56e212f83ad18b150fc76c07b26c2124/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:13:21 np0005593234 podman[291118]: 2026-01-23 10:13:21.169020273 +0000 UTC m=+0.023760910 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:13:21 np0005593234 podman[291118]: 2026-01-23 10:13:21.268519695 +0000 UTC m=+0.123260332 container init ee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 05:13:21 np0005593234 podman[291118]: 2026-01-23 10:13:21.273376117 +0000 UTC m=+0.128116734 container start ee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 05:13:21 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[291184]: [NOTICE]   (291196) : New worker (291198) forked
Jan 23 05:13:21 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[291184]: [NOTICE]   (291196) : Loading success.
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.364 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for c38b8bfe-1b70-4daf-b676-250c1e933ed4 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.364 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163201.3634636, c38b8bfe-1b70-4daf-b676-250c1e933ed4 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.364 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.369 227766 DEBUG nova.compute.manager [None req-53b7dc21-018f-42d6-8317-44d33fa65348 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.414 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.417 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.464 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163201.3644629, c38b8bfe-1b70-4daf-b676-250c1e933ed4 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.464 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] VM Started (Lifecycle Event)#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.481 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.485 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:13:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:21.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.980 227766 DEBUG nova.compute.manager [req-604854ac-73b7-4e8c-9a59-fcb1094f9114 req-5254e40e-0b2c-427c-b1e6-f8e29b618f24 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received event network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.981 227766 DEBUG oslo_concurrency.lockutils [req-604854ac-73b7-4e8c-9a59-fcb1094f9114 req-5254e40e-0b2c-427c-b1e6-f8e29b618f24 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.982 227766 DEBUG oslo_concurrency.lockutils [req-604854ac-73b7-4e8c-9a59-fcb1094f9114 req-5254e40e-0b2c-427c-b1e6-f8e29b618f24 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.982 227766 DEBUG oslo_concurrency.lockutils [req-604854ac-73b7-4e8c-9a59-fcb1094f9114 req-5254e40e-0b2c-427c-b1e6-f8e29b618f24 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.982 227766 DEBUG nova.compute.manager [req-604854ac-73b7-4e8c-9a59-fcb1094f9114 req-5254e40e-0b2c-427c-b1e6-f8e29b618f24 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] No waiting events found dispatching network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:13:21 np0005593234 nova_compute[227762]: 2026-01-23 10:13:21.982 227766 WARNING nova.compute.manager [req-604854ac-73b7-4e8c-9a59-fcb1094f9114 req-5254e40e-0b2c-427c-b1e6-f8e29b618f24 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received unexpected event network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 for instance with vm_state rescued and task_state None.#033[00m
Jan 23 05:13:22 np0005593234 nova_compute[227762]: 2026-01-23 10:13:22.054 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:23.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:23.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:24 np0005593234 nova_compute[227762]: 2026-01-23 10:13:24.057 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:24 np0005593234 nova_compute[227762]: 2026-01-23 10:13:24.391 227766 DEBUG nova.compute.manager [req-fc48f8ef-184a-4834-9827-5e1c15ff0e05 req-e69496df-c482-4b3b-9f90-c5f919a10fcf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received event network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:13:24 np0005593234 nova_compute[227762]: 2026-01-23 10:13:24.392 227766 DEBUG oslo_concurrency.lockutils [req-fc48f8ef-184a-4834-9827-5e1c15ff0e05 req-e69496df-c482-4b3b-9f90-c5f919a10fcf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:24 np0005593234 nova_compute[227762]: 2026-01-23 10:13:24.392 227766 DEBUG oslo_concurrency.lockutils [req-fc48f8ef-184a-4834-9827-5e1c15ff0e05 req-e69496df-c482-4b3b-9f90-c5f919a10fcf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:24 np0005593234 nova_compute[227762]: 2026-01-23 10:13:24.392 227766 DEBUG oslo_concurrency.lockutils [req-fc48f8ef-184a-4834-9827-5e1c15ff0e05 req-e69496df-c482-4b3b-9f90-c5f919a10fcf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:24 np0005593234 nova_compute[227762]: 2026-01-23 10:13:24.393 227766 DEBUG nova.compute.manager [req-fc48f8ef-184a-4834-9827-5e1c15ff0e05 req-e69496df-c482-4b3b-9f90-c5f919a10fcf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] No waiting events found dispatching network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:13:24 np0005593234 nova_compute[227762]: 2026-01-23 10:13:24.393 227766 WARNING nova.compute.manager [req-fc48f8ef-184a-4834-9827-5e1c15ff0e05 req-e69496df-c482-4b3b-9f90-c5f919a10fcf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received unexpected event network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 for instance with vm_state rescued and task_state None.#033[00m
Jan 23 05:13:24 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:24Z|00562|binding|INFO|Releasing lport 1788b5e6-601b-4e3d-a584-c0138c3308f6 from this chassis (sb_readonly=0)
Jan 23 05:13:24 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:24Z|00563|binding|INFO|Releasing lport b57bd565-3bb1-4ecc-8df0-a7c439ac84a6 from this chassis (sb_readonly=0)
Jan 23 05:13:24 np0005593234 nova_compute[227762]: 2026-01-23 10:13:24.570 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:25.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:25.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:27.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:27 np0005593234 nova_compute[227762]: 2026-01-23 10:13:27.056 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:27.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:29.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:29 np0005593234 nova_compute[227762]: 2026-01-23 10:13:29.059 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:29.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:31.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:31.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:32 np0005593234 nova_compute[227762]: 2026-01-23 10:13:32.058 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:33.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:33.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:34 np0005593234 nova_compute[227762]: 2026-01-23 10:13:34.063 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:34Z|00564|binding|INFO|Releasing lport 1788b5e6-601b-4e3d-a584-c0138c3308f6 from this chassis (sb_readonly=0)
Jan 23 05:13:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:34Z|00565|binding|INFO|Releasing lport b57bd565-3bb1-4ecc-8df0-a7c439ac84a6 from this chassis (sb_readonly=0)
Jan 23 05:13:34 np0005593234 nova_compute[227762]: 2026-01-23 10:13:34.552 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:35.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e297 e297: 3 total, 3 up, 3 in
Jan 23 05:13:35 np0005593234 ovn_controller[134547]: 2026-01-23T10:13:35Z|00060|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:53:81:35 10.100.0.12
Jan 23 05:13:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:35.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e298 e298: 3 total, 3 up, 3 in
Jan 23 05:13:36 np0005593234 nova_compute[227762]: 2026-01-23 10:13:36.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:36 np0005593234 nova_compute[227762]: 2026-01-23 10:13:36.787 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:36 np0005593234 nova_compute[227762]: 2026-01-23 10:13:36.788 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:36 np0005593234 nova_compute[227762]: 2026-01-23 10:13:36.788 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:36 np0005593234 nova_compute[227762]: 2026-01-23 10:13:36.788 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:13:36 np0005593234 nova_compute[227762]: 2026-01-23 10:13:36.789 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 05:13:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:37.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 05:13:37 np0005593234 nova_compute[227762]: 2026-01-23 10:13:37.105 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e299 e299: 3 total, 3 up, 3 in
Jan 23 05:13:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:13:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3279848503' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:13:37 np0005593234 nova_compute[227762]: 2026-01-23 10:13:37.289 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:37 np0005593234 nova_compute[227762]: 2026-01-23 10:13:37.468 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:13:37 np0005593234 nova_compute[227762]: 2026-01-23 10:13:37.468 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:13:37 np0005593234 nova_compute[227762]: 2026-01-23 10:13:37.472 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000086 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:13:37 np0005593234 nova_compute[227762]: 2026-01-23 10:13:37.472 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000086 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:13:37 np0005593234 nova_compute[227762]: 2026-01-23 10:13:37.476 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000087 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:13:37 np0005593234 nova_compute[227762]: 2026-01-23 10:13:37.476 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000087 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:13:37 np0005593234 nova_compute[227762]: 2026-01-23 10:13:37.476 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000087 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:13:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:37 np0005593234 nova_compute[227762]: 2026-01-23 10:13:37.646 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:13:37 np0005593234 nova_compute[227762]: 2026-01-23 10:13:37.648 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3814MB free_disk=20.69784164428711GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:13:37 np0005593234 nova_compute[227762]: 2026-01-23 10:13:37.648 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:37 np0005593234 nova_compute[227762]: 2026-01-23 10:13:37.648 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:37.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:38 np0005593234 nova_compute[227762]: 2026-01-23 10:13:38.146 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:13:38 np0005593234 nova_compute[227762]: 2026-01-23 10:13:38.147 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 33559028-00d9-4918-9015-26172db3d00c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:13:38 np0005593234 nova_compute[227762]: 2026-01-23 10:13:38.147 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance c38b8bfe-1b70-4daf-b676-250c1e933ed4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:13:38 np0005593234 nova_compute[227762]: 2026-01-23 10:13:38.147 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:13:38 np0005593234 nova_compute[227762]: 2026-01-23 10:13:38.147 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:13:38 np0005593234 nova_compute[227762]: 2026-01-23 10:13:38.424 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:13:38 np0005593234 nova_compute[227762]: 2026-01-23 10:13:38.649 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:13:38 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/171840230' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:13:38 np0005593234 nova_compute[227762]: 2026-01-23 10:13:38.893 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:13:38 np0005593234 nova_compute[227762]: 2026-01-23 10:13:38.899 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:13:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:39.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:39 np0005593234 nova_compute[227762]: 2026-01-23 10:13:39.065 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:39 np0005593234 nova_compute[227762]: 2026-01-23 10:13:39.324 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:13:39 np0005593234 nova_compute[227762]: 2026-01-23 10:13:39.370 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:13:39 np0005593234 nova_compute[227762]: 2026-01-23 10:13:39.371 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.722s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:13:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:39.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:13:40 np0005593234 nova_compute[227762]: 2026-01-23 10:13:40.371 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:41.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:41.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e300 e300: 3 total, 3 up, 3 in
Jan 23 05:13:42 np0005593234 nova_compute[227762]: 2026-01-23 10:13:42.107 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:42 np0005593234 nova_compute[227762]: 2026-01-23 10:13:42.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:42 np0005593234 nova_compute[227762]: 2026-01-23 10:13:42.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:42 np0005593234 podman[291314]: 2026-01-23 10:13:42.764097427 +0000 UTC m=+0.059020676 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:13:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:42.850 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:13:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:42.851 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:13:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:13:42.852 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:13:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:43.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:43.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:44 np0005593234 nova_compute[227762]: 2026-01-23 10:13:44.066 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:13:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1361485578' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:13:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:13:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1361485578' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:13:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:45.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:13:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:45.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:13:46 np0005593234 nova_compute[227762]: 2026-01-23 10:13:46.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:46 np0005593234 nova_compute[227762]: 2026-01-23 10:13:46.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:46 np0005593234 nova_compute[227762]: 2026-01-23 10:13:46.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:13:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:47.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:47 np0005593234 nova_compute[227762]: 2026-01-23 10:13:47.108 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:47 np0005593234 nova_compute[227762]: 2026-01-23 10:13:47.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:47.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:49.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:49 np0005593234 nova_compute[227762]: 2026-01-23 10:13:49.069 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:49 np0005593234 podman[291360]: 2026-01-23 10:13:49.299309942 +0000 UTC m=+0.086720947 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:13:49 np0005593234 nova_compute[227762]: 2026-01-23 10:13:49.351 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:49.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:50 np0005593234 nova_compute[227762]: 2026-01-23 10:13:50.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:50 np0005593234 nova_compute[227762]: 2026-01-23 10:13:50.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:13:50 np0005593234 nova_compute[227762]: 2026-01-23 10:13:50.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:13:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:51.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:51 np0005593234 nova_compute[227762]: 2026-01-23 10:13:51.158 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:13:51 np0005593234 nova_compute[227762]: 2026-01-23 10:13:51.159 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:13:51 np0005593234 nova_compute[227762]: 2026-01-23 10:13:51.160 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:13:51 np0005593234 nova_compute[227762]: 2026-01-23 10:13:51.160 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:13:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:13:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:51.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:13:52 np0005593234 nova_compute[227762]: 2026-01-23 10:13:52.111 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:13:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:13:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:13:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:13:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:13:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:53.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:53 np0005593234 nova_compute[227762]: 2026-01-23 10:13:53.322 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Updating instance_info_cache with network_info: [{"id": "8be6de92-c581-49d7-a315-1d1b8c33153a", "address": "fa:16:3e:d6:07:56", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8be6de92-c5", "ovs_interfaceid": "8be6de92-c581-49d7-a315-1d1b8c33153a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:13:53 np0005593234 nova_compute[227762]: 2026-01-23 10:13:53.859 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:13:53 np0005593234 nova_compute[227762]: 2026-01-23 10:13:53.859 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:13:53 np0005593234 nova_compute[227762]: 2026-01-23 10:13:53.860 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:53.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:54 np0005593234 nova_compute[227762]: 2026-01-23 10:13:54.073 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:13:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:55.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:13:55 np0005593234 nova_compute[227762]: 2026-01-23 10:13:55.855 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:13:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:55.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:57.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:57 np0005593234 nova_compute[227762]: 2026-01-23 10:13:57.136 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:13:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:57.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:13:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:13:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:59.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:13:59 np0005593234 nova_compute[227762]: 2026-01-23 10:13:59.076 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:13:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:13:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:13:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:59.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:14:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:14:00 np0005593234 nova_compute[227762]: 2026-01-23 10:14:00.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:01.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:01 np0005593234 nova_compute[227762]: 2026-01-23 10:14:01.153 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:01.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:02 np0005593234 nova_compute[227762]: 2026-01-23 10:14:02.138 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:03.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:03 np0005593234 nova_compute[227762]: 2026-01-23 10:14:03.522 227766 DEBUG oslo_concurrency.lockutils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "33559028-00d9-4918-9015-26172db3d00c" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:03 np0005593234 nova_compute[227762]: 2026-01-23 10:14:03.523 227766 DEBUG oslo_concurrency.lockutils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:03 np0005593234 nova_compute[227762]: 2026-01-23 10:14:03.523 227766 INFO nova.compute.manager [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Shelving#033[00m
Jan 23 05:14:03 np0005593234 nova_compute[227762]: 2026-01-23 10:14:03.549 227766 DEBUG nova.virt.libvirt.driver [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:14:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:03.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:04 np0005593234 nova_compute[227762]: 2026-01-23 10:14:04.103 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:05.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:05.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:06 np0005593234 kernel: tap5004fad4-57 (unregistering): left promiscuous mode
Jan 23 05:14:06 np0005593234 NetworkManager[48942]: <info>  [1769163246.0502] device (tap5004fad4-57): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:14:06 np0005593234 ovn_controller[134547]: 2026-01-23T10:14:06Z|00566|binding|INFO|Releasing lport 5004fad4-5788-4709-9c83-b5fe075c0aa7 from this chassis (sb_readonly=0)
Jan 23 05:14:06 np0005593234 ovn_controller[134547]: 2026-01-23T10:14:06Z|00567|binding|INFO|Setting lport 5004fad4-5788-4709-9c83-b5fe075c0aa7 down in Southbound
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.060 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:06 np0005593234 ovn_controller[134547]: 2026-01-23T10:14:06Z|00568|binding|INFO|Removing iface tap5004fad4-57 ovn-installed in OVS
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.064 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.074 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:95:4c:75 10.100.0.7'], port_security=['fa:16:3e:95:4c:75 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '33559028-00d9-4918-9015-26172db3d00c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d9599b4-8855-4310-af02-cdd058438f7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cf3e0bf9-33c6-483b-a880-c8297a0be71f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.199'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=875f4baa-cb85-49ca-8f02-78715d351fdb, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5004fad4-5788-4709-9c83-b5fe075c0aa7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.076 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5004fad4-5788-4709-9c83-b5fe075c0aa7 in datapath 8d9599b4-8855-4310-af02-cdd058438f7d unbound from our chassis#033[00m
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.077 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8d9599b4-8855-4310-af02-cdd058438f7d#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.078 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.096 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4b2fe1ef-faf6-4f5f-a9bb-5aa65bab49be]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:06 np0005593234 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000086.scope: Deactivated successfully.
Jan 23 05:14:06 np0005593234 systemd[1]: machine-qemu\x2d65\x2dinstance\x2d00000086.scope: Consumed 16.592s CPU time.
Jan 23 05:14:06 np0005593234 systemd-machined[195626]: Machine qemu-65-instance-00000086 terminated.
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.130 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[60f559b9-630b-4397-bc34-430e6ec9f1ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.134 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[aa1377d1-d3ec-4178-8036-fe4a86c449e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.166 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[cabe75f3-ed88-4947-8b44-75bec3f66cad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.183 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aba9cab0-52a1-4b26-9c16-7c672b21bfdc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8d9599b4-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:a1:2b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 15, 'rx_bytes': 784, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 164], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687969, 'reachable_time': 29760, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 291614, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.196 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4569dbd2-513b-4209-9565-1350bec5679b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tap8d9599b4-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687982, 'tstamp': 687982}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291615, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap8d9599b4-81'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 687985, 'tstamp': 687985}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 291615, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.198 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d9599b4-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.256 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.261 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.261 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8d9599b4-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.262 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.262 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8d9599b4-80, col_values=(('external_ids', {'iface-id': 'b57bd565-3bb1-4ecc-8df0-a7c439ac84a6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:06.262 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.278 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.283 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.532 227766 DEBUG nova.compute.manager [req-f7f38388-b841-4c16-8eb1-adfb5026e981 req-33f8d9c0-9d3e-4449-a6a0-771aa25c0ea9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Received event network-vif-unplugged-5004fad4-5788-4709-9c83-b5fe075c0aa7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.534 227766 DEBUG oslo_concurrency.lockutils [req-f7f38388-b841-4c16-8eb1-adfb5026e981 req-33f8d9c0-9d3e-4449-a6a0-771aa25c0ea9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "33559028-00d9-4918-9015-26172db3d00c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.534 227766 DEBUG oslo_concurrency.lockutils [req-f7f38388-b841-4c16-8eb1-adfb5026e981 req-33f8d9c0-9d3e-4449-a6a0-771aa25c0ea9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.535 227766 DEBUG oslo_concurrency.lockutils [req-f7f38388-b841-4c16-8eb1-adfb5026e981 req-33f8d9c0-9d3e-4449-a6a0-771aa25c0ea9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.535 227766 DEBUG nova.compute.manager [req-f7f38388-b841-4c16-8eb1-adfb5026e981 req-33f8d9c0-9d3e-4449-a6a0-771aa25c0ea9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] No waiting events found dispatching network-vif-unplugged-5004fad4-5788-4709-9c83-b5fe075c0aa7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.535 227766 WARNING nova.compute.manager [req-f7f38388-b841-4c16-8eb1-adfb5026e981 req-33f8d9c0-9d3e-4449-a6a0-771aa25c0ea9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Received unexpected event network-vif-unplugged-5004fad4-5788-4709-9c83-b5fe075c0aa7 for instance with vm_state active and task_state shelving.#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.569 227766 INFO nova.virt.libvirt.driver [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.576 227766 INFO nova.virt.libvirt.driver [-] [instance: 33559028-00d9-4918-9015-26172db3d00c] Instance destroyed successfully.#033[00m
Jan 23 05:14:06 np0005593234 nova_compute[227762]: 2026-01-23 10:14:06.578 227766 DEBUG nova.objects.instance [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'numa_topology' on Instance uuid 33559028-00d9-4918-9015-26172db3d00c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:14:07 np0005593234 nova_compute[227762]: 2026-01-23 10:14:07.017 227766 INFO nova.virt.libvirt.driver [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Beginning cold snapshot process#033[00m
Jan 23 05:14:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:07.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:07 np0005593234 nova_compute[227762]: 2026-01-23 10:14:07.141 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:07 np0005593234 nova_compute[227762]: 2026-01-23 10:14:07.227 227766 DEBUG nova.virt.libvirt.imagebackend [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 05:14:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:14:07Z|00569|binding|INFO|Releasing lport 1788b5e6-601b-4e3d-a584-c0138c3308f6 from this chassis (sb_readonly=0)
Jan 23 05:14:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:14:07Z|00570|binding|INFO|Releasing lport b57bd565-3bb1-4ecc-8df0-a7c439ac84a6 from this chassis (sb_readonly=0)
Jan 23 05:14:07 np0005593234 nova_compute[227762]: 2026-01-23 10:14:07.440 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e300 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:07 np0005593234 nova_compute[227762]: 2026-01-23 10:14:07.584 227766 DEBUG nova.storage.rbd_utils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] creating snapshot(4d6b52c923bb41b49a0d3a14619f038b) on rbd image(33559028-00d9-4918-9015-26172db3d00c_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:14:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:07.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e301 e301: 3 total, 3 up, 3 in
Jan 23 05:14:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:08.605 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=50, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=49) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:14:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:08.607 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:14:08 np0005593234 nova_compute[227762]: 2026-01-23 10:14:08.609 227766 DEBUG nova.storage.rbd_utils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] cloning vms/33559028-00d9-4918-9015-26172db3d00c_disk@4d6b52c923bb41b49a0d3a14619f038b to images/91667598-4041-4c0e-ba8d-b3a19e535259 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:14:08 np0005593234 nova_compute[227762]: 2026-01-23 10:14:08.640 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:08 np0005593234 nova_compute[227762]: 2026-01-23 10:14:08.679 227766 DEBUG nova.compute.manager [req-93c1145d-ab31-4f55-aee1-dfc302d929b4 req-fce720de-96ba-46ef-909a-72714f8d0a6d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Received event network-vif-plugged-5004fad4-5788-4709-9c83-b5fe075c0aa7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:08 np0005593234 nova_compute[227762]: 2026-01-23 10:14:08.680 227766 DEBUG oslo_concurrency.lockutils [req-93c1145d-ab31-4f55-aee1-dfc302d929b4 req-fce720de-96ba-46ef-909a-72714f8d0a6d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "33559028-00d9-4918-9015-26172db3d00c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:08 np0005593234 nova_compute[227762]: 2026-01-23 10:14:08.680 227766 DEBUG oslo_concurrency.lockutils [req-93c1145d-ab31-4f55-aee1-dfc302d929b4 req-fce720de-96ba-46ef-909a-72714f8d0a6d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:08 np0005593234 nova_compute[227762]: 2026-01-23 10:14:08.680 227766 DEBUG oslo_concurrency.lockutils [req-93c1145d-ab31-4f55-aee1-dfc302d929b4 req-fce720de-96ba-46ef-909a-72714f8d0a6d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:08 np0005593234 nova_compute[227762]: 2026-01-23 10:14:08.680 227766 DEBUG nova.compute.manager [req-93c1145d-ab31-4f55-aee1-dfc302d929b4 req-fce720de-96ba-46ef-909a-72714f8d0a6d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] No waiting events found dispatching network-vif-plugged-5004fad4-5788-4709-9c83-b5fe075c0aa7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:14:08 np0005593234 nova_compute[227762]: 2026-01-23 10:14:08.681 227766 WARNING nova.compute.manager [req-93c1145d-ab31-4f55-aee1-dfc302d929b4 req-fce720de-96ba-46ef-909a-72714f8d0a6d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Received unexpected event network-vif-plugged-5004fad4-5788-4709-9c83-b5fe075c0aa7 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 23 05:14:08 np0005593234 nova_compute[227762]: 2026-01-23 10:14:08.734 227766 DEBUG nova.storage.rbd_utils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] flattening images/91667598-4041-4c0e-ba8d-b3a19e535259 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 05:14:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:09.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:09 np0005593234 nova_compute[227762]: 2026-01-23 10:14:09.111 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:09 np0005593234 nova_compute[227762]: 2026-01-23 10:14:09.125 227766 DEBUG nova.storage.rbd_utils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] removing snapshot(4d6b52c923bb41b49a0d3a14619f038b) on rbd image(33559028-00d9-4918-9015-26172db3d00c_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:14:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e302 e302: 3 total, 3 up, 3 in
Jan 23 05:14:09 np0005593234 nova_compute[227762]: 2026-01-23 10:14:09.799 227766 DEBUG nova.storage.rbd_utils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] creating snapshot(snap) on rbd image(91667598-4041-4c0e-ba8d-b3a19e535259) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:14:09 np0005593234 nova_compute[227762]: 2026-01-23 10:14:09.928 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:09 np0005593234 nova_compute[227762]: 2026-01-23 10:14:09.928 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:14:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:14:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:09.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:14:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e303 e303: 3 total, 3 up, 3 in
Jan 23 05:14:10 np0005593234 nova_compute[227762]: 2026-01-23 10:14:10.813 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:14:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:11.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:14:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:14:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:11.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:14:12 np0005593234 nova_compute[227762]: 2026-01-23 10:14:12.144 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:13.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:13 np0005593234 nova_compute[227762]: 2026-01-23 10:14:13.370 227766 INFO nova.virt.libvirt.driver [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Snapshot image upload complete#033[00m
Jan 23 05:14:13 np0005593234 nova_compute[227762]: 2026-01-23 10:14:13.370 227766 DEBUG nova.compute.manager [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:14:13 np0005593234 nova_compute[227762]: 2026-01-23 10:14:13.451 227766 INFO nova.compute.manager [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Shelve offloading#033[00m
Jan 23 05:14:13 np0005593234 nova_compute[227762]: 2026-01-23 10:14:13.459 227766 INFO nova.virt.libvirt.driver [-] [instance: 33559028-00d9-4918-9015-26172db3d00c] Instance destroyed successfully.#033[00m
Jan 23 05:14:13 np0005593234 nova_compute[227762]: 2026-01-23 10:14:13.460 227766 DEBUG nova.compute.manager [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:14:13 np0005593234 nova_compute[227762]: 2026-01-23 10:14:13.462 227766 DEBUG oslo_concurrency.lockutils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:14:13 np0005593234 nova_compute[227762]: 2026-01-23 10:14:13.462 227766 DEBUG oslo_concurrency.lockutils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquired lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:14:13 np0005593234 nova_compute[227762]: 2026-01-23 10:14:13.463 227766 DEBUG nova.network.neutron [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:14:13 np0005593234 podman[291820]: 2026-01-23 10:14:13.76235385 +0000 UTC m=+0.054416493 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:14:13 np0005593234 nova_compute[227762]: 2026-01-23 10:14:13.762 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:13 np0005593234 nova_compute[227762]: 2026-01-23 10:14:13.762 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:14:13 np0005593234 nova_compute[227762]: 2026-01-23 10:14:13.783 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:14:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:14:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:13.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:14:14 np0005593234 nova_compute[227762]: 2026-01-23 10:14:14.114 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:15.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:15 np0005593234 nova_compute[227762]: 2026-01-23 10:14:15.083 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:15 np0005593234 nova_compute[227762]: 2026-01-23 10:14:15.327 227766 DEBUG nova.network.neutron [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Updating instance_info_cache with network_info: [{"id": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "address": "fa:16:3e:95:4c:75", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5004fad4-57", "ovs_interfaceid": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:14:15 np0005593234 nova_compute[227762]: 2026-01-23 10:14:15.362 227766 DEBUG oslo_concurrency.lockutils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Releasing lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:14:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:15.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:16 np0005593234 ovn_controller[134547]: 2026-01-23T10:14:16Z|00571|binding|INFO|Releasing lport 1788b5e6-601b-4e3d-a584-c0138c3308f6 from this chassis (sb_readonly=0)
Jan 23 05:14:16 np0005593234 ovn_controller[134547]: 2026-01-23T10:14:16Z|00572|binding|INFO|Releasing lport b57bd565-3bb1-4ecc-8df0-a7c439ac84a6 from this chassis (sb_readonly=0)
Jan 23 05:14:16 np0005593234 nova_compute[227762]: 2026-01-23 10:14:16.589 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:16 np0005593234 nova_compute[227762]: 2026-01-23 10:14:16.903 227766 INFO nova.virt.libvirt.driver [-] [instance: 33559028-00d9-4918-9015-26172db3d00c] Instance destroyed successfully.#033[00m
Jan 23 05:14:16 np0005593234 nova_compute[227762]: 2026-01-23 10:14:16.903 227766 DEBUG nova.objects.instance [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'resources' on Instance uuid 33559028-00d9-4918-9015-26172db3d00c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:14:16 np0005593234 nova_compute[227762]: 2026-01-23 10:14:16.930 227766 DEBUG nova.virt.libvirt.vif [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:12:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1716094682',display_name='tempest-ServerActionsTestOtherB-server-1716094682',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1716094682',id=134,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPpuWItOSZUstL5LlOZAhtyKqrmFs0bJ/+DBMLk1rKDBu2SnttdOypH9Db6AMV4nGhLXOyr97hIMUaALurv7OcM9NkoB1CxFMDb3d0IWPDnRphumt71Jz0jUP0kiZtXBTQ==',key_name='tempest-keypair-1844396132',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:12:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-frjx5azl',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member',shelved_at='2026-01-23T10:14:13.370474',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='91667598-4041-4c0e-ba8d-b3a19e535259'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:14:07Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='aca3cab576d641d3b89e7dddf155d467',uuid=33559028-00d9-4918-9015-26172db3d00c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "address": "fa:16:3e:95:4c:75", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5004fad4-57", "ovs_interfaceid": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:14:16 np0005593234 nova_compute[227762]: 2026-01-23 10:14:16.931 227766 DEBUG nova.network.os_vif_util [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "address": "fa:16:3e:95:4c:75", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5004fad4-57", "ovs_interfaceid": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:14:16 np0005593234 nova_compute[227762]: 2026-01-23 10:14:16.932 227766 DEBUG nova.network.os_vif_util [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:95:4c:75,bridge_name='br-int',has_traffic_filtering=True,id=5004fad4-5788-4709-9c83-b5fe075c0aa7,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5004fad4-57') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:14:16 np0005593234 nova_compute[227762]: 2026-01-23 10:14:16.933 227766 DEBUG os_vif [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:4c:75,bridge_name='br-int',has_traffic_filtering=True,id=5004fad4-5788-4709-9c83-b5fe075c0aa7,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5004fad4-57') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:14:16 np0005593234 nova_compute[227762]: 2026-01-23 10:14:16.936 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:16 np0005593234 nova_compute[227762]: 2026-01-23 10:14:16.936 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5004fad4-57, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:16 np0005593234 nova_compute[227762]: 2026-01-23 10:14:16.938 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:16 np0005593234 nova_compute[227762]: 2026-01-23 10:14:16.939 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:16 np0005593234 nova_compute[227762]: 2026-01-23 10:14:16.942 227766 INFO os_vif [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:95:4c:75,bridge_name='br-int',has_traffic_filtering=True,id=5004fad4-5788-4709-9c83-b5fe075c0aa7,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5004fad4-57')#033[00m
Jan 23 05:14:17 np0005593234 nova_compute[227762]: 2026-01-23 10:14:17.037 227766 DEBUG nova.compute.manager [req-784fad11-3269-405a-890b-1fb336251774 req-94f0e000-ae1e-4e2f-8c9b-c5e3ce6e8840 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Received event network-changed-5004fad4-5788-4709-9c83-b5fe075c0aa7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:17 np0005593234 nova_compute[227762]: 2026-01-23 10:14:17.037 227766 DEBUG nova.compute.manager [req-784fad11-3269-405a-890b-1fb336251774 req-94f0e000-ae1e-4e2f-8c9b-c5e3ce6e8840 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Refreshing instance network info cache due to event network-changed-5004fad4-5788-4709-9c83-b5fe075c0aa7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:14:17 np0005593234 nova_compute[227762]: 2026-01-23 10:14:17.037 227766 DEBUG oslo_concurrency.lockutils [req-784fad11-3269-405a-890b-1fb336251774 req-94f0e000-ae1e-4e2f-8c9b-c5e3ce6e8840 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:14:17 np0005593234 nova_compute[227762]: 2026-01-23 10:14:17.037 227766 DEBUG oslo_concurrency.lockutils [req-784fad11-3269-405a-890b-1fb336251774 req-94f0e000-ae1e-4e2f-8c9b-c5e3ce6e8840 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:14:17 np0005593234 nova_compute[227762]: 2026-01-23 10:14:17.038 227766 DEBUG nova.network.neutron [req-784fad11-3269-405a-890b-1fb336251774 req-94f0e000-ae1e-4e2f-8c9b-c5e3ce6e8840 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Refreshing network info cache for port 5004fad4-5788-4709-9c83-b5fe075c0aa7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:14:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:17.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:17 np0005593234 nova_compute[227762]: 2026-01-23 10:14:17.146 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e303 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:17 np0005593234 nova_compute[227762]: 2026-01-23 10:14:17.894 227766 INFO nova.virt.libvirt.driver [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Deleting instance files /var/lib/nova/instances/33559028-00d9-4918-9015-26172db3d00c_del#033[00m
Jan 23 05:14:17 np0005593234 nova_compute[227762]: 2026-01-23 10:14:17.895 227766 INFO nova.virt.libvirt.driver [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Deletion of /var/lib/nova/instances/33559028-00d9-4918-9015-26172db3d00c_del complete#033[00m
Jan 23 05:14:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:17.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:17 np0005593234 nova_compute[227762]: 2026-01-23 10:14:17.980 227766 INFO nova.scheduler.client.report [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Deleted allocations for instance 33559028-00d9-4918-9015-26172db3d00c#033[00m
Jan 23 05:14:18 np0005593234 nova_compute[227762]: 2026-01-23 10:14:18.033 227766 DEBUG oslo_concurrency.lockutils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:18 np0005593234 nova_compute[227762]: 2026-01-23 10:14:18.033 227766 DEBUG oslo_concurrency.lockutils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:18 np0005593234 nova_compute[227762]: 2026-01-23 10:14:18.241 227766 DEBUG oslo_concurrency.processutils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:18.609 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '50'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:14:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1584159286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:14:18 np0005593234 nova_compute[227762]: 2026-01-23 10:14:18.716 227766 DEBUG oslo_concurrency.processutils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:18 np0005593234 nova_compute[227762]: 2026-01-23 10:14:18.722 227766 DEBUG nova.compute.provider_tree [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:14:18 np0005593234 nova_compute[227762]: 2026-01-23 10:14:18.745 227766 DEBUG nova.scheduler.client.report [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:14:18 np0005593234 nova_compute[227762]: 2026-01-23 10:14:18.768 227766 DEBUG oslo_concurrency.lockutils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.735s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e304 e304: 3 total, 3 up, 3 in
Jan 23 05:14:18 np0005593234 nova_compute[227762]: 2026-01-23 10:14:18.829 227766 DEBUG oslo_concurrency.lockutils [None req-25fcfe25-dbf5-423e-97d4-99e565edb1e4 aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "33559028-00d9-4918-9015-26172db3d00c" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 15.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:19.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:19 np0005593234 podman[291885]: 2026-01-23 10:14:19.78719838 +0000 UTC m=+0.083524107 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 05:14:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:19.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:21.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:21 np0005593234 nova_compute[227762]: 2026-01-23 10:14:21.244 227766 DEBUG nova.network.neutron [req-784fad11-3269-405a-890b-1fb336251774 req-94f0e000-ae1e-4e2f-8c9b-c5e3ce6e8840 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Updated VIF entry in instance network info cache for port 5004fad4-5788-4709-9c83-b5fe075c0aa7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:14:21 np0005593234 nova_compute[227762]: 2026-01-23 10:14:21.244 227766 DEBUG nova.network.neutron [req-784fad11-3269-405a-890b-1fb336251774 req-94f0e000-ae1e-4e2f-8c9b-c5e3ce6e8840 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 33559028-00d9-4918-9015-26172db3d00c] Updating instance_info_cache with network_info: [{"id": "5004fad4-5788-4709-9c83-b5fe075c0aa7", "address": "fa:16:3e:95:4c:75", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": null, "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.199", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap5004fad4-57", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:14:21 np0005593234 nova_compute[227762]: 2026-01-23 10:14:21.271 227766 DEBUG oslo_concurrency.lockutils [req-784fad11-3269-405a-890b-1fb336251774 req-94f0e000-ae1e-4e2f-8c9b-c5e3ce6e8840 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-33559028-00d9-4918-9015-26172db3d00c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:14:21 np0005593234 nova_compute[227762]: 2026-01-23 10:14:21.290 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163246.2885418, 33559028-00d9-4918-9015-26172db3d00c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:14:21 np0005593234 nova_compute[227762]: 2026-01-23 10:14:21.290 227766 INFO nova.compute.manager [-] [instance: 33559028-00d9-4918-9015-26172db3d00c] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:14:21 np0005593234 nova_compute[227762]: 2026-01-23 10:14:21.343 227766 DEBUG nova.compute.manager [None req-3d5aa96a-d22e-4f46-976c-dc5e38b89fbd - - - - - -] [instance: 33559028-00d9-4918-9015-26172db3d00c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:14:21 np0005593234 nova_compute[227762]: 2026-01-23 10:14:21.939 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:21.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e305 e305: 3 total, 3 up, 3 in
Jan 23 05:14:22 np0005593234 nova_compute[227762]: 2026-01-23 10:14:22.148 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:14:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:23.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:14:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:23.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:25.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:25.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:26 np0005593234 nova_compute[227762]: 2026-01-23 10:14:26.941 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:14:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:27.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:14:27 np0005593234 nova_compute[227762]: 2026-01-23 10:14:27.150 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:27.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:29.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:29 np0005593234 nova_compute[227762]: 2026-01-23 10:14:29.100 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:29.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:31.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:31 np0005593234 nova_compute[227762]: 2026-01-23 10:14:31.944 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:14:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:31.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:14:32 np0005593234 nova_compute[227762]: 2026-01-23 10:14:32.152 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:33.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:33.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:35.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:35.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:36 np0005593234 nova_compute[227762]: 2026-01-23 10:14:36.946 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:37.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:37 np0005593234 nova_compute[227762]: 2026-01-23 10:14:37.154 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:37 np0005593234 nova_compute[227762]: 2026-01-23 10:14:37.766 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:37 np0005593234 nova_compute[227762]: 2026-01-23 10:14:37.766 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:37 np0005593234 nova_compute[227762]: 2026-01-23 10:14:37.793 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:37 np0005593234 nova_compute[227762]: 2026-01-23 10:14:37.793 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:37 np0005593234 nova_compute[227762]: 2026-01-23 10:14:37.793 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:37 np0005593234 nova_compute[227762]: 2026-01-23 10:14:37.794 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:14:37 np0005593234 nova_compute[227762]: 2026-01-23 10:14:37.794 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:37.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:14:38 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3420402570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.238 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.350 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.351 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000007f as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.354 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000087 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.354 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000087 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.354 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000087 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.544 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.545 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3986MB free_disk=20.809932708740234GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.545 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.545 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.622 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.622 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance c38b8bfe-1b70-4daf-b676-250c1e933ed4 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.623 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.623 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:14:38 np0005593234 nova_compute[227762]: 2026-01-23 10:14:38.687 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:39.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:14:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1937065965' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:14:39 np0005593234 nova_compute[227762]: 2026-01-23 10:14:39.119 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:39 np0005593234 nova_compute[227762]: 2026-01-23 10:14:39.125 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:14:39 np0005593234 nova_compute[227762]: 2026-01-23 10:14:39.691 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:14:39 np0005593234 nova_compute[227762]: 2026-01-23 10:14:39.702 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:14:39 np0005593234 nova_compute[227762]: 2026-01-23 10:14:39.703 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:39.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:41.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:41 np0005593234 nova_compute[227762]: 2026-01-23 10:14:41.948 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:14:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:41.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:14:42 np0005593234 nova_compute[227762]: 2026-01-23 10:14:42.305 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:42.850 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:42.851 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:42.852 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:43.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:43.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:44 np0005593234 nova_compute[227762]: 2026-01-23 10:14:44.015 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "633b85ea-a47c-4be0-b06d-388aa421728b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:44 np0005593234 nova_compute[227762]: 2026-01-23 10:14:44.015 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:44 np0005593234 nova_compute[227762]: 2026-01-23 10:14:44.062 227766 DEBUG nova.compute.manager [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:14:44 np0005593234 nova_compute[227762]: 2026-01-23 10:14:44.157 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:44 np0005593234 nova_compute[227762]: 2026-01-23 10:14:44.158 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:44 np0005593234 nova_compute[227762]: 2026-01-23 10:14:44.165 227766 DEBUG nova.virt.hardware [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:14:44 np0005593234 nova_compute[227762]: 2026-01-23 10:14:44.165 227766 INFO nova.compute.claims [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:14:44 np0005593234 nova_compute[227762]: 2026-01-23 10:14:44.382 227766 DEBUG oslo_concurrency.processutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:44 np0005593234 nova_compute[227762]: 2026-01-23 10:14:44.682 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:44 np0005593234 nova_compute[227762]: 2026-01-23 10:14:44.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:45.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:14:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3864811406' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.403 227766 DEBUG oslo_concurrency.processutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.021s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.410 227766 DEBUG nova.compute.provider_tree [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.437 227766 DEBUG nova.scheduler.client.report [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.472 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.314s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.473 227766 DEBUG nova.compute.manager [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:14:45 np0005593234 podman[292039]: 2026-01-23 10:14:45.480540311 +0000 UTC m=+0.080252215 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.541 227766 DEBUG nova.compute.manager [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.542 227766 DEBUG nova.network.neutron [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.564 227766 INFO nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.590 227766 DEBUG nova.compute.manager [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.660 227766 INFO nova.virt.block_device [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Booting with volume bdfc6219-716c-481a-916d-78de375d66c3 at /dev/vda#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.776 227766 DEBUG nova.policy [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '95ac13194f0940128d42af3d45d130fa', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3ae621f21a8e438fb95152309b38cee5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.849 227766 DEBUG os_brick.utils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.851 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.864 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.864 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[5a17896b-1bb7-4638-a182-76fe4060783a]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.865 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.873 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.873 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[ed37ff64-3301-4119-8a53-9155a1d94ae3]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.875 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.884 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.884 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[0f36680e-ceac-41de-9d31-4c83719d6de0]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.886 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[78db25bb-a6d4-4aa0-a97e-eeedd44a0046]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.886 227766 DEBUG oslo_concurrency.processutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.913 227766 DEBUG oslo_concurrency.processutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "nvme version" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.916 227766 DEBUG os_brick.initiator.connectors.lightos [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.916 227766 DEBUG os_brick.initiator.connectors.lightos [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.916 227766 DEBUG os_brick.initiator.connectors.lightos [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.917 227766 DEBUG os_brick.utils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] <== get_connector_properties: return (67ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:14:45 np0005593234 nova_compute[227762]: 2026-01-23 10:14:45.917 227766 DEBUG nova.virt.block_device [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating existing volume attachment record: 7b54bc93-9572-4745-82ec-cfb9dba6b408 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:14:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:45.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:46 np0005593234 nova_compute[227762]: 2026-01-23 10:14:46.984 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:47 np0005593234 nova_compute[227762]: 2026-01-23 10:14:47.309 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:14:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:47.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:14:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e305 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:14:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2009978594' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:14:47 np0005593234 nova_compute[227762]: 2026-01-23 10:14:47.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:47 np0005593234 nova_compute[227762]: 2026-01-23 10:14:47.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:14:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:47.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:48 np0005593234 nova_compute[227762]: 2026-01-23 10:14:48.084 227766 DEBUG nova.network.neutron [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Successfully created port: 2957b316-2d74-4b52-bfc9-52a2c5b56c01 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:14:48 np0005593234 nova_compute[227762]: 2026-01-23 10:14:48.351 227766 DEBUG nova.compute.manager [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:14:48 np0005593234 nova_compute[227762]: 2026-01-23 10:14:48.352 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:14:48 np0005593234 nova_compute[227762]: 2026-01-23 10:14:48.353 227766 INFO nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Creating image(s)#033[00m
Jan 23 05:14:48 np0005593234 nova_compute[227762]: 2026-01-23 10:14:48.353 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:14:48 np0005593234 nova_compute[227762]: 2026-01-23 10:14:48.354 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Ensure instance console log exists: /var/lib/nova/instances/633b85ea-a47c-4be0-b06d-388aa421728b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:14:48 np0005593234 nova_compute[227762]: 2026-01-23 10:14:48.354 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:48 np0005593234 nova_compute[227762]: 2026-01-23 10:14:48.354 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:48 np0005593234 nova_compute[227762]: 2026-01-23 10:14:48.355 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:48 np0005593234 nova_compute[227762]: 2026-01-23 10:14:48.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:49.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:49 np0005593234 nova_compute[227762]: 2026-01-23 10:14:49.412 227766 DEBUG nova.network.neutron [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Successfully updated port: 2957b316-2d74-4b52-bfc9-52a2c5b56c01 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:14:49 np0005593234 nova_compute[227762]: 2026-01-23 10:14:49.445 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:14:49 np0005593234 nova_compute[227762]: 2026-01-23 10:14:49.445 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquired lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:14:49 np0005593234 nova_compute[227762]: 2026-01-23 10:14:49.445 227766 DEBUG nova.network.neutron [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:14:49 np0005593234 nova_compute[227762]: 2026-01-23 10:14:49.669 227766 DEBUG nova.network.neutron [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:14:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:50.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:14:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/22953883' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:14:50 np0005593234 nova_compute[227762]: 2026-01-23 10:14:50.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:50 np0005593234 nova_compute[227762]: 2026-01-23 10:14:50.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:14:50 np0005593234 podman[292123]: 2026-01-23 10:14:50.788159128 +0000 UTC m=+0.084694203 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 05:14:50 np0005593234 nova_compute[227762]: 2026-01-23 10:14:50.792 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 33559028-00d9-4918-9015-26172db3d00c] Skipping network cache update for instance because it has been migrated to another host. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9902#033[00m
Jan 23 05:14:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e306 e306: 3 total, 3 up, 3 in
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.318 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.319 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.319 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:14:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:14:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:51.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.591 227766 DEBUG nova.network.neutron [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating instance_info_cache with network_info: [{"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.617 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Releasing lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.617 227766 DEBUG nova.compute.manager [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Instance network_info: |[{"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.620 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Start _get_guest_xml network_info=[{"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'boot_index': 0, 'mount_device': '/dev/vda', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-bdfc6219-716c-481a-916d-78de375d66c3', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'bdfc6219-716c-481a-916d-78de375d66c3', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '633b85ea-a47c-4be0-b06d-388aa421728b', 'attached_at': '', 'detached_at': '', 'volume_id': 'bdfc6219-716c-481a-916d-78de375d66c3', 'serial': 'bdfc6219-716c-481a-916d-78de375d66c3'}, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '7b54bc93-9572-4745-82ec-cfb9dba6b408', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.624 227766 WARNING nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.647 227766 DEBUG nova.virt.libvirt.host [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.648 227766 DEBUG nova.virt.libvirt.host [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.652 227766 DEBUG nova.virt.libvirt.host [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.653 227766 DEBUG nova.virt.libvirt.host [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.654 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.654 227766 DEBUG nova.virt.hardware [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.655 227766 DEBUG nova.virt.hardware [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.655 227766 DEBUG nova.virt.hardware [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.655 227766 DEBUG nova.virt.hardware [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.655 227766 DEBUG nova.virt.hardware [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.656 227766 DEBUG nova.virt.hardware [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.656 227766 DEBUG nova.virt.hardware [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.656 227766 DEBUG nova.virt.hardware [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.656 227766 DEBUG nova.virt.hardware [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.656 227766 DEBUG nova.virt.hardware [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.657 227766 DEBUG nova.virt.hardware [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.684 227766 DEBUG nova.storage.rbd_utils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] rbd image 633b85ea-a47c-4be0-b06d-388aa421728b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.689 227766 DEBUG oslo_concurrency.processutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.870 227766 DEBUG nova.compute.manager [req-14e01a94-6010-4a49-b86a-9bcd040eab47 req-57350d7a-63c5-4ca2-8dee-16b914d91821 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Received event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.870 227766 DEBUG nova.compute.manager [req-14e01a94-6010-4a49-b86a-9bcd040eab47 req-57350d7a-63c5-4ca2-8dee-16b914d91821 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing instance network info cache due to event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.870 227766 DEBUG oslo_concurrency.lockutils [req-14e01a94-6010-4a49-b86a-9bcd040eab47 req-57350d7a-63c5-4ca2-8dee-16b914d91821 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.870 227766 DEBUG oslo_concurrency.lockutils [req-14e01a94-6010-4a49-b86a-9bcd040eab47 req-57350d7a-63c5-4ca2-8dee-16b914d91821 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.871 227766 DEBUG nova.network.neutron [req-14e01a94-6010-4a49-b86a-9bcd040eab47 req-57350d7a-63c5-4ca2-8dee-16b914d91821 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:14:51 np0005593234 nova_compute[227762]: 2026-01-23 10:14:51.987 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:52.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:14:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2418747019' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.153 227766 DEBUG oslo_concurrency.processutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.211 227766 DEBUG nova.virt.libvirt.vif [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:14:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-205659850',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-205659850',id=140,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPuMczToXGmZUNyxG5fVGeV6xaoJVOpQ6Lh9dx5t6v22bv4xalVGQLUjYNEpg7ajkuOU/WHiNfvMhffjZHY/YojnQQYOX+q0GTa9+NPbkGDFf1XELa+vTNvIe6ZV8CwP9g==',key_name='tempest-TestInstancesWithCinderVolumes-232096272',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ae621f21a8e438fb95152309b38cee5',ramdisk_id='',reservation_id='r-rpa9cnu8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-565485208',owner_user_name='tempest-TestInstancesWithCinderVolumes-565485208-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:14:45Z,user_data=None,user_id='95ac13194f0940128d42af3d45d130fa',uuid=633b85ea-a47c-4be0-b06d-388aa421728b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.211 227766 DEBUG nova.network.os_vif_util [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Converting VIF {"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.212 227766 DEBUG nova.network.os_vif_util [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:c7:7e,bridge_name='br-int',has_traffic_filtering=True,id=2957b316-2d74-4b52-bfc9-52a2c5b56c01,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2957b316-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.213 227766 DEBUG nova.objects.instance [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 633b85ea-a47c-4be0-b06d-388aa421728b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.286 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  <uuid>633b85ea-a47c-4be0-b06d-388aa421728b</uuid>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  <name>instance-0000008c</name>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestInstancesWithCinderVolumes-server-205659850</nova:name>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:14:51</nova:creationTime>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <nova:user uuid="95ac13194f0940128d42af3d45d130fa">tempest-TestInstancesWithCinderVolumes-565485208-project-member</nova:user>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <nova:project uuid="3ae621f21a8e438fb95152309b38cee5">tempest-TestInstancesWithCinderVolumes-565485208</nova:project>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <nova:port uuid="2957b316-2d74-4b52-bfc9-52a2c5b56c01">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <entry name="serial">633b85ea-a47c-4be0-b06d-388aa421728b</entry>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <entry name="uuid">633b85ea-a47c-4be0-b06d-388aa421728b</entry>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/633b85ea-a47c-4be0-b06d-388aa421728b_disk.config">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="volumes/volume-bdfc6219-716c-481a-916d-78de375d66c3">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <serial>bdfc6219-716c-481a-916d-78de375d66c3</serial>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:89:c7:7e"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <target dev="tap2957b316-2d"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/633b85ea-a47c-4be0-b06d-388aa421728b/console.log" append="off"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:14:52 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:14:52 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:14:52 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:14:52 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.286 227766 DEBUG nova.compute.manager [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Preparing to wait for external event network-vif-plugged-2957b316-2d74-4b52-bfc9-52a2c5b56c01 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.287 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "633b85ea-a47c-4be0-b06d-388aa421728b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.287 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.287 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.288 227766 DEBUG nova.virt.libvirt.vif [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:14:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-205659850',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-205659850',id=140,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPuMczToXGmZUNyxG5fVGeV6xaoJVOpQ6Lh9dx5t6v22bv4xalVGQLUjYNEpg7ajkuOU/WHiNfvMhffjZHY/YojnQQYOX+q0GTa9+NPbkGDFf1XELa+vTNvIe6ZV8CwP9g==',key_name='tempest-TestInstancesWithCinderVolumes-232096272',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ae621f21a8e438fb95152309b38cee5',ramdisk_id='',reservation_id='r-rpa9cnu8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestInstancesWithCinderVolumes-565485208',owner_user_name='tempest-TestInstancesWithCinderVolumes-565485208-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:14:45Z,user_data=None,user_id='95ac13194f0940128d42af3d45d130fa',uuid=633b85ea-a47c-4be0-b06d-388aa421728b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.288 227766 DEBUG nova.network.os_vif_util [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Converting VIF {"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.289 227766 DEBUG nova.network.os_vif_util [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:89:c7:7e,bridge_name='br-int',has_traffic_filtering=True,id=2957b316-2d74-4b52-bfc9-52a2c5b56c01,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2957b316-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.289 227766 DEBUG os_vif [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:c7:7e,bridge_name='br-int',has_traffic_filtering=True,id=2957b316-2d74-4b52-bfc9-52a2c5b56c01,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2957b316-2d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.290 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.290 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.290 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.294 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.295 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2957b316-2d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.295 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2957b316-2d, col_values=(('external_ids', {'iface-id': '2957b316-2d74-4b52-bfc9-52a2c5b56c01', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:89:c7:7e', 'vm-uuid': '633b85ea-a47c-4be0-b06d-388aa421728b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.297 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:52 np0005593234 NetworkManager[48942]: <info>  [1769163292.2977] manager: (tap2957b316-2d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/282)
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.299 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.302 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.303 227766 INFO os_vif [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:89:c7:7e,bridge_name='br-int',has_traffic_filtering=True,id=2957b316-2d74-4b52-bfc9-52a2c5b56c01,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2957b316-2d')#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.311 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.378 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.378 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.378 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No VIF found with MAC fa:16:3e:89:c7:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.378 227766 INFO nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Using config drive#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.405 227766 DEBUG nova.storage.rbd_utils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] rbd image 633b85ea-a47c-4be0-b06d-388aa421728b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:14:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.954 227766 INFO nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Creating config drive at /var/lib/nova/instances/633b85ea-a47c-4be0-b06d-388aa421728b/disk.config#033[00m
Jan 23 05:14:52 np0005593234 nova_compute[227762]: 2026-01-23 10:14:52.960 227766 DEBUG oslo_concurrency.processutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/633b85ea-a47c-4be0-b06d-388aa421728b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8fdxef4b execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:53 np0005593234 nova_compute[227762]: 2026-01-23 10:14:53.092 227766 DEBUG oslo_concurrency.processutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/633b85ea-a47c-4be0-b06d-388aa421728b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8fdxef4b" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:53 np0005593234 nova_compute[227762]: 2026-01-23 10:14:53.123 227766 DEBUG nova.storage.rbd_utils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] rbd image 633b85ea-a47c-4be0-b06d-388aa421728b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:14:53 np0005593234 nova_compute[227762]: 2026-01-23 10:14:53.127 227766 DEBUG oslo_concurrency.processutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/633b85ea-a47c-4be0-b06d-388aa421728b/disk.config 633b85ea-a47c-4be0-b06d-388aa421728b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:14:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:53.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:53 np0005593234 nova_compute[227762]: 2026-01-23 10:14:53.524 227766 DEBUG oslo_concurrency.processutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/633b85ea-a47c-4be0-b06d-388aa421728b/disk.config 633b85ea-a47c-4be0-b06d-388aa421728b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:14:53 np0005593234 nova_compute[227762]: 2026-01-23 10:14:53.526 227766 INFO nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Deleting local config drive /var/lib/nova/instances/633b85ea-a47c-4be0-b06d-388aa421728b/disk.config because it was imported into RBD.#033[00m
Jan 23 05:14:53 np0005593234 kernel: tap2957b316-2d: entered promiscuous mode
Jan 23 05:14:53 np0005593234 NetworkManager[48942]: <info>  [1769163293.5784] manager: (tap2957b316-2d): new Tun device (/org/freedesktop/NetworkManager/Devices/283)
Jan 23 05:14:53 np0005593234 nova_compute[227762]: 2026-01-23 10:14:53.579 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:53 np0005593234 ovn_controller[134547]: 2026-01-23T10:14:53Z|00573|binding|INFO|Claiming lport 2957b316-2d74-4b52-bfc9-52a2c5b56c01 for this chassis.
Jan 23 05:14:53 np0005593234 ovn_controller[134547]: 2026-01-23T10:14:53Z|00574|binding|INFO|2957b316-2d74-4b52-bfc9-52a2c5b56c01: Claiming fa:16:3e:89:c7:7e 10.100.0.12
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.591 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:c7:7e 10.100.0.12'], port_security=['fa:16:3e:89:c7:7e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '633b85ea-a47c-4be0-b06d-388aa421728b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ae621f21a8e438fb95152309b38cee5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3b0a0b41-45a8-4582-a4d2-a9aff1f1a18c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5888498-07d6-4c96-95ee-546974eebd82, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=2957b316-2d74-4b52-bfc9-52a2c5b56c01) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.594 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 2957b316-2d74-4b52-bfc9-52a2c5b56c01 in datapath f98d79de-4a23-4f29-9848-c5d4c5683a5d bound to our chassis#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.596 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f98d79de-4a23-4f29-9848-c5d4c5683a5d#033[00m
Jan 23 05:14:53 np0005593234 ovn_controller[134547]: 2026-01-23T10:14:53Z|00575|binding|INFO|Setting lport 2957b316-2d74-4b52-bfc9-52a2c5b56c01 ovn-installed in OVS
Jan 23 05:14:53 np0005593234 ovn_controller[134547]: 2026-01-23T10:14:53Z|00576|binding|INFO|Setting lport 2957b316-2d74-4b52-bfc9-52a2c5b56c01 up in Southbound
Jan 23 05:14:53 np0005593234 nova_compute[227762]: 2026-01-23 10:14:53.600 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:53 np0005593234 nova_compute[227762]: 2026-01-23 10:14:53.602 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.611 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e517aa4c-ba40-4799-92d9-fbeb2b9c8368]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.612 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf98d79de-41 in ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.615 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf98d79de-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.615 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c01fc7b0-edc0-4794-b7c5-fd4cf3132829]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.616 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[945287d6-478d-4c69-a966-a6a8da861e7e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 systemd-udevd[292263]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:14:53 np0005593234 systemd-machined[195626]: New machine qemu-68-instance-0000008c.
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.628 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[98a0594d-5977-4cad-bb7d-9ca91d095ab5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 NetworkManager[48942]: <info>  [1769163293.6319] device (tap2957b316-2d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:14:53 np0005593234 NetworkManager[48942]: <info>  [1769163293.6329] device (tap2957b316-2d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:14:53 np0005593234 systemd[1]: Started Virtual Machine qemu-68-instance-0000008c.
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.652 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ad4cdad2-507e-4c43-85cb-0e9a8393a6d9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.681 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b70add2f-7d14-4eaf-a049-e62f70d59fa8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.686 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c12bfa52-80b2-4878-9403-cdd2070f8c6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 systemd-udevd[292267]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:14:53 np0005593234 NetworkManager[48942]: <info>  [1769163293.6878] manager: (tapf98d79de-40): new Veth device (/org/freedesktop/NetworkManager/Devices/284)
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.720 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[445b6591-000c-4615-9685-4f398074653c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.723 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[bf1d359a-34cb-4045-8574-7198b5632c52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 NetworkManager[48942]: <info>  [1769163293.7447] device (tapf98d79de-40): carrier: link connected
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.749 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b40dd15a-c5e1-4ddb-ab84-bf945ffb462d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.768 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[89cbc9c2-8e8f-4f35-ad0b-bccf0dad26b9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf98d79de-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:3d:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718870, 'reachable_time': 36864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 292296, 'error': None, 'target': 'ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.787 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4663d589-4a84-4b70-9c97-f54f1144976d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe79:3d5f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 718870, 'tstamp': 718870}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 292297, 'error': None, 'target': 'ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.801 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f62d5dd7-9188-4690-9e83-1c7357f582e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf98d79de-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:79:3d:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 183], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718870, 'reachable_time': 36864, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 292298, 'error': None, 'target': 'ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.837 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[76dcaf8f-b7ac-484c-a09d-0cf59c7f6653]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.896 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b9fce204-695e-487a-aa88-9dbe8a31870f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.897 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf98d79de-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.898 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.899 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf98d79de-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:53 np0005593234 kernel: tapf98d79de-40: entered promiscuous mode
Jan 23 05:14:53 np0005593234 nova_compute[227762]: 2026-01-23 10:14:53.900 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:53 np0005593234 NetworkManager[48942]: <info>  [1769163293.9011] manager: (tapf98d79de-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/285)
Jan 23 05:14:53 np0005593234 nova_compute[227762]: 2026-01-23 10:14:53.902 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.904 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf98d79de-40, col_values=(('external_ids', {'iface-id': '2c16e447-27d9-4516-bf23-ec948f375c10'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:14:53 np0005593234 nova_compute[227762]: 2026-01-23 10:14:53.905 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:53 np0005593234 ovn_controller[134547]: 2026-01-23T10:14:53Z|00577|binding|INFO|Releasing lport 2c16e447-27d9-4516-bf23-ec948f375c10 from this chassis (sb_readonly=0)
Jan 23 05:14:53 np0005593234 nova_compute[227762]: 2026-01-23 10:14:53.919 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.921 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f98d79de-4a23-4f29-9848-c5d4c5683a5d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f98d79de-4a23-4f29-9848-c5d4c5683a5d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.922 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[468c0e22-fcfe-401f-ba50-3ce9d17784c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.922 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-f98d79de-4a23-4f29-9848-c5d4c5683a5d
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/f98d79de-4a23-4f29-9848-c5d4c5683a5d.pid.haproxy
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID f98d79de-4a23-4f29-9848-c5d4c5683a5d
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:14:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:14:53.923 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'env', 'PROCESS_TAG=haproxy-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f98d79de-4a23-4f29-9848-c5d4c5683a5d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:14:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:54.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.379 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163294.3785324, 633b85ea-a47c-4be0-b06d-388aa421728b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.380 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] VM Started (Lifecycle Event)#033[00m
Jan 23 05:14:54 np0005593234 podman[292366]: 2026-01-23 10:14:54.288430479 +0000 UTC m=+0.024252295 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.404 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.409 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163294.3792272, 633b85ea-a47c-4be0-b06d-388aa421728b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.409 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.458 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.461 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.491 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.741 227766 DEBUG nova.compute.manager [req-ab01e823-d8ca-401f-9e10-6c4653cfa421 req-6ed52b8d-f116-4b72-94b1-21559bd743d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Received event network-vif-plugged-2957b316-2d74-4b52-bfc9-52a2c5b56c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.742 227766 DEBUG oslo_concurrency.lockutils [req-ab01e823-d8ca-401f-9e10-6c4653cfa421 req-6ed52b8d-f116-4b72-94b1-21559bd743d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "633b85ea-a47c-4be0-b06d-388aa421728b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.742 227766 DEBUG oslo_concurrency.lockutils [req-ab01e823-d8ca-401f-9e10-6c4653cfa421 req-6ed52b8d-f116-4b72-94b1-21559bd743d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.742 227766 DEBUG oslo_concurrency.lockutils [req-ab01e823-d8ca-401f-9e10-6c4653cfa421 req-6ed52b8d-f116-4b72-94b1-21559bd743d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.742 227766 DEBUG nova.compute.manager [req-ab01e823-d8ca-401f-9e10-6c4653cfa421 req-6ed52b8d-f116-4b72-94b1-21559bd743d5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Processing event network-vif-plugged-2957b316-2d74-4b52-bfc9-52a2c5b56c01 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.743 227766 DEBUG nova.compute.manager [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.746 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163294.7460303, 633b85ea-a47c-4be0-b06d-388aa421728b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.746 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.748 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.750 227766 INFO nova.virt.libvirt.driver [-] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Instance spawned successfully.#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.751 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.783 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.788 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.792 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.792 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.793 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.793 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.793 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.794 227766 DEBUG nova.virt.libvirt.driver [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.825 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:14:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:14:54 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4276578476' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.869 227766 INFO nova.compute.manager [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Took 6.52 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.869 227766 DEBUG nova.compute.manager [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.914 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Updating instance_info_cache with network_info: [{"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:14:54 np0005593234 podman[292366]: 2026-01-23 10:14:54.932803378 +0000 UTC m=+0.668625184 container create aa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.943 227766 INFO nova.compute.manager [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Took 10.83 seconds to build instance.#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.945 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.945 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.945 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.960 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.960 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.962 227766 DEBUG nova.network.neutron [req-14e01a94-6010-4a49-b86a-9bcd040eab47 req-57350d7a-63c5-4ca2-8dee-16b914d91821 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updated VIF entry in instance network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.962 227766 DEBUG nova.network.neutron [req-14e01a94-6010-4a49-b86a-9bcd040eab47 req-57350d7a-63c5-4ca2-8dee-16b914d91821 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating instance_info_cache with network_info: [{"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.980 227766 DEBUG oslo_concurrency.lockutils [None req-8e791dad-16bb-4df3-8aec-b08833cc38cd 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.965s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:54 np0005593234 systemd[1]: Started libpod-conmon-aa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38.scope.
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.991 227766 DEBUG oslo_concurrency.lockutils [req-14e01a94-6010-4a49-b86a-9bcd040eab47 req-57350d7a-63c5-4ca2-8dee-16b914d91821 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.994 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Triggering sync for uuid 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.994 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Triggering sync for uuid c38b8bfe-1b70-4daf-b676-250c1e933ed4 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.994 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Triggering sync for uuid 633b85ea-a47c-4be0-b06d-388aa421728b _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.995 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.995 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.995 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.996 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.996 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "633b85ea-a47c-4be0-b06d-388aa421728b" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:54 np0005593234 nova_compute[227762]: 2026-01-23 10:14:54.996 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:55 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:14:55 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3aa1dd529c3ccbaab298f2eb778eabfaf5ca007f4f142a37e078d0fc7a1999f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:14:55 np0005593234 nova_compute[227762]: 2026-01-23 10:14:55.030 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:55 np0005593234 nova_compute[227762]: 2026-01-23 10:14:55.034 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:55 np0005593234 nova_compute[227762]: 2026-01-23 10:14:55.035 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:55 np0005593234 podman[292366]: 2026-01-23 10:14:55.037235014 +0000 UTC m=+0.773056840 container init aa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:14:55 np0005593234 podman[292366]: 2026-01-23 10:14:55.043839529 +0000 UTC m=+0.779661335 container start aa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:14:55 np0005593234 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[292387]: [NOTICE]   (292391) : New worker (292393) forked
Jan 23 05:14:55 np0005593234 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[292387]: [NOTICE]   (292391) : Loading success.
Jan 23 05:14:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:55.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:56.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:56 np0005593234 nova_compute[227762]: 2026-01-23 10:14:56.915 227766 DEBUG nova.compute.manager [req-f92fd391-c274-48aa-8799-341e2490459e req-0ee54004-f95f-4262-b028-db50987bd844 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Received event network-vif-plugged-2957b316-2d74-4b52-bfc9-52a2c5b56c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:14:56 np0005593234 nova_compute[227762]: 2026-01-23 10:14:56.916 227766 DEBUG oslo_concurrency.lockutils [req-f92fd391-c274-48aa-8799-341e2490459e req-0ee54004-f95f-4262-b028-db50987bd844 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "633b85ea-a47c-4be0-b06d-388aa421728b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:56 np0005593234 nova_compute[227762]: 2026-01-23 10:14:56.917 227766 DEBUG oslo_concurrency.lockutils [req-f92fd391-c274-48aa-8799-341e2490459e req-0ee54004-f95f-4262-b028-db50987bd844 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:56 np0005593234 nova_compute[227762]: 2026-01-23 10:14:56.917 227766 DEBUG oslo_concurrency.lockutils [req-f92fd391-c274-48aa-8799-341e2490459e req-0ee54004-f95f-4262-b028-db50987bd844 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:56 np0005593234 nova_compute[227762]: 2026-01-23 10:14:56.917 227766 DEBUG nova.compute.manager [req-f92fd391-c274-48aa-8799-341e2490459e req-0ee54004-f95f-4262-b028-db50987bd844 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] No waiting events found dispatching network-vif-plugged-2957b316-2d74-4b52-bfc9-52a2c5b56c01 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:14:56 np0005593234 nova_compute[227762]: 2026-01-23 10:14:56.917 227766 WARNING nova.compute.manager [req-f92fd391-c274-48aa-8799-341e2490459e req-0ee54004-f95f-4262-b028-db50987bd844 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Received unexpected event network-vif-plugged-2957b316-2d74-4b52-bfc9-52a2c5b56c01 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:14:57 np0005593234 nova_compute[227762]: 2026-01-23 10:14:57.277 227766 DEBUG oslo_concurrency.lockutils [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "633b85ea-a47c-4be0-b06d-388aa421728b" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:57 np0005593234 nova_compute[227762]: 2026-01-23 10:14:57.277 227766 DEBUG oslo_concurrency.lockutils [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:57 np0005593234 nova_compute[227762]: 2026-01-23 10:14:57.299 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:57 np0005593234 nova_compute[227762]: 2026-01-23 10:14:57.312 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:14:57 np0005593234 nova_compute[227762]: 2026-01-23 10:14:57.322 227766 DEBUG nova.objects.instance [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'flavor' on Instance uuid 633b85ea-a47c-4be0-b06d-388aa421728b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:14:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:57.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:57 np0005593234 nova_compute[227762]: 2026-01-23 10:14:57.396 227766 DEBUG oslo_concurrency.lockutils [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:14:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e306 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:14:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:58.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e307 e307: 3 total, 3 up, 3 in
Jan 23 05:14:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:14:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:14:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:59.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:14:59 np0005593234 nova_compute[227762]: 2026-01-23 10:14:59.501 227766 DEBUG oslo_concurrency.lockutils [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "633b85ea-a47c-4be0-b06d-388aa421728b" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:14:59 np0005593234 nova_compute[227762]: 2026-01-23 10:14:59.501 227766 DEBUG oslo_concurrency.lockutils [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:14:59 np0005593234 nova_compute[227762]: 2026-01-23 10:14:59.502 227766 INFO nova.compute.manager [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Attaching volume 06664261-bdf3-44c1-9f40-a29c0039b070 to /dev/vdb#033[00m
Jan 23 05:15:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:00.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.238 227766 DEBUG os_brick.utils [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.239 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.254 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.254 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[becd8c85-f2ce-4b0b-b83c-04a6720238e2]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.256 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.267 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.268 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[ca47f66f-45e0-4fb0-9d4f-2749ea9491f4]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.269 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.280 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.280 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[fa59ce56-db80-4c44-85f7-41bd4bf6d78f]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.281 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[c4d3ad30-c4cd-4216-880a-f1355227d720]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.282 227766 DEBUG oslo_concurrency.processutils [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.305 227766 DEBUG oslo_concurrency.processutils [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "nvme version" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.308 227766 DEBUG os_brick.initiator.connectors.lightos [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.309 227766 DEBUG os_brick.initiator.connectors.lightos [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.309 227766 DEBUG os_brick.initiator.connectors.lightos [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.310 227766 DEBUG os_brick.utils [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] <== get_connector_properties: return (70ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:15:00 np0005593234 nova_compute[227762]: 2026-01-23 10:15:00.310 227766 DEBUG nova.virt.block_device [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating existing volume attachment record: 4746c25b-1702-4d42-aee7-aa52d6e3ff92 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:15:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:01.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:15:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:15:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:15:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:15:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:15:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:15:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e308 e308: 3 total, 3 up, 3 in
Jan 23 05:15:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:15:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:02.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:15:02 np0005593234 nova_compute[227762]: 2026-01-23 10:15:02.302 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:02 np0005593234 nova_compute[227762]: 2026-01-23 10:15:02.314 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:15:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e308 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:02 np0005593234 nova_compute[227762]: 2026-01-23 10:15:02.778 227766 DEBUG nova.objects.instance [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'flavor' on Instance uuid 633b85ea-a47c-4be0-b06d-388aa421728b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:02 np0005593234 nova_compute[227762]: 2026-01-23 10:15:02.863 227766 DEBUG nova.virt.libvirt.driver [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Attempting to attach volume 06664261-bdf3-44c1-9f40-a29c0039b070 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:15:02 np0005593234 nova_compute[227762]: 2026-01-23 10:15:02.866 227766 DEBUG nova.virt.libvirt.guest [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:15:02 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:15:02 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-06664261-bdf3-44c1-9f40-a29c0039b070">
Jan 23 05:15:02 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:15:02 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:15:02 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:15:02 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:15:02 np0005593234 nova_compute[227762]:  <auth username="openstack">
Jan 23 05:15:02 np0005593234 nova_compute[227762]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:15:02 np0005593234 nova_compute[227762]:  </auth>
Jan 23 05:15:02 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:15:02 np0005593234 nova_compute[227762]:  <serial>06664261-bdf3-44c1-9f40-a29c0039b070</serial>
Jan 23 05:15:02 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:15:02 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:15:03 np0005593234 nova_compute[227762]: 2026-01-23 10:15:03.315 227766 DEBUG nova.virt.libvirt.driver [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:03 np0005593234 nova_compute[227762]: 2026-01-23 10:15:03.315 227766 DEBUG nova.virt.libvirt.driver [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:03 np0005593234 nova_compute[227762]: 2026-01-23 10:15:03.316 227766 DEBUG nova.virt.libvirt.driver [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:03 np0005593234 nova_compute[227762]: 2026-01-23 10:15:03.316 227766 DEBUG nova.virt.libvirt.driver [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No VIF found with MAC fa:16:3e:89:c7:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:15:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:03.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:04.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:05 np0005593234 nova_compute[227762]: 2026-01-23 10:15:05.093 227766 DEBUG oslo_concurrency.lockutils [None req-3eef7be0-b971-4089-8822-83e5d516a535 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 5.591s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:05.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:15:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:06.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:15:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:15:06 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1555104024' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:15:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 e309: 3 total, 3 up, 3 in
Jan 23 05:15:07 np0005593234 nova_compute[227762]: 2026-01-23 10:15:07.306 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:07 np0005593234 nova_compute[227762]: 2026-01-23 10:15:07.317 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:15:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:07.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:15:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:15:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:15:08 np0005593234 nova_compute[227762]: 2026-01-23 10:15:08.021 227766 DEBUG oslo_concurrency.lockutils [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "633b85ea-a47c-4be0-b06d-388aa421728b" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:08 np0005593234 nova_compute[227762]: 2026-01-23 10:15:08.021 227766 DEBUG oslo_concurrency.lockutils [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:08.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:08 np0005593234 nova_compute[227762]: 2026-01-23 10:15:08.115 227766 DEBUG nova.objects.instance [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'flavor' on Instance uuid 633b85ea-a47c-4be0-b06d-388aa421728b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:08 np0005593234 nova_compute[227762]: 2026-01-23 10:15:08.453 227766 DEBUG oslo_concurrency.lockutils [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.432s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:09.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:09 np0005593234 nova_compute[227762]: 2026-01-23 10:15:09.669 227766 DEBUG oslo_concurrency.lockutils [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "633b85ea-a47c-4be0-b06d-388aa421728b" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:09 np0005593234 nova_compute[227762]: 2026-01-23 10:15:09.670 227766 DEBUG oslo_concurrency.lockutils [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:09 np0005593234 nova_compute[227762]: 2026-01-23 10:15:09.670 227766 INFO nova.compute.manager [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Attaching volume eb20e12a-5d9c-4034-8e50-a0b613fd5f3d to /dev/vdc#033[00m
Jan 23 05:15:09 np0005593234 ovn_controller[134547]: 2026-01-23T10:15:09Z|00061|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:89:c7:7e 10.100.0.12
Jan 23 05:15:09 np0005593234 ovn_controller[134547]: 2026-01-23T10:15:09Z|00062|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:89:c7:7e 10.100.0.12
Jan 23 05:15:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:10.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.571 227766 DEBUG os_brick.utils [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.572 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.584 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.584 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[6f9f97a9-1c67-4a03-a7d1-dce475b17e7d]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.585 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.594 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.595 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[93916b0f-d565-498e-a835-f79e9f6a37c0]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.596 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.605 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.605 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[b5e3611e-1c96-4224-88dc-87b8ea4d8062]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.606 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[72d1c9ef-9f5a-4272-9f63-59a02b113f3b]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.606 227766 DEBUG oslo_concurrency.processutils [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.634 227766 DEBUG oslo_concurrency.processutils [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "nvme version" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.637 227766 DEBUG os_brick.initiator.connectors.lightos [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.637 227766 DEBUG os_brick.initiator.connectors.lightos [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.637 227766 DEBUG os_brick.initiator.connectors.lightos [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.638 227766 DEBUG os_brick.utils [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] <== get_connector_properties: return (66ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:15:10 np0005593234 nova_compute[227762]: 2026-01-23 10:15:10.638 227766 DEBUG nova.virt.block_device [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating existing volume attachment record: 90f1549a-989c-45d3-941a-597a1c413490 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:15:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:15:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:11.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:15:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:12.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:15:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2682085718' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:15:12 np0005593234 nova_compute[227762]: 2026-01-23 10:15:12.310 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:12 np0005593234 nova_compute[227762]: 2026-01-23 10:15:12.320 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:12 np0005593234 nova_compute[227762]: 2026-01-23 10:15:12.524 227766 DEBUG nova.objects.instance [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'flavor' on Instance uuid 633b85ea-a47c-4be0-b06d-388aa421728b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:12 np0005593234 nova_compute[227762]: 2026-01-23 10:15:12.554 227766 DEBUG nova.virt.libvirt.driver [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Attempting to attach volume eb20e12a-5d9c-4034-8e50-a0b613fd5f3d with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:15:12 np0005593234 nova_compute[227762]: 2026-01-23 10:15:12.556 227766 DEBUG nova.virt.libvirt.guest [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:15:12 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:15:12 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-eb20e12a-5d9c-4034-8e50-a0b613fd5f3d">
Jan 23 05:15:12 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:15:12 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:15:12 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:15:12 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:15:12 np0005593234 nova_compute[227762]:  <auth username="openstack">
Jan 23 05:15:12 np0005593234 nova_compute[227762]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:15:12 np0005593234 nova_compute[227762]:  </auth>
Jan 23 05:15:12 np0005593234 nova_compute[227762]:  <target dev="vdc" bus="virtio"/>
Jan 23 05:15:12 np0005593234 nova_compute[227762]:  <serial>eb20e12a-5d9c-4034-8e50-a0b613fd5f3d</serial>
Jan 23 05:15:12 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:15:12 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:15:12 np0005593234 nova_compute[227762]: 2026-01-23 10:15:12.736 227766 DEBUG nova.virt.libvirt.driver [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:12 np0005593234 nova_compute[227762]: 2026-01-23 10:15:12.737 227766 DEBUG nova.virt.libvirt.driver [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:12 np0005593234 nova_compute[227762]: 2026-01-23 10:15:12.737 227766 DEBUG nova.virt.libvirt.driver [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:12 np0005593234 nova_compute[227762]: 2026-01-23 10:15:12.737 227766 DEBUG nova.virt.libvirt.driver [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:15:12 np0005593234 nova_compute[227762]: 2026-01-23 10:15:12.738 227766 DEBUG nova.virt.libvirt.driver [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] No VIF found with MAC fa:16:3e:89:c7:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:15:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:13.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:14.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:15.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:15 np0005593234 podman[292820]: 2026-01-23 10:15:15.766537596 +0000 UTC m=+0.056700964 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:15:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:16.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:17 np0005593234 nova_compute[227762]: 2026-01-23 10:15:17.314 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:17 np0005593234 nova_compute[227762]: 2026-01-23 10:15:17.322 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:17.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:15:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:18.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:15:19 np0005593234 nova_compute[227762]: 2026-01-23 10:15:19.403 227766 DEBUG oslo_concurrency.lockutils [None req-45981b71-1525-41ad-810c-54e04dab77a0 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 9.734s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:19.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:20.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:21.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:21 np0005593234 podman[292842]: 2026-01-23 10:15:21.786801125 +0000 UTC m=+0.086412718 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:15:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:22.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:22 np0005593234 nova_compute[227762]: 2026-01-23 10:15:22.317 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:22 np0005593234 nova_compute[227762]: 2026-01-23 10:15:22.324 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:23.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:24.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:24 np0005593234 nova_compute[227762]: 2026-01-23 10:15:24.622 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:24.621 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=51, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=50) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:15:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:24.624 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:15:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:25.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:26.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:27 np0005593234 nova_compute[227762]: 2026-01-23 10:15:27.320 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:27 np0005593234 nova_compute[227762]: 2026-01-23 10:15:27.327 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:27.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:27 np0005593234 nova_compute[227762]: 2026-01-23 10:15:27.453 227766 DEBUG nova.compute.manager [req-ba02075b-5c6a-4d31-853c-2026a1e8a25d req-459e1c29-b9ca-4d23-b1fd-90598a2865f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Received event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:27 np0005593234 nova_compute[227762]: 2026-01-23 10:15:27.453 227766 DEBUG nova.compute.manager [req-ba02075b-5c6a-4d31-853c-2026a1e8a25d req-459e1c29-b9ca-4d23-b1fd-90598a2865f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing instance network info cache due to event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:15:27 np0005593234 nova_compute[227762]: 2026-01-23 10:15:27.454 227766 DEBUG oslo_concurrency.lockutils [req-ba02075b-5c6a-4d31-853c-2026a1e8a25d req-459e1c29-b9ca-4d23-b1fd-90598a2865f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:15:27 np0005593234 nova_compute[227762]: 2026-01-23 10:15:27.454 227766 DEBUG oslo_concurrency.lockutils [req-ba02075b-5c6a-4d31-853c-2026a1e8a25d req-459e1c29-b9ca-4d23-b1fd-90598a2865f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:15:27 np0005593234 nova_compute[227762]: 2026-01-23 10:15:27.455 227766 DEBUG nova.network.neutron [req-ba02075b-5c6a-4d31-853c-2026a1e8a25d req-459e1c29-b9ca-4d23-b1fd-90598a2865f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:15:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:27.626 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '51'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:15:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1472672131' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:15:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:15:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1472672131' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:15:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:15:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:28.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.556 227766 DEBUG oslo_concurrency.lockutils [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.557 227766 DEBUG oslo_concurrency.lockutils [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.557 227766 DEBUG oslo_concurrency.lockutils [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.557 227766 DEBUG oslo_concurrency.lockutils [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.557 227766 DEBUG oslo_concurrency.lockutils [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.559 227766 INFO nova.compute.manager [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Terminating instance#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.560 227766 DEBUG nova.compute.manager [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:15:28 np0005593234 kernel: tap8be6de92-c5 (unregistering): left promiscuous mode
Jan 23 05:15:28 np0005593234 NetworkManager[48942]: <info>  [1769163328.6331] device (tap8be6de92-c5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.644 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:15:28Z|00578|binding|INFO|Releasing lport 8be6de92-c581-49d7-a315-1d1b8c33153a from this chassis (sb_readonly=0)
Jan 23 05:15:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:15:28Z|00579|binding|INFO|Setting lport 8be6de92-c581-49d7-a315-1d1b8c33153a down in Southbound
Jan 23 05:15:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:15:28Z|00580|binding|INFO|Removing iface tap8be6de92-c5 ovn-installed in OVS
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.648 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:28.659 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:07:56 10.100.0.8'], port_security=['fa:16:3e:d6:07:56 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8d9599b4-8855-4310-af02-cdd058438f7d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9dd869ce76e44fc8a82b8bbee1654d33', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b5b72284-9167-4768-aa53-98b2ad243e70', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=875f4baa-cb85-49ca-8f02-78715d351fdb, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=8be6de92-c581-49d7-a315-1d1b8c33153a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:15:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:28.661 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 8be6de92-c581-49d7-a315-1d1b8c33153a in datapath 8d9599b4-8855-4310-af02-cdd058438f7d unbound from our chassis#033[00m
Jan 23 05:15:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:28.663 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8d9599b4-8855-4310-af02-cdd058438f7d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.663 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:28.666 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[83967750-0048-415f-a637-d814bcbc7626]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:28.667 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d namespace which is not needed anymore#033[00m
Jan 23 05:15:28 np0005593234 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007f.scope: Deactivated successfully.
Jan 23 05:15:28 np0005593234 systemd[1]: machine-qemu\x2d59\x2dinstance\x2d0000007f.scope: Consumed 27.578s CPU time.
Jan 23 05:15:28 np0005593234 systemd-machined[195626]: Machine qemu-59-instance-0000007f terminated.
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.802 227766 INFO nova.virt.libvirt.driver [-] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Instance destroyed successfully.#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.803 227766 DEBUG nova.objects.instance [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lazy-loading 'resources' on Instance uuid 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:28 np0005593234 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[286235]: [NOTICE]   (286239) : haproxy version is 2.8.14-c23fe91
Jan 23 05:15:28 np0005593234 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[286235]: [NOTICE]   (286239) : path to executable is /usr/sbin/haproxy
Jan 23 05:15:28 np0005593234 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[286235]: [WARNING]  (286239) : Exiting Master process...
Jan 23 05:15:28 np0005593234 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[286235]: [WARNING]  (286239) : Exiting Master process...
Jan 23 05:15:28 np0005593234 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[286235]: [ALERT]    (286239) : Current worker (286241) exited with code 143 (Terminated)
Jan 23 05:15:28 np0005593234 neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d[286235]: [WARNING]  (286239) : All workers exited. Exiting... (0)
Jan 23 05:15:28 np0005593234 systemd[1]: libpod-d5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0.scope: Deactivated successfully.
Jan 23 05:15:28 np0005593234 podman[292895]: 2026-01-23 10:15:28.817198812 +0000 UTC m=+0.051640636 container died d5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:15:28 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0-userdata-shm.mount: Deactivated successfully.
Jan 23 05:15:28 np0005593234 systemd[1]: var-lib-containers-storage-overlay-e172f9831372312d99f1618df0ac2daa8d2ad83be2d9f75197e922d10e277efd-merged.mount: Deactivated successfully.
Jan 23 05:15:28 np0005593234 podman[292895]: 2026-01-23 10:15:28.865037969 +0000 UTC m=+0.099479793 container cleanup d5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:15:28 np0005593234 systemd[1]: libpod-conmon-d5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0.scope: Deactivated successfully.
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.900 227766 DEBUG nova.virt.libvirt.vif [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:09:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerActionsTestOtherB-server-1792998',display_name='tempest-ServerActionsTestOtherB-server-1792998',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionstestotherb-server-1792998',id=127,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:09:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9dd869ce76e44fc8a82b8bbee1654d33',ramdisk_id='',reservation_id='r-3xuyr6l4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsTestOtherB-1052932467',owner_user_name='tempest-ServerActionsTestOtherB-1052932467-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:09:45Z,user_data=None,user_id='aca3cab576d641d3b89e7dddf155d467',uuid=11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "8be6de92-c581-49d7-a315-1d1b8c33153a", "address": "fa:16:3e:d6:07:56", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8be6de92-c5", "ovs_interfaceid": "8be6de92-c581-49d7-a315-1d1b8c33153a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.901 227766 DEBUG nova.network.os_vif_util [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converting VIF {"id": "8be6de92-c581-49d7-a315-1d1b8c33153a", "address": "fa:16:3e:d6:07:56", "network": {"id": "8d9599b4-8855-4310-af02-cdd058438f7d", "bridge": "br-int", "label": "tempest-ServerActionsTestOtherB-1325714374-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9dd869ce76e44fc8a82b8bbee1654d33", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap8be6de92-c5", "ovs_interfaceid": "8be6de92-c581-49d7-a315-1d1b8c33153a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.901 227766 DEBUG nova.network.os_vif_util [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d6:07:56,bridge_name='br-int',has_traffic_filtering=True,id=8be6de92-c581-49d7-a315-1d1b8c33153a,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8be6de92-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.902 227766 DEBUG os_vif [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:07:56,bridge_name='br-int',has_traffic_filtering=True,id=8be6de92-c581-49d7-a315-1d1b8c33153a,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8be6de92-c5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.904 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.904 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8be6de92-c5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.907 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.910 227766 INFO os_vif [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:07:56,bridge_name='br-int',has_traffic_filtering=True,id=8be6de92-c581-49d7-a315-1d1b8c33153a,network=Network(8d9599b4-8855-4310-af02-cdd058438f7d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap8be6de92-c5')#033[00m
Jan 23 05:15:28 np0005593234 podman[292936]: 2026-01-23 10:15:28.929627796 +0000 UTC m=+0.042911464 container remove d5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:15:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:28.936 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7e831cbf-046e-4a06-a9f2-233941dbf977]: (4, ('Fri Jan 23 10:15:28 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d (d5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0)\nd5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0\nFri Jan 23 10:15:28 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d (d5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0)\nd5cdaa854f806be32587ef3b74d7c4eac6df59bc023c39125f0f7ab7f630dfa0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:28.939 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc0f532-7ea0-4fe1-80a5-ecf3ca0d12ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:28.942 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8d9599b4-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.943 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593234 kernel: tap8d9599b4-80: left promiscuous mode
Jan 23 05:15:28 np0005593234 nova_compute[227762]: 2026-01-23 10:15:28.959 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:28.962 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[052cf1e1-ebc5-400e-8df8-484634dbddd5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:28.983 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5827d05c-0e3f-420c-843e-35be332dffd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:28.984 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c917a0c4-c9a3-4b77-9d92-450c008db200]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:29.000 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[44605f17-88d0-4aef-bbb0-d5b4f63df3a9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 687946, 'reachable_time': 43360, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 292969, 'error': None, 'target': 'ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:29 np0005593234 systemd[1]: run-netns-ovnmeta\x2d8d9599b4\x2d8855\x2d4310\x2daf02\x2dcdd058438f7d.mount: Deactivated successfully.
Jan 23 05:15:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:29.003 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8d9599b4-8855-4310-af02-cdd058438f7d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:15:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:29.004 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[7f63d9d9-4011-42aa-836f-fa8cb5758fa3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:15:29 np0005593234 nova_compute[227762]: 2026-01-23 10:15:29.320 227766 INFO nova.virt.libvirt.driver [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Deleting instance files /var/lib/nova/instances/11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_del#033[00m
Jan 23 05:15:29 np0005593234 nova_compute[227762]: 2026-01-23 10:15:29.321 227766 INFO nova.virt.libvirt.driver [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Deletion of /var/lib/nova/instances/11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae_del complete#033[00m
Jan 23 05:15:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:29.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:29 np0005593234 nova_compute[227762]: 2026-01-23 10:15:29.602 227766 INFO nova.compute.manager [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Took 1.04 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:15:29 np0005593234 nova_compute[227762]: 2026-01-23 10:15:29.602 227766 DEBUG oslo.service.loopingcall [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:15:29 np0005593234 nova_compute[227762]: 2026-01-23 10:15:29.603 227766 DEBUG nova.compute.manager [-] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:15:29 np0005593234 nova_compute[227762]: 2026-01-23 10:15:29.603 227766 DEBUG nova.network.neutron [-] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:15:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:30.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.148 227766 DEBUG nova.compute.manager [req-b010a8a5-a95e-499f-b08e-1c94422c93c3 req-aa8620f3-a5b7-404b-b74b-bbb847f0f6f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Received event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.148 227766 DEBUG nova.compute.manager [req-b010a8a5-a95e-499f-b08e-1c94422c93c3 req-aa8620f3-a5b7-404b-b74b-bbb847f0f6f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing instance network info cache due to event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.149 227766 DEBUG oslo_concurrency.lockutils [req-b010a8a5-a95e-499f-b08e-1c94422c93c3 req-aa8620f3-a5b7-404b-b74b-bbb847f0f6f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.284 227766 DEBUG nova.compute.manager [req-19a8a32c-75c7-401e-abac-0b1dd0e1dbc9 req-6c229363-2c81-49a2-b94e-263108644d85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Received event network-vif-unplugged-8be6de92-c581-49d7-a315-1d1b8c33153a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.285 227766 DEBUG oslo_concurrency.lockutils [req-19a8a32c-75c7-401e-abac-0b1dd0e1dbc9 req-6c229363-2c81-49a2-b94e-263108644d85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.285 227766 DEBUG oslo_concurrency.lockutils [req-19a8a32c-75c7-401e-abac-0b1dd0e1dbc9 req-6c229363-2c81-49a2-b94e-263108644d85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.285 227766 DEBUG oslo_concurrency.lockutils [req-19a8a32c-75c7-401e-abac-0b1dd0e1dbc9 req-6c229363-2c81-49a2-b94e-263108644d85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.286 227766 DEBUG nova.compute.manager [req-19a8a32c-75c7-401e-abac-0b1dd0e1dbc9 req-6c229363-2c81-49a2-b94e-263108644d85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] No waiting events found dispatching network-vif-unplugged-8be6de92-c581-49d7-a315-1d1b8c33153a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.286 227766 DEBUG nova.compute.manager [req-19a8a32c-75c7-401e-abac-0b1dd0e1dbc9 req-6c229363-2c81-49a2-b94e-263108644d85 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Received event network-vif-unplugged-8be6de92-c581-49d7-a315-1d1b8c33153a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:15:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:15:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:31.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.433 227766 DEBUG nova.network.neutron [req-ba02075b-5c6a-4d31-853c-2026a1e8a25d req-459e1c29-b9ca-4d23-b1fd-90598a2865f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updated VIF entry in instance network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.434 227766 DEBUG nova.network.neutron [req-ba02075b-5c6a-4d31-853c-2026a1e8a25d req-459e1c29-b9ca-4d23-b1fd-90598a2865f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating instance_info_cache with network_info: [{"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.488 227766 DEBUG oslo_concurrency.lockutils [req-ba02075b-5c6a-4d31-853c-2026a1e8a25d req-459e1c29-b9ca-4d23-b1fd-90598a2865f1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.489 227766 DEBUG oslo_concurrency.lockutils [req-b010a8a5-a95e-499f-b08e-1c94422c93c3 req-aa8620f3-a5b7-404b-b74b-bbb847f0f6f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.489 227766 DEBUG nova.network.neutron [req-b010a8a5-a95e-499f-b08e-1c94422c93c3 req-aa8620f3-a5b7-404b-b74b-bbb847f0f6f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.691 227766 DEBUG nova.network.neutron [-] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.725 227766 INFO nova.compute.manager [-] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Took 2.12 seconds to deallocate network for instance.#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.841 227766 DEBUG nova.compute.manager [req-94f4bb4c-194a-4a13-a3ae-9380f3829e7d req-98e6a660-55d5-422e-b6d3-7e247b243afb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Received event network-vif-deleted-8be6de92-c581-49d7-a315-1d1b8c33153a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.855 227766 DEBUG oslo_concurrency.lockutils [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:31 np0005593234 nova_compute[227762]: 2026-01-23 10:15:31.855 227766 DEBUG oslo_concurrency.lockutils [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:32 np0005593234 nova_compute[227762]: 2026-01-23 10:15:32.041 227766 DEBUG oslo_concurrency.processutils [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:15:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:32.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:15:32 np0005593234 nova_compute[227762]: 2026-01-23 10:15:32.328 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:15:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3927844482' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:15:32 np0005593234 nova_compute[227762]: 2026-01-23 10:15:32.491 227766 DEBUG oslo_concurrency.processutils [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:32 np0005593234 nova_compute[227762]: 2026-01-23 10:15:32.497 227766 DEBUG nova.compute.provider_tree [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:15:32 np0005593234 nova_compute[227762]: 2026-01-23 10:15:32.541 227766 DEBUG nova.scheduler.client.report [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:15:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:32 np0005593234 nova_compute[227762]: 2026-01-23 10:15:32.570 227766 DEBUG oslo_concurrency.lockutils [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.715s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:32 np0005593234 nova_compute[227762]: 2026-01-23 10:15:32.639 227766 INFO nova.scheduler.client.report [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Deleted allocations for instance 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae#033[00m
Jan 23 05:15:32 np0005593234 nova_compute[227762]: 2026-01-23 10:15:32.780 227766 DEBUG oslo_concurrency.lockutils [None req-fa102482-656f-4d64-8c84-3846786c2a9a aca3cab576d641d3b89e7dddf155d467 9dd869ce76e44fc8a82b8bbee1654d33 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.224s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:15:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:33.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:15:33 np0005593234 nova_compute[227762]: 2026-01-23 10:15:33.553 227766 DEBUG nova.compute.manager [req-3c87a4a8-e0c7-4ecc-a95f-2cff71858805 req-e4644613-f5c3-4543-869a-d8cadc0df97b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Received event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:33 np0005593234 nova_compute[227762]: 2026-01-23 10:15:33.553 227766 DEBUG nova.compute.manager [req-3c87a4a8-e0c7-4ecc-a95f-2cff71858805 req-e4644613-f5c3-4543-869a-d8cadc0df97b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing instance network info cache due to event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:15:33 np0005593234 nova_compute[227762]: 2026-01-23 10:15:33.553 227766 DEBUG oslo_concurrency.lockutils [req-3c87a4a8-e0c7-4ecc-a95f-2cff71858805 req-e4644613-f5c3-4543-869a-d8cadc0df97b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:15:33 np0005593234 nova_compute[227762]: 2026-01-23 10:15:33.745 227766 DEBUG nova.compute.manager [req-5cf26439-f3c4-4e86-b103-0fac42abcbc6 req-5176e168-c15a-44dd-810f-afebf2d2622f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Received event network-vif-plugged-8be6de92-c581-49d7-a315-1d1b8c33153a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:33 np0005593234 nova_compute[227762]: 2026-01-23 10:15:33.746 227766 DEBUG oslo_concurrency.lockutils [req-5cf26439-f3c4-4e86-b103-0fac42abcbc6 req-5176e168-c15a-44dd-810f-afebf2d2622f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:33 np0005593234 nova_compute[227762]: 2026-01-23 10:15:33.746 227766 DEBUG oslo_concurrency.lockutils [req-5cf26439-f3c4-4e86-b103-0fac42abcbc6 req-5176e168-c15a-44dd-810f-afebf2d2622f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:33 np0005593234 nova_compute[227762]: 2026-01-23 10:15:33.746 227766 DEBUG oslo_concurrency.lockutils [req-5cf26439-f3c4-4e86-b103-0fac42abcbc6 req-5176e168-c15a-44dd-810f-afebf2d2622f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:33 np0005593234 nova_compute[227762]: 2026-01-23 10:15:33.746 227766 DEBUG nova.compute.manager [req-5cf26439-f3c4-4e86-b103-0fac42abcbc6 req-5176e168-c15a-44dd-810f-afebf2d2622f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] No waiting events found dispatching network-vif-plugged-8be6de92-c581-49d7-a315-1d1b8c33153a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:15:33 np0005593234 nova_compute[227762]: 2026-01-23 10:15:33.747 227766 WARNING nova.compute.manager [req-5cf26439-f3c4-4e86-b103-0fac42abcbc6 req-5176e168-c15a-44dd-810f-afebf2d2622f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Received unexpected event network-vif-plugged-8be6de92-c581-49d7-a315-1d1b8c33153a for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:15:33 np0005593234 nova_compute[227762]: 2026-01-23 10:15:33.908 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:34.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:35.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:15:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:36.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:15:36 np0005593234 nova_compute[227762]: 2026-01-23 10:15:36.591 227766 DEBUG nova.network.neutron [req-b010a8a5-a95e-499f-b08e-1c94422c93c3 req-aa8620f3-a5b7-404b-b74b-bbb847f0f6f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updated VIF entry in instance network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:15:36 np0005593234 nova_compute[227762]: 2026-01-23 10:15:36.592 227766 DEBUG nova.network.neutron [req-b010a8a5-a95e-499f-b08e-1c94422c93c3 req-aa8620f3-a5b7-404b-b74b-bbb847f0f6f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating instance_info_cache with network_info: [{"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:15:36 np0005593234 nova_compute[227762]: 2026-01-23 10:15:36.631 227766 DEBUG oslo_concurrency.lockutils [req-b010a8a5-a95e-499f-b08e-1c94422c93c3 req-aa8620f3-a5b7-404b-b74b-bbb847f0f6f8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:15:36 np0005593234 nova_compute[227762]: 2026-01-23 10:15:36.633 227766 DEBUG oslo_concurrency.lockutils [req-3c87a4a8-e0c7-4ecc-a95f-2cff71858805 req-e4644613-f5c3-4543-869a-d8cadc0df97b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:15:36 np0005593234 nova_compute[227762]: 2026-01-23 10:15:36.633 227766 DEBUG nova.network.neutron [req-3c87a4a8-e0c7-4ecc-a95f-2cff71858805 req-e4644613-f5c3-4543-869a-d8cadc0df97b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:15:37 np0005593234 nova_compute[227762]: 2026-01-23 10:15:37.331 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:37.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:38.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:38 np0005593234 nova_compute[227762]: 2026-01-23 10:15:38.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:38 np0005593234 nova_compute[227762]: 2026-01-23 10:15:38.774 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:38 np0005593234 nova_compute[227762]: 2026-01-23 10:15:38.775 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:38 np0005593234 nova_compute[227762]: 2026-01-23 10:15:38.775 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:38 np0005593234 nova_compute[227762]: 2026-01-23 10:15:38.776 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:15:38 np0005593234 nova_compute[227762]: 2026-01-23 10:15:38.776 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:38 np0005593234 nova_compute[227762]: 2026-01-23 10:15:38.911 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:15:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/828020188' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.238 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.383 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.384 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.384 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.384 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.388 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000087 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.388 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000087 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.389 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000087 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:15:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:39.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.591 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.592 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3966MB free_disk=20.788330078125GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.592 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.593 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.724 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance c38b8bfe-1b70-4daf-b676-250c1e933ed4 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.725 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 633b85ea-a47c-4be0-b06d-388aa421728b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.725 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:15:39 np0005593234 nova_compute[227762]: 2026-01-23 10:15:39.726 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:15:39 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #54. Immutable memtables: 10.
Jan 23 05:15:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:40.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:40 np0005593234 nova_compute[227762]: 2026-01-23 10:15:40.184 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:15:40 np0005593234 nova_compute[227762]: 2026-01-23 10:15:40.532 227766 DEBUG nova.network.neutron [req-3c87a4a8-e0c7-4ecc-a95f-2cff71858805 req-e4644613-f5c3-4543-869a-d8cadc0df97b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updated VIF entry in instance network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:15:40 np0005593234 nova_compute[227762]: 2026-01-23 10:15:40.533 227766 DEBUG nova.network.neutron [req-3c87a4a8-e0c7-4ecc-a95f-2cff71858805 req-e4644613-f5c3-4543-869a-d8cadc0df97b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating instance_info_cache with network_info: [{"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:15:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:15:40 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1306845608' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:15:40 np0005593234 nova_compute[227762]: 2026-01-23 10:15:40.657 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:15:40 np0005593234 nova_compute[227762]: 2026-01-23 10:15:40.663 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:15:40 np0005593234 nova_compute[227762]: 2026-01-23 10:15:40.788 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:15:40 np0005593234 nova_compute[227762]: 2026-01-23 10:15:40.802 227766 DEBUG oslo_concurrency.lockutils [req-3c87a4a8-e0c7-4ecc-a95f-2cff71858805 req-e4644613-f5c3-4543-869a-d8cadc0df97b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:15:40 np0005593234 nova_compute[227762]: 2026-01-23 10:15:40.867 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:15:40 np0005593234 nova_compute[227762]: 2026-01-23 10:15:40.867 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.275s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:41 np0005593234 nova_compute[227762]: 2026-01-23 10:15:41.106 227766 DEBUG nova.compute.manager [req-0f980335-138f-4a62-a702-f132b87539af req-78684391-0b4c-4e51-a3d1-07709271b586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Received event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:41 np0005593234 nova_compute[227762]: 2026-01-23 10:15:41.106 227766 DEBUG nova.compute.manager [req-0f980335-138f-4a62-a702-f132b87539af req-78684391-0b4c-4e51-a3d1-07709271b586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing instance network info cache due to event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:15:41 np0005593234 nova_compute[227762]: 2026-01-23 10:15:41.107 227766 DEBUG oslo_concurrency.lockutils [req-0f980335-138f-4a62-a702-f132b87539af req-78684391-0b4c-4e51-a3d1-07709271b586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:15:41 np0005593234 nova_compute[227762]: 2026-01-23 10:15:41.108 227766 DEBUG oslo_concurrency.lockutils [req-0f980335-138f-4a62-a702-f132b87539af req-78684391-0b4c-4e51-a3d1-07709271b586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:15:41 np0005593234 nova_compute[227762]: 2026-01-23 10:15:41.108 227766 DEBUG nova.network.neutron [req-0f980335-138f-4a62-a702-f132b87539af req-78684391-0b4c-4e51-a3d1-07709271b586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:15:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:41.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:41 np0005593234 nova_compute[227762]: 2026-01-23 10:15:41.868 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:42.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:42 np0005593234 nova_compute[227762]: 2026-01-23 10:15:42.334 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:42.852 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:42.852 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:15:42.853 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:43.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:43 np0005593234 nova_compute[227762]: 2026-01-23 10:15:43.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:43 np0005593234 nova_compute[227762]: 2026-01-23 10:15:43.801 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163328.8001475, 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:15:43 np0005593234 nova_compute[227762]: 2026-01-23 10:15:43.802 227766 INFO nova.compute.manager [-] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:15:43 np0005593234 nova_compute[227762]: 2026-01-23 10:15:43.891 227766 DEBUG nova.compute.manager [None req-639f6086-0313-49a1-b583-4a26dadb67fb - - - - - -] [instance: 11b0b7c7-5997-43f8-a4c4-ef7a927ce7ae] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:15:43 np0005593234 nova_compute[227762]: 2026-01-23 10:15:43.915 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:44.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:44 np0005593234 nova_compute[227762]: 2026-01-23 10:15:44.623 227766 DEBUG nova.network.neutron [req-0f980335-138f-4a62-a702-f132b87539af req-78684391-0b4c-4e51-a3d1-07709271b586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updated VIF entry in instance network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:15:44 np0005593234 nova_compute[227762]: 2026-01-23 10:15:44.623 227766 DEBUG nova.network.neutron [req-0f980335-138f-4a62-a702-f132b87539af req-78684391-0b4c-4e51-a3d1-07709271b586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating instance_info_cache with network_info: [{"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:15:44 np0005593234 nova_compute[227762]: 2026-01-23 10:15:44.647 227766 DEBUG oslo_concurrency.lockutils [req-0f980335-138f-4a62-a702-f132b87539af req-78684391-0b4c-4e51-a3d1-07709271b586 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:15:44 np0005593234 nova_compute[227762]: 2026-01-23 10:15:44.762 227766 DEBUG oslo_concurrency.lockutils [None req-4b25ade7-0d89-44b0-87a0-6af6b01a2933 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "633b85ea-a47c-4be0-b06d-388aa421728b" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:44 np0005593234 nova_compute[227762]: 2026-01-23 10:15:44.763 227766 DEBUG oslo_concurrency.lockutils [None req-4b25ade7-0d89-44b0-87a0-6af6b01a2933 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:44 np0005593234 nova_compute[227762]: 2026-01-23 10:15:44.779 227766 INFO nova.compute.manager [None req-4b25ade7-0d89-44b0-87a0-6af6b01a2933 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Detaching volume 06664261-bdf3-44c1-9f40-a29c0039b070#033[00m
Jan 23 05:15:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:15:45Z|00581|binding|INFO|Releasing lport 1788b5e6-601b-4e3d-a584-c0138c3308f6 from this chassis (sb_readonly=0)
Jan 23 05:15:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:15:45Z|00582|binding|INFO|Releasing lport 2c16e447-27d9-4516-bf23-ec948f375c10 from this chassis (sb_readonly=0)
Jan 23 05:15:45 np0005593234 nova_compute[227762]: 2026-01-23 10:15:45.084 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:45 np0005593234 nova_compute[227762]: 2026-01-23 10:15:45.086 227766 INFO nova.virt.block_device [None req-4b25ade7-0d89-44b0-87a0-6af6b01a2933 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Attempting to driver detach volume 06664261-bdf3-44c1-9f40-a29c0039b070 from mountpoint /dev/vdb#033[00m
Jan 23 05:15:45 np0005593234 nova_compute[227762]: 2026-01-23 10:15:45.095 227766 DEBUG nova.virt.libvirt.driver [None req-4b25ade7-0d89-44b0-87a0-6af6b01a2933 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Attempting to detach device vdb from instance 633b85ea-a47c-4be0-b06d-388aa421728b from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:15:45 np0005593234 nova_compute[227762]: 2026-01-23 10:15:45.096 227766 DEBUG nova.virt.libvirt.guest [None req-4b25ade7-0d89-44b0-87a0-6af6b01a2933 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:15:45 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-06664261-bdf3-44c1-9f40-a29c0039b070">
Jan 23 05:15:45 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:  <serial>06664261-bdf3-44c1-9f40-a29c0039b070</serial>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:15:45 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:15:45 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:15:45 np0005593234 nova_compute[227762]: 2026-01-23 10:15:45.128 227766 INFO nova.virt.libvirt.driver [None req-4b25ade7-0d89-44b0-87a0-6af6b01a2933 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Successfully detached device vdb from instance 633b85ea-a47c-4be0-b06d-388aa421728b from the persistent domain config.#033[00m
Jan 23 05:15:45 np0005593234 nova_compute[227762]: 2026-01-23 10:15:45.128 227766 DEBUG nova.virt.libvirt.driver [None req-4b25ade7-0d89-44b0-87a0-6af6b01a2933 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 633b85ea-a47c-4be0-b06d-388aa421728b from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:15:45 np0005593234 nova_compute[227762]: 2026-01-23 10:15:45.129 227766 DEBUG nova.virt.libvirt.guest [None req-4b25ade7-0d89-44b0-87a0-6af6b01a2933 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:15:45 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-06664261-bdf3-44c1-9f40-a29c0039b070">
Jan 23 05:15:45 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:  <serial>06664261-bdf3-44c1-9f40-a29c0039b070</serial>
Jan 23 05:15:45 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:15:45 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:15:45 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:15:45 np0005593234 nova_compute[227762]: 2026-01-23 10:15:45.253 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769163345.2530851, 633b85ea-a47c-4be0-b06d-388aa421728b => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:15:45 np0005593234 nova_compute[227762]: 2026-01-23 10:15:45.255 227766 DEBUG nova.virt.libvirt.driver [None req-4b25ade7-0d89-44b0-87a0-6af6b01a2933 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 633b85ea-a47c-4be0-b06d-388aa421728b _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:15:45 np0005593234 nova_compute[227762]: 2026-01-23 10:15:45.257 227766 INFO nova.virt.libvirt.driver [None req-4b25ade7-0d89-44b0-87a0-6af6b01a2933 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Successfully detached device vdb from instance 633b85ea-a47c-4be0-b06d-388aa421728b from the live domain config.#033[00m
Jan 23 05:15:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:45.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:46.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:46 np0005593234 nova_compute[227762]: 2026-01-23 10:15:46.219 227766 DEBUG nova.objects.instance [None req-4b25ade7-0d89-44b0-87a0-6af6b01a2933 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'flavor' on Instance uuid 633b85ea-a47c-4be0-b06d-388aa421728b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:46 np0005593234 nova_compute[227762]: 2026-01-23 10:15:46.278 227766 DEBUG oslo_concurrency.lockutils [None req-4b25ade7-0d89-44b0-87a0-6af6b01a2933 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.515s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:46 np0005593234 nova_compute[227762]: 2026-01-23 10:15:46.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:46 np0005593234 podman[293099]: 2026-01-23 10:15:46.752397475 +0000 UTC m=+0.049396356 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #121. Immutable memtables: 0.
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.011453) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 75] Flushing memtable with next log file: 121
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163347011553, "job": 75, "event": "flush_started", "num_memtables": 1, "num_entries": 2235, "num_deletes": 256, "total_data_size": 5091492, "memory_usage": 5160184, "flush_reason": "Manual Compaction"}
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 75] Level-0 flush table #122: started
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163347174121, "cf_name": "default", "job": 75, "event": "table_file_creation", "file_number": 122, "file_size": 2083545, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 59926, "largest_seqno": 62156, "table_properties": {"data_size": 2076516, "index_size": 3719, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 18820, "raw_average_key_size": 21, "raw_value_size": 2060815, "raw_average_value_size": 2357, "num_data_blocks": 163, "num_entries": 874, "num_filter_entries": 874, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163170, "oldest_key_time": 1769163170, "file_creation_time": 1769163347, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 122, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 75] Flush lasted 162710 microseconds, and 6435 cpu microseconds.
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.174190) [db/flush_job.cc:967] [default] [JOB 75] Level-0 flush table #122: 2083545 bytes OK
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.174209) [db/memtable_list.cc:519] [default] Level-0 commit table #122 started
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.178839) [db/memtable_list.cc:722] [default] Level-0 commit table #122: memtable #1 done
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.178866) EVENT_LOG_v1 {"time_micros": 1769163347178858, "job": 75, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.178884) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 75] Try to delete WAL files size 5081411, prev total WAL file size 5081411, number of live WAL files 2.
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000118.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.180252) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032303036' seq:72057594037927935, type:22 .. '6D6772737461740032323539' seq:0, type:0; will stop at (end)
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 76] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 75 Base level 0, inputs: [122(2034KB)], [120(11MB)]
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163347180394, "job": 76, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [122], "files_L6": [120], "score": -1, "input_data_size": 14211686, "oldest_snapshot_seqno": -1}
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 76] Generated table #123: 8590 keys, 11574756 bytes, temperature: kUnknown
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163347267514, "cf_name": "default", "job": 76, "event": "table_file_creation", "file_number": 123, "file_size": 11574756, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11519705, "index_size": 32475, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21509, "raw_key_size": 221666, "raw_average_key_size": 25, "raw_value_size": 11369297, "raw_average_value_size": 1323, "num_data_blocks": 1276, "num_entries": 8590, "num_filter_entries": 8590, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769163347, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 123, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.267794) [db/compaction/compaction_job.cc:1663] [default] [JOB 76] Compacted 1@0 + 1@6 files to L6 => 11574756 bytes
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.270161) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.9 rd, 132.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 11.6 +0.0 blob) out(11.0 +0.0 blob), read-write-amplify(12.4) write-amplify(5.6) OK, records in: 9042, records dropped: 452 output_compression: NoCompression
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.270202) EVENT_LOG_v1 {"time_micros": 1769163347270186, "job": 76, "event": "compaction_finished", "compaction_time_micros": 87216, "compaction_time_cpu_micros": 34758, "output_level": 6, "num_output_files": 1, "total_output_size": 11574756, "num_input_records": 9042, "num_output_records": 8590, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000122.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163347270714, "job": 76, "event": "table_file_deletion", "file_number": 122}
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000120.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163347272668, "job": 76, "event": "table_file_deletion", "file_number": 120}
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.180097) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.272768) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.272774) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.272776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.272777) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:15:47.272779) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:15:47 np0005593234 nova_compute[227762]: 2026-01-23 10:15:47.337 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:47.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:47 np0005593234 nova_compute[227762]: 2026-01-23 10:15:47.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:48.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:48 np0005593234 nova_compute[227762]: 2026-01-23 10:15:48.918 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:49.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:49 np0005593234 nova_compute[227762]: 2026-01-23 10:15:49.748 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:49 np0005593234 nova_compute[227762]: 2026-01-23 10:15:49.748 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:49 np0005593234 nova_compute[227762]: 2026-01-23 10:15:49.748 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:15:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:50.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:51.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:52.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.338 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.472 227766 DEBUG oslo_concurrency.lockutils [None req-0fad180c-1d1c-49cc-951d-ac638dee1bb4 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "633b85ea-a47c-4be0-b06d-388aa421728b" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.473 227766 DEBUG oslo_concurrency.lockutils [None req-0fad180c-1d1c-49cc-951d-ac638dee1bb4 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.491 227766 INFO nova.compute.manager [None req-0fad180c-1d1c-49cc-951d-ac638dee1bb4 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Detaching volume eb20e12a-5d9c-4034-8e50-a0b613fd5f3d#033[00m
Jan 23 05:15:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:15:52 np0005593234 podman[293171]: 2026-01-23 10:15:52.779349302 +0000 UTC m=+0.078103699 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.869 227766 INFO nova.virt.block_device [None req-0fad180c-1d1c-49cc-951d-ac638dee1bb4 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Attempting to driver detach volume eb20e12a-5d9c-4034-8e50-a0b613fd5f3d from mountpoint /dev/vdc#033[00m
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.879 227766 DEBUG nova.virt.libvirt.driver [None req-0fad180c-1d1c-49cc-951d-ac638dee1bb4 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Attempting to detach device vdc from instance 633b85ea-a47c-4be0-b06d-388aa421728b from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.880 227766 DEBUG nova.virt.libvirt.guest [None req-0fad180c-1d1c-49cc-951d-ac638dee1bb4 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:15:52 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-eb20e12a-5d9c-4034-8e50-a0b613fd5f3d">
Jan 23 05:15:52 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:  <target dev="vdc" bus="virtio"/>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:  <serial>eb20e12a-5d9c-4034-8e50-a0b613fd5f3d</serial>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 23 05:15:52 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:15:52 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.887 227766 INFO nova.virt.libvirt.driver [None req-0fad180c-1d1c-49cc-951d-ac638dee1bb4 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Successfully detached device vdc from instance 633b85ea-a47c-4be0-b06d-388aa421728b from the persistent domain config.#033[00m
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.888 227766 DEBUG nova.virt.libvirt.driver [None req-0fad180c-1d1c-49cc-951d-ac638dee1bb4 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance 633b85ea-a47c-4be0-b06d-388aa421728b from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.888 227766 DEBUG nova.virt.libvirt.guest [None req-0fad180c-1d1c-49cc-951d-ac638dee1bb4 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:15:52 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-eb20e12a-5d9c-4034-8e50-a0b613fd5f3d">
Jan 23 05:15:52 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:  <target dev="vdc" bus="virtio"/>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:  <serial>eb20e12a-5d9c-4034-8e50-a0b613fd5f3d</serial>
Jan 23 05:15:52 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 23 05:15:52 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:15:52 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.948 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769163352.9482827, 633b85ea-a47c-4be0-b06d-388aa421728b => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.950 227766 DEBUG nova.virt.libvirt.driver [None req-0fad180c-1d1c-49cc-951d-ac638dee1bb4 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance 633b85ea-a47c-4be0-b06d-388aa421728b _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:15:52 np0005593234 nova_compute[227762]: 2026-01-23 10:15:52.953 227766 INFO nova.virt.libvirt.driver [None req-0fad180c-1d1c-49cc-951d-ac638dee1bb4 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Successfully detached device vdc from instance 633b85ea-a47c-4be0-b06d-388aa421728b from the live domain config.#033[00m
Jan 23 05:15:53 np0005593234 nova_compute[227762]: 2026-01-23 10:15:53.225 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:15:53 np0005593234 nova_compute[227762]: 2026-01-23 10:15:53.225 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:15:53 np0005593234 nova_compute[227762]: 2026-01-23 10:15:53.226 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:15:53 np0005593234 nova_compute[227762]: 2026-01-23 10:15:53.226 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c38b8bfe-1b70-4daf-b676-250c1e933ed4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:53.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:53 np0005593234 nova_compute[227762]: 2026-01-23 10:15:53.588 227766 DEBUG nova.objects.instance [None req-0fad180c-1d1c-49cc-951d-ac638dee1bb4 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'flavor' on Instance uuid 633b85ea-a47c-4be0-b06d-388aa421728b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:15:53 np0005593234 nova_compute[227762]: 2026-01-23 10:15:53.660 227766 DEBUG oslo_concurrency.lockutils [None req-0fad180c-1d1c-49cc-951d-ac638dee1bb4 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:15:53 np0005593234 nova_compute[227762]: 2026-01-23 10:15:53.922 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:54.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:55.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:55 np0005593234 nova_compute[227762]: 2026-01-23 10:15:55.530 227766 DEBUG nova.compute.manager [req-d5f6b038-b602-4111-8ff6-45be5e06f224 req-fcb1d082-507d-4d43-acc5-a13ba6e2b89c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Received event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:15:55 np0005593234 nova_compute[227762]: 2026-01-23 10:15:55.530 227766 DEBUG nova.compute.manager [req-d5f6b038-b602-4111-8ff6-45be5e06f224 req-fcb1d082-507d-4d43-acc5-a13ba6e2b89c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing instance network info cache due to event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:15:55 np0005593234 nova_compute[227762]: 2026-01-23 10:15:55.531 227766 DEBUG oslo_concurrency.lockutils [req-d5f6b038-b602-4111-8ff6-45be5e06f224 req-fcb1d082-507d-4d43-acc5-a13ba6e2b89c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:15:55 np0005593234 nova_compute[227762]: 2026-01-23 10:15:55.531 227766 DEBUG oslo_concurrency.lockutils [req-d5f6b038-b602-4111-8ff6-45be5e06f224 req-fcb1d082-507d-4d43-acc5-a13ba6e2b89c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:15:55 np0005593234 nova_compute[227762]: 2026-01-23 10:15:55.531 227766 DEBUG nova.network.neutron [req-d5f6b038-b602-4111-8ff6-45be5e06f224 req-fcb1d082-507d-4d43-acc5-a13ba6e2b89c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:15:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:56.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:56 np0005593234 nova_compute[227762]: 2026-01-23 10:15:56.760 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Updating instance_info_cache with network_info: [{"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:15:56 np0005593234 nova_compute[227762]: 2026-01-23 10:15:56.852 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-c38b8bfe-1b70-4daf-b676-250c1e933ed4" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:15:56 np0005593234 nova_compute[227762]: 2026-01-23 10:15:56.852 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:15:56 np0005593234 nova_compute[227762]: 2026-01-23 10:15:56.853 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:15:57 np0005593234 nova_compute[227762]: 2026-01-23 10:15:57.342 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:57.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:15:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:58.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:15:58 np0005593234 nova_compute[227762]: 2026-01-23 10:15:58.926 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:15:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:15:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:15:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:59.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:16:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:00.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:16:00 np0005593234 nova_compute[227762]: 2026-01-23 10:16:00.847 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:01.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:02.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:02 np0005593234 nova_compute[227762]: 2026-01-23 10:16:02.267 227766 DEBUG nova.network.neutron [req-d5f6b038-b602-4111-8ff6-45be5e06f224 req-fcb1d082-507d-4d43-acc5-a13ba6e2b89c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updated VIF entry in instance network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:16:02 np0005593234 nova_compute[227762]: 2026-01-23 10:16:02.267 227766 DEBUG nova.network.neutron [req-d5f6b038-b602-4111-8ff6-45be5e06f224 req-fcb1d082-507d-4d43-acc5-a13ba6e2b89c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating instance_info_cache with network_info: [{"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:16:02 np0005593234 nova_compute[227762]: 2026-01-23 10:16:02.312 227766 DEBUG oslo_concurrency.lockutils [req-d5f6b038-b602-4111-8ff6-45be5e06f224 req-fcb1d082-507d-4d43-acc5-a13ba6e2b89c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:16:02 np0005593234 nova_compute[227762]: 2026-01-23 10:16:02.344 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:03.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:03 np0005593234 nova_compute[227762]: 2026-01-23 10:16:03.929 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:04.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:05 np0005593234 nova_compute[227762]: 2026-01-23 10:16:05.407 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:05.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:06.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #124. Immutable memtables: 0.
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.923644) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 77] Flushing memtable with next log file: 124
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163366923764, "job": 77, "event": "flush_started", "num_memtables": 1, "num_entries": 445, "num_deletes": 251, "total_data_size": 515422, "memory_usage": 525224, "flush_reason": "Manual Compaction"}
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 77] Level-0 flush table #125: started
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163366928075, "cf_name": "default", "job": 77, "event": "table_file_creation", "file_number": 125, "file_size": 339893, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 62161, "largest_seqno": 62601, "table_properties": {"data_size": 337452, "index_size": 541, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6041, "raw_average_key_size": 18, "raw_value_size": 332596, "raw_average_value_size": 1032, "num_data_blocks": 24, "num_entries": 322, "num_filter_entries": 322, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163348, "oldest_key_time": 1769163348, "file_creation_time": 1769163366, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 125, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 77] Flush lasted 4445 microseconds, and 1829 cpu microseconds.
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.928120) [db/flush_job.cc:967] [default] [JOB 77] Level-0 flush table #125: 339893 bytes OK
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.928137) [db/memtable_list.cc:519] [default] Level-0 commit table #125 started
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.929651) [db/memtable_list.cc:722] [default] Level-0 commit table #125: memtable #1 done
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.929663) EVENT_LOG_v1 {"time_micros": 1769163366929659, "job": 77, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.929678) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 77] Try to delete WAL files size 512644, prev total WAL file size 512644, number of live WAL files 2.
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000121.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.930082) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035323731' seq:72057594037927935, type:22 .. '7061786F730035353233' seq:0, type:0; will stop at (end)
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 78] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 77 Base level 0, inputs: [125(331KB)], [123(11MB)]
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163366930191, "job": 78, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [125], "files_L6": [123], "score": -1, "input_data_size": 11914649, "oldest_snapshot_seqno": -1}
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 78] Generated table #126: 8402 keys, 10025560 bytes, temperature: kUnknown
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163366986019, "cf_name": "default", "job": 78, "event": "table_file_creation", "file_number": 126, "file_size": 10025560, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9973139, "index_size": 30293, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21061, "raw_key_size": 218453, "raw_average_key_size": 26, "raw_value_size": 9827337, "raw_average_value_size": 1169, "num_data_blocks": 1177, "num_entries": 8402, "num_filter_entries": 8402, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769163366, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 126, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.986243) [db/compaction/compaction_job.cc:1663] [default] [JOB 78] Compacted 1@0 + 1@6 files to L6 => 10025560 bytes
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.987497) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 213.1 rd, 179.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 11.0 +0.0 blob) out(9.6 +0.0 blob), read-write-amplify(64.6) write-amplify(29.5) OK, records in: 8912, records dropped: 510 output_compression: NoCompression
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.987517) EVENT_LOG_v1 {"time_micros": 1769163366987508, "job": 78, "event": "compaction_finished", "compaction_time_micros": 55898, "compaction_time_cpu_micros": 23953, "output_level": 6, "num_output_files": 1, "total_output_size": 10025560, "num_input_records": 8912, "num_output_records": 8402, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000125.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163366987718, "job": 78, "event": "table_file_deletion", "file_number": 125}
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000123.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163366989678, "job": 78, "event": "table_file_deletion", "file_number": 123}
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.929988) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.989767) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.989772) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.989774) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.989776) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:06 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:06.989777) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:07 np0005593234 nova_compute[227762]: 2026-01-23 10:16:07.346 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:07.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:08.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:08 np0005593234 nova_compute[227762]: 2026-01-23 10:16:08.932 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:16:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:16:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:16:08 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:16:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:09.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:10.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:11.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #127. Immutable memtables: 0.
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.008051) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 79] Flushing memtable with next log file: 127
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163372008137, "job": 79, "event": "flush_started", "num_memtables": 1, "num_entries": 312, "num_deletes": 250, "total_data_size": 130265, "memory_usage": 137608, "flush_reason": "Manual Compaction"}
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 79] Level-0 flush table #128: started
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163372010463, "cf_name": "default", "job": 79, "event": "table_file_creation", "file_number": 128, "file_size": 86250, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 62602, "largest_seqno": 62913, "table_properties": {"data_size": 84237, "index_size": 176, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 4296, "raw_average_key_size": 15, "raw_value_size": 80279, "raw_average_value_size": 283, "num_data_blocks": 8, "num_entries": 283, "num_filter_entries": 283, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163367, "oldest_key_time": 1769163367, "file_creation_time": 1769163372, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 128, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 79] Flush lasted 2423 microseconds, and 1054 cpu microseconds.
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.010489) [db/flush_job.cc:967] [default] [JOB 79] Level-0 flush table #128: 86250 bytes OK
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.010501) [db/memtable_list.cc:519] [default] Level-0 commit table #128 started
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.011554) [db/memtable_list.cc:722] [default] Level-0 commit table #128: memtable #1 done
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.011599) EVENT_LOG_v1 {"time_micros": 1769163372011594, "job": 79, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.011616) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 79] Try to delete WAL files size 128011, prev total WAL file size 128011, number of live WAL files 2.
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000124.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.011979) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323530' seq:72057594037927935, type:22 .. '6B7600353031' seq:0, type:0; will stop at (end)
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 80] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 79 Base level 0, inputs: [128(84KB)], [126(9790KB)]
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163372012060, "job": 80, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [128], "files_L6": [126], "score": -1, "input_data_size": 10111810, "oldest_snapshot_seqno": -1}
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 80] Generated table #129: 8175 keys, 9041903 bytes, temperature: kUnknown
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163372069797, "cf_name": "default", "job": 80, "event": "table_file_creation", "file_number": 129, "file_size": 9041903, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8991845, "index_size": 28505, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20485, "raw_key_size": 215458, "raw_average_key_size": 26, "raw_value_size": 8850657, "raw_average_value_size": 1082, "num_data_blocks": 1084, "num_entries": 8175, "num_filter_entries": 8175, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769163372, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 129, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.070023) [db/compaction/compaction_job.cc:1663] [default] [JOB 80] Compacted 1@0 + 1@6 files to L6 => 9041903 bytes
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.071535) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.9 rd, 156.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.1, 9.6 +0.0 blob) out(8.6 +0.0 blob), read-write-amplify(222.1) write-amplify(104.8) OK, records in: 8685, records dropped: 510 output_compression: NoCompression
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.071552) EVENT_LOG_v1 {"time_micros": 1769163372071545, "job": 80, "event": "compaction_finished", "compaction_time_micros": 57804, "compaction_time_cpu_micros": 23359, "output_level": 6, "num_output_files": 1, "total_output_size": 9041903, "num_input_records": 8685, "num_output_records": 8175, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000128.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163372071729, "job": 80, "event": "table_file_deletion", "file_number": 128}
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000126.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163372073912, "job": 80, "event": "table_file_deletion", "file_number": 126}
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.011847) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.074034) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.074040) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.074041) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.074042) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:16:12.074044) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:16:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:12.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:12 np0005593234 nova_compute[227762]: 2026-01-23 10:16:12.385 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:16:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:13.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:16:13 np0005593234 nova_compute[227762]: 2026-01-23 10:16:13.601 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:13 np0005593234 nova_compute[227762]: 2026-01-23 10:16:13.933 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:14.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:15.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:16:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:16:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:16.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:17 np0005593234 nova_compute[227762]: 2026-01-23 10:16:17.387 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:17 np0005593234 podman[293445]: 2026-01-23 10:16:17.422597415 +0000 UTC m=+0.062058020 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:16:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:16:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:17.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:16:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:16:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:18.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:16:18 np0005593234 nova_compute[227762]: 2026-01-23 10:16:18.935 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:19.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:20.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:21.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:22.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:22 np0005593234 nova_compute[227762]: 2026-01-23 10:16:22.390 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.272 227766 DEBUG oslo_concurrency.lockutils [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.272 227766 DEBUG oslo_concurrency.lockutils [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.272 227766 DEBUG oslo_concurrency.lockutils [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.273 227766 DEBUG oslo_concurrency.lockutils [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.273 227766 DEBUG oslo_concurrency.lockutils [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.274 227766 INFO nova.compute.manager [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Terminating instance#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.275 227766 DEBUG nova.compute.manager [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:16:23 np0005593234 kernel: tap483c7ca9-a9 (unregistering): left promiscuous mode
Jan 23 05:16:23 np0005593234 NetworkManager[48942]: <info>  [1769163383.3445] device (tap483c7ca9-a9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.352 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:23 np0005593234 ovn_controller[134547]: 2026-01-23T10:16:23Z|00583|binding|INFO|Releasing lport 483c7ca9-a908-4082-bbad-1ea123d6a3f1 from this chassis (sb_readonly=0)
Jan 23 05:16:23 np0005593234 ovn_controller[134547]: 2026-01-23T10:16:23Z|00584|binding|INFO|Setting lport 483c7ca9-a908-4082-bbad-1ea123d6a3f1 down in Southbound
Jan 23 05:16:23 np0005593234 ovn_controller[134547]: 2026-01-23T10:16:23Z|00585|binding|INFO|Removing iface tap483c7ca9-a9 ovn-installed in OVS
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.355 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.366 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:23 np0005593234 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000087.scope: Deactivated successfully.
Jan 23 05:16:23 np0005593234 systemd[1]: machine-qemu\x2d67\x2dinstance\x2d00000087.scope: Consumed 21.567s CPU time.
Jan 23 05:16:23 np0005593234 systemd-machined[195626]: Machine qemu-67-instance-00000087 terminated.
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.397 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:53:81:35 10.100.0.12'], port_security=['fa:16:3e:53:81:35 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'c38b8bfe-1b70-4daf-b676-250c1e933ed4', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '0a6ba16c4b9d49d3bc24cd7b44935d1f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '6fc0d424-7779-4175-b5e0-e2613de6ecef', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5fb685af-2efd-4d70-8868-8a86ed4c3ca6, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=483c7ca9-a908-4082-bbad-1ea123d6a3f1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.399 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 483c7ca9-a908-4082-bbad-1ea123d6a3f1 in datapath 00bd3319-bfe5-4acd-b2e4-17830ee847f9 unbound from our chassis#033[00m
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.401 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 00bd3319-bfe5-4acd-b2e4-17830ee847f9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.403 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b9adff0f-bbae-4381-aa29-79c7bdd1b01e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.404 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 namespace which is not needed anymore#033[00m
Jan 23 05:16:23 np0005593234 podman[293468]: 2026-01-23 10:16:23.462334977 +0000 UTC m=+0.087526212 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:16:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:23.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.516 227766 INFO nova.virt.libvirt.driver [-] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Instance destroyed successfully.#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.517 227766 DEBUG nova.objects.instance [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lazy-loading 'resources' on Instance uuid c38b8bfe-1b70-4daf-b676-250c1e933ed4 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:16:23 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[291184]: [NOTICE]   (291196) : haproxy version is 2.8.14-c23fe91
Jan 23 05:16:23 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[291184]: [NOTICE]   (291196) : path to executable is /usr/sbin/haproxy
Jan 23 05:16:23 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[291184]: [WARNING]  (291196) : Exiting Master process...
Jan 23 05:16:23 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[291184]: [ALERT]    (291196) : Current worker (291198) exited with code 143 (Terminated)
Jan 23 05:16:23 np0005593234 neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9[291184]: [WARNING]  (291196) : All workers exited. Exiting... (0)
Jan 23 05:16:23 np0005593234 systemd[1]: libpod-ee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d.scope: Deactivated successfully.
Jan 23 05:16:23 np0005593234 podman[293515]: 2026-01-23 10:16:23.532983114 +0000 UTC m=+0.052704000 container died ee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:16:23 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d-userdata-shm.mount: Deactivated successfully.
Jan 23 05:16:23 np0005593234 systemd[1]: var-lib-containers-storage-overlay-d7737df349e5a533369b961e4f4e817d56e212f83ad18b150fc76c07b26c2124-merged.mount: Deactivated successfully.
Jan 23 05:16:23 np0005593234 podman[293515]: 2026-01-23 10:16:23.590043938 +0000 UTC m=+0.109764814 container cleanup ee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:16:23 np0005593234 systemd[1]: libpod-conmon-ee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d.scope: Deactivated successfully.
Jan 23 05:16:23 np0005593234 podman[293556]: 2026-01-23 10:16:23.649923868 +0000 UTC m=+0.040675095 container remove ee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.656 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[593b6135-5cf1-4a57-be9c-30a6519730d4]: (4, ('Fri Jan 23 10:16:23 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 (ee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d)\nee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d\nFri Jan 23 10:16:23 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 (ee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d)\nee3889c8cef92f17028e6943ced27b4ed5f1d570c7b0341caa3248d41be5b15d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.659 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b72e5c13-00b9-40a8-b793-b0a91781ecca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.661 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap00bd3319-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.663 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:23 np0005593234 kernel: tap00bd3319-b0: left promiscuous mode
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.684 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.687 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8db2c927-acc4-4975-8a34-10e0071bf432]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.707 227766 DEBUG nova.virt.libvirt.vif [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:12:34Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerRescueNegativeTestJSON-server-526971967',display_name='tempest-ServerRescueNegativeTestJSON-server-526971967',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverrescuenegativetestjson-server-526971967',id=135,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:13:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='0a6ba16c4b9d49d3bc24cd7b44935d1f',ramdisk_id='',reservation_id='r-rypzatwh',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerRescueNegativeTestJSON-87224704',owner_user_name='tempest-ServerRescueNegativeTestJSON-87224704-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:13:21Z,user_data=None,user_id='fae914e59ec54f6b80928ef3cc68dbdb',uuid=c38b8bfe-1b70-4daf-b676-250c1e933ed4,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='rescued') vif={"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.707 227766 DEBUG nova.network.os_vif_util [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converting VIF {"id": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "address": "fa:16:3e:53:81:35", "network": {"id": "00bd3319-bfe5-4acd-b2e4-17830ee847f9", "bridge": "br-int", "label": "tempest-ServerRescueNegativeTestJSON-921943436-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0a6ba16c4b9d49d3bc24cd7b44935d1f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap483c7ca9-a9", "ovs_interfaceid": "483c7ca9-a908-4082-bbad-1ea123d6a3f1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.708 227766 DEBUG nova.network.os_vif_util [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:53:81:35,bridge_name='br-int',has_traffic_filtering=True,id=483c7ca9-a908-4082-bbad-1ea123d6a3f1,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483c7ca9-a9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.708 227766 DEBUG os_vif [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:81:35,bridge_name='br-int',has_traffic_filtering=True,id=483c7ca9-a908-4082-bbad-1ea123d6a3f1,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483c7ca9-a9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.710 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.710 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap483c7ca9-a9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.711 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e80bdec1-b67e-42d7-a7b8-d2b765793476]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.712 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3935d368-e5fb-40de-8e40-9d47225e443c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.714 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:16:23 np0005593234 nova_compute[227762]: 2026-01-23 10:16:23.717 227766 INFO os_vif [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:53:81:35,bridge_name='br-int',has_traffic_filtering=True,id=483c7ca9-a908-4082-bbad-1ea123d6a3f1,network=Network(00bd3319-bfe5-4acd-b2e4-17830ee847f9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap483c7ca9-a9')#033[00m
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.730 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[28659362-91ff-40b2-94a5-281f884305ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 709552, 'reachable_time': 15632, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 293575, 'error': None, 'target': 'ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:23 np0005593234 systemd[1]: run-netns-ovnmeta\x2d00bd3319\x2dbfe5\x2d4acd\x2db2e4\x2d17830ee847f9.mount: Deactivated successfully.
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.733 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-00bd3319-bfe5-4acd-b2e4-17830ee847f9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:16:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:23.733 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[f799572a-a856-49eb-ab86-f4a340ad8158]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:16:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:24.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:24 np0005593234 nova_compute[227762]: 2026-01-23 10:16:24.460 227766 INFO nova.virt.libvirt.driver [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Deleting instance files /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4_del#033[00m
Jan 23 05:16:24 np0005593234 nova_compute[227762]: 2026-01-23 10:16:24.461 227766 INFO nova.virt.libvirt.driver [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Deletion of /var/lib/nova/instances/c38b8bfe-1b70-4daf-b676-250c1e933ed4_del complete#033[00m
Jan 23 05:16:25 np0005593234 nova_compute[227762]: 2026-01-23 10:16:25.217 227766 DEBUG nova.compute.manager [req-e3aed57d-c0dd-45e6-b8e3-4faf9e6963a7 req-1419f05b-fd8d-4c2c-abe4-3283604ac082 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received event network-vif-unplugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:16:25 np0005593234 nova_compute[227762]: 2026-01-23 10:16:25.217 227766 DEBUG oslo_concurrency.lockutils [req-e3aed57d-c0dd-45e6-b8e3-4faf9e6963a7 req-1419f05b-fd8d-4c2c-abe4-3283604ac082 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:25 np0005593234 nova_compute[227762]: 2026-01-23 10:16:25.217 227766 DEBUG oslo_concurrency.lockutils [req-e3aed57d-c0dd-45e6-b8e3-4faf9e6963a7 req-1419f05b-fd8d-4c2c-abe4-3283604ac082 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:25 np0005593234 nova_compute[227762]: 2026-01-23 10:16:25.218 227766 DEBUG oslo_concurrency.lockutils [req-e3aed57d-c0dd-45e6-b8e3-4faf9e6963a7 req-1419f05b-fd8d-4c2c-abe4-3283604ac082 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:25 np0005593234 nova_compute[227762]: 2026-01-23 10:16:25.218 227766 DEBUG nova.compute.manager [req-e3aed57d-c0dd-45e6-b8e3-4faf9e6963a7 req-1419f05b-fd8d-4c2c-abe4-3283604ac082 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] No waiting events found dispatching network-vif-unplugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:16:25 np0005593234 nova_compute[227762]: 2026-01-23 10:16:25.218 227766 DEBUG nova.compute.manager [req-e3aed57d-c0dd-45e6-b8e3-4faf9e6963a7 req-1419f05b-fd8d-4c2c-abe4-3283604ac082 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received event network-vif-unplugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:16:25 np0005593234 nova_compute[227762]: 2026-01-23 10:16:25.447 227766 INFO nova.compute.manager [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Took 2.17 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:16:25 np0005593234 nova_compute[227762]: 2026-01-23 10:16:25.448 227766 DEBUG oslo.service.loopingcall [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:16:25 np0005593234 nova_compute[227762]: 2026-01-23 10:16:25.448 227766 DEBUG nova.compute.manager [-] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:16:25 np0005593234 nova_compute[227762]: 2026-01-23 10:16:25.449 227766 DEBUG nova.network.neutron [-] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:16:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:25.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:26.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:27 np0005593234 nova_compute[227762]: 2026-01-23 10:16:27.433 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:27.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.030 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.033 227766 DEBUG nova.compute.manager [req-c0e24644-33fb-4a06-b221-666cc1b0af46 req-3c8f0cd4-fa79-4443-b04a-cd266c6f5c56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received event network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.033 227766 DEBUG oslo_concurrency.lockutils [req-c0e24644-33fb-4a06-b221-666cc1b0af46 req-3c8f0cd4-fa79-4443-b04a-cd266c6f5c56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.034 227766 DEBUG oslo_concurrency.lockutils [req-c0e24644-33fb-4a06-b221-666cc1b0af46 req-3c8f0cd4-fa79-4443-b04a-cd266c6f5c56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.034 227766 DEBUG oslo_concurrency.lockutils [req-c0e24644-33fb-4a06-b221-666cc1b0af46 req-3c8f0cd4-fa79-4443-b04a-cd266c6f5c56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.034 227766 DEBUG nova.compute.manager [req-c0e24644-33fb-4a06-b221-666cc1b0af46 req-3c8f0cd4-fa79-4443-b04a-cd266c6f5c56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] No waiting events found dispatching network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.035 227766 WARNING nova.compute.manager [req-c0e24644-33fb-4a06-b221-666cc1b0af46 req-3c8f0cd4-fa79-4443-b04a-cd266c6f5c56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received unexpected event network-vif-plugged-483c7ca9-a908-4082-bbad-1ea123d6a3f1 for instance with vm_state rescued and task_state deleting.#033[00m
Jan 23 05:16:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:28.037 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=52, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=51) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:16:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:28.038 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:16:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:28.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.280 227766 DEBUG nova.network.neutron [-] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.347 227766 INFO nova.compute.manager [-] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Took 2.90 seconds to deallocate network for instance.#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.452 227766 DEBUG oslo_concurrency.lockutils [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.452 227766 DEBUG oslo_concurrency.lockutils [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.486 227766 DEBUG nova.scheduler.client.report [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.504 227766 DEBUG nova.scheduler.client.report [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.505 227766 DEBUG nova.compute.provider_tree [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.529 227766 DEBUG nova.scheduler.client.report [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.541 227766 DEBUG nova.compute.manager [req-0cb68a2d-1b1a-4187-9b4c-2a81265dd5f7 req-16e114d9-e381-4329-be6c-19de047bf95e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Received event network-vif-deleted-483c7ca9-a908-4082-bbad-1ea123d6a3f1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.566 227766 DEBUG nova.scheduler.client.report [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.678 227766 DEBUG oslo_concurrency.processutils [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:28 np0005593234 nova_compute[227762]: 2026-01-23 10:16:28.712 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:16:29 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2790695261' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:16:29 np0005593234 nova_compute[227762]: 2026-01-23 10:16:29.140 227766 DEBUG oslo_concurrency.processutils [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:29 np0005593234 nova_compute[227762]: 2026-01-23 10:16:29.147 227766 DEBUG nova.compute.provider_tree [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:16:29 np0005593234 nova_compute[227762]: 2026-01-23 10:16:29.194 227766 DEBUG nova.scheduler.client.report [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:16:29 np0005593234 nova_compute[227762]: 2026-01-23 10:16:29.264 227766 DEBUG oslo_concurrency.lockutils [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.811s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:29 np0005593234 nova_compute[227762]: 2026-01-23 10:16:29.297 227766 INFO nova.scheduler.client.report [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Deleted allocations for instance c38b8bfe-1b70-4daf-b676-250c1e933ed4#033[00m
Jan 23 05:16:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:29.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:29 np0005593234 nova_compute[227762]: 2026-01-23 10:16:29.660 227766 DEBUG oslo_concurrency.lockutils [None req-b779970f-3653-4d4d-af4f-19b756431aa2 fae914e59ec54f6b80928ef3cc68dbdb 0a6ba16c4b9d49d3bc24cd7b44935d1f - - default default] Lock "c38b8bfe-1b70-4daf-b676-250c1e933ed4" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.388s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:30.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:31.039 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '52'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:16:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:31.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:32.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:32 np0005593234 nova_compute[227762]: 2026-01-23 10:16:32.480 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:33.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:33 np0005593234 nova_compute[227762]: 2026-01-23 10:16:33.716 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:34.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:35.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:35 np0005593234 ovn_controller[134547]: 2026-01-23T10:16:35Z|00586|binding|INFO|Releasing lport 2c16e447-27d9-4516-bf23-ec948f375c10 from this chassis (sb_readonly=0)
Jan 23 05:16:35 np0005593234 nova_compute[227762]: 2026-01-23 10:16:35.980 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:36.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:37 np0005593234 nova_compute[227762]: 2026-01-23 10:16:37.481 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:37.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:38.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:38 np0005593234 nova_compute[227762]: 2026-01-23 10:16:38.514 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163383.5138345, c38b8bfe-1b70-4daf-b676-250c1e933ed4 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:16:38 np0005593234 nova_compute[227762]: 2026-01-23 10:16:38.515 227766 INFO nova.compute.manager [-] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:16:38 np0005593234 nova_compute[227762]: 2026-01-23 10:16:38.548 227766 DEBUG nova.compute.manager [None req-531fda50-c130-462d-98cc-aa8ac1248d2b - - - - - -] [instance: c38b8bfe-1b70-4daf-b676-250c1e933ed4] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:16:38 np0005593234 nova_compute[227762]: 2026-01-23 10:16:38.718 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:38 np0005593234 nova_compute[227762]: 2026-01-23 10:16:38.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.061 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.061 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.061 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.061 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.062 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:16:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4149537340' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:16:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:39.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.506 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.655 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.656 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.819 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.820 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4204MB free_disk=20.987842559814453GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.820 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.821 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.926 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 633b85ea-a47c-4be0-b06d-388aa421728b actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.927 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:16:39 np0005593234 nova_compute[227762]: 2026-01-23 10:16:39.927 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:16:40 np0005593234 nova_compute[227762]: 2026-01-23 10:16:40.008 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:16:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:40.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:16:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:41.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:16:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/444975129' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:16:41 np0005593234 nova_compute[227762]: 2026-01-23 10:16:41.760 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.752s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:41 np0005593234 nova_compute[227762]: 2026-01-23 10:16:41.767 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:16:41 np0005593234 nova_compute[227762]: 2026-01-23 10:16:41.797 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:16:41 np0005593234 nova_compute[227762]: 2026-01-23 10:16:41.840 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:16:41 np0005593234 nova_compute[227762]: 2026-01-23 10:16:41.840 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.020s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:42.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:42 np0005593234 nova_compute[227762]: 2026-01-23 10:16:42.346 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:42 np0005593234 nova_compute[227762]: 2026-01-23 10:16:42.482 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:42 np0005593234 nova_compute[227762]: 2026-01-23 10:16:42.841 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:42.853 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:42.853 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:16:42.854 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:43.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:43 np0005593234 nova_compute[227762]: 2026-01-23 10:16:43.720 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:44.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:44 np0005593234 nova_compute[227762]: 2026-01-23 10:16:44.853 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:45.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:46.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:46 np0005593234 nova_compute[227762]: 2026-01-23 10:16:46.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:47 np0005593234 nova_compute[227762]: 2026-01-23 10:16:47.485 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:47.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:47 np0005593234 podman[293724]: 2026-01-23 10:16:47.765596781 +0000 UTC m=+0.058859870 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 05:16:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:16:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:48.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:16:48 np0005593234 nova_compute[227762]: 2026-01-23 10:16:48.723 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:49.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:49 np0005593234 nova_compute[227762]: 2026-01-23 10:16:49.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:49 np0005593234 nova_compute[227762]: 2026-01-23 10:16:49.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:49 np0005593234 nova_compute[227762]: 2026-01-23 10:16:49.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:16:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:50.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:16:50Z|00587|binding|INFO|Releasing lport 2c16e447-27d9-4516-bf23-ec948f375c10 from this chassis (sb_readonly=0)
Jan 23 05:16:50 np0005593234 nova_compute[227762]: 2026-01-23 10:16:50.749 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:51.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:51 np0005593234 nova_compute[227762]: 2026-01-23 10:16:51.790 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:51 np0005593234 nova_compute[227762]: 2026-01-23 10:16:51.791 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:51 np0005593234 nova_compute[227762]: 2026-01-23 10:16:51.809 227766 DEBUG nova.compute.manager [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:16:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:52.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:52 np0005593234 nova_compute[227762]: 2026-01-23 10:16:52.487 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:52 np0005593234 nova_compute[227762]: 2026-01-23 10:16:52.602 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:16:52 np0005593234 nova_compute[227762]: 2026-01-23 10:16:52.603 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:16:52 np0005593234 nova_compute[227762]: 2026-01-23 10:16:52.609 227766 DEBUG nova.virt.hardware [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:16:52 np0005593234 nova_compute[227762]: 2026-01-23 10:16:52.610 227766 INFO nova.compute.claims [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:16:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:52 np0005593234 nova_compute[227762]: 2026-01-23 10:16:52.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:16:52 np0005593234 nova_compute[227762]: 2026-01-23 10:16:52.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:16:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:53.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:53 np0005593234 nova_compute[227762]: 2026-01-23 10:16:53.724 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:53 np0005593234 podman[293796]: 2026-01-23 10:16:53.782331302 +0000 UTC m=+0.074852988 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:16:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:54.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:54 np0005593234 nova_compute[227762]: 2026-01-23 10:16:54.470 227766 DEBUG oslo_concurrency.processutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:16:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:16:54 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3398619664' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:16:54 np0005593234 nova_compute[227762]: 2026-01-23 10:16:54.945 227766 DEBUG oslo_concurrency.processutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:16:54 np0005593234 nova_compute[227762]: 2026-01-23 10:16:54.953 227766 DEBUG nova.compute.provider_tree [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:16:54 np0005593234 nova_compute[227762]: 2026-01-23 10:16:54.978 227766 DEBUG nova.scheduler.client.report [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:16:55 np0005593234 nova_compute[227762]: 2026-01-23 10:16:55.025 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.422s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:16:55 np0005593234 nova_compute[227762]: 2026-01-23 10:16:55.026 227766 DEBUG nova.compute.manager [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:16:55 np0005593234 nova_compute[227762]: 2026-01-23 10:16:55.147 227766 DEBUG nova.compute.manager [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:16:55 np0005593234 nova_compute[227762]: 2026-01-23 10:16:55.147 227766 DEBUG nova.network.neutron [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:16:55 np0005593234 nova_compute[227762]: 2026-01-23 10:16:55.282 227766 INFO nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:16:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:55.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:56.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:57 np0005593234 nova_compute[227762]: 2026-01-23 10:16:57.489 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:57.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:16:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:58.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:16:58 np0005593234 nova_compute[227762]: 2026-01-23 10:16:58.727 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:16:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:16:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:16:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:59.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:17:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:00.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:17:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:01.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:02.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:02 np0005593234 nova_compute[227762]: 2026-01-23 10:17:02.490 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:02 np0005593234 nova_compute[227762]: 2026-01-23 10:17:02.509 227766 DEBUG nova.compute.manager [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:17:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:02 np0005593234 nova_compute[227762]: 2026-01-23 10:17:02.852 227766 INFO nova.virt.block_device [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Booting with volume aee959cd-89dc-45e7-ba7b-58dc8568e292 at /dev/vda#033[00m
Jan 23 05:17:03 np0005593234 nova_compute[227762]: 2026-01-23 10:17:03.260 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:17:03 np0005593234 nova_compute[227762]: 2026-01-23 10:17:03.261 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:17:03 np0005593234 nova_compute[227762]: 2026-01-23 10:17:03.261 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:17:03 np0005593234 nova_compute[227762]: 2026-01-23 10:17:03.309 227766 DEBUG nova.policy [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '18f5dbf0e00d41b2b913cc1a517bc922', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '59cfb6a6a5ea438fb4b12029b4fcea0f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:17:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:17:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:03.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:17:03 np0005593234 nova_compute[227762]: 2026-01-23 10:17:03.729 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:03 np0005593234 nova_compute[227762]: 2026-01-23 10:17:03.994 227766 DEBUG os_brick.utils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:17:03 np0005593234 nova_compute[227762]: 2026-01-23 10:17:03.996 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.009 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.010 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[ceaf76a4-c808-427b-a10a-16629458328a]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.011 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.019 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.019 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[df13046f-12f6-40be-a54f-145d8b2d5769]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.021 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.030 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.031 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[86336668-bae2-4b11-afc9-69057e0e46b1]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.033 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[e38c7d45-6a59-4d49-9dc2-409156b15ca6]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.033 227766 DEBUG oslo_concurrency.processutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.059 227766 DEBUG oslo_concurrency.processutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CMD "nvme version" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.062 227766 DEBUG os_brick.initiator.connectors.lightos [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.062 227766 DEBUG os_brick.initiator.connectors.lightos [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.063 227766 DEBUG os_brick.initiator.connectors.lightos [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.063 227766 DEBUG os_brick.utils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] <== get_connector_properties: return (68ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.063 227766 DEBUG nova.virt.block_device [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Updating existing volume attachment record: 4e2d473b-0246-4a58-9f58-5de378436645 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:17:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:04.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:04 np0005593234 nova_compute[227762]: 2026-01-23 10:17:04.912 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:04.912 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=53, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=52) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:17:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:04.913 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:17:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:17:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2070612728' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:17:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:17:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:05.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:17:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:17:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:06.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:17:06 np0005593234 nova_compute[227762]: 2026-01-23 10:17:06.886 227766 DEBUG nova.network.neutron [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Successfully created port: d384450c-fad9-4b71-a4a4-8f666b98276f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:17:07 np0005593234 nova_compute[227762]: 2026-01-23 10:17:07.075 227766 DEBUG nova.compute.manager [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:17:07 np0005593234 nova_compute[227762]: 2026-01-23 10:17:07.076 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:17:07 np0005593234 nova_compute[227762]: 2026-01-23 10:17:07.077 227766 INFO nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Creating image(s)#033[00m
Jan 23 05:17:07 np0005593234 nova_compute[227762]: 2026-01-23 10:17:07.077 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:17:07 np0005593234 nova_compute[227762]: 2026-01-23 10:17:07.077 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Ensure instance console log exists: /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:17:07 np0005593234 nova_compute[227762]: 2026-01-23 10:17:07.078 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:07 np0005593234 nova_compute[227762]: 2026-01-23 10:17:07.078 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:07 np0005593234 nova_compute[227762]: 2026-01-23 10:17:07.078 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:07 np0005593234 nova_compute[227762]: 2026-01-23 10:17:07.491 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:07.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:08.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:08 np0005593234 nova_compute[227762]: 2026-01-23 10:17:08.730 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:08 np0005593234 nova_compute[227762]: 2026-01-23 10:17:08.843 227766 DEBUG nova.network.neutron [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Successfully updated port: d384450c-fad9-4b71-a4a4-8f666b98276f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:17:08 np0005593234 nova_compute[227762]: 2026-01-23 10:17:08.942 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Acquiring lock "refresh_cache-59fbb9c5-c8a9-4238-be6a-07598275a158" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:17:08 np0005593234 nova_compute[227762]: 2026-01-23 10:17:08.942 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Acquired lock "refresh_cache-59fbb9c5-c8a9-4238-be6a-07598275a158" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:17:08 np0005593234 nova_compute[227762]: 2026-01-23 10:17:08.943 227766 DEBUG nova.network.neutron [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:17:09 np0005593234 nova_compute[227762]: 2026-01-23 10:17:09.185 227766 DEBUG nova.compute.manager [req-65c36076-9041-44f2-bd8f-8a3b2c941d67 req-7752246c-cea8-4803-8997-648669a79116 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received event network-changed-d384450c-fad9-4b71-a4a4-8f666b98276f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:17:09 np0005593234 nova_compute[227762]: 2026-01-23 10:17:09.185 227766 DEBUG nova.compute.manager [req-65c36076-9041-44f2-bd8f-8a3b2c941d67 req-7752246c-cea8-4803-8997-648669a79116 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Refreshing instance network info cache due to event network-changed-d384450c-fad9-4b71-a4a4-8f666b98276f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:17:09 np0005593234 nova_compute[227762]: 2026-01-23 10:17:09.186 227766 DEBUG oslo_concurrency.lockutils [req-65c36076-9041-44f2-bd8f-8a3b2c941d67 req-7752246c-cea8-4803-8997-648669a79116 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-59fbb9c5-c8a9-4238-be6a-07598275a158" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:17:09 np0005593234 nova_compute[227762]: 2026-01-23 10:17:09.325 227766 DEBUG nova.network.neutron [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:17:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:09.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:10 np0005593234 nova_compute[227762]: 2026-01-23 10:17:10.115 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:10.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:10 np0005593234 nova_compute[227762]: 2026-01-23 10:17:10.168 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating instance_info_cache with network_info: [{"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:17:10 np0005593234 nova_compute[227762]: 2026-01-23 10:17:10.214 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:17:10 np0005593234 nova_compute[227762]: 2026-01-23 10:17:10.215 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:17:10 np0005593234 nova_compute[227762]: 2026-01-23 10:17:10.215 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:11.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.707 227766 DEBUG nova.network.neutron [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Updating instance_info_cache with network_info: [{"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.748 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Releasing lock "refresh_cache-59fbb9c5-c8a9-4238-be6a-07598275a158" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.748 227766 DEBUG nova.compute.manager [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Instance network_info: |[{"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.749 227766 DEBUG oslo_concurrency.lockutils [req-65c36076-9041-44f2-bd8f-8a3b2c941d67 req-7752246c-cea8-4803-8997-648669a79116 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-59fbb9c5-c8a9-4238-be6a-07598275a158" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.749 227766 DEBUG nova.network.neutron [req-65c36076-9041-44f2-bd8f-8a3b2c941d67 req-7752246c-cea8-4803-8997-648669a79116 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Refreshing network info cache for port d384450c-fad9-4b71-a4a4-8f666b98276f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.754 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Start _get_guest_xml network_info=[{"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'boot_index': 0, 'mount_device': '/dev/vda', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-aee959cd-89dc-45e7-ba7b-58dc8568e292', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'aee959cd-89dc-45e7-ba7b-58dc8568e292', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '59fbb9c5-c8a9-4238-be6a-07598275a158', 'attached_at': '', 'detached_at': '', 'volume_id': 'aee959cd-89dc-45e7-ba7b-58dc8568e292', 'serial': 'aee959cd-89dc-45e7-ba7b-58dc8568e292'}, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '4e2d473b-0246-4a58-9f58-5de378436645', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.760 227766 WARNING nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.766 227766 DEBUG nova.virt.libvirt.host [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.767 227766 DEBUG nova.virt.libvirt.host [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.771 227766 DEBUG nova.virt.libvirt.host [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.771 227766 DEBUG nova.virt.libvirt.host [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.773 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.773 227766 DEBUG nova.virt.hardware [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.773 227766 DEBUG nova.virt.hardware [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.773 227766 DEBUG nova.virt.hardware [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.774 227766 DEBUG nova.virt.hardware [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.774 227766 DEBUG nova.virt.hardware [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.774 227766 DEBUG nova.virt.hardware [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.774 227766 DEBUG nova.virt.hardware [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.774 227766 DEBUG nova.virt.hardware [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.774 227766 DEBUG nova.virt.hardware [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.775 227766 DEBUG nova.virt.hardware [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:17:11 np0005593234 nova_compute[227762]: 2026-01-23 10:17:11.775 227766 DEBUG nova.virt.hardware [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:17:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:11.916 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '53'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.071 227766 DEBUG nova.storage.rbd_utils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] rbd image 59fbb9c5-c8a9-4238-be6a-07598275a158_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.076 227766 DEBUG oslo_concurrency.processutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:12.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:17:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3472513583' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.530 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.548 227766 DEBUG oslo_concurrency.processutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.472s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.592 227766 DEBUG nova.virt.libvirt.vif [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:16:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-299074738',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-299074738',id=144,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIO8SoVPFpcE8uzIjQTRaXKTFlQPbF3ozpTg0KSfakFf4/i5eGAVsKo/QuxEFtXxl3uwg/ipbk/ufHC2kKF3qYN3uuybCD/TIi0ND+JRge+qnaUvHHmVHBBPdL11iJCEJQ==',key_name='tempest-keypair-657216728',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='59cfb6a6a5ea438fb4b12029b4fcea0f',ramdisk_id='',reservation_id='r-p45q8ss2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServerActionsV293TestJSON-1456789510',owner_user_name='tempest-ServerActionsV293TestJSON-1456789510-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:17:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='18f5dbf0e00d41b2b913cc1a517bc922',uuid=59fbb9c5-c8a9-4238-be6a-07598275a158,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.592 227766 DEBUG nova.network.os_vif_util [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Converting VIF {"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.593 227766 DEBUG nova.network.os_vif_util [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.594 227766 DEBUG nova.objects.instance [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lazy-loading 'pci_devices' on Instance uuid 59fbb9c5-c8a9-4238-be6a-07598275a158 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.633 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  <uuid>59fbb9c5-c8a9-4238-be6a-07598275a158</uuid>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  <name>instance-00000090</name>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerActionsV293TestJSON-server-299074738</nova:name>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:17:11</nova:creationTime>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <nova:user uuid="18f5dbf0e00d41b2b913cc1a517bc922">tempest-ServerActionsV293TestJSON-1456789510-project-member</nova:user>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <nova:project uuid="59cfb6a6a5ea438fb4b12029b4fcea0f">tempest-ServerActionsV293TestJSON-1456789510</nova:project>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <nova:port uuid="d384450c-fad9-4b71-a4a4-8f666b98276f">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <entry name="serial">59fbb9c5-c8a9-4238-be6a-07598275a158</entry>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <entry name="uuid">59fbb9c5-c8a9-4238-be6a-07598275a158</entry>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/59fbb9c5-c8a9-4238-be6a-07598275a158_disk.config">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="volumes/volume-aee959cd-89dc-45e7-ba7b-58dc8568e292">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <serial>aee959cd-89dc-45e7-ba7b-58dc8568e292</serial>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:62:5b:7f"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <target dev="tapd384450c-fa"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/console.log" append="off"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:17:12 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:17:12 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:17:12 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:17:12 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.636 227766 DEBUG nova.compute.manager [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Preparing to wait for external event network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.636 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.636 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.637 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.637 227766 DEBUG nova.virt.libvirt.vif [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:16:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-299074738',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-299074738',id=144,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIO8SoVPFpcE8uzIjQTRaXKTFlQPbF3ozpTg0KSfakFf4/i5eGAVsKo/QuxEFtXxl3uwg/ipbk/ufHC2kKF3qYN3uuybCD/TIi0ND+JRge+qnaUvHHmVHBBPdL11iJCEJQ==',key_name='tempest-keypair-657216728',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='59cfb6a6a5ea438fb4b12029b4fcea0f',ramdisk_id='',reservation_id='r-p45q8ss2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-ServerActionsV293TestJSON-1456789510',owner_user_name='tempest-ServerActionsV293TestJSON-1456789510-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:17:02Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='18f5dbf0e00d41b2b913cc1a517bc922',uuid=59fbb9c5-c8a9-4238-be6a-07598275a158,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.638 227766 DEBUG nova.network.os_vif_util [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Converting VIF {"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.638 227766 DEBUG nova.network.os_vif_util [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.638 227766 DEBUG os_vif [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.639 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.639 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.640 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.643 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.643 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd384450c-fa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.643 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd384450c-fa, col_values=(('external_ids', {'iface-id': 'd384450c-fad9-4b71-a4a4-8f666b98276f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:5b:7f', 'vm-uuid': '59fbb9c5-c8a9-4238-be6a-07598275a158'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.645 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:12 np0005593234 NetworkManager[48942]: <info>  [1769163432.6465] manager: (tapd384450c-fa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/286)
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.647 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.650 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.651 227766 INFO os_vif [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa')#033[00m
Jan 23 05:17:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.739 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.739 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.739 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] No VIF found with MAC fa:16:3e:62:5b:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.740 227766 INFO nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Using config drive#033[00m
Jan 23 05:17:12 np0005593234 nova_compute[227762]: 2026-01-23 10:17:12.768 227766 DEBUG nova.storage.rbd_utils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] rbd image 59fbb9c5-c8a9-4238-be6a-07598275a158_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:17:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:13.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:13 np0005593234 nova_compute[227762]: 2026-01-23 10:17:13.630 227766 INFO nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Creating config drive at /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/disk.config#033[00m
Jan 23 05:17:13 np0005593234 nova_compute[227762]: 2026-01-23 10:17:13.638 227766 DEBUG oslo_concurrency.processutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4j2xggvg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:13 np0005593234 nova_compute[227762]: 2026-01-23 10:17:13.770 227766 DEBUG oslo_concurrency.processutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4j2xggvg" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:13 np0005593234 nova_compute[227762]: 2026-01-23 10:17:13.800 227766 DEBUG nova.storage.rbd_utils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] rbd image 59fbb9c5-c8a9-4238-be6a-07598275a158_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:17:13 np0005593234 nova_compute[227762]: 2026-01-23 10:17:13.803 227766 DEBUG oslo_concurrency.processutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/disk.config 59fbb9c5-c8a9-4238-be6a-07598275a158_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:13 np0005593234 nova_compute[227762]: 2026-01-23 10:17:13.952 227766 DEBUG oslo_concurrency.processutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/disk.config 59fbb9c5-c8a9-4238-be6a-07598275a158_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:13 np0005593234 nova_compute[227762]: 2026-01-23 10:17:13.953 227766 INFO nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Deleting local config drive /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/disk.config because it was imported into RBD.#033[00m
Jan 23 05:17:13 np0005593234 kernel: tapd384450c-fa: entered promiscuous mode
Jan 23 05:17:14 np0005593234 NetworkManager[48942]: <info>  [1769163434.0005] manager: (tapd384450c-fa): new Tun device (/org/freedesktop/NetworkManager/Devices/287)
Jan 23 05:17:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:17:14Z|00588|binding|INFO|Claiming lport d384450c-fad9-4b71-a4a4-8f666b98276f for this chassis.
Jan 23 05:17:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:17:14Z|00589|binding|INFO|d384450c-fad9-4b71-a4a4-8f666b98276f: Claiming fa:16:3e:62:5b:7f 10.100.0.4
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.001 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:17:14Z|00590|binding|INFO|Setting lport d384450c-fad9-4b71-a4a4-8f666b98276f ovn-installed in OVS
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.016 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.019 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:14 np0005593234 systemd-udevd[294020]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:17:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:17:14Z|00591|binding|INFO|Setting lport d384450c-fad9-4b71-a4a4-8f666b98276f up in Southbound
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.037 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:5b:7f 10.100.0.4'], port_security=['fa:16:3e:62:5b:7f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '59fbb9c5-c8a9-4238-be6a-07598275a158', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '59cfb6a6a5ea438fb4b12029b4fcea0f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '86cc459d-07e9-4599-8119-e9daeae5f0bd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=989a90b0-402b-45c7-85bc-096f22ca1841, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d384450c-fad9-4b71-a4a4-8f666b98276f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.039 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d384450c-fad9-4b71-a4a4-8f666b98276f in datapath 8f36dc80-2fd9-4680-a74d-5f599bd98395 bound to our chassis#033[00m
Jan 23 05:17:14 np0005593234 systemd-machined[195626]: New machine qemu-69-instance-00000090.
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.040 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8f36dc80-2fd9-4680-a74d-5f599bd98395#033[00m
Jan 23 05:17:14 np0005593234 NetworkManager[48942]: <info>  [1769163434.0441] device (tapd384450c-fa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:17:14 np0005593234 NetworkManager[48942]: <info>  [1769163434.0450] device (tapd384450c-fa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.052 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[011a5f81-c8d1-4c27-a201-07d191372157]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.053 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8f36dc80-21 in ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:17:14 np0005593234 systemd[1]: Started Virtual Machine qemu-69-instance-00000090.
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.055 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8f36dc80-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.055 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ecb2cf0b-1918-4e3d-bbaa-d1f5c49c2d0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.056 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[05b2b4c2-4d8c-4617-9570-2e0da747532a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.069 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[29604603-a0bc-4e17-8400-a20dcfc9a72d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.083 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0afe8e41-836c-4846-adc4-acac36dde14a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.114 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[07ae7ed2-b2f6-446e-801b-a03ca7a0ea86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 NetworkManager[48942]: <info>  [1769163434.1209] manager: (tap8f36dc80-20): new Veth device (/org/freedesktop/NetworkManager/Devices/288)
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.121 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a6481bcb-feb3-451f-b466-0472232e747b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 systemd-udevd[294024]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.150 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f671a11c-5be4-427d-b692-a5cf29e816ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.152 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[52531d30-8d87-4465-b5e4-0dd949578702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:14.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:14 np0005593234 NetworkManager[48942]: <info>  [1769163434.1740] device (tap8f36dc80-20): carrier: link connected
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.179 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[aa3d0ff0-68cb-4537-9fff-9c76e8cb727c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.195 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9ad5f43f-93f6-43dc-97c0-babac317e458]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f36dc80-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:48:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 732913, 'reachable_time': 29525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294056, 'error': None, 'target': 'ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.210 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.211 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8b31e490-cf43-46ad-9654-5d8547cc9617]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:48d3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 732913, 'tstamp': 732913}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294057, 'error': None, 'target': 'ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.228 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ef564d-767f-440b-99b5-ce49fd0ccc21]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f36dc80-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:48:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 187], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 732913, 'reachable_time': 29525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294058, 'error': None, 'target': 'ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.256 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2865fe4a-2147-4be5-afa0-179ac16f913b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.312 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0a88eb7d-6768-4887-ad4e-5ed721738acd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.316 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f36dc80-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.317 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.317 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f36dc80-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.319 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:14 np0005593234 NetworkManager[48942]: <info>  [1769163434.3204] manager: (tap8f36dc80-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/289)
Jan 23 05:17:14 np0005593234 kernel: tap8f36dc80-20: entered promiscuous mode
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.322 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.323 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8f36dc80-20, col_values=(('external_ids', {'iface-id': '2dbd8718-7ffd-46bd-89c9-1311fab1c368'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.324 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:17:14Z|00592|binding|INFO|Releasing lport 2dbd8718-7ffd-46bd-89c9-1311fab1c368 from this chassis (sb_readonly=0)
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.337 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.338 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8f36dc80-2fd9-4680-a74d-5f599bd98395.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8f36dc80-2fd9-4680-a74d-5f599bd98395.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.339 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[65c6654e-635a-48cb-bb2f-e81e10576255]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.340 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-8f36dc80-2fd9-4680-a74d-5f599bd98395
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/8f36dc80-2fd9-4680-a74d-5f599bd98395.pid.haproxy
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 8f36dc80-2fd9-4680-a74d-5f599bd98395
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:17:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:14.341 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'env', 'PROCESS_TAG=haproxy-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8f36dc80-2fd9-4680-a74d-5f599bd98395.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.439 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163434.4390554, 59fbb9c5-c8a9-4238-be6a-07598275a158 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.439 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] VM Started (Lifecycle Event)#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.493 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.497 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163434.4398386, 59fbb9c5-c8a9-4238-be6a-07598275a158 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.497 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.593 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.598 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.664 227766 DEBUG nova.compute.manager [req-34ef9e29-505b-40ab-8844-5b10e189c0cf req-2f3a5418-d39c-4c0c-81aa-c316ef099db0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received event network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.665 227766 DEBUG oslo_concurrency.lockutils [req-34ef9e29-505b-40ab-8844-5b10e189c0cf req-2f3a5418-d39c-4c0c-81aa-c316ef099db0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.665 227766 DEBUG oslo_concurrency.lockutils [req-34ef9e29-505b-40ab-8844-5b10e189c0cf req-2f3a5418-d39c-4c0c-81aa-c316ef099db0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.666 227766 DEBUG oslo_concurrency.lockutils [req-34ef9e29-505b-40ab-8844-5b10e189c0cf req-2f3a5418-d39c-4c0c-81aa-c316ef099db0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.666 227766 DEBUG nova.compute.manager [req-34ef9e29-505b-40ab-8844-5b10e189c0cf req-2f3a5418-d39c-4c0c-81aa-c316ef099db0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Processing event network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.667 227766 DEBUG nova.compute.manager [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.671 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.674 227766 INFO nova.virt.libvirt.driver [-] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Instance spawned successfully.#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.674 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.690 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.691 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163434.6706173, 59fbb9c5-c8a9-4238-be6a-07598275a158 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.691 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:17:14 np0005593234 podman[294131]: 2026-01-23 10:17:14.720269073 +0000 UTC m=+0.060520252 container create 41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 05:17:14 np0005593234 systemd[1]: Started libpod-conmon-41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c.scope.
Jan 23 05:17:14 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:17:14 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0d8772cf921bc909322887ae7600757cecfb458652805a95dcbf09de8e51080/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:17:14 np0005593234 podman[294131]: 2026-01-23 10:17:14.688271078 +0000 UTC m=+0.028522277 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.796 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:17:14 np0005593234 podman[294131]: 2026-01-23 10:17:14.803401806 +0000 UTC m=+0.143652995 container init 41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.802 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.806 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.806 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.807 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.807 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.807 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.808 227766 DEBUG nova.virt.libvirt.driver [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:17:14 np0005593234 podman[294131]: 2026-01-23 10:17:14.810184238 +0000 UTC m=+0.150435407 container start 41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:17:14 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294146]: [NOTICE]   (294150) : New worker (294152) forked
Jan 23 05:17:14 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294146]: [NOTICE]   (294150) : Loading success.
Jan 23 05:17:14 np0005593234 nova_compute[227762]: 2026-01-23 10:17:14.882 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:17:15 np0005593234 nova_compute[227762]: 2026-01-23 10:17:15.038 227766 INFO nova.compute.manager [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Took 7.96 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:17:15 np0005593234 nova_compute[227762]: 2026-01-23 10:17:15.039 227766 DEBUG nova.compute.manager [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:17:15 np0005593234 nova_compute[227762]: 2026-01-23 10:17:15.336 227766 INFO nova.compute.manager [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Took 22.84 seconds to build instance.#033[00m
Jan 23 05:17:15 np0005593234 nova_compute[227762]: 2026-01-23 10:17:15.436 227766 DEBUG oslo_concurrency.lockutils [None req-720cc1b1-0676-4143-8ca2-e6ca5a671aa1 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:17:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:15.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:17:15 np0005593234 nova_compute[227762]: 2026-01-23 10:17:15.896 227766 DEBUG nova.network.neutron [req-65c36076-9041-44f2-bd8f-8a3b2c941d67 req-7752246c-cea8-4803-8997-648669a79116 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Updated VIF entry in instance network info cache for port d384450c-fad9-4b71-a4a4-8f666b98276f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:17:15 np0005593234 nova_compute[227762]: 2026-01-23 10:17:15.896 227766 DEBUG nova.network.neutron [req-65c36076-9041-44f2-bd8f-8a3b2c941d67 req-7752246c-cea8-4803-8997-648669a79116 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Updating instance_info_cache with network_info: [{"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:17:15 np0005593234 nova_compute[227762]: 2026-01-23 10:17:15.951 227766 DEBUG oslo_concurrency.lockutils [req-65c36076-9041-44f2-bd8f-8a3b2c941d67 req-7752246c-cea8-4803-8997-648669a79116 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-59fbb9c5-c8a9-4238-be6a-07598275a158" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:17:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:16.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:17:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:17:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:17:17 np0005593234 nova_compute[227762]: 2026-01-23 10:17:17.427 227766 DEBUG nova.compute.manager [req-c74c0c59-394b-4b33-abf6-3ac3813f4fd1 req-f2e5ba20-4ace-45d3-8465-cdf2dc318176 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received event network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:17:17 np0005593234 nova_compute[227762]: 2026-01-23 10:17:17.427 227766 DEBUG oslo_concurrency.lockutils [req-c74c0c59-394b-4b33-abf6-3ac3813f4fd1 req-f2e5ba20-4ace-45d3-8465-cdf2dc318176 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:17 np0005593234 nova_compute[227762]: 2026-01-23 10:17:17.428 227766 DEBUG oslo_concurrency.lockutils [req-c74c0c59-394b-4b33-abf6-3ac3813f4fd1 req-f2e5ba20-4ace-45d3-8465-cdf2dc318176 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:17 np0005593234 nova_compute[227762]: 2026-01-23 10:17:17.428 227766 DEBUG oslo_concurrency.lockutils [req-c74c0c59-394b-4b33-abf6-3ac3813f4fd1 req-f2e5ba20-4ace-45d3-8465-cdf2dc318176 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:17 np0005593234 nova_compute[227762]: 2026-01-23 10:17:17.428 227766 DEBUG nova.compute.manager [req-c74c0c59-394b-4b33-abf6-3ac3813f4fd1 req-f2e5ba20-4ace-45d3-8465-cdf2dc318176 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] No waiting events found dispatching network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:17:17 np0005593234 nova_compute[227762]: 2026-01-23 10:17:17.428 227766 WARNING nova.compute.manager [req-c74c0c59-394b-4b33-abf6-3ac3813f4fd1 req-f2e5ba20-4ace-45d3-8465-cdf2dc318176 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received unexpected event network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f for instance with vm_state active and task_state None.#033[00m
Jan 23 05:17:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:17.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:17 np0005593234 nova_compute[227762]: 2026-01-23 10:17:17.532 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:17 np0005593234 nova_compute[227762]: 2026-01-23 10:17:17.646 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:18.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:18 np0005593234 podman[294294]: 2026-01-23 10:17:18.768405732 +0000 UTC m=+0.059072627 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:17:19 np0005593234 nova_compute[227762]: 2026-01-23 10:17:19.122 227766 DEBUG nova.compute.manager [req-4a396461-ec62-4363-ac45-725fbaca3166 req-c729df00-5f97-4034-992a-dfd182aa8aa3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received event network-changed-d384450c-fad9-4b71-a4a4-8f666b98276f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:17:19 np0005593234 nova_compute[227762]: 2026-01-23 10:17:19.123 227766 DEBUG nova.compute.manager [req-4a396461-ec62-4363-ac45-725fbaca3166 req-c729df00-5f97-4034-992a-dfd182aa8aa3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Refreshing instance network info cache due to event network-changed-d384450c-fad9-4b71-a4a4-8f666b98276f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:17:19 np0005593234 nova_compute[227762]: 2026-01-23 10:17:19.123 227766 DEBUG oslo_concurrency.lockutils [req-4a396461-ec62-4363-ac45-725fbaca3166 req-c729df00-5f97-4034-992a-dfd182aa8aa3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-59fbb9c5-c8a9-4238-be6a-07598275a158" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:17:19 np0005593234 nova_compute[227762]: 2026-01-23 10:17:19.123 227766 DEBUG oslo_concurrency.lockutils [req-4a396461-ec62-4363-ac45-725fbaca3166 req-c729df00-5f97-4034-992a-dfd182aa8aa3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-59fbb9c5-c8a9-4238-be6a-07598275a158" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:17:19 np0005593234 nova_compute[227762]: 2026-01-23 10:17:19.123 227766 DEBUG nova.network.neutron [req-4a396461-ec62-4363-ac45-725fbaca3166 req-c729df00-5f97-4034-992a-dfd182aa8aa3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Refreshing network info cache for port d384450c-fad9-4b71-a4a4-8f666b98276f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:17:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:19.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:20.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:20 np0005593234 nova_compute[227762]: 2026-01-23 10:17:20.445 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:17:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2477827470' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:17:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:21.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:21 np0005593234 nova_compute[227762]: 2026-01-23 10:17:21.842 227766 DEBUG nova.network.neutron [req-4a396461-ec62-4363-ac45-725fbaca3166 req-c729df00-5f97-4034-992a-dfd182aa8aa3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Updated VIF entry in instance network info cache for port d384450c-fad9-4b71-a4a4-8f666b98276f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:17:21 np0005593234 nova_compute[227762]: 2026-01-23 10:17:21.843 227766 DEBUG nova.network.neutron [req-4a396461-ec62-4363-ac45-725fbaca3166 req-c729df00-5f97-4034-992a-dfd182aa8aa3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Updating instance_info_cache with network_info: [{"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:17:21 np0005593234 nova_compute[227762]: 2026-01-23 10:17:21.876 227766 DEBUG oslo_concurrency.lockutils [req-4a396461-ec62-4363-ac45-725fbaca3166 req-c729df00-5f97-4034-992a-dfd182aa8aa3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-59fbb9c5-c8a9-4238-be6a-07598275a158" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:17:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:22.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:22 np0005593234 nova_compute[227762]: 2026-01-23 10:17:22.533 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:22 np0005593234 nova_compute[227762]: 2026-01-23 10:17:22.648 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:17:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:17:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:23.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:17:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:24.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:17:24 np0005593234 podman[294366]: 2026-01-23 10:17:24.786508044 +0000 UTC m=+0.082507015 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 05:17:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:17:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:25.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:17:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:26.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:26 np0005593234 nova_compute[227762]: 2026-01-23 10:17:26.780 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:27 np0005593234 nova_compute[227762]: 2026-01-23 10:17:27.535 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:17:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:27.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:17:27 np0005593234 nova_compute[227762]: 2026-01-23 10:17:27.649 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:17:28Z|00063|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:62:5b:7f 10.100.0.4
Jan 23 05:17:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:17:28Z|00064|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:62:5b:7f 10.100.0.4
Jan 23 05:17:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:17:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:28.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:17:28 np0005593234 ceph-mgr[77448]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 05:17:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:29.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:30.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:31.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:32.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:32 np0005593234 nova_compute[227762]: 2026-01-23 10:17:32.536 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:32 np0005593234 nova_compute[227762]: 2026-01-23 10:17:32.651 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:33.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:34.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:35.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:36.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:37 np0005593234 nova_compute[227762]: 2026-01-23 10:17:37.539 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:37.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:37 np0005593234 nova_compute[227762]: 2026-01-23 10:17:37.654 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:38.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:38 np0005593234 nova_compute[227762]: 2026-01-23 10:17:38.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.006 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.007 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.007 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.007 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.008 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.036 227766 INFO nova.compute.manager [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Rebuilding instance#033[00m
Jan 23 05:17:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:17:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3315566149' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.495 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.538 227766 DEBUG nova.objects.instance [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lazy-loading 'trusted_certs' on Instance uuid 59fbb9c5-c8a9-4238-be6a-07598275a158 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:17:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:39.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.570 227766 DEBUG nova.compute.manager [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.614 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000090 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.615 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000090 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.618 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.618 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000008c as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.645 227766 DEBUG nova.objects.instance [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lazy-loading 'pci_requests' on Instance uuid 59fbb9c5-c8a9-4238-be6a-07598275a158 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.677 227766 DEBUG nova.objects.instance [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lazy-loading 'pci_devices' on Instance uuid 59fbb9c5-c8a9-4238-be6a-07598275a158 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.693 227766 DEBUG nova.objects.instance [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lazy-loading 'resources' on Instance uuid 59fbb9c5-c8a9-4238-be6a-07598275a158 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.707 227766 DEBUG nova.objects.instance [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lazy-loading 'migration_context' on Instance uuid 59fbb9c5-c8a9-4238-be6a-07598275a158 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.720 227766 DEBUG nova.objects.instance [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.723 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.804 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.806 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3934MB free_disk=20.966922760009766GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.806 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.806 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.947 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 633b85ea-a47c-4be0-b06d-388aa421728b actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.948 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 59fbb9c5-c8a9-4238-be6a-07598275a158 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.948 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:17:39 np0005593234 nova_compute[227762]: 2026-01-23 10:17:39.948 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:17:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:40.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:40 np0005593234 nova_compute[227762]: 2026-01-23 10:17:40.431 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:17:40 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3261356760' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:17:40 np0005593234 nova_compute[227762]: 2026-01-23 10:17:40.897 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:40 np0005593234 nova_compute[227762]: 2026-01-23 10:17:40.902 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:17:41 np0005593234 nova_compute[227762]: 2026-01-23 10:17:41.414 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:17:41 np0005593234 nova_compute[227762]: 2026-01-23 10:17:41.527 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:17:41 np0005593234 nova_compute[227762]: 2026-01-23 10:17:41.528 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.721s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:41.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:42 np0005593234 kernel: tapd384450c-fa (unregistering): left promiscuous mode
Jan 23 05:17:42 np0005593234 NetworkManager[48942]: <info>  [1769163462.0483] device (tapd384450c-fa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.058 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:42 np0005593234 ovn_controller[134547]: 2026-01-23T10:17:42Z|00593|binding|INFO|Releasing lport d384450c-fad9-4b71-a4a4-8f666b98276f from this chassis (sb_readonly=0)
Jan 23 05:17:42 np0005593234 ovn_controller[134547]: 2026-01-23T10:17:42Z|00594|binding|INFO|Setting lport d384450c-fad9-4b71-a4a4-8f666b98276f down in Southbound
Jan 23 05:17:42 np0005593234 ovn_controller[134547]: 2026-01-23T10:17:42Z|00595|binding|INFO|Removing iface tapd384450c-fa ovn-installed in OVS
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.073 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.073 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:5b:7f 10.100.0.4'], port_security=['fa:16:3e:62:5b:7f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '59fbb9c5-c8a9-4238-be6a-07598275a158', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '59cfb6a6a5ea438fb4b12029b4fcea0f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '86cc459d-07e9-4599-8119-e9daeae5f0bd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.237'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=989a90b0-402b-45c7-85bc-096f22ca1841, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d384450c-fad9-4b71-a4a4-8f666b98276f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.075 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d384450c-fad9-4b71-a4a4-8f666b98276f in datapath 8f36dc80-2fd9-4680-a74d-5f599bd98395 unbound from our chassis#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.077 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8f36dc80-2fd9-4680-a74d-5f599bd98395, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.079 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[342b9b88-90f4-4660-8744-f3ff2dd169ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.080 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395 namespace which is not needed anymore#033[00m
Jan 23 05:17:42 np0005593234 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000090.scope: Deactivated successfully.
Jan 23 05:17:42 np0005593234 systemd[1]: machine-qemu\x2d69\x2dinstance\x2d00000090.scope: Consumed 14.583s CPU time.
Jan 23 05:17:42 np0005593234 systemd-machined[195626]: Machine qemu-69-instance-00000090 terminated.
Jan 23 05:17:42 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294146]: [NOTICE]   (294150) : haproxy version is 2.8.14-c23fe91
Jan 23 05:17:42 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294146]: [NOTICE]   (294150) : path to executable is /usr/sbin/haproxy
Jan 23 05:17:42 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294146]: [WARNING]  (294150) : Exiting Master process...
Jan 23 05:17:42 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294146]: [ALERT]    (294150) : Current worker (294152) exited with code 143 (Terminated)
Jan 23 05:17:42 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294146]: [WARNING]  (294150) : All workers exited. Exiting... (0)
Jan 23 05:17:42 np0005593234 systemd[1]: libpod-41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c.scope: Deactivated successfully.
Jan 23 05:17:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:42.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:42 np0005593234 podman[294518]: 2026-01-23 10:17:42.209375234 +0000 UTC m=+0.045467765 container died 41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:17:42 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c-userdata-shm.mount: Deactivated successfully.
Jan 23 05:17:42 np0005593234 systemd[1]: var-lib-containers-storage-overlay-b0d8772cf921bc909322887ae7600757cecfb458652805a95dcbf09de8e51080-merged.mount: Deactivated successfully.
Jan 23 05:17:42 np0005593234 podman[294518]: 2026-01-23 10:17:42.24303708 +0000 UTC m=+0.079129611 container cleanup 41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:17:42 np0005593234 systemd[1]: libpod-conmon-41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c.scope: Deactivated successfully.
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.296 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.302 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:42 np0005593234 podman[294549]: 2026-01-23 10:17:42.436667859 +0000 UTC m=+0.175313081 container remove 41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.443 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3e528b16-d581-4f45-bc93-a03105a16051]: (4, ('Fri Jan 23 10:17:42 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395 (41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c)\n41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c\nFri Jan 23 10:17:42 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395 (41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c)\n41e79cf3464acf99bc484ea4a9c8203c2b2badaf699f2cbef8d8f30e9e91b14c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.445 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bcd46ddf-ae68-446e-b0a7-ac9b688d3936]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.447 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f36dc80-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.449 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:42 np0005593234 kernel: tap8f36dc80-20: left promiscuous mode
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.466 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.469 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9b7fce27-651f-4f98-8feb-8f7db5a876dc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.484 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[24298056-f716-422c-ad65-06d47d96b28f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.485 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a795b63b-c5ea-4bc4-a47e-b23d5ce6d971]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.499 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6fa5ec1b-a278-4a5b-8d82-41d84569f6d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 732906, 'reachable_time': 31784, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294577, 'error': None, 'target': 'ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.502 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.503 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[7f2a2045-bf37-40ff-a4c9-d3be20c90922]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:42 np0005593234 systemd[1]: run-netns-ovnmeta\x2d8f36dc80\x2d2fd9\x2d4680\x2da74d\x2d5f599bd98395.mount: Deactivated successfully.
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.540 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.655 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.739 227766 INFO nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.743 227766 INFO nova.virt.libvirt.driver [-] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Instance destroyed successfully.#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.749 227766 INFO nova.virt.libvirt.driver [-] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Instance destroyed successfully.#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.749 227766 DEBUG nova.virt.libvirt.vif [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:16:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-1260279439',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-299074738',id=144,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIO8SoVPFpcE8uzIjQTRaXKTFlQPbF3ozpTg0KSfakFf4/i5eGAVsKo/QuxEFtXxl3uwg/ipbk/ufHC2kKF3qYN3uuybCD/TIi0ND+JRge+qnaUvHHmVHBBPdL11iJCEJQ==',key_name='tempest-keypair-657216728',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:17:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='59cfb6a6a5ea438fb4b12029b4fcea0f',ramdisk_id='',reservation_id='r-p45q8ss2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsV293TestJSON-1456789510',owner_user_name='tempest-ServerActionsV293TestJSON-1456789510-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:17:37Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='18f5dbf0e00d41b2b913cc1a517bc922',uuid=59fbb9c5-c8a9-4238-be6a-07598275a158,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.750 227766 DEBUG nova.network.os_vif_util [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Converting VIF {"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.751 227766 DEBUG nova.network.os_vif_util [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.751 227766 DEBUG os_vif [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.753 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.754 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd384450c-fa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.755 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.757 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:42 np0005593234 nova_compute[227762]: 2026-01-23 10:17:42.759 227766 INFO os_vif [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa')#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.853 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.854 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:17:42.855 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:43 np0005593234 nova_compute[227762]: 2026-01-23 10:17:43.394 227766 INFO nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Deleting instance files /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158_del#033[00m
Jan 23 05:17:43 np0005593234 nova_compute[227762]: 2026-01-23 10:17:43.395 227766 INFO nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Deletion of /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158_del complete#033[00m
Jan 23 05:17:43 np0005593234 nova_compute[227762]: 2026-01-23 10:17:43.485 227766 DEBUG nova.compute.manager [req-f96126f9-d7c7-4382-afa7-f7799fe2b55e req-a78e9f19-7ad1-4e6f-92a3-8691df71970f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received event network-vif-unplugged-d384450c-fad9-4b71-a4a4-8f666b98276f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:17:43 np0005593234 nova_compute[227762]: 2026-01-23 10:17:43.486 227766 DEBUG oslo_concurrency.lockutils [req-f96126f9-d7c7-4382-afa7-f7799fe2b55e req-a78e9f19-7ad1-4e6f-92a3-8691df71970f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:43 np0005593234 nova_compute[227762]: 2026-01-23 10:17:43.486 227766 DEBUG oslo_concurrency.lockutils [req-f96126f9-d7c7-4382-afa7-f7799fe2b55e req-a78e9f19-7ad1-4e6f-92a3-8691df71970f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:43 np0005593234 nova_compute[227762]: 2026-01-23 10:17:43.486 227766 DEBUG oslo_concurrency.lockutils [req-f96126f9-d7c7-4382-afa7-f7799fe2b55e req-a78e9f19-7ad1-4e6f-92a3-8691df71970f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:43 np0005593234 nova_compute[227762]: 2026-01-23 10:17:43.486 227766 DEBUG nova.compute.manager [req-f96126f9-d7c7-4382-afa7-f7799fe2b55e req-a78e9f19-7ad1-4e6f-92a3-8691df71970f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] No waiting events found dispatching network-vif-unplugged-d384450c-fad9-4b71-a4a4-8f666b98276f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:17:43 np0005593234 nova_compute[227762]: 2026-01-23 10:17:43.486 227766 WARNING nova.compute.manager [req-f96126f9-d7c7-4382-afa7-f7799fe2b55e req-a78e9f19-7ad1-4e6f-92a3-8691df71970f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received unexpected event network-vif-unplugged-d384450c-fad9-4b71-a4a4-8f666b98276f for instance with vm_state active and task_state rebuilding.#033[00m
Jan 23 05:17:43 np0005593234 nova_compute[227762]: 2026-01-23 10:17:43.528 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:43.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:43 np0005593234 nova_compute[227762]: 2026-01-23 10:17:43.954 227766 WARNING nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] During detach_volume, instance disappeared.: nova.exception.InstanceNotFound: Instance 59fbb9c5-c8a9-4238-be6a-07598275a158 could not be found.#033[00m
Jan 23 05:17:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:17:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:44.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:17:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:45.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:45 np0005593234 nova_compute[227762]: 2026-01-23 10:17:45.698 227766 DEBUG nova.compute.manager [req-2af66e44-5fdf-4698-8366-09182f516997 req-8321bd1b-9d0b-43a2-ba43-7f7f4f26a9b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received event network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:17:45 np0005593234 nova_compute[227762]: 2026-01-23 10:17:45.698 227766 DEBUG oslo_concurrency.lockutils [req-2af66e44-5fdf-4698-8366-09182f516997 req-8321bd1b-9d0b-43a2-ba43-7f7f4f26a9b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:45 np0005593234 nova_compute[227762]: 2026-01-23 10:17:45.698 227766 DEBUG oslo_concurrency.lockutils [req-2af66e44-5fdf-4698-8366-09182f516997 req-8321bd1b-9d0b-43a2-ba43-7f7f4f26a9b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:45 np0005593234 nova_compute[227762]: 2026-01-23 10:17:45.699 227766 DEBUG oslo_concurrency.lockutils [req-2af66e44-5fdf-4698-8366-09182f516997 req-8321bd1b-9d0b-43a2-ba43-7f7f4f26a9b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:45 np0005593234 nova_compute[227762]: 2026-01-23 10:17:45.699 227766 DEBUG nova.compute.manager [req-2af66e44-5fdf-4698-8366-09182f516997 req-8321bd1b-9d0b-43a2-ba43-7f7f4f26a9b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] No waiting events found dispatching network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:17:45 np0005593234 nova_compute[227762]: 2026-01-23 10:17:45.699 227766 WARNING nova.compute.manager [req-2af66e44-5fdf-4698-8366-09182f516997 req-8321bd1b-9d0b-43a2-ba43-7f7f4f26a9b5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received unexpected event network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f for instance with vm_state active and task_state rebuilding.#033[00m
Jan 23 05:17:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:46.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:46 np0005593234 nova_compute[227762]: 2026-01-23 10:17:46.219 227766 DEBUG nova.compute.manager [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Preparing to wait for external event volume-reimaged-aee959cd-89dc-45e7-ba7b-58dc8568e292 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:17:46 np0005593234 nova_compute[227762]: 2026-01-23 10:17:46.220 227766 DEBUG oslo_concurrency.lockutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:46 np0005593234 nova_compute[227762]: 2026-01-23 10:17:46.220 227766 DEBUG oslo_concurrency.lockutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:46 np0005593234 nova_compute[227762]: 2026-01-23 10:17:46.221 227766 DEBUG oslo_concurrency.lockutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:46 np0005593234 nova_compute[227762]: 2026-01-23 10:17:46.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:46 np0005593234 nova_compute[227762]: 2026-01-23 10:17:46.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:47 np0005593234 nova_compute[227762]: 2026-01-23 10:17:47.541 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:47.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:47 np0005593234 nova_compute[227762]: 2026-01-23 10:17:47.755 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:17:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:48.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:17:48 np0005593234 nova_compute[227762]: 2026-01-23 10:17:48.741 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:49.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:17:49 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4195139317' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:17:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:17:49 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4195139317' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:17:49 np0005593234 podman[294601]: 2026-01-23 10:17:49.760429426 +0000 UTC m=+0.047420195 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:17:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:50.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:50 np0005593234 nova_compute[227762]: 2026-01-23 10:17:50.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:50 np0005593234 nova_compute[227762]: 2026-01-23 10:17:50.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:17:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:51.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:51 np0005593234 nova_compute[227762]: 2026-01-23 10:17:51.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:52.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:52 np0005593234 nova_compute[227762]: 2026-01-23 10:17:52.542 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:52 np0005593234 nova_compute[227762]: 2026-01-23 10:17:52.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:52 np0005593234 nova_compute[227762]: 2026-01-23 10:17:52.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:17:52 np0005593234 nova_compute[227762]: 2026-01-23 10:17:52.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:17:52 np0005593234 nova_compute[227762]: 2026-01-23 10:17:52.757 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:53 np0005593234 nova_compute[227762]: 2026-01-23 10:17:53.128 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:17:53 np0005593234 nova_compute[227762]: 2026-01-23 10:17:53.129 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:17:53 np0005593234 nova_compute[227762]: 2026-01-23 10:17:53.129 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:17:53 np0005593234 nova_compute[227762]: 2026-01-23 10:17:53.129 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 633b85ea-a47c-4be0-b06d-388aa421728b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:17:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:53.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:54.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:17:54 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2440939990' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:17:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:17:54 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2440939990' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:17:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:17:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:55.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:17:55 np0005593234 podman[294673]: 2026-01-23 10:17:55.782352858 +0000 UTC m=+0.074210038 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:17:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:56.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.245 227766 DEBUG nova.compute.manager [req-bb049b55-e424-45d6-b770-2c5190b06b82 req-40842455-5ac6-4084-9293-7710c85ff6ce 56391f81dbdc4570a9c80a65db98922d 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received event volume-reimaged-aee959cd-89dc-45e7-ba7b-58dc8568e292 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.245 227766 DEBUG oslo_concurrency.lockutils [req-bb049b55-e424-45d6-b770-2c5190b06b82 req-40842455-5ac6-4084-9293-7710c85ff6ce 56391f81dbdc4570a9c80a65db98922d 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.245 227766 DEBUG oslo_concurrency.lockutils [req-bb049b55-e424-45d6-b770-2c5190b06b82 req-40842455-5ac6-4084-9293-7710c85ff6ce 56391f81dbdc4570a9c80a65db98922d 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.245 227766 DEBUG oslo_concurrency.lockutils [req-bb049b55-e424-45d6-b770-2c5190b06b82 req-40842455-5ac6-4084-9293-7710c85ff6ce 56391f81dbdc4570a9c80a65db98922d 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.246 227766 DEBUG nova.compute.manager [req-bb049b55-e424-45d6-b770-2c5190b06b82 req-40842455-5ac6-4084-9293-7710c85ff6ce 56391f81dbdc4570a9c80a65db98922d 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Processing event volume-reimaged-aee959cd-89dc-45e7-ba7b-58dc8568e292 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.246 227766 DEBUG nova.compute.manager [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Instance event wait completed in 8 seconds for volume-reimaged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.319 227766 INFO nova.virt.block_device [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Booting with volume aee959cd-89dc-45e7-ba7b-58dc8568e292 at /dev/vda#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.616 227766 DEBUG os_brick.utils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.618 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.629 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.629 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[29344417-09f7-45bd-b319-dfa1d8f55315]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.631 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.639 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.640 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[e39440d7-1534-4ad7-a9d4-8fdade86bf26]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.641 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.650 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.650 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[9eb24c55-5b51-4b22-b2b1-1b22d5b09410]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.652 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[e7cb82e4-9c73-413b-b0e2-3b17f665f7f6]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.652 227766 DEBUG oslo_concurrency.processutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.676 227766 DEBUG oslo_concurrency.processutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CMD "nvme version" returned: 0 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.679 227766 DEBUG os_brick.initiator.connectors.lightos [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.680 227766 DEBUG os_brick.initiator.connectors.lightos [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.680 227766 DEBUG os_brick.initiator.connectors.lightos [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.680 227766 DEBUG os_brick.utils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] <== get_connector_properties: return (64ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.681 227766 DEBUG nova.virt.block_device [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Updating existing volume attachment record: 5e83a544-6ca3-4c1e-beb8-65d5969306e7 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.950 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating instance_info_cache with network_info: [{"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.973 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.973 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:17:56 np0005593234 nova_compute[227762]: 2026-01-23 10:17:56.974 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:17:57 np0005593234 nova_compute[227762]: 2026-01-23 10:17:57.314 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163462.312478, 59fbb9c5-c8a9-4238-be6a-07598275a158 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:17:57 np0005593234 nova_compute[227762]: 2026-01-23 10:17:57.314 227766 INFO nova.compute.manager [-] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:17:57 np0005593234 nova_compute[227762]: 2026-01-23 10:17:57.350 227766 DEBUG nova.compute.manager [None req-470636ea-b978-4354-9a0a-4eae520d0ca7 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:17:57 np0005593234 nova_compute[227762]: 2026-01-23 10:17:57.544 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:17:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:57.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:17:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:17:57 np0005593234 nova_compute[227762]: 2026-01-23 10:17:57.759 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:58.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.299 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.300 227766 INFO nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Creating image(s)#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.300 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.301 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Ensure instance console log exists: /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.301 227766 DEBUG oslo_concurrency.lockutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.302 227766 DEBUG oslo_concurrency.lockutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.302 227766 DEBUG oslo_concurrency.lockutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.307 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Start _get_guest_xml network_info=[{"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'boot_index': 0, 'mount_device': '/dev/vda', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-aee959cd-89dc-45e7-ba7b-58dc8568e292', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'aee959cd-89dc-45e7-ba7b-58dc8568e292', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '59fbb9c5-c8a9-4238-be6a-07598275a158', 'attached_at': '', 'detached_at': '', 'volume_id': 'aee959cd-89dc-45e7-ba7b-58dc8568e292', 'serial': 'aee959cd-89dc-45e7-ba7b-58dc8568e292'}, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '5e83a544-6ca3-4c1e-beb8-65d5969306e7', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.313 227766 WARNING nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.324 227766 DEBUG nova.virt.libvirt.host [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.324 227766 DEBUG nova.virt.libvirt.host [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.329 227766 DEBUG nova.virt.libvirt.host [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.330 227766 DEBUG nova.virt.libvirt.host [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.332 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.333 227766 DEBUG nova.virt.hardware [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.333 227766 DEBUG nova.virt.hardware [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.334 227766 DEBUG nova.virt.hardware [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.334 227766 DEBUG nova.virt.hardware [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.335 227766 DEBUG nova.virt.hardware [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.335 227766 DEBUG nova.virt.hardware [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.336 227766 DEBUG nova.virt.hardware [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.336 227766 DEBUG nova.virt.hardware [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.337 227766 DEBUG nova.virt.hardware [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.337 227766 DEBUG nova.virt.hardware [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.337 227766 DEBUG nova.virt.hardware [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.338 227766 DEBUG nova.objects.instance [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lazy-loading 'vcpu_model' on Instance uuid 59fbb9c5-c8a9-4238-be6a-07598275a158 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.394 227766 DEBUG nova.storage.rbd_utils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] rbd image 59fbb9c5-c8a9-4238-be6a-07598275a158_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.398 227766 DEBUG oslo_concurrency.processutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:17:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/39492881' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.850 227766 DEBUG oslo_concurrency.processutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.901 227766 DEBUG nova.virt.libvirt.vif [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:16:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-1260279439',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-299074738',id=144,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIO8SoVPFpcE8uzIjQTRaXKTFlQPbF3ozpTg0KSfakFf4/i5eGAVsKo/QuxEFtXxl3uwg/ipbk/ufHC2kKF3qYN3uuybCD/TIi0ND+JRge+qnaUvHHmVHBBPdL11iJCEJQ==',key_name='tempest-keypair-657216728',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:17:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='59cfb6a6a5ea438fb4b12029b4fcea0f',ramdisk_id='',reservation_id='r-p45q8ss2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsV293TestJSON-1456789510',owner_user_name='tempest-ServerActionsV293TestJSON-1456789510-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:17:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='18f5dbf0e00d41b2b913cc1a517bc922',uuid=59fbb9c5-c8a9-4238-be6a-07598275a158,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.902 227766 DEBUG nova.network.os_vif_util [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Converting VIF {"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.903 227766 DEBUG nova.network.os_vif_util [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.906 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  <uuid>59fbb9c5-c8a9-4238-be6a-07598275a158</uuid>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  <name>instance-00000090</name>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerActionsV293TestJSON-server-1260279439</nova:name>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:17:58</nova:creationTime>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <nova:user uuid="18f5dbf0e00d41b2b913cc1a517bc922">tempest-ServerActionsV293TestJSON-1456789510-project-member</nova:user>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <nova:project uuid="59cfb6a6a5ea438fb4b12029b4fcea0f">tempest-ServerActionsV293TestJSON-1456789510</nova:project>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <nova:port uuid="d384450c-fad9-4b71-a4a4-8f666b98276f">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <entry name="serial">59fbb9c5-c8a9-4238-be6a-07598275a158</entry>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <entry name="uuid">59fbb9c5-c8a9-4238-be6a-07598275a158</entry>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/59fbb9c5-c8a9-4238-be6a-07598275a158_disk.config">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="volumes/volume-aee959cd-89dc-45e7-ba7b-58dc8568e292">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <serial>aee959cd-89dc-45e7-ba7b-58dc8568e292</serial>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:62:5b:7f"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <target dev="tapd384450c-fa"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/console.log" append="off"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:17:58 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:17:58 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:17:58 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:17:58 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.907 227766 DEBUG nova.virt.libvirt.vif [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:16:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-1260279439',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-299074738',id=144,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIO8SoVPFpcE8uzIjQTRaXKTFlQPbF3ozpTg0KSfakFf4/i5eGAVsKo/QuxEFtXxl3uwg/ipbk/ufHC2kKF3qYN3uuybCD/TIi0ND+JRge+qnaUvHHmVHBBPdL11iJCEJQ==',key_name='tempest-keypair-657216728',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:17:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='59cfb6a6a5ea438fb4b12029b4fcea0f',ramdisk_id='',reservation_id='r-p45q8ss2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsV293TestJSON-1456789510',owner_user_name='tempest-ServerActionsV293TestJSON-1456789510-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:17:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='18f5dbf0e00d41b2b913cc1a517bc922',uuid=59fbb9c5-c8a9-4238-be6a-07598275a158,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.907 227766 DEBUG nova.network.os_vif_util [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Converting VIF {"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.908 227766 DEBUG nova.network.os_vif_util [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.908 227766 DEBUG os_vif [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.909 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.909 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.910 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.913 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.913 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd384450c-fa, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.914 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd384450c-fa, col_values=(('external_ids', {'iface-id': 'd384450c-fad9-4b71-a4a4-8f666b98276f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:62:5b:7f', 'vm-uuid': '59fbb9c5-c8a9-4238-be6a-07598275a158'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.916 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:58 np0005593234 NetworkManager[48942]: <info>  [1769163478.9168] manager: (tapd384450c-fa): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/290)
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.918 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.922 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:17:58 np0005593234 nova_compute[227762]: 2026-01-23 10:17:58.923 227766 INFO os_vif [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa')#033[00m
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.085 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.085 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.086 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] No VIF found with MAC fa:16:3e:62:5b:7f, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.086 227766 INFO nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Using config drive#033[00m
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.111 227766 DEBUG nova.storage.rbd_utils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] rbd image 59fbb9c5-c8a9-4238-be6a-07598275a158_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.143 227766 DEBUG nova.objects.instance [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lazy-loading 'ec2_ids' on Instance uuid 59fbb9c5-c8a9-4238-be6a-07598275a158 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.176 227766 DEBUG nova.objects.instance [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lazy-loading 'keypairs' on Instance uuid 59fbb9c5-c8a9-4238-be6a-07598275a158 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:17:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:17:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:17:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:59.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.617 227766 INFO nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Creating config drive at /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/disk.config#033[00m
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.623 227766 DEBUG oslo_concurrency.processutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbjuyhoc_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.753 227766 DEBUG oslo_concurrency.processutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbjuyhoc_" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.784 227766 DEBUG nova.storage.rbd_utils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] rbd image 59fbb9c5-c8a9-4238-be6a-07598275a158_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.788 227766 DEBUG oslo_concurrency.processutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/disk.config 59fbb9c5-c8a9-4238-be6a-07598275a158_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.959 227766 DEBUG oslo_concurrency.processutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/disk.config 59fbb9c5-c8a9-4238-be6a-07598275a158_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:17:59 np0005593234 nova_compute[227762]: 2026-01-23 10:17:59.959 227766 INFO nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Deleting local config drive /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158/disk.config because it was imported into RBD.#033[00m
Jan 23 05:18:00 np0005593234 kernel: tapd384450c-fa: entered promiscuous mode
Jan 23 05:18:00 np0005593234 NetworkManager[48942]: <info>  [1769163480.0079] manager: (tapd384450c-fa): new Tun device (/org/freedesktop/NetworkManager/Devices/291)
Jan 23 05:18:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:00Z|00596|binding|INFO|Claiming lport d384450c-fad9-4b71-a4a4-8f666b98276f for this chassis.
Jan 23 05:18:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:00Z|00597|binding|INFO|d384450c-fad9-4b71-a4a4-8f666b98276f: Claiming fa:16:3e:62:5b:7f 10.100.0.4
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.009 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:00Z|00598|binding|INFO|Setting lport d384450c-fad9-4b71-a4a4-8f666b98276f ovn-installed in OVS
Jan 23 05:18:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:00Z|00599|binding|INFO|Setting lport d384450c-fad9-4b71-a4a4-8f666b98276f up in Southbound
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.023 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.025 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:5b:7f 10.100.0.4'], port_security=['fa:16:3e:62:5b:7f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '59fbb9c5-c8a9-4238-be6a-07598275a158', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '59cfb6a6a5ea438fb4b12029b4fcea0f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '86cc459d-07e9-4599-8119-e9daeae5f0bd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.237'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=989a90b0-402b-45c7-85bc-096f22ca1841, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d384450c-fad9-4b71-a4a4-8f666b98276f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.026 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d384450c-fad9-4b71-a4a4-8f666b98276f in datapath 8f36dc80-2fd9-4680-a74d-5f599bd98395 bound to our chassis#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.027 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.027 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8f36dc80-2fd9-4680-a74d-5f599bd98395#033[00m
Jan 23 05:18:00 np0005593234 systemd-udevd[294822]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.038 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3e33d5bd-d8c5-4c0e-b61f-71c73cbc6bc1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.038 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8f36dc80-21 in ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:18:00 np0005593234 systemd-machined[195626]: New machine qemu-70-instance-00000090.
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.040 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8f36dc80-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.040 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[67a06b8f-3e09-453d-a90f-459f3b803296]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.041 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d2ed833e-66ed-4e51-9f84-58b341133072]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 NetworkManager[48942]: <info>  [1769163480.0479] device (tapd384450c-fa): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:18:00 np0005593234 NetworkManager[48942]: <info>  [1769163480.0489] device (tapd384450c-fa): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.052 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[d3383705-20c4-4d78-9f06-8aeafa898897]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 systemd[1]: Started Virtual Machine qemu-70-instance-00000090.
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.075 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5235df85-2f7f-4452-bd21-560be2a6263c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.103 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[03289785-466b-4a57-9135-9d053645b7dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 systemd-udevd[294825]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.110 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8201850a-14bc-49e6-9e4f-c9316d5d696b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 NetworkManager[48942]: <info>  [1769163480.1110] manager: (tap8f36dc80-20): new Veth device (/org/freedesktop/NetworkManager/Devices/292)
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.140 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9e02273d-5226-4063-a25c-117911c31c47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.142 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e49dbebf-9771-4670-9231-f42272c9436f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 NetworkManager[48942]: <info>  [1769163480.1617] device (tap8f36dc80-20): carrier: link connected
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.165 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[092aeb4e-8f6c-406d-a895-8b83d55de8f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.180 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3c4cecd9-6719-47c4-a924-5106ef950832]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f36dc80-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:48:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737512, 'reachable_time': 30748, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 294855, 'error': None, 'target': 'ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.195 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[02659823-3e30-4803-856c-d1f2c716fb6b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:48d3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 737512, 'tstamp': 737512}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 294856, 'error': None, 'target': 'ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.211 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b70ca6c2-351e-4a6a-9494-1311cf3d87d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8f36dc80-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:48:d3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 190], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737512, 'reachable_time': 30748, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 294857, 'error': None, 'target': 'ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:00.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.241 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a02db0-2bac-480b-8884-7830b67466fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.296 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2d5d04a3-cf40-490a-a975-ea342eafe7d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.298 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f36dc80-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.298 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.298 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8f36dc80-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:00 np0005593234 kernel: tap8f36dc80-20: entered promiscuous mode
Jan 23 05:18:00 np0005593234 NetworkManager[48942]: <info>  [1769163480.3014] manager: (tap8f36dc80-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/293)
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.300 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.303 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8f36dc80-20, col_values=(('external_ids', {'iface-id': '2dbd8718-7ffd-46bd-89c9-1311fab1c368'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.304 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:00Z|00600|binding|INFO|Releasing lport 2dbd8718-7ffd-46bd-89c9-1311fab1c368 from this chassis (sb_readonly=0)
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.320 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.321 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8f36dc80-2fd9-4680-a74d-5f599bd98395.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8f36dc80-2fd9-4680-a74d-5f599bd98395.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.322 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d54abab7-ae17-4980-961c-f6962f57d6ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.323 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-8f36dc80-2fd9-4680-a74d-5f599bd98395
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/8f36dc80-2fd9-4680-a74d-5f599bd98395.pid.haproxy
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 8f36dc80-2fd9-4680-a74d-5f599bd98395
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:18:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:00.323 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'env', 'PROCESS_TAG=haproxy-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8f36dc80-2fd9-4680-a74d-5f599bd98395.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.513 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163480.5128143, 59fbb9c5-c8a9-4238-be6a-07598275a158 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.513 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.516 227766 DEBUG nova.compute.manager [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.516 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.520 227766 INFO nova.virt.libvirt.driver [-] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Instance spawned successfully.#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.520 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.555 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.558 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.573 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.573 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.574 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.574 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.574 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.575 227766 DEBUG nova.virt.libvirt.driver [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.636 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.637 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163480.5156431, 59fbb9c5-c8a9-4238-be6a-07598275a158 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.637 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] VM Started (Lifecycle Event)#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.667 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.672 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.699 227766 DEBUG nova.compute.manager [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:18:00 np0005593234 podman[294931]: 2026-01-23 10:18:00.717830269 +0000 UTC m=+0.060038978 container create c7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.745 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 05:18:00 np0005593234 systemd[1]: Started libpod-conmon-c7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c.scope.
Jan 23 05:18:00 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:18:00 np0005593234 podman[294931]: 2026-01-23 10:18:00.688503927 +0000 UTC m=+0.030712636 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:18:00 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d6d926bede6dcc692c13ee54a09c6d93b8f474a9434e2d11ff154c258795599/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.791 227766 DEBUG oslo_concurrency.lockutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.792 227766 DEBUG oslo_concurrency.lockutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.793 227766 DEBUG nova.objects.instance [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 05:18:00 np0005593234 podman[294931]: 2026-01-23 10:18:00.805805413 +0000 UTC m=+0.148014092 container init c7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:18:00 np0005593234 podman[294931]: 2026-01-23 10:18:00.811093067 +0000 UTC m=+0.153301746 container start c7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 05:18:00 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294946]: [NOTICE]   (294950) : New worker (294952) forked
Jan 23 05:18:00 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294946]: [NOTICE]   (294950) : Loading success.
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.925 227766 DEBUG oslo_concurrency.lockutils [None req-bb049b55-e424-45d6-b770-2c5190b06b82 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.947 227766 DEBUG nova.compute.manager [req-446dbee6-ec81-4ffe-984c-355b3e6290e3 req-c6b3ceb2-fa80-4e89-8567-1a0aa115eb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received event network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.948 227766 DEBUG oslo_concurrency.lockutils [req-446dbee6-ec81-4ffe-984c-355b3e6290e3 req-c6b3ceb2-fa80-4e89-8567-1a0aa115eb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.948 227766 DEBUG oslo_concurrency.lockutils [req-446dbee6-ec81-4ffe-984c-355b3e6290e3 req-c6b3ceb2-fa80-4e89-8567-1a0aa115eb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.948 227766 DEBUG oslo_concurrency.lockutils [req-446dbee6-ec81-4ffe-984c-355b3e6290e3 req-c6b3ceb2-fa80-4e89-8567-1a0aa115eb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.948 227766 DEBUG nova.compute.manager [req-446dbee6-ec81-4ffe-984c-355b3e6290e3 req-c6b3ceb2-fa80-4e89-8567-1a0aa115eb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] No waiting events found dispatching network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:18:00 np0005593234 nova_compute[227762]: 2026-01-23 10:18:00.949 227766 WARNING nova.compute.manager [req-446dbee6-ec81-4ffe-984c-355b3e6290e3 req-c6b3ceb2-fa80-4e89-8567-1a0aa115eb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received unexpected event network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f for instance with vm_state active and task_state None.#033[00m
Jan 23 05:18:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:01.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #130. Immutable memtables: 0.
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.047612) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 81] Flushing memtable with next log file: 130
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163482047672, "job": 81, "event": "flush_started", "num_memtables": 1, "num_entries": 1388, "num_deletes": 257, "total_data_size": 2977565, "memory_usage": 3017136, "flush_reason": "Manual Compaction"}
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 81] Level-0 flush table #131: started
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163482061119, "cf_name": "default", "job": 81, "event": "table_file_creation", "file_number": 131, "file_size": 1954264, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 62918, "largest_seqno": 64301, "table_properties": {"data_size": 1948456, "index_size": 3074, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12831, "raw_average_key_size": 19, "raw_value_size": 1936579, "raw_average_value_size": 2970, "num_data_blocks": 136, "num_entries": 652, "num_filter_entries": 652, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163372, "oldest_key_time": 1769163372, "file_creation_time": 1769163482, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 131, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 81] Flush lasted 13611 microseconds, and 4948 cpu microseconds.
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.061222) [db/flush_job.cc:967] [default] [JOB 81] Level-0 flush table #131: 1954264 bytes OK
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.061245) [db/memtable_list.cc:519] [default] Level-0 commit table #131 started
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.062899) [db/memtable_list.cc:722] [default] Level-0 commit table #131: memtable #1 done
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.062912) EVENT_LOG_v1 {"time_micros": 1769163482062908, "job": 81, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.062928) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 81] Try to delete WAL files size 2970964, prev total WAL file size 2970964, number of live WAL files 2.
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000127.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.063682) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032323539' seq:72057594037927935, type:22 .. '6C6F676D0032353132' seq:0, type:0; will stop at (end)
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 82] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 81 Base level 0, inputs: [131(1908KB)], [129(8829KB)]
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163482063811, "job": 82, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [131], "files_L6": [129], "score": -1, "input_data_size": 10996167, "oldest_snapshot_seqno": -1}
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 82] Generated table #132: 8300 keys, 10860180 bytes, temperature: kUnknown
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163482133179, "cf_name": "default", "job": 82, "event": "table_file_creation", "file_number": 132, "file_size": 10860180, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10807277, "index_size": 31030, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20805, "raw_key_size": 219076, "raw_average_key_size": 26, "raw_value_size": 10661902, "raw_average_value_size": 1284, "num_data_blocks": 1189, "num_entries": 8300, "num_filter_entries": 8300, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769163482, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 132, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.133455) [db/compaction/compaction_job.cc:1663] [default] [JOB 82] Compacted 1@0 + 1@6 files to L6 => 10860180 bytes
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.134690) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.3 rd, 156.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 8.6 +0.0 blob) out(10.4 +0.0 blob), read-write-amplify(11.2) write-amplify(5.6) OK, records in: 8827, records dropped: 527 output_compression: NoCompression
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.134706) EVENT_LOG_v1 {"time_micros": 1769163482134699, "job": 82, "event": "compaction_finished", "compaction_time_micros": 69470, "compaction_time_cpu_micros": 25276, "output_level": 6, "num_output_files": 1, "total_output_size": 10860180, "num_input_records": 8827, "num_output_records": 8300, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000131.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163482135220, "job": 82, "event": "table_file_deletion", "file_number": 131}
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000129.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163482136852, "job": 82, "event": "table_file_deletion", "file_number": 129}
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.063557) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.136948) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.136952) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.136953) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.136955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:18:02.136956) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:18:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:18:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:02.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:18:02 np0005593234 nova_compute[227762]: 2026-01-23 10:18:02.546 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:02 np0005593234 nova_compute[227762]: 2026-01-23 10:18:02.968 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:03 np0005593234 nova_compute[227762]: 2026-01-23 10:18:03.145 227766 DEBUG nova.compute.manager [req-4ea976a3-9b22-41ba-a6f2-cc101d664269 req-da142516-f672-4f5e-804c-27e9d884c9e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received event network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:03 np0005593234 nova_compute[227762]: 2026-01-23 10:18:03.145 227766 DEBUG oslo_concurrency.lockutils [req-4ea976a3-9b22-41ba-a6f2-cc101d664269 req-da142516-f672-4f5e-804c-27e9d884c9e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:03 np0005593234 nova_compute[227762]: 2026-01-23 10:18:03.145 227766 DEBUG oslo_concurrency.lockutils [req-4ea976a3-9b22-41ba-a6f2-cc101d664269 req-da142516-f672-4f5e-804c-27e9d884c9e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:03 np0005593234 nova_compute[227762]: 2026-01-23 10:18:03.146 227766 DEBUG oslo_concurrency.lockutils [req-4ea976a3-9b22-41ba-a6f2-cc101d664269 req-da142516-f672-4f5e-804c-27e9d884c9e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:03 np0005593234 nova_compute[227762]: 2026-01-23 10:18:03.146 227766 DEBUG nova.compute.manager [req-4ea976a3-9b22-41ba-a6f2-cc101d664269 req-da142516-f672-4f5e-804c-27e9d884c9e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] No waiting events found dispatching network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:18:03 np0005593234 nova_compute[227762]: 2026-01-23 10:18:03.146 227766 WARNING nova.compute.manager [req-4ea976a3-9b22-41ba-a6f2-cc101d664269 req-da142516-f672-4f5e-804c-27e9d884c9e3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received unexpected event network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f for instance with vm_state active and task_state None.#033[00m
Jan 23 05:18:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:03.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:03 np0005593234 nova_compute[227762]: 2026-01-23 10:18:03.916 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:18:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:04.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:18:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:05.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:18:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:06.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:18:07 np0005593234 nova_compute[227762]: 2026-01-23 10:18:07.549 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:07.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:08.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:08 np0005593234 nova_compute[227762]: 2026-01-23 10:18:08.918 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:09.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:10.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:11.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:18:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:12.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:18:12 np0005593234 nova_compute[227762]: 2026-01-23 10:18:12.551 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:13.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:13 np0005593234 nova_compute[227762]: 2026-01-23 10:18:13.922 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:14.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:14 np0005593234 nova_compute[227762]: 2026-01-23 10:18:14.586 227766 DEBUG nova.compute.manager [req-bb26b0be-ef0a-4925-9b7e-8549134ff680 req-7ff89c83-b083-4e7b-afff-23e537774b7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Received event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:14 np0005593234 nova_compute[227762]: 2026-01-23 10:18:14.587 227766 DEBUG nova.compute.manager [req-bb26b0be-ef0a-4925-9b7e-8549134ff680 req-7ff89c83-b083-4e7b-afff-23e537774b7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing instance network info cache due to event network-changed-2957b316-2d74-4b52-bfc9-52a2c5b56c01. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:18:14 np0005593234 nova_compute[227762]: 2026-01-23 10:18:14.587 227766 DEBUG oslo_concurrency.lockutils [req-bb26b0be-ef0a-4925-9b7e-8549134ff680 req-7ff89c83-b083-4e7b-afff-23e537774b7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:18:14 np0005593234 nova_compute[227762]: 2026-01-23 10:18:14.587 227766 DEBUG oslo_concurrency.lockutils [req-bb26b0be-ef0a-4925-9b7e-8549134ff680 req-7ff89c83-b083-4e7b-afff-23e537774b7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:18:14 np0005593234 nova_compute[227762]: 2026-01-23 10:18:14.587 227766 DEBUG nova.network.neutron [req-bb26b0be-ef0a-4925-9b7e-8549134ff680 req-7ff89c83-b083-4e7b-afff-23e537774b7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Refreshing network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:18:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:14Z|00065|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:62:5b:7f 10.100.0.4
Jan 23 05:18:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:14Z|00066|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:62:5b:7f 10.100.0.4
Jan 23 05:18:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:15.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:16.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:17 np0005593234 nova_compute[227762]: 2026-01-23 10:18:17.553 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:17.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:18.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:18 np0005593234 nova_compute[227762]: 2026-01-23 10:18:18.898 227766 DEBUG nova.network.neutron [req-bb26b0be-ef0a-4925-9b7e-8549134ff680 req-7ff89c83-b083-4e7b-afff-23e537774b7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updated VIF entry in instance network info cache for port 2957b316-2d74-4b52-bfc9-52a2c5b56c01. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:18:18 np0005593234 nova_compute[227762]: 2026-01-23 10:18:18.899 227766 DEBUG nova.network.neutron [req-bb26b0be-ef0a-4925-9b7e-8549134ff680 req-7ff89c83-b083-4e7b-afff-23e537774b7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating instance_info_cache with network_info: [{"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:18:18 np0005593234 nova_compute[227762]: 2026-01-23 10:18:18.925 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:18 np0005593234 nova_compute[227762]: 2026-01-23 10:18:18.932 227766 DEBUG oslo_concurrency.lockutils [req-bb26b0be-ef0a-4925-9b7e-8549134ff680 req-7ff89c83-b083-4e7b-afff-23e537774b7b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-633b85ea-a47c-4be0-b06d-388aa421728b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:18:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:19.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:20.224 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=54, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=53) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:18:20 np0005593234 nova_compute[227762]: 2026-01-23 10:18:20.225 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:20.227 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:18:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:20.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:20 np0005593234 podman[295022]: 2026-01-23 10:18:20.759323955 +0000 UTC m=+0.048472878 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:18:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:21.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:22.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:22 np0005593234 nova_compute[227762]: 2026-01-23 10:18:22.554 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 05:18:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:23.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 05:18:23 np0005593234 nova_compute[227762]: 2026-01-23 10:18:23.927 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:24.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:18:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:25.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:18:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:18:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:18:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:18:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:26.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:26Z|00601|binding|INFO|Releasing lport 2dbd8718-7ffd-46bd-89c9-1311fab1c368 from this chassis (sb_readonly=0)
Jan 23 05:18:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:26Z|00602|binding|INFO|Releasing lport 2c16e447-27d9-4516-bf23-ec948f375c10 from this chassis (sb_readonly=0)
Jan 23 05:18:26 np0005593234 nova_compute[227762]: 2026-01-23 10:18:26.426 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:26 np0005593234 podman[295175]: 2026-01-23 10:18:26.795543549 +0000 UTC m=+0.085311503 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 23 05:18:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:27.228 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '54'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:27 np0005593234 nova_compute[227762]: 2026-01-23 10:18:27.555 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:27.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:28.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:28 np0005593234 nova_compute[227762]: 2026-01-23 10:18:28.931 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:18:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:29.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:18:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:30.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:31.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:31 np0005593234 nova_compute[227762]: 2026-01-23 10:18:31.921 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:32 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:18:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:32.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:32 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:18:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:18:32 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.444 227766 DEBUG oslo_concurrency.lockutils [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "633b85ea-a47c-4be0-b06d-388aa421728b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.444 227766 DEBUG oslo_concurrency.lockutils [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.444 227766 DEBUG oslo_concurrency.lockutils [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "633b85ea-a47c-4be0-b06d-388aa421728b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.445 227766 DEBUG oslo_concurrency.lockutils [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.445 227766 DEBUG oslo_concurrency.lockutils [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.446 227766 INFO nova.compute.manager [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Terminating instance#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.447 227766 DEBUG nova.compute.manager [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:18:32 np0005593234 kernel: tap2957b316-2d (unregistering): left promiscuous mode
Jan 23 05:18:32 np0005593234 NetworkManager[48942]: <info>  [1769163512.5145] device (tap2957b316-2d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:18:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:32Z|00603|binding|INFO|Releasing lport 2957b316-2d74-4b52-bfc9-52a2c5b56c01 from this chassis (sb_readonly=0)
Jan 23 05:18:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:32Z|00604|binding|INFO|Setting lport 2957b316-2d74-4b52-bfc9-52a2c5b56c01 down in Southbound
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.525 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:32Z|00605|binding|INFO|Removing iface tap2957b316-2d ovn-installed in OVS
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.527 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.539 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.586 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.588 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:c7:7e 10.100.0.12'], port_security=['fa:16:3e:89:c7:7e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '633b85ea-a47c-4be0-b06d-388aa421728b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ae621f21a8e438fb95152309b38cee5', 'neutron:revision_number': '6', 'neutron:security_group_ids': '3b0a0b41-45a8-4582-a4d2-a9aff1f1a18c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5888498-07d6-4c96-95ee-546974eebd82, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=2957b316-2d74-4b52-bfc9-52a2c5b56c01) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.589 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 2957b316-2d74-4b52-bfc9-52a2c5b56c01 in datapath f98d79de-4a23-4f29-9848-c5d4c5683a5d unbound from our chassis#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.593 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f98d79de-4a23-4f29-9848-c5d4c5683a5d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.594 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e08b08e9-30de-4b72-b81d-b9d682db60c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.596 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d namespace which is not needed anymore#033[00m
Jan 23 05:18:32 np0005593234 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000008c.scope: Deactivated successfully.
Jan 23 05:18:32 np0005593234 systemd[1]: machine-qemu\x2d68\x2dinstance\x2d0000008c.scope: Consumed 23.716s CPU time.
Jan 23 05:18:32 np0005593234 systemd-machined[195626]: Machine qemu-68-instance-0000008c terminated.
Jan 23 05:18:32 np0005593234 kernel: tap2957b316-2d: entered promiscuous mode
Jan 23 05:18:32 np0005593234 NetworkManager[48942]: <info>  [1769163512.6648] manager: (tap2957b316-2d): new Tun device (/org/freedesktop/NetworkManager/Devices/294)
Jan 23 05:18:32 np0005593234 kernel: tap2957b316-2d (unregistering): left promiscuous mode
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.672 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:32Z|00606|binding|INFO|Claiming lport 2957b316-2d74-4b52-bfc9-52a2c5b56c01 for this chassis.
Jan 23 05:18:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:32Z|00607|binding|INFO|2957b316-2d74-4b52-bfc9-52a2c5b56c01: Claiming fa:16:3e:89:c7:7e 10.100.0.12
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.684 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:c7:7e 10.100.0.12'], port_security=['fa:16:3e:89:c7:7e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '633b85ea-a47c-4be0-b06d-388aa421728b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ae621f21a8e438fb95152309b38cee5', 'neutron:revision_number': '6', 'neutron:security_group_ids': '3b0a0b41-45a8-4582-a4d2-a9aff1f1a18c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5888498-07d6-4c96-95ee-546974eebd82, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=2957b316-2d74-4b52-bfc9-52a2c5b56c01) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:18:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:32Z|00608|binding|INFO|Setting lport 2957b316-2d74-4b52-bfc9-52a2c5b56c01 ovn-installed in OVS
Jan 23 05:18:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:32Z|00609|binding|INFO|Setting lport 2957b316-2d74-4b52-bfc9-52a2c5b56c01 up in Southbound
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.699 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:32Z|00610|binding|INFO|Releasing lport 2957b316-2d74-4b52-bfc9-52a2c5b56c01 from this chassis (sb_readonly=1)
Jan 23 05:18:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:32Z|00611|if_status|INFO|Dropped 4 log messages in last 799 seconds (most recently, 799 seconds ago) due to excessive rate
Jan 23 05:18:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:32Z|00612|if_status|INFO|Not setting lport 2957b316-2d74-4b52-bfc9-52a2c5b56c01 down as sb is readonly
Jan 23 05:18:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:32Z|00613|binding|INFO|Removing iface tap2957b316-2d ovn-installed in OVS
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.702 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:32Z|00614|binding|INFO|Releasing lport 2957b316-2d74-4b52-bfc9-52a2c5b56c01 from this chassis (sb_readonly=0)
Jan 23 05:18:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:32Z|00615|binding|INFO|Setting lport 2957b316-2d74-4b52-bfc9-52a2c5b56c01 down in Southbound
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.705 227766 INFO nova.virt.libvirt.driver [-] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Instance destroyed successfully.#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.706 227766 DEBUG nova.objects.instance [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lazy-loading 'resources' on Instance uuid 633b85ea-a47c-4be0-b06d-388aa421728b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.713 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.721 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:89:c7:7e 10.100.0.12'], port_security=['fa:16:3e:89:c7:7e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '633b85ea-a47c-4be0-b06d-388aa421728b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ae621f21a8e438fb95152309b38cee5', 'neutron:revision_number': '6', 'neutron:security_group_ids': '3b0a0b41-45a8-4582-a4d2-a9aff1f1a18c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e5888498-07d6-4c96-95ee-546974eebd82, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=2957b316-2d74-4b52-bfc9-52a2c5b56c01) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:18:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.728 227766 DEBUG nova.virt.libvirt.vif [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:14:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-TestInstancesWithCinderVolumes-server-205659850',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testinstanceswithcindervolumes-server-205659850',id=140,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPuMczToXGmZUNyxG5fVGeV6xaoJVOpQ6Lh9dx5t6v22bv4xalVGQLUjYNEpg7ajkuOU/WHiNfvMhffjZHY/YojnQQYOX+q0GTa9+NPbkGDFf1XELa+vTNvIe6ZV8CwP9g==',key_name='tempest-TestInstancesWithCinderVolumes-232096272',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:14:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ae621f21a8e438fb95152309b38cee5',ramdisk_id='',reservation_id='r-rpa9cnu8',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestInstancesWithCinderVolumes-565485208',owner_user_name='tempest-TestInstancesWithCinderVolumes-565485208-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:14:54Z,user_data=None,user_id='95ac13194f0940128d42af3d45d130fa',uuid=633b85ea-a47c-4be0-b06d-388aa421728b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.729 227766 DEBUG nova.network.os_vif_util [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Converting VIF {"id": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "address": "fa:16:3e:89:c7:7e", "network": {"id": "f98d79de-4a23-4f29-9848-c5d4c5683a5d", "bridge": "br-int", "label": "tempest-TestInstancesWithCinderVolumes-1507431135-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ae621f21a8e438fb95152309b38cee5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2957b316-2d", "ovs_interfaceid": "2957b316-2d74-4b52-bfc9-52a2c5b56c01", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.729 227766 DEBUG nova.network.os_vif_util [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:89:c7:7e,bridge_name='br-int',has_traffic_filtering=True,id=2957b316-2d74-4b52-bfc9-52a2c5b56c01,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2957b316-2d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.730 227766 DEBUG os_vif [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:c7:7e,bridge_name='br-int',has_traffic_filtering=True,id=2957b316-2d74-4b52-bfc9-52a2c5b56c01,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2957b316-2d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.731 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.732 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2957b316-2d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.733 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.736 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.738 227766 INFO os_vif [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:89:c7:7e,bridge_name='br-int',has_traffic_filtering=True,id=2957b316-2d74-4b52-bfc9-52a2c5b56c01,network=Network(f98d79de-4a23-4f29-9848-c5d4c5683a5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2957b316-2d')#033[00m
Jan 23 05:18:32 np0005593234 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[292387]: [NOTICE]   (292391) : haproxy version is 2.8.14-c23fe91
Jan 23 05:18:32 np0005593234 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[292387]: [NOTICE]   (292391) : path to executable is /usr/sbin/haproxy
Jan 23 05:18:32 np0005593234 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[292387]: [WARNING]  (292391) : Exiting Master process...
Jan 23 05:18:32 np0005593234 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[292387]: [ALERT]    (292391) : Current worker (292393) exited with code 143 (Terminated)
Jan 23 05:18:32 np0005593234 neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d[292387]: [WARNING]  (292391) : All workers exited. Exiting... (0)
Jan 23 05:18:32 np0005593234 systemd[1]: libpod-aa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38.scope: Deactivated successfully.
Jan 23 05:18:32 np0005593234 podman[295334]: 2026-01-23 10:18:32.753311316 +0000 UTC m=+0.049358106 container died aa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:18:32 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38-userdata-shm.mount: Deactivated successfully.
Jan 23 05:18:32 np0005593234 systemd[1]: var-lib-containers-storage-overlay-3aa1dd529c3ccbaab298f2eb778eabfaf5ca007f4f142a37e078d0fc7a1999f9-merged.mount: Deactivated successfully.
Jan 23 05:18:32 np0005593234 podman[295334]: 2026-01-23 10:18:32.802221016 +0000 UTC m=+0.098267806 container cleanup aa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:18:32 np0005593234 systemd[1]: libpod-conmon-aa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38.scope: Deactivated successfully.
Jan 23 05:18:32 np0005593234 podman[295385]: 2026-01-23 10:18:32.86569304 +0000 UTC m=+0.039744247 container remove aa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.871 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bfd6330f-d53a-4d47-8a20-d3f164e23d42]: (4, ('Fri Jan 23 10:18:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d (aa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38)\naa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38\nFri Jan 23 10:18:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d (aa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38)\naa6b1ec4b90d38ca76ec3aaca7ac8c3b7ae78610187fae9eaca8d61fd7ef4f38\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.873 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[efeeb1ac-f351-435d-b0ac-bc7f79c16fa0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.874 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf98d79de-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:32 np0005593234 kernel: tapf98d79de-40: left promiscuous mode
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.877 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.890 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.893 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1e7efa7d-bfc1-4972-8065-61ebc062265a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.909 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3f91e164-97ff-488f-9ecc-ba11cd472cc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.910 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ae4a5e3c-85fd-4cc5-a21e-39da231a2e4f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.925 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[906c804f-6df8-4854-a1cd-0565313d5fd9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 718863, 'reachable_time': 28188, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295402, 'error': None, 'target': 'ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593234 systemd[1]: run-netns-ovnmeta\x2df98d79de\x2d4a23\x2d4f29\x2d9848\x2dc5d4c5683a5d.mount: Deactivated successfully.
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.929 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f98d79de-4a23-4f29-9848-c5d4c5683a5d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.930 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[a91ff908-2638-40c5-ad40-f54e3cc92122]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.931 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 2957b316-2d74-4b52-bfc9-52a2c5b56c01 in datapath f98d79de-4a23-4f29-9848-c5d4c5683a5d unbound from our chassis#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.933 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f98d79de-4a23-4f29-9848-c5d4c5683a5d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.933 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cd6811c3-bb04-48b9-8e31-1607fc15dc20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.934 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 2957b316-2d74-4b52-bfc9-52a2c5b56c01 in datapath f98d79de-4a23-4f29-9848-c5d4c5683a5d unbound from our chassis#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.935 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f98d79de-4a23-4f29-9848-c5d4c5683a5d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:18:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:32.936 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6b38be52-57ef-413b-8b48-48e0d83e3b85]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.968 227766 INFO nova.virt.libvirt.driver [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Deleting instance files /var/lib/nova/instances/633b85ea-a47c-4be0-b06d-388aa421728b_del#033[00m
Jan 23 05:18:32 np0005593234 nova_compute[227762]: 2026-01-23 10:18:32.969 227766 INFO nova.virt.libvirt.driver [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Deletion of /var/lib/nova/instances/633b85ea-a47c-4be0-b06d-388aa421728b_del complete#033[00m
Jan 23 05:18:33 np0005593234 nova_compute[227762]: 2026-01-23 10:18:33.078 227766 INFO nova.compute.manager [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Took 0.63 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:18:33 np0005593234 nova_compute[227762]: 2026-01-23 10:18:33.079 227766 DEBUG oslo.service.loopingcall [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:18:33 np0005593234 nova_compute[227762]: 2026-01-23 10:18:33.079 227766 DEBUG nova.compute.manager [-] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:18:33 np0005593234 nova_compute[227762]: 2026-01-23 10:18:33.079 227766 DEBUG nova.network.neutron [-] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:18:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:33.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:34 np0005593234 nova_compute[227762]: 2026-01-23 10:18:34.110 227766 DEBUG nova.network.neutron [-] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:18:34 np0005593234 nova_compute[227762]: 2026-01-23 10:18:34.135 227766 INFO nova.compute.manager [-] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Took 1.06 seconds to deallocate network for instance.#033[00m
Jan 23 05:18:34 np0005593234 nova_compute[227762]: 2026-01-23 10:18:34.229 227766 DEBUG nova.compute.manager [req-76fab88c-478b-488d-bb7a-815b53fb1bd6 req-a00812e4-50b4-4923-8f8d-6344ce777737 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Received event network-vif-deleted-2957b316-2d74-4b52-bfc9-52a2c5b56c01 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:18:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:34.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:18:34 np0005593234 nova_compute[227762]: 2026-01-23 10:18:34.478 227766 INFO nova.compute.manager [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Took 0.34 seconds to detach 1 volumes for instance.#033[00m
Jan 23 05:18:34 np0005593234 nova_compute[227762]: 2026-01-23 10:18:34.583 227766 DEBUG oslo_concurrency.lockutils [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:34 np0005593234 nova_compute[227762]: 2026-01-23 10:18:34.584 227766 DEBUG oslo_concurrency.lockutils [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:34 np0005593234 nova_compute[227762]: 2026-01-23 10:18:34.737 227766 DEBUG oslo_concurrency.processutils [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:18:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:18:35 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2653520427' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.193 227766 DEBUG oslo_concurrency.processutils [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.199 227766 DEBUG nova.compute.provider_tree [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.225 227766 DEBUG nova.scheduler.client.report [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.263 227766 DEBUG oslo_concurrency.lockutils [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.352 227766 INFO nova.scheduler.client.report [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Deleted allocations for instance 633b85ea-a47c-4be0-b06d-388aa421728b#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.429 227766 DEBUG oslo_concurrency.lockutils [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.429 227766 DEBUG oslo_concurrency.lockutils [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.430 227766 DEBUG oslo_concurrency.lockutils [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.430 227766 DEBUG oslo_concurrency.lockutils [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.430 227766 DEBUG oslo_concurrency.lockutils [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.431 227766 INFO nova.compute.manager [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Terminating instance#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.433 227766 DEBUG nova.compute.manager [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:18:35 np0005593234 kernel: tapd384450c-fa (unregistering): left promiscuous mode
Jan 23 05:18:35 np0005593234 NetworkManager[48942]: <info>  [1769163515.4840] device (tapd384450c-fa): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.490 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:35 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:35Z|00616|binding|INFO|Releasing lport d384450c-fad9-4b71-a4a4-8f666b98276f from this chassis (sb_readonly=0)
Jan 23 05:18:35 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:35Z|00617|binding|INFO|Setting lport d384450c-fad9-4b71-a4a4-8f666b98276f down in Southbound
Jan 23 05:18:35 np0005593234 ovn_controller[134547]: 2026-01-23T10:18:35Z|00618|binding|INFO|Removing iface tapd384450c-fa ovn-installed in OVS
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.493 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.507 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.514 227766 DEBUG oslo_concurrency.lockutils [None req-aeea3c7b-837e-45d2-a2b5-e3d7d8a3ebc1 95ac13194f0940128d42af3d45d130fa 3ae621f21a8e438fb95152309b38cee5 - - default default] Lock "633b85ea-a47c-4be0-b06d-388aa421728b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.514 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:62:5b:7f 10.100.0.4'], port_security=['fa:16:3e:62:5b:7f 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '59fbb9c5-c8a9-4238-be6a-07598275a158', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '59cfb6a6a5ea438fb4b12029b4fcea0f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '86cc459d-07e9-4599-8119-e9daeae5f0bd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.237', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=989a90b0-402b-45c7-85bc-096f22ca1841, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d384450c-fad9-4b71-a4a4-8f666b98276f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.515 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d384450c-fad9-4b71-a4a4-8f666b98276f in datapath 8f36dc80-2fd9-4680-a74d-5f599bd98395 unbound from our chassis#033[00m
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.517 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8f36dc80-2fd9-4680-a74d-5f599bd98395, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.517 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fea85c1d-04a2-4031-9e50-e7f190a71fc7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.518 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395 namespace which is not needed anymore#033[00m
Jan 23 05:18:35 np0005593234 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000090.scope: Deactivated successfully.
Jan 23 05:18:35 np0005593234 systemd[1]: machine-qemu\x2d70\x2dinstance\x2d00000090.scope: Consumed 14.168s CPU time.
Jan 23 05:18:35 np0005593234 systemd-machined[195626]: Machine qemu-70-instance-00000090 terminated.
Jan 23 05:18:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:35.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:35 np0005593234 NetworkManager[48942]: <info>  [1769163515.6507] manager: (tapd384450c-fa): new Tun device (/org/freedesktop/NetworkManager/Devices/295)
Jan 23 05:18:35 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294946]: [NOTICE]   (294950) : haproxy version is 2.8.14-c23fe91
Jan 23 05:18:35 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294946]: [NOTICE]   (294950) : path to executable is /usr/sbin/haproxy
Jan 23 05:18:35 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294946]: [WARNING]  (294950) : Exiting Master process...
Jan 23 05:18:35 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294946]: [ALERT]    (294950) : Current worker (294952) exited with code 143 (Terminated)
Jan 23 05:18:35 np0005593234 neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395[294946]: [WARNING]  (294950) : All workers exited. Exiting... (0)
Jan 23 05:18:35 np0005593234 systemd[1]: libpod-c7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c.scope: Deactivated successfully.
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.666 227766 INFO nova.virt.libvirt.driver [-] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Instance destroyed successfully.#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.667 227766 DEBUG nova.objects.instance [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lazy-loading 'resources' on Instance uuid 59fbb9c5-c8a9-4238-be6a-07598275a158 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:18:35 np0005593234 podman[295451]: 2026-01-23 10:18:35.670632016 +0000 UTC m=+0.053214935 container died c7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.686 227766 DEBUG nova.virt.libvirt.vif [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:16:47Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-ServerActionsV293TestJSON-server-1260279439',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serveractionsv293testjson-server-299074738',id=144,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIO8SoVPFpcE8uzIjQTRaXKTFlQPbF3ozpTg0KSfakFf4/i5eGAVsKo/QuxEFtXxl3uwg/ipbk/ufHC2kKF3qYN3uuybCD/TIi0ND+JRge+qnaUvHHmVHBBPdL11iJCEJQ==',key_name='tempest-keypair-657216728',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={rebuild='server'},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='59cfb6a6a5ea438fb4b12029b4fcea0f',ramdisk_id='',reservation_id='r-p45q8ss2',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerActionsV293TestJSON-1456789510',owner_user_name='tempest-ServerActionsV293TestJSON-1456789510-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:00Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='18f5dbf0e00d41b2b913cc1a517bc922',uuid=59fbb9c5-c8a9-4238-be6a-07598275a158,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.688 227766 DEBUG nova.network.os_vif_util [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Converting VIF {"id": "d384450c-fad9-4b71-a4a4-8f666b98276f", "address": "fa:16:3e:62:5b:7f", "network": {"id": "8f36dc80-2fd9-4680-a74d-5f599bd98395", "bridge": "br-int", "label": "tempest-ServerActionsV293TestJSON-1747630754-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "59cfb6a6a5ea438fb4b12029b4fcea0f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd384450c-fa", "ovs_interfaceid": "d384450c-fad9-4b71-a4a4-8f666b98276f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.689 227766 DEBUG nova.network.os_vif_util [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.689 227766 DEBUG os_vif [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.690 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.691 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd384450c-fa, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.692 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.694 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.696 227766 INFO os_vif [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:62:5b:7f,bridge_name='br-int',has_traffic_filtering=True,id=d384450c-fad9-4b71-a4a4-8f666b98276f,network=Network(8f36dc80-2fd9-4680-a74d-5f599bd98395),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd384450c-fa')#033[00m
Jan 23 05:18:35 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c-userdata-shm.mount: Deactivated successfully.
Jan 23 05:18:35 np0005593234 systemd[1]: var-lib-containers-storage-overlay-3d6d926bede6dcc692c13ee54a09c6d93b8f474a9434e2d11ff154c258795599-merged.mount: Deactivated successfully.
Jan 23 05:18:35 np0005593234 podman[295451]: 2026-01-23 10:18:35.716376687 +0000 UTC m=+0.098959576 container cleanup c7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:18:35 np0005593234 systemd[1]: libpod-conmon-c7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c.scope: Deactivated successfully.
Jan 23 05:18:35 np0005593234 podman[295503]: 2026-01-23 10:18:35.772719349 +0000 UTC m=+0.036862087 container remove c7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.778 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[813cb95c-f535-40d5-b45b-0415a36bdb14]: (4, ('Fri Jan 23 10:18:35 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395 (c7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c)\nc7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c\nFri Jan 23 10:18:35 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395 (c7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c)\nc7bf45d4bb11390d29f2452d4389c4ffd43dc6551974b643a148d85dd4f5a85c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.779 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[26450d46-7937-4241-834f-51dc6dfdb8f1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.781 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8f36dc80-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.782 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:35 np0005593234 kernel: tap8f36dc80-20: left promiscuous mode
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.796 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.799 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1d4e005e-8d9c-4dec-ac9c-8dc761b1ceeb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.815 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[af8754f0-5b6d-410f-9531-991d4aa97d3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.816 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[73dfb0ad-68cd-4779-b5fa-9300567a312b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.831 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bf339024-5830-4d2c-9379-f2237e8e3d8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 737505, 'reachable_time': 42386, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 295522, 'error': None, 'target': 'ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:35 np0005593234 systemd[1]: run-netns-ovnmeta\x2d8f36dc80\x2d2fd9\x2d4680\x2da74d\x2d5f599bd98395.mount: Deactivated successfully.
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.834 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8f36dc80-2fd9-4680-a74d-5f599bd98395 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:18:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:35.835 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[c5ce6b5c-98a0-4230-b50b-ed2819706ecc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.911 227766 INFO nova.virt.libvirt.driver [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Deleting instance files /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158_del#033[00m
Jan 23 05:18:35 np0005593234 nova_compute[227762]: 2026-01-23 10:18:35.911 227766 INFO nova.virt.libvirt.driver [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Deletion of /var/lib/nova/instances/59fbb9c5-c8a9-4238-be6a-07598275a158_del complete#033[00m
Jan 23 05:18:36 np0005593234 nova_compute[227762]: 2026-01-23 10:18:36.022 227766 DEBUG nova.compute.manager [req-d5da2a42-db34-4138-bbf8-d60f1a0970cc req-7d58cf96-32fa-45ca-bc0d-bc2994441400 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received event network-vif-unplugged-d384450c-fad9-4b71-a4a4-8f666b98276f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:36 np0005593234 nova_compute[227762]: 2026-01-23 10:18:36.022 227766 DEBUG oslo_concurrency.lockutils [req-d5da2a42-db34-4138-bbf8-d60f1a0970cc req-7d58cf96-32fa-45ca-bc0d-bc2994441400 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:36 np0005593234 nova_compute[227762]: 2026-01-23 10:18:36.022 227766 DEBUG oslo_concurrency.lockutils [req-d5da2a42-db34-4138-bbf8-d60f1a0970cc req-7d58cf96-32fa-45ca-bc0d-bc2994441400 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:36 np0005593234 nova_compute[227762]: 2026-01-23 10:18:36.023 227766 DEBUG oslo_concurrency.lockutils [req-d5da2a42-db34-4138-bbf8-d60f1a0970cc req-7d58cf96-32fa-45ca-bc0d-bc2994441400 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:36 np0005593234 nova_compute[227762]: 2026-01-23 10:18:36.023 227766 DEBUG nova.compute.manager [req-d5da2a42-db34-4138-bbf8-d60f1a0970cc req-7d58cf96-32fa-45ca-bc0d-bc2994441400 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] No waiting events found dispatching network-vif-unplugged-d384450c-fad9-4b71-a4a4-8f666b98276f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:18:36 np0005593234 nova_compute[227762]: 2026-01-23 10:18:36.023 227766 DEBUG nova.compute.manager [req-d5da2a42-db34-4138-bbf8-d60f1a0970cc req-7d58cf96-32fa-45ca-bc0d-bc2994441400 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received event network-vif-unplugged-d384450c-fad9-4b71-a4a4-8f666b98276f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:18:36 np0005593234 nova_compute[227762]: 2026-01-23 10:18:36.029 227766 INFO nova.compute.manager [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Took 0.60 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:18:36 np0005593234 nova_compute[227762]: 2026-01-23 10:18:36.030 227766 DEBUG oslo.service.loopingcall [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:18:36 np0005593234 nova_compute[227762]: 2026-01-23 10:18:36.030 227766 DEBUG nova.compute.manager [-] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:18:36 np0005593234 nova_compute[227762]: 2026-01-23 10:18:36.030 227766 DEBUG nova.network.neutron [-] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:18:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:36.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:37 np0005593234 nova_compute[227762]: 2026-01-23 10:18:37.587 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:18:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:37.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:18:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:18:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3854041957' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:18:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:18:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3854041957' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:18:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:38.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:38 np0005593234 nova_compute[227762]: 2026-01-23 10:18:38.698 227766 DEBUG nova.compute.manager [req-24001707-449e-4641-9f72-153a4d6db099 req-6bd35931-84ef-4e88-a440-d7b028146df7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received event network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:38 np0005593234 nova_compute[227762]: 2026-01-23 10:18:38.699 227766 DEBUG oslo_concurrency.lockutils [req-24001707-449e-4641-9f72-153a4d6db099 req-6bd35931-84ef-4e88-a440-d7b028146df7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:38 np0005593234 nova_compute[227762]: 2026-01-23 10:18:38.699 227766 DEBUG oslo_concurrency.lockutils [req-24001707-449e-4641-9f72-153a4d6db099 req-6bd35931-84ef-4e88-a440-d7b028146df7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:38 np0005593234 nova_compute[227762]: 2026-01-23 10:18:38.699 227766 DEBUG oslo_concurrency.lockutils [req-24001707-449e-4641-9f72-153a4d6db099 req-6bd35931-84ef-4e88-a440-d7b028146df7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:38 np0005593234 nova_compute[227762]: 2026-01-23 10:18:38.699 227766 DEBUG nova.compute.manager [req-24001707-449e-4641-9f72-153a4d6db099 req-6bd35931-84ef-4e88-a440-d7b028146df7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] No waiting events found dispatching network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:18:38 np0005593234 nova_compute[227762]: 2026-01-23 10:18:38.700 227766 WARNING nova.compute.manager [req-24001707-449e-4641-9f72-153a4d6db099 req-6bd35931-84ef-4e88-a440-d7b028146df7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received unexpected event network-vif-plugged-d384450c-fad9-4b71-a4a4-8f666b98276f for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:18:38 np0005593234 nova_compute[227762]: 2026-01-23 10:18:38.966 227766 DEBUG nova.network.neutron [-] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:18:38 np0005593234 nova_compute[227762]: 2026-01-23 10:18:38.989 227766 INFO nova.compute.manager [-] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Took 2.96 seconds to deallocate network for instance.#033[00m
Jan 23 05:18:39 np0005593234 nova_compute[227762]: 2026-01-23 10:18:39.122 227766 DEBUG nova.compute.manager [req-867ed263-0531-4da4-a56c-54ffde833e47 req-d48b91a7-1e64-46f7-b437-ca770ad65c1f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Received event network-vif-deleted-d384450c-fad9-4b71-a4a4-8f666b98276f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:39 np0005593234 nova_compute[227762]: 2026-01-23 10:18:39.313 227766 INFO nova.compute.manager [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Took 0.32 seconds to detach 1 volumes for instance.#033[00m
Jan 23 05:18:39 np0005593234 nova_compute[227762]: 2026-01-23 10:18:39.314 227766 DEBUG nova.compute.manager [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Deleting volume: aee959cd-89dc-45e7-ba7b-58dc8568e292 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 23 05:18:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:39.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:39 np0005593234 nova_compute[227762]: 2026-01-23 10:18:39.670 227766 DEBUG oslo_concurrency.lockutils [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:39 np0005593234 nova_compute[227762]: 2026-01-23 10:18:39.671 227766 DEBUG oslo_concurrency.lockutils [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:39 np0005593234 nova_compute[227762]: 2026-01-23 10:18:39.875 227766 DEBUG oslo_concurrency.processutils [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:18:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:40.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:18:40 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1063256803' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.344 227766 DEBUG oslo_concurrency.processutils [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.350 227766 DEBUG nova.compute.provider_tree [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.421 227766 DEBUG nova.scheduler.client.report [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.466 227766 DEBUG oslo_concurrency.lockutils [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.795s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.596 227766 INFO nova.scheduler.client.report [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Deleted allocations for instance 59fbb9c5-c8a9-4238-be6a-07598275a158#033[00m
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.693 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.706 227766 DEBUG oslo_concurrency.lockutils [None req-3ed3ab32-0749-400d-92ac-41259243d99f 18f5dbf0e00d41b2b913cc1a517bc922 59cfb6a6a5ea438fb4b12029b4fcea0f - - default default] Lock "59fbb9c5-c8a9-4238-be6a-07598275a158" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.277s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.772 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.774 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.774 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:18:40 np0005593234 nova_compute[227762]: 2026-01-23 10:18:40.813 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:18:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1428374408' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:18:41 np0005593234 nova_compute[227762]: 2026-01-23 10:18:41.305 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.531s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:18:41 np0005593234 nova_compute[227762]: 2026-01-23 10:18:41.465 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:18:41 np0005593234 nova_compute[227762]: 2026-01-23 10:18:41.466 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4387MB free_disk=20.94278335571289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:18:41 np0005593234 nova_compute[227762]: 2026-01-23 10:18:41.466 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:41 np0005593234 nova_compute[227762]: 2026-01-23 10:18:41.467 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:41 np0005593234 nova_compute[227762]: 2026-01-23 10:18:41.569 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:18:41 np0005593234 nova_compute[227762]: 2026-01-23 10:18:41.569 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:18:41 np0005593234 nova_compute[227762]: 2026-01-23 10:18:41.616 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:18:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:18:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:41.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:18:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:42.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:18:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2476475721' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:18:42 np0005593234 nova_compute[227762]: 2026-01-23 10:18:42.426 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.809s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:18:42 np0005593234 nova_compute[227762]: 2026-01-23 10:18:42.431 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:18:42 np0005593234 nova_compute[227762]: 2026-01-23 10:18:42.452 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:18:42 np0005593234 nova_compute[227762]: 2026-01-23 10:18:42.480 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:18:42 np0005593234 nova_compute[227762]: 2026-01-23 10:18:42.481 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.014s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:42 np0005593234 nova_compute[227762]: 2026-01-23 10:18:42.590 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e309 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:42.854 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:42.855 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:18:42.855 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:43.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:44.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:44 np0005593234 nova_compute[227762]: 2026-01-23 10:18:44.482 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:18:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3253091716' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:18:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:18:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3253091716' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:18:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:45.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:45 np0005593234 nova_compute[227762]: 2026-01-23 10:18:45.694 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e310 e310: 3 total, 3 up, 3 in
Jan 23 05:18:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:18:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:46.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:18:47 np0005593234 nova_compute[227762]: 2026-01-23 10:18:47.593 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:47.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:47 np0005593234 nova_compute[227762]: 2026-01-23 10:18:47.701 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163512.6987846, 633b85ea-a47c-4be0-b06d-388aa421728b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:18:47 np0005593234 nova_compute[227762]: 2026-01-23 10:18:47.701 227766 INFO nova.compute.manager [-] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:18:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e310 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:47 np0005593234 nova_compute[227762]: 2026-01-23 10:18:47.739 227766 DEBUG nova.compute.manager [None req-8f170d21-cdaf-4f4e-ad17-8ac26124d8d4 - - - - - -] [instance: 633b85ea-a47c-4be0-b06d-388aa421728b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:18:47 np0005593234 nova_compute[227762]: 2026-01-23 10:18:47.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:47 np0005593234 nova_compute[227762]: 2026-01-23 10:18:47.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:18:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:48.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:18:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:49.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:50.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:50 np0005593234 nova_compute[227762]: 2026-01-23 10:18:50.665 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163515.6631875, 59fbb9c5-c8a9-4238-be6a-07598275a158 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:18:50 np0005593234 nova_compute[227762]: 2026-01-23 10:18:50.665 227766 INFO nova.compute.manager [-] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:18:50 np0005593234 nova_compute[227762]: 2026-01-23 10:18:50.694 227766 DEBUG nova.compute.manager [None req-3019461a-eb2e-46a2-93bb-d0150cc6a5e0 - - - - - -] [instance: 59fbb9c5-c8a9-4238-be6a-07598275a158] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:18:50 np0005593234 nova_compute[227762]: 2026-01-23 10:18:50.696 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:50 np0005593234 nova_compute[227762]: 2026-01-23 10:18:50.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:50 np0005593234 nova_compute[227762]: 2026-01-23 10:18:50.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:18:51 np0005593234 nova_compute[227762]: 2026-01-23 10:18:51.307 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:51 np0005593234 podman[295625]: 2026-01-23 10:18:51.472077888 +0000 UTC m=+0.049445338 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 05:18:51 np0005593234 nova_compute[227762]: 2026-01-23 10:18:51.518 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a0be3878-0750-42e1-8219-5dd9d4b3412c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:51 np0005593234 nova_compute[227762]: 2026-01-23 10:18:51.518 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:51 np0005593234 nova_compute[227762]: 2026-01-23 10:18:51.521 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:51 np0005593234 nova_compute[227762]: 2026-01-23 10:18:51.555 227766 DEBUG nova.compute.manager [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:18:51 np0005593234 nova_compute[227762]: 2026-01-23 10:18:51.655 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:51 np0005593234 nova_compute[227762]: 2026-01-23 10:18:51.656 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:51 np0005593234 nova_compute[227762]: 2026-01-23 10:18:51.666 227766 DEBUG nova.virt.hardware [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:18:51 np0005593234 nova_compute[227762]: 2026-01-23 10:18:51.666 227766 INFO nova.compute.claims [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:18:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:51.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:51 np0005593234 nova_compute[227762]: 2026-01-23 10:18:51.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:51 np0005593234 nova_compute[227762]: 2026-01-23 10:18:51.818 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:18:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 e311: 3 total, 3 up, 3 in
Jan 23 05:18:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:18:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3894811592' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.280 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.286 227766 DEBUG nova.compute.provider_tree [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:18:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:52.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.318 227766 DEBUG nova.scheduler.client.report [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.350 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.694s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.351 227766 DEBUG nova.compute.manager [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.413 227766 DEBUG nova.compute.manager [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.413 227766 DEBUG nova.network.neutron [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.437 227766 INFO nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.461 227766 DEBUG nova.compute.manager [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.594 227766 DEBUG nova.compute.manager [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.595 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.595 227766 INFO nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Creating image(s)#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.621 227766 DEBUG nova.storage.rbd_utils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a0be3878-0750-42e1-8219-5dd9d4b3412c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.649 227766 DEBUG nova.storage.rbd_utils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a0be3878-0750-42e1-8219-5dd9d4b3412c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.671 227766 DEBUG nova.storage.rbd_utils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a0be3878-0750-42e1-8219-5dd9d4b3412c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.678 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.703 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.717 227766 DEBUG nova.policy [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60291ce86b6946629a2e48f6680312cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:18:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.748 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.749 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.749 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.749 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.774 227766 DEBUG nova.storage.rbd_utils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a0be3878-0750-42e1-8219-5dd9d4b3412c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.778 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a0be3878-0750-42e1-8219-5dd9d4b3412c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:18:52 np0005593234 nova_compute[227762]: 2026-01-23 10:18:52.801 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:18:53 np0005593234 nova_compute[227762]: 2026-01-23 10:18:53.085 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 a0be3878-0750-42e1-8219-5dd9d4b3412c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.307s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:18:53 np0005593234 nova_compute[227762]: 2026-01-23 10:18:53.149 227766 DEBUG nova.storage.rbd_utils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] resizing rbd image a0be3878-0750-42e1-8219-5dd9d4b3412c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:18:53 np0005593234 nova_compute[227762]: 2026-01-23 10:18:53.242 227766 DEBUG nova.objects.instance [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'migration_context' on Instance uuid a0be3878-0750-42e1-8219-5dd9d4b3412c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:18:53 np0005593234 nova_compute[227762]: 2026-01-23 10:18:53.267 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:18:53 np0005593234 nova_compute[227762]: 2026-01-23 10:18:53.267 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Ensure instance console log exists: /var/lib/nova/instances/a0be3878-0750-42e1-8219-5dd9d4b3412c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:18:53 np0005593234 nova_compute[227762]: 2026-01-23 10:18:53.268 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:18:53 np0005593234 nova_compute[227762]: 2026-01-23 10:18:53.268 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:18:53 np0005593234 nova_compute[227762]: 2026-01-23 10:18:53.268 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:18:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:53.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:54.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:54 np0005593234 nova_compute[227762]: 2026-01-23 10:18:54.660 227766 DEBUG nova.network.neutron [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Successfully created port: a8c7e858-f840-4ed1-b47f-7d4497071e55 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:18:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:55.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:55 np0005593234 nova_compute[227762]: 2026-01-23 10:18:55.698 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:56.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:56 np0005593234 nova_compute[227762]: 2026-01-23 10:18:56.741 227766 DEBUG nova.network.neutron [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Successfully updated port: a8c7e858-f840-4ed1-b47f-7d4497071e55 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:18:56 np0005593234 nova_compute[227762]: 2026-01-23 10:18:56.757 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:18:56 np0005593234 nova_compute[227762]: 2026-01-23 10:18:56.757 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquired lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:18:56 np0005593234 nova_compute[227762]: 2026-01-23 10:18:56.757 227766 DEBUG nova.network.neutron [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:18:57 np0005593234 nova_compute[227762]: 2026-01-23 10:18:57.153 227766 DEBUG nova.compute.manager [req-9aefa9b9-b64a-4d57-aabf-c56a8f498685 req-8226bcdc-6e8f-47ef-a1d1-5fe17fcf14ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Received event network-changed-a8c7e858-f840-4ed1-b47f-7d4497071e55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:18:57 np0005593234 nova_compute[227762]: 2026-01-23 10:18:57.154 227766 DEBUG nova.compute.manager [req-9aefa9b9-b64a-4d57-aabf-c56a8f498685 req-8226bcdc-6e8f-47ef-a1d1-5fe17fcf14ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Refreshing instance network info cache due to event network-changed-a8c7e858-f840-4ed1-b47f-7d4497071e55. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:18:57 np0005593234 nova_compute[227762]: 2026-01-23 10:18:57.154 227766 DEBUG oslo_concurrency.lockutils [req-9aefa9b9-b64a-4d57-aabf-c56a8f498685 req-8226bcdc-6e8f-47ef-a1d1-5fe17fcf14ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:18:57 np0005593234 nova_compute[227762]: 2026-01-23 10:18:57.344 227766 DEBUG nova.network.neutron [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:18:57 np0005593234 nova_compute[227762]: 2026-01-23 10:18:57.595 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:18:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:57.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:18:57 np0005593234 nova_compute[227762]: 2026-01-23 10:18:57.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:57 np0005593234 podman[295862]: 2026-01-23 10:18:57.809544467 +0000 UTC m=+0.100579177 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:18:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:58.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.406 227766 DEBUG nova.network.neutron [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Updating instance_info_cache with network_info: [{"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.429 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Releasing lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.429 227766 DEBUG nova.compute.manager [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Instance network_info: |[{"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.429 227766 DEBUG oslo_concurrency.lockutils [req-9aefa9b9-b64a-4d57-aabf-c56a8f498685 req-8226bcdc-6e8f-47ef-a1d1-5fe17fcf14ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.430 227766 DEBUG nova.network.neutron [req-9aefa9b9-b64a-4d57-aabf-c56a8f498685 req-8226bcdc-6e8f-47ef-a1d1-5fe17fcf14ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Refreshing network info cache for port a8c7e858-f840-4ed1-b47f-7d4497071e55 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.432 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Start _get_guest_xml network_info=[{"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.436 227766 WARNING nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.440 227766 DEBUG nova.virt.libvirt.host [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.441 227766 DEBUG nova.virt.libvirt.host [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.445 227766 DEBUG nova.virt.libvirt.host [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.446 227766 DEBUG nova.virt.libvirt.host [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.447 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.447 227766 DEBUG nova.virt.hardware [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.448 227766 DEBUG nova.virt.hardware [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.448 227766 DEBUG nova.virt.hardware [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.448 227766 DEBUG nova.virt.hardware [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.448 227766 DEBUG nova.virt.hardware [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.449 227766 DEBUG nova.virt.hardware [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.449 227766 DEBUG nova.virt.hardware [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.449 227766 DEBUG nova.virt.hardware [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.449 227766 DEBUG nova.virt.hardware [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.449 227766 DEBUG nova.virt.hardware [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.450 227766 DEBUG nova.virt.hardware [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.453 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:18:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:18:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:18:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:59.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:18:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:18:59 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1783431175' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.869 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.896 227766 DEBUG nova.storage.rbd_utils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a0be3878-0750-42e1-8219-5dd9d4b3412c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:18:59 np0005593234 nova_compute[227762]: 2026-01-23 10:18:59.900 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:00.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:19:00 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/853829937' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.353 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.355 227766 DEBUG nova.virt.libvirt.vif [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:18:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-942916977',display_name='tempest-TestNetworkBasicOps-server-942916977',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-942916977',id=149,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJdB97H3R8BgDR2jd94b/eFyJTAqvmLTTsSC7oR+dOUgKelzDzIxuLparKQHADcuxki2LCEgJ5UyTzGmYLOA+Tbo16wmjo1/o8NzBKLh8f0Lbh7anUmFdDARVVV8OKyqUg==',key_name='tempest-TestNetworkBasicOps-1305641656',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-s1tm5r0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:18:52Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=a0be3878-0750-42e1-8219-5dd9d4b3412c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.356 227766 DEBUG nova.network.os_vif_util [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.356 227766 DEBUG nova.network.os_vif_util [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:17:60,bridge_name='br-int',has_traffic_filtering=True,id=a8c7e858-f840-4ed1-b47f-7d4497071e55,network=Network(ef2a274e-4da0-400b-bcb7-ccf7f53401c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8c7e858-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.358 227766 DEBUG nova.objects.instance [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid a0be3878-0750-42e1-8219-5dd9d4b3412c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.395 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  <uuid>a0be3878-0750-42e1-8219-5dd9d4b3412c</uuid>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  <name>instance-00000095</name>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkBasicOps-server-942916977</nova:name>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:18:59</nova:creationTime>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <nova:port uuid="a8c7e858-f840-4ed1-b47f-7d4497071e55">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <entry name="serial">a0be3878-0750-42e1-8219-5dd9d4b3412c</entry>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <entry name="uuid">a0be3878-0750-42e1-8219-5dd9d4b3412c</entry>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a0be3878-0750-42e1-8219-5dd9d4b3412c_disk">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a0be3878-0750-42e1-8219-5dd9d4b3412c_disk.config">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:69:17:60"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <target dev="tapa8c7e858-f8"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/a0be3878-0750-42e1-8219-5dd9d4b3412c/console.log" append="off"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:19:00 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:19:00 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:19:00 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:19:00 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.396 227766 DEBUG nova.compute.manager [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Preparing to wait for external event network-vif-plugged-a8c7e858-f840-4ed1-b47f-7d4497071e55 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.396 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.397 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.397 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.398 227766 DEBUG nova.virt.libvirt.vif [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:18:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-942916977',display_name='tempest-TestNetworkBasicOps-server-942916977',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-942916977',id=149,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJdB97H3R8BgDR2jd94b/eFyJTAqvmLTTsSC7oR+dOUgKelzDzIxuLparKQHADcuxki2LCEgJ5UyTzGmYLOA+Tbo16wmjo1/o8NzBKLh8f0Lbh7anUmFdDARVVV8OKyqUg==',key_name='tempest-TestNetworkBasicOps-1305641656',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-s1tm5r0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:18:52Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=a0be3878-0750-42e1-8219-5dd9d4b3412c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.398 227766 DEBUG nova.network.os_vif_util [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.399 227766 DEBUG nova.network.os_vif_util [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:17:60,bridge_name='br-int',has_traffic_filtering=True,id=a8c7e858-f840-4ed1-b47f-7d4497071e55,network=Network(ef2a274e-4da0-400b-bcb7-ccf7f53401c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8c7e858-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.399 227766 DEBUG os_vif [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:17:60,bridge_name='br-int',has_traffic_filtering=True,id=a8c7e858-f840-4ed1-b47f-7d4497071e55,network=Network(ef2a274e-4da0-400b-bcb7-ccf7f53401c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8c7e858-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.400 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.400 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.401 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.403 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.404 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa8c7e858-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.404 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa8c7e858-f8, col_values=(('external_ids', {'iface-id': 'a8c7e858-f840-4ed1-b47f-7d4497071e55', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:69:17:60', 'vm-uuid': 'a0be3878-0750-42e1-8219-5dd9d4b3412c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:00 np0005593234 NetworkManager[48942]: <info>  [1769163540.4065] manager: (tapa8c7e858-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/296)
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.406 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.408 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.413 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.413 227766 INFO os_vif [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:17:60,bridge_name='br-int',has_traffic_filtering=True,id=a8c7e858-f840-4ed1-b47f-7d4497071e55,network=Network(ef2a274e-4da0-400b-bcb7-ccf7f53401c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8c7e858-f8')#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.488 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.488 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.489 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No VIF found with MAC fa:16:3e:69:17:60, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.489 227766 INFO nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Using config drive#033[00m
Jan 23 05:19:00 np0005593234 nova_compute[227762]: 2026-01-23 10:19:00.511 227766 DEBUG nova.storage.rbd_utils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a0be3878-0750-42e1-8219-5dd9d4b3412c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:01.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:01 np0005593234 nova_compute[227762]: 2026-01-23 10:19:01.908 227766 INFO nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Creating config drive at /var/lib/nova/instances/a0be3878-0750-42e1-8219-5dd9d4b3412c/disk.config#033[00m
Jan 23 05:19:01 np0005593234 nova_compute[227762]: 2026-01-23 10:19:01.913 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a0be3878-0750-42e1-8219-5dd9d4b3412c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8mf88pfw execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.044 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a0be3878-0750-42e1-8219-5dd9d4b3412c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8mf88pfw" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.075 227766 DEBUG nova.storage.rbd_utils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image a0be3878-0750-42e1-8219-5dd9d4b3412c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.079 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a0be3878-0750-42e1-8219-5dd9d4b3412c/disk.config a0be3878-0750-42e1-8219-5dd9d4b3412c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.251 227766 DEBUG oslo_concurrency.processutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a0be3878-0750-42e1-8219-5dd9d4b3412c/disk.config a0be3878-0750-42e1-8219-5dd9d4b3412c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.172s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.252 227766 INFO nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Deleting local config drive /var/lib/nova/instances/a0be3878-0750-42e1-8219-5dd9d4b3412c/disk.config because it was imported into RBD.#033[00m
Jan 23 05:19:02 np0005593234 kernel: tapa8c7e858-f8: entered promiscuous mode
Jan 23 05:19:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:02.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:02 np0005593234 NetworkManager[48942]: <info>  [1769163542.3082] manager: (tapa8c7e858-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/297)
Jan 23 05:19:02 np0005593234 systemd-udevd[296024]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:19:02 np0005593234 ovn_controller[134547]: 2026-01-23T10:19:02Z|00619|binding|INFO|Claiming lport a8c7e858-f840-4ed1-b47f-7d4497071e55 for this chassis.
Jan 23 05:19:02 np0005593234 ovn_controller[134547]: 2026-01-23T10:19:02Z|00620|binding|INFO|a8c7e858-f840-4ed1-b47f-7d4497071e55: Claiming fa:16:3e:69:17:60 10.100.0.3
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.345 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.350 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:02 np0005593234 NetworkManager[48942]: <info>  [1769163542.3569] device (tapa8c7e858-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:19:02 np0005593234 NetworkManager[48942]: <info>  [1769163542.3574] device (tapa8c7e858-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.361 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:17:60 10.100.0.3'], port_security=['fa:16:3e:69:17:60 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'a0be3878-0750-42e1-8219-5dd9d4b3412c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef2a274e-4da0-400b-bcb7-ccf7f53401c1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3f0c6dd0-0716-4c13-90bf-a399480fe5a7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=211b0b88-3773-4743-9d70-bd2a35ab028e, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a8c7e858-f840-4ed1-b47f-7d4497071e55) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.362 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a8c7e858-f840-4ed1-b47f-7d4497071e55 in datapath ef2a274e-4da0-400b-bcb7-ccf7f53401c1 bound to our chassis#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.363 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network ef2a274e-4da0-400b-bcb7-ccf7f53401c1#033[00m
Jan 23 05:19:02 np0005593234 systemd-machined[195626]: New machine qemu-71-instance-00000095.
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.374 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9ebce474-9eb6-4e70-972a-b42123f9afa5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.375 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapef2a274e-41 in ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.377 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapef2a274e-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.377 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4f8dacc4-0a6b-42f4-bb32-542b634c3d71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.378 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1e5654fa-e1ee-4bee-bbec-b38f77754a35]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.391 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[75d818ef-6e7c-4610-98b6-a93069dfeafe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 systemd[1]: Started Virtual Machine qemu-71-instance-00000095.
Jan 23 05:19:02 np0005593234 ovn_controller[134547]: 2026-01-23T10:19:02Z|00621|binding|INFO|Setting lport a8c7e858-f840-4ed1-b47f-7d4497071e55 ovn-installed in OVS
Jan 23 05:19:02 np0005593234 ovn_controller[134547]: 2026-01-23T10:19:02Z|00622|binding|INFO|Setting lport a8c7e858-f840-4ed1-b47f-7d4497071e55 up in Southbound
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.412 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.416 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dab7c92c-e895-4656-834e-ba48d85ee32f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.446 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[95612397-a383-4ba7-a0ae-9f9c6c721e4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.451 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[764686c3-1d4f-4332-85d8-f6869c676208]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 NetworkManager[48942]: <info>  [1769163542.4522] manager: (tapef2a274e-40): new Veth device (/org/freedesktop/NetworkManager/Devices/298)
Jan 23 05:19:02 np0005593234 systemd-udevd[296028]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.481 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9f048a85-bcdf-44c0-88f5-8b7bd789f57f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.484 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e2f63d36-b2b5-48db-be30-bbaa8bf93027]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 NetworkManager[48942]: <info>  [1769163542.5038] device (tapef2a274e-40): carrier: link connected
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.508 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[93c05774-adf9-4911-9b5c-188212cc8c22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.522 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b11e99b8-3aba-4075-a656-4ddd00bca30a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef2a274e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:af:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 194], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 743746, 'reachable_time': 29460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 296060, 'error': None, 'target': 'ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.540 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b56fd7c1-4678-4e17-a508-e7afc8ce3138]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1c:af99'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 743746, 'tstamp': 743746}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 296061, 'error': None, 'target': 'ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.564 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0a888485-ec35-4176-8b39-b3c2f26c7224]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapef2a274e-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1c:af:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 194], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 743746, 'reachable_time': 29460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 296062, 'error': None, 'target': 'ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.577 227766 DEBUG nova.network.neutron [req-9aefa9b9-b64a-4d57-aabf-c56a8f498685 req-8226bcdc-6e8f-47ef-a1d1-5fe17fcf14ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Updated VIF entry in instance network info cache for port a8c7e858-f840-4ed1-b47f-7d4497071e55. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.578 227766 DEBUG nova.network.neutron [req-9aefa9b9-b64a-4d57-aabf-c56a8f498685 req-8226bcdc-6e8f-47ef-a1d1-5fe17fcf14ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Updating instance_info_cache with network_info: [{"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.596 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.605 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[715d3d7b-660d-437e-a1bd-be931b16b42a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.610 227766 DEBUG oslo_concurrency.lockutils [req-9aefa9b9-b64a-4d57-aabf-c56a8f498685 req-8226bcdc-6e8f-47ef-a1d1-5fe17fcf14ca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.670 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf16435-1e6d-480b-ad90-dce2586cc713]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.672 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef2a274e-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.672 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.673 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapef2a274e-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.675 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:02 np0005593234 NetworkManager[48942]: <info>  [1769163542.6758] manager: (tapef2a274e-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/299)
Jan 23 05:19:02 np0005593234 kernel: tapef2a274e-40: entered promiscuous mode
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.679 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapef2a274e-40, col_values=(('external_ids', {'iface-id': '4dd3507c-09b2-4097-8357-2c398ef8e03c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:02 np0005593234 ovn_controller[134547]: 2026-01-23T10:19:02Z|00623|binding|INFO|Releasing lport 4dd3507c-09b2-4097-8357-2c398ef8e03c from this chassis (sb_readonly=0)
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.682 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.683 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/ef2a274e-4da0-400b-bcb7-ccf7f53401c1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/ef2a274e-4da0-400b-bcb7-ccf7f53401c1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.694 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ed99d270-03e5-4ebf-805d-cac192f1107a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:19:02 np0005593234 nova_compute[227762]: 2026-01-23 10:19:02.695 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.696 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-ef2a274e-4da0-400b-bcb7-ccf7f53401c1
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/ef2a274e-4da0-400b-bcb7-ccf7f53401c1.pid.haproxy
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID ef2a274e-4da0-400b-bcb7-ccf7f53401c1
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:19:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:02.696 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1', 'env', 'PROCESS_TAG=haproxy-ef2a274e-4da0-400b-bcb7-ccf7f53401c1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/ef2a274e-4da0-400b-bcb7-ccf7f53401c1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:19:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.021 227766 DEBUG nova.compute.manager [req-932eb3f4-398d-464f-ad9d-f3c8d3833868 req-c62e5124-a92b-4d48-89d4-adbd3e47d805 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Received event network-vif-plugged-a8c7e858-f840-4ed1-b47f-7d4497071e55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.022 227766 DEBUG oslo_concurrency.lockutils [req-932eb3f4-398d-464f-ad9d-f3c8d3833868 req-c62e5124-a92b-4d48-89d4-adbd3e47d805 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.022 227766 DEBUG oslo_concurrency.lockutils [req-932eb3f4-398d-464f-ad9d-f3c8d3833868 req-c62e5124-a92b-4d48-89d4-adbd3e47d805 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.022 227766 DEBUG oslo_concurrency.lockutils [req-932eb3f4-398d-464f-ad9d-f3c8d3833868 req-c62e5124-a92b-4d48-89d4-adbd3e47d805 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.023 227766 DEBUG nova.compute.manager [req-932eb3f4-398d-464f-ad9d-f3c8d3833868 req-c62e5124-a92b-4d48-89d4-adbd3e47d805 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Processing event network-vif-plugged-a8c7e858-f840-4ed1-b47f-7d4497071e55 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:19:03 np0005593234 podman[296109]: 2026-01-23 10:19:03.047891922 +0000 UTC m=+0.044175125 container create bdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:19:03 np0005593234 systemd[1]: Started libpod-conmon-bdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1.scope.
Jan 23 05:19:03 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:19:03 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/185b1b81c0f784497799f7e52a890ed056602473d9ea8be925ef01ef7cbb8ce2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:19:03 np0005593234 podman[296109]: 2026-01-23 10:19:03.104415198 +0000 UTC m=+0.100698411 container init bdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:19:03 np0005593234 podman[296109]: 2026-01-23 10:19:03.110066893 +0000 UTC m=+0.106350106 container start bdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:19:03 np0005593234 podman[296109]: 2026-01-23 10:19:03.023476402 +0000 UTC m=+0.019759635 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:19:03 np0005593234 neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1[296150]: [NOTICE]   (296155) : New worker (296157) forked
Jan 23 05:19:03 np0005593234 neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1[296150]: [NOTICE]   (296155) : Loading success.
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.139 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163543.1391716, a0be3878-0750-42e1-8219-5dd9d4b3412c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.140 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] VM Started (Lifecycle Event)#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.142 227766 DEBUG nova.compute.manager [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.146 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.149 227766 INFO nova.virt.libvirt.driver [-] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Instance spawned successfully.#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.150 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.177 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.183 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.186 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.186 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.187 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.187 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.188 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.188 227766 DEBUG nova.virt.libvirt.driver [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.228 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.229 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163543.1394386, a0be3878-0750-42e1-8219-5dd9d4b3412c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.229 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.269 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.273 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163543.1452758, a0be3878-0750-42e1-8219-5dd9d4b3412c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.273 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.295 227766 INFO nova.compute.manager [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Took 10.70 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.296 227766 DEBUG nova.compute.manager [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.297 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.302 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.361 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.405 227766 INFO nova.compute.manager [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Took 11.79 seconds to build instance.#033[00m
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.505 227766 DEBUG oslo_concurrency.lockutils [None req-48e8e75f-6afe-4ed5-85b5-732d05c90472 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.987s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:19:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:03.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:19:03 np0005593234 nova_compute[227762]: 2026-01-23 10:19:03.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:04.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:05 np0005593234 nova_compute[227762]: 2026-01-23 10:19:05.407 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:05.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:06 np0005593234 nova_compute[227762]: 2026-01-23 10:19:06.312 227766 DEBUG nova.compute.manager [req-6100c8a1-a86e-4d6c-addb-9468907dd59f req-a6a8b0dd-65a6-463c-903f-61bd02a4ee90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Received event network-vif-plugged-a8c7e858-f840-4ed1-b47f-7d4497071e55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:06 np0005593234 nova_compute[227762]: 2026-01-23 10:19:06.313 227766 DEBUG oslo_concurrency.lockutils [req-6100c8a1-a86e-4d6c-addb-9468907dd59f req-a6a8b0dd-65a6-463c-903f-61bd02a4ee90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:06 np0005593234 nova_compute[227762]: 2026-01-23 10:19:06.313 227766 DEBUG oslo_concurrency.lockutils [req-6100c8a1-a86e-4d6c-addb-9468907dd59f req-a6a8b0dd-65a6-463c-903f-61bd02a4ee90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:06.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:06 np0005593234 nova_compute[227762]: 2026-01-23 10:19:06.314 227766 DEBUG oslo_concurrency.lockutils [req-6100c8a1-a86e-4d6c-addb-9468907dd59f req-a6a8b0dd-65a6-463c-903f-61bd02a4ee90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:06 np0005593234 nova_compute[227762]: 2026-01-23 10:19:06.314 227766 DEBUG nova.compute.manager [req-6100c8a1-a86e-4d6c-addb-9468907dd59f req-a6a8b0dd-65a6-463c-903f-61bd02a4ee90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] No waiting events found dispatching network-vif-plugged-a8c7e858-f840-4ed1-b47f-7d4497071e55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:19:06 np0005593234 nova_compute[227762]: 2026-01-23 10:19:06.314 227766 WARNING nova.compute.manager [req-6100c8a1-a86e-4d6c-addb-9468907dd59f req-a6a8b0dd-65a6-463c-903f-61bd02a4ee90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Received unexpected event network-vif-plugged-a8c7e858-f840-4ed1-b47f-7d4497071e55 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:19:07 np0005593234 nova_compute[227762]: 2026-01-23 10:19:07.598 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:07.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:08.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:09.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:10.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:10 np0005593234 nova_compute[227762]: 2026-01-23 10:19:10.410 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:19:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:11.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:19:11 np0005593234 NetworkManager[48942]: <info>  [1769163551.7446] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/300)
Jan 23 05:19:11 np0005593234 NetworkManager[48942]: <info>  [1769163551.7457] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/301)
Jan 23 05:19:11 np0005593234 nova_compute[227762]: 2026-01-23 10:19:11.744 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:11 np0005593234 nova_compute[227762]: 2026-01-23 10:19:11.759 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:11 np0005593234 nova_compute[227762]: 2026-01-23 10:19:11.761 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:19:11 np0005593234 nova_compute[227762]: 2026-01-23 10:19:11.844 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:11 np0005593234 ovn_controller[134547]: 2026-01-23T10:19:11Z|00624|binding|INFO|Releasing lport 4dd3507c-09b2-4097-8357-2c398ef8e03c from this chassis (sb_readonly=0)
Jan 23 05:19:11 np0005593234 nova_compute[227762]: 2026-01-23 10:19:11.854 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:12.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:12 np0005593234 nova_compute[227762]: 2026-01-23 10:19:12.390 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:12 np0005593234 nova_compute[227762]: 2026-01-23 10:19:12.600 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:12 np0005593234 nova_compute[227762]: 2026-01-23 10:19:12.684 227766 DEBUG nova.compute.manager [req-9cff93d5-087c-4dd7-98d0-a15c877bdba2 req-5fb4ca9b-fcda-412e-a85d-447d569d56ee 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Received event network-changed-a8c7e858-f840-4ed1-b47f-7d4497071e55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:12 np0005593234 nova_compute[227762]: 2026-01-23 10:19:12.684 227766 DEBUG nova.compute.manager [req-9cff93d5-087c-4dd7-98d0-a15c877bdba2 req-5fb4ca9b-fcda-412e-a85d-447d569d56ee 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Refreshing instance network info cache due to event network-changed-a8c7e858-f840-4ed1-b47f-7d4497071e55. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:19:12 np0005593234 nova_compute[227762]: 2026-01-23 10:19:12.685 227766 DEBUG oslo_concurrency.lockutils [req-9cff93d5-087c-4dd7-98d0-a15c877bdba2 req-5fb4ca9b-fcda-412e-a85d-447d569d56ee 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:12 np0005593234 nova_compute[227762]: 2026-01-23 10:19:12.685 227766 DEBUG oslo_concurrency.lockutils [req-9cff93d5-087c-4dd7-98d0-a15c877bdba2 req-5fb4ca9b-fcda-412e-a85d-447d569d56ee 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:12 np0005593234 nova_compute[227762]: 2026-01-23 10:19:12.685 227766 DEBUG nova.network.neutron [req-9cff93d5-087c-4dd7-98d0-a15c877bdba2 req-5fb4ca9b-fcda-412e-a85d-447d569d56ee 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Refreshing network info cache for port a8c7e858-f840-4ed1-b47f-7d4497071e55 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:19:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:19:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:13.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:19:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:14.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:14 np0005593234 nova_compute[227762]: 2026-01-23 10:19:14.775 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:14 np0005593234 nova_compute[227762]: 2026-01-23 10:19:14.775 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:19:14 np0005593234 nova_compute[227762]: 2026-01-23 10:19:14.803 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:19:15 np0005593234 nova_compute[227762]: 2026-01-23 10:19:15.411 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:15 np0005593234 ovn_controller[134547]: 2026-01-23T10:19:15Z|00067|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:69:17:60 10.100.0.3
Jan 23 05:19:15 np0005593234 ovn_controller[134547]: 2026-01-23T10:19:15Z|00068|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:69:17:60 10.100.0.3
Jan 23 05:19:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:15.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:16 np0005593234 nova_compute[227762]: 2026-01-23 10:19:16.244 227766 DEBUG nova.network.neutron [req-9cff93d5-087c-4dd7-98d0-a15c877bdba2 req-5fb4ca9b-fcda-412e-a85d-447d569d56ee 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Updated VIF entry in instance network info cache for port a8c7e858-f840-4ed1-b47f-7d4497071e55. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:19:16 np0005593234 nova_compute[227762]: 2026-01-23 10:19:16.245 227766 DEBUG nova.network.neutron [req-9cff93d5-087c-4dd7-98d0-a15c877bdba2 req-5fb4ca9b-fcda-412e-a85d-447d569d56ee 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Updating instance_info_cache with network_info: [{"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:16 np0005593234 nova_compute[227762]: 2026-01-23 10:19:16.288 227766 DEBUG oslo_concurrency.lockutils [req-9cff93d5-087c-4dd7-98d0-a15c877bdba2 req-5fb4ca9b-fcda-412e-a85d-447d569d56ee 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:16.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:17 np0005593234 nova_compute[227762]: 2026-01-23 10:19:17.602 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:19:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:17.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:19:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:18.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:19 np0005593234 nova_compute[227762]: 2026-01-23 10:19:19.102 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:19.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:20.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:20 np0005593234 nova_compute[227762]: 2026-01-23 10:19:20.412 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:20 np0005593234 nova_compute[227762]: 2026-01-23 10:19:20.995 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:20.996 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=55, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=54) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:19:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:20.997 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:19:21 np0005593234 nova_compute[227762]: 2026-01-23 10:19:21.618 227766 INFO nova.compute.manager [None req-759c80ab-ab2a-474b-8ca9-b30804e74e48 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Get console output#033[00m
Jan 23 05:19:21 np0005593234 nova_compute[227762]: 2026-01-23 10:19:21.625 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:19:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:21.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:21 np0005593234 podman[296228]: 2026-01-23 10:19:21.766046744 +0000 UTC m=+0.058795589 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:19:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:19:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:22.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:19:22 np0005593234 nova_compute[227762]: 2026-01-23 10:19:22.603 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:23.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:23.999 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '55'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:19:24 np0005593234 nova_compute[227762]: 2026-01-23 10:19:24.114 227766 DEBUG nova.compute.manager [req-69160822-071e-4388-bed4-f0d4656b1234 req-ebd40514-8b25-4b8e-b2f3-5a198653d14b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Received event network-changed-a8c7e858-f840-4ed1-b47f-7d4497071e55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:19:24 np0005593234 nova_compute[227762]: 2026-01-23 10:19:24.115 227766 DEBUG nova.compute.manager [req-69160822-071e-4388-bed4-f0d4656b1234 req-ebd40514-8b25-4b8e-b2f3-5a198653d14b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Refreshing instance network info cache due to event network-changed-a8c7e858-f840-4ed1-b47f-7d4497071e55. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:19:24 np0005593234 nova_compute[227762]: 2026-01-23 10:19:24.115 227766 DEBUG oslo_concurrency.lockutils [req-69160822-071e-4388-bed4-f0d4656b1234 req-ebd40514-8b25-4b8e-b2f3-5a198653d14b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:24 np0005593234 nova_compute[227762]: 2026-01-23 10:19:24.118 227766 DEBUG oslo_concurrency.lockutils [req-69160822-071e-4388-bed4-f0d4656b1234 req-ebd40514-8b25-4b8e-b2f3-5a198653d14b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:24 np0005593234 nova_compute[227762]: 2026-01-23 10:19:24.119 227766 DEBUG nova.network.neutron [req-69160822-071e-4388-bed4-f0d4656b1234 req-ebd40514-8b25-4b8e-b2f3-5a198653d14b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Refreshing network info cache for port a8c7e858-f840-4ed1-b47f-7d4497071e55 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:19:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:19:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:24.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:19:25 np0005593234 nova_compute[227762]: 2026-01-23 10:19:25.415 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:25.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:26.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:26 np0005593234 nova_compute[227762]: 2026-01-23 10:19:26.350 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:27 np0005593234 nova_compute[227762]: 2026-01-23 10:19:27.606 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:27 np0005593234 nova_compute[227762]: 2026-01-23 10:19:27.670 227766 DEBUG nova.network.neutron [req-69160822-071e-4388-bed4-f0d4656b1234 req-ebd40514-8b25-4b8e-b2f3-5a198653d14b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Updated VIF entry in instance network info cache for port a8c7e858-f840-4ed1-b47f-7d4497071e55. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:19:27 np0005593234 nova_compute[227762]: 2026-01-23 10:19:27.670 227766 DEBUG nova.network.neutron [req-69160822-071e-4388-bed4-f0d4656b1234 req-ebd40514-8b25-4b8e-b2f3-5a198653d14b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Updating instance_info_cache with network_info: [{"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:19:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:27.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:27 np0005593234 nova_compute[227762]: 2026-01-23 10:19:27.724 227766 DEBUG oslo_concurrency.lockutils [req-69160822-071e-4388-bed4-f0d4656b1234 req-ebd40514-8b25-4b8e-b2f3-5a198653d14b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:19:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:28.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:28 np0005593234 podman[296252]: 2026-01-23 10:19:28.801095847 +0000 UTC m=+0.089606306 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:19:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:29.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:30.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:30 np0005593234 nova_compute[227762]: 2026-01-23 10:19:30.416 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:31.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:32.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:32 np0005593234 nova_compute[227762]: 2026-01-23 10:19:32.607 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:19:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:33.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #133. Immutable memtables: 0.
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.818143) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 83] Flushing memtable with next log file: 133
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163573818202, "job": 83, "event": "flush_started", "num_memtables": 1, "num_entries": 1183, "num_deletes": 253, "total_data_size": 2529236, "memory_usage": 2552032, "flush_reason": "Manual Compaction"}
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 83] Level-0 flush table #134: started
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163573828530, "cf_name": "default", "job": 83, "event": "table_file_creation", "file_number": 134, "file_size": 1670104, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 64306, "largest_seqno": 65484, "table_properties": {"data_size": 1664752, "index_size": 2747, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11991, "raw_average_key_size": 20, "raw_value_size": 1653925, "raw_average_value_size": 2808, "num_data_blocks": 120, "num_entries": 589, "num_filter_entries": 589, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163483, "oldest_key_time": 1769163483, "file_creation_time": 1769163573, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 134, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 83] Flush lasted 10501 microseconds, and 4416 cpu microseconds.
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.828644) [db/flush_job.cc:967] [default] [JOB 83] Level-0 flush table #134: 1670104 bytes OK
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.828664) [db/memtable_list.cc:519] [default] Level-0 commit table #134 started
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.830420) [db/memtable_list.cc:722] [default] Level-0 commit table #134: memtable #1 done
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.830432) EVENT_LOG_v1 {"time_micros": 1769163573830428, "job": 83, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.830450) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 83] Try to delete WAL files size 2523450, prev total WAL file size 2523450, number of live WAL files 2.
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000130.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.831273) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035353232' seq:72057594037927935, type:22 .. '7061786F730035373734' seq:0, type:0; will stop at (end)
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 84] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 83 Base level 0, inputs: [134(1630KB)], [132(10MB)]
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163573831405, "job": 84, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [134], "files_L6": [132], "score": -1, "input_data_size": 12530284, "oldest_snapshot_seqno": -1}
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 84] Generated table #135: 8364 keys, 10637705 bytes, temperature: kUnknown
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163573939327, "cf_name": "default", "job": 84, "event": "table_file_creation", "file_number": 135, "file_size": 10637705, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10584602, "index_size": 31102, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 20933, "raw_key_size": 221201, "raw_average_key_size": 26, "raw_value_size": 10438329, "raw_average_value_size": 1248, "num_data_blocks": 1187, "num_entries": 8364, "num_filter_entries": 8364, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769163573, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 135, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.939626) [db/compaction/compaction_job.cc:1663] [default] [JOB 84] Compacted 1@0 + 1@6 files to L6 => 10637705 bytes
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.943155) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 116.0 rd, 98.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.6, 10.4 +0.0 blob) out(10.1 +0.0 blob), read-write-amplify(13.9) write-amplify(6.4) OK, records in: 8889, records dropped: 525 output_compression: NoCompression
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.943184) EVENT_LOG_v1 {"time_micros": 1769163573943172, "job": 84, "event": "compaction_finished", "compaction_time_micros": 107996, "compaction_time_cpu_micros": 30696, "output_level": 6, "num_output_files": 1, "total_output_size": 10637705, "num_input_records": 8889, "num_output_records": 8364, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000134.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163573943643, "job": 84, "event": "table_file_deletion", "file_number": 134}
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000132.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163573945873, "job": 84, "event": "table_file_deletion", "file_number": 132}
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.831135) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.945945) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.945950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.945951) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.945953) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:19:33 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:19:33.945955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:19:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:34.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:19:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:19:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:19:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:19:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:19:35 np0005593234 nova_compute[227762]: 2026-01-23 10:19:35.419 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:19:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:35.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:19:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:36.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:37 np0005593234 nova_compute[227762]: 2026-01-23 10:19:37.489 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:37 np0005593234 nova_compute[227762]: 2026-01-23 10:19:37.610 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:37.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:19:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:38.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:19:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:39.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:40.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:40 np0005593234 nova_compute[227762]: 2026-01-23 10:19:40.420 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:41.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:19:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:19:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:42.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:42 np0005593234 nova_compute[227762]: 2026-01-23 10:19:42.611 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:42 np0005593234 nova_compute[227762]: 2026-01-23 10:19:42.773 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:42.855 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:42.856 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:19:42.856 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:42 np0005593234 nova_compute[227762]: 2026-01-23 10:19:42.952 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:42 np0005593234 nova_compute[227762]: 2026-01-23 10:19:42.953 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:42 np0005593234 nova_compute[227762]: 2026-01-23 10:19:42.953 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:42 np0005593234 nova_compute[227762]: 2026-01-23 10:19:42.954 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:19:42 np0005593234 nova_compute[227762]: 2026-01-23 10:19:42.954 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:19:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3839139129' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:43 np0005593234 nova_compute[227762]: 2026-01-23 10:19:43.434 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:43 np0005593234 nova_compute[227762]: 2026-01-23 10:19:43.686 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000095 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:19:43 np0005593234 nova_compute[227762]: 2026-01-23 10:19:43.687 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000095 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:19:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:19:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:43.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:19:43 np0005593234 nova_compute[227762]: 2026-01-23 10:19:43.839 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:19:43 np0005593234 nova_compute[227762]: 2026-01-23 10:19:43.840 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4208MB free_disk=20.810218811035156GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:19:43 np0005593234 nova_compute[227762]: 2026-01-23 10:19:43.840 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:19:43 np0005593234 nova_compute[227762]: 2026-01-23 10:19:43.841 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:19:44 np0005593234 nova_compute[227762]: 2026-01-23 10:19:44.183 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance a0be3878-0750-42e1-8219-5dd9d4b3412c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:19:44 np0005593234 nova_compute[227762]: 2026-01-23 10:19:44.184 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:19:44 np0005593234 nova_compute[227762]: 2026-01-23 10:19:44.184 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:19:44 np0005593234 nova_compute[227762]: 2026-01-23 10:19:44.262 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:19:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:44.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:19:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1098800442' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:19:45 np0005593234 nova_compute[227762]: 2026-01-23 10:19:45.060 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.798s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:19:45 np0005593234 nova_compute[227762]: 2026-01-23 10:19:45.067 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:19:45 np0005593234 nova_compute[227762]: 2026-01-23 10:19:45.106 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:19:45 np0005593234 nova_compute[227762]: 2026-01-23 10:19:45.153 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:19:45 np0005593234 nova_compute[227762]: 2026-01-23 10:19:45.153 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:19:45 np0005593234 nova_compute[227762]: 2026-01-23 10:19:45.421 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:45.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:46.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:47 np0005593234 nova_compute[227762]: 2026-01-23 10:19:47.125 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:47 np0005593234 nova_compute[227762]: 2026-01-23 10:19:47.614 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:47.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:19:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:48.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:19:48 np0005593234 nova_compute[227762]: 2026-01-23 10:19:48.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:49 np0005593234 nova_compute[227762]: 2026-01-23 10:19:49.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:49.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:50.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:50 np0005593234 nova_compute[227762]: 2026-01-23 10:19:50.423 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:50 np0005593234 nova_compute[227762]: 2026-01-23 10:19:50.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:50 np0005593234 nova_compute[227762]: 2026-01-23 10:19:50.767 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:50 np0005593234 nova_compute[227762]: 2026-01-23 10:19:50.768 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:19:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:19:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:51.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:19:52 np0005593234 podman[296591]: 2026-01-23 10:19:52.043663831 +0000 UTC m=+0.055527727 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 23 05:19:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:52.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:52 np0005593234 nova_compute[227762]: 2026-01-23 10:19:52.616 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:52 np0005593234 nova_compute[227762]: 2026-01-23 10:19:52.695 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:52 np0005593234 nova_compute[227762]: 2026-01-23 10:19:52.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:19:52 np0005593234 nova_compute[227762]: 2026-01-23 10:19:52.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:19:52 np0005593234 nova_compute[227762]: 2026-01-23 10:19:52.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:19:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e311 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:53 np0005593234 nova_compute[227762]: 2026-01-23 10:19:53.500 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:19:53 np0005593234 nova_compute[227762]: 2026-01-23 10:19:53.501 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:19:53 np0005593234 nova_compute[227762]: 2026-01-23 10:19:53.501 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:19:53 np0005593234 nova_compute[227762]: 2026-01-23 10:19:53.501 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a0be3878-0750-42e1-8219-5dd9d4b3412c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:19:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:53.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:54.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:55 np0005593234 nova_compute[227762]: 2026-01-23 10:19:55.426 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:19:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:55.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:19:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:56.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 e312: 3 total, 3 up, 3 in
Jan 23 05:19:57 np0005593234 nova_compute[227762]: 2026-01-23 10:19:57.618 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:19:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:57.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:19:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:58.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:19:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:19:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:59.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:19:59 np0005593234 podman[296640]: 2026-01-23 10:19:59.780755842 +0000 UTC m=+0.081762902 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 05:20:00 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 05:20:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:00.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:00 np0005593234 nova_compute[227762]: 2026-01-23 10:20:00.427 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:01.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:02.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:02 np0005593234 nova_compute[227762]: 2026-01-23 10:20:02.620 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:03.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:20:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:04.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:20:05 np0005593234 nova_compute[227762]: 2026-01-23 10:20:05.429 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:05.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:20:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:06.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:20:07 np0005593234 nova_compute[227762]: 2026-01-23 10:20:07.621 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:20:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:07.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:20:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:08.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:20:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:09.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:20:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:10.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:10 np0005593234 nova_compute[227762]: 2026-01-23 10:20:10.431 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:11.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:12.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:12 np0005593234 nova_compute[227762]: 2026-01-23 10:20:12.672 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:13.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:14 np0005593234 nova_compute[227762]: 2026-01-23 10:20:14.059 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Updating instance_info_cache with network_info: [{"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:20:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:20:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:14.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:20:15 np0005593234 nova_compute[227762]: 2026-01-23 10:20:15.432 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:15.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:16.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:16 np0005593234 nova_compute[227762]: 2026-01-23 10:20:16.632 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:20:16 np0005593234 nova_compute[227762]: 2026-01-23 10:20:16.632 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:20:16 np0005593234 nova_compute[227762]: 2026-01-23 10:20:16.633 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:16 np0005593234 nova_compute[227762]: 2026-01-23 10:20:16.633 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:17 np0005593234 nova_compute[227762]: 2026-01-23 10:20:17.674 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:20:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:17.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:20:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:18.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:19.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:20.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:20 np0005593234 nova_compute[227762]: 2026-01-23 10:20:20.435 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:21.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:22.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:22 np0005593234 nova_compute[227762]: 2026-01-23 10:20:22.675 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:22 np0005593234 podman[296729]: 2026-01-23 10:20:22.747987768 +0000 UTC m=+0.045081363 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:20:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:23.496 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=56, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=55) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:20:23 np0005593234 nova_compute[227762]: 2026-01-23 10:20:23.497 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:23.497 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:20:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:23.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:24.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:24 np0005593234 nova_compute[227762]: 2026-01-23 10:20:24.627 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:25 np0005593234 nova_compute[227762]: 2026-01-23 10:20:25.437 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:25.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:26.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:27 np0005593234 nova_compute[227762]: 2026-01-23 10:20:27.676 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:27.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:28.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:29.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:30.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:30 np0005593234 nova_compute[227762]: 2026-01-23 10:20:30.438 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:30 np0005593234 ovn_controller[134547]: 2026-01-23T10:20:30Z|00625|binding|INFO|Releasing lport 4dd3507c-09b2-4097-8357-2c398ef8e03c from this chassis (sb_readonly=0)
Jan 23 05:20:30 np0005593234 nova_compute[227762]: 2026-01-23 10:20:30.488 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:30.498 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '56'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:20:30 np0005593234 ovn_controller[134547]: 2026-01-23T10:20:30Z|00626|binding|INFO|Releasing lport 4dd3507c-09b2-4097-8357-2c398ef8e03c from this chassis (sb_readonly=0)
Jan 23 05:20:30 np0005593234 nova_compute[227762]: 2026-01-23 10:20:30.709 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:30 np0005593234 podman[296752]: 2026-01-23 10:20:30.790285348 +0000 UTC m=+0.073823745 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 05:20:30 np0005593234 nova_compute[227762]: 2026-01-23 10:20:30.827 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:30 np0005593234 nova_compute[227762]: 2026-01-23 10:20:30.828 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:30 np0005593234 nova_compute[227762]: 2026-01-23 10:20:30.890 227766 DEBUG nova.compute.manager [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:20:31 np0005593234 nova_compute[227762]: 2026-01-23 10:20:31.193 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:31 np0005593234 nova_compute[227762]: 2026-01-23 10:20:31.193 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:31 np0005593234 nova_compute[227762]: 2026-01-23 10:20:31.202 227766 DEBUG nova.virt.hardware [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:20:31 np0005593234 nova_compute[227762]: 2026-01-23 10:20:31.202 227766 INFO nova.compute.claims [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:20:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:31.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:32.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:32 np0005593234 nova_compute[227762]: 2026-01-23 10:20:32.678 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:32 np0005593234 nova_compute[227762]: 2026-01-23 10:20:32.914 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:20:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/139530930' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:20:33 np0005593234 nova_compute[227762]: 2026-01-23 10:20:33.360 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:33 np0005593234 nova_compute[227762]: 2026-01-23 10:20:33.366 227766 DEBUG nova.compute.provider_tree [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:20:33 np0005593234 nova_compute[227762]: 2026-01-23 10:20:33.417 227766 DEBUG nova.scheduler.client.report [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:20:33 np0005593234 nova_compute[227762]: 2026-01-23 10:20:33.509 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.316s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:33 np0005593234 nova_compute[227762]: 2026-01-23 10:20:33.510 227766 DEBUG nova.compute.manager [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:20:33 np0005593234 nova_compute[227762]: 2026-01-23 10:20:33.674 227766 DEBUG nova.compute.manager [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:20:33 np0005593234 nova_compute[227762]: 2026-01-23 10:20:33.675 227766 DEBUG nova.network.neutron [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:20:33 np0005593234 nova_compute[227762]: 2026-01-23 10:20:33.718 227766 INFO nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:20:33 np0005593234 nova_compute[227762]: 2026-01-23 10:20:33.755 227766 DEBUG nova.compute.manager [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:20:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:33.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:33 np0005593234 nova_compute[227762]: 2026-01-23 10:20:33.979 227766 DEBUG nova.compute.manager [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:20:33 np0005593234 nova_compute[227762]: 2026-01-23 10:20:33.980 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:20:33 np0005593234 nova_compute[227762]: 2026-01-23 10:20:33.981 227766 INFO nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Creating image(s)#033[00m
Jan 23 05:20:34 np0005593234 nova_compute[227762]: 2026-01-23 10:20:34.015 227766 DEBUG nova.storage.rbd_utils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:20:34 np0005593234 nova_compute[227762]: 2026-01-23 10:20:34.054 227766 DEBUG nova.storage.rbd_utils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:20:34 np0005593234 nova_compute[227762]: 2026-01-23 10:20:34.081 227766 DEBUG nova.storage.rbd_utils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:20:34 np0005593234 nova_compute[227762]: 2026-01-23 10:20:34.084 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "db5146c726563dd06be5c3f5cc1141007148d79c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:34 np0005593234 nova_compute[227762]: 2026-01-23 10:20:34.085 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "db5146c726563dd06be5c3f5cc1141007148d79c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:34 np0005593234 nova_compute[227762]: 2026-01-23 10:20:34.403 227766 DEBUG nova.policy [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c041da0a601a4260b29fc9c65719597f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b976daabc8124a99814954633f99ed7b', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:20:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:34.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:35 np0005593234 nova_compute[227762]: 2026-01-23 10:20:35.440 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:35.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:20:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:36.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:20:36 np0005593234 nova_compute[227762]: 2026-01-23 10:20:36.740 227766 DEBUG nova.virt.libvirt.imagebackend [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/bba8f8ac-6563-4b96-a735-670d31b1818b/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/bba8f8ac-6563-4b96-a735-670d31b1818b/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 05:20:37 np0005593234 nova_compute[227762]: 2026-01-23 10:20:37.681 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:37.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:38.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:20:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:39.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:20:40 np0005593234 nova_compute[227762]: 2026-01-23 10:20:40.441 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:40.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:41.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:41 np0005593234 nova_compute[227762]: 2026-01-23 10:20:41.911 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/db5146c726563dd06be5c3f5cc1141007148d79c.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:41 np0005593234 nova_compute[227762]: 2026-01-23 10:20:41.987 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/db5146c726563dd06be5c3f5cc1141007148d79c.part --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:41 np0005593234 nova_compute[227762]: 2026-01-23 10:20:41.988 227766 DEBUG nova.virt.images [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] bba8f8ac-6563-4b96-a735-670d31b1818b was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Jan 23 05:20:41 np0005593234 nova_compute[227762]: 2026-01-23 10:20:41.989 227766 DEBUG nova.privsep.utils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Jan 23 05:20:41 np0005593234 nova_compute[227762]: 2026-01-23 10:20:41.989 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/db5146c726563dd06be5c3f5cc1141007148d79c.part /var/lib/nova/instances/_base/db5146c726563dd06be5c3f5cc1141007148d79c.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:42 np0005593234 nova_compute[227762]: 2026-01-23 10:20:42.212 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/db5146c726563dd06be5c3f5cc1141007148d79c.part /var/lib/nova/instances/_base/db5146c726563dd06be5c3f5cc1141007148d79c.converted" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:42 np0005593234 nova_compute[227762]: 2026-01-23 10:20:42.217 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/db5146c726563dd06be5c3f5cc1141007148d79c.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:42 np0005593234 nova_compute[227762]: 2026-01-23 10:20:42.281 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/db5146c726563dd06be5c3f5cc1141007148d79c.converted --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:42 np0005593234 nova_compute[227762]: 2026-01-23 10:20:42.283 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "db5146c726563dd06be5c3f5cc1141007148d79c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 8.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:42 np0005593234 nova_compute[227762]: 2026-01-23 10:20:42.310 227766 DEBUG nova.storage.rbd_utils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:20:42 np0005593234 nova_compute[227762]: 2026-01-23 10:20:42.314 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/db5146c726563dd06be5c3f5cc1141007148d79c a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:42.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:42 np0005593234 nova_compute[227762]: 2026-01-23 10:20:42.664 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/db5146c726563dd06be5c3f5cc1141007148d79c a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.350s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:42 np0005593234 nova_compute[227762]: 2026-01-23 10:20:42.698 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:42 np0005593234 nova_compute[227762]: 2026-01-23 10:20:42.742 227766 DEBUG nova.storage.rbd_utils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] resizing rbd image a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:20:42 np0005593234 nova_compute[227762]: 2026-01-23 10:20:42.782 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:42.856 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:42.857 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:42 np0005593234 nova_compute[227762]: 2026-01-23 10:20:42.857 227766 DEBUG nova.objects.instance [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'migration_context' on Instance uuid a00a5042-ce71-4ecf-ab8f-d9e596d48035 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:20:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:42.857 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:20:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:20:43 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:20:43 np0005593234 nova_compute[227762]: 2026-01-23 10:20:43.331 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:20:43 np0005593234 nova_compute[227762]: 2026-01-23 10:20:43.332 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Ensure instance console log exists: /var/lib/nova/instances/a00a5042-ce71-4ecf-ab8f-d9e596d48035/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:20:43 np0005593234 nova_compute[227762]: 2026-01-23 10:20:43.333 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:43 np0005593234 nova_compute[227762]: 2026-01-23 10:20:43.333 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:43 np0005593234 nova_compute[227762]: 2026-01-23 10:20:43.333 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:43 np0005593234 nova_compute[227762]: 2026-01-23 10:20:43.356 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:43 np0005593234 nova_compute[227762]: 2026-01-23 10:20:43.356 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:43 np0005593234 nova_compute[227762]: 2026-01-23 10:20:43.357 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:43 np0005593234 nova_compute[227762]: 2026-01-23 10:20:43.357 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:20:43 np0005593234 nova_compute[227762]: 2026-01-23 10:20:43.357 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:43 np0005593234 nova_compute[227762]: 2026-01-23 10:20:43.470 227766 DEBUG nova.network.neutron [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Successfully created port: d4f06800-1f0a-4f50-b00d-b10219301efc _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:20:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:20:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1543620271' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:20:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:43.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:44 np0005593234 nova_compute[227762]: 2026-01-23 10:20:44.296 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.939s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:44.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:44 np0005593234 nova_compute[227762]: 2026-01-23 10:20:44.499 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000095 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:20:44 np0005593234 nova_compute[227762]: 2026-01-23 10:20:44.499 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-00000095 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:20:44 np0005593234 nova_compute[227762]: 2026-01-23 10:20:44.650 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:20:44 np0005593234 nova_compute[227762]: 2026-01-23 10:20:44.651 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4168MB free_disk=20.851818084716797GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:20:44 np0005593234 nova_compute[227762]: 2026-01-23 10:20:44.651 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:44 np0005593234 nova_compute[227762]: 2026-01-23 10:20:44.651 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:45 np0005593234 nova_compute[227762]: 2026-01-23 10:20:45.147 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance a0be3878-0750-42e1-8219-5dd9d4b3412c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:20:45 np0005593234 nova_compute[227762]: 2026-01-23 10:20:45.147 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance a00a5042-ce71-4ecf-ab8f-d9e596d48035 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:20:45 np0005593234 nova_compute[227762]: 2026-01-23 10:20:45.148 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:20:45 np0005593234 nova_compute[227762]: 2026-01-23 10:20:45.148 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:20:45 np0005593234 nova_compute[227762]: 2026-01-23 10:20:45.442 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:45 np0005593234 nova_compute[227762]: 2026-01-23 10:20:45.577 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:45.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:20:46 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/910478037' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:20:46 np0005593234 nova_compute[227762]: 2026-01-23 10:20:46.021 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:46 np0005593234 nova_compute[227762]: 2026-01-23 10:20:46.027 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:20:46 np0005593234 nova_compute[227762]: 2026-01-23 10:20:46.106 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:20:46 np0005593234 NetworkManager[48942]: <info>  [1769163646.1265] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/302)
Jan 23 05:20:46 np0005593234 NetworkManager[48942]: <info>  [1769163646.1277] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/303)
Jan 23 05:20:46 np0005593234 nova_compute[227762]: 2026-01-23 10:20:46.126 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:46 np0005593234 nova_compute[227762]: 2026-01-23 10:20:46.208 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:20:46 np0005593234 nova_compute[227762]: 2026-01-23 10:20:46.209 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:46 np0005593234 nova_compute[227762]: 2026-01-23 10:20:46.286 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:46 np0005593234 ovn_controller[134547]: 2026-01-23T10:20:46Z|00627|binding|INFO|Releasing lport 4dd3507c-09b2-4097-8357-2c398ef8e03c from this chassis (sb_readonly=0)
Jan 23 05:20:46 np0005593234 nova_compute[227762]: 2026-01-23 10:20:46.304 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:46.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:47 np0005593234 nova_compute[227762]: 2026-01-23 10:20:47.685 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:47.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:48.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:49 np0005593234 nova_compute[227762]: 2026-01-23 10:20:49.172 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:49 np0005593234 nova_compute[227762]: 2026-01-23 10:20:49.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:49.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:20:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:20:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:20:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:50.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:20:50 np0005593234 nova_compute[227762]: 2026-01-23 10:20:50.469 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:50 np0005593234 nova_compute[227762]: 2026-01-23 10:20:50.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:50 np0005593234 nova_compute[227762]: 2026-01-23 10:20:50.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:20:51 np0005593234 nova_compute[227762]: 2026-01-23 10:20:51.619 227766 DEBUG nova.network.neutron [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Successfully updated port: d4f06800-1f0a-4f50-b00d-b10219301efc _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:20:51 np0005593234 nova_compute[227762]: 2026-01-23 10:20:51.696 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:20:51 np0005593234 nova_compute[227762]: 2026-01-23 10:20:51.697 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquired lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:20:51 np0005593234 nova_compute[227762]: 2026-01-23 10:20:51.697 227766 DEBUG nova.network.neutron [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:20:51 np0005593234 nova_compute[227762]: 2026-01-23 10:20:51.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:51.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:52.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:52 np0005593234 nova_compute[227762]: 2026-01-23 10:20:52.692 227766 DEBUG nova.compute.manager [req-f18d896b-1830-442d-b0ba-e676f674fac3 req-d053b213-06c2-4fe7-98bc-07802c81bb3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:20:52 np0005593234 nova_compute[227762]: 2026-01-23 10:20:52.692 227766 DEBUG nova.compute.manager [req-f18d896b-1830-442d-b0ba-e676f674fac3 req-d053b213-06c2-4fe7-98bc-07802c81bb3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing instance network info cache due to event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:20:52 np0005593234 nova_compute[227762]: 2026-01-23 10:20:52.693 227766 DEBUG oslo_concurrency.lockutils [req-f18d896b-1830-442d-b0ba-e676f674fac3 req-d053b213-06c2-4fe7-98bc-07802c81bb3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:20:52 np0005593234 nova_compute[227762]: 2026-01-23 10:20:52.701 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:52 np0005593234 nova_compute[227762]: 2026-01-23 10:20:52.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:52 np0005593234 nova_compute[227762]: 2026-01-23 10:20:52.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:20:52 np0005593234 nova_compute[227762]: 2026-01-23 10:20:52.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:20:52 np0005593234 nova_compute[227762]: 2026-01-23 10:20:52.776 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:20:52 np0005593234 nova_compute[227762]: 2026-01-23 10:20:52.855 227766 DEBUG nova.network.neutron [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:20:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:53 np0005593234 nova_compute[227762]: 2026-01-23 10:20:53.235 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:20:53 np0005593234 nova_compute[227762]: 2026-01-23 10:20:53.235 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:20:53 np0005593234 nova_compute[227762]: 2026-01-23 10:20:53.235 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:20:53 np0005593234 nova_compute[227762]: 2026-01-23 10:20:53.235 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a0be3878-0750-42e1-8219-5dd9d4b3412c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:20:53 np0005593234 podman[297314]: 2026-01-23 10:20:53.789055175 +0000 UTC m=+0.088776140 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:20:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:20:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:53.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:20:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:54.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.471 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.625 227766 DEBUG nova.network.neutron [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updating instance_info_cache with network_info: [{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.674 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Releasing lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.674 227766 DEBUG nova.compute.manager [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Instance network_info: |[{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.675 227766 DEBUG oslo_concurrency.lockutils [req-f18d896b-1830-442d-b0ba-e676f674fac3 req-d053b213-06c2-4fe7-98bc-07802c81bb3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.675 227766 DEBUG nova.network.neutron [req-f18d896b-1830-442d-b0ba-e676f674fac3 req-d053b213-06c2-4fe7-98bc-07802c81bb3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.678 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Start _get_guest_xml network_info=[{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:19:54Z,direct_url=<?>,disk_format='qcow2',id=bba8f8ac-6563-4b96-a735-670d31b1818b,min_disk=0,min_ram=0,name='tempest-scenario-img--131466346',owner='b976daabc8124a99814954633f99ed7b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:19:57Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': 'bba8f8ac-6563-4b96-a735-670d31b1818b'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.681 227766 WARNING nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.688 227766 DEBUG nova.virt.libvirt.host [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.688 227766 DEBUG nova.virt.libvirt.host [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.692 227766 DEBUG nova.virt.libvirt.host [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.693 227766 DEBUG nova.virt.libvirt.host [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.695 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.695 227766 DEBUG nova.virt.hardware [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:19:54Z,direct_url=<?>,disk_format='qcow2',id=bba8f8ac-6563-4b96-a735-670d31b1818b,min_disk=0,min_ram=0,name='tempest-scenario-img--131466346',owner='b976daabc8124a99814954633f99ed7b',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:19:57Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.695 227766 DEBUG nova.virt.hardware [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.696 227766 DEBUG nova.virt.hardware [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.696 227766 DEBUG nova.virt.hardware [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.696 227766 DEBUG nova.virt.hardware [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.696 227766 DEBUG nova.virt.hardware [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.696 227766 DEBUG nova.virt.hardware [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.697 227766 DEBUG nova.virt.hardware [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.697 227766 DEBUG nova.virt.hardware [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.697 227766 DEBUG nova.virt.hardware [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.697 227766 DEBUG nova.virt.hardware [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:20:55 np0005593234 nova_compute[227762]: 2026-01-23 10:20:55.700 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:55.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:20:56 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/353415393' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.138 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.166 227766 DEBUG nova.storage.rbd_utils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.170 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:56.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:20:56 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1556264655' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.601 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.603 227766 DEBUG nova.virt.libvirt.vif [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:20:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-79223605',display_name='tempest-TestMinimumBasicScenario-server-79223605',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-79223605',id=154,image_ref='bba8f8ac-6563-4b96-a735-670d31b1818b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8Y9UEL92/+TB+I+GNhaZt1mYMByc7/BrYfEDaKlAAZo7j91A8ceJavobN2fd/HuU5MXKggpmRNE2fbSVJxSSFeNnWSzt9Sqrij7kFCnUGkI6fsAtTHbWMsV0NwSH55dw==',key_name='tempest-TestMinimumBasicScenario-867634297',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b976daabc8124a99814954633f99ed7b',ramdisk_id='',reservation_id='r-o6qhqwf7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bba8f8ac-6563-4b96-a735-670d31b1818b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1465373740',owner_user_name='tempest-TestMinimumBasicScenario-1465373740-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:20:33Z,user_data=None,user_id='c041da0a601a4260b29fc9c65719597f',uuid=a00a5042-ce71-4ecf-ab8f-d9e596d48035,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.604 227766 DEBUG nova.network.os_vif_util [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converting VIF {"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.605 227766 DEBUG nova.network.os_vif_util [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.606 227766 DEBUG nova.objects.instance [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'pci_devices' on Instance uuid a00a5042-ce71-4ecf-ab8f-d9e596d48035 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.640 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  <uuid>a00a5042-ce71-4ecf-ab8f-d9e596d48035</uuid>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  <name>instance-0000009a</name>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestMinimumBasicScenario-server-79223605</nova:name>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:20:55</nova:creationTime>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <nova:user uuid="c041da0a601a4260b29fc9c65719597f">tempest-TestMinimumBasicScenario-1465373740-project-member</nova:user>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <nova:project uuid="b976daabc8124a99814954633f99ed7b">tempest-TestMinimumBasicScenario-1465373740</nova:project>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="bba8f8ac-6563-4b96-a735-670d31b1818b"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <nova:port uuid="d4f06800-1f0a-4f50-b00d-b10219301efc">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <entry name="serial">a00a5042-ce71-4ecf-ab8f-d9e596d48035</entry>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <entry name="uuid">a00a5042-ce71-4ecf-ab8f-d9e596d48035</entry>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk.config">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:01:4f:d5"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <target dev="tapd4f06800-1f"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/a00a5042-ce71-4ecf-ab8f-d9e596d48035/console.log" append="off"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:20:56 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:20:56 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:20:56 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:20:56 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.641 227766 DEBUG nova.compute.manager [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Preparing to wait for external event network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.642 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.642 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.642 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.643 227766 DEBUG nova.virt.libvirt.vif [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:20:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-79223605',display_name='tempest-TestMinimumBasicScenario-server-79223605',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-79223605',id=154,image_ref='bba8f8ac-6563-4b96-a735-670d31b1818b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8Y9UEL92/+TB+I+GNhaZt1mYMByc7/BrYfEDaKlAAZo7j91A8ceJavobN2fd/HuU5MXKggpmRNE2fbSVJxSSFeNnWSzt9Sqrij7kFCnUGkI6fsAtTHbWMsV0NwSH55dw==',key_name='tempest-TestMinimumBasicScenario-867634297',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b976daabc8124a99814954633f99ed7b',ramdisk_id='',reservation_id='r-o6qhqwf7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bba8f8ac-6563-4b96-a735-670d31b1818b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestMinimumBasicScenario-1465373740',owner_user_name='tempest-TestMinimumBasicScenario-1465373740-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:20:33Z,user_data=None,user_id='c041da0a601a4260b29fc9c65719597f',uuid=a00a5042-ce71-4ecf-ab8f-d9e596d48035,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.643 227766 DEBUG nova.network.os_vif_util [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converting VIF {"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.644 227766 DEBUG nova.network.os_vif_util [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.644 227766 DEBUG os_vif [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.645 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.645 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.646 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.650 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.650 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4f06800-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.651 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd4f06800-1f, col_values=(('external_ids', {'iface-id': 'd4f06800-1f0a-4f50-b00d-b10219301efc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:4f:d5', 'vm-uuid': 'a00a5042-ce71-4ecf-ab8f-d9e596d48035'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:20:56 np0005593234 NetworkManager[48942]: <info>  [1769163656.6533] manager: (tapd4f06800-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/304)
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.655 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.657 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.659 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.660 227766 INFO os_vif [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f')#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.769 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.769 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.770 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No VIF found with MAC fa:16:3e:01:4f:d5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.770 227766 INFO nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Using config drive#033[00m
Jan 23 05:20:56 np0005593234 nova_compute[227762]: 2026-01-23 10:20:56.797 227766 DEBUG nova.storage.rbd_utils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:20:57 np0005593234 nova_compute[227762]: 2026-01-23 10:20:57.077 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Updating instance_info_cache with network_info: [{"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:20:57 np0005593234 nova_compute[227762]: 2026-01-23 10:20:57.119 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-a0be3878-0750-42e1-8219-5dd9d4b3412c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:20:57 np0005593234 nova_compute[227762]: 2026-01-23 10:20:57.120 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:20:57 np0005593234 nova_compute[227762]: 2026-01-23 10:20:57.120 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:57 np0005593234 nova_compute[227762]: 2026-01-23 10:20:57.704 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:57.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:20:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:58.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:58 np0005593234 nova_compute[227762]: 2026-01-23 10:20:58.480 227766 INFO nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Creating config drive at /var/lib/nova/instances/a00a5042-ce71-4ecf-ab8f-d9e596d48035/disk.config#033[00m
Jan 23 05:20:58 np0005593234 nova_compute[227762]: 2026-01-23 10:20:58.485 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a00a5042-ce71-4ecf-ab8f-d9e596d48035/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_o9qn896 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:58 np0005593234 nova_compute[227762]: 2026-01-23 10:20:58.618 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a00a5042-ce71-4ecf-ab8f-d9e596d48035/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_o9qn896" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:58 np0005593234 nova_compute[227762]: 2026-01-23 10:20:58.648 227766 DEBUG nova.storage.rbd_utils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] rbd image a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:20:58 np0005593234 nova_compute[227762]: 2026-01-23 10:20:58.652 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/a00a5042-ce71-4ecf-ab8f-d9e596d48035/disk.config a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:20:58 np0005593234 nova_compute[227762]: 2026-01-23 10:20:58.802 227766 DEBUG oslo_concurrency.processutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/a00a5042-ce71-4ecf-ab8f-d9e596d48035/disk.config a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.149s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:20:58 np0005593234 nova_compute[227762]: 2026-01-23 10:20:58.802 227766 INFO nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Deleting local config drive /var/lib/nova/instances/a00a5042-ce71-4ecf-ab8f-d9e596d48035/disk.config because it was imported into RBD.#033[00m
Jan 23 05:20:58 np0005593234 kernel: tapd4f06800-1f: entered promiscuous mode
Jan 23 05:20:58 np0005593234 NetworkManager[48942]: <info>  [1769163658.8532] manager: (tapd4f06800-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/305)
Jan 23 05:20:58 np0005593234 nova_compute[227762]: 2026-01-23 10:20:58.854 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:58 np0005593234 ovn_controller[134547]: 2026-01-23T10:20:58Z|00628|binding|INFO|Claiming lport d4f06800-1f0a-4f50-b00d-b10219301efc for this chassis.
Jan 23 05:20:58 np0005593234 ovn_controller[134547]: 2026-01-23T10:20:58Z|00629|binding|INFO|d4f06800-1f0a-4f50-b00d-b10219301efc: Claiming fa:16:3e:01:4f:d5 10.100.0.13
Jan 23 05:20:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:58.867 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:4f:d5 10.100.0.13'], port_security=['fa:16:3e:01:4f:d5 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a00a5042-ce71-4ecf-ab8f-d9e596d48035', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b976daabc8124a99814954633f99ed7b', 'neutron:revision_number': '2', 'neutron:security_group_ids': '77e10692-5f18-4d4e-ba14-6f09047b276a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0878063b-8606-438f-ae03-20f399cd80c4, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d4f06800-1f0a-4f50-b00d-b10219301efc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:20:58 np0005593234 ovn_controller[134547]: 2026-01-23T10:20:58Z|00630|binding|INFO|Setting lport d4f06800-1f0a-4f50-b00d-b10219301efc ovn-installed in OVS
Jan 23 05:20:58 np0005593234 ovn_controller[134547]: 2026-01-23T10:20:58Z|00631|binding|INFO|Setting lport d4f06800-1f0a-4f50-b00d-b10219301efc up in Southbound
Jan 23 05:20:58 np0005593234 nova_compute[227762]: 2026-01-23 10:20:58.871 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:58.873 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d4f06800-1f0a-4f50-b00d-b10219301efc in datapath 8b38c3ca-73e5-4583-a277-cd0670deffdb bound to our chassis#033[00m
Jan 23 05:20:58 np0005593234 nova_compute[227762]: 2026-01-23 10:20:58.875 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:58.875 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8b38c3ca-73e5-4583-a277-cd0670deffdb#033[00m
Jan 23 05:20:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:58.890 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[badb7381-5007-4089-8122-cfd2facc6c4b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:58.892 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8b38c3ca-71 in ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:20:58 np0005593234 systemd-machined[195626]: New machine qemu-72-instance-0000009a.
Jan 23 05:20:58 np0005593234 systemd-udevd[297472]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:20:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:58.896 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8b38c3ca-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:20:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:58.896 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0f21c52a-535d-4558-8e6a-d3b2b41308ba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:58.897 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[49295b75-548d-451e-908a-d2bba9d97027]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:58 np0005593234 systemd[1]: Started Virtual Machine qemu-72-instance-0000009a.
Jan 23 05:20:58 np0005593234 NetworkManager[48942]: <info>  [1769163658.9110] device (tapd4f06800-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:20:58 np0005593234 NetworkManager[48942]: <info>  [1769163658.9120] device (tapd4f06800-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:20:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:58.912 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[e6d605c9-204c-40e5-bde7-119ab147ed4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:58.940 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d31a3ba4-a4e6-458f-a856-352f89ead7f2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:58.977 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[5e5402c1-e5ba-4942-b3c5-74bd4656b8c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:58 np0005593234 NetworkManager[48942]: <info>  [1769163658.9845] manager: (tap8b38c3ca-70): new Veth device (/org/freedesktop/NetworkManager/Devices/306)
Jan 23 05:20:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:58.983 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f63b26ec-7a33-4c3f-abaa-553217ee4049]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.018 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[62741f15-f296-4f82-bc18-67bc0e01b5a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.020 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[2e22c928-a20a-4411-be06-4d2620d0d24f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:59 np0005593234 NetworkManager[48942]: <info>  [1769163659.0431] device (tap8b38c3ca-70): carrier: link connected
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.048 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[512868dc-12d1-4083-8e26-ba254836d952]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.063 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[77687f6c-55eb-4a36-9897-38af1b341e29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b38c3ca-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:fa:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 755400, 'reachable_time': 23747, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297504, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.077 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[df3ac547-5cb0-4d0c-a61f-34cb5f0e712f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8e:fa5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 755400, 'tstamp': 755400}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 297505, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.098 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4ad2e289-27bc-45a0-8065-f48e0037cc86]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b38c3ca-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:fa:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 196], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 755400, 'reachable_time': 23747, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 297506, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.130 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e8b49a71-30db-4e1b-94d3-89d5e0d297d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.185 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a19b3848-7782-4ee7-96d9-5e388ddce695]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.187 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b38c3ca-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.187 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.188 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b38c3ca-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:20:59 np0005593234 kernel: tap8b38c3ca-70: entered promiscuous mode
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.189 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:59 np0005593234 NetworkManager[48942]: <info>  [1769163659.1905] manager: (tap8b38c3ca-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/307)
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.196 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8b38c3ca-70, col_values=(('external_ids', {'iface-id': '120d9d64-6853-4b50-a095-bddadd015ba1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.197 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:59 np0005593234 ovn_controller[134547]: 2026-01-23T10:20:59Z|00632|binding|INFO|Releasing lport 120d9d64-6853-4b50-a095-bddadd015ba1 from this chassis (sb_readonly=0)
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.199 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.200 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8b38c3ca-73e5-4583-a277-cd0670deffdb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8b38c3ca-73e5-4583-a277-cd0670deffdb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.200 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae00d1e-5f8c-4a4c-8a9e-4e82ba81d121]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.201 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-8b38c3ca-73e5-4583-a277-cd0670deffdb
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/8b38c3ca-73e5-4583-a277-cd0670deffdb.pid.haproxy
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 8b38c3ca-73e5-4583-a277-cd0670deffdb
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:20:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:20:59.202 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'env', 'PROCESS_TAG=haproxy-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8b38c3ca-73e5-4583-a277-cd0670deffdb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.212 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.216 227766 DEBUG nova.network.neutron [req-f18d896b-1830-442d-b0ba-e676f674fac3 req-d053b213-06c2-4fe7-98bc-07802c81bb3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updated VIF entry in instance network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.216 227766 DEBUG nova.network.neutron [req-f18d896b-1830-442d-b0ba-e676f674fac3 req-d053b213-06c2-4fe7-98bc-07802c81bb3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updating instance_info_cache with network_info: [{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.266 227766 DEBUG oslo_concurrency.lockutils [req-f18d896b-1830-442d-b0ba-e676f674fac3 req-d053b213-06c2-4fe7-98bc-07802c81bb3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:20:59 np0005593234 podman[297538]: 2026-01-23 10:20:59.596133915 +0000 UTC m=+0.049362305 container create 59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:20:59 np0005593234 systemd[1]: Started libpod-conmon-59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6.scope.
Jan 23 05:20:59 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:20:59 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8297876adb61a8222b954e849161ea0bc6d013fdaa6ff0f8793e50a99c82c48c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:20:59 np0005593234 podman[297538]: 2026-01-23 10:20:59.569739424 +0000 UTC m=+0.022967834 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:20:59 np0005593234 podman[297538]: 2026-01-23 10:20:59.671036242 +0000 UTC m=+0.124264652 container init 59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 05:20:59 np0005593234 podman[297538]: 2026-01-23 10:20:59.676402428 +0000 UTC m=+0.129630818 container start 59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.685 227766 DEBUG nova.compute.manager [req-01953a63-1ebc-4afd-98ee-f43f4ef36772 req-fdfc2e26-33b7-4bd1-ac9e-a8c14de151dc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.685 227766 DEBUG oslo_concurrency.lockutils [req-01953a63-1ebc-4afd-98ee-f43f4ef36772 req-fdfc2e26-33b7-4bd1-ac9e-a8c14de151dc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.685 227766 DEBUG oslo_concurrency.lockutils [req-01953a63-1ebc-4afd-98ee-f43f4ef36772 req-fdfc2e26-33b7-4bd1-ac9e-a8c14de151dc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.686 227766 DEBUG oslo_concurrency.lockutils [req-01953a63-1ebc-4afd-98ee-f43f4ef36772 req-fdfc2e26-33b7-4bd1-ac9e-a8c14de151dc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.686 227766 DEBUG nova.compute.manager [req-01953a63-1ebc-4afd-98ee-f43f4ef36772 req-fdfc2e26-33b7-4bd1-ac9e-a8c14de151dc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Processing event network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:20:59 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[297554]: [NOTICE]   (297558) : New worker (297575) forked
Jan 23 05:20:59 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[297554]: [NOTICE]   (297558) : Loading success.
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:20:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:20:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:20:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:59.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.874 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163659.8741975, a00a5042-ce71-4ecf-ab8f-d9e596d48035 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.875 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] VM Started (Lifecycle Event)#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.877 227766 DEBUG nova.compute.manager [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.880 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.884 227766 INFO nova.virt.libvirt.driver [-] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Instance spawned successfully.#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.885 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.915 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.922 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.923 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.923 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.924 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.925 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.926 227766 DEBUG nova.virt.libvirt.driver [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:20:59 np0005593234 nova_compute[227762]: 2026-01-23 10:20:59.933 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:21:00 np0005593234 nova_compute[227762]: 2026-01-23 10:21:00.059 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:21:00 np0005593234 nova_compute[227762]: 2026-01-23 10:21:00.060 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163659.8744404, a00a5042-ce71-4ecf-ab8f-d9e596d48035 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:21:00 np0005593234 nova_compute[227762]: 2026-01-23 10:21:00.060 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:21:00 np0005593234 nova_compute[227762]: 2026-01-23 10:21:00.117 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:00 np0005593234 nova_compute[227762]: 2026-01-23 10:21:00.120 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163659.88021, a00a5042-ce71-4ecf-ab8f-d9e596d48035 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:21:00 np0005593234 nova_compute[227762]: 2026-01-23 10:21:00.120 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:21:00 np0005593234 nova_compute[227762]: 2026-01-23 10:21:00.181 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:00 np0005593234 nova_compute[227762]: 2026-01-23 10:21:00.185 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:21:00 np0005593234 nova_compute[227762]: 2026-01-23 10:21:00.207 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:21:00 np0005593234 nova_compute[227762]: 2026-01-23 10:21:00.223 227766 INFO nova.compute.manager [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Took 26.24 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:21:00 np0005593234 nova_compute[227762]: 2026-01-23 10:21:00.223 227766 DEBUG nova.compute.manager [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:00.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:00 np0005593234 nova_compute[227762]: 2026-01-23 10:21:00.605 227766 INFO nova.compute.manager [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Took 29.49 seconds to build instance.#033[00m
Jan 23 05:21:01 np0005593234 nova_compute[227762]: 2026-01-23 10:21:01.654 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:01 np0005593234 nova_compute[227762]: 2026-01-23 10:21:01.782 227766 DEBUG oslo_concurrency.lockutils [None req-6f930a82-0e49-4cf9-bfc9-c43c21ccee47 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 30.955s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:01 np0005593234 podman[297612]: 2026-01-23 10:21:01.805364521 +0000 UTC m=+0.104541450 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:21:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:01.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:02 np0005593234 nova_compute[227762]: 2026-01-23 10:21:02.162 227766 DEBUG nova.compute.manager [req-67fc8977-b226-494d-b42e-6ce6566b52ee req-4b72574d-705a-4248-8749-e7eb5c89e17d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:02 np0005593234 nova_compute[227762]: 2026-01-23 10:21:02.163 227766 DEBUG oslo_concurrency.lockutils [req-67fc8977-b226-494d-b42e-6ce6566b52ee req-4b72574d-705a-4248-8749-e7eb5c89e17d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:02 np0005593234 nova_compute[227762]: 2026-01-23 10:21:02.163 227766 DEBUG oslo_concurrency.lockutils [req-67fc8977-b226-494d-b42e-6ce6566b52ee req-4b72574d-705a-4248-8749-e7eb5c89e17d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:02 np0005593234 nova_compute[227762]: 2026-01-23 10:21:02.163 227766 DEBUG oslo_concurrency.lockutils [req-67fc8977-b226-494d-b42e-6ce6566b52ee req-4b72574d-705a-4248-8749-e7eb5c89e17d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:02 np0005593234 nova_compute[227762]: 2026-01-23 10:21:02.163 227766 DEBUG nova.compute.manager [req-67fc8977-b226-494d-b42e-6ce6566b52ee req-4b72574d-705a-4248-8749-e7eb5c89e17d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] No waiting events found dispatching network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:21:02 np0005593234 nova_compute[227762]: 2026-01-23 10:21:02.164 227766 WARNING nova.compute.manager [req-67fc8977-b226-494d-b42e-6ce6566b52ee req-4b72574d-705a-4248-8749-e7eb5c89e17d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received unexpected event network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc for instance with vm_state active and task_state None.#033[00m
Jan 23 05:21:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:02.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:02 np0005593234 nova_compute[227762]: 2026-01-23 10:21:02.706 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:02 np0005593234 nova_compute[227762]: 2026-01-23 10:21:02.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:03.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:04.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:05.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:06.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:06 np0005593234 nova_compute[227762]: 2026-01-23 10:21:06.657 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:07 np0005593234 nova_compute[227762]: 2026-01-23 10:21:07.746 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:07.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:08.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:09.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:10.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:10 np0005593234 nova_compute[227762]: 2026-01-23 10:21:10.850 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:11 np0005593234 nova_compute[227762]: 2026-01-23 10:21:11.664 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:11.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:12.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:12 np0005593234 nova_compute[227762]: 2026-01-23 10:21:12.747 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:13.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:21:13Z|00069|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:01:4f:d5 10.100.0.13
Jan 23 05:21:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:21:13Z|00070|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:01:4f:d5 10.100.0.13
Jan 23 05:21:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:14.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.086 227766 DEBUG oslo_concurrency.lockutils [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a0be3878-0750-42e1-8219-5dd9d4b3412c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.086 227766 DEBUG oslo_concurrency.lockutils [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.087 227766 DEBUG oslo_concurrency.lockutils [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.087 227766 DEBUG oslo_concurrency.lockutils [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.087 227766 DEBUG oslo_concurrency.lockutils [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.088 227766 INFO nova.compute.manager [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Terminating instance#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.089 227766 DEBUG nova.compute.manager [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:21:15 np0005593234 kernel: tapa8c7e858-f8 (unregistering): left promiscuous mode
Jan 23 05:21:15 np0005593234 NetworkManager[48942]: <info>  [1769163675.1415] device (tapa8c7e858-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:21:15 np0005593234 ovn_controller[134547]: 2026-01-23T10:21:15Z|00633|binding|INFO|Releasing lport a8c7e858-f840-4ed1-b47f-7d4497071e55 from this chassis (sb_readonly=0)
Jan 23 05:21:15 np0005593234 ovn_controller[134547]: 2026-01-23T10:21:15Z|00634|binding|INFO|Setting lport a8c7e858-f840-4ed1-b47f-7d4497071e55 down in Southbound
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.151 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:15 np0005593234 ovn_controller[134547]: 2026-01-23T10:21:15Z|00635|binding|INFO|Removing iface tapa8c7e858-f8 ovn-installed in OVS
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.166 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.171 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:17:60 10.100.0.3'], port_security=['fa:16:3e:69:17:60 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'a0be3878-0750-42e1-8219-5dd9d4b3412c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ef2a274e-4da0-400b-bcb7-ccf7f53401c1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3f0c6dd0-0716-4c13-90bf-a399480fe5a7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=211b0b88-3773-4743-9d70-bd2a35ab028e, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a8c7e858-f840-4ed1-b47f-7d4497071e55) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.173 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a8c7e858-f840-4ed1-b47f-7d4497071e55 in datapath ef2a274e-4da0-400b-bcb7-ccf7f53401c1 unbound from our chassis#033[00m
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.175 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network ef2a274e-4da0-400b-bcb7-ccf7f53401c1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.177 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fdd72957-5e6e-4c75-8ba2-4bd27b6cca49]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.178 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1 namespace which is not needed anymore#033[00m
Jan 23 05:21:15 np0005593234 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000095.scope: Deactivated successfully.
Jan 23 05:21:15 np0005593234 systemd[1]: machine-qemu\x2d71\x2dinstance\x2d00000095.scope: Consumed 18.982s CPU time.
Jan 23 05:21:15 np0005593234 systemd-machined[195626]: Machine qemu-71-instance-00000095 terminated.
Jan 23 05:21:15 np0005593234 neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1[296150]: [NOTICE]   (296155) : haproxy version is 2.8.14-c23fe91
Jan 23 05:21:15 np0005593234 neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1[296150]: [NOTICE]   (296155) : path to executable is /usr/sbin/haproxy
Jan 23 05:21:15 np0005593234 neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1[296150]: [WARNING]  (296155) : Exiting Master process...
Jan 23 05:21:15 np0005593234 neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1[296150]: [ALERT]    (296155) : Current worker (296157) exited with code 143 (Terminated)
Jan 23 05:21:15 np0005593234 neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1[296150]: [WARNING]  (296155) : All workers exited. Exiting... (0)
Jan 23 05:21:15 np0005593234 systemd[1]: libpod-bdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1.scope: Deactivated successfully.
Jan 23 05:21:15 np0005593234 podman[297722]: 2026-01-23 10:21:15.306108021 +0000 UTC m=+0.041195641 container died bdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.312 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.315 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.347 227766 INFO nova.virt.libvirt.driver [-] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Instance destroyed successfully.#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.348 227766 DEBUG nova.objects.instance [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'resources' on Instance uuid a0be3878-0750-42e1-8219-5dd9d4b3412c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:21:15 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1-userdata-shm.mount: Deactivated successfully.
Jan 23 05:21:15 np0005593234 systemd[1]: var-lib-containers-storage-overlay-185b1b81c0f784497799f7e52a890ed056602473d9ea8be925ef01ef7cbb8ce2-merged.mount: Deactivated successfully.
Jan 23 05:21:15 np0005593234 podman[297722]: 2026-01-23 10:21:15.375419996 +0000 UTC m=+0.110507616 container cleanup bdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:21:15 np0005593234 systemd[1]: libpod-conmon-bdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1.scope: Deactivated successfully.
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.392 227766 DEBUG nova.virt.libvirt.vif [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-942916977',display_name='tempest-TestNetworkBasicOps-server-942916977',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-942916977',id=149,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJdB97H3R8BgDR2jd94b/eFyJTAqvmLTTsSC7oR+dOUgKelzDzIxuLparKQHADcuxki2LCEgJ5UyTzGmYLOA+Tbo16wmjo1/o8NzBKLh8f0Lbh7anUmFdDARVVV8OKyqUg==',key_name='tempest-TestNetworkBasicOps-1305641656',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:19:03Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-s1tm5r0p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:19:03Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=a0be3878-0750-42e1-8219-5dd9d4b3412c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.392 227766 DEBUG nova.network.os_vif_util [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "address": "fa:16:3e:69:17:60", "network": {"id": "ef2a274e-4da0-400b-bcb7-ccf7f53401c1", "bridge": "br-int", "label": "tempest-network-smoke--403303166", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8c7e858-f8", "ovs_interfaceid": "a8c7e858-f840-4ed1-b47f-7d4497071e55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.393 227766 DEBUG nova.network.os_vif_util [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:69:17:60,bridge_name='br-int',has_traffic_filtering=True,id=a8c7e858-f840-4ed1-b47f-7d4497071e55,network=Network(ef2a274e-4da0-400b-bcb7-ccf7f53401c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8c7e858-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.394 227766 DEBUG os_vif [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:17:60,bridge_name='br-int',has_traffic_filtering=True,id=a8c7e858-f840-4ed1-b47f-7d4497071e55,network=Network(ef2a274e-4da0-400b-bcb7-ccf7f53401c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8c7e858-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.396 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.396 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8c7e858-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.397 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.399 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.401 227766 INFO os_vif [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:17:60,bridge_name='br-int',has_traffic_filtering=True,id=a8c7e858-f840-4ed1-b47f-7d4497071e55,network=Network(ef2a274e-4da0-400b-bcb7-ccf7f53401c1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8c7e858-f8')#033[00m
Jan 23 05:21:15 np0005593234 podman[297760]: 2026-01-23 10:21:15.438360501 +0000 UTC m=+0.044395710 container remove bdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.444 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[671b7bcd-8ce8-4865-bffd-cc75db747af1]: (4, ('Fri Jan 23 10:21:15 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1 (bdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1)\nbdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1\nFri Jan 23 10:21:15 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1 (bdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1)\nbdfbd5c6c0d5aa3237496008b3f16083c3f7345fe88d5ad3e3056fdf209484a1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.446 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8b0e87-dc7e-4cfb-9cde-b0cdded909c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.448 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapef2a274e-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.449 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:15 np0005593234 kernel: tapef2a274e-40: left promiscuous mode
Jan 23 05:21:15 np0005593234 nova_compute[227762]: 2026-01-23 10:21:15.462 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.466 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[824b2281-a645-4154-8514-3ecc7da0f27a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.486 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d38830-e47f-4dbe-810c-77a18728c658]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.487 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fd9ea94d-07db-4e82-b8a1-b9383aa51e4e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.505 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3fe149c4-ff70-423b-9a19-fd01f7e3e9ca]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 743740, 'reachable_time': 33788, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 297793, 'error': None, 'target': 'ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:15 np0005593234 systemd[1]: run-netns-ovnmeta\x2def2a274e\x2d4da0\x2d400b\x2dbcb7\x2dccf7f53401c1.mount: Deactivated successfully.
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.510 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-ef2a274e-4da0-400b-bcb7-ccf7f53401c1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:21:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:15.510 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[bc8c9cc2-6c61-4d46-bd5b-b07ade4b8d65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:15.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:16.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.053 227766 DEBUG oslo_concurrency.lockutils [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.053 227766 DEBUG oslo_concurrency.lockutils [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.086 227766 DEBUG nova.objects.instance [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'flavor' on Instance uuid a00a5042-ce71-4ecf-ab8f-d9e596d48035 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.142 227766 DEBUG oslo_concurrency.lockutils [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.664 227766 INFO nova.virt.libvirt.driver [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Deleting instance files /var/lib/nova/instances/a0be3878-0750-42e1-8219-5dd9d4b3412c_del#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.665 227766 INFO nova.virt.libvirt.driver [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Deletion of /var/lib/nova/instances/a0be3878-0750-42e1-8219-5dd9d4b3412c_del complete#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.749 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.763 227766 DEBUG oslo_concurrency.lockutils [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.764 227766 DEBUG oslo_concurrency.lockutils [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.764 227766 INFO nova.compute.manager [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Attaching volume 7857fc75-1658-465f-a6a1-40f608f6408e to /dev/vdb#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.814 227766 INFO nova.compute.manager [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Took 2.72 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.815 227766 DEBUG oslo.service.loopingcall [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.815 227766 DEBUG nova.compute.manager [-] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.816 227766 DEBUG nova.network.neutron [-] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:21:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:17.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.898 227766 DEBUG nova.compute.manager [req-477029df-8514-41d7-9a10-238d3d8bfb13 req-c3724da3-7116-41f9-a363-e11e058ed9a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Received event network-vif-unplugged-a8c7e858-f840-4ed1-b47f-7d4497071e55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.899 227766 DEBUG oslo_concurrency.lockutils [req-477029df-8514-41d7-9a10-238d3d8bfb13 req-c3724da3-7116-41f9-a363-e11e058ed9a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.899 227766 DEBUG oslo_concurrency.lockutils [req-477029df-8514-41d7-9a10-238d3d8bfb13 req-c3724da3-7116-41f9-a363-e11e058ed9a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.900 227766 DEBUG oslo_concurrency.lockutils [req-477029df-8514-41d7-9a10-238d3d8bfb13 req-c3724da3-7116-41f9-a363-e11e058ed9a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.900 227766 DEBUG nova.compute.manager [req-477029df-8514-41d7-9a10-238d3d8bfb13 req-c3724da3-7116-41f9-a363-e11e058ed9a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] No waiting events found dispatching network-vif-unplugged-a8c7e858-f840-4ed1-b47f-7d4497071e55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:21:17 np0005593234 nova_compute[227762]: 2026-01-23 10:21:17.900 227766 DEBUG nova.compute.manager [req-477029df-8514-41d7-9a10-238d3d8bfb13 req-c3724da3-7116-41f9-a363-e11e058ed9a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Received event network-vif-unplugged-a8c7e858-f840-4ed1-b47f-7d4497071e55 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.222 227766 DEBUG os_brick.utils [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.225 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.249 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.249 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[698e125f-e45f-4bc5-8132-2d5b32ed134b]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.251 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.265 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.014s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.265 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[c8975c00-0a33-47cf-8c8f-11c1bf7b4ca7]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.267 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.282 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.283 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[23a2ace1-347f-4b67-9677-0402715288f1]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.284 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[e0172299-2d3e-4710-b289-ed8f8072f88d]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.285 227766 DEBUG oslo_concurrency.processutils [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.316 227766 DEBUG oslo_concurrency.processutils [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "nvme version" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.319 227766 DEBUG os_brick.initiator.connectors.lightos [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.320 227766 DEBUG os_brick.initiator.connectors.lightos [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.320 227766 DEBUG os_brick.initiator.connectors.lightos [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.320 227766 DEBUG os_brick.utils [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] <== get_connector_properties: return (97ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:21:18 np0005593234 nova_compute[227762]: 2026-01-23 10:21:18.321 227766 DEBUG nova.virt.block_device [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updating existing volume attachment record: 0c773f34-5811-4b94-875c-528e61a6d7b2 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:21:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:18.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:19 np0005593234 nova_compute[227762]: 2026-01-23 10:21:19.710 227766 DEBUG nova.network.neutron [-] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:21:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:21:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/471357842' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:21:19 np0005593234 nova_compute[227762]: 2026-01-23 10:21:19.753 227766 INFO nova.compute.manager [-] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Took 1.94 seconds to deallocate network for instance.#033[00m
Jan 23 05:21:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:21:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:19.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:21:19 np0005593234 nova_compute[227762]: 2026-01-23 10:21:19.882 227766 DEBUG oslo_concurrency.lockutils [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:19 np0005593234 nova_compute[227762]: 2026-01-23 10:21:19.883 227766 DEBUG oslo_concurrency.lockutils [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:19 np0005593234 nova_compute[227762]: 2026-01-23 10:21:19.962 227766 DEBUG nova.compute.manager [req-1d71448f-d7a2-4622-88d7-1a074c008400 req-4562f606-ffdb-462c-9580-7452a323f72e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Received event network-vif-deleted-a8c7e858-f840-4ed1-b47f-7d4497071e55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.027 227766 DEBUG nova.objects.instance [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'flavor' on Instance uuid a00a5042-ce71-4ecf-ab8f-d9e596d48035 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.073 227766 DEBUG nova.virt.libvirt.driver [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Attempting to attach volume 7857fc75-1658-465f-a6a1-40f608f6408e with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.076 227766 DEBUG nova.virt.libvirt.guest [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:21:20 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:21:20 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-7857fc75-1658-465f-a6a1-40f608f6408e">
Jan 23 05:21:20 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:21:20 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:21:20 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:21:20 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:21:20 np0005593234 nova_compute[227762]:  <auth username="openstack">
Jan 23 05:21:20 np0005593234 nova_compute[227762]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:21:20 np0005593234 nova_compute[227762]:  </auth>
Jan 23 05:21:20 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:21:20 np0005593234 nova_compute[227762]:  <serial>7857fc75-1658-465f-a6a1-40f608f6408e</serial>
Jan 23 05:21:20 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:21:20 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.345 227766 DEBUG nova.compute.manager [req-1a7be1ae-c0a0-4206-81b8-557a1da010f5 req-0a74d7bb-481d-49f1-97da-2a97ab3c5ad4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Received event network-vif-plugged-a8c7e858-f840-4ed1-b47f-7d4497071e55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.346 227766 DEBUG oslo_concurrency.lockutils [req-1a7be1ae-c0a0-4206-81b8-557a1da010f5 req-0a74d7bb-481d-49f1-97da-2a97ab3c5ad4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.346 227766 DEBUG oslo_concurrency.lockutils [req-1a7be1ae-c0a0-4206-81b8-557a1da010f5 req-0a74d7bb-481d-49f1-97da-2a97ab3c5ad4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.347 227766 DEBUG oslo_concurrency.lockutils [req-1a7be1ae-c0a0-4206-81b8-557a1da010f5 req-0a74d7bb-481d-49f1-97da-2a97ab3c5ad4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.347 227766 DEBUG nova.compute.manager [req-1a7be1ae-c0a0-4206-81b8-557a1da010f5 req-0a74d7bb-481d-49f1-97da-2a97ab3c5ad4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] No waiting events found dispatching network-vif-plugged-a8c7e858-f840-4ed1-b47f-7d4497071e55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.347 227766 WARNING nova.compute.manager [req-1a7be1ae-c0a0-4206-81b8-557a1da010f5 req-0a74d7bb-481d-49f1-97da-2a97ab3c5ad4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Received unexpected event network-vif-plugged-a8c7e858-f840-4ed1-b47f-7d4497071e55 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.397 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:20.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.567 227766 DEBUG nova.virt.libvirt.driver [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.568 227766 DEBUG nova.virt.libvirt.driver [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.568 227766 DEBUG nova.virt.libvirt.driver [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:21:20 np0005593234 nova_compute[227762]: 2026-01-23 10:21:20.568 227766 DEBUG nova.virt.libvirt.driver [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] No VIF found with MAC fa:16:3e:01:4f:d5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:21:21 np0005593234 nova_compute[227762]: 2026-01-23 10:21:21.078 227766 DEBUG oslo_concurrency.processutils [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:21 np0005593234 nova_compute[227762]: 2026-01-23 10:21:21.205 227766 DEBUG oslo_concurrency.lockutils [None req-02534ff4-fa20-468e-888f-ab2e8c131a9a c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 3.441s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:21:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2016328147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:21:21 np0005593234 nova_compute[227762]: 2026-01-23 10:21:21.518 227766 DEBUG oslo_concurrency.processutils [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:21 np0005593234 nova_compute[227762]: 2026-01-23 10:21:21.525 227766 DEBUG nova.compute.provider_tree [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:21:21 np0005593234 nova_compute[227762]: 2026-01-23 10:21:21.546 227766 DEBUG nova.scheduler.client.report [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:21:21 np0005593234 nova_compute[227762]: 2026-01-23 10:21:21.584 227766 DEBUG oslo_concurrency.lockutils [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:21.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:21 np0005593234 nova_compute[227762]: 2026-01-23 10:21:21.905 227766 INFO nova.scheduler.client.report [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Deleted allocations for instance a0be3878-0750-42e1-8219-5dd9d4b3412c#033[00m
Jan 23 05:21:22 np0005593234 nova_compute[227762]: 2026-01-23 10:21:22.226 227766 DEBUG oslo_concurrency.lockutils [None req-1e63a0ff-1fee-4cb9-ba6b-162a800f367a 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a0be3878-0750-42e1-8219-5dd9d4b3412c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:22 np0005593234 nova_compute[227762]: 2026-01-23 10:21:22.430 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:22.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:22 np0005593234 nova_compute[227762]: 2026-01-23 10:21:22.751 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e312 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:23.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:24.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:24 np0005593234 podman[297849]: 2026-01-23 10:21:24.774961802 +0000 UTC m=+0.058207570 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:21:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e313 e313: 3 total, 3 up, 3 in
Jan 23 05:21:25 np0005593234 nova_compute[227762]: 2026-01-23 10:21:25.398 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:21:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:25.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:21:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:26.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:27 np0005593234 nova_compute[227762]: 2026-01-23 10:21:27.753 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:27.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:28.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:29.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:30 np0005593234 nova_compute[227762]: 2026-01-23 10:21:30.344 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163675.3437085, a0be3878-0750-42e1-8219-5dd9d4b3412c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:21:30 np0005593234 nova_compute[227762]: 2026-01-23 10:21:30.344 227766 INFO nova.compute.manager [-] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:21:30 np0005593234 nova_compute[227762]: 2026-01-23 10:21:30.400 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:30.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:31 np0005593234 nova_compute[227762]: 2026-01-23 10:21:31.596 227766 DEBUG nova.compute.manager [None req-74030a8b-0bbf-4ce0-ace7-2123bc774222 - - - - - -] [instance: a0be3878-0750-42e1-8219-5dd9d4b3412c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:21:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:31.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:32.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:32 np0005593234 nova_compute[227762]: 2026-01-23 10:21:32.754 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:32 np0005593234 podman[297872]: 2026-01-23 10:21:32.777699766 +0000 UTC m=+0.072202535 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 23 05:21:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:33.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:34.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:35 np0005593234 nova_compute[227762]: 2026-01-23 10:21:35.403 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:35.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:36.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:37 np0005593234 nova_compute[227762]: 2026-01-23 10:21:37.768 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:37.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:38.176 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=57, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=56) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:21:38 np0005593234 nova_compute[227762]: 2026-01-23 10:21:38.176 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:38.177 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:21:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:38.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:39.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:40 np0005593234 nova_compute[227762]: 2026-01-23 10:21:40.406 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:40.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:41.179 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '57'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:21:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:41.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:42 np0005593234 ovn_controller[134547]: 2026-01-23T10:21:42Z|00636|binding|INFO|Releasing lport 120d9d64-6853-4b50-a095-bddadd015ba1 from this chassis (sb_readonly=0)
Jan 23 05:21:42 np0005593234 nova_compute[227762]: 2026-01-23 10:21:42.070 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:42.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:42 np0005593234 nova_compute[227762]: 2026-01-23 10:21:42.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:42 np0005593234 nova_compute[227762]: 2026-01-23 10:21:42.769 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:42 np0005593234 nova_compute[227762]: 2026-01-23 10:21:42.847 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:42 np0005593234 nova_compute[227762]: 2026-01-23 10:21:42.848 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:42 np0005593234 nova_compute[227762]: 2026-01-23 10:21:42.848 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:42 np0005593234 nova_compute[227762]: 2026-01-23 10:21:42.848 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:21:42 np0005593234 nova_compute[227762]: 2026-01-23 10:21:42.849 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:42.856 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:42.857 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:21:42.857 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:43 np0005593234 nova_compute[227762]: 2026-01-23 10:21:43.273 227766 DEBUG nova.compute.manager [req-7e2487c0-e9ad-46cb-9a3e-41638044fef5 req-ef8b6655-0e36-433a-b538-56983d2b9738 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:43 np0005593234 nova_compute[227762]: 2026-01-23 10:21:43.273 227766 DEBUG nova.compute.manager [req-7e2487c0-e9ad-46cb-9a3e-41638044fef5 req-ef8b6655-0e36-433a-b538-56983d2b9738 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing instance network info cache due to event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:21:43 np0005593234 nova_compute[227762]: 2026-01-23 10:21:43.273 227766 DEBUG oslo_concurrency.lockutils [req-7e2487c0-e9ad-46cb-9a3e-41638044fef5 req-ef8b6655-0e36-433a-b538-56983d2b9738 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:21:43 np0005593234 nova_compute[227762]: 2026-01-23 10:21:43.273 227766 DEBUG oslo_concurrency.lockutils [req-7e2487c0-e9ad-46cb-9a3e-41638044fef5 req-ef8b6655-0e36-433a-b538-56983d2b9738 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:21:43 np0005593234 nova_compute[227762]: 2026-01-23 10:21:43.274 227766 DEBUG nova.network.neutron [req-7e2487c0-e9ad-46cb-9a3e-41638044fef5 req-ef8b6655-0e36-433a-b538-56983d2b9738 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:21:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:21:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3277885011' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:21:43 np0005593234 nova_compute[227762]: 2026-01-23 10:21:43.294 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:43 np0005593234 nova_compute[227762]: 2026-01-23 10:21:43.638 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000009a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:21:43 np0005593234 nova_compute[227762]: 2026-01-23 10:21:43.638 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000009a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:21:43 np0005593234 nova_compute[227762]: 2026-01-23 10:21:43.639 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000009a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:21:43 np0005593234 nova_compute[227762]: 2026-01-23 10:21:43.791 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:21:43 np0005593234 nova_compute[227762]: 2026-01-23 10:21:43.792 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4145MB free_disk=20.89724349975586GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:21:43 np0005593234 nova_compute[227762]: 2026-01-23 10:21:43.792 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:21:43 np0005593234 nova_compute[227762]: 2026-01-23 10:21:43.792 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:21:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:21:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:43.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:21:44 np0005593234 nova_compute[227762]: 2026-01-23 10:21:44.125 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance a00a5042-ce71-4ecf-ab8f-d9e596d48035 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:21:44 np0005593234 nova_compute[227762]: 2026-01-23 10:21:44.125 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:21:44 np0005593234 nova_compute[227762]: 2026-01-23 10:21:44.125 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:21:44 np0005593234 nova_compute[227762]: 2026-01-23 10:21:44.174 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:21:44 np0005593234 nova_compute[227762]: 2026-01-23 10:21:44.224 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:21:44 np0005593234 nova_compute[227762]: 2026-01-23 10:21:44.225 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:21:44 np0005593234 nova_compute[227762]: 2026-01-23 10:21:44.258 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:21:44 np0005593234 nova_compute[227762]: 2026-01-23 10:21:44.287 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:21:44 np0005593234 nova_compute[227762]: 2026-01-23 10:21:44.374 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:21:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:21:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:44.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:21:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:21:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2534641836' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:21:44 np0005593234 nova_compute[227762]: 2026-01-23 10:21:44.820 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:21:44 np0005593234 nova_compute[227762]: 2026-01-23 10:21:44.827 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:21:44 np0005593234 nova_compute[227762]: 2026-01-23 10:21:44.855 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:21:45 np0005593234 nova_compute[227762]: 2026-01-23 10:21:45.409 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:21:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:45.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:21:46 np0005593234 nova_compute[227762]: 2026-01-23 10:21:46.223 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:21:46 np0005593234 nova_compute[227762]: 2026-01-23 10:21:46.224 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.432s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:21:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:46.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:47 np0005593234 nova_compute[227762]: 2026-01-23 10:21:47.808 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:47.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:48.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:49.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:50 np0005593234 nova_compute[227762]: 2026-01-23 10:21:50.412 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:50.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:50 np0005593234 podman[298178]: 2026-01-23 10:21:50.645764121 +0000 UTC m=+0.060686647 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=reef, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:21:50 np0005593234 podman[298178]: 2026-01-23 10:21:50.744982324 +0000 UTC m=+0.159904830 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:21:51 np0005593234 nova_compute[227762]: 2026-01-23 10:21:51.224 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:51 np0005593234 nova_compute[227762]: 2026-01-23 10:21:51.224 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:51 np0005593234 podman[298333]: 2026-01-23 10:21:51.300875727 +0000 UTC m=+0.054260898 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:21:51 np0005593234 podman[298333]: 2026-01-23 10:21:51.338346931 +0000 UTC m=+0.091732042 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:21:51 np0005593234 podman[298397]: 2026-01-23 10:21:51.5298017 +0000 UTC m=+0.052901215 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, release=1793, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, vendor=Red Hat, Inc., version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=keepalived for Ceph)
Jan 23 05:21:51 np0005593234 podman[298397]: 2026-01-23 10:21:51.541903096 +0000 UTC m=+0.065002611 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, architecture=x86_64, distribution-scope=public, com.redhat.component=keepalived-container, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, version=2.2.4, release=1793)
Jan 23 05:21:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:51.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:52 np0005593234 nova_compute[227762]: 2026-01-23 10:21:52.290 227766 DEBUG nova.network.neutron [req-7e2487c0-e9ad-46cb-9a3e-41638044fef5 req-ef8b6655-0e36-433a-b538-56983d2b9738 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updated VIF entry in instance network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:21:52 np0005593234 nova_compute[227762]: 2026-01-23 10:21:52.290 227766 DEBUG nova.network.neutron [req-7e2487c0-e9ad-46cb-9a3e-41638044fef5 req-ef8b6655-0e36-433a-b538-56983d2b9738 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updating instance_info_cache with network_info: [{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:21:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:52.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:21:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:21:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:21:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:21:52 np0005593234 nova_compute[227762]: 2026-01-23 10:21:52.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:21:52 np0005593234 nova_compute[227762]: 2026-01-23 10:21:52.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:21:52 np0005593234 nova_compute[227762]: 2026-01-23 10:21:52.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:21:52 np0005593234 nova_compute[227762]: 2026-01-23 10:21:52.810 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:53 np0005593234 nova_compute[227762]: 2026-01-23 10:21:53.184 227766 DEBUG oslo_concurrency.lockutils [req-7e2487c0-e9ad-46cb-9a3e-41638044fef5 req-ef8b6655-0e36-433a-b538-56983d2b9738 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:21:53 np0005593234 nova_compute[227762]: 2026-01-23 10:21:53.684 227766 DEBUG nova.compute.manager [req-2e4b756c-4c47-418b-8f37-cd3297c53e2a req-cd9e255f-d1fb-4091-825b-151fac83aaf9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:53 np0005593234 nova_compute[227762]: 2026-01-23 10:21:53.685 227766 DEBUG nova.compute.manager [req-2e4b756c-4c47-418b-8f37-cd3297c53e2a req-cd9e255f-d1fb-4091-825b-151fac83aaf9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing instance network info cache due to event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:21:53 np0005593234 nova_compute[227762]: 2026-01-23 10:21:53.685 227766 DEBUG oslo_concurrency.lockutils [req-2e4b756c-4c47-418b-8f37-cd3297c53e2a req-cd9e255f-d1fb-4091-825b-151fac83aaf9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:21:53 np0005593234 nova_compute[227762]: 2026-01-23 10:21:53.685 227766 DEBUG oslo_concurrency.lockutils [req-2e4b756c-4c47-418b-8f37-cd3297c53e2a req-cd9e255f-d1fb-4091-825b-151fac83aaf9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:21:53 np0005593234 nova_compute[227762]: 2026-01-23 10:21:53.685 227766 DEBUG nova.network.neutron [req-2e4b756c-4c47-418b-8f37-cd3297c53e2a req-cd9e255f-d1fb-4091-825b-151fac83aaf9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:21:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:21:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:53.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:21:54 np0005593234 nova_compute[227762]: 2026-01-23 10:21:54.216 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:21:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:21:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.0 total, 600.0 interval#012Cumulative writes: 13K writes, 66K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.13 GB, 0.03 MB/s#012Cumulative WAL: 13K writes, 13K syncs, 1.00 writes per sync, written: 0.13 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1572 writes, 7896 keys, 1572 commit groups, 1.0 writes per commit group, ingest: 15.81 MB, 0.03 MB/s#012Interval WAL: 1572 writes, 1572 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     51.5      1.57              0.24        42    0.037       0      0       0.0       0.0#012  L6      1/0   10.14 MB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   4.9    113.9     96.6      4.09              1.29        41    0.100    273K    22K       0.0       0.0#012 Sum      1/0   10.14 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   5.9     82.3     84.1      5.66              1.53        83    0.068    273K    22K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   9.7    104.7    101.7      0.67              0.20        12    0.056     53K   3045       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   0.0    113.9     96.6      4.09              1.29        41    0.100    273K    22K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     51.6      1.57              0.24        41    0.038       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 4800.0 total, 600.0 interval#012Flush(GB): cumulative 0.079, interval 0.007#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.46 GB write, 0.10 MB/s write, 0.45 GB read, 0.10 MB/s read, 5.7 seconds#012Interval compaction: 0.07 GB write, 0.11 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.7 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 304.00 MB usage: 51.45 MB table_size: 0 occupancy: 18446744073709551615 collections: 9 last_copies: 0 last_secs: 0.000358 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(2958,49.44 MB,16.2625%) FilterBlock(83,774.80 KB,0.248894%) IndexBlock(83,1.25 MB,0.412605%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:21:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:21:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:54.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:55 np0005593234 nova_compute[227762]: 2026-01-23 10:21:55.414 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:55 np0005593234 podman[298612]: 2026-01-23 10:21:55.762358326 +0000 UTC m=+0.056940839 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 23 05:21:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:21:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:55.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:21:55 np0005593234 nova_compute[227762]: 2026-01-23 10:21:55.969 227766 DEBUG nova.compute.manager [req-26040182-ca81-4b5a-a929-565377c4e360 req-7f04424b-7536-4fde-901c-ad1a492ff8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:21:55 np0005593234 nova_compute[227762]: 2026-01-23 10:21:55.969 227766 DEBUG nova.compute.manager [req-26040182-ca81-4b5a-a929-565377c4e360 req-7f04424b-7536-4fde-901c-ad1a492ff8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing instance network info cache due to event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:21:55 np0005593234 nova_compute[227762]: 2026-01-23 10:21:55.969 227766 DEBUG oslo_concurrency.lockutils [req-26040182-ca81-4b5a-a929-565377c4e360 req-7f04424b-7536-4fde-901c-ad1a492ff8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:21:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:21:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:56.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:21:57 np0005593234 nova_compute[227762]: 2026-01-23 10:21:57.437 227766 DEBUG nova.network.neutron [req-2e4b756c-4c47-418b-8f37-cd3297c53e2a req-cd9e255f-d1fb-4091-825b-151fac83aaf9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updated VIF entry in instance network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:21:57 np0005593234 nova_compute[227762]: 2026-01-23 10:21:57.437 227766 DEBUG nova.network.neutron [req-2e4b756c-4c47-418b-8f37-cd3297c53e2a req-cd9e255f-d1fb-4091-825b-151fac83aaf9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updating instance_info_cache with network_info: [{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:21:57 np0005593234 nova_compute[227762]: 2026-01-23 10:21:57.464 227766 DEBUG oslo_concurrency.lockutils [req-2e4b756c-4c47-418b-8f37-cd3297c53e2a req-cd9e255f-d1fb-4091-825b-151fac83aaf9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:21:57 np0005593234 nova_compute[227762]: 2026-01-23 10:21:57.465 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:21:57 np0005593234 nova_compute[227762]: 2026-01-23 10:21:57.465 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:21:57 np0005593234 nova_compute[227762]: 2026-01-23 10:21:57.465 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a00a5042-ce71-4ecf-ab8f-d9e596d48035 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:21:57 np0005593234 nova_compute[227762]: 2026-01-23 10:21:57.812 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:21:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:57.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:21:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:58.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:21:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:21:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:21:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:59.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:00 np0005593234 nova_compute[227762]: 2026-01-23 10:22:00.416 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:00.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:22:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:22:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:01.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:02.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:02 np0005593234 nova_compute[227762]: 2026-01-23 10:22:02.814 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:03 np0005593234 nova_compute[227762]: 2026-01-23 10:22:03.176 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updating instance_info_cache with network_info: [{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:03 np0005593234 nova_compute[227762]: 2026-01-23 10:22:03.219 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:03 np0005593234 nova_compute[227762]: 2026-01-23 10:22:03.219 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:22:03 np0005593234 nova_compute[227762]: 2026-01-23 10:22:03.220 227766 DEBUG oslo_concurrency.lockutils [req-26040182-ca81-4b5a-a929-565377c4e360 req-7f04424b-7536-4fde-901c-ad1a492ff8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:03 np0005593234 nova_compute[227762]: 2026-01-23 10:22:03.220 227766 DEBUG nova.network.neutron [req-26040182-ca81-4b5a-a929-565377c4e360 req-7f04424b-7536-4fde-901c-ad1a492ff8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:22:03 np0005593234 nova_compute[227762]: 2026-01-23 10:22:03.221 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:03 np0005593234 nova_compute[227762]: 2026-01-23 10:22:03.221 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:03 np0005593234 nova_compute[227762]: 2026-01-23 10:22:03.221 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:03 np0005593234 nova_compute[227762]: 2026-01-23 10:22:03.221 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:03 np0005593234 nova_compute[227762]: 2026-01-23 10:22:03.221 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:22:03 np0005593234 podman[298687]: 2026-01-23 10:22:03.778349352 +0000 UTC m=+0.078161480 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 05:22:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:03.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:04 np0005593234 nova_compute[227762]: 2026-01-23 10:22:04.215 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:04 np0005593234 nova_compute[227762]: 2026-01-23 10:22:04.409 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:04.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:04 np0005593234 nova_compute[227762]: 2026-01-23 10:22:04.765 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:05 np0005593234 nova_compute[227762]: 2026-01-23 10:22:05.417 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:22:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:05.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:22:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:06.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:07 np0005593234 nova_compute[227762]: 2026-01-23 10:22:07.816 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:07.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:08 np0005593234 nova_compute[227762]: 2026-01-23 10:22:08.069 227766 DEBUG nova.network.neutron [req-26040182-ca81-4b5a-a929-565377c4e360 req-7f04424b-7536-4fde-901c-ad1a492ff8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updated VIF entry in instance network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:22:08 np0005593234 nova_compute[227762]: 2026-01-23 10:22:08.069 227766 DEBUG nova.network.neutron [req-26040182-ca81-4b5a-a929-565377c4e360 req-7f04424b-7536-4fde-901c-ad1a492ff8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updating instance_info_cache with network_info: [{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:08.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:08 np0005593234 nova_compute[227762]: 2026-01-23 10:22:08.966 227766 DEBUG oslo_concurrency.lockutils [req-26040182-ca81-4b5a-a929-565377c4e360 req-7f04424b-7536-4fde-901c-ad1a492ff8e4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:09.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:10 np0005593234 nova_compute[227762]: 2026-01-23 10:22:10.419 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:10.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:11 np0005593234 nova_compute[227762]: 2026-01-23 10:22:11.860 227766 DEBUG nova.compute.manager [req-212ceaaf-6eb3-4608-b76e-be2412e23be9 req-592b4d3e-5b5a-489b-b863-5bbc8adfc5dd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:11 np0005593234 nova_compute[227762]: 2026-01-23 10:22:11.860 227766 DEBUG nova.compute.manager [req-212ceaaf-6eb3-4608-b76e-be2412e23be9 req-592b4d3e-5b5a-489b-b863-5bbc8adfc5dd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing instance network info cache due to event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:22:11 np0005593234 nova_compute[227762]: 2026-01-23 10:22:11.861 227766 DEBUG oslo_concurrency.lockutils [req-212ceaaf-6eb3-4608-b76e-be2412e23be9 req-592b4d3e-5b5a-489b-b863-5bbc8adfc5dd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:11 np0005593234 nova_compute[227762]: 2026-01-23 10:22:11.861 227766 DEBUG oslo_concurrency.lockutils [req-212ceaaf-6eb3-4608-b76e-be2412e23be9 req-592b4d3e-5b5a-489b-b863-5bbc8adfc5dd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:11 np0005593234 nova_compute[227762]: 2026-01-23 10:22:11.861 227766 DEBUG nova.network.neutron [req-212ceaaf-6eb3-4608-b76e-be2412e23be9 req-592b4d3e-5b5a-489b-b863-5bbc8adfc5dd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:22:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:11.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:12.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:12 np0005593234 nova_compute[227762]: 2026-01-23 10:22:12.818 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e313 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:13.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e314 e314: 3 total, 3 up, 3 in
Jan 23 05:22:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:14.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:15 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 23 05:22:15 np0005593234 nova_compute[227762]: 2026-01-23 10:22:15.421 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:15 np0005593234 nova_compute[227762]: 2026-01-23 10:22:15.516 227766 DEBUG oslo_concurrency.lockutils [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:15 np0005593234 nova_compute[227762]: 2026-01-23 10:22:15.516 227766 DEBUG oslo_concurrency.lockutils [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:15 np0005593234 nova_compute[227762]: 2026-01-23 10:22:15.516 227766 INFO nova.compute.manager [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Rebooting instance#033[00m
Jan 23 05:22:15 np0005593234 nova_compute[227762]: 2026-01-23 10:22:15.537 227766 DEBUG oslo_concurrency.lockutils [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:15.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:16 np0005593234 nova_compute[227762]: 2026-01-23 10:22:16.019 227766 DEBUG nova.network.neutron [req-212ceaaf-6eb3-4608-b76e-be2412e23be9 req-592b4d3e-5b5a-489b-b863-5bbc8adfc5dd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updated VIF entry in instance network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:22:16 np0005593234 nova_compute[227762]: 2026-01-23 10:22:16.019 227766 DEBUG nova.network.neutron [req-212ceaaf-6eb3-4608-b76e-be2412e23be9 req-592b4d3e-5b5a-489b-b863-5bbc8adfc5dd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updating instance_info_cache with network_info: [{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:16 np0005593234 nova_compute[227762]: 2026-01-23 10:22:16.102 227766 DEBUG oslo_concurrency.lockutils [req-212ceaaf-6eb3-4608-b76e-be2412e23be9 req-592b4d3e-5b5a-489b-b863-5bbc8adfc5dd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:16 np0005593234 nova_compute[227762]: 2026-01-23 10:22:16.102 227766 DEBUG oslo_concurrency.lockutils [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquired lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:16 np0005593234 nova_compute[227762]: 2026-01-23 10:22:16.103 227766 DEBUG nova.network.neutron [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:22:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:16.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:17 np0005593234 nova_compute[227762]: 2026-01-23 10:22:17.819 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:17.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e314 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:18.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:19 np0005593234 nova_compute[227762]: 2026-01-23 10:22:19.044 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:19.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.064 227766 DEBUG nova.network.neutron [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updating instance_info_cache with network_info: [{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.153 227766 DEBUG oslo_concurrency.lockutils [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Releasing lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.156 227766 DEBUG nova.compute.manager [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:22:20 np0005593234 kernel: tapd4f06800-1f (unregistering): left promiscuous mode
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.423 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:20 np0005593234 NetworkManager[48942]: <info>  [1769163740.4249] device (tapd4f06800-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:22:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:20Z|00637|binding|INFO|Releasing lport d4f06800-1f0a-4f50-b00d-b10219301efc from this chassis (sb_readonly=0)
Jan 23 05:22:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:20Z|00638|binding|INFO|Setting lport d4f06800-1f0a-4f50-b00d-b10219301efc down in Southbound
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.433 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:20Z|00639|binding|INFO|Removing iface tapd4f06800-1f ovn-installed in OVS
Jan 23 05:22:20 np0005593234 systemd[1]: Starting dnf makecache...
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.448 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:4f:d5 10.100.0.13'], port_security=['fa:16:3e:01:4f:d5 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a00a5042-ce71-4ecf-ab8f-d9e596d48035', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b976daabc8124a99814954633f99ed7b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '77e10692-5f18-4d4e-ba14-6f09047b276a f9cbd483-55b8-4d56-bdee-89214b08a0fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0878063b-8606-438f-ae03-20f399cd80c4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d4f06800-1f0a-4f50-b00d-b10219301efc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.449 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d4f06800-1f0a-4f50-b00d-b10219301efc in datapath 8b38c3ca-73e5-4583-a277-cd0670deffdb unbound from our chassis#033[00m
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.451 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8b38c3ca-73e5-4583-a277-cd0670deffdb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.452 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[df09fcd8-c38b-4e36-87b5-6360b4d46970]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.453 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb namespace which is not needed anymore#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.460 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:20 np0005593234 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000009a.scope: Deactivated successfully.
Jan 23 05:22:20 np0005593234 systemd[1]: machine-qemu\x2d72\x2dinstance\x2d0000009a.scope: Consumed 16.425s CPU time.
Jan 23 05:22:20 np0005593234 systemd-machined[195626]: Machine qemu-72-instance-0000009a terminated.
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.569 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.575 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.586 227766 INFO nova.virt.libvirt.driver [-] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Instance destroyed successfully.#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.586 227766 DEBUG nova.objects.instance [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'resources' on Instance uuid a00a5042-ce71-4ecf-ab8f-d9e596d48035 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:20.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.614 227766 DEBUG nova.virt.libvirt.vif [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:20:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-79223605',display_name='tempest-TestMinimumBasicScenario-server-79223605',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-79223605',id=154,image_ref='bba8f8ac-6563-4b96-a735-670d31b1818b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8Y9UEL92/+TB+I+GNhaZt1mYMByc7/BrYfEDaKlAAZo7j91A8ceJavobN2fd/HuU5MXKggpmRNE2fbSVJxSSFeNnWSzt9Sqrij7kFCnUGkI6fsAtTHbWMsV0NwSH55dw==',key_name='tempest-TestMinimumBasicScenario-867634297',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b976daabc8124a99814954633f99ed7b',ramdisk_id='',reservation_id='r-o6qhqwf7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bba8f8ac-6563-4b96-a735-670d31b1818b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1465373740',owner_user_name='tempest-TestMinimumBasicScenario-1465373740-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:22:20Z,user_data=None,user_id='c041da0a601a4260b29fc9c65719597f',uuid=a00a5042-ce71-4ecf-ab8f-d9e596d48035,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.615 227766 DEBUG nova.network.os_vif_util [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converting VIF {"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.615 227766 DEBUG nova.network.os_vif_util [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.616 227766 DEBUG os_vif [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.617 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.617 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4f06800-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.619 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.621 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.625 227766 INFO os_vif [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f')#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.635 227766 DEBUG nova.virt.libvirt.driver [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Start _get_guest_xml network_info=[{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=bba8f8ac-6563-4b96-a735-670d31b1818b,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': 'bba8f8ac-6563-4b96-a735-670d31b1818b'}], 'ephemerals': [], 'block_device_mapping': [{'boot_index': None, 'mount_device': '/dev/vdb', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-7857fc75-1658-465f-a6a1-40f608f6408e', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '7857fc75-1658-465f-a6a1-40f608f6408e', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'a00a5042-ce71-4ecf-ab8f-d9e596d48035', 'attached_at': '', 'detached_at': '', 'volume_id': '7857fc75-1658-465f-a6a1-40f608f6408e', 'serial': '7857fc75-1658-465f-a6a1-40f608f6408e'}, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '0c773f34-5811-4b94-875c-528e61a6d7b2', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.639 227766 WARNING nova.virt.libvirt.driver [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.653 227766 DEBUG nova.virt.libvirt.host [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.654 227766 DEBUG nova.virt.libvirt.host [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:22:20 np0005593234 dnf[298772]: Metadata cache refreshed recently.
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.663 227766 DEBUG nova.virt.libvirt.host [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.664 227766 DEBUG nova.virt.libvirt.host [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:22:20 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[297554]: [NOTICE]   (297558) : haproxy version is 2.8.14-c23fe91
Jan 23 05:22:20 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[297554]: [NOTICE]   (297558) : path to executable is /usr/sbin/haproxy
Jan 23 05:22:20 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[297554]: [WARNING]  (297558) : Exiting Master process...
Jan 23 05:22:20 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[297554]: [WARNING]  (297558) : Exiting Master process...
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.666 227766 DEBUG nova.virt.libvirt.driver [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.666 227766 DEBUG nova.virt.hardware [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=bba8f8ac-6563-4b96-a735-670d31b1818b,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.667 227766 DEBUG nova.virt.hardware [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.667 227766 DEBUG nova.virt.hardware [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.667 227766 DEBUG nova.virt.hardware [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.667 227766 DEBUG nova.virt.hardware [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.668 227766 DEBUG nova.virt.hardware [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.668 227766 DEBUG nova.virt.hardware [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.668 227766 DEBUG nova.virt.hardware [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.668 227766 DEBUG nova.virt.hardware [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.668 227766 DEBUG nova.virt.hardware [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.669 227766 DEBUG nova.virt.hardware [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.669 227766 DEBUG nova.objects.instance [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'vcpu_model' on Instance uuid a00a5042-ce71-4ecf-ab8f-d9e596d48035 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:20 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[297554]: [ALERT]    (297558) : Current worker (297575) exited with code 143 (Terminated)
Jan 23 05:22:20 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[297554]: [WARNING]  (297558) : All workers exited. Exiting... (0)
Jan 23 05:22:20 np0005593234 systemd[1]: libpod-59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6.scope: Deactivated successfully.
Jan 23 05:22:20 np0005593234 podman[298798]: 2026-01-23 10:22:20.676389834 +0000 UTC m=+0.135848551 container died 59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.687 227766 DEBUG oslo_concurrency.processutils [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:20 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6-userdata-shm.mount: Deactivated successfully.
Jan 23 05:22:20 np0005593234 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 23 05:22:20 np0005593234 systemd[1]: Finished dnf makecache.
Jan 23 05:22:20 np0005593234 systemd[1]: var-lib-containers-storage-overlay-8297876adb61a8222b954e849161ea0bc6d013fdaa6ff0f8793e50a99c82c48c-merged.mount: Deactivated successfully.
Jan 23 05:22:20 np0005593234 podman[298798]: 2026-01-23 10:22:20.71968785 +0000 UTC m=+0.179146557 container cleanup 59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:22:20 np0005593234 systemd[1]: libpod-conmon-59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6.scope: Deactivated successfully.
Jan 23 05:22:20 np0005593234 podman[298838]: 2026-01-23 10:22:20.777641991 +0000 UTC m=+0.037795125 container remove 59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.783 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[03916593-2036-4e80-b883-33fb5d64d275]: (4, ('Fri Jan 23 10:22:20 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb (59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6)\n59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6\nFri Jan 23 10:22:20 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb (59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6)\n59df55eae410b89e6fc4247090b4f882ebc501e5091d515adb73f6b4dba4ffb6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.785 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a5dc008d-953e-4453-acdf-298b30180511]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.786 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b38c3ca-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:20 np0005593234 kernel: tap8b38c3ca-70: left promiscuous mode
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.787 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.793 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e2a439b4-1c95-4348-833e-ade9cf2386b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:20 np0005593234 nova_compute[227762]: 2026-01-23 10:22:20.801 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.813 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[72537558-b22a-406d-91e1-0af78e449405]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.814 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a4492525-5068-43f5-877f-fbd47c44ad05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.832 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7097504e-1e89-4a22-b89d-bf92dee11c05]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 755393, 'reachable_time': 18986, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298873, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.834 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:22:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:20.835 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[ace16f0c-7957-43ce-ae23-efb75a07d3d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:20 np0005593234 systemd[1]: run-netns-ovnmeta\x2d8b38c3ca\x2d73e5\x2d4583\x2da277\x2dcd0670deffdb.mount: Deactivated successfully.
Jan 23 05:22:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:22:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2730497085' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.118 227766 DEBUG oslo_concurrency.processutils [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.154 227766 DEBUG oslo_concurrency.processutils [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:22:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2884912591' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.581 227766 DEBUG oslo_concurrency.processutils [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.648 227766 DEBUG nova.virt.libvirt.vif [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:20:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-79223605',display_name='tempest-TestMinimumBasicScenario-server-79223605',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-79223605',id=154,image_ref='bba8f8ac-6563-4b96-a735-670d31b1818b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8Y9UEL92/+TB+I+GNhaZt1mYMByc7/BrYfEDaKlAAZo7j91A8ceJavobN2fd/HuU5MXKggpmRNE2fbSVJxSSFeNnWSzt9Sqrij7kFCnUGkI6fsAtTHbWMsV0NwSH55dw==',key_name='tempest-TestMinimumBasicScenario-867634297',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b976daabc8124a99814954633f99ed7b',ramdisk_id='',reservation_id='r-o6qhqwf7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bba8f8ac-6563-4b96-a735-670d31b1818b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1465373740',owner_user_name='tempest-TestMinimumBasicScenario-1465373740-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:22:20Z,user_data=None,user_id='c041da0a601a4260b29fc9c65719597f',uuid=a00a5042-ce71-4ecf-ab8f-d9e596d48035,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.649 227766 DEBUG nova.network.os_vif_util [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converting VIF {"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.649 227766 DEBUG nova.network.os_vif_util [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.651 227766 DEBUG nova.objects.instance [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'pci_devices' on Instance uuid a00a5042-ce71-4ecf-ab8f-d9e596d48035 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.675 227766 DEBUG nova.virt.libvirt.driver [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  <uuid>a00a5042-ce71-4ecf-ab8f-d9e596d48035</uuid>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  <name>instance-0000009a</name>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestMinimumBasicScenario-server-79223605</nova:name>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:22:20</nova:creationTime>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <nova:user uuid="c041da0a601a4260b29fc9c65719597f">tempest-TestMinimumBasicScenario-1465373740-project-member</nova:user>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <nova:project uuid="b976daabc8124a99814954633f99ed7b">tempest-TestMinimumBasicScenario-1465373740</nova:project>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="bba8f8ac-6563-4b96-a735-670d31b1818b"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <nova:port uuid="d4f06800-1f0a-4f50-b00d-b10219301efc">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <entry name="serial">a00a5042-ce71-4ecf-ab8f-d9e596d48035</entry>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <entry name="uuid">a00a5042-ce71-4ecf-ab8f-d9e596d48035</entry>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/a00a5042-ce71-4ecf-ab8f-d9e596d48035_disk.config">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="volumes/volume-7857fc75-1658-465f-a6a1-40f608f6408e">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <target dev="vdb" bus="virtio"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <serial>7857fc75-1658-465f-a6a1-40f608f6408e</serial>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:01:4f:d5"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <target dev="tapd4f06800-1f"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/a00a5042-ce71-4ecf-ab8f-d9e596d48035/console.log" append="off"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <input type="keyboard" bus="usb"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:22:21 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:22:21 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:22:21 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:22:21 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.676 227766 DEBUG nova.virt.libvirt.driver [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] skipping disk for instance-0000009a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.676 227766 DEBUG nova.virt.libvirt.driver [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] skipping disk for instance-0000009a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.676 227766 DEBUG nova.virt.libvirt.driver [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] skipping disk for instance-0000009a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.677 227766 DEBUG nova.virt.libvirt.vif [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:20:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-79223605',display_name='tempest-TestMinimumBasicScenario-server-79223605',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-79223605',id=154,image_ref='bba8f8ac-6563-4b96-a735-670d31b1818b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8Y9UEL92/+TB+I+GNhaZt1mYMByc7/BrYfEDaKlAAZo7j91A8ceJavobN2fd/HuU5MXKggpmRNE2fbSVJxSSFeNnWSzt9Sqrij7kFCnUGkI6fsAtTHbWMsV0NwSH55dw==',key_name='tempest-TestMinimumBasicScenario-867634297',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=1,progress=0,project_id='b976daabc8124a99814954633f99ed7b',ramdisk_id='',reservation_id='r-o6qhqwf7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bba8f8ac-6563-4b96-a735-670d31b1818b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1465373740',owner_user_name='tempest-TestMinimumBasicScenario-1465373740-project-member'},tags=<?>,task_state='reboot_started_hard',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:22:20Z,user_data=None,user_id='c041da0a601a4260b29fc9c65719597f',uuid=a00a5042-ce71-4ecf-ab8f-d9e596d48035,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.677 227766 DEBUG nova.network.os_vif_util [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converting VIF {"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.677 227766 DEBUG nova.network.os_vif_util [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.678 227766 DEBUG os_vif [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.679 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.679 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.679 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.682 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.683 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd4f06800-1f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.683 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd4f06800-1f, col_values=(('external_ids', {'iface-id': 'd4f06800-1f0a-4f50-b00d-b10219301efc', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:01:4f:d5', 'vm-uuid': 'a00a5042-ce71-4ecf-ab8f-d9e596d48035'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.685 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:21 np0005593234 NetworkManager[48942]: <info>  [1769163741.6857] manager: (tapd4f06800-1f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/308)
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.687 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.691 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.692 227766 INFO os_vif [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f')#033[00m
Jan 23 05:22:21 np0005593234 kernel: tapd4f06800-1f: entered promiscuous mode
Jan 23 05:22:21 np0005593234 NetworkManager[48942]: <info>  [1769163741.7644] manager: (tapd4f06800-1f): new Tun device (/org/freedesktop/NetworkManager/Devices/309)
Jan 23 05:22:21 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:21Z|00640|binding|INFO|Claiming lport d4f06800-1f0a-4f50-b00d-b10219301efc for this chassis.
Jan 23 05:22:21 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:21Z|00641|binding|INFO|d4f06800-1f0a-4f50-b00d-b10219301efc: Claiming fa:16:3e:01:4f:d5 10.100.0.13
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.763 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:21 np0005593234 systemd-udevd[298777]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.771 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:4f:d5 10.100.0.13'], port_security=['fa:16:3e:01:4f:d5 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a00a5042-ce71-4ecf-ab8f-d9e596d48035', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b976daabc8124a99814954633f99ed7b', 'neutron:revision_number': '5', 'neutron:security_group_ids': '77e10692-5f18-4d4e-ba14-6f09047b276a f9cbd483-55b8-4d56-bdee-89214b08a0fe', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.236'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0878063b-8606-438f-ae03-20f399cd80c4, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d4f06800-1f0a-4f50-b00d-b10219301efc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.772 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d4f06800-1f0a-4f50-b00d-b10219301efc in datapath 8b38c3ca-73e5-4583-a277-cd0670deffdb bound to our chassis#033[00m
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.774 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8b38c3ca-73e5-4583-a277-cd0670deffdb#033[00m
Jan 23 05:22:21 np0005593234 NetworkManager[48942]: <info>  [1769163741.7837] device (tapd4f06800-1f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:22:21 np0005593234 NetworkManager[48942]: <info>  [1769163741.7847] device (tapd4f06800-1f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:22:21 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:21Z|00642|binding|INFO|Setting lport d4f06800-1f0a-4f50-b00d-b10219301efc ovn-installed in OVS
Jan 23 05:22:21 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:21Z|00643|binding|INFO|Setting lport d4f06800-1f0a-4f50-b00d-b10219301efc up in Southbound
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.788 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.789 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[68e086bc-d382-4e8e-8961-412d3fc72e41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.790 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8b38c3ca-71 in ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.792 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8b38c3ca-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.792 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2f85ec5d-e868-4bfc-b8f5-a700ed37feba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:21 np0005593234 nova_compute[227762]: 2026-01-23 10:22:21.792 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.792 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[55458f26-3d27-4b65-a721-49d39b281db4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:21 np0005593234 systemd-machined[195626]: New machine qemu-73-instance-0000009a.
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.809 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[93b516b3-160e-4d67-bc82-fc34cf7c02a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:21 np0005593234 systemd[1]: Started Virtual Machine qemu-73-instance-0000009a.
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.830 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8c3ddfbe-689b-434a-90e1-433b637b4687]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.878 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d1595cfa-64fb-4832-8542-13c0ca6f9a9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.883 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ce264578-e422-4d3a-83cd-91e7d67c5d31]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:21 np0005593234 NetworkManager[48942]: <info>  [1769163741.8849] manager: (tap8b38c3ca-70): new Veth device (/org/freedesktop/NetworkManager/Devices/310)
Jan 23 05:22:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:21.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.914 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe4f040-ce2d-4a27-9660-fbb3fccd92fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.916 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d080d1d3-613f-4e50-a95f-ee7fb8206774]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:21 np0005593234 NetworkManager[48942]: <info>  [1769163741.9380] device (tap8b38c3ca-70): carrier: link connected
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.944 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7f32cc5e-89a0-4f8b-aee5-fd219afbd23c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.960 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f167599c-d9ef-4293-8cf9-313220866006]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b38c3ca-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:fa:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 763689, 'reachable_time': 35518, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 298961, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.976 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[892bdfab-b0b3-4be3-8271-25a2f5ee32d7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8e:fa5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 763689, 'tstamp': 763689}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 298962, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:21.990 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b560cd4b-f54f-45a0-922d-12aa81c3d292]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8b38c3ca-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8e:fa:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 200], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 763689, 'reachable_time': 35518, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 298963, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:22.027 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[251ce580-0f73-4cd5-b654-e09777aa199f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:22.078 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[be042ed1-c67b-4aeb-9fa1-116e72944cd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:22.080 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b38c3ca-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:22.080 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:22.080 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8b38c3ca-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:22 np0005593234 NetworkManager[48942]: <info>  [1769163742.0830] manager: (tap8b38c3ca-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/311)
Jan 23 05:22:22 np0005593234 kernel: tap8b38c3ca-70: entered promiscuous mode
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.082 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:22.086 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8b38c3ca-70, col_values=(('external_ids', {'iface-id': '120d9d64-6853-4b50-a095-bddadd015ba1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:22Z|00644|binding|INFO|Releasing lport 120d9d64-6853-4b50-a095-bddadd015ba1 from this chassis (sb_readonly=0)
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.088 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:22.088 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8b38c3ca-73e5-4583-a277-cd0670deffdb.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8b38c3ca-73e5-4583-a277-cd0670deffdb.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:22.089 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1fdbf96f-f195-40ad-b14b-38787d6384f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:22.089 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-8b38c3ca-73e5-4583-a277-cd0670deffdb
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/8b38c3ca-73e5-4583-a277-cd0670deffdb.pid.haproxy
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 8b38c3ca-73e5-4583-a277-cd0670deffdb
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:22:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:22.090 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'env', 'PROCESS_TAG=haproxy-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8b38c3ca-73e5-4583-a277-cd0670deffdb.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.099 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e315 e315: 3 total, 3 up, 3 in
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.261 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for a00a5042-ce71-4ecf-ab8f-d9e596d48035 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.262 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163742.261318, a00a5042-ce71-4ecf-ab8f-d9e596d48035 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.262 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.264 227766 DEBUG nova.compute.manager [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.268 227766 INFO nova.virt.libvirt.driver [-] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Instance rebooted successfully.#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.268 227766 DEBUG nova.compute.manager [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.311 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.314 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started_hard, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.382 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] During sync_power_state the instance has a pending task (reboot_started_hard). Skip.#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.393 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163742.263935, a00a5042-ce71-4ecf-ab8f-d9e596d48035 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.393 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] VM Started (Lifecycle Event)#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.400 227766 DEBUG oslo_concurrency.lockutils [None req-3aec43d6-e83d-40fb-96c8-7196ce1e7f07 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 6.884s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.437 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.440 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:22:22 np0005593234 podman[299057]: 2026-01-23 10:22:22.444783754 +0000 UTC m=+0.045769434 container create 1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:22:22 np0005593234 systemd[1]: Started libpod-conmon-1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4.scope.
Jan 23 05:22:22 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:22:22 np0005593234 podman[299057]: 2026-01-23 10:22:22.420379915 +0000 UTC m=+0.021365615 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:22:22 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4f509d27450a5749b96a9920c873f79f4dba1895762471a6454651729a6ad10e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:22:22 np0005593234 podman[299057]: 2026-01-23 10:22:22.528419172 +0000 UTC m=+0.129404882 container init 1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 05:22:22 np0005593234 podman[299057]: 2026-01-23 10:22:22.533786199 +0000 UTC m=+0.134771879 container start 1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 05:22:22 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[299072]: [NOTICE]   (299076) : New worker (299078) forked
Jan 23 05:22:22 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[299072]: [NOTICE]   (299076) : Loading success.
Jan 23 05:22:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:22.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:22 np0005593234 nova_compute[227762]: 2026-01-23 10:22:22.821 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:22:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:23.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:22:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:22:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:24.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:22:25 np0005593234 nova_compute[227762]: 2026-01-23 10:22:25.179 227766 DEBUG nova.compute.manager [req-898d5477-5491-4b4f-9af0-1fed62609d4b req-0e5119ac-6182-4399-aeaa-e8a3a9982864 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-vif-unplugged-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:25 np0005593234 nova_compute[227762]: 2026-01-23 10:22:25.179 227766 DEBUG oslo_concurrency.lockutils [req-898d5477-5491-4b4f-9af0-1fed62609d4b req-0e5119ac-6182-4399-aeaa-e8a3a9982864 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:25 np0005593234 nova_compute[227762]: 2026-01-23 10:22:25.179 227766 DEBUG oslo_concurrency.lockutils [req-898d5477-5491-4b4f-9af0-1fed62609d4b req-0e5119ac-6182-4399-aeaa-e8a3a9982864 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:25 np0005593234 nova_compute[227762]: 2026-01-23 10:22:25.180 227766 DEBUG oslo_concurrency.lockutils [req-898d5477-5491-4b4f-9af0-1fed62609d4b req-0e5119ac-6182-4399-aeaa-e8a3a9982864 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:25 np0005593234 nova_compute[227762]: 2026-01-23 10:22:25.180 227766 DEBUG nova.compute.manager [req-898d5477-5491-4b4f-9af0-1fed62609d4b req-0e5119ac-6182-4399-aeaa-e8a3a9982864 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] No waiting events found dispatching network-vif-unplugged-d4f06800-1f0a-4f50-b00d-b10219301efc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:22:25 np0005593234 nova_compute[227762]: 2026-01-23 10:22:25.180 227766 WARNING nova.compute.manager [req-898d5477-5491-4b4f-9af0-1fed62609d4b req-0e5119ac-6182-4399-aeaa-e8a3a9982864 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received unexpected event network-vif-unplugged-d4f06800-1f0a-4f50-b00d-b10219301efc for instance with vm_state active and task_state None.#033[00m
Jan 23 05:22:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:25.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:26.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:26 np0005593234 nova_compute[227762]: 2026-01-23 10:22:26.685 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:26 np0005593234 podman[299089]: 2026-01-23 10:22:26.754474716 +0000 UTC m=+0.053813914 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.489 227766 DEBUG nova.compute.manager [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.489 227766 DEBUG oslo_concurrency.lockutils [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.489 227766 DEBUG oslo_concurrency.lockutils [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.489 227766 DEBUG oslo_concurrency.lockutils [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.489 227766 DEBUG nova.compute.manager [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] No waiting events found dispatching network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.490 227766 WARNING nova.compute.manager [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received unexpected event network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc for instance with vm_state active and task_state None.#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.490 227766 DEBUG nova.compute.manager [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.490 227766 DEBUG oslo_concurrency.lockutils [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.490 227766 DEBUG oslo_concurrency.lockutils [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.490 227766 DEBUG oslo_concurrency.lockutils [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.490 227766 DEBUG nova.compute.manager [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] No waiting events found dispatching network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.490 227766 WARNING nova.compute.manager [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received unexpected event network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc for instance with vm_state active and task_state None.#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.491 227766 DEBUG nova.compute.manager [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.491 227766 DEBUG oslo_concurrency.lockutils [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.491 227766 DEBUG oslo_concurrency.lockutils [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.491 227766 DEBUG oslo_concurrency.lockutils [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.491 227766 DEBUG nova.compute.manager [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] No waiting events found dispatching network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.491 227766 WARNING nova.compute.manager [req-af5f54d9-def7-48d3-aa21-23e838b3c8b1 req-014dba7a-c2d8-4624-82c5-e955a6408aba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received unexpected event network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc for instance with vm_state active and task_state None.#033[00m
Jan 23 05:22:27 np0005593234 nova_compute[227762]: 2026-01-23 10:22:27.824 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:27.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:28.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:22:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 4800.5 total, 600.0 interval#012Cumulative writes: 56K writes, 226K keys, 56K commit groups, 1.0 writes per commit group, ingest: 0.23 GB, 0.05 MB/s#012Cumulative WAL: 56K writes, 20K syncs, 2.73 writes per sync, written: 0.23 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 8055 writes, 30K keys, 8055 commit groups, 1.0 writes per commit group, ingest: 33.67 MB, 0.06 MB/s#012Interval WAL: 8055 writes, 3248 syncs, 2.48 writes per sync, written: 0.03 GB, 0.06 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:22:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:29.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:30.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:31 np0005593234 nova_compute[227762]: 2026-01-23 10:22:31.689 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:31.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:32.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:32 np0005593234 nova_compute[227762]: 2026-01-23 10:22:32.826 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:33 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:33Z|00645|binding|INFO|Releasing lport 120d9d64-6853-4b50-a095-bddadd015ba1 from this chassis (sb_readonly=0)
Jan 23 05:22:33 np0005593234 nova_compute[227762]: 2026-01-23 10:22:33.472 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:22:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:33.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:22:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:34.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:34 np0005593234 podman[299162]: 2026-01-23 10:22:34.792494607 +0000 UTC m=+0.080646376 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:22:35 np0005593234 nova_compute[227762]: 2026-01-23 10:22:35.591 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "8010f6fe-77ef-48ec-952f-a3a65186cd59" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:35 np0005593234 nova_compute[227762]: 2026-01-23 10:22:35.592 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:35 np0005593234 nova_compute[227762]: 2026-01-23 10:22:35.635 227766 DEBUG nova.compute.manager [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:22:35 np0005593234 nova_compute[227762]: 2026-01-23 10:22:35.740 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:35 np0005593234 nova_compute[227762]: 2026-01-23 10:22:35.741 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:35 np0005593234 nova_compute[227762]: 2026-01-23 10:22:35.752 227766 DEBUG nova.virt.hardware [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:22:35 np0005593234 nova_compute[227762]: 2026-01-23 10:22:35.752 227766 INFO nova.compute.claims [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:22:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:35.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:36 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:36Z|00071|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:01:4f:d5 10.100.0.13
Jan 23 05:22:36 np0005593234 nova_compute[227762]: 2026-01-23 10:22:36.013 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:22:36 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1690385329' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:22:36 np0005593234 nova_compute[227762]: 2026-01-23 10:22:36.470 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:36 np0005593234 nova_compute[227762]: 2026-01-23 10:22:36.478 227766 DEBUG nova.compute.provider_tree [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:22:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:36.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:36 np0005593234 nova_compute[227762]: 2026-01-23 10:22:36.691 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:36 np0005593234 nova_compute[227762]: 2026-01-23 10:22:36.767 227766 DEBUG nova.scheduler.client.report [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:22:36 np0005593234 nova_compute[227762]: 2026-01-23 10:22:36.880 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:36 np0005593234 nova_compute[227762]: 2026-01-23 10:22:36.881 227766 DEBUG nova.compute.manager [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:22:36 np0005593234 nova_compute[227762]: 2026-01-23 10:22:36.973 227766 DEBUG nova.compute.manager [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:22:36 np0005593234 nova_compute[227762]: 2026-01-23 10:22:36.974 227766 DEBUG nova.network.neutron [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:22:36 np0005593234 nova_compute[227762]: 2026-01-23 10:22:36.998 227766 INFO nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.022 227766 DEBUG nova.compute.manager [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.233 227766 DEBUG nova.compute.manager [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.235 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.235 227766 INFO nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Creating image(s)#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.266 227766 DEBUG nova.storage.rbd_utils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 8010f6fe-77ef-48ec-952f-a3a65186cd59_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.299 227766 DEBUG nova.storage.rbd_utils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 8010f6fe-77ef-48ec-952f-a3a65186cd59_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.324 227766 DEBUG nova.storage.rbd_utils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 8010f6fe-77ef-48ec-952f-a3a65186cd59_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.328 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.402 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.403 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.404 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.405 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.435 227766 DEBUG nova.storage.rbd_utils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 8010f6fe-77ef-48ec-952f-a3a65186cd59_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.438 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 8010f6fe-77ef-48ec-952f-a3a65186cd59_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.613 227766 DEBUG nova.policy [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60291ce86b6946629a2e48f6680312cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.747 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 8010f6fe-77ef-48ec-952f-a3a65186cd59_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.309s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.809 227766 DEBUG nova.storage.rbd_utils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] resizing rbd image 8010f6fe-77ef-48ec-952f-a3a65186cd59_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.837 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.899 227766 DEBUG nova.objects.instance [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 8010f6fe-77ef-48ec-952f-a3a65186cd59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:37.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.923 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.924 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Ensure instance console log exists: /var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.924 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.925 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:37 np0005593234 nova_compute[227762]: 2026-01-23 10:22:37.925 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:38.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:39.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:40.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:40 np0005593234 nova_compute[227762]: 2026-01-23 10:22:40.742 227766 DEBUG nova.network.neutron [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Successfully created port: 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:22:41 np0005593234 nova_compute[227762]: 2026-01-23 10:22:41.693 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:41.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:42.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:42 np0005593234 nova_compute[227762]: 2026-01-23 10:22:42.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:42 np0005593234 nova_compute[227762]: 2026-01-23 10:22:42.828 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:42 np0005593234 nova_compute[227762]: 2026-01-23 10:22:42.841 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:42 np0005593234 nova_compute[227762]: 2026-01-23 10:22:42.841 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:42 np0005593234 nova_compute[227762]: 2026-01-23 10:22:42.842 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:42 np0005593234 nova_compute[227762]: 2026-01-23 10:22:42.842 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:22:42 np0005593234 nova_compute[227762]: 2026-01-23 10:22:42.842 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:42.857 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:42.859 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:42.859 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.022 227766 DEBUG nova.network.neutron [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Successfully updated port: 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.045 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.045 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquired lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.045 227766 DEBUG nova.network.neutron [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.210 227766 DEBUG nova.compute.manager [req-44b9b9c2-c39e-46cc-acb5-7176ff050a6d req-bd7a6642-bf98-4a1a-9f68-2a9cfbe5d940 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-changed-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.211 227766 DEBUG nova.compute.manager [req-44b9b9c2-c39e-46cc-acb5-7176ff050a6d req-bd7a6642-bf98-4a1a-9f68-2a9cfbe5d940 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Refreshing instance network info cache due to event network-changed-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.211 227766 DEBUG oslo_concurrency.lockutils [req-44b9b9c2-c39e-46cc-acb5-7176ff050a6d req-bd7a6642-bf98-4a1a-9f68-2a9cfbe5d940 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:22:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/619399140' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.275 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.346 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000009a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.347 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000009a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.347 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000009a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.508 227766 DEBUG nova.network.neutron [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.520 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.521 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4128MB free_disk=20.892227172851562GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.522 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.522 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.620 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance a00a5042-ce71-4ecf-ab8f-d9e596d48035 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.621 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 8010f6fe-77ef-48ec-952f-a3a65186cd59 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.621 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.621 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:22:43 np0005593234 nova_compute[227762]: 2026-01-23 10:22:43.685 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:43.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:22:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/551148813' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.101 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.106 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.126 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.156 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.157 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.635s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:44.502 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=58, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=57) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:22:44 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:44.503 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.537 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:44.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.768 227766 DEBUG nova.network.neutron [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updating instance_info_cache with network_info: [{"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.790 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Releasing lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.790 227766 DEBUG nova.compute.manager [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Instance network_info: |[{"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.791 227766 DEBUG oslo_concurrency.lockutils [req-44b9b9c2-c39e-46cc-acb5-7176ff050a6d req-bd7a6642-bf98-4a1a-9f68-2a9cfbe5d940 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.791 227766 DEBUG nova.network.neutron [req-44b9b9c2-c39e-46cc-acb5-7176ff050a6d req-bd7a6642-bf98-4a1a-9f68-2a9cfbe5d940 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Refreshing network info cache for port 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.794 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Start _get_guest_xml network_info=[{"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.797 227766 WARNING nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.801 227766 DEBUG nova.virt.libvirt.host [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.802 227766 DEBUG nova.virt.libvirt.host [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.804 227766 DEBUG nova.virt.libvirt.host [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.805 227766 DEBUG nova.virt.libvirt.host [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.806 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.806 227766 DEBUG nova.virt.hardware [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.807 227766 DEBUG nova.virt.hardware [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.807 227766 DEBUG nova.virt.hardware [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.807 227766 DEBUG nova.virt.hardware [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.807 227766 DEBUG nova.virt.hardware [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.807 227766 DEBUG nova.virt.hardware [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.808 227766 DEBUG nova.virt.hardware [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.808 227766 DEBUG nova.virt.hardware [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.808 227766 DEBUG nova.virt.hardware [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.809 227766 DEBUG nova.virt.hardware [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.809 227766 DEBUG nova.virt.hardware [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:22:44 np0005593234 nova_compute[227762]: 2026-01-23 10:22:44.812 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:22:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3506404361' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.244 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.282 227766 DEBUG nova.storage.rbd_utils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 8010f6fe-77ef-48ec-952f-a3a65186cd59_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.287 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:22:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2314982328' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.749 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.753 227766 DEBUG nova.virt.libvirt.vif [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:22:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1311975023',display_name='tempest-TestNetworkBasicOps-server-1311975023',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1311975023',id=158,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL/weReemtkH37O5IRYNNKB/TT2YWx12mLIlrues6ra7aWugRfActXRaloGeDbBIe3P4JQOsAt7qcWUIg9CugCzIu3x3QAkzyj6XRgUoI3pt8WnzsopEYd4m2Fn+OmRPbg==',key_name='tempest-TestNetworkBasicOps-1841453028',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-3yzv077h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:22:37Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=8010f6fe-77ef-48ec-952f-a3a65186cd59,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.753 227766 DEBUG nova.network.os_vif_util [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.755 227766 DEBUG nova.network.os_vif_util [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:52:70,bridge_name='br-int',has_traffic_filtering=True,id=1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9,network=Network(16290d86-0a8d-403e-83f2-0ae47fb80e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fea5a6d-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.757 227766 DEBUG nova.objects.instance [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8010f6fe-77ef-48ec-952f-a3a65186cd59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.781 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  <uuid>8010f6fe-77ef-48ec-952f-a3a65186cd59</uuid>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  <name>instance-0000009e</name>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkBasicOps-server-1311975023</nova:name>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:22:44</nova:creationTime>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <nova:port uuid="1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <entry name="serial">8010f6fe-77ef-48ec-952f-a3a65186cd59</entry>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <entry name="uuid">8010f6fe-77ef-48ec-952f-a3a65186cd59</entry>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/8010f6fe-77ef-48ec-952f-a3a65186cd59_disk">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/8010f6fe-77ef-48ec-952f-a3a65186cd59_disk.config">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:8e:52:70"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <target dev="tap1fea5a6d-70"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/console.log" append="off"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:22:45 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:22:45 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:22:45 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:22:45 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.783 227766 DEBUG nova.compute.manager [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Preparing to wait for external event network-vif-plugged-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.783 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.783 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.784 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.784 227766 DEBUG nova.virt.libvirt.vif [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:22:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1311975023',display_name='tempest-TestNetworkBasicOps-server-1311975023',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1311975023',id=158,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL/weReemtkH37O5IRYNNKB/TT2YWx12mLIlrues6ra7aWugRfActXRaloGeDbBIe3P4JQOsAt7qcWUIg9CugCzIu3x3QAkzyj6XRgUoI3pt8WnzsopEYd4m2Fn+OmRPbg==',key_name='tempest-TestNetworkBasicOps-1841453028',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-3yzv077h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:22:37Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=8010f6fe-77ef-48ec-952f-a3a65186cd59,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.784 227766 DEBUG nova.network.os_vif_util [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.785 227766 DEBUG nova.network.os_vif_util [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8e:52:70,bridge_name='br-int',has_traffic_filtering=True,id=1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9,network=Network(16290d86-0a8d-403e-83f2-0ae47fb80e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fea5a6d-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.785 227766 DEBUG os_vif [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:52:70,bridge_name='br-int',has_traffic_filtering=True,id=1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9,network=Network(16290d86-0a8d-403e-83f2-0ae47fb80e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fea5a6d-70') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.789 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.789 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.790 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.793 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.793 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1fea5a6d-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.794 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1fea5a6d-70, col_values=(('external_ids', {'iface-id': '1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8e:52:70', 'vm-uuid': '8010f6fe-77ef-48ec-952f-a3a65186cd59'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.839 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:45 np0005593234 NetworkManager[48942]: <info>  [1769163765.8400] manager: (tap1fea5a6d-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/312)
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.842 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.846 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.848 227766 INFO os_vif [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8e:52:70,bridge_name='br-int',has_traffic_filtering=True,id=1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9,network=Network(16290d86-0a8d-403e-83f2-0ae47fb80e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fea5a6d-70')#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.905 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.906 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.906 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No VIF found with MAC fa:16:3e:8e:52:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.907 227766 INFO nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Using config drive#033[00m
Jan 23 05:22:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:45.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:45 np0005593234 nova_compute[227762]: 2026-01-23 10:22:45.929 227766 DEBUG nova.storage.rbd_utils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 8010f6fe-77ef-48ec-952f-a3a65186cd59_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:46 np0005593234 nova_compute[227762]: 2026-01-23 10:22:46.443 227766 INFO nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Creating config drive at /var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/disk.config#033[00m
Jan 23 05:22:46 np0005593234 nova_compute[227762]: 2026-01-23 10:22:46.448 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr078_0o3 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:46 np0005593234 nova_compute[227762]: 2026-01-23 10:22:46.476 227766 DEBUG nova.network.neutron [req-44b9b9c2-c39e-46cc-acb5-7176ff050a6d req-bd7a6642-bf98-4a1a-9f68-2a9cfbe5d940 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updated VIF entry in instance network info cache for port 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:22:46 np0005593234 nova_compute[227762]: 2026-01-23 10:22:46.477 227766 DEBUG nova.network.neutron [req-44b9b9c2-c39e-46cc-acb5-7176ff050a6d req-bd7a6642-bf98-4a1a-9f68-2a9cfbe5d940 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updating instance_info_cache with network_info: [{"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:46 np0005593234 nova_compute[227762]: 2026-01-23 10:22:46.493 227766 DEBUG oslo_concurrency.lockutils [req-44b9b9c2-c39e-46cc-acb5-7176ff050a6d req-bd7a6642-bf98-4a1a-9f68-2a9cfbe5d940 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:46 np0005593234 nova_compute[227762]: 2026-01-23 10:22:46.583 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr078_0o3" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:46 np0005593234 nova_compute[227762]: 2026-01-23 10:22:46.614 227766 DEBUG nova.storage.rbd_utils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 8010f6fe-77ef-48ec-952f-a3a65186cd59_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:22:46 np0005593234 nova_compute[227762]: 2026-01-23 10:22:46.619 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/disk.config 8010f6fe-77ef-48ec-952f-a3a65186cd59_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:22:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:46.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:46 np0005593234 nova_compute[227762]: 2026-01-23 10:22:46.777 227766 DEBUG oslo_concurrency.processutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/disk.config 8010f6fe-77ef-48ec-952f-a3a65186cd59_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.158s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:22:46 np0005593234 nova_compute[227762]: 2026-01-23 10:22:46.778 227766 INFO nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Deleting local config drive /var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/disk.config because it was imported into RBD.#033[00m
Jan 23 05:22:46 np0005593234 kernel: tap1fea5a6d-70: entered promiscuous mode
Jan 23 05:22:46 np0005593234 NetworkManager[48942]: <info>  [1769163766.8355] manager: (tap1fea5a6d-70): new Tun device (/org/freedesktop/NetworkManager/Devices/313)
Jan 23 05:22:46 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:46Z|00646|binding|INFO|Claiming lport 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 for this chassis.
Jan 23 05:22:46 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:46Z|00647|binding|INFO|1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9: Claiming fa:16:3e:8e:52:70 10.100.0.3
Jan 23 05:22:46 np0005593234 nova_compute[227762]: 2026-01-23 10:22:46.837 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.852 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:52:70 10.100.0.3'], port_security=['fa:16:3e:8e:52:70 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8010f6fe-77ef-48ec-952f-a3a65186cd59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16290d86-0a8d-403e-83f2-0ae47fb80e5f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'f0ab937f-5892-49c6-b2ad-6c661ac4d86b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bb4f46ff-5230-4660-946f-0fcefddd5977, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.853 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 in datapath 16290d86-0a8d-403e-83f2-0ae47fb80e5f bound to our chassis#033[00m
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.855 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 16290d86-0a8d-403e-83f2-0ae47fb80e5f#033[00m
Jan 23 05:22:46 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:46Z|00648|binding|INFO|Setting lport 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 up in Southbound
Jan 23 05:22:46 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:46Z|00649|binding|INFO|Setting lport 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 ovn-installed in OVS
Jan 23 05:22:46 np0005593234 nova_compute[227762]: 2026-01-23 10:22:46.868 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.869 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1d1f577a-8e63-42d0-8244-3bea15b1d95c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.869 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap16290d86-01 in ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.871 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap16290d86-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.872 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fc742ad4-ea73-49dc-8544-894c860c8dd3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.874 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[606244c0-6c45-4e8f-a330-19c0431eb8f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:46 np0005593234 nova_compute[227762]: 2026-01-23 10:22:46.874 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:46 np0005593234 systemd-machined[195626]: New machine qemu-74-instance-0000009e.
Jan 23 05:22:46 np0005593234 systemd-udevd[299562]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.890 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[d8b56742-4816-4324-a8da-e82666edb259]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:46 np0005593234 NetworkManager[48942]: <info>  [1769163766.8971] device (tap1fea5a6d-70): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:22:46 np0005593234 systemd[1]: Started Virtual Machine qemu-74-instance-0000009e.
Jan 23 05:22:46 np0005593234 NetworkManager[48942]: <info>  [1769163766.8980] device (tap1fea5a6d-70): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.916 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[833b89d0-c799-4837-8c72-5ea6c747b1f0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.945 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4b9c3097-8c73-475a-90b7-91e01b81854d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.954 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[527f6f8b-1fae-43ea-bd0c-5c96c368e3c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:46 np0005593234 NetworkManager[48942]: <info>  [1769163766.9559] manager: (tap16290d86-00): new Veth device (/org/freedesktop/NetworkManager/Devices/314)
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.992 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9e98a27c-fc70-4012-8d35-4358626bd2ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:46.996 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9d6920a5-2b9e-4248-ae79-f1702a3cd65c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:47 np0005593234 NetworkManager[48942]: <info>  [1769163767.0211] device (tap16290d86-00): carrier: link connected
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.025 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b0311011-f110-4fa2-bc70-a5eef6464d7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.045 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a80b9a74-81bd-4359-8a05-17cdcf4cdc66]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16290d86-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:f5:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 766198, 'reachable_time': 37439, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 299593, 'error': None, 'target': 'ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.066 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9b441a43-76ed-4043-bc3e-b768043ae0e2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe16:f52d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 766198, 'tstamp': 766198}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 299594, 'error': None, 'target': 'ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.088 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4b2e7201-1295-4582-8836-bfa0c0f829a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap16290d86-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:16:f5:2d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 202], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 766198, 'reachable_time': 37439, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 299599, 'error': None, 'target': 'ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.136 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6a070680-92b9-460d-936c-19ebc99b5634]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.208 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[10d4a7b9-f679-47bd-b978-1592bcccd914]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.209 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16290d86-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.210 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.210 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap16290d86-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.212 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:47 np0005593234 NetworkManager[48942]: <info>  [1769163767.2132] manager: (tap16290d86-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/315)
Jan 23 05:22:47 np0005593234 kernel: tap16290d86-00: entered promiscuous mode
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.216 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap16290d86-00, col_values=(('external_ids', {'iface-id': '941ae456-64ca-4338-b65f-ea519122a16f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.217 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:47 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:47Z|00650|binding|INFO|Releasing lport 941ae456-64ca-4338-b65f-ea519122a16f from this chassis (sb_readonly=0)
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.231 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.232 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/16290d86-0a8d-403e-83f2-0ae47fb80e5f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/16290d86-0a8d-403e-83f2-0ae47fb80e5f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.233 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[75c6a9e1-0402-4652-9e1a-b4fe2b5a349a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.233 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-16290d86-0a8d-403e-83f2-0ae47fb80e5f
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/16290d86-0a8d-403e-83f2-0ae47fb80e5f.pid.haproxy
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 16290d86-0a8d-403e-83f2-0ae47fb80e5f
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:22:47 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:47.234 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f', 'env', 'PROCESS_TAG=haproxy-16290d86-0a8d-403e-83f2-0ae47fb80e5f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/16290d86-0a8d-403e-83f2-0ae47fb80e5f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.253 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163767.2527623, 8010f6fe-77ef-48ec-952f-a3a65186cd59 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.253 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] VM Started (Lifecycle Event)#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.286 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.290 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163767.2528818, 8010f6fe-77ef-48ec-952f-a3a65186cd59 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.290 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.322 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.325 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.351 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:22:47 np0005593234 podman[299669]: 2026-01-23 10:22:47.608206162 +0000 UTC m=+0.062882854 container create c22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 23 05:22:47 np0005593234 systemd[1]: Started libpod-conmon-c22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb.scope.
Jan 23 05:22:47 np0005593234 podman[299669]: 2026-01-23 10:22:47.57335382 +0000 UTC m=+0.028030562 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:22:47 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:22:47 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7233f00f7c1da591c032a081973e55dd4fc7f05eb49b31f769a4b069ecec4ac9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:22:47 np0005593234 podman[299669]: 2026-01-23 10:22:47.701744689 +0000 UTC m=+0.156421381 container init c22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:22:47 np0005593234 podman[299669]: 2026-01-23 10:22:47.706678942 +0000 UTC m=+0.161355604 container start c22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 05:22:47 np0005593234 neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f[299684]: [NOTICE]   (299688) : New worker (299690) forked
Jan 23 05:22:47 np0005593234 neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f[299684]: [NOTICE]   (299688) : Loading success.
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.831 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.853 227766 DEBUG nova.compute.manager [req-24e60658-8566-4e32-98b8-620cac000c56 req-089ca00c-ef17-4d3b-a15f-488415cc3d61 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-vif-plugged-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.853 227766 DEBUG oslo_concurrency.lockutils [req-24e60658-8566-4e32-98b8-620cac000c56 req-089ca00c-ef17-4d3b-a15f-488415cc3d61 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.853 227766 DEBUG oslo_concurrency.lockutils [req-24e60658-8566-4e32-98b8-620cac000c56 req-089ca00c-ef17-4d3b-a15f-488415cc3d61 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.854 227766 DEBUG oslo_concurrency.lockutils [req-24e60658-8566-4e32-98b8-620cac000c56 req-089ca00c-ef17-4d3b-a15f-488415cc3d61 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.854 227766 DEBUG nova.compute.manager [req-24e60658-8566-4e32-98b8-620cac000c56 req-089ca00c-ef17-4d3b-a15f-488415cc3d61 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Processing event network-vif-plugged-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.855 227766 DEBUG nova.compute.manager [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.859 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163767.8588197, 8010f6fe-77ef-48ec-952f-a3a65186cd59 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.859 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.860 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.864 227766 INFO nova.virt.libvirt.driver [-] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Instance spawned successfully.#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.864 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:22:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:47.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.946 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.949 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.955 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.955 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.956 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.957 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.958 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.958 227766 DEBUG nova.virt.libvirt.driver [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.983 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:22:47 np0005593234 nova_compute[227762]: 2026-01-23 10:22:47.984 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:48 np0005593234 nova_compute[227762]: 2026-01-23 10:22:48.033 227766 INFO nova.compute.manager [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Took 10.80 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:22:48 np0005593234 nova_compute[227762]: 2026-01-23 10:22:48.034 227766 DEBUG nova.compute.manager [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:22:48 np0005593234 nova_compute[227762]: 2026-01-23 10:22:48.202 227766 INFO nova.compute.manager [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Took 12.51 seconds to build instance.#033[00m
Jan 23 05:22:48 np0005593234 nova_compute[227762]: 2026-01-23 10:22:48.234 227766 DEBUG oslo_concurrency.lockutils [None req-5d9db404-df66-42ee-a923-675ed27c4305 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.642s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:48.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:49 np0005593234 nova_compute[227762]: 2026-01-23 10:22:49.157 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:49.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:50 np0005593234 nova_compute[227762]: 2026-01-23 10:22:50.011 227766 DEBUG nova.compute.manager [req-50024090-abbf-44a8-b5df-837f8290490a req-4695c033-5844-4303-8fef-beea705d1ef5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-vif-plugged-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:50 np0005593234 nova_compute[227762]: 2026-01-23 10:22:50.011 227766 DEBUG oslo_concurrency.lockutils [req-50024090-abbf-44a8-b5df-837f8290490a req-4695c033-5844-4303-8fef-beea705d1ef5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:22:50 np0005593234 nova_compute[227762]: 2026-01-23 10:22:50.012 227766 DEBUG oslo_concurrency.lockutils [req-50024090-abbf-44a8-b5df-837f8290490a req-4695c033-5844-4303-8fef-beea705d1ef5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:22:50 np0005593234 nova_compute[227762]: 2026-01-23 10:22:50.012 227766 DEBUG oslo_concurrency.lockutils [req-50024090-abbf-44a8-b5df-837f8290490a req-4695c033-5844-4303-8fef-beea705d1ef5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:22:50 np0005593234 nova_compute[227762]: 2026-01-23 10:22:50.012 227766 DEBUG nova.compute.manager [req-50024090-abbf-44a8-b5df-837f8290490a req-4695c033-5844-4303-8fef-beea705d1ef5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] No waiting events found dispatching network-vif-plugged-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:22:50 np0005593234 nova_compute[227762]: 2026-01-23 10:22:50.012 227766 WARNING nova.compute.manager [req-50024090-abbf-44a8-b5df-837f8290490a req-4695c033-5844-4303-8fef-beea705d1ef5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received unexpected event network-vif-plugged-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:22:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:22:50.505 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '58'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:22:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:50.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:50 np0005593234 nova_compute[227762]: 2026-01-23 10:22:50.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:50 np0005593234 nova_compute[227762]: 2026-01-23 10:22:50.839 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:22:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:51.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:22:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:52Z|00651|binding|INFO|Releasing lport 120d9d64-6853-4b50-a095-bddadd015ba1 from this chassis (sb_readonly=0)
Jan 23 05:22:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:22:52Z|00652|binding|INFO|Releasing lport 941ae456-64ca-4338-b65f-ea519122a16f from this chassis (sb_readonly=0)
Jan 23 05:22:52 np0005593234 nova_compute[227762]: 2026-01-23 10:22:52.422 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:52.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:52 np0005593234 nova_compute[227762]: 2026-01-23 10:22:52.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:52 np0005593234 nova_compute[227762]: 2026-01-23 10:22:52.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:22:52 np0005593234 nova_compute[227762]: 2026-01-23 10:22:52.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:22:52 np0005593234 nova_compute[227762]: 2026-01-23 10:22:52.833 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:53 np0005593234 nova_compute[227762]: 2026-01-23 10:22:53.194 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:53 np0005593234 nova_compute[227762]: 2026-01-23 10:22:53.194 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:53 np0005593234 nova_compute[227762]: 2026-01-23 10:22:53.195 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:22:53 np0005593234 nova_compute[227762]: 2026-01-23 10:22:53.195 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid a00a5042-ce71-4ecf-ab8f-d9e596d48035 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:22:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:22:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:53.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:22:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:22:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:54.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:22:54 np0005593234 nova_compute[227762]: 2026-01-23 10:22:54.755 227766 DEBUG nova.compute.manager [req-388cdc4e-7a53-479b-aece-c28d337877e8 req-8fa6d0e0-5235-47a9-b722-e484a0bddbfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-changed-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:54 np0005593234 nova_compute[227762]: 2026-01-23 10:22:54.756 227766 DEBUG nova.compute.manager [req-388cdc4e-7a53-479b-aece-c28d337877e8 req-8fa6d0e0-5235-47a9-b722-e484a0bddbfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Refreshing instance network info cache due to event network-changed-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:22:54 np0005593234 nova_compute[227762]: 2026-01-23 10:22:54.756 227766 DEBUG oslo_concurrency.lockutils [req-388cdc4e-7a53-479b-aece-c28d337877e8 req-8fa6d0e0-5235-47a9-b722-e484a0bddbfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:54 np0005593234 nova_compute[227762]: 2026-01-23 10:22:54.756 227766 DEBUG oslo_concurrency.lockutils [req-388cdc4e-7a53-479b-aece-c28d337877e8 req-8fa6d0e0-5235-47a9-b722-e484a0bddbfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:54 np0005593234 nova_compute[227762]: 2026-01-23 10:22:54.756 227766 DEBUG nova.network.neutron [req-388cdc4e-7a53-479b-aece-c28d337877e8 req-8fa6d0e0-5235-47a9-b722-e484a0bddbfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Refreshing network info cache for port 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:22:54 np0005593234 nova_compute[227762]: 2026-01-23 10:22:54.808 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:55 np0005593234 nova_compute[227762]: 2026-01-23 10:22:55.367 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updating instance_info_cache with network_info: [{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.236", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:55 np0005593234 nova_compute[227762]: 2026-01-23 10:22:55.387 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:55 np0005593234 nova_compute[227762]: 2026-01-23 10:22:55.388 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:22:55 np0005593234 nova_compute[227762]: 2026-01-23 10:22:55.388 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:55 np0005593234 nova_compute[227762]: 2026-01-23 10:22:55.388 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:55 np0005593234 nova_compute[227762]: 2026-01-23 10:22:55.389 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:22:55 np0005593234 nova_compute[227762]: 2026-01-23 10:22:55.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_shelved_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:55 np0005593234 nova_compute[227762]: 2026-01-23 10:22:55.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:22:55 np0005593234 nova_compute[227762]: 2026-01-23 10:22:55.879 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:55.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:56.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:57 np0005593234 podman[299754]: 2026-01-23 10:22:57.758335733 +0000 UTC m=+0.054908858 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 23 05:22:57 np0005593234 nova_compute[227762]: 2026-01-23 10:22:57.835 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:22:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:22:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:57.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:58 np0005593234 nova_compute[227762]: 2026-01-23 10:22:58.222 227766 DEBUG nova.network.neutron [req-388cdc4e-7a53-479b-aece-c28d337877e8 req-8fa6d0e0-5235-47a9-b722-e484a0bddbfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updated VIF entry in instance network info cache for port 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:22:58 np0005593234 nova_compute[227762]: 2026-01-23 10:22:58.223 227766 DEBUG nova.network.neutron [req-388cdc4e-7a53-479b-aece-c28d337877e8 req-8fa6d0e0-5235-47a9-b722-e484a0bddbfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updating instance_info_cache with network_info: [{"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:22:58 np0005593234 nova_compute[227762]: 2026-01-23 10:22:58.253 227766 DEBUG oslo_concurrency.lockutils [req-388cdc4e-7a53-479b-aece-c28d337877e8 req-8fa6d0e0-5235-47a9-b722-e484a0bddbfa 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:22:58 np0005593234 nova_compute[227762]: 2026-01-23 10:22:58.320 227766 DEBUG nova.compute.manager [req-6cda09c7-2af9-45f6-be9b-b0ff80b6abd2 req-c315fa10-4ec6-46d7-8b59-8b5638c67ac8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:22:58 np0005593234 nova_compute[227762]: 2026-01-23 10:22:58.321 227766 DEBUG nova.compute.manager [req-6cda09c7-2af9-45f6-be9b-b0ff80b6abd2 req-c315fa10-4ec6-46d7-8b59-8b5638c67ac8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing instance network info cache due to event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:22:58 np0005593234 nova_compute[227762]: 2026-01-23 10:22:58.321 227766 DEBUG oslo_concurrency.lockutils [req-6cda09c7-2af9-45f6-be9b-b0ff80b6abd2 req-c315fa10-4ec6-46d7-8b59-8b5638c67ac8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:22:58 np0005593234 nova_compute[227762]: 2026-01-23 10:22:58.321 227766 DEBUG oslo_concurrency.lockutils [req-6cda09c7-2af9-45f6-be9b-b0ff80b6abd2 req-c315fa10-4ec6-46d7-8b59-8b5638c67ac8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:22:58 np0005593234 nova_compute[227762]: 2026-01-23 10:22:58.321 227766 DEBUG nova.network.neutron [req-6cda09c7-2af9-45f6-be9b-b0ff80b6abd2 req-c315fa10-4ec6-46d7-8b59-8b5638c67ac8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:22:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:58.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:22:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:22:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:22:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:59.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:23:00 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1491934135' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:00 np0005593234 nova_compute[227762]: 2026-01-23 10:23:00.553 227766 DEBUG nova.network.neutron [req-6cda09c7-2af9-45f6-be9b-b0ff80b6abd2 req-c315fa10-4ec6-46d7-8b59-8b5638c67ac8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updated VIF entry in instance network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:23:00 np0005593234 nova_compute[227762]: 2026-01-23 10:23:00.554 227766 DEBUG nova.network.neutron [req-6cda09c7-2af9-45f6-be9b-b0ff80b6abd2 req-c315fa10-4ec6-46d7-8b59-8b5638c67ac8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updating instance_info_cache with network_info: [{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:00 np0005593234 nova_compute[227762]: 2026-01-23 10:23:00.575 227766 DEBUG oslo_concurrency.lockutils [req-6cda09c7-2af9-45f6-be9b-b0ff80b6abd2 req-c315fa10-4ec6-46d7-8b59-8b5638c67ac8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:23:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:00.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:00Z|00072|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8e:52:70 10.100.0.3
Jan 23 05:23:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:00Z|00073|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8e:52:70 10.100.0.3
Jan 23 05:23:00 np0005593234 nova_compute[227762]: 2026-01-23 10:23:00.881 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #136. Immutable memtables: 0.
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:01.905755) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 85] Flushing memtable with next log file: 136
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163781905868, "job": 85, "event": "flush_started", "num_memtables": 1, "num_entries": 2374, "num_deletes": 252, "total_data_size": 5612535, "memory_usage": 5700480, "flush_reason": "Manual Compaction"}
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 85] Level-0 flush table #137: started
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163781924591, "cf_name": "default", "job": 85, "event": "table_file_creation", "file_number": 137, "file_size": 3670488, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 65489, "largest_seqno": 67858, "table_properties": {"data_size": 3660948, "index_size": 5969, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20183, "raw_average_key_size": 20, "raw_value_size": 3641701, "raw_average_value_size": 3700, "num_data_blocks": 260, "num_entries": 984, "num_filter_entries": 984, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163574, "oldest_key_time": 1769163574, "file_creation_time": 1769163781, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 137, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 85] Flush lasted 18895 microseconds, and 7959 cpu microseconds.
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:01.924664) [db/flush_job.cc:967] [default] [JOB 85] Level-0 flush table #137: 3670488 bytes OK
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:01.924690) [db/memtable_list.cc:519] [default] Level-0 commit table #137 started
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:01.926380) [db/memtable_list.cc:722] [default] Level-0 commit table #137: memtable #1 done
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:01.926397) EVENT_LOG_v1 {"time_micros": 1769163781926391, "job": 85, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:01.926414) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 85] Try to delete WAL files size 5601989, prev total WAL file size 5622786, number of live WAL files 2.
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000133.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:01.928078) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730035373733' seq:72057594037927935, type:22 .. '7061786F730036303235' seq:0, type:0; will stop at (end)
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 86] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 85 Base level 0, inputs: [137(3584KB)], [135(10MB)]
Jan 23 05:23:01 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163781928198, "job": 86, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [137], "files_L6": [135], "score": -1, "input_data_size": 14308193, "oldest_snapshot_seqno": -1}
Jan 23 05:23:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:01.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 86] Generated table #138: 8825 keys, 12408197 bytes, temperature: kUnknown
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163782014885, "cf_name": "default", "job": 86, "event": "table_file_creation", "file_number": 138, "file_size": 12408197, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12350529, "index_size": 34515, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22085, "raw_key_size": 231612, "raw_average_key_size": 26, "raw_value_size": 12194814, "raw_average_value_size": 1381, "num_data_blocks": 1327, "num_entries": 8825, "num_filter_entries": 8825, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769163781, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 138, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:02.015162) [db/compaction/compaction_job.cc:1663] [default] [JOB 86] Compacted 1@0 + 1@6 files to L6 => 12408197 bytes
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:02.016842) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.9 rd, 143.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 10.1 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(7.3) write-amplify(3.4) OK, records in: 9348, records dropped: 523 output_compression: NoCompression
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:02.016861) EVENT_LOG_v1 {"time_micros": 1769163782016852, "job": 86, "event": "compaction_finished", "compaction_time_micros": 86765, "compaction_time_cpu_micros": 31803, "output_level": 6, "num_output_files": 1, "total_output_size": 12408197, "num_input_records": 9348, "num_output_records": 8825, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000137.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163782017529, "job": 86, "event": "table_file_deletion", "file_number": 137}
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000135.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163782019389, "job": 86, "event": "table_file_deletion", "file_number": 135}
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:01.927989) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:02.019488) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:02.019494) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:02.019497) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:02.019500) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:23:02.019502) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:23:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:02.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:02 np0005593234 nova_compute[227762]: 2026-01-23 10:23:02.838 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:23:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:03 np0005593234 nova_compute[227762]: 2026-01-23 10:23:03.655 227766 DEBUG nova.compute.manager [req-9b518d91-e616-4c82-9eba-d13ec3a9ea88 req-10e55a09-8871-4d07-b8c6-e41eb80bfff0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:03 np0005593234 nova_compute[227762]: 2026-01-23 10:23:03.656 227766 DEBUG nova.compute.manager [req-9b518d91-e616-4c82-9eba-d13ec3a9ea88 req-10e55a09-8871-4d07-b8c6-e41eb80bfff0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing instance network info cache due to event network-changed-d4f06800-1f0a-4f50-b00d-b10219301efc. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:23:03 np0005593234 nova_compute[227762]: 2026-01-23 10:23:03.656 227766 DEBUG oslo_concurrency.lockutils [req-9b518d91-e616-4c82-9eba-d13ec3a9ea88 req-10e55a09-8871-4d07-b8c6-e41eb80bfff0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:03 np0005593234 nova_compute[227762]: 2026-01-23 10:23:03.656 227766 DEBUG oslo_concurrency.lockutils [req-9b518d91-e616-4c82-9eba-d13ec3a9ea88 req-10e55a09-8871-4d07-b8c6-e41eb80bfff0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:03 np0005593234 nova_compute[227762]: 2026-01-23 10:23:03.656 227766 DEBUG nova.network.neutron [req-9b518d91-e616-4c82-9eba-d13ec3a9ea88 req-10e55a09-8871-4d07-b8c6-e41eb80bfff0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Refreshing network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:23:03 np0005593234 nova_compute[227762]: 2026-01-23 10:23:03.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:23:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:23:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:23:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:03.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:04.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:05 np0005593234 nova_compute[227762]: 2026-01-23 10:23:05.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:05 np0005593234 nova_compute[227762]: 2026-01-23 10:23:05.767 227766 DEBUG oslo_concurrency.lockutils [None req-179a9ee5-e6cb-4380-aa03-69a561ea7c3e c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:05 np0005593234 nova_compute[227762]: 2026-01-23 10:23:05.767 227766 DEBUG oslo_concurrency.lockutils [None req-179a9ee5-e6cb-4380-aa03-69a561ea7c3e c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:05 np0005593234 nova_compute[227762]: 2026-01-23 10:23:05.787 227766 INFO nova.compute.manager [None req-179a9ee5-e6cb-4380-aa03-69a561ea7c3e c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Detaching volume 7857fc75-1658-465f-a6a1-40f608f6408e#033[00m
Jan 23 05:23:05 np0005593234 podman[299908]: 2026-01-23 10:23:05.823541431 +0000 UTC m=+0.112769796 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 23 05:23:05 np0005593234 nova_compute[227762]: 2026-01-23 10:23:05.882 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:05.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:06 np0005593234 nova_compute[227762]: 2026-01-23 10:23:06.068 227766 INFO nova.virt.block_device [None req-179a9ee5-e6cb-4380-aa03-69a561ea7c3e c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Attempting to driver detach volume 7857fc75-1658-465f-a6a1-40f608f6408e from mountpoint /dev/vdb#033[00m
Jan 23 05:23:06 np0005593234 nova_compute[227762]: 2026-01-23 10:23:06.078 227766 DEBUG nova.virt.libvirt.driver [None req-179a9ee5-e6cb-4380-aa03-69a561ea7c3e c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Attempting to detach device vdb from instance a00a5042-ce71-4ecf-ab8f-d9e596d48035 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:23:06 np0005593234 nova_compute[227762]: 2026-01-23 10:23:06.078 227766 DEBUG nova.virt.libvirt.guest [None req-179a9ee5-e6cb-4380-aa03-69a561ea7c3e c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:23:06 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-7857fc75-1658-465f-a6a1-40f608f6408e">
Jan 23 05:23:06 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:  <serial>7857fc75-1658-465f-a6a1-40f608f6408e</serial>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 23 05:23:06 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:23:06 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:23:06 np0005593234 nova_compute[227762]: 2026-01-23 10:23:06.086 227766 INFO nova.virt.libvirt.driver [None req-179a9ee5-e6cb-4380-aa03-69a561ea7c3e c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Successfully detached device vdb from instance a00a5042-ce71-4ecf-ab8f-d9e596d48035 from the persistent domain config.#033[00m
Jan 23 05:23:06 np0005593234 nova_compute[227762]: 2026-01-23 10:23:06.087 227766 DEBUG nova.virt.libvirt.driver [None req-179a9ee5-e6cb-4380-aa03-69a561ea7c3e c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance a00a5042-ce71-4ecf-ab8f-d9e596d48035 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:23:06 np0005593234 nova_compute[227762]: 2026-01-23 10:23:06.087 227766 DEBUG nova.virt.libvirt.guest [None req-179a9ee5-e6cb-4380-aa03-69a561ea7c3e c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:23:06 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-7857fc75-1658-465f-a6a1-40f608f6408e">
Jan 23 05:23:06 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:  <serial>7857fc75-1658-465f-a6a1-40f608f6408e</serial>
Jan 23 05:23:06 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 23 05:23:06 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:23:06 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:23:06 np0005593234 nova_compute[227762]: 2026-01-23 10:23:06.139 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769163786.13946, a00a5042-ce71-4ecf-ab8f-d9e596d48035 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:23:06 np0005593234 nova_compute[227762]: 2026-01-23 10:23:06.142 227766 DEBUG nova.virt.libvirt.driver [None req-179a9ee5-e6cb-4380-aa03-69a561ea7c3e c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance a00a5042-ce71-4ecf-ab8f-d9e596d48035 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:23:06 np0005593234 nova_compute[227762]: 2026-01-23 10:23:06.144 227766 INFO nova.virt.libvirt.driver [None req-179a9ee5-e6cb-4380-aa03-69a561ea7c3e c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Successfully detached device vdb from instance a00a5042-ce71-4ecf-ab8f-d9e596d48035 from the live domain config.#033[00m
Jan 23 05:23:06 np0005593234 nova_compute[227762]: 2026-01-23 10:23:06.205 227766 DEBUG nova.network.neutron [req-9b518d91-e616-4c82-9eba-d13ec3a9ea88 req-10e55a09-8871-4d07-b8c6-e41eb80bfff0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updated VIF entry in instance network info cache for port d4f06800-1f0a-4f50-b00d-b10219301efc. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:23:06 np0005593234 nova_compute[227762]: 2026-01-23 10:23:06.205 227766 DEBUG nova.network.neutron [req-9b518d91-e616-4c82-9eba-d13ec3a9ea88 req-10e55a09-8871-4d07-b8c6-e41eb80bfff0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updating instance_info_cache with network_info: [{"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:06 np0005593234 nova_compute[227762]: 2026-01-23 10:23:06.227 227766 DEBUG oslo_concurrency.lockutils [req-9b518d91-e616-4c82-9eba-d13ec3a9ea88 req-10e55a09-8871-4d07-b8c6-e41eb80bfff0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-a00a5042-ce71-4ecf-ab8f-d9e596d48035" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:23:06 np0005593234 nova_compute[227762]: 2026-01-23 10:23:06.497 227766 DEBUG nova.objects.instance [None req-179a9ee5-e6cb-4380-aa03-69a561ea7c3e c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'flavor' on Instance uuid a00a5042-ce71-4ecf-ab8f-d9e596d48035 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:06.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:07 np0005593234 nova_compute[227762]: 2026-01-23 10:23:07.088 227766 DEBUG oslo_concurrency.lockutils [None req-179a9ee5-e6cb-4380-aa03-69a561ea7c3e c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:07 np0005593234 nova_compute[227762]: 2026-01-23 10:23:07.810 227766 INFO nova.compute.manager [None req-3acab7cb-fc3b-4a5f-a0be-ddca89372f09 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Get console output#033[00m
Jan 23 05:23:07 np0005593234 nova_compute[227762]: 2026-01-23 10:23:07.815 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:23:07 np0005593234 nova_compute[227762]: 2026-01-23 10:23:07.839 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:07.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:08.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:09.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:23:10 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2835427716' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:23:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:23:10 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2835427716' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:23:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:10.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:10 np0005593234 nova_compute[227762]: 2026-01-23 10:23:10.884 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:23:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:23:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:11.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.132 227766 DEBUG oslo_concurrency.lockutils [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.133 227766 DEBUG oslo_concurrency.lockutils [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.133 227766 DEBUG oslo_concurrency.lockutils [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.134 227766 DEBUG oslo_concurrency.lockutils [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.134 227766 DEBUG oslo_concurrency.lockutils [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.136 227766 INFO nova.compute.manager [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Terminating instance#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.137 227766 DEBUG nova.compute.manager [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:23:12 np0005593234 kernel: tapd4f06800-1f (unregistering): left promiscuous mode
Jan 23 05:23:12 np0005593234 NetworkManager[48942]: <info>  [1769163792.1857] device (tapd4f06800-1f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.196 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:12 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:12Z|00653|binding|INFO|Releasing lport d4f06800-1f0a-4f50-b00d-b10219301efc from this chassis (sb_readonly=0)
Jan 23 05:23:12 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:12Z|00654|binding|INFO|Setting lport d4f06800-1f0a-4f50-b00d-b10219301efc down in Southbound
Jan 23 05:23:12 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:12Z|00655|binding|INFO|Removing iface tapd4f06800-1f ovn-installed in OVS
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.198 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.348 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:01:4f:d5 10.100.0.13'], port_security=['fa:16:3e:01:4f:d5 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'a00a5042-ce71-4ecf-ab8f-d9e596d48035', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b976daabc8124a99814954633f99ed7b', 'neutron:revision_number': '8', 'neutron:security_group_ids': '77e10692-5f18-4d4e-ba14-6f09047b276a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0878063b-8606-438f-ae03-20f399cd80c4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=d4f06800-1f0a-4f50-b00d-b10219301efc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.350 144381 INFO neutron.agent.ovn.metadata.agent [-] Port d4f06800-1f0a-4f50-b00d-b10219301efc in datapath 8b38c3ca-73e5-4583-a277-cd0670deffdb unbound from our chassis#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.351 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8b38c3ca-73e5-4583-a277-cd0670deffdb, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.373 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.373 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1f9744c3-ea94-48e7-90b0-0ac5ca251524]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.374 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb namespace which is not needed anymore#033[00m
Jan 23 05:23:12 np0005593234 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000009a.scope: Deactivated successfully.
Jan 23 05:23:12 np0005593234 systemd[1]: machine-qemu\x2d73\x2dinstance\x2d0000009a.scope: Consumed 15.474s CPU time.
Jan 23 05:23:12 np0005593234 systemd-machined[195626]: Machine qemu-73-instance-0000009a terminated.
Jan 23 05:23:12 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[299072]: [NOTICE]   (299076) : haproxy version is 2.8.14-c23fe91
Jan 23 05:23:12 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[299072]: [NOTICE]   (299076) : path to executable is /usr/sbin/haproxy
Jan 23 05:23:12 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[299072]: [WARNING]  (299076) : Exiting Master process...
Jan 23 05:23:12 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[299072]: [ALERT]    (299076) : Current worker (299078) exited with code 143 (Terminated)
Jan 23 05:23:12 np0005593234 neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb[299072]: [WARNING]  (299076) : All workers exited. Exiting... (0)
Jan 23 05:23:12 np0005593234 systemd[1]: libpod-1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4.scope: Deactivated successfully.
Jan 23 05:23:12 np0005593234 podman[300015]: 2026-01-23 10:23:12.508236641 +0000 UTC m=+0.044633918 container died 1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:23:12 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4-userdata-shm.mount: Deactivated successfully.
Jan 23 05:23:12 np0005593234 systemd[1]: var-lib-containers-storage-overlay-4f509d27450a5749b96a9920c873f79f4dba1895762471a6454651729a6ad10e-merged.mount: Deactivated successfully.
Jan 23 05:23:12 np0005593234 podman[300015]: 2026-01-23 10:23:12.54364003 +0000 UTC m=+0.080037307 container cleanup 1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:23:12 np0005593234 systemd[1]: libpod-conmon-1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4.scope: Deactivated successfully.
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.572 227766 INFO nova.virt.libvirt.driver [-] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Instance destroyed successfully.#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.573 227766 DEBUG nova.objects.instance [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lazy-loading 'resources' on Instance uuid a00a5042-ce71-4ecf-ab8f-d9e596d48035 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:12 np0005593234 podman[300048]: 2026-01-23 10:23:12.606707191 +0000 UTC m=+0.041930644 container remove 1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.606 227766 DEBUG nova.virt.libvirt.vif [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:20:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestMinimumBasicScenario-server-79223605',display_name='tempest-TestMinimumBasicScenario-server-79223605',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testminimumbasicscenario-server-79223605',id=154,image_ref='bba8f8ac-6563-4b96-a735-670d31b1818b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO8Y9UEL92/+TB+I+GNhaZt1mYMByc7/BrYfEDaKlAAZo7j91A8ceJavobN2fd/HuU5MXKggpmRNE2fbSVJxSSFeNnWSzt9Sqrij7kFCnUGkI6fsAtTHbWMsV0NwSH55dw==',key_name='tempest-TestMinimumBasicScenario-867634297',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:00Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b976daabc8124a99814954633f99ed7b',ramdisk_id='',reservation_id='r-o6qhqwf7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='bba8f8ac-6563-4b96-a735-670d31b1818b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestMinimumBasicScenario-1465373740',owner_user_name='tempest-TestMinimumBasicScenario-1465373740-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:22:22Z,user_data=None,user_id='c041da0a601a4260b29fc9c65719597f',uuid=a00a5042-ce71-4ecf-ab8f-d9e596d48035,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.607 227766 DEBUG nova.network.os_vif_util [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converting VIF {"id": "d4f06800-1f0a-4f50-b00d-b10219301efc", "address": "fa:16:3e:01:4f:d5", "network": {"id": "8b38c3ca-73e5-4583-a277-cd0670deffdb", "bridge": "br-int", "label": "tempest-TestMinimumBasicScenario-1932395812-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b976daabc8124a99814954633f99ed7b", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd4f06800-1f", "ovs_interfaceid": "d4f06800-1f0a-4f50-b00d-b10219301efc", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.608 227766 DEBUG nova.network.os_vif_util [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.608 227766 DEBUG os_vif [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.609 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.610 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd4f06800-1f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.611 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.612 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.612 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[57a2ebb5-627a-467c-a49b-8dbecfaec408]: (4, ('Fri Jan 23 10:23:12 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb (1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4)\n1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4\nFri Jan 23 10:23:12 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb (1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4)\n1e80373eb38ef570f86ab277161cae7ff4155b8d8a5d3fc652e2b0947a9470f4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.614 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7354435e-1bc4-4256-aa75-a4e45d7f69da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.614 227766 INFO os_vif [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:01:4f:d5,bridge_name='br-int',has_traffic_filtering=True,id=d4f06800-1f0a-4f50-b00d-b10219301efc,network=Network(8b38c3ca-73e5-4583-a277-cd0670deffdb),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd4f06800-1f')#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.615 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8b38c3ca-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:12 np0005593234 kernel: tap8b38c3ca-70: left promiscuous mode
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.632 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.632 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c776c1fb-e490-49f6-9051-99e113f23d30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.649 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d4682673-9d35-4e80-b915-4c700711ec7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.650 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0bebaefc-b12a-4e48-bac2-09ec077903e8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.665 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[695e7d77-ca45-4d28-b050-9324b63331cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 763683, 'reachable_time': 38584, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 300091, 'error': None, 'target': 'ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.668 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8b38c3ca-73e5-4583-a277-cd0670deffdb deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:23:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:12.669 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[d80a5989-ec21-43ad-a175-0dc1b6b1a0c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:12 np0005593234 systemd[1]: run-netns-ovnmeta\x2d8b38c3ca\x2d73e5\x2d4583\x2da277\x2dcd0670deffdb.mount: Deactivated successfully.
Jan 23 05:23:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:12.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:12 np0005593234 nova_compute[227762]: 2026-01-23 10:23:12.840 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e315 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.059 227766 INFO nova.virt.libvirt.driver [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Deleting instance files /var/lib/nova/instances/a00a5042-ce71-4ecf-ab8f-d9e596d48035_del#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.059 227766 INFO nova.virt.libvirt.driver [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Deletion of /var/lib/nova/instances/a00a5042-ce71-4ecf-ab8f-d9e596d48035_del complete#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.126 227766 INFO nova.compute.manager [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Took 0.99 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.127 227766 DEBUG oslo.service.loopingcall [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.127 227766 DEBUG nova.compute.manager [-] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.128 227766 DEBUG nova.network.neutron [-] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.386 227766 DEBUG nova.compute.manager [req-e48ba7c7-0c99-44d3-8ac0-d799af4cbc9f req-75c68762-7717-4458-938c-c1265574a364 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-vif-unplugged-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.386 227766 DEBUG oslo_concurrency.lockutils [req-e48ba7c7-0c99-44d3-8ac0-d799af4cbc9f req-75c68762-7717-4458-938c-c1265574a364 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.387 227766 DEBUG oslo_concurrency.lockutils [req-e48ba7c7-0c99-44d3-8ac0-d799af4cbc9f req-75c68762-7717-4458-938c-c1265574a364 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.387 227766 DEBUG oslo_concurrency.lockutils [req-e48ba7c7-0c99-44d3-8ac0-d799af4cbc9f req-75c68762-7717-4458-938c-c1265574a364 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.387 227766 DEBUG nova.compute.manager [req-e48ba7c7-0c99-44d3-8ac0-d799af4cbc9f req-75c68762-7717-4458-938c-c1265574a364 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] No waiting events found dispatching network-vif-unplugged-d4f06800-1f0a-4f50-b00d-b10219301efc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.387 227766 DEBUG nova.compute.manager [req-e48ba7c7-0c99-44d3-8ac0-d799af4cbc9f req-75c68762-7717-4458-938c-c1265574a364 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-vif-unplugged-d4f06800-1f0a-4f50-b00d-b10219301efc for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.387 227766 DEBUG nova.compute.manager [req-e48ba7c7-0c99-44d3-8ac0-d799af4cbc9f req-75c68762-7717-4458-938c-c1265574a364 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.388 227766 DEBUG oslo_concurrency.lockutils [req-e48ba7c7-0c99-44d3-8ac0-d799af4cbc9f req-75c68762-7717-4458-938c-c1265574a364 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.388 227766 DEBUG oslo_concurrency.lockutils [req-e48ba7c7-0c99-44d3-8ac0-d799af4cbc9f req-75c68762-7717-4458-938c-c1265574a364 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.388 227766 DEBUG oslo_concurrency.lockutils [req-e48ba7c7-0c99-44d3-8ac0-d799af4cbc9f req-75c68762-7717-4458-938c-c1265574a364 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.388 227766 DEBUG nova.compute.manager [req-e48ba7c7-0c99-44d3-8ac0-d799af4cbc9f req-75c68762-7717-4458-938c-c1265574a364 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] No waiting events found dispatching network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.388 227766 WARNING nova.compute.manager [req-e48ba7c7-0c99-44d3-8ac0-d799af4cbc9f req-75c68762-7717-4458-938c-c1265574a364 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received unexpected event network-vif-plugged-d4f06800-1f0a-4f50-b00d-b10219301efc for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.627 227766 DEBUG oslo_concurrency.lockutils [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "interface-8010f6fe-77ef-48ec-952f-a3a65186cd59-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.627 227766 DEBUG oslo_concurrency.lockutils [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "interface-8010f6fe-77ef-48ec-952f-a3a65186cd59-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:13 np0005593234 nova_compute[227762]: 2026-01-23 10:23:13.627 227766 DEBUG nova.objects.instance [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'flavor' on Instance uuid 8010f6fe-77ef-48ec-952f-a3a65186cd59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 23 05:23:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:13.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:14 np0005593234 nova_compute[227762]: 2026-01-23 10:23:14.430 227766 DEBUG nova.network.neutron [-] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:14 np0005593234 nova_compute[227762]: 2026-01-23 10:23:14.460 227766 INFO nova.compute.manager [-] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Took 1.33 seconds to deallocate network for instance.#033[00m
Jan 23 05:23:14 np0005593234 nova_compute[227762]: 2026-01-23 10:23:14.486 227766 DEBUG nova.objects.instance [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'pci_requests' on Instance uuid 8010f6fe-77ef-48ec-952f-a3a65186cd59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:14.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:14 np0005593234 nova_compute[227762]: 2026-01-23 10:23:14.727 227766 DEBUG nova.compute.manager [req-ff55822c-fd18-4248-9b66-ca575eae1b2a req-2259bca3-50a8-46d2-be6f-c54cebdf166c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Received event network-vif-deleted-d4f06800-1f0a-4f50-b00d-b10219301efc external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:14 np0005593234 nova_compute[227762]: 2026-01-23 10:23:14.737 227766 DEBUG oslo_concurrency.lockutils [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:14 np0005593234 nova_compute[227762]: 2026-01-23 10:23:14.737 227766 DEBUG oslo_concurrency.lockutils [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:14 np0005593234 nova_compute[227762]: 2026-01-23 10:23:14.742 227766 DEBUG nova.network.neutron [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:23:14 np0005593234 nova_compute[227762]: 2026-01-23 10:23:14.847 227766 DEBUG oslo_concurrency.processutils [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:15 np0005593234 nova_compute[227762]: 2026-01-23 10:23:15.120 227766 DEBUG nova.policy [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60291ce86b6946629a2e48f6680312cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:23:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:23:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/301625785' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:15 np0005593234 nova_compute[227762]: 2026-01-23 10:23:15.271 227766 DEBUG oslo_concurrency.processutils [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:15 np0005593234 nova_compute[227762]: 2026-01-23 10:23:15.276 227766 DEBUG nova.compute.provider_tree [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:23:15 np0005593234 nova_compute[227762]: 2026-01-23 10:23:15.308 227766 DEBUG nova.scheduler.client.report [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:23:15 np0005593234 nova_compute[227762]: 2026-01-23 10:23:15.332 227766 DEBUG oslo_concurrency.lockutils [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:15 np0005593234 nova_compute[227762]: 2026-01-23 10:23:15.370 227766 INFO nova.scheduler.client.report [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Deleted allocations for instance a00a5042-ce71-4ecf-ab8f-d9e596d48035#033[00m
Jan 23 05:23:15 np0005593234 nova_compute[227762]: 2026-01-23 10:23:15.458 227766 DEBUG oslo_concurrency.lockutils [None req-e53f2a87-a0e5-41bb-99e2-c3347924b674 c041da0a601a4260b29fc9c65719597f b976daabc8124a99814954633f99ed7b - - default default] Lock "a00a5042-ce71-4ecf-ab8f-d9e596d48035" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.325s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:15.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:16 np0005593234 nova_compute[227762]: 2026-01-23 10:23:16.003 227766 DEBUG nova.network.neutron [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Successfully created port: 419310e6-0055-4c1d-8cdb-be034824b754 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:23:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:16.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:17 np0005593234 nova_compute[227762]: 2026-01-23 10:23:17.244 227766 DEBUG nova.network.neutron [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Successfully updated port: 419310e6-0055-4c1d-8cdb-be034824b754 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:23:17 np0005593234 nova_compute[227762]: 2026-01-23 10:23:17.270 227766 DEBUG oslo_concurrency.lockutils [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:17 np0005593234 nova_compute[227762]: 2026-01-23 10:23:17.270 227766 DEBUG oslo_concurrency.lockutils [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquired lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:17 np0005593234 nova_compute[227762]: 2026-01-23 10:23:17.270 227766 DEBUG nova.network.neutron [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:23:17 np0005593234 nova_compute[227762]: 2026-01-23 10:23:17.353 227766 DEBUG nova.compute.manager [req-cae9b8e5-d62f-402a-876c-218ccea266f7 req-18c582c0-1790-4123-a9eb-0b5367021267 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-changed-419310e6-0055-4c1d-8cdb-be034824b754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:17 np0005593234 nova_compute[227762]: 2026-01-23 10:23:17.354 227766 DEBUG nova.compute.manager [req-cae9b8e5-d62f-402a-876c-218ccea266f7 req-18c582c0-1790-4123-a9eb-0b5367021267 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Refreshing instance network info cache due to event network-changed-419310e6-0055-4c1d-8cdb-be034824b754. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:23:17 np0005593234 nova_compute[227762]: 2026-01-23 10:23:17.354 227766 DEBUG oslo_concurrency.lockutils [req-cae9b8e5-d62f-402a-876c-218ccea266f7 req-18c582c0-1790-4123-a9eb-0b5367021267 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e316 e316: 3 total, 3 up, 3 in
Jan 23 05:23:17 np0005593234 nova_compute[227762]: 2026-01-23 10:23:17.613 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:17 np0005593234 nova_compute[227762]: 2026-01-23 10:23:17.843 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e316 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:17.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:18.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e317 e317: 3 total, 3 up, 3 in
Jan 23 05:23:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:19.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:20.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:21.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:22 np0005593234 nova_compute[227762]: 2026-01-23 10:23:22.104 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:22 np0005593234 nova_compute[227762]: 2026-01-23 10:23:22.677 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:22.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:22 np0005593234 nova_compute[227762]: 2026-01-23 10:23:22.844 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:22Z|00656|binding|INFO|Releasing lport 941ae456-64ca-4338-b65f-ea519122a16f from this chassis (sb_readonly=0)
Jan 23 05:23:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e317 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:22 np0005593234 nova_compute[227762]: 2026-01-23 10:23:22.949 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:22 np0005593234 nova_compute[227762]: 2026-01-23 10:23:22.990 227766 DEBUG nova.network.neutron [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updating instance_info_cache with network_info: [{"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.018 227766 DEBUG oslo_concurrency.lockutils [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Releasing lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.019 227766 DEBUG oslo_concurrency.lockutils [req-cae9b8e5-d62f-402a-876c-218ccea266f7 req-18c582c0-1790-4123-a9eb-0b5367021267 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.020 227766 DEBUG nova.network.neutron [req-cae9b8e5-d62f-402a-876c-218ccea266f7 req-18c582c0-1790-4123-a9eb-0b5367021267 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Refreshing network info cache for port 419310e6-0055-4c1d-8cdb-be034824b754 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.022 227766 DEBUG nova.virt.libvirt.vif [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:22:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1311975023',display_name='tempest-TestNetworkBasicOps-server-1311975023',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1311975023',id=158,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL/weReemtkH37O5IRYNNKB/TT2YWx12mLIlrues6ra7aWugRfActXRaloGeDbBIe3P4JQOsAt7qcWUIg9CugCzIu3x3QAkzyj6XRgUoI3pt8WnzsopEYd4m2Fn+OmRPbg==',key_name='tempest-TestNetworkBasicOps-1841453028',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:22:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-3yzv077h',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:22:48Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=8010f6fe-77ef-48ec-952f-a3a65186cd59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.022 227766 DEBUG nova.network.os_vif_util [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.023 227766 DEBUG nova.network.os_vif_util [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:44:0e,bridge_name='br-int',has_traffic_filtering=True,id=419310e6-0055-4c1d-8cdb-be034824b754,network=Network(e227a777-0e88-4409-a4a5-266ef225baae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap419310e6-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.023 227766 DEBUG os_vif [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:44:0e,bridge_name='br-int',has_traffic_filtering=True,id=419310e6-0055-4c1d-8cdb-be034824b754,network=Network(e227a777-0e88-4409-a4a5-266ef225baae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap419310e6-00') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.024 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.024 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.024 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.027 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.027 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap419310e6-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.027 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap419310e6-00, col_values=(('external_ids', {'iface-id': '419310e6-0055-4c1d-8cdb-be034824b754', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:14:44:0e', 'vm-uuid': '8010f6fe-77ef-48ec-952f-a3a65186cd59'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.028 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:23 np0005593234 NetworkManager[48942]: <info>  [1769163803.0295] manager: (tap419310e6-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/316)
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.030 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.034 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.035 227766 INFO os_vif [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:14:44:0e,bridge_name='br-int',has_traffic_filtering=True,id=419310e6-0055-4c1d-8cdb-be034824b754,network=Network(e227a777-0e88-4409-a4a5-266ef225baae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap419310e6-00')#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.036 227766 DEBUG nova.virt.libvirt.vif [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:22:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1311975023',display_name='tempest-TestNetworkBasicOps-server-1311975023',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1311975023',id=158,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL/weReemtkH37O5IRYNNKB/TT2YWx12mLIlrues6ra7aWugRfActXRaloGeDbBIe3P4JQOsAt7qcWUIg9CugCzIu3x3QAkzyj6XRgUoI3pt8WnzsopEYd4m2Fn+OmRPbg==',key_name='tempest-TestNetworkBasicOps-1841453028',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:22:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-3yzv077h',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:22:48Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=8010f6fe-77ef-48ec-952f-a3a65186cd59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.036 227766 DEBUG nova.network.os_vif_util [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.037 227766 DEBUG nova.network.os_vif_util [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:14:44:0e,bridge_name='br-int',has_traffic_filtering=True,id=419310e6-0055-4c1d-8cdb-be034824b754,network=Network(e227a777-0e88-4409-a4a5-266ef225baae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap419310e6-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.040 227766 DEBUG nova.virt.libvirt.guest [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] attach device xml: <interface type="ethernet">
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  <mac address="fa:16:3e:14:44:0e"/>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  <model type="virtio"/>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  <mtu size="1442"/>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  <target dev="tap419310e6-00"/>
Jan 23 05:23:23 np0005593234 nova_compute[227762]: </interface>
Jan 23 05:23:23 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:23:23 np0005593234 kernel: tap419310e6-00: entered promiscuous mode
Jan 23 05:23:23 np0005593234 NetworkManager[48942]: <info>  [1769163803.0518] manager: (tap419310e6-00): new Tun device (/org/freedesktop/NetworkManager/Devices/317)
Jan 23 05:23:23 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:23Z|00657|binding|INFO|Claiming lport 419310e6-0055-4c1d-8cdb-be034824b754 for this chassis.
Jan 23 05:23:23 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:23Z|00658|binding|INFO|419310e6-0055-4c1d-8cdb-be034824b754: Claiming fa:16:3e:14:44:0e 10.100.0.18
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.053 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.062 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:44:0e 10.100.0.18'], port_security=['fa:16:3e:14:44:0e 10.100.0.18'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': '8010f6fe-77ef-48ec-952f-a3a65186cd59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e227a777-0e88-4409-a4a5-266ef225baae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '104c556a-4616-455b-9049-a55a5af0ff57', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9352757c-3308-4452-a338-cff1ca2f64b6, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=419310e6-0055-4c1d-8cdb-be034824b754) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.063 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 419310e6-0055-4c1d-8cdb-be034824b754 in datapath e227a777-0e88-4409-a4a5-266ef225baae bound to our chassis#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.065 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e227a777-0e88-4409-a4a5-266ef225baae#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.075 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[54292499-0a22-4ec5-a97e-603490a3b497]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.075 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape227a777-01 in ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.077 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape227a777-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.077 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[83efac0a-e53d-4e50-b1eb-e7d8d2daacf5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.078 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f8c70573-b10d-4690-9184-0badc7e06b3a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 systemd-udevd[300179]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.090 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[ef972f0c-12e1-4a4d-9b86-dfe926f8afdd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.092 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:23 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:23Z|00659|binding|INFO|Setting lport 419310e6-0055-4c1d-8cdb-be034824b754 ovn-installed in OVS
Jan 23 05:23:23 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:23Z|00660|binding|INFO|Setting lport 419310e6-0055-4c1d-8cdb-be034824b754 up in Southbound
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.095 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:23 np0005593234 NetworkManager[48942]: <info>  [1769163803.0988] device (tap419310e6-00): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:23:23 np0005593234 NetworkManager[48942]: <info>  [1769163803.1002] device (tap419310e6-00): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.116 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0baba8a5-b46b-4b68-a89b-714e8bab3b93]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.147 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[693daa11-d24c-4dc4-a8f8-e5298ce4c47d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.151 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3af8ad7f-8d50-4f41-b2db-873ed1d92c48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 NetworkManager[48942]: <info>  [1769163803.1527] manager: (tape227a777-00): new Veth device (/org/freedesktop/NetworkManager/Devices/318)
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.156 227766 DEBUG nova.virt.libvirt.driver [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.156 227766 DEBUG nova.virt.libvirt.driver [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.157 227766 DEBUG nova.virt.libvirt.driver [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No VIF found with MAC fa:16:3e:8e:52:70, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.157 227766 DEBUG nova.virt.libvirt.driver [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No VIF found with MAC fa:16:3e:14:44:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.182 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[782e5e36-36c7-402d-b7c2-cec556a77ff3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.184 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d3878373-f3a4-41b4-8896-d75c6fbc2bcc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.188 227766 DEBUG nova.virt.libvirt.guest [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  <nova:name>tempest-TestNetworkBasicOps-server-1311975023</nova:name>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 10:23:23</nova:creationTime>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 05:23:23 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:    <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:    <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:    <nova:port uuid="1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9">
Jan 23 05:23:23 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:    <nova:port uuid="419310e6-0055-4c1d-8cdb-be034824b754">
Jan 23 05:23:23 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 05:23:23 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 05:23:23 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 05:23:23 np0005593234 nova_compute[227762]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 05:23:23 np0005593234 NetworkManager[48942]: <info>  [1769163803.2055] device (tape227a777-00): carrier: link connected
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.210 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1e25e249-7441-4089-a116-e2b7f24b6bdc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.229 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0d4302af-09c5-409c-b83d-9f1b3845bdef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape227a777-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:39:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 769816, 'reachable_time': 23525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 300205, 'error': None, 'target': 'ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.230 227766 DEBUG oslo_concurrency.lockutils [None req-35af6789-352b-49fd-ba9d-5a185c6a09a7 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "interface-8010f6fe-77ef-48ec-952f-a3a65186cd59-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 9.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.242 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba8457c-bd58-4b91-ad7f-258e15288ac6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fead:3980'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 769816, 'tstamp': 769816}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 300206, 'error': None, 'target': 'ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.255 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fef79a69-088a-4fc7-992d-de7d005aedbe]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape227a777-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ad:39:80'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 205], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 769816, 'reachable_time': 23525, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 300207, 'error': None, 'target': 'ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.279 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3d37c38c-44d6-4d9a-bbe6-42a6baac4dbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.334 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf168a5-a93e-468b-aaa9-d0355c540be8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.335 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape227a777-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.335 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.336 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape227a777-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:23 np0005593234 kernel: tape227a777-00: entered promiscuous mode
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.337 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:23 np0005593234 NetworkManager[48942]: <info>  [1769163803.3381] manager: (tape227a777-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/319)
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.340 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape227a777-00, col_values=(('external_ids', {'iface-id': 'bec6dd0a-3f1b-4b60-8b52-1e4ba653a56d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.341 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:23 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:23Z|00661|binding|INFO|Releasing lport bec6dd0a-3f1b-4b60-8b52-1e4ba653a56d from this chassis (sb_readonly=0)
Jan 23 05:23:23 np0005593234 nova_compute[227762]: 2026-01-23 10:23:23.355 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.356 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e227a777-0e88-4409-a4a5-266ef225baae.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e227a777-0e88-4409-a4a5-266ef225baae.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.357 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c7a285a4-7f74-4dbd-8eab-e3140233405c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.358 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-e227a777-0e88-4409-a4a5-266ef225baae
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/e227a777-0e88-4409-a4a5-266ef225baae.pid.haproxy
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID e227a777-0e88-4409-a4a5-266ef225baae
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:23:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:23.359 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae', 'env', 'PROCESS_TAG=haproxy-e227a777-0e88-4409-a4a5-266ef225baae', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e227a777-0e88-4409-a4a5-266ef225baae.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:23:23 np0005593234 podman[300240]: 2026-01-23 10:23:23.702929788 +0000 UTC m=+0.061202563 container create bffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:23:23 np0005593234 systemd[1]: Started libpod-conmon-bffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf.scope.
Jan 23 05:23:23 np0005593234 podman[300240]: 2026-01-23 10:23:23.668971803 +0000 UTC m=+0.027244668 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:23:23 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:23:23 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f175c24ab90f6e7bcde53409dc2e1ecb05ada2bd33a5a2f79fb3d3161f4486/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:23:23 np0005593234 podman[300240]: 2026-01-23 10:23:23.79566491 +0000 UTC m=+0.153937705 container init bffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:23:23 np0005593234 podman[300240]: 2026-01-23 10:23:23.802052918 +0000 UTC m=+0.160325693 container start bffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:23:23 np0005593234 neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae[300256]: [NOTICE]   (300260) : New worker (300262) forked
Jan 23 05:23:23 np0005593234 neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae[300256]: [NOTICE]   (300260) : Loading success.
Jan 23 05:23:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:23.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:24 np0005593234 nova_compute[227762]: 2026-01-23 10:23:24.153 227766 DEBUG nova.compute.manager [req-cb2bd216-0e4c-48d1-8fbf-36926f741529 req-7741ecf3-b60f-47a9-92e6-feb4f8af03bf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-vif-plugged-419310e6-0055-4c1d-8cdb-be034824b754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:24 np0005593234 nova_compute[227762]: 2026-01-23 10:23:24.154 227766 DEBUG oslo_concurrency.lockutils [req-cb2bd216-0e4c-48d1-8fbf-36926f741529 req-7741ecf3-b60f-47a9-92e6-feb4f8af03bf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:24 np0005593234 nova_compute[227762]: 2026-01-23 10:23:24.154 227766 DEBUG oslo_concurrency.lockutils [req-cb2bd216-0e4c-48d1-8fbf-36926f741529 req-7741ecf3-b60f-47a9-92e6-feb4f8af03bf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:24 np0005593234 nova_compute[227762]: 2026-01-23 10:23:24.155 227766 DEBUG oslo_concurrency.lockutils [req-cb2bd216-0e4c-48d1-8fbf-36926f741529 req-7741ecf3-b60f-47a9-92e6-feb4f8af03bf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:24 np0005593234 nova_compute[227762]: 2026-01-23 10:23:24.155 227766 DEBUG nova.compute.manager [req-cb2bd216-0e4c-48d1-8fbf-36926f741529 req-7741ecf3-b60f-47a9-92e6-feb4f8af03bf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] No waiting events found dispatching network-vif-plugged-419310e6-0055-4c1d-8cdb-be034824b754 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:23:24 np0005593234 nova_compute[227762]: 2026-01-23 10:23:24.155 227766 WARNING nova.compute.manager [req-cb2bd216-0e4c-48d1-8fbf-36926f741529 req-7741ecf3-b60f-47a9-92e6-feb4f8af03bf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received unexpected event network-vif-plugged-419310e6-0055-4c1d-8cdb-be034824b754 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:23:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:24.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:25 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:25Z|00074|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:14:44:0e 10.100.0.18
Jan 23 05:23:25 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:25Z|00075|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:14:44:0e 10.100.0.18
Jan 23 05:23:25 np0005593234 nova_compute[227762]: 2026-01-23 10:23:25.932 227766 DEBUG nova.network.neutron [req-cae9b8e5-d62f-402a-876c-218ccea266f7 req-18c582c0-1790-4123-a9eb-0b5367021267 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updated VIF entry in instance network info cache for port 419310e6-0055-4c1d-8cdb-be034824b754. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:23:25 np0005593234 nova_compute[227762]: 2026-01-23 10:23:25.933 227766 DEBUG nova.network.neutron [req-cae9b8e5-d62f-402a-876c-218ccea266f7 req-18c582c0-1790-4123-a9eb-0b5367021267 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updating instance_info_cache with network_info: [{"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:25.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:25 np0005593234 nova_compute[227762]: 2026-01-23 10:23:25.974 227766 DEBUG oslo_concurrency.lockutils [req-cae9b8e5-d62f-402a-876c-218ccea266f7 req-18c582c0-1790-4123-a9eb-0b5367021267 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:23:26 np0005593234 nova_compute[227762]: 2026-01-23 10:23:26.374 227766 DEBUG nova.compute.manager [req-a331be14-f80a-4e13-90fa-69d811abe967 req-eb2bfaae-a371-4d75-986d-7f0cfc3eed9a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-vif-plugged-419310e6-0055-4c1d-8cdb-be034824b754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:26 np0005593234 nova_compute[227762]: 2026-01-23 10:23:26.375 227766 DEBUG oslo_concurrency.lockutils [req-a331be14-f80a-4e13-90fa-69d811abe967 req-eb2bfaae-a371-4d75-986d-7f0cfc3eed9a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:26 np0005593234 nova_compute[227762]: 2026-01-23 10:23:26.375 227766 DEBUG oslo_concurrency.lockutils [req-a331be14-f80a-4e13-90fa-69d811abe967 req-eb2bfaae-a371-4d75-986d-7f0cfc3eed9a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:26 np0005593234 nova_compute[227762]: 2026-01-23 10:23:26.375 227766 DEBUG oslo_concurrency.lockutils [req-a331be14-f80a-4e13-90fa-69d811abe967 req-eb2bfaae-a371-4d75-986d-7f0cfc3eed9a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:26 np0005593234 nova_compute[227762]: 2026-01-23 10:23:26.376 227766 DEBUG nova.compute.manager [req-a331be14-f80a-4e13-90fa-69d811abe967 req-eb2bfaae-a371-4d75-986d-7f0cfc3eed9a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] No waiting events found dispatching network-vif-plugged-419310e6-0055-4c1d-8cdb-be034824b754 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:23:26 np0005593234 nova_compute[227762]: 2026-01-23 10:23:26.376 227766 WARNING nova.compute.manager [req-a331be14-f80a-4e13-90fa-69d811abe967 req-eb2bfaae-a371-4d75-986d-7f0cfc3eed9a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received unexpected event network-vif-plugged-419310e6-0055-4c1d-8cdb-be034824b754 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:23:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:26.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e318 e318: 3 total, 3 up, 3 in
Jan 23 05:23:27 np0005593234 nova_compute[227762]: 2026-01-23 10:23:27.571 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163792.56988, a00a5042-ce71-4ecf-ab8f-d9e596d48035 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:23:27 np0005593234 nova_compute[227762]: 2026-01-23 10:23:27.572 227766 INFO nova.compute.manager [-] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:23:27 np0005593234 nova_compute[227762]: 2026-01-23 10:23:27.592 227766 DEBUG nova.compute.manager [None req-69865df8-e70a-47b9-a513-3562a196a40a - - - - - -] [instance: a00a5042-ce71-4ecf-ab8f-d9e596d48035] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:23:27 np0005593234 nova_compute[227762]: 2026-01-23 10:23:27.847 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:27.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:28 np0005593234 nova_compute[227762]: 2026-01-23 10:23:28.003 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:28 np0005593234 nova_compute[227762]: 2026-01-23 10:23:28.029 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:28.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:28 np0005593234 podman[300274]: 2026-01-23 10:23:28.761731737 +0000 UTC m=+0.052358457 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 05:23:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:29.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:30.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:31.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:32.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:32 np0005593234 nova_compute[227762]: 2026-01-23 10:23:32.850 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:33 np0005593234 nova_compute[227762]: 2026-01-23 10:23:33.031 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:33.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:34.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:34 np0005593234 nova_compute[227762]: 2026-01-23 10:23:34.844 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:35.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:36.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:36 np0005593234 podman[300348]: 2026-01-23 10:23:36.792857954 +0000 UTC m=+0.089798351 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:23:37 np0005593234 nova_compute[227762]: 2026-01-23 10:23:37.853 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:37.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:38 np0005593234 nova_compute[227762]: 2026-01-23 10:23:38.032 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:38.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:39 np0005593234 nova_compute[227762]: 2026-01-23 10:23:39.150 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:39 np0005593234 nova_compute[227762]: 2026-01-23 10:23:39.151 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:39 np0005593234 nova_compute[227762]: 2026-01-23 10:23:39.192 227766 DEBUG nova.compute.manager [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:23:39 np0005593234 nova_compute[227762]: 2026-01-23 10:23:39.282 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:39 np0005593234 nova_compute[227762]: 2026-01-23 10:23:39.283 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:39 np0005593234 nova_compute[227762]: 2026-01-23 10:23:39.300 227766 DEBUG nova.virt.hardware [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:23:39 np0005593234 nova_compute[227762]: 2026-01-23 10:23:39.301 227766 INFO nova.compute.claims [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:23:39 np0005593234 nova_compute[227762]: 2026-01-23 10:23:39.456 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:23:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2987655499' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:39 np0005593234 nova_compute[227762]: 2026-01-23 10:23:39.871 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:39 np0005593234 nova_compute[227762]: 2026-01-23 10:23:39.880 227766 DEBUG nova.compute.provider_tree [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:23:39 np0005593234 nova_compute[227762]: 2026-01-23 10:23:39.920 227766 DEBUG nova.scheduler.client.report [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:23:39 np0005593234 nova_compute[227762]: 2026-01-23 10:23:39.955 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:39 np0005593234 nova_compute[227762]: 2026-01-23 10:23:39.956 227766 DEBUG nova.compute.manager [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:23:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:39.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.047 227766 DEBUG nova.compute.manager [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.048 227766 DEBUG nova.network.neutron [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.073 227766 INFO nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.092 227766 DEBUG nova.compute.manager [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.203 227766 DEBUG nova.compute.manager [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.204 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.205 227766 INFO nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Creating image(s)#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.231 227766 DEBUG nova.storage.rbd_utils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image bd0fc955-63ff-41a4-b31b-369c2b584544_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.253 227766 DEBUG nova.storage.rbd_utils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image bd0fc955-63ff-41a4-b31b-369c2b584544_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.275 227766 DEBUG nova.storage.rbd_utils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image bd0fc955-63ff-41a4-b31b-369c2b584544_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.279 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.351 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.352 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.353 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.353 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.377 227766 DEBUG nova.storage.rbd_utils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image bd0fc955-63ff-41a4-b31b-369c2b584544_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.379 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 bd0fc955-63ff-41a4-b31b-369c2b584544_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.628 227766 DEBUG nova.policy [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '01b7396ecc574dd6ba2df2f406921223', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c7c25c6bb33b41bf9cd8febb8259fd87', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.656 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 bd0fc955-63ff-41a4-b31b-369c2b584544_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.276s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.725 227766 DEBUG nova.storage.rbd_utils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] resizing rbd image bd0fc955-63ff-41a4-b31b-369c2b584544_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:23:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:40.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.832 227766 DEBUG nova.objects.instance [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'migration_context' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.858 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.858 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Ensure instance console log exists: /var/lib/nova/instances/bd0fc955-63ff-41a4-b31b-369c2b584544/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.859 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.859 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:40 np0005593234 nova_compute[227762]: 2026-01-23 10:23:40.859 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:41.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:42 np0005593234 nova_compute[227762]: 2026-01-23 10:23:42.544 227766 DEBUG nova.network.neutron [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Successfully created port: a4401398-6f7f-4595-b308-33a66a468a1f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:23:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:42.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:42 np0005593234 nova_compute[227762]: 2026-01-23 10:23:42.855 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:42.858 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:42.860 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:42.861 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:43 np0005593234 nova_compute[227762]: 2026-01-23 10:23:43.035 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:43 np0005593234 nova_compute[227762]: 2026-01-23 10:23:43.177 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:43 np0005593234 nova_compute[227762]: 2026-01-23 10:23:43.551 227766 DEBUG nova.network.neutron [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Successfully updated port: a4401398-6f7f-4595-b308-33a66a468a1f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:23:43 np0005593234 nova_compute[227762]: 2026-01-23 10:23:43.623 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:43 np0005593234 nova_compute[227762]: 2026-01-23 10:23:43.623 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquired lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:43 np0005593234 nova_compute[227762]: 2026-01-23 10:23:43.624 227766 DEBUG nova.network.neutron [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:23:43 np0005593234 nova_compute[227762]: 2026-01-23 10:23:43.682 227766 DEBUG nova.compute.manager [req-6e4ec3e0-7072-4f0e-99c8-7087fb499d5d req-444627fe-7774-4d49-81a5-3456cec4972e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-changed-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:43 np0005593234 nova_compute[227762]: 2026-01-23 10:23:43.683 227766 DEBUG nova.compute.manager [req-6e4ec3e0-7072-4f0e-99c8-7087fb499d5d req-444627fe-7774-4d49-81a5-3456cec4972e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Refreshing instance network info cache due to event network-changed-a4401398-6f7f-4595-b308-33a66a468a1f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:23:43 np0005593234 nova_compute[227762]: 2026-01-23 10:23:43.683 227766 DEBUG oslo_concurrency.lockutils [req-6e4ec3e0-7072-4f0e-99c8-7087fb499d5d req-444627fe-7774-4d49-81a5-3456cec4972e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:43 np0005593234 nova_compute[227762]: 2026-01-23 10:23:43.864 227766 DEBUG nova.network.neutron [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:23:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:43.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:23:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:44.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:23:44 np0005593234 nova_compute[227762]: 2026-01-23 10:23:44.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:44 np0005593234 nova_compute[227762]: 2026-01-23 10:23:44.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:44 np0005593234 nova_compute[227762]: 2026-01-23 10:23:44.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:44 np0005593234 nova_compute[227762]: 2026-01-23 10:23:44.774 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:44 np0005593234 nova_compute[227762]: 2026-01-23 10:23:44.774 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:23:44 np0005593234 nova_compute[227762]: 2026-01-23 10:23:44.774 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:23:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3384550270' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.323 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.549s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.423 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000009e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.423 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-0000009e as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.570 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.572 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4151MB free_disk=20.893104553222656GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.572 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.572 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.714 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 8010f6fe-77ef-48ec-952f-a3a65186cd59 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.714 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance bd0fc955-63ff-41a4-b31b-369c2b584544 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.714 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.715 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.790 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.936 227766 DEBUG nova.network.neutron [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Updating instance_info_cache with network_info: [{"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.982 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Releasing lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.982 227766 DEBUG nova.compute.manager [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance network_info: |[{"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.983 227766 DEBUG oslo_concurrency.lockutils [req-6e4ec3e0-7072-4f0e-99c8-7087fb499d5d req-444627fe-7774-4d49-81a5-3456cec4972e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.983 227766 DEBUG nova.network.neutron [req-6e4ec3e0-7072-4f0e-99c8-7087fb499d5d req-444627fe-7774-4d49-81a5-3456cec4972e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Refreshing network info cache for port a4401398-6f7f-4595-b308-33a66a468a1f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.986 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Start _get_guest_xml network_info=[{"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.989 227766 WARNING nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:23:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:45.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.997 227766 DEBUG nova.virt.libvirt.host [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:23:45 np0005593234 nova_compute[227762]: 2026-01-23 10:23:45.997 227766 DEBUG nova.virt.libvirt.host [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.003 227766 DEBUG nova.virt.libvirt.host [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.004 227766 DEBUG nova.virt.libvirt.host [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.005 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.005 227766 DEBUG nova.virt.hardware [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.006 227766 DEBUG nova.virt.hardware [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.006 227766 DEBUG nova.virt.hardware [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.006 227766 DEBUG nova.virt.hardware [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.006 227766 DEBUG nova.virt.hardware [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.007 227766 DEBUG nova.virt.hardware [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.007 227766 DEBUG nova.virt.hardware [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.007 227766 DEBUG nova.virt.hardware [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.007 227766 DEBUG nova.virt.hardware [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.008 227766 DEBUG nova.virt.hardware [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.008 227766 DEBUG nova.virt.hardware [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.011 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:23:46 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1622257089' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.210 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.215 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.242 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:23:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:23:46 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2544679930' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.275 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.276 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:23:46 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2776638880' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.420 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.409s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.448 227766 DEBUG nova.storage.rbd_utils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image bd0fc955-63ff-41a4-b31b-369c2b584544_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.453 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:46.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:23:46 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/599302499' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.881 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.884 227766 DEBUG nova.virt.libvirt.vif [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:23:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-59893283',display_name='tempest-AttachVolumeTestJSON-server-59893283',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-59893283',id=162,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFhliJMNa+yXuf0fSBz/3snMSJFr8dgTPMjo/Saadc9lk7SWpQL9azGzggupkC6Qe7ig8O16HGbub4k3l0C30EmW82whCbHcGBU51cs0XAqEAsopgWQJtLRZgtshG7dxog==',key_name='tempest-keypair-1909141305',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c7c25c6bb33b41bf9cd8febb8259fd87',ramdisk_id='',reservation_id='r-gp4q9qzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-345871886',owner_user_name='tempest-AttachVolumeTestJSON-345871886-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:23:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='01b7396ecc574dd6ba2df2f406921223',uuid=bd0fc955-63ff-41a4-b31b-369c2b584544,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.884 227766 DEBUG nova.network.os_vif_util [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converting VIF {"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.885 227766 DEBUG nova.network.os_vif_util [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.886 227766 DEBUG nova.objects.instance [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'pci_devices' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.927 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  <uuid>bd0fc955-63ff-41a4-b31b-369c2b584544</uuid>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  <name>instance-000000a2</name>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <nova:name>tempest-AttachVolumeTestJSON-server-59893283</nova:name>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:23:45</nova:creationTime>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <nova:user uuid="01b7396ecc574dd6ba2df2f406921223">tempest-AttachVolumeTestJSON-345871886-project-member</nova:user>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <nova:project uuid="c7c25c6bb33b41bf9cd8febb8259fd87">tempest-AttachVolumeTestJSON-345871886</nova:project>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <nova:port uuid="a4401398-6f7f-4595-b308-33a66a468a1f">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <entry name="serial">bd0fc955-63ff-41a4-b31b-369c2b584544</entry>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <entry name="uuid">bd0fc955-63ff-41a4-b31b-369c2b584544</entry>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/bd0fc955-63ff-41a4-b31b-369c2b584544_disk">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/bd0fc955-63ff-41a4-b31b-369c2b584544_disk.config">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:d9:d9:06"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <target dev="tapa4401398-6f"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/bd0fc955-63ff-41a4-b31b-369c2b584544/console.log" append="off"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:23:46 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:23:46 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:23:46 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:23:46 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.929 227766 DEBUG nova.compute.manager [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Preparing to wait for external event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.929 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.930 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.930 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.931 227766 DEBUG nova.virt.libvirt.vif [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:23:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-59893283',display_name='tempest-AttachVolumeTestJSON-server-59893283',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-59893283',id=162,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFhliJMNa+yXuf0fSBz/3snMSJFr8dgTPMjo/Saadc9lk7SWpQL9azGzggupkC6Qe7ig8O16HGbub4k3l0C30EmW82whCbHcGBU51cs0XAqEAsopgWQJtLRZgtshG7dxog==',key_name='tempest-keypair-1909141305',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c7c25c6bb33b41bf9cd8febb8259fd87',ramdisk_id='',reservation_id='r-gp4q9qzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-345871886',owner_user_name='tempest-AttachVolumeTestJSON-345871886-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:23:40Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='01b7396ecc574dd6ba2df2f406921223',uuid=bd0fc955-63ff-41a4-b31b-369c2b584544,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.931 227766 DEBUG nova.network.os_vif_util [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converting VIF {"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.932 227766 DEBUG nova.network.os_vif_util [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.933 227766 DEBUG os_vif [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.933 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.934 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.934 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.938 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.939 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa4401398-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.939 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa4401398-6f, col_values=(('external_ids', {'iface-id': 'a4401398-6f7f-4595-b308-33a66a468a1f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:d9:06', 'vm-uuid': 'bd0fc955-63ff-41a4-b31b-369c2b584544'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.940 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:46 np0005593234 NetworkManager[48942]: <info>  [1769163826.9415] manager: (tapa4401398-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/320)
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.943 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.946 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:46 np0005593234 nova_compute[227762]: 2026-01-23 10:23:46.946 227766 INFO os_vif [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f')#033[00m
Jan 23 05:23:47 np0005593234 nova_compute[227762]: 2026-01-23 10:23:47.017 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:23:47 np0005593234 nova_compute[227762]: 2026-01-23 10:23:47.018 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:23:47 np0005593234 nova_compute[227762]: 2026-01-23 10:23:47.018 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No VIF found with MAC fa:16:3e:d9:d9:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:23:47 np0005593234 nova_compute[227762]: 2026-01-23 10:23:47.019 227766 INFO nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Using config drive#033[00m
Jan 23 05:23:47 np0005593234 nova_compute[227762]: 2026-01-23 10:23:47.043 227766 DEBUG nova.storage.rbd_utils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image bd0fc955-63ff-41a4-b31b-369c2b584544_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:47 np0005593234 nova_compute[227762]: 2026-01-23 10:23:47.626 227766 INFO nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Creating config drive at /var/lib/nova/instances/bd0fc955-63ff-41a4-b31b-369c2b584544/disk.config#033[00m
Jan 23 05:23:47 np0005593234 nova_compute[227762]: 2026-01-23 10:23:47.638 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/bd0fc955-63ff-41a4-b31b-369c2b584544/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqijij7rx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:47 np0005593234 nova_compute[227762]: 2026-01-23 10:23:47.773 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/bd0fc955-63ff-41a4-b31b-369c2b584544/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpqijij7rx" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:47 np0005593234 nova_compute[227762]: 2026-01-23 10:23:47.912 227766 DEBUG nova.storage.rbd_utils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image bd0fc955-63ff-41a4-b31b-369c2b584544_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:23:47 np0005593234 nova_compute[227762]: 2026-01-23 10:23:47.916 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/bd0fc955-63ff-41a4-b31b-369c2b584544/disk.config bd0fc955-63ff-41a4-b31b-369c2b584544_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:23:47 np0005593234 nova_compute[227762]: 2026-01-23 10:23:47.938 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:47.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.102 227766 DEBUG oslo_concurrency.processutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/bd0fc955-63ff-41a4-b31b-369c2b584544/disk.config bd0fc955-63ff-41a4-b31b-369c2b584544_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.186s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.103 227766 INFO nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Deleting local config drive /var/lib/nova/instances/bd0fc955-63ff-41a4-b31b-369c2b584544/disk.config because it was imported into RBD.#033[00m
Jan 23 05:23:48 np0005593234 kernel: tapa4401398-6f: entered promiscuous mode
Jan 23 05:23:48 np0005593234 NetworkManager[48942]: <info>  [1769163828.1503] manager: (tapa4401398-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/321)
Jan 23 05:23:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:48Z|00662|binding|INFO|Claiming lport a4401398-6f7f-4595-b308-33a66a468a1f for this chassis.
Jan 23 05:23:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:48Z|00663|binding|INFO|a4401398-6f7f-4595-b308-33a66a468a1f: Claiming fa:16:3e:d9:d9:06 10.100.0.7
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.151 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.166 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:d9:06 10.100.0.7'], port_security=['fa:16:3e:d9:d9:06 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'bd0fc955-63ff-41a4-b31b-369c2b584544', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1280650e-e283-4ddc-81aa-357640520155', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7c25c6bb33b41bf9cd8febb8259fd87', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c721f45-9254-46f2-b17b-2aa67f5ce3fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4684203-7828-4ea2-86ad-83030eb9aefe, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a4401398-6f7f-4595-b308-33a66a468a1f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.168 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a4401398-6f7f-4595-b308-33a66a468a1f in datapath 1280650e-e283-4ddc-81aa-357640520155 bound to our chassis#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.170 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1280650e-e283-4ddc-81aa-357640520155#033[00m
Jan 23 05:23:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:48Z|00664|binding|INFO|Setting lport a4401398-6f7f-4595-b308-33a66a468a1f ovn-installed in OVS
Jan 23 05:23:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:48Z|00665|binding|INFO|Setting lport a4401398-6f7f-4595-b308-33a66a468a1f up in Southbound
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.171 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.175 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:48 np0005593234 systemd-udevd[300752]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.184 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[665f4b9c-d682-4337-afca-27e233e08f34]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.185 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1280650e-e1 in ovnmeta-1280650e-e283-4ddc-81aa-357640520155 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:23:48 np0005593234 systemd-machined[195626]: New machine qemu-75-instance-000000a2.
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.187 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1280650e-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.187 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[37b8110f-9787-461b-a3f5-ad6ea67c596d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.188 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a21afe03-0752-479e-8007-562f2eb42a29]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 NetworkManager[48942]: <info>  [1769163828.2017] device (tapa4401398-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:23:48 np0005593234 NetworkManager[48942]: <info>  [1769163828.2028] device (tapa4401398-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:23:48 np0005593234 systemd[1]: Started Virtual Machine qemu-75-instance-000000a2.
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.203 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[795e358e-4afe-43ab-8126-8334ab5b354a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.229 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9b5e38ee-aad4-494a-9f97-c638758f51a5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.261 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d61439-e00d-41b2-b7ab-16238d5a12d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.267 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c4f40b0c-88f8-4a4b-9c06-af10d2d0ec2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 systemd-udevd[300755]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:23:48 np0005593234 NetworkManager[48942]: <info>  [1769163828.2698] manager: (tap1280650e-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/322)
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.302 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1d4af73e-0b33-4487-bd71-2516982cb33b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.306 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[71a2e3ef-c8db-4b39-84cf-d6bf99a47da6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 NetworkManager[48942]: <info>  [1769163828.3319] device (tap1280650e-e0): carrier: link connected
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.338 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[5685a9ce-8b2f-4ca0-8818-d479f82a93bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.359 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a4608370-122b-4614-927c-e3eaba9eb54d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1280650e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:5b:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 207], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 772329, 'reachable_time': 19131, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 300784, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.375 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f65dd755-05a4-48c8-a357-b4b5d388f499]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:5b3e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 772329, 'tstamp': 772329}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 300785, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.397 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[94b29db6-510a-4719-bb43-1a4dd2c6f8d4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1280650e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:5b:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 180, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 207], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 772329, 'reachable_time': 19131, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 152, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 152, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 300786, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.426 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[48a12a72-a55a-44e6-bb11-310324adf2b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.484 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1083e734-7999-41c1-ad19-8e13f4c6fdc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.485 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1280650e-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.486 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.486 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1280650e-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:48 np0005593234 kernel: tap1280650e-e0: entered promiscuous mode
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.488 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:48 np0005593234 NetworkManager[48942]: <info>  [1769163828.4887] manager: (tap1280650e-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/323)
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.490 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1280650e-e0, col_values=(('external_ids', {'iface-id': '8ca9fbcb-59f5-4006-84df-ab99827a2b39'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:23:48Z|00666|binding|INFO|Releasing lport 8ca9fbcb-59f5-4006-84df-ab99827a2b39 from this chassis (sb_readonly=0)
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.507 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.508 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1280650e-e283-4ddc-81aa-357640520155.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1280650e-e283-4ddc-81aa-357640520155.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.509 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7b048200-c988-4faa-86e8-634c24c951dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.510 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-1280650e-e283-4ddc-81aa-357640520155
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/1280650e-e283-4ddc-81aa-357640520155.pid.haproxy
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 1280650e-e283-4ddc-81aa-357640520155
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.510 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'env', 'PROCESS_TAG=haproxy-1280650e-e283-4ddc-81aa-357640520155', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1280650e-e283-4ddc-81aa-357640520155.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.533 227766 DEBUG nova.network.neutron [req-6e4ec3e0-7072-4f0e-99c8-7087fb499d5d req-444627fe-7774-4d49-81a5-3456cec4972e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Updated VIF entry in instance network info cache for port a4401398-6f7f-4595-b308-33a66a468a1f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.534 227766 DEBUG nova.network.neutron [req-6e4ec3e0-7072-4f0e-99c8-7087fb499d5d req-444627fe-7774-4d49-81a5-3456cec4972e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Updating instance_info_cache with network_info: [{"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.560 227766 DEBUG oslo_concurrency.lockutils [req-6e4ec3e0-7072-4f0e-99c8-7087fb499d5d req-444627fe-7774-4d49-81a5-3456cec4972e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.601 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163828.6011465, bd0fc955-63ff-41a4-b31b-369c2b584544 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.602 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] VM Started (Lifecycle Event)#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.631 227766 DEBUG nova.compute.manager [req-8259ac55-29ce-418f-a59b-b4ef90693738 req-95db6b99-292b-4c86-aafe-002fa8444399 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.631 227766 DEBUG oslo_concurrency.lockutils [req-8259ac55-29ce-418f-a59b-b4ef90693738 req-95db6b99-292b-4c86-aafe-002fa8444399 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.632 227766 DEBUG oslo_concurrency.lockutils [req-8259ac55-29ce-418f-a59b-b4ef90693738 req-95db6b99-292b-4c86-aafe-002fa8444399 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.632 227766 DEBUG oslo_concurrency.lockutils [req-8259ac55-29ce-418f-a59b-b4ef90693738 req-95db6b99-292b-4c86-aafe-002fa8444399 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.632 227766 DEBUG nova.compute.manager [req-8259ac55-29ce-418f-a59b-b4ef90693738 req-95db6b99-292b-4c86-aafe-002fa8444399 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Processing event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.633 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.634 227766 DEBUG nova.compute.manager [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.639 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.649 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.652 227766 INFO nova.virt.libvirt.driver [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance spawned successfully.#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.653 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.691 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.691 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.692 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.693 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.693 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.694 227766 DEBUG nova.virt.libvirt.driver [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.698 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.698 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163828.6021261, bd0fc955-63ff-41a4-b31b-369c2b584544 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.698 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.733 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.736 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163828.6377394, bd0fc955-63ff-41a4-b31b-369c2b584544 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.736 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:23:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:48.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.771 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.775 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.783 227766 INFO nova.compute.manager [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Took 8.58 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.783 227766 DEBUG nova.compute.manager [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.874 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:48.876 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=59, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=58) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.890 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:23:48 np0005593234 podman[300860]: 2026-01-23 10:23:48.904016667 +0000 UTC m=+0.053585885 container create d41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.946 227766 INFO nova.compute.manager [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Took 9.70 seconds to build instance.#033[00m
Jan 23 05:23:48 np0005593234 systemd[1]: Started libpod-conmon-d41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281.scope.
Jan 23 05:23:48 np0005593234 nova_compute[227762]: 2026-01-23 10:23:48.970 227766 DEBUG oslo_concurrency.lockutils [None req-ed2e8567-6151-452b-aaba-670332eaf91f 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.820s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:48 np0005593234 podman[300860]: 2026-01-23 10:23:48.877365429 +0000 UTC m=+0.026934677 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:23:48 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:23:48 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b972311d376ad8b6edbe482b92e488f1bef3059e0d46bcde74d28fb06bd94770/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:23:48 np0005593234 podman[300860]: 2026-01-23 10:23:48.98873962 +0000 UTC m=+0.138308838 container init d41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 05:23:48 np0005593234 podman[300860]: 2026-01-23 10:23:48.993813967 +0000 UTC m=+0.143383185 container start d41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 05:23:49 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[300873]: [NOTICE]   (300877) : New worker (300879) forked
Jan 23 05:23:49 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[300873]: [NOTICE]   (300877) : Loading success.
Jan 23 05:23:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:49.047 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:23:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:23:49.048 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '59'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:23:49 np0005593234 nova_compute[227762]: 2026-01-23 10:23:49.276 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:49.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:50.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:50 np0005593234 nova_compute[227762]: 2026-01-23 10:23:50.759 227766 DEBUG nova.compute.manager [req-b1302e5e-ba30-4943-bdfe-595b4452ee43 req-a76dc21b-558b-481c-bb46-bca6719bfc3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:50 np0005593234 nova_compute[227762]: 2026-01-23 10:23:50.768 227766 DEBUG oslo_concurrency.lockutils [req-b1302e5e-ba30-4943-bdfe-595b4452ee43 req-a76dc21b-558b-481c-bb46-bca6719bfc3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:23:50 np0005593234 nova_compute[227762]: 2026-01-23 10:23:50.768 227766 DEBUG oslo_concurrency.lockutils [req-b1302e5e-ba30-4943-bdfe-595b4452ee43 req-a76dc21b-558b-481c-bb46-bca6719bfc3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:23:50 np0005593234 nova_compute[227762]: 2026-01-23 10:23:50.769 227766 DEBUG oslo_concurrency.lockutils [req-b1302e5e-ba30-4943-bdfe-595b4452ee43 req-a76dc21b-558b-481c-bb46-bca6719bfc3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:23:50 np0005593234 nova_compute[227762]: 2026-01-23 10:23:50.770 227766 DEBUG nova.compute.manager [req-b1302e5e-ba30-4943-bdfe-595b4452ee43 req-a76dc21b-558b-481c-bb46-bca6719bfc3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] No waiting events found dispatching network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:23:50 np0005593234 nova_compute[227762]: 2026-01-23 10:23:50.770 227766 WARNING nova.compute.manager [req-b1302e5e-ba30-4943-bdfe-595b4452ee43 req-a76dc21b-558b-481c-bb46-bca6719bfc3b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received unexpected event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f for instance with vm_state active and task_state None.#033[00m
Jan 23 05:23:51 np0005593234 nova_compute[227762]: 2026-01-23 10:23:51.943 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:52.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:52 np0005593234 nova_compute[227762]: 2026-01-23 10:23:52.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:23:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:52.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:23:52 np0005593234 nova_compute[227762]: 2026-01-23 10:23:52.879 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:53 np0005593234 nova_compute[227762]: 2026-01-23 10:23:53.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:23:53 np0005593234 nova_compute[227762]: 2026-01-23 10:23:53.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:23:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:54.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:54 np0005593234 nova_compute[227762]: 2026-01-23 10:23:54.228 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:54 np0005593234 nova_compute[227762]: 2026-01-23 10:23:54.229 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:54 np0005593234 nova_compute[227762]: 2026-01-23 10:23:54.229 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:23:54 np0005593234 nova_compute[227762]: 2026-01-23 10:23:54.322 227766 DEBUG nova.compute.manager [req-be307ff0-67e5-47da-9d6c-0c782f4267d0 req-52b4c957-4a4d-4a77-a84d-6ba867fc8b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-changed-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:23:54 np0005593234 nova_compute[227762]: 2026-01-23 10:23:54.322 227766 DEBUG nova.compute.manager [req-be307ff0-67e5-47da-9d6c-0c782f4267d0 req-52b4c957-4a4d-4a77-a84d-6ba867fc8b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Refreshing instance network info cache due to event network-changed-a4401398-6f7f-4595-b308-33a66a468a1f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:23:54 np0005593234 nova_compute[227762]: 2026-01-23 10:23:54.322 227766 DEBUG oslo_concurrency.lockutils [req-be307ff0-67e5-47da-9d6c-0c782f4267d0 req-52b4c957-4a4d-4a77-a84d-6ba867fc8b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:23:54 np0005593234 nova_compute[227762]: 2026-01-23 10:23:54.323 227766 DEBUG oslo_concurrency.lockutils [req-be307ff0-67e5-47da-9d6c-0c782f4267d0 req-52b4c957-4a4d-4a77-a84d-6ba867fc8b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:23:54 np0005593234 nova_compute[227762]: 2026-01-23 10:23:54.323 227766 DEBUG nova.network.neutron [req-be307ff0-67e5-47da-9d6c-0c782f4267d0 req-52b4c957-4a4d-4a77-a84d-6ba867fc8b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Refreshing network info cache for port a4401398-6f7f-4595-b308-33a66a468a1f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:23:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:54.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:56.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:56.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:56 np0005593234 nova_compute[227762]: 2026-01-23 10:23:56.947 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:57 np0005593234 nova_compute[227762]: 2026-01-23 10:23:57.884 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:23:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:23:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:58.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:23:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:23:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:58.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:23:59 np0005593234 podman[300943]: 2026-01-23 10:23:59.760384183 +0000 UTC m=+0.053182244 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 23 05:24:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:00.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:00 np0005593234 nova_compute[227762]: 2026-01-23 10:24:00.636 227766 DEBUG nova.network.neutron [req-be307ff0-67e5-47da-9d6c-0c782f4267d0 req-52b4c957-4a4d-4a77-a84d-6ba867fc8b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Updated VIF entry in instance network info cache for port a4401398-6f7f-4595-b308-33a66a468a1f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:24:00 np0005593234 nova_compute[227762]: 2026-01-23 10:24:00.637 227766 DEBUG nova.network.neutron [req-be307ff0-67e5-47da-9d6c-0c782f4267d0 req-52b4c957-4a4d-4a77-a84d-6ba867fc8b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Updating instance_info_cache with network_info: [{"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:00.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:00 np0005593234 nova_compute[227762]: 2026-01-23 10:24:00.796 227766 DEBUG oslo_concurrency.lockutils [req-be307ff0-67e5-47da-9d6c-0c782f4267d0 req-52b4c957-4a4d-4a77-a84d-6ba867fc8b0f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:00Z|00076|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d9:d9:06 10.100.0.7
Jan 23 05:24:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:00Z|00077|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d9:d9:06 10.100.0.7
Jan 23 05:24:01 np0005593234 nova_compute[227762]: 2026-01-23 10:24:01.952 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:02.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:02.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:02 np0005593234 nova_compute[227762]: 2026-01-23 10:24:02.888 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:03 np0005593234 nova_compute[227762]: 2026-01-23 10:24:03.138 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updating instance_info_cache with network_info: [{"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:03 np0005593234 nova_compute[227762]: 2026-01-23 10:24:03.169 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:03 np0005593234 nova_compute[227762]: 2026-01-23 10:24:03.169 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:24:03 np0005593234 nova_compute[227762]: 2026-01-23 10:24:03.169 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:03 np0005593234 nova_compute[227762]: 2026-01-23 10:24:03.169 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:03 np0005593234 nova_compute[227762]: 2026-01-23 10:24:03.170 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:03 np0005593234 nova_compute[227762]: 2026-01-23 10:24:03.170 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:24:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:04.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:04 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #55. Immutable memtables: 11.
Jan 23 05:24:04 np0005593234 nova_compute[227762]: 2026-01-23 10:24:04.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:04.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:04 np0005593234 nova_compute[227762]: 2026-01-23 10:24:04.775 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:06.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:06 np0005593234 nova_compute[227762]: 2026-01-23 10:24:06.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:06.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:06 np0005593234 nova_compute[227762]: 2026-01-23 10:24:06.954 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #139. Immutable memtables: 0.
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.359138) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 87] Flushing memtable with next log file: 139
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163847359215, "job": 87, "event": "flush_started", "num_memtables": 1, "num_entries": 972, "num_deletes": 251, "total_data_size": 1889829, "memory_usage": 1919912, "flush_reason": "Manual Compaction"}
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 87] Level-0 flush table #140: started
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163847368375, "cf_name": "default", "job": 87, "event": "table_file_creation", "file_number": 140, "file_size": 824235, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 67863, "largest_seqno": 68830, "table_properties": {"data_size": 820435, "index_size": 1451, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10302, "raw_average_key_size": 21, "raw_value_size": 812251, "raw_average_value_size": 1664, "num_data_blocks": 64, "num_entries": 488, "num_filter_entries": 488, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163781, "oldest_key_time": 1769163781, "file_creation_time": 1769163847, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 140, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 87] Flush lasted 9276 microseconds, and 4559 cpu microseconds.
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.368433) [db/flush_job.cc:967] [default] [JOB 87] Level-0 flush table #140: 824235 bytes OK
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.368450) [db/memtable_list.cc:519] [default] Level-0 commit table #140 started
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.370067) [db/memtable_list.cc:722] [default] Level-0 commit table #140: memtable #1 done
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.370080) EVENT_LOG_v1 {"time_micros": 1769163847370076, "job": 87, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.370095) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 87] Try to delete WAL files size 1884898, prev total WAL file size 1884898, number of live WAL files 2.
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000136.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.370991) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032323538' seq:72057594037927935, type:22 .. '6D6772737461740032353039' seq:0, type:0; will stop at (end)
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 88] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 87 Base level 0, inputs: [140(804KB)], [138(11MB)]
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163847371110, "job": 88, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [140], "files_L6": [138], "score": -1, "input_data_size": 13232432, "oldest_snapshot_seqno": -1}
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 88] Generated table #141: 8821 keys, 9795518 bytes, temperature: kUnknown
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163847427552, "cf_name": "default", "job": 88, "event": "table_file_creation", "file_number": 141, "file_size": 9795518, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9741779, "index_size": 30586, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22085, "raw_key_size": 231824, "raw_average_key_size": 26, "raw_value_size": 9590143, "raw_average_value_size": 1087, "num_data_blocks": 1165, "num_entries": 8821, "num_filter_entries": 8821, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769163847, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 141, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.427804) [db/compaction/compaction_job.cc:1663] [default] [JOB 88] Compacted 1@0 + 1@6 files to L6 => 9795518 bytes
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.429472) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 234.1 rd, 173.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 11.8 +0.0 blob) out(9.3 +0.0 blob), read-write-amplify(27.9) write-amplify(11.9) OK, records in: 9313, records dropped: 492 output_compression: NoCompression
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.429489) EVENT_LOG_v1 {"time_micros": 1769163847429481, "job": 88, "event": "compaction_finished", "compaction_time_micros": 56532, "compaction_time_cpu_micros": 31493, "output_level": 6, "num_output_files": 1, "total_output_size": 9795518, "num_input_records": 9313, "num_output_records": 8821, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000140.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163847429751, "job": 88, "event": "table_file_deletion", "file_number": 140}
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000138.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163847431996, "job": 88, "event": "table_file_deletion", "file_number": 138}
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.370878) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.432053) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.432059) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.432061) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.432063) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:07.432065) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:07 np0005593234 podman[300968]: 2026-01-23 10:24:07.79250379 +0000 UTC m=+0.078940024 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Jan 23 05:24:07 np0005593234 nova_compute[227762]: 2026-01-23 10:24:07.889 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e318 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:08.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:08 np0005593234 nova_compute[227762]: 2026-01-23 10:24:08.110 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:08.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:10.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e319 e319: 3 total, 3 up, 3 in
Jan 23 05:24:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:10.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e320 e320: 3 total, 3 up, 3 in
Jan 23 05:24:11 np0005593234 nova_compute[227762]: 2026-01-23 10:24:11.957 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:12.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:12 np0005593234 nova_compute[227762]: 2026-01-23 10:24:12.260 227766 DEBUG oslo_concurrency.lockutils [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:12 np0005593234 nova_compute[227762]: 2026-01-23 10:24:12.260 227766 DEBUG oslo_concurrency.lockutils [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:12 np0005593234 nova_compute[227762]: 2026-01-23 10:24:12.276 227766 DEBUG nova.objects.instance [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'flavor' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:12 np0005593234 nova_compute[227762]: 2026-01-23 10:24:12.313 227766 DEBUG oslo_concurrency.lockutils [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e321 e321: 3 total, 3 up, 3 in
Jan 23 05:24:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:24:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:24:12 np0005593234 nova_compute[227762]: 2026-01-23 10:24:12.624 227766 DEBUG nova.compute.manager [req-723faac9-07ab-46db-9add-45bdb5a94d03 req-fb09972b-e201-46e5-94c1-25acb87f64a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-changed-419310e6-0055-4c1d-8cdb-be034824b754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:12 np0005593234 nova_compute[227762]: 2026-01-23 10:24:12.624 227766 DEBUG nova.compute.manager [req-723faac9-07ab-46db-9add-45bdb5a94d03 req-fb09972b-e201-46e5-94c1-25acb87f64a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Refreshing instance network info cache due to event network-changed-419310e6-0055-4c1d-8cdb-be034824b754. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:24:12 np0005593234 nova_compute[227762]: 2026-01-23 10:24:12.625 227766 DEBUG oslo_concurrency.lockutils [req-723faac9-07ab-46db-9add-45bdb5a94d03 req-fb09972b-e201-46e5-94c1-25acb87f64a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:12 np0005593234 nova_compute[227762]: 2026-01-23 10:24:12.625 227766 DEBUG oslo_concurrency.lockutils [req-723faac9-07ab-46db-9add-45bdb5a94d03 req-fb09972b-e201-46e5-94c1-25acb87f64a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:12 np0005593234 nova_compute[227762]: 2026-01-23 10:24:12.625 227766 DEBUG nova.network.neutron [req-723faac9-07ab-46db-9add-45bdb5a94d03 req-fb09972b-e201-46e5-94c1-25acb87f64a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Refreshing network info cache for port 419310e6-0055-4c1d-8cdb-be034824b754 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:24:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:12.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:12 np0005593234 nova_compute[227762]: 2026-01-23 10:24:12.891 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e321 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.419 227766 DEBUG oslo_concurrency.lockutils [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.419 227766 DEBUG oslo_concurrency.lockutils [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.420 227766 INFO nova.compute.manager [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Attaching volume 127a1e1e-4b4f-4404-b522-315ba62689fa to /dev/vdb#033[00m
Jan 23 05:24:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:24:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:24:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.694 227766 DEBUG os_brick.utils [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.696 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.708 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.709 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[864bd805-1288-4563-8a4a-dea8c9c424ba]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.710 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.719 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.720 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[915728e3-2531-4c20-93e8-6acfd5531233]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.721 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.731 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.732 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[1b2cc111-a2d0-4cea-bb46-dffb46db6d44]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.733 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[0df83a7d-04bf-4308-a204-b2f8179eea04]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.734 227766 DEBUG oslo_concurrency.processutils [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.763 227766 DEBUG oslo_concurrency.processutils [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "nvme version" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.765 227766 DEBUG os_brick.initiator.connectors.lightos [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.765 227766 DEBUG os_brick.initiator.connectors.lightos [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.766 227766 DEBUG os_brick.initiator.connectors.lightos [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.766 227766 DEBUG os_brick.utils [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] <== get_connector_properties: return (71ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:24:13 np0005593234 nova_compute[227762]: 2026-01-23 10:24:13.766 227766 DEBUG nova.virt.block_device [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Updating existing volume attachment record: 4db92ede-8dc6-42ee-bc0a-820c8697f507 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:24:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:14.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:24:14 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1630668236' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:24:14 np0005593234 nova_compute[227762]: 2026-01-23 10:24:14.725 227766 DEBUG nova.objects.instance [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'flavor' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:14 np0005593234 nova_compute[227762]: 2026-01-23 10:24:14.753 227766 DEBUG nova.virt.libvirt.driver [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Attempting to attach volume 127a1e1e-4b4f-4404-b522-315ba62689fa with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:24:14 np0005593234 nova_compute[227762]: 2026-01-23 10:24:14.758 227766 DEBUG nova.virt.libvirt.guest [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:24:14 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:24:14 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-127a1e1e-4b4f-4404-b522-315ba62689fa">
Jan 23 05:24:14 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:24:14 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:24:14 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:24:14 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:24:14 np0005593234 nova_compute[227762]:  <auth username="openstack">
Jan 23 05:24:14 np0005593234 nova_compute[227762]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:24:14 np0005593234 nova_compute[227762]:  </auth>
Jan 23 05:24:14 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:24:14 np0005593234 nova_compute[227762]:  <serial>127a1e1e-4b4f-4404-b522-315ba62689fa</serial>
Jan 23 05:24:14 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:24:14 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:24:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:14.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:15 np0005593234 nova_compute[227762]: 2026-01-23 10:24:15.061 227766 DEBUG nova.virt.libvirt.driver [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:24:15 np0005593234 nova_compute[227762]: 2026-01-23 10:24:15.061 227766 DEBUG nova.virt.libvirt.driver [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:24:15 np0005593234 nova_compute[227762]: 2026-01-23 10:24:15.065 227766 DEBUG nova.virt.libvirt.driver [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:24:15 np0005593234 nova_compute[227762]: 2026-01-23 10:24:15.065 227766 DEBUG nova.virt.libvirt.driver [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No VIF found with MAC fa:16:3e:d9:d9:06, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:24:15 np0005593234 nova_compute[227762]: 2026-01-23 10:24:15.306 227766 DEBUG nova.network.neutron [req-723faac9-07ab-46db-9add-45bdb5a94d03 req-fb09972b-e201-46e5-94c1-25acb87f64a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updated VIF entry in instance network info cache for port 419310e6-0055-4c1d-8cdb-be034824b754. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:24:15 np0005593234 nova_compute[227762]: 2026-01-23 10:24:15.307 227766 DEBUG nova.network.neutron [req-723faac9-07ab-46db-9add-45bdb5a94d03 req-fb09972b-e201-46e5-94c1-25acb87f64a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updating instance_info_cache with network_info: [{"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:15 np0005593234 nova_compute[227762]: 2026-01-23 10:24:15.325 227766 DEBUG oslo_concurrency.lockutils [None req-10a12814-e973-415b-b2ed-16685855fdad 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.905s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:15 np0005593234 nova_compute[227762]: 2026-01-23 10:24:15.331 227766 DEBUG oslo_concurrency.lockutils [req-723faac9-07ab-46db-9add-45bdb5a94d03 req-fb09972b-e201-46e5-94c1-25acb87f64a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:16.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:16 np0005593234 nova_compute[227762]: 2026-01-23 10:24:16.264 227766 DEBUG oslo_concurrency.lockutils [None req-ecc81cb5-8aab-42ee-a857-eaf98c8d6383 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:16 np0005593234 nova_compute[227762]: 2026-01-23 10:24:16.264 227766 DEBUG oslo_concurrency.lockutils [None req-ecc81cb5-8aab-42ee-a857-eaf98c8d6383 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:16 np0005593234 nova_compute[227762]: 2026-01-23 10:24:16.265 227766 DEBUG nova.compute.manager [None req-ecc81cb5-8aab-42ee-a857-eaf98c8d6383 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:16 np0005593234 nova_compute[227762]: 2026-01-23 10:24:16.268 227766 DEBUG nova.compute.manager [None req-ecc81cb5-8aab-42ee-a857-eaf98c8d6383 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 23 05:24:16 np0005593234 nova_compute[227762]: 2026-01-23 10:24:16.269 227766 DEBUG nova.objects.instance [None req-ecc81cb5-8aab-42ee-a857-eaf98c8d6383 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'flavor' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:16 np0005593234 nova_compute[227762]: 2026-01-23 10:24:16.302 227766 DEBUG nova.virt.libvirt.driver [None req-ecc81cb5-8aab-42ee-a857-eaf98c8d6383 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:24:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:16.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:16 np0005593234 nova_compute[227762]: 2026-01-23 10:24:16.961 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 e322: 3 total, 3 up, 3 in
Jan 23 05:24:17 np0005593234 nova_compute[227762]: 2026-01-23 10:24:17.893 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:18.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:18 np0005593234 kernel: tapa4401398-6f (unregistering): left promiscuous mode
Jan 23 05:24:18 np0005593234 NetworkManager[48942]: <info>  [1769163858.6416] device (tapa4401398-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:24:18 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:18Z|00667|binding|INFO|Releasing lport a4401398-6f7f-4595-b308-33a66a468a1f from this chassis (sb_readonly=0)
Jan 23 05:24:18 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:18Z|00668|binding|INFO|Setting lport a4401398-6f7f-4595-b308-33a66a468a1f down in Southbound
Jan 23 05:24:18 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:18Z|00669|binding|INFO|Removing iface tapa4401398-6f ovn-installed in OVS
Jan 23 05:24:18 np0005593234 nova_compute[227762]: 2026-01-23 10:24:18.653 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:18 np0005593234 nova_compute[227762]: 2026-01-23 10:24:18.655 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:18.662 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:d9:06 10.100.0.7'], port_security=['fa:16:3e:d9:d9:06 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'bd0fc955-63ff-41a4-b31b-369c2b584544', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1280650e-e283-4ddc-81aa-357640520155', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7c25c6bb33b41bf9cd8febb8259fd87', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c721f45-9254-46f2-b17b-2aa67f5ce3fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4684203-7828-4ea2-86ad-83030eb9aefe, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a4401398-6f7f-4595-b308-33a66a468a1f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:24:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:18.664 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a4401398-6f7f-4595-b308-33a66a468a1f in datapath 1280650e-e283-4ddc-81aa-357640520155 unbound from our chassis#033[00m
Jan 23 05:24:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:18.667 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1280650e-e283-4ddc-81aa-357640520155, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:24:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:18.669 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[96358205-d66b-4092-a2c5-bf691ac7dcc8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:18.669 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1280650e-e283-4ddc-81aa-357640520155 namespace which is not needed anymore#033[00m
Jan 23 05:24:18 np0005593234 nova_compute[227762]: 2026-01-23 10:24:18.674 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:18 np0005593234 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a2.scope: Deactivated successfully.
Jan 23 05:24:18 np0005593234 systemd[1]: machine-qemu\x2d75\x2dinstance\x2d000000a2.scope: Consumed 14.501s CPU time.
Jan 23 05:24:18 np0005593234 systemd-machined[195626]: Machine qemu-75-instance-000000a2 terminated.
Jan 23 05:24:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:18.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:18 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[300873]: [NOTICE]   (300877) : haproxy version is 2.8.14-c23fe91
Jan 23 05:24:18 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[300873]: [NOTICE]   (300877) : path to executable is /usr/sbin/haproxy
Jan 23 05:24:18 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[300873]: [WARNING]  (300877) : Exiting Master process...
Jan 23 05:24:18 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[300873]: [ALERT]    (300877) : Current worker (300879) exited with code 143 (Terminated)
Jan 23 05:24:18 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[300873]: [WARNING]  (300877) : All workers exited. Exiting... (0)
Jan 23 05:24:18 np0005593234 systemd[1]: libpod-d41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281.scope: Deactivated successfully.
Jan 23 05:24:18 np0005593234 podman[301234]: 2026-01-23 10:24:18.804768624 +0000 UTC m=+0.040126698 container died d41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:24:18 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281-userdata-shm.mount: Deactivated successfully.
Jan 23 05:24:18 np0005593234 systemd[1]: var-lib-containers-storage-overlay-b972311d376ad8b6edbe482b92e488f1bef3059e0d46bcde74d28fb06bd94770-merged.mount: Deactivated successfully.
Jan 23 05:24:18 np0005593234 podman[301234]: 2026-01-23 10:24:18.858247316 +0000 UTC m=+0.093605390 container cleanup d41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:24:18 np0005593234 systemd[1]: libpod-conmon-d41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281.scope: Deactivated successfully.
Jan 23 05:24:19 np0005593234 nova_compute[227762]: 2026-01-23 10:24:19.077 227766 DEBUG nova.compute.manager [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-unplugged-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:19 np0005593234 nova_compute[227762]: 2026-01-23 10:24:19.078 227766 DEBUG oslo_concurrency.lockutils [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:19 np0005593234 nova_compute[227762]: 2026-01-23 10:24:19.079 227766 DEBUG oslo_concurrency.lockutils [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:19 np0005593234 nova_compute[227762]: 2026-01-23 10:24:19.079 227766 DEBUG oslo_concurrency.lockutils [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:19 np0005593234 nova_compute[227762]: 2026-01-23 10:24:19.079 227766 DEBUG nova.compute.manager [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] No waiting events found dispatching network-vif-unplugged-a4401398-6f7f-4595-b308-33a66a468a1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:19 np0005593234 nova_compute[227762]: 2026-01-23 10:24:19.079 227766 WARNING nova.compute.manager [req-73666e33-9e93-479e-bb89-8e537c33ebf6 req-62819e4b-f1b6-4145-8f1a-b89ad03a1812 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received unexpected event network-vif-unplugged-a4401398-6f7f-4595-b308-33a66a468a1f for instance with vm_state active and task_state powering-off.#033[00m
Jan 23 05:24:19 np0005593234 podman[301270]: 2026-01-23 10:24:19.249020538 +0000 UTC m=+0.370099420 container remove d41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:24:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:19.256 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[afc9de9c-16d4-4a04-a68a-3bb76f7897ea]: (4, ('Fri Jan 23 10:24:18 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155 (d41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281)\nd41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281\nFri Jan 23 10:24:18 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155 (d41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281)\nd41d72799919740267156b91e7f3905cf1c7e5177b776bce3fd3d1794b745281\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:19.257 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[49f0a9b6-ebcf-43d6-85ea-7e0d35c44fa4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:19.258 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1280650e-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:19 np0005593234 kernel: tap1280650e-e0: left promiscuous mode
Jan 23 05:24:19 np0005593234 nova_compute[227762]: 2026-01-23 10:24:19.259 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:19 np0005593234 nova_compute[227762]: 2026-01-23 10:24:19.291 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:19.294 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[de231b3c-67fe-412e-a7f3-fde2e79aef21]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593234 nova_compute[227762]: 2026-01-23 10:24:19.317 227766 INFO nova.virt.libvirt.driver [None req-ecc81cb5-8aab-42ee-a857-eaf98c8d6383 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:24:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:19.320 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2407b404-3830-445e-ba19-45809bf5e74f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:19.321 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e9657c7f-2bb1-4fdd-8202-ba1c328b2a9f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593234 nova_compute[227762]: 2026-01-23 10:24:19.322 227766 INFO nova.virt.libvirt.driver [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance destroyed successfully.#033[00m
Jan 23 05:24:19 np0005593234 nova_compute[227762]: 2026-01-23 10:24:19.322 227766 DEBUG nova.objects.instance [None req-ecc81cb5-8aab-42ee-a857-eaf98c8d6383 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'numa_topology' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:19 np0005593234 nova_compute[227762]: 2026-01-23 10:24:19.337 227766 DEBUG nova.compute.manager [None req-ecc81cb5-8aab-42ee-a857-eaf98c8d6383 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:19.337 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c852d71a-0db0-4161-8c21-aa7b14622466]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 772321, 'reachable_time': 41842, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 301294, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:19.341 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1280650e-e283-4ddc-81aa-357640520155 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:24:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:19.341 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[bb4ad95a-d04a-4bb0-9379-6146a729c695]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:19 np0005593234 systemd[1]: run-netns-ovnmeta\x2d1280650e\x2de283\x2d4ddc\x2d81aa\x2d357640520155.mount: Deactivated successfully.
Jan 23 05:24:19 np0005593234 nova_compute[227762]: 2026-01-23 10:24:19.554 227766 DEBUG oslo_concurrency.lockutils [None req-ecc81cb5-8aab-42ee-a857-eaf98c8d6383 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 3.290s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:20.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:20 np0005593234 nova_compute[227762]: 2026-01-23 10:24:20.644 227766 DEBUG nova.objects.instance [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'flavor' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:20 np0005593234 nova_compute[227762]: 2026-01-23 10:24:20.666 227766 DEBUG oslo_concurrency.lockutils [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:20 np0005593234 nova_compute[227762]: 2026-01-23 10:24:20.666 227766 DEBUG oslo_concurrency.lockutils [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquired lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:20 np0005593234 nova_compute[227762]: 2026-01-23 10:24:20.666 227766 DEBUG nova.network.neutron [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:24:20 np0005593234 nova_compute[227762]: 2026-01-23 10:24:20.666 227766 DEBUG nova.objects.instance [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'info_cache' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:20.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:21 np0005593234 nova_compute[227762]: 2026-01-23 10:24:21.237 227766 DEBUG nova.compute.manager [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:21 np0005593234 nova_compute[227762]: 2026-01-23 10:24:21.238 227766 DEBUG oslo_concurrency.lockutils [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:21 np0005593234 nova_compute[227762]: 2026-01-23 10:24:21.238 227766 DEBUG oslo_concurrency.lockutils [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:21 np0005593234 nova_compute[227762]: 2026-01-23 10:24:21.238 227766 DEBUG oslo_concurrency.lockutils [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:21 np0005593234 nova_compute[227762]: 2026-01-23 10:24:21.238 227766 DEBUG nova.compute.manager [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] No waiting events found dispatching network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:21 np0005593234 nova_compute[227762]: 2026-01-23 10:24:21.238 227766 WARNING nova.compute.manager [req-341d3b10-e9c9-49af-a294-cbfbd7808d21 req-80b345ed-6621-4981-a27d-84181e5713d0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received unexpected event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 23 05:24:21 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:24:21 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:24:21 np0005593234 nova_compute[227762]: 2026-01-23 10:24:21.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:21 np0005593234 nova_compute[227762]: 2026-01-23 10:24:21.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:24:21 np0005593234 nova_compute[227762]: 2026-01-23 10:24:21.963 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:22.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.349 227766 DEBUG nova.network.neutron [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Updating instance_info_cache with network_info: [{"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.445 227766 DEBUG oslo_concurrency.lockutils [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "interface-8010f6fe-77ef-48ec-952f-a3a65186cd59-419310e6-0055-4c1d-8cdb-be034824b754" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.445 227766 DEBUG oslo_concurrency.lockutils [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "interface-8010f6fe-77ef-48ec-952f-a3a65186cd59-419310e6-0055-4c1d-8cdb-be034824b754" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.688 227766 DEBUG oslo_concurrency.lockutils [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Releasing lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.693 227766 DEBUG nova.objects.instance [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'flavor' on Instance uuid 8010f6fe-77ef-48ec-952f-a3a65186cd59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.722 227766 DEBUG nova.virt.libvirt.vif [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:22:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1311975023',display_name='tempest-TestNetworkBasicOps-server-1311975023',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1311975023',id=158,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL/weReemtkH37O5IRYNNKB/TT2YWx12mLIlrues6ra7aWugRfActXRaloGeDbBIe3P4JQOsAt7qcWUIg9CugCzIu3x3QAkzyj6XRgUoI3pt8WnzsopEYd4m2Fn+OmRPbg==',key_name='tempest-TestNetworkBasicOps-1841453028',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:22:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-3yzv077h',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:22:48Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=8010f6fe-77ef-48ec-952f-a3a65186cd59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.723 227766 DEBUG nova.network.os_vif_util [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.724 227766 DEBUG nova.network.os_vif_util [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:14:44:0e,bridge_name='br-int',has_traffic_filtering=True,id=419310e6-0055-4c1d-8cdb-be034824b754,network=Network(e227a777-0e88-4409-a4a5-266ef225baae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap419310e6-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.725 227766 INFO nova.virt.libvirt.driver [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance destroyed successfully.#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.725 227766 DEBUG nova.objects.instance [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'numa_topology' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.728 227766 DEBUG nova.virt.libvirt.guest [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:14:44:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap419310e6-00"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.730 227766 DEBUG nova.virt.libvirt.guest [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:14:44:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap419310e6-00"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.733 227766 DEBUG nova.virt.libvirt.driver [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Attempting to detach device tap419310e6-00 from instance 8010f6fe-77ef-48ec-952f-a3a65186cd59 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.734 227766 DEBUG nova.virt.libvirt.guest [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] detach device xml: <interface type="ethernet">
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <mac address="fa:16:3e:14:44:0e"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <model type="virtio"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <mtu size="1442"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <target dev="tap419310e6-00"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]: </interface>
Jan 23 05:24:22 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.742 227766 DEBUG nova.virt.libvirt.guest [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:14:44:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap419310e6-00"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.744 227766 DEBUG nova.objects.instance [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'resources' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.746 227766 DEBUG nova.virt.libvirt.guest [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:14:44:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap419310e6-00"/></interface>not found in domain: <domain type='kvm' id='74'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <name>instance-0000009e</name>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <uuid>8010f6fe-77ef-48ec-952f-a3a65186cd59</uuid>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:name>tempest-TestNetworkBasicOps-server-1311975023</nova:name>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 10:23:23</nova:creationTime>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:port uuid="1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9">
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:port uuid="419310e6-0055-4c1d-8cdb-be034824b754">
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 05:24:22 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <memory unit='KiB'>131072</memory>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <vcpu placement='static'>1</vcpu>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <resource>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <partition>/machine</partition>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </resource>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <sysinfo type='smbios'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <entry name='manufacturer'>RDO</entry>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <entry name='serial'>8010f6fe-77ef-48ec-952f-a3a65186cd59</entry>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <entry name='uuid'>8010f6fe-77ef-48ec-952f-a3a65186cd59</entry>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <entry name='family'>Virtual Machine</entry>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <boot dev='hd'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <smbios mode='sysinfo'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <vmcoreinfo state='on'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <model fallback='forbid'>Nehalem</model>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <feature policy='require' name='x2apic'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <feature policy='require' name='hypervisor'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <feature policy='require' name='vme'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <clock offset='utc'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <timer name='hpet' present='no'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <on_poweroff>destroy</on_poweroff>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <on_reboot>restart</on_reboot>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <on_crash>destroy</on_crash>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <disk type='network' device='disk'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/8010f6fe-77ef-48ec-952f-a3a65186cd59_disk' index='2'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target dev='vda' bus='virtio'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='virtio-disk0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <disk type='network' device='cdrom'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/8010f6fe-77ef-48ec-952f-a3a65186cd59_disk.config' index='1'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target dev='sda' bus='sata'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <readonly/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='sata0-0-0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pcie.0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='1' port='0x10'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='2' port='0x11'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.2'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='3' port='0x12'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.3'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='4' port='0x13'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.4'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='5' port='0x14'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.5'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='6' port='0x15'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.6'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='7' port='0x16'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.7'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='8' port='0x17'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.8'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='9' port='0x18'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.9'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='10' port='0x19'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.10'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='11' port='0x1a'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.11'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='12' port='0x1b'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.12'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='13' port='0x1c'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.13'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='14' port='0x1d'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.14'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='15' port='0x1e'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.15'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='16' port='0x1f'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.16'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='17' port='0x20'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.17'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='18' port='0x21'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.18'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='19' port='0x22'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.19'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='20' port='0x23'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.20'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='21' port='0x24'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.21'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='22' port='0x25'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.22'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='23' port='0x26'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.23'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='24' port='0x27'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.24'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='25' port='0x28'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.25'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-pci-bridge'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.26'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='usb'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='sata' index='0'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='ide'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:8e:52:70'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target dev='tap1fea5a6d-70'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='net0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:14:44:0e'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target dev='tap419310e6-00'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='net1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <serial type='pty'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <source path='/dev/pts/1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/console.log' append='off'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target type='isa-serial' port='0'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <model name='isa-serial'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      </target>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <console type='pty' tty='/dev/pts/1'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <source path='/dev/pts/1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/console.log' append='off'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target type='serial' port='0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </console>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <input type='tablet' bus='usb'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='input0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='usb' bus='0' port='1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </input>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <input type='mouse' bus='ps2'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='input1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </input>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <input type='keyboard' bus='ps2'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='input2'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </input>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <listen type='address' address='::0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <audio id='1' type='none'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='video0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <watchdog model='itco' action='reset'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='watchdog0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </watchdog>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <memballoon model='virtio'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <stats period='10'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='balloon0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <rng model='virtio'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <backend model='random'>/dev/urandom</backend>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='rng0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <label>system_u:system_r:svirt_t:s0:c114,c349</label>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c114,c349</imagelabel>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <label>+107:+107</label>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <imagelabel>+107:+107</imagelabel>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 05:24:22 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:24:22 np0005593234 nova_compute[227762]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.746 227766 INFO nova.virt.libvirt.driver [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully detached device tap419310e6-00 from instance 8010f6fe-77ef-48ec-952f-a3a65186cd59 from the persistent domain config.#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.746 227766 DEBUG nova.virt.libvirt.driver [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] (1/8): Attempting to detach device tap419310e6-00 with device alias net1 from instance 8010f6fe-77ef-48ec-952f-a3a65186cd59 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.746 227766 DEBUG nova.virt.libvirt.guest [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] detach device xml: <interface type="ethernet">
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <mac address="fa:16:3e:14:44:0e"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <model type="virtio"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <mtu size="1442"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <target dev="tap419310e6-00"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]: </interface>
Jan 23 05:24:22 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.766 227766 DEBUG nova.virt.libvirt.vif [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:23:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-59893283',display_name='tempest-AttachVolumeTestJSON-server-59893283',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-59893283',id=162,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFhliJMNa+yXuf0fSBz/3snMSJFr8dgTPMjo/Saadc9lk7SWpQL9azGzggupkC6Qe7ig8O16HGbub4k3l0C30EmW82whCbHcGBU51cs0XAqEAsopgWQJtLRZgtshG7dxog==',key_name='tempest-keypair-1909141305',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:23:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c7c25c6bb33b41bf9cd8febb8259fd87',ramdisk_id='',reservation_id='r-gp4q9qzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-345871886',owner_user_name='tempest-AttachVolumeTestJSON-345871886-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:24:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='01b7396ecc574dd6ba2df2f406921223',uuid=bd0fc955-63ff-41a4-b31b-369c2b584544,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.767 227766 DEBUG nova.network.os_vif_util [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converting VIF {"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.768 227766 DEBUG nova.network.os_vif_util [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.768 227766 DEBUG os_vif [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.771 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.772 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa4401398-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.774 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.777 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.785 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.789 227766 INFO os_vif [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f')#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.799 227766 DEBUG nova.virt.libvirt.driver [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Start _get_guest_xml network_info=[{"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdb': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [{'boot_index': None, 'mount_device': '/dev/vdb', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-127a1e1e-4b4f-4404-b522-315ba62689fa', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '127a1e1e-4b4f-4404-b522-315ba62689fa', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'bd0fc955-63ff-41a4-b31b-369c2b584544', 'attached_at': '', 'detached_at': '', 'volume_id': '127a1e1e-4b4f-4404-b522-315ba62689fa', 'serial': '127a1e1e-4b4f-4404-b522-315ba62689fa'}, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '4db92ede-8dc6-42ee-bc0a-820c8697f507', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:24:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:22.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.803 227766 WARNING nova.virt.libvirt.driver [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.809 227766 DEBUG nova.virt.libvirt.host [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.810 227766 DEBUG nova.virt.libvirt.host [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.813 227766 DEBUG nova.virt.libvirt.host [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.813 227766 DEBUG nova.virt.libvirt.host [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.814 227766 DEBUG nova.virt.libvirt.driver [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.814 227766 DEBUG nova.virt.hardware [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.814 227766 DEBUG nova.virt.hardware [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.815 227766 DEBUG nova.virt.hardware [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.815 227766 DEBUG nova.virt.hardware [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.815 227766 DEBUG nova.virt.hardware [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.815 227766 DEBUG nova.virt.hardware [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.815 227766 DEBUG nova.virt.hardware [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.816 227766 DEBUG nova.virt.hardware [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.816 227766 DEBUG nova.virt.hardware [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.816 227766 DEBUG nova.virt.hardware [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.816 227766 DEBUG nova.virt.hardware [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.816 227766 DEBUG nova.objects.instance [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'vcpu_model' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.839 227766 DEBUG oslo_concurrency.processutils [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:22 np0005593234 kernel: tap419310e6-00 (unregistering): left promiscuous mode
Jan 23 05:24:22 np0005593234 NetworkManager[48942]: <info>  [1769163862.8620] device (tap419310e6-00): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.871 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:22Z|00670|binding|INFO|Releasing lport 419310e6-0055-4c1d-8cdb-be034824b754 from this chassis (sb_readonly=0)
Jan 23 05:24:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:22Z|00671|binding|INFO|Setting lport 419310e6-0055-4c1d-8cdb-be034824b754 down in Southbound
Jan 23 05:24:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:22Z|00672|binding|INFO|Removing iface tap419310e6-00 ovn-installed in OVS
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.875 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769163862.8754218, 8010f6fe-77ef-48ec-952f-a3a65186cd59 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.876 227766 DEBUG nova.virt.libvirt.driver [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Start waiting for the detach event from libvirt for device tap419310e6-00 with device alias net1 for instance 8010f6fe-77ef-48ec-952f-a3a65186cd59 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.877 227766 DEBUG nova.virt.libvirt.guest [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:14:44:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap419310e6-00"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.881 227766 DEBUG nova.virt.libvirt.guest [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:14:44:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap419310e6-00"/></interface>not found in domain: <domain type='kvm' id='74'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <name>instance-0000009e</name>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <uuid>8010f6fe-77ef-48ec-952f-a3a65186cd59</uuid>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:name>tempest-TestNetworkBasicOps-server-1311975023</nova:name>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 10:23:23</nova:creationTime>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:port uuid="1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9">
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:port uuid="419310e6-0055-4c1d-8cdb-be034824b754">
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.18" ipVersion="4"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 05:24:22 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <memory unit='KiB'>131072</memory>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <vcpu placement='static'>1</vcpu>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <resource>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <partition>/machine</partition>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </resource>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <sysinfo type='smbios'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <entry name='manufacturer'>RDO</entry>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <entry name='serial'>8010f6fe-77ef-48ec-952f-a3a65186cd59</entry>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <entry name='uuid'>8010f6fe-77ef-48ec-952f-a3a65186cd59</entry>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <entry name='family'>Virtual Machine</entry>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <boot dev='hd'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <smbios mode='sysinfo'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <vmcoreinfo state='on'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <model fallback='forbid'>Nehalem</model>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <feature policy='require' name='x2apic'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <feature policy='require' name='hypervisor'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <feature policy='require' name='vme'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <clock offset='utc'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <timer name='hpet' present='no'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <on_poweroff>destroy</on_poweroff>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <on_reboot>restart</on_reboot>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <on_crash>destroy</on_crash>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <disk type='network' device='disk'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/8010f6fe-77ef-48ec-952f-a3a65186cd59_disk' index='2'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target dev='vda' bus='virtio'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='virtio-disk0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <disk type='network' device='cdrom'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/8010f6fe-77ef-48ec-952f-a3a65186cd59_disk.config' index='1'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target dev='sda' bus='sata'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <readonly/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='sata0-0-0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pcie.0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='1' port='0x10'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='2' port='0x11'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.2'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='3' port='0x12'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.3'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='4' port='0x13'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.4'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='5' port='0x14'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.5'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='6' port='0x15'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.6'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='7' port='0x16'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.7'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='8' port='0x17'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.8'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='9' port='0x18'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.9'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='10' port='0x19'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.10'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='11' port='0x1a'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.11'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='12' port='0x1b'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.12'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='13' port='0x1c'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.13'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='14' port='0x1d'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.14'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='15' port='0x1e'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.15'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='16' port='0x1f'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.16'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='17' port='0x20'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.17'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='18' port='0x21'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.18'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='19' port='0x22'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.19'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='20' port='0x23'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.20'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='21' port='0x24'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.21'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='22' port='0x25'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.22'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='23' port='0x26'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.23'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='24' port='0x27'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.24'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target chassis='25' port='0x28'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.25'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model name='pcie-pci-bridge'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='pci.26'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='usb'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <controller type='sata' index='0'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='ide'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:8e:52:70'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target dev='tap1fea5a6d-70'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='net0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <serial type='pty'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <source path='/dev/pts/1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/console.log' append='off'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target type='isa-serial' port='0'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:        <model name='isa-serial'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      </target>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <console type='pty' tty='/dev/pts/1'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <source path='/dev/pts/1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/console.log' append='off'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <target type='serial' port='0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </console>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <input type='tablet' bus='usb'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='input0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='usb' bus='0' port='1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </input>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <input type='mouse' bus='ps2'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='input1'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </input>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <input type='keyboard' bus='ps2'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='input2'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </input>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <listen type='address' address='::0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <audio id='1' type='none'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='video0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <watchdog model='itco' action='reset'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='watchdog0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </watchdog>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <memballoon model='virtio'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <stats period='10'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='balloon0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <rng model='virtio'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <backend model='random'>/dev/urandom</backend>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <alias name='rng0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <label>system_u:system_r:svirt_t:s0:c114,c349</label>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c114,c349</imagelabel>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <label>+107:+107</label>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <imagelabel>+107:+107</imagelabel>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 05:24:22 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:24:22 np0005593234 nova_compute[227762]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.882 227766 INFO nova.virt.libvirt.driver [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully detached device tap419310e6-00 from instance 8010f6fe-77ef-48ec-952f-a3a65186cd59 from the live domain config.#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.882 227766 DEBUG nova.virt.libvirt.vif [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:22:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1311975023',display_name='tempest-TestNetworkBasicOps-server-1311975023',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1311975023',id=158,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL/weReemtkH37O5IRYNNKB/TT2YWx12mLIlrues6ra7aWugRfActXRaloGeDbBIe3P4JQOsAt7qcWUIg9CugCzIu3x3QAkzyj6XRgUoI3pt8WnzsopEYd4m2Fn+OmRPbg==',key_name='tempest-TestNetworkBasicOps-1841453028',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:22:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-3yzv077h',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:22:48Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=8010f6fe-77ef-48ec-952f-a3a65186cd59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.883 227766 DEBUG nova.network.os_vif_util [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.884 227766 DEBUG nova.network.os_vif_util [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:14:44:0e,bridge_name='br-int',has_traffic_filtering=True,id=419310e6-0055-4c1d-8cdb-be034824b754,network=Network(e227a777-0e88-4409-a4a5-266ef225baae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap419310e6-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.884 227766 DEBUG os_vif [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:14:44:0e,bridge_name='br-int',has_traffic_filtering=True,id=419310e6-0055-4c1d-8cdb-be034824b754,network=Network(e227a777-0e88-4409-a4a5-266ef225baae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap419310e6-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:24:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:22.887 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:14:44:0e 10.100.0.18', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28', 'neutron:device_id': '8010f6fe-77ef-48ec-952f-a3a65186cd59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e227a777-0e88-4409-a4a5-266ef225baae', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9352757c-3308-4452-a338-cff1ca2f64b6, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=419310e6-0055-4c1d-8cdb-be034824b754) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.887 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.888 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap419310e6-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.889 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.890 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.891 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:22.892 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 419310e6-0055-4c1d-8cdb-be034824b754 in datapath e227a777-0e88-4409-a4a5-266ef225baae unbound from our chassis#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.893 227766 INFO os_vif [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:14:44:0e,bridge_name='br-int',has_traffic_filtering=True,id=419310e6-0055-4c1d-8cdb-be034824b754,network=Network(e227a777-0e88-4409-a4a5-266ef225baae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap419310e6-00')#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.894 227766 DEBUG nova.virt.libvirt.guest [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:name>tempest-TestNetworkBasicOps-server-1311975023</nova:name>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 10:24:22</nova:creationTime>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    <nova:port uuid="1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9">
Jan 23 05:24:22 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 05:24:22 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 05:24:22 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 05:24:22 np0005593234 nova_compute[227762]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 05:24:22 np0005593234 nova_compute[227762]: 2026-01-23 10:24:22.894 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:22.895 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e227a777-0e88-4409-a4a5-266ef225baae, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:24:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:22.896 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[30b6453b-84f6-4a10-a492-f8bb637f91fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:22.898 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae namespace which is not needed anymore#033[00m
Jan 23 05:24:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:23 np0005593234 neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae[300256]: [NOTICE]   (300260) : haproxy version is 2.8.14-c23fe91
Jan 23 05:24:23 np0005593234 neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae[300256]: [NOTICE]   (300260) : path to executable is /usr/sbin/haproxy
Jan 23 05:24:23 np0005593234 neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae[300256]: [WARNING]  (300260) : Exiting Master process...
Jan 23 05:24:23 np0005593234 neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae[300256]: [ALERT]    (300260) : Current worker (300262) exited with code 143 (Terminated)
Jan 23 05:24:23 np0005593234 neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae[300256]: [WARNING]  (300260) : All workers exited. Exiting... (0)
Jan 23 05:24:23 np0005593234 systemd[1]: libpod-bffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf.scope: Deactivated successfully.
Jan 23 05:24:23 np0005593234 podman[301372]: 2026-01-23 10:24:23.026272857 +0000 UTC m=+0.041686726 container died bffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 05:24:23 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-bffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf-userdata-shm.mount: Deactivated successfully.
Jan 23 05:24:23 np0005593234 systemd[1]: var-lib-containers-storage-overlay-e6f175c24ab90f6e7bcde53409dc2e1ecb05ada2bd33a5a2f79fb3d3161f4486-merged.mount: Deactivated successfully.
Jan 23 05:24:23 np0005593234 podman[301372]: 2026-01-23 10:24:23.067679203 +0000 UTC m=+0.083093062 container cleanup bffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:24:23 np0005593234 systemd[1]: libpod-conmon-bffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf.scope: Deactivated successfully.
Jan 23 05:24:23 np0005593234 podman[301416]: 2026-01-23 10:24:23.131239439 +0000 UTC m=+0.038931331 container remove bffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:24:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:23.136 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a5b3bb27-e289-4246-b821-021cf3a118f6]: (4, ('Fri Jan 23 10:24:22 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae (bffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf)\nbffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf\nFri Jan 23 10:24:23 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae (bffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf)\nbffed677e021f0f18575d0ec30c5b3063f4aa900b9c9f0290003f1985097f3bf\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:23.138 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[63a3a438-95bb-4831-96d5-45ee2a3a92ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:23.138 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape227a777-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:23 np0005593234 kernel: tape227a777-00: left promiscuous mode
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.140 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.146 227766 DEBUG nova.compute.manager [req-be88cbaf-4db1-4965-97b7-02839c9a8c99 req-3366cd50-45f0-4171-bea9-c76f61a30e2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-vif-unplugged-419310e6-0055-4c1d-8cdb-be034824b754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.146 227766 DEBUG oslo_concurrency.lockutils [req-be88cbaf-4db1-4965-97b7-02839c9a8c99 req-3366cd50-45f0-4171-bea9-c76f61a30e2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.146 227766 DEBUG oslo_concurrency.lockutils [req-be88cbaf-4db1-4965-97b7-02839c9a8c99 req-3366cd50-45f0-4171-bea9-c76f61a30e2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.147 227766 DEBUG oslo_concurrency.lockutils [req-be88cbaf-4db1-4965-97b7-02839c9a8c99 req-3366cd50-45f0-4171-bea9-c76f61a30e2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.147 227766 DEBUG nova.compute.manager [req-be88cbaf-4db1-4965-97b7-02839c9a8c99 req-3366cd50-45f0-4171-bea9-c76f61a30e2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] No waiting events found dispatching network-vif-unplugged-419310e6-0055-4c1d-8cdb-be034824b754 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.147 227766 WARNING nova.compute.manager [req-be88cbaf-4db1-4965-97b7-02839c9a8c99 req-3366cd50-45f0-4171-bea9-c76f61a30e2d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received unexpected event network-vif-unplugged-419310e6-0055-4c1d-8cdb-be034824b754 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.167 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:23.170 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0fd8c50f-1739-4572-9c90-7ee795c41c2a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:23.186 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c71cae34-4f87-4198-b7dd-0996e024aa57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:23.187 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7dedb39b-2a19-4827-8d49-e248561e4a3e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:23.207 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[578db1a6-e784-4d0c-ab6a-bd1a5906e314]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 769810, 'reachable_time': 15189, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 301431, 'error': None, 'target': 'ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:23 np0005593234 systemd[1]: run-netns-ovnmeta\x2de227a777\x2d0e88\x2d4409\x2da4a5\x2d266ef225baae.mount: Deactivated successfully.
Jan 23 05:24:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:23.208 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e227a777-0e88-4409-a4a5-266ef225baae deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:24:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:23.209 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[0af92b42-7f1e-4a8a-b0a8-76f7af19e29a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:24:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2324780322' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.378 227766 DEBUG oslo_concurrency.processutils [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.539s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.435 227766 DEBUG oslo_concurrency.processutils [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.590 227766 DEBUG oslo_concurrency.lockutils [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.591 227766 DEBUG oslo_concurrency.lockutils [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquired lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.592 227766 DEBUG nova.network.neutron [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.639 227766 DEBUG nova.compute.manager [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-vif-deleted-419310e6-0055-4c1d-8cdb-be034824b754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.639 227766 INFO nova.compute.manager [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Neutron deleted interface 419310e6-0055-4c1d-8cdb-be034824b754; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.639 227766 DEBUG nova.network.neutron [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updating instance_info_cache with network_info: [{"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.671 227766 DEBUG nova.objects.instance [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lazy-loading 'system_metadata' on Instance uuid 8010f6fe-77ef-48ec-952f-a3a65186cd59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.696 227766 DEBUG nova.objects.instance [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lazy-loading 'flavor' on Instance uuid 8010f6fe-77ef-48ec-952f-a3a65186cd59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:24:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/94902904' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:24:23 np0005593234 nova_compute[227762]: 2026-01-23 10:24:23.874 227766 DEBUG oslo_concurrency.processutils [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:24.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.545 227766 DEBUG nova.virt.libvirt.vif [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:22:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1311975023',display_name='tempest-TestNetworkBasicOps-server-1311975023',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1311975023',id=158,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL/weReemtkH37O5IRYNNKB/TT2YWx12mLIlrues6ra7aWugRfActXRaloGeDbBIe3P4JQOsAt7qcWUIg9CugCzIu3x3QAkzyj6XRgUoI3pt8WnzsopEYd4m2Fn+OmRPbg==',key_name='tempest-TestNetworkBasicOps-1841453028',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:22:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-3yzv077h',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:22:48Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=8010f6fe-77ef-48ec-952f-a3a65186cd59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.546 227766 DEBUG nova.network.os_vif_util [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Converting VIF {"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.547 227766 DEBUG nova.network.os_vif_util [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:14:44:0e,bridge_name='br-int',has_traffic_filtering=True,id=419310e6-0055-4c1d-8cdb-be034824b754,network=Network(e227a777-0e88-4409-a4a5-266ef225baae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap419310e6-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.552 227766 DEBUG nova.virt.libvirt.guest [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:14:44:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap419310e6-00"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.557 227766 DEBUG nova.virt.libvirt.guest [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:14:44:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap419310e6-00"/></interface>not found in domain: <domain type='kvm' id='74'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <name>instance-0000009e</name>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <uuid>8010f6fe-77ef-48ec-952f-a3a65186cd59</uuid>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:name>tempest-TestNetworkBasicOps-server-1311975023</nova:name>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 10:24:22</nova:creationTime>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:port uuid="1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9">
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 05:24:24 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <memory unit='KiB'>131072</memory>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <vcpu placement='static'>1</vcpu>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <resource>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <partition>/machine</partition>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </resource>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <sysinfo type='smbios'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <entry name='manufacturer'>RDO</entry>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <entry name='serial'>8010f6fe-77ef-48ec-952f-a3a65186cd59</entry>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <entry name='uuid'>8010f6fe-77ef-48ec-952f-a3a65186cd59</entry>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <entry name='family'>Virtual Machine</entry>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <boot dev='hd'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <smbios mode='sysinfo'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <vmcoreinfo state='on'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <model fallback='forbid'>Nehalem</model>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <feature policy='require' name='x2apic'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <feature policy='require' name='hypervisor'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <feature policy='require' name='vme'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <clock offset='utc'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <timer name='hpet' present='no'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <on_poweroff>destroy</on_poweroff>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <on_reboot>restart</on_reboot>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <on_crash>destroy</on_crash>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <disk type='network' device='disk'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/8010f6fe-77ef-48ec-952f-a3a65186cd59_disk' index='2'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target dev='vda' bus='virtio'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='virtio-disk0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <disk type='network' device='cdrom'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/8010f6fe-77ef-48ec-952f-a3a65186cd59_disk.config' index='1'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target dev='sda' bus='sata'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <readonly/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='sata0-0-0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pcie.0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='1' port='0x10'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='2' port='0x11'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.2'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='3' port='0x12'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.3'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='4' port='0x13'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.4'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='5' port='0x14'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.5'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='6' port='0x15'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.6'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='7' port='0x16'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.7'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='8' port='0x17'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.8'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='9' port='0x18'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.9'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='10' port='0x19'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.10'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='11' port='0x1a'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.11'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='12' port='0x1b'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.12'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='13' port='0x1c'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.13'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='14' port='0x1d'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.14'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='15' port='0x1e'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.15'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='16' port='0x1f'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.16'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='17' port='0x20'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.17'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='18' port='0x21'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.18'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='19' port='0x22'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.19'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='20' port='0x23'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.20'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='21' port='0x24'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.21'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='22' port='0x25'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.22'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='23' port='0x26'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.23'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='24' port='0x27'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.24'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='25' port='0x28'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.25'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-pci-bridge'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.26'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='usb'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='sata' index='0'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='ide'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:8e:52:70'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target dev='tap1fea5a6d-70'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='net0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <serial type='pty'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <source path='/dev/pts/1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/console.log' append='off'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target type='isa-serial' port='0'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <model name='isa-serial'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      </target>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <console type='pty' tty='/dev/pts/1'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <source path='/dev/pts/1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/console.log' append='off'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target type='serial' port='0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </console>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <input type='tablet' bus='usb'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='input0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='usb' bus='0' port='1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </input>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <input type='mouse' bus='ps2'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='input1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </input>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <input type='keyboard' bus='ps2'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='input2'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </input>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <listen type='address' address='::0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <audio id='1' type='none'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='video0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <watchdog model='itco' action='reset'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='watchdog0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </watchdog>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <memballoon model='virtio'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <stats period='10'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='balloon0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <rng model='virtio'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <backend model='random'>/dev/urandom</backend>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='rng0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <label>system_u:system_r:svirt_t:s0:c114,c349</label>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c114,c349</imagelabel>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <label>+107:+107</label>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <imagelabel>+107:+107</imagelabel>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 05:24:24 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:24:24 np0005593234 nova_compute[227762]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.557 227766 DEBUG nova.virt.libvirt.guest [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:14:44:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap419310e6-00"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.560 227766 DEBUG nova.virt.libvirt.guest [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:14:44:0e"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap419310e6-00"/></interface>not found in domain: <domain type='kvm' id='74'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <name>instance-0000009e</name>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <uuid>8010f6fe-77ef-48ec-952f-a3a65186cd59</uuid>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:name>tempest-TestNetworkBasicOps-server-1311975023</nova:name>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 10:24:22</nova:creationTime>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:port uuid="1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9">
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 05:24:24 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <memory unit='KiB'>131072</memory>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <vcpu placement='static'>1</vcpu>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <resource>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <partition>/machine</partition>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </resource>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <sysinfo type='smbios'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <entry name='manufacturer'>RDO</entry>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <entry name='product'>OpenStack Compute</entry>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <entry name='serial'>8010f6fe-77ef-48ec-952f-a3a65186cd59</entry>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <entry name='uuid'>8010f6fe-77ef-48ec-952f-a3a65186cd59</entry>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <entry name='family'>Virtual Machine</entry>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <boot dev='hd'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <smbios mode='sysinfo'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <vmcoreinfo state='on'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <cpu mode='custom' match='exact' check='full'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <model fallback='forbid'>Nehalem</model>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <feature policy='require' name='x2apic'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <feature policy='require' name='hypervisor'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <feature policy='require' name='vme'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <clock offset='utc'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <timer name='pit' tickpolicy='delay'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <timer name='rtc' tickpolicy='catchup'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <timer name='hpet' present='no'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <on_poweroff>destroy</on_poweroff>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <on_reboot>restart</on_reboot>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <on_crash>destroy</on_crash>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <disk type='network' device='disk'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/8010f6fe-77ef-48ec-952f-a3a65186cd59_disk' index='2'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target dev='vda' bus='virtio'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='virtio-disk0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <disk type='network' device='cdrom'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <driver name='qemu' type='raw' cache='none'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <auth username='openstack'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <secret type='ceph' uuid='e1533653-0a5a-584c-b34b-8689f0d32e77'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <source protocol='rbd' name='vms/8010f6fe-77ef-48ec-952f-a3a65186cd59_disk.config' index='1'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <host name='192.168.122.100' port='6789'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <host name='192.168.122.102' port='6789'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <host name='192.168.122.101' port='6789'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target dev='sda' bus='sata'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <readonly/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='sata0-0-0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='0' model='pcie-root'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pcie.0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='1' port='0x10'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='2' port='0x11'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.2'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='3' port='0x12'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.3'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='4' port='0x13'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.4'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='5' port='0x14'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.5'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='6' port='0x15'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.6'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='7' port='0x16'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.7'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='8' port='0x17'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.8'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='9' port='0x18'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.9'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='10' port='0x19'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.10'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='11' port='0x1a'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.11'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='12' port='0x1b'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.12'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='13' port='0x1c'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.13'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='14' port='0x1d'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.14'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='15' port='0x1e'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.15'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='16' port='0x1f'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.16'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='17' port='0x20'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.17'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='18' port='0x21'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.18'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='19' port='0x22'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.19'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='20' port='0x23'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.20'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='21' port='0x24'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.21'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='22' port='0x25'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.22'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='23' port='0x26'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.23'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='24' port='0x27'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.24'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-root-port'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target chassis='25' port='0x28'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.25'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model name='pcie-pci-bridge'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='pci.26'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='usb'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <controller type='sata' index='0'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='ide'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </controller>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <interface type='ethernet'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <mac address='fa:16:3e:8e:52:70'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target dev='tap1fea5a6d-70'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model type='virtio'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <driver name='vhost' rx_queue_size='512'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <mtu size='1442'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='net0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <serial type='pty'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <source path='/dev/pts/1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/console.log' append='off'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target type='isa-serial' port='0'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:        <model name='isa-serial'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      </target>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <console type='pty' tty='/dev/pts/1'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <source path='/dev/pts/1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <log file='/var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59/console.log' append='off'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <target type='serial' port='0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='serial0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </console>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <input type='tablet' bus='usb'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='input0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='usb' bus='0' port='1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </input>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <input type='mouse' bus='ps2'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='input1'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </input>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <input type='keyboard' bus='ps2'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='input2'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </input>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <graphics type='vnc' port='5901' autoport='yes' listen='::0'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <listen type='address' address='::0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </graphics>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <audio id='1' type='none'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <model type='virtio' heads='1' primary='yes'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='video0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <watchdog model='itco' action='reset'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='watchdog0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </watchdog>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <memballoon model='virtio'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <stats period='10'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='balloon0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <rng model='virtio'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <backend model='random'>/dev/urandom</backend>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <alias name='rng0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <label>system_u:system_r:svirt_t:s0:c114,c349</label>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c114,c349</imagelabel>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <label>+107:+107</label>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <imagelabel>+107:+107</imagelabel>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </seclabel>
Jan 23 05:24:24 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:24:24 np0005593234 nova_compute[227762]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.561 227766 WARNING nova.virt.libvirt.driver [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Detaching interface fa:16:3e:14:44:0e failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap419310e6-00' not found.#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.561 227766 DEBUG nova.virt.libvirt.vif [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:22:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1311975023',display_name='tempest-TestNetworkBasicOps-server-1311975023',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1311975023',id=158,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL/weReemtkH37O5IRYNNKB/TT2YWx12mLIlrues6ra7aWugRfActXRaloGeDbBIe3P4JQOsAt7qcWUIg9CugCzIu3x3QAkzyj6XRgUoI3pt8WnzsopEYd4m2Fn+OmRPbg==',key_name='tempest-TestNetworkBasicOps-1841453028',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:22:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-3yzv077h',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:22:48Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=8010f6fe-77ef-48ec-952f-a3a65186cd59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.562 227766 DEBUG nova.network.os_vif_util [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Converting VIF {"id": "419310e6-0055-4c1d-8cdb-be034824b754", "address": "fa:16:3e:14:44:0e", "network": {"id": "e227a777-0e88-4409-a4a5-266ef225baae", "bridge": "br-int", "label": "tempest-network-smoke--2008856721", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.18", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap419310e6-00", "ovs_interfaceid": "419310e6-0055-4c1d-8cdb-be034824b754", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.562 227766 DEBUG nova.network.os_vif_util [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:14:44:0e,bridge_name='br-int',has_traffic_filtering=True,id=419310e6-0055-4c1d-8cdb-be034824b754,network=Network(e227a777-0e88-4409-a4a5-266ef225baae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap419310e6-00') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.562 227766 DEBUG os_vif [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:14:44:0e,bridge_name='br-int',has_traffic_filtering=True,id=419310e6-0055-4c1d-8cdb-be034824b754,network=Network(e227a777-0e88-4409-a4a5-266ef225baae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap419310e6-00') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.563 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.564 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap419310e6-00, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.564 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.567 227766 INFO os_vif [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:14:44:0e,bridge_name='br-int',has_traffic_filtering=True,id=419310e6-0055-4c1d-8cdb-be034824b754,network=Network(e227a777-0e88-4409-a4a5-266ef225baae),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap419310e6-00')#033[00m
Jan 23 05:24:24 np0005593234 nova_compute[227762]: 2026-01-23 10:24:24.567 227766 DEBUG nova.virt.libvirt.guest [req-9b40d62a-9053-47a9-86ea-65037bc586af req-6a751fcd-504c-4626-970a-2444c409a1ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:name>tempest-TestNetworkBasicOps-server-1311975023</nova:name>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:creationTime>2026-01-23 10:24:24</nova:creationTime>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:flavor name="m1.nano">
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:memory>128</nova:memory>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:disk>1</nova:disk>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:swap>0</nova:swap>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:vcpus>1</nova:vcpus>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </nova:flavor>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:owner>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </nova:owner>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  <nova:ports>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    <nova:port uuid="1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9">
Jan 23 05:24:24 np0005593234 nova_compute[227762]:      <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:    </nova:port>
Jan 23 05:24:24 np0005593234 nova_compute[227762]:  </nova:ports>
Jan 23 05:24:24 np0005593234 nova_compute[227762]: </nova:instance>
Jan 23 05:24:24 np0005593234 nova_compute[227762]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Jan 23 05:24:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:24.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:25 np0005593234 nova_compute[227762]: 2026-01-23 10:24:25.736 227766 DEBUG nova.virt.libvirt.vif [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:23:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-59893283',display_name='tempest-AttachVolumeTestJSON-server-59893283',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-59893283',id=162,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFhliJMNa+yXuf0fSBz/3snMSJFr8dgTPMjo/Saadc9lk7SWpQL9azGzggupkC6Qe7ig8O16HGbub4k3l0C30EmW82whCbHcGBU51cs0XAqEAsopgWQJtLRZgtshG7dxog==',key_name='tempest-keypair-1909141305',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:23:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c7c25c6bb33b41bf9cd8febb8259fd87',ramdisk_id='',reservation_id='r-gp4q9qzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-345871886',owner_user_name='tempest-AttachVolumeTestJSON-345871886-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:24:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='01b7396ecc574dd6ba2df2f406921223',uuid=bd0fc955-63ff-41a4-b31b-369c2b584544,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:24:25 np0005593234 nova_compute[227762]: 2026-01-23 10:24:25.737 227766 DEBUG nova.network.os_vif_util [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converting VIF {"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:24:25 np0005593234 nova_compute[227762]: 2026-01-23 10:24:25.738 227766 DEBUG nova.network.os_vif_util [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:24:25 np0005593234 nova_compute[227762]: 2026-01-23 10:24:25.740 227766 DEBUG nova.objects.instance [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'pci_devices' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.022 227766 DEBUG nova.compute.manager [req-d4c9a436-9753-4b9e-a98f-19b709a4287f req-47d88314-5dd5-4f2b-90b5-93897292d5af 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-vif-plugged-419310e6-0055-4c1d-8cdb-be034824b754 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.022 227766 DEBUG oslo_concurrency.lockutils [req-d4c9a436-9753-4b9e-a98f-19b709a4287f req-47d88314-5dd5-4f2b-90b5-93897292d5af 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.023 227766 DEBUG oslo_concurrency.lockutils [req-d4c9a436-9753-4b9e-a98f-19b709a4287f req-47d88314-5dd5-4f2b-90b5-93897292d5af 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.023 227766 DEBUG oslo_concurrency.lockutils [req-d4c9a436-9753-4b9e-a98f-19b709a4287f req-47d88314-5dd5-4f2b-90b5-93897292d5af 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.024 227766 DEBUG nova.compute.manager [req-d4c9a436-9753-4b9e-a98f-19b709a4287f req-47d88314-5dd5-4f2b-90b5-93897292d5af 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] No waiting events found dispatching network-vif-plugged-419310e6-0055-4c1d-8cdb-be034824b754 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.024 227766 WARNING nova.compute.manager [req-d4c9a436-9753-4b9e-a98f-19b709a4287f req-47d88314-5dd5-4f2b-90b5-93897292d5af 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received unexpected event network-vif-plugged-419310e6-0055-4c1d-8cdb-be034824b754 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:24:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:26.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.046 227766 DEBUG nova.virt.libvirt.driver [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  <uuid>bd0fc955-63ff-41a4-b31b-369c2b584544</uuid>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  <name>instance-000000a2</name>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <nova:name>tempest-AttachVolumeTestJSON-server-59893283</nova:name>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:24:22</nova:creationTime>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <nova:user uuid="01b7396ecc574dd6ba2df2f406921223">tempest-AttachVolumeTestJSON-345871886-project-member</nova:user>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <nova:project uuid="c7c25c6bb33b41bf9cd8febb8259fd87">tempest-AttachVolumeTestJSON-345871886</nova:project>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <nova:port uuid="a4401398-6f7f-4595-b308-33a66a468a1f">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <entry name="serial">bd0fc955-63ff-41a4-b31b-369c2b584544</entry>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <entry name="uuid">bd0fc955-63ff-41a4-b31b-369c2b584544</entry>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/bd0fc955-63ff-41a4-b31b-369c2b584544_disk">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/bd0fc955-63ff-41a4-b31b-369c2b584544_disk.config">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="volumes/volume-127a1e1e-4b4f-4404-b522-315ba62689fa">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <target dev="vdb" bus="virtio"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <serial>127a1e1e-4b4f-4404-b522-315ba62689fa</serial>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:d9:d9:06"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <target dev="tapa4401398-6f"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/bd0fc955-63ff-41a4-b31b-369c2b584544/console.log" append="off"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <input type="keyboard" bus="usb"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:24:26 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:24:26 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:24:26 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:24:26 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.047 227766 DEBUG nova.virt.libvirt.driver [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.048 227766 DEBUG nova.virt.libvirt.driver [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.048 227766 DEBUG nova.virt.libvirt.driver [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.049 227766 DEBUG nova.virt.libvirt.vif [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:23:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-59893283',display_name='tempest-AttachVolumeTestJSON-server-59893283',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-59893283',id=162,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFhliJMNa+yXuf0fSBz/3snMSJFr8dgTPMjo/Saadc9lk7SWpQL9azGzggupkC6Qe7ig8O16HGbub4k3l0C30EmW82whCbHcGBU51cs0XAqEAsopgWQJtLRZgtshG7dxog==',key_name='tempest-keypair-1909141305',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:23:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='c7c25c6bb33b41bf9cd8febb8259fd87',ramdisk_id='',reservation_id='r-gp4q9qzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-345871886',owner_user_name='tempest-AttachVolumeTestJSON-345871886-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:24:19Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='01b7396ecc574dd6ba2df2f406921223',uuid=bd0fc955-63ff-41a4-b31b-369c2b584544,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.050 227766 DEBUG nova.network.os_vif_util [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converting VIF {"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.051 227766 DEBUG nova.network.os_vif_util [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.052 227766 DEBUG os_vif [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.057 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.057 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.058 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.062 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.063 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa4401398-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.064 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa4401398-6f, col_values=(('external_ids', {'iface-id': 'a4401398-6f7f-4595-b308-33a66a468a1f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:d9:06', 'vm-uuid': 'bd0fc955-63ff-41a4-b31b-369c2b584544'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.066 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:26 np0005593234 NetworkManager[48942]: <info>  [1769163866.0681] manager: (tapa4401398-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/324)
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.069 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.074 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.076 227766 INFO os_vif [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f')#033[00m
Jan 23 05:24:26 np0005593234 kernel: tapa4401398-6f: entered promiscuous mode
Jan 23 05:24:26 np0005593234 NetworkManager[48942]: <info>  [1769163866.1602] manager: (tapa4401398-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/325)
Jan 23 05:24:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:26Z|00673|binding|INFO|Claiming lport a4401398-6f7f-4595-b308-33a66a468a1f for this chassis.
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.161 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:26Z|00674|binding|INFO|a4401398-6f7f-4595-b308-33a66a468a1f: Claiming fa:16:3e:d9:d9:06 10.100.0.7
Jan 23 05:24:26 np0005593234 systemd-udevd[301490]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:24:26 np0005593234 NetworkManager[48942]: <info>  [1769163866.1990] device (tapa4401398-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:24:26 np0005593234 NetworkManager[48942]: <info>  [1769163866.1995] device (tapa4401398-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:24:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:26Z|00675|binding|INFO|Setting lport a4401398-6f7f-4595-b308-33a66a468a1f ovn-installed in OVS
Jan 23 05:24:26 np0005593234 systemd-machined[195626]: New machine qemu-76-instance-000000a2.
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.249 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:26Z|00676|binding|INFO|Setting lport a4401398-6f7f-4595-b308-33a66a468a1f up in Southbound
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.250 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:d9:06 10.100.0.7'], port_security=['fa:16:3e:d9:d9:06 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'bd0fc955-63ff-41a4-b31b-369c2b584544', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1280650e-e283-4ddc-81aa-357640520155', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7c25c6bb33b41bf9cd8febb8259fd87', 'neutron:revision_number': '5', 'neutron:security_group_ids': '3c721f45-9254-46f2-b17b-2aa67f5ce3fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4684203-7828-4ea2-86ad-83030eb9aefe, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a4401398-6f7f-4595-b308-33a66a468a1f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.250 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a4401398-6f7f-4595-b308-33a66a468a1f in datapath 1280650e-e283-4ddc-81aa-357640520155 bound to our chassis#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.252 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1280650e-e283-4ddc-81aa-357640520155#033[00m
Jan 23 05:24:26 np0005593234 systemd[1]: Started Virtual Machine qemu-76-instance-000000a2.
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.262 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[57414a15-794d-4525-b44d-a293c2eedfa8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.263 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1280650e-e1 in ovnmeta-1280650e-e283-4ddc-81aa-357640520155 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.265 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1280650e-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.265 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aa639593-9a01-4ddc-8264-68a6c0abdce5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.266 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c6f4e890-6e0d-4557-b247-038ca824652b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.278 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[8428b6d6-db47-4d07-93b7-046264ac69a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.293 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0708f06f-f6e9-411d-bad9-a40d2c580b6a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.334 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e757664b-4a9d-468a-b731-d25194cc5887]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.342 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e2dcc0c4-51f7-4869-8fba-3c8adce2baa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 NetworkManager[48942]: <info>  [1769163866.3441] manager: (tap1280650e-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/326)
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.374 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7013c0fb-b4bb-4690-936e-5a2e9a98382a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.377 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d70169e2-5b1b-403d-b58f-e6be4f673acb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 NetworkManager[48942]: <info>  [1769163866.3986] device (tap1280650e-e0): carrier: link connected
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.406 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[31d713ff-8c33-4405-bbea-9024280d9b7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.422 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e53b57bc-9376-4e63-a7cc-cda3c7579bd0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1280650e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:5b:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 776135, 'reachable_time': 41380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 301524, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.437 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f0cac5e3-53b1-4386-b385-c76962ca04d6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:5b3e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 776135, 'tstamp': 776135}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 301525, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.453 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d20f8e-06fc-4d1b-b992-d09670b17d52]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1280650e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:5b:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 210], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 776135, 'reachable_time': 41380, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 301526, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.484 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c6b3ff9d-2840-42f6-9efb-eb5bfaf20362]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.545 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2c9ce838-9224-4104-94d0-047cb1710392]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.546 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1280650e-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.547 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.547 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1280650e-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.549 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:26 np0005593234 NetworkManager[48942]: <info>  [1769163866.5500] manager: (tap1280650e-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/327)
Jan 23 05:24:26 np0005593234 kernel: tap1280650e-e0: entered promiscuous mode
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.552 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.553 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1280650e-e0, col_values=(('external_ids', {'iface-id': '8ca9fbcb-59f5-4006-84df-ab99827a2b39'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.555 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:26Z|00677|binding|INFO|Releasing lport 8ca9fbcb-59f5-4006-84df-ab99827a2b39 from this chassis (sb_readonly=0)
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.557 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.557 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1280650e-e283-4ddc-81aa-357640520155.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1280650e-e283-4ddc-81aa-357640520155.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.558 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[268332c4-8bfb-4a27-b464-fb1898dbfbc7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.559 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-1280650e-e283-4ddc-81aa-357640520155
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/1280650e-e283-4ddc-81aa-357640520155.pid.haproxy
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 1280650e-e283-4ddc-81aa-357640520155
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:24:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:26.560 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'env', 'PROCESS_TAG=haproxy-1280650e-e283-4ddc-81aa-357640520155', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1280650e-e283-4ddc-81aa-357640520155.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.570 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:26.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.901 227766 DEBUG nova.compute.manager [req-a4fa2ec6-ca5c-4201-8aca-7877784cc36d req-e6df1345-f3b3-4dcd-964e-915a94e4188f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.901 227766 DEBUG oslo_concurrency.lockutils [req-a4fa2ec6-ca5c-4201-8aca-7877784cc36d req-e6df1345-f3b3-4dcd-964e-915a94e4188f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.901 227766 DEBUG oslo_concurrency.lockutils [req-a4fa2ec6-ca5c-4201-8aca-7877784cc36d req-e6df1345-f3b3-4dcd-964e-915a94e4188f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.902 227766 DEBUG oslo_concurrency.lockutils [req-a4fa2ec6-ca5c-4201-8aca-7877784cc36d req-e6df1345-f3b3-4dcd-964e-915a94e4188f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.902 227766 DEBUG nova.compute.manager [req-a4fa2ec6-ca5c-4201-8aca-7877784cc36d req-e6df1345-f3b3-4dcd-964e-915a94e4188f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] No waiting events found dispatching network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:26 np0005593234 nova_compute[227762]: 2026-01-23 10:24:26.902 227766 WARNING nova.compute.manager [req-a4fa2ec6-ca5c-4201-8aca-7877784cc36d req-e6df1345-f3b3-4dcd-964e-915a94e4188f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received unexpected event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 23 05:24:26 np0005593234 podman[301612]: 2026-01-23 10:24:26.942649698 +0000 UTC m=+0.041827631 container create fbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:24:26 np0005593234 systemd[1]: Started libpod-conmon-fbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08.scope.
Jan 23 05:24:26 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:24:26 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a4483c2bc6fe15e66de44a62fc9281fe1bd72db7a58c52dc8e4cb11fde05f2b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:24:27 np0005593234 podman[301612]: 2026-01-23 10:24:27.004796899 +0000 UTC m=+0.103974892 container init fbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:24:27 np0005593234 podman[301612]: 2026-01-23 10:24:27.010466835 +0000 UTC m=+0.109644778 container start fbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:24:27 np0005593234 podman[301612]: 2026-01-23 10:24:26.922736439 +0000 UTC m=+0.021914402 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:24:27 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[301629]: [NOTICE]   (301636) : New worker (301638) forked
Jan 23 05:24:27 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[301629]: [NOTICE]   (301636) : Loading success.
Jan 23 05:24:27 np0005593234 nova_compute[227762]: 2026-01-23 10:24:27.083 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for bd0fc955-63ff-41a4-b31b-369c2b584544 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:24:27 np0005593234 nova_compute[227762]: 2026-01-23 10:24:27.084 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163867.083126, bd0fc955-63ff-41a4-b31b-369c2b584544 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:24:27 np0005593234 nova_compute[227762]: 2026-01-23 10:24:27.084 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:24:27 np0005593234 nova_compute[227762]: 2026-01-23 10:24:27.085 227766 DEBUG nova.compute.manager [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:24:27 np0005593234 nova_compute[227762]: 2026-01-23 10:24:27.088 227766 INFO nova.virt.libvirt.driver [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance rebooted successfully.#033[00m
Jan 23 05:24:27 np0005593234 nova_compute[227762]: 2026-01-23 10:24:27.088 227766 DEBUG nova.compute.manager [None req-43385bfd-e7ff-4110-ab09-de45f28da2c2 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:27 np0005593234 nova_compute[227762]: 2026-01-23 10:24:27.141 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:27 np0005593234 nova_compute[227762]: 2026-01-23 10:24:27.144 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:24:27 np0005593234 nova_compute[227762]: 2026-01-23 10:24:27.328 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163867.083834, bd0fc955-63ff-41a4-b31b-369c2b584544 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:24:27 np0005593234 nova_compute[227762]: 2026-01-23 10:24:27.328 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] VM Started (Lifecycle Event)#033[00m
Jan 23 05:24:27 np0005593234 nova_compute[227762]: 2026-01-23 10:24:27.475 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:27 np0005593234 nova_compute[227762]: 2026-01-23 10:24:27.480 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:24:27 np0005593234 nova_compute[227762]: 2026-01-23 10:24:27.896 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:28 np0005593234 nova_compute[227762]: 2026-01-23 10:24:28.015 227766 INFO nova.network.neutron [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Port 419310e6-0055-4c1d-8cdb-be034824b754 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Jan 23 05:24:28 np0005593234 nova_compute[227762]: 2026-01-23 10:24:28.015 227766 DEBUG nova.network.neutron [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updating instance_info_cache with network_info: [{"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:28.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:28 np0005593234 nova_compute[227762]: 2026-01-23 10:24:28.048 227766 DEBUG oslo_concurrency.lockutils [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Releasing lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:28 np0005593234 nova_compute[227762]: 2026-01-23 10:24:28.076 227766 DEBUG oslo_concurrency.lockutils [None req-e1273f6d-78ad-4aa7-8228-e16699523144 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "interface-8010f6fe-77ef-48ec-952f-a3a65186cd59-419310e6-0055-4c1d-8cdb-be034824b754" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 5.631s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:28Z|00678|binding|INFO|Releasing lport 8ca9fbcb-59f5-4006-84df-ab99827a2b39 from this chassis (sb_readonly=0)
Jan 23 05:24:28 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:28Z|00679|binding|INFO|Releasing lport 941ae456-64ca-4338-b65f-ea519122a16f from this chassis (sb_readonly=0)
Jan 23 05:24:28 np0005593234 nova_compute[227762]: 2026-01-23 10:24:28.143 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:28 np0005593234 nova_compute[227762]: 2026-01-23 10:24:28.762 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:28 np0005593234 nova_compute[227762]: 2026-01-23 10:24:28.763 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:24:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:28.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.004 227766 DEBUG nova.compute.manager [req-9db008f9-191c-46e5-a748-d02ac60faab9 req-33d68669-fae1-4064-9021-9891ae17fe45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.005 227766 DEBUG oslo_concurrency.lockutils [req-9db008f9-191c-46e5-a748-d02ac60faab9 req-33d68669-fae1-4064-9021-9891ae17fe45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.005 227766 DEBUG oslo_concurrency.lockutils [req-9db008f9-191c-46e5-a748-d02ac60faab9 req-33d68669-fae1-4064-9021-9891ae17fe45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.005 227766 DEBUG oslo_concurrency.lockutils [req-9db008f9-191c-46e5-a748-d02ac60faab9 req-33d68669-fae1-4064-9021-9891ae17fe45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.005 227766 DEBUG nova.compute.manager [req-9db008f9-191c-46e5-a748-d02ac60faab9 req-33d68669-fae1-4064-9021-9891ae17fe45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] No waiting events found dispatching network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.005 227766 WARNING nova.compute.manager [req-9db008f9-191c-46e5-a748-d02ac60faab9 req-33d68669-fae1-4064-9021-9891ae17fe45 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received unexpected event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f for instance with vm_state active and task_state None.#033[00m
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.010 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.926 227766 DEBUG oslo_concurrency.lockutils [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "8010f6fe-77ef-48ec-952f-a3a65186cd59" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.926 227766 DEBUG oslo_concurrency.lockutils [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.927 227766 DEBUG oslo_concurrency.lockutils [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.927 227766 DEBUG oslo_concurrency.lockutils [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.927 227766 DEBUG oslo_concurrency.lockutils [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.928 227766 INFO nova.compute.manager [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Terminating instance#033[00m
Jan 23 05:24:29 np0005593234 nova_compute[227762]: 2026-01-23 10:24:29.929 227766 DEBUG nova.compute.manager [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:24:30 np0005593234 kernel: tap1fea5a6d-70 (unregistering): left promiscuous mode
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.030 227766 DEBUG nova.compute.manager [req-08d5b601-9dfa-4028-8f0e-4702dedcb70e req-8ba56a9e-2ba9-40ad-ba26-89e3c3785ac6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-changed-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:30 np0005593234 NetworkManager[48942]: <info>  [1769163870.0327] device (tap1fea5a6d-70): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.031 227766 DEBUG nova.compute.manager [req-08d5b601-9dfa-4028-8f0e-4702dedcb70e req-8ba56a9e-2ba9-40ad-ba26-89e3c3785ac6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Refreshing instance network info cache due to event network-changed-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.035 227766 DEBUG oslo_concurrency.lockutils [req-08d5b601-9dfa-4028-8f0e-4702dedcb70e req-8ba56a9e-2ba9-40ad-ba26-89e3c3785ac6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.035 227766 DEBUG oslo_concurrency.lockutils [req-08d5b601-9dfa-4028-8f0e-4702dedcb70e req-8ba56a9e-2ba9-40ad-ba26-89e3c3785ac6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.036 227766 DEBUG nova.network.neutron [req-08d5b601-9dfa-4028-8f0e-4702dedcb70e req-8ba56a9e-2ba9-40ad-ba26-89e3c3785ac6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Refreshing network info cache for port 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:24:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:30.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:30 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:30Z|00680|binding|INFO|Releasing lport 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 from this chassis (sb_readonly=0)
Jan 23 05:24:30 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:30Z|00681|binding|INFO|Setting lport 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 down in Southbound
Jan 23 05:24:30 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:30Z|00682|binding|INFO|Removing iface tap1fea5a6d-70 ovn-installed in OVS
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.089 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.096 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8e:52:70 10.100.0.3'], port_security=['fa:16:3e:8e:52:70 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '8010f6fe-77ef-48ec-952f-a3a65186cd59', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-16290d86-0a8d-403e-83f2-0ae47fb80e5f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'f0ab937f-5892-49c6-b2ad-6c661ac4d86b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bb4f46ff-5230-4660-946f-0fcefddd5977, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.098 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 in datapath 16290d86-0a8d-403e-83f2-0ae47fb80e5f unbound from our chassis#033[00m
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.100 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 16290d86-0a8d-403e-83f2-0ae47fb80e5f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.101 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4a17dc90-3579-47be-8c81-6c1854710e05]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.102 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f namespace which is not needed anymore#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.104 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:30 np0005593234 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000009e.scope: Deactivated successfully.
Jan 23 05:24:30 np0005593234 systemd[1]: machine-qemu\x2d74\x2dinstance\x2d0000009e.scope: Consumed 18.249s CPU time.
Jan 23 05:24:30 np0005593234 systemd-machined[195626]: Machine qemu-74-instance-0000009e terminated.
Jan 23 05:24:30 np0005593234 podman[301649]: 2026-01-23 10:24:30.137491259 +0000 UTC m=+0.082517085 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.147 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.152 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.160 227766 INFO nova.virt.libvirt.driver [-] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Instance destroyed successfully.#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.161 227766 DEBUG nova.objects.instance [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'resources' on Instance uuid 8010f6fe-77ef-48ec-952f-a3a65186cd59 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.180 227766 DEBUG nova.virt.libvirt.vif [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:22:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1311975023',display_name='tempest-TestNetworkBasicOps-server-1311975023',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1311975023',id=158,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL/weReemtkH37O5IRYNNKB/TT2YWx12mLIlrues6ra7aWugRfActXRaloGeDbBIe3P4JQOsAt7qcWUIg9CugCzIu3x3QAkzyj6XRgUoI3pt8WnzsopEYd4m2Fn+OmRPbg==',key_name='tempest-TestNetworkBasicOps-1841453028',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:22:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-3yzv077h',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:22:48Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=8010f6fe-77ef-48ec-952f-a3a65186cd59,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.181 227766 DEBUG nova.network.os_vif_util [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.191", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.182 227766 DEBUG nova.network.os_vif_util [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8e:52:70,bridge_name='br-int',has_traffic_filtering=True,id=1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9,network=Network(16290d86-0a8d-403e-83f2-0ae47fb80e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fea5a6d-70') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.182 227766 DEBUG os_vif [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:52:70,bridge_name='br-int',has_traffic_filtering=True,id=1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9,network=Network(16290d86-0a8d-403e-83f2-0ae47fb80e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fea5a6d-70') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.184 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.184 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1fea5a6d-70, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.185 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.189 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.190 227766 INFO os_vif [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8e:52:70,bridge_name='br-int',has_traffic_filtering=True,id=1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9,network=Network(16290d86-0a8d-403e-83f2-0ae47fb80e5f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1fea5a6d-70')#033[00m
Jan 23 05:24:30 np0005593234 neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f[299684]: [NOTICE]   (299688) : haproxy version is 2.8.14-c23fe91
Jan 23 05:24:30 np0005593234 neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f[299684]: [NOTICE]   (299688) : path to executable is /usr/sbin/haproxy
Jan 23 05:24:30 np0005593234 neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f[299684]: [WARNING]  (299688) : Exiting Master process...
Jan 23 05:24:30 np0005593234 neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f[299684]: [ALERT]    (299688) : Current worker (299690) exited with code 143 (Terminated)
Jan 23 05:24:30 np0005593234 neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f[299684]: [WARNING]  (299688) : All workers exited. Exiting... (0)
Jan 23 05:24:30 np0005593234 systemd[1]: libpod-c22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb.scope: Deactivated successfully.
Jan 23 05:24:30 np0005593234 podman[301693]: 2026-01-23 10:24:30.257261361 +0000 UTC m=+0.055839486 container died c22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:24:30 np0005593234 systemd[1]: var-lib-containers-storage-overlay-7233f00f7c1da591c032a081973e55dd4fc7f05eb49b31f769a4b069ecec4ac9-merged.mount: Deactivated successfully.
Jan 23 05:24:30 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb-userdata-shm.mount: Deactivated successfully.
Jan 23 05:24:30 np0005593234 podman[301693]: 2026-01-23 10:24:30.293884159 +0000 UTC m=+0.092462284 container cleanup c22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 05:24:30 np0005593234 systemd[1]: libpod-conmon-c22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb.scope: Deactivated successfully.
Jan 23 05:24:30 np0005593234 podman[301738]: 2026-01-23 10:24:30.353041887 +0000 UTC m=+0.040519060 container remove c22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.358 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c1b7bacd-686d-4fb5-a863-9b0dce1651e1]: (4, ('Fri Jan 23 10:24:30 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f (c22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb)\nc22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb\nFri Jan 23 10:24:30 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f (c22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb)\nc22b809e32081aaff0f7b4816564fa3ac1ca2bcf26aa3c01836e39e2888d5fcb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.360 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f626543c-1f90-40f4-96f0-0cb2299d6456]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.361 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap16290d86-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:24:30 np0005593234 kernel: tap16290d86-00: left promiscuous mode
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.363 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:30 np0005593234 nova_compute[227762]: 2026-01-23 10:24:30.376 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.379 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a1971a-0d9b-40ca-9a33-e069daaa6765]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.391 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[16d26254-2101-4af2-93a0-118358a1b8fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.392 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fc24e6f3-5f7d-4e42-8405-c8767744af44]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.406 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[06e45610-7f4f-4710-80a0-9799868ee696]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 766190, 'reachable_time': 30222, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 301753, 'error': None, 'target': 'ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.408 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-16290d86-0a8d-403e-83f2-0ae47fb80e5f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:24:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:30.408 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[9055e0d9-99bd-46dd-98bf-391868e7dbc8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:24:30 np0005593234 systemd[1]: run-netns-ovnmeta\x2d16290d86\x2d0a8d\x2d403e\x2d83f2\x2d0ae47fb80e5f.mount: Deactivated successfully.
Jan 23 05:24:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:30.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:31 np0005593234 nova_compute[227762]: 2026-01-23 10:24:31.791 227766 INFO nova.virt.libvirt.driver [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Deleting instance files /var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59_del#033[00m
Jan 23 05:24:31 np0005593234 nova_compute[227762]: 2026-01-23 10:24:31.792 227766 INFO nova.virt.libvirt.driver [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Deletion of /var/lib/nova/instances/8010f6fe-77ef-48ec-952f-a3a65186cd59_del complete#033[00m
Jan 23 05:24:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:32.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:32.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:32 np0005593234 nova_compute[227762]: 2026-01-23 10:24:32.898 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:33 np0005593234 nova_compute[227762]: 2026-01-23 10:24:33.709 227766 INFO nova.compute.manager [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Took 3.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:24:33 np0005593234 nova_compute[227762]: 2026-01-23 10:24:33.709 227766 DEBUG oslo.service.loopingcall [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:24:33 np0005593234 nova_compute[227762]: 2026-01-23 10:24:33.710 227766 DEBUG nova.compute.manager [-] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:24:33 np0005593234 nova_compute[227762]: 2026-01-23 10:24:33.710 227766 DEBUG nova.network.neutron [-] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:24:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:34.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:34.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:35 np0005593234 nova_compute[227762]: 2026-01-23 10:24:35.188 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:36.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:36 np0005593234 nova_compute[227762]: 2026-01-23 10:24:36.111 227766 DEBUG nova.network.neutron [req-08d5b601-9dfa-4028-8f0e-4702dedcb70e req-8ba56a9e-2ba9-40ad-ba26-89e3c3785ac6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updated VIF entry in instance network info cache for port 1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:24:36 np0005593234 nova_compute[227762]: 2026-01-23 10:24:36.111 227766 DEBUG nova.network.neutron [req-08d5b601-9dfa-4028-8f0e-4702dedcb70e req-8ba56a9e-2ba9-40ad-ba26-89e3c3785ac6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updating instance_info_cache with network_info: [{"id": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "address": "fa:16:3e:8e:52:70", "network": {"id": "16290d86-0a8d-403e-83f2-0ae47fb80e5f", "bridge": "br-int", "label": "tempest-network-smoke--1218687721", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1fea5a6d-70", "ovs_interfaceid": "1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:36.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:37 np0005593234 nova_compute[227762]: 2026-01-23 10:24:37.405 227766 DEBUG nova.compute.manager [req-bb896f7a-34a9-447c-a0d4-9dcc9b2f224a req-acca6da9-54af-4003-a86f-a748bf98bd27 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-vif-unplugged-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:37 np0005593234 nova_compute[227762]: 2026-01-23 10:24:37.406 227766 DEBUG oslo_concurrency.lockutils [req-bb896f7a-34a9-447c-a0d4-9dcc9b2f224a req-acca6da9-54af-4003-a86f-a748bf98bd27 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:37 np0005593234 nova_compute[227762]: 2026-01-23 10:24:37.406 227766 DEBUG oslo_concurrency.lockutils [req-bb896f7a-34a9-447c-a0d4-9dcc9b2f224a req-acca6da9-54af-4003-a86f-a748bf98bd27 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:37 np0005593234 nova_compute[227762]: 2026-01-23 10:24:37.407 227766 DEBUG oslo_concurrency.lockutils [req-bb896f7a-34a9-447c-a0d4-9dcc9b2f224a req-acca6da9-54af-4003-a86f-a748bf98bd27 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:37 np0005593234 nova_compute[227762]: 2026-01-23 10:24:37.407 227766 DEBUG nova.compute.manager [req-bb896f7a-34a9-447c-a0d4-9dcc9b2f224a req-acca6da9-54af-4003-a86f-a748bf98bd27 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] No waiting events found dispatching network-vif-unplugged-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:37 np0005593234 nova_compute[227762]: 2026-01-23 10:24:37.407 227766 DEBUG nova.compute.manager [req-bb896f7a-34a9-447c-a0d4-9dcc9b2f224a req-acca6da9-54af-4003-a86f-a748bf98bd27 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-vif-unplugged-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:24:37 np0005593234 nova_compute[227762]: 2026-01-23 10:24:37.900 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:38.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:38 np0005593234 podman[301809]: 2026-01-23 10:24:38.794802094 +0000 UTC m=+0.090697679 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:24:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:38.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:40.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:40 np0005593234 nova_compute[227762]: 2026-01-23 10:24:40.221 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:40 np0005593234 nova_compute[227762]: 2026-01-23 10:24:40.327 227766 DEBUG oslo_concurrency.lockutils [req-08d5b601-9dfa-4028-8f0e-4702dedcb70e req-8ba56a9e-2ba9-40ad-ba26-89e3c3785ac6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-8010f6fe-77ef-48ec-952f-a3a65186cd59" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:40 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:40Z|00078|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d9:d9:06 10.100.0.7
Jan 23 05:24:40 np0005593234 nova_compute[227762]: 2026-01-23 10:24:40.410 227766 DEBUG nova.compute.manager [req-e3827da9-679f-4b2f-9bb1-35acbe8dc722 req-ad407c05-0550-4775-a61a-c515bf6d284c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-vif-plugged-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:40 np0005593234 nova_compute[227762]: 2026-01-23 10:24:40.411 227766 DEBUG oslo_concurrency.lockutils [req-e3827da9-679f-4b2f-9bb1-35acbe8dc722 req-ad407c05-0550-4775-a61a-c515bf6d284c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:40 np0005593234 nova_compute[227762]: 2026-01-23 10:24:40.411 227766 DEBUG oslo_concurrency.lockutils [req-e3827da9-679f-4b2f-9bb1-35acbe8dc722 req-ad407c05-0550-4775-a61a-c515bf6d284c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:40 np0005593234 nova_compute[227762]: 2026-01-23 10:24:40.411 227766 DEBUG oslo_concurrency.lockutils [req-e3827da9-679f-4b2f-9bb1-35acbe8dc722 req-ad407c05-0550-4775-a61a-c515bf6d284c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:40 np0005593234 nova_compute[227762]: 2026-01-23 10:24:40.411 227766 DEBUG nova.compute.manager [req-e3827da9-679f-4b2f-9bb1-35acbe8dc722 req-ad407c05-0550-4775-a61a-c515bf6d284c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] No waiting events found dispatching network-vif-plugged-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:24:40 np0005593234 nova_compute[227762]: 2026-01-23 10:24:40.411 227766 WARNING nova.compute.manager [req-e3827da9-679f-4b2f-9bb1-35acbe8dc722 req-ad407c05-0550-4775-a61a-c515bf6d284c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received unexpected event network-vif-plugged-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:24:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:40.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:41 np0005593234 nova_compute[227762]: 2026-01-23 10:24:41.078 227766 DEBUG nova.network.neutron [-] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:41 np0005593234 nova_compute[227762]: 2026-01-23 10:24:41.398 227766 INFO nova.compute.manager [-] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Took 7.69 seconds to deallocate network for instance.#033[00m
Jan 23 05:24:41 np0005593234 nova_compute[227762]: 2026-01-23 10:24:41.606 227766 DEBUG oslo_concurrency.lockutils [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:41 np0005593234 nova_compute[227762]: 2026-01-23 10:24:41.606 227766 DEBUG oslo_concurrency.lockutils [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:41 np0005593234 nova_compute[227762]: 2026-01-23 10:24:41.714 227766 DEBUG oslo_concurrency.processutils [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:42.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1521024957' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:24:42 np0005593234 nova_compute[227762]: 2026-01-23 10:24:42.159 227766 DEBUG oslo_concurrency.processutils [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:42 np0005593234 nova_compute[227762]: 2026-01-23 10:24:42.165 227766 DEBUG nova.compute.provider_tree [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:24:42 np0005593234 nova_compute[227762]: 2026-01-23 10:24:42.259 227766 DEBUG nova.scheduler.client.report [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:24:42 np0005593234 nova_compute[227762]: 2026-01-23 10:24:42.324 227766 DEBUG oslo_concurrency.lockutils [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:42 np0005593234 nova_compute[227762]: 2026-01-23 10:24:42.437 227766 INFO nova.scheduler.client.report [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Deleted allocations for instance 8010f6fe-77ef-48ec-952f-a3a65186cd59#033[00m
Jan 23 05:24:42 np0005593234 nova_compute[227762]: 2026-01-23 10:24:42.655 227766 DEBUG nova.compute.manager [req-3e717140-a788-417b-8480-bd79fc196cf4 req-d5cf2508-aa46-4bfb-813e-89a88368bb4a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Received event network-vif-deleted-1fea5a6d-7081-4c0c-9b0b-103fdebcbcc9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:24:42 np0005593234 nova_compute[227762]: 2026-01-23 10:24:42.659 227766 DEBUG oslo_concurrency.lockutils [None req-b3384337-e908-436e-999b-df0955533167 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "8010f6fe-77ef-48ec-952f-a3a65186cd59" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 12.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #142. Immutable memtables: 0.
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.683090) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 89] Flushing memtable with next log file: 142
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163882683139, "job": 89, "event": "flush_started", "num_memtables": 1, "num_entries": 693, "num_deletes": 257, "total_data_size": 1140801, "memory_usage": 1158272, "flush_reason": "Manual Compaction"}
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 89] Level-0 flush table #143: started
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163882690613, "cf_name": "default", "job": 89, "event": "table_file_creation", "file_number": 143, "file_size": 752775, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 68835, "largest_seqno": 69523, "table_properties": {"data_size": 749273, "index_size": 1345, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8106, "raw_average_key_size": 19, "raw_value_size": 742106, "raw_average_value_size": 1766, "num_data_blocks": 58, "num_entries": 420, "num_filter_entries": 420, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163848, "oldest_key_time": 1769163848, "file_creation_time": 1769163882, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 143, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 89] Flush lasted 7566 microseconds, and 3357 cpu microseconds.
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.690656) [db/flush_job.cc:967] [default] [JOB 89] Level-0 flush table #143: 752775 bytes OK
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.690676) [db/memtable_list.cc:519] [default] Level-0 commit table #143 started
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.691991) [db/memtable_list.cc:722] [default] Level-0 commit table #143: memtable #1 done
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.692002) EVENT_LOG_v1 {"time_micros": 1769163882691998, "job": 89, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.692020) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 89] Try to delete WAL files size 1136989, prev total WAL file size 1136989, number of live WAL files 2.
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000139.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.692535) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032353131' seq:72057594037927935, type:22 .. '6C6F676D0032373633' seq:0, type:0; will stop at (end)
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 90] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 89 Base level 0, inputs: [143(735KB)], [141(9565KB)]
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163882692598, "job": 90, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [143], "files_L6": [141], "score": -1, "input_data_size": 10548293, "oldest_snapshot_seqno": -1}
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 90] Generated table #144: 8709 keys, 10400575 bytes, temperature: kUnknown
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163882755436, "cf_name": "default", "job": 90, "event": "table_file_creation", "file_number": 144, "file_size": 10400575, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10346377, "index_size": 31301, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21829, "raw_key_size": 230465, "raw_average_key_size": 26, "raw_value_size": 10195505, "raw_average_value_size": 1170, "num_data_blocks": 1192, "num_entries": 8709, "num_filter_entries": 8709, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769163882, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 144, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.755702) [db/compaction/compaction_job.cc:1663] [default] [JOB 90] Compacted 1@0 + 1@6 files to L6 => 10400575 bytes
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.756874) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 167.7 rd, 165.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 9.3 +0.0 blob) out(9.9 +0.0 blob), read-write-amplify(27.8) write-amplify(13.8) OK, records in: 9241, records dropped: 532 output_compression: NoCompression
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.756892) EVENT_LOG_v1 {"time_micros": 1769163882756883, "job": 90, "event": "compaction_finished", "compaction_time_micros": 62915, "compaction_time_cpu_micros": 23996, "output_level": 6, "num_output_files": 1, "total_output_size": 10400575, "num_input_records": 9241, "num_output_records": 8709, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000143.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163882757163, "job": 90, "event": "table_file_deletion", "file_number": 143}
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000141.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163882758874, "job": 90, "event": "table_file_deletion", "file_number": 141}
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.692449) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.758943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.758949) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.758950) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.758951) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:24:42.758953) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:24:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:42.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:42.859 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:42.860 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:42.860 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:42 np0005593234 nova_compute[227762]: 2026-01-23 10:24:42.901 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:44.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:44.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:45 np0005593234 nova_compute[227762]: 2026-01-23 10:24:45.630 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:45 np0005593234 nova_compute[227762]: 2026-01-23 10:24:45.631 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163870.1566725, 8010f6fe-77ef-48ec-952f-a3a65186cd59 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:24:45 np0005593234 nova_compute[227762]: 2026-01-23 10:24:45.631 227766 INFO nova.compute.manager [-] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:24:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:46.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:46 np0005593234 nova_compute[227762]: 2026-01-23 10:24:46.549 227766 DEBUG nova.compute.manager [None req-15d5ab99-63d1-4495-a0cb-ae8bcccdd9aa - - - - - -] [instance: 8010f6fe-77ef-48ec-952f-a3a65186cd59] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:24:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:46.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:46 np0005593234 nova_compute[227762]: 2026-01-23 10:24:46.992 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:47 np0005593234 nova_compute[227762]: 2026-01-23 10:24:47.903 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:24:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:48.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:24:48 np0005593234 nova_compute[227762]: 2026-01-23 10:24:48.119 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:48 np0005593234 nova_compute[227762]: 2026-01-23 10:24:48.120 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:48 np0005593234 nova_compute[227762]: 2026-01-23 10:24:48.120 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:48 np0005593234 nova_compute[227762]: 2026-01-23 10:24:48.120 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:24:48 np0005593234 nova_compute[227762]: 2026-01-23 10:24:48.121 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:24:48 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/509132170' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:24:48 np0005593234 nova_compute[227762]: 2026-01-23 10:24:48.594 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:48.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:49 np0005593234 nova_compute[227762]: 2026-01-23 10:24:49.239 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:24:49 np0005593234 nova_compute[227762]: 2026-01-23 10:24:49.240 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:24:49 np0005593234 nova_compute[227762]: 2026-01-23 10:24:49.240 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:24:49 np0005593234 nova_compute[227762]: 2026-01-23 10:24:49.389 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:24:49 np0005593234 nova_compute[227762]: 2026-01-23 10:24:49.390 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4175MB free_disk=20.851360321044922GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:24:49 np0005593234 nova_compute[227762]: 2026-01-23 10:24:49.390 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:24:49 np0005593234 nova_compute[227762]: 2026-01-23 10:24:49.391 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:24:49 np0005593234 nova_compute[227762]: 2026-01-23 10:24:49.520 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance bd0fc955-63ff-41a4-b31b-369c2b584544 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:24:49 np0005593234 nova_compute[227762]: 2026-01-23 10:24:49.522 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:24:49 np0005593234 nova_compute[227762]: 2026-01-23 10:24:49.522 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:24:49 np0005593234 nova_compute[227762]: 2026-01-23 10:24:49.562 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:24:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:24:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1710291228' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:24:50 np0005593234 nova_compute[227762]: 2026-01-23 10:24:50.028 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:24:50 np0005593234 nova_compute[227762]: 2026-01-23 10:24:50.036 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:24:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:50.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:50 np0005593234 nova_compute[227762]: 2026-01-23 10:24:50.170 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:24:50 np0005593234 nova_compute[227762]: 2026-01-23 10:24:50.207 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:24:50 np0005593234 nova_compute[227762]: 2026-01-23 10:24:50.208 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.817s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:24:50 np0005593234 nova_compute[227762]: 2026-01-23 10:24:50.209 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:50 np0005593234 nova_compute[227762]: 2026-01-23 10:24:50.631 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:50.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:52.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:24:52Z|00683|binding|INFO|Releasing lport 8ca9fbcb-59f5-4006-84df-ab99827a2b39 from this chassis (sb_readonly=0)
Jan 23 05:24:52 np0005593234 nova_compute[227762]: 2026-01-23 10:24:52.170 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:52 np0005593234 nova_compute[227762]: 2026-01-23 10:24:52.172 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:52.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:52 np0005593234 nova_compute[227762]: 2026-01-23 10:24:52.905 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:53 np0005593234 nova_compute[227762]: 2026-01-23 10:24:53.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:53 np0005593234 nova_compute[227762]: 2026-01-23 10:24:53.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:24:53 np0005593234 nova_compute[227762]: 2026-01-23 10:24:53.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:24:53 np0005593234 nova_compute[227762]: 2026-01-23 10:24:53.968 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:24:53 np0005593234 nova_compute[227762]: 2026-01-23 10:24:53.968 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:24:53 np0005593234 nova_compute[227762]: 2026-01-23 10:24:53.968 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:24:53 np0005593234 nova_compute[227762]: 2026-01-23 10:24:53.969 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:24:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:54.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:54.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:55 np0005593234 nova_compute[227762]: 2026-01-23 10:24:55.152 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:55 np0005593234 nova_compute[227762]: 2026-01-23 10:24:55.633 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:56.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:56 np0005593234 nova_compute[227762]: 2026-01-23 10:24:56.090 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Updating instance_info_cache with network_info: [{"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:24:56 np0005593234 nova_compute[227762]: 2026-01-23 10:24:56.271 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:24:56 np0005593234 nova_compute[227762]: 2026-01-23 10:24:56.271 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:24:56 np0005593234 nova_compute[227762]: 2026-01-23 10:24:56.272 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:56 np0005593234 nova_compute[227762]: 2026-01-23 10:24:56.272 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:56.334 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=60, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=59) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:24:56 np0005593234 nova_compute[227762]: 2026-01-23 10:24:56.334 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:24:56.335 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:24:56 np0005593234 nova_compute[227762]: 2026-01-23 10:24:56.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:56.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:24:57 np0005593234 nova_compute[227762]: 2026-01-23 10:24:57.907 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:24:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:24:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:24:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:58.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:24:58 np0005593234 nova_compute[227762]: 2026-01-23 10:24:58.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:24:58 np0005593234 nova_compute[227762]: 2026-01-23 10:24:58.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:24:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:24:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:24:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:58.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:00.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:00 np0005593234 nova_compute[227762]: 2026-01-23 10:25:00.635 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:00 np0005593234 podman[301963]: 2026-01-23 10:25:00.763346958 +0000 UTC m=+0.057695274 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 23 05:25:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:00.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:00 np0005593234 nova_compute[227762]: 2026-01-23 10:25:00.957 227766 DEBUG oslo_concurrency.lockutils [None req-5271780e-9e5e-44fd-bc6f-0bf9981342da 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:00 np0005593234 nova_compute[227762]: 2026-01-23 10:25:00.957 227766 DEBUG oslo_concurrency.lockutils [None req-5271780e-9e5e-44fd-bc6f-0bf9981342da 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:00 np0005593234 nova_compute[227762]: 2026-01-23 10:25:00.984 227766 INFO nova.compute.manager [None req-5271780e-9e5e-44fd-bc6f-0bf9981342da 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Detaching volume 127a1e1e-4b4f-4404-b522-315ba62689fa#033[00m
Jan 23 05:25:01 np0005593234 nova_compute[227762]: 2026-01-23 10:25:01.215 227766 INFO nova.virt.block_device [None req-5271780e-9e5e-44fd-bc6f-0bf9981342da 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Attempting to driver detach volume 127a1e1e-4b4f-4404-b522-315ba62689fa from mountpoint /dev/vdb#033[00m
Jan 23 05:25:01 np0005593234 nova_compute[227762]: 2026-01-23 10:25:01.224 227766 DEBUG nova.virt.libvirt.driver [None req-5271780e-9e5e-44fd-bc6f-0bf9981342da 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Attempting to detach device vdb from instance bd0fc955-63ff-41a4-b31b-369c2b584544 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:25:01 np0005593234 nova_compute[227762]: 2026-01-23 10:25:01.225 227766 DEBUG nova.virt.libvirt.guest [None req-5271780e-9e5e-44fd-bc6f-0bf9981342da 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:25:01 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-127a1e1e-4b4f-4404-b522-315ba62689fa">
Jan 23 05:25:01 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:  <serial>127a1e1e-4b4f-4404-b522-315ba62689fa</serial>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 23 05:25:01 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:25:01 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:25:01 np0005593234 nova_compute[227762]: 2026-01-23 10:25:01.236 227766 INFO nova.virt.libvirt.driver [None req-5271780e-9e5e-44fd-bc6f-0bf9981342da 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully detached device vdb from instance bd0fc955-63ff-41a4-b31b-369c2b584544 from the persistent domain config.#033[00m
Jan 23 05:25:01 np0005593234 nova_compute[227762]: 2026-01-23 10:25:01.236 227766 DEBUG nova.virt.libvirt.driver [None req-5271780e-9e5e-44fd-bc6f-0bf9981342da 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance bd0fc955-63ff-41a4-b31b-369c2b584544 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:25:01 np0005593234 nova_compute[227762]: 2026-01-23 10:25:01.237 227766 DEBUG nova.virt.libvirt.guest [None req-5271780e-9e5e-44fd-bc6f-0bf9981342da 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:25:01 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-127a1e1e-4b4f-4404-b522-315ba62689fa">
Jan 23 05:25:01 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:  <serial>127a1e1e-4b4f-4404-b522-315ba62689fa</serial>
Jan 23 05:25:01 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 23 05:25:01 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:25:01 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:25:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:01.338 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '60'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:01 np0005593234 nova_compute[227762]: 2026-01-23 10:25:01.377 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769163901.3771646, bd0fc955-63ff-41a4-b31b-369c2b584544 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:25:01 np0005593234 nova_compute[227762]: 2026-01-23 10:25:01.381 227766 DEBUG nova.virt.libvirt.driver [None req-5271780e-9e5e-44fd-bc6f-0bf9981342da 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance bd0fc955-63ff-41a4-b31b-369c2b584544 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:25:01 np0005593234 nova_compute[227762]: 2026-01-23 10:25:01.383 227766 INFO nova.virt.libvirt.driver [None req-5271780e-9e5e-44fd-bc6f-0bf9981342da 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully detached device vdb from instance bd0fc955-63ff-41a4-b31b-369c2b584544 from the live domain config.#033[00m
Jan 23 05:25:01 np0005593234 nova_compute[227762]: 2026-01-23 10:25:01.592 227766 DEBUG nova.objects.instance [None req-5271780e-9e5e-44fd-bc6f-0bf9981342da 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'flavor' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:01 np0005593234 nova_compute[227762]: 2026-01-23 10:25:01.628 227766 DEBUG oslo_concurrency.lockutils [None req-5271780e-9e5e-44fd-bc6f-0bf9981342da 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.671s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:02.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:02 np0005593234 nova_compute[227762]: 2026-01-23 10:25:02.589 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:02 np0005593234 nova_compute[227762]: 2026-01-23 10:25:02.694 227766 DEBUG oslo_concurrency.lockutils [None req-56f8d969-01d2-42d5-a106-3f1e4efdd0a1 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:02 np0005593234 nova_compute[227762]: 2026-01-23 10:25:02.695 227766 DEBUG oslo_concurrency.lockutils [None req-56f8d969-01d2-42d5-a106-3f1e4efdd0a1 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:02 np0005593234 nova_compute[227762]: 2026-01-23 10:25:02.695 227766 DEBUG nova.compute.manager [None req-56f8d969-01d2-42d5-a106-3f1e4efdd0a1 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:02 np0005593234 nova_compute[227762]: 2026-01-23 10:25:02.698 227766 DEBUG nova.compute.manager [None req-56f8d969-01d2-42d5-a106-3f1e4efdd0a1 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 23 05:25:02 np0005593234 nova_compute[227762]: 2026-01-23 10:25:02.699 227766 DEBUG nova.objects.instance [None req-56f8d969-01d2-42d5-a106-3f1e4efdd0a1 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'flavor' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:02 np0005593234 nova_compute[227762]: 2026-01-23 10:25:02.727 227766 DEBUG nova.virt.libvirt.driver [None req-56f8d969-01d2-42d5-a106-3f1e4efdd0a1 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:25:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:02.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:02 np0005593234 nova_compute[227762]: 2026-01-23 10:25:02.941 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:04.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:04.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:05 np0005593234 nova_compute[227762]: 2026-01-23 10:25:05.636 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:05 np0005593234 nova_compute[227762]: 2026-01-23 10:25:05.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:05 np0005593234 kernel: tapa4401398-6f (unregistering): left promiscuous mode
Jan 23 05:25:05 np0005593234 NetworkManager[48942]: <info>  [1769163905.8258] device (tapa4401398-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:25:05 np0005593234 ovn_controller[134547]: 2026-01-23T10:25:05Z|00684|binding|INFO|Releasing lport a4401398-6f7f-4595-b308-33a66a468a1f from this chassis (sb_readonly=0)
Jan 23 05:25:05 np0005593234 ovn_controller[134547]: 2026-01-23T10:25:05Z|00685|binding|INFO|Setting lport a4401398-6f7f-4595-b308-33a66a468a1f down in Southbound
Jan 23 05:25:05 np0005593234 ovn_controller[134547]: 2026-01-23T10:25:05Z|00686|binding|INFO|Removing iface tapa4401398-6f ovn-installed in OVS
Jan 23 05:25:05 np0005593234 nova_compute[227762]: 2026-01-23 10:25:05.838 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:05 np0005593234 nova_compute[227762]: 2026-01-23 10:25:05.842 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:05 np0005593234 nova_compute[227762]: 2026-01-23 10:25:05.865 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:05 np0005593234 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a2.scope: Deactivated successfully.
Jan 23 05:25:05 np0005593234 systemd[1]: machine-qemu\x2d76\x2dinstance\x2d000000a2.scope: Consumed 15.672s CPU time.
Jan 23 05:25:05 np0005593234 systemd-machined[195626]: Machine qemu-76-instance-000000a2 terminated.
Jan 23 05:25:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:06.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:06 np0005593234 nova_compute[227762]: 2026-01-23 10:25:06.746 227766 INFO nova.virt.libvirt.driver [None req-56f8d969-01d2-42d5-a106-3f1e4efdd0a1 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance shutdown successfully after 4 seconds.#033[00m
Jan 23 05:25:06 np0005593234 nova_compute[227762]: 2026-01-23 10:25:06.751 227766 INFO nova.virt.libvirt.driver [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance destroyed successfully.#033[00m
Jan 23 05:25:06 np0005593234 nova_compute[227762]: 2026-01-23 10:25:06.751 227766 DEBUG nova.objects.instance [None req-56f8d969-01d2-42d5-a106-3f1e4efdd0a1 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'numa_topology' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:06.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.485 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:d9:06 10.100.0.7'], port_security=['fa:16:3e:d9:d9:06 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'bd0fc955-63ff-41a4-b31b-369c2b584544', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1280650e-e283-4ddc-81aa-357640520155', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7c25c6bb33b41bf9cd8febb8259fd87', 'neutron:revision_number': '6', 'neutron:security_group_ids': '3c721f45-9254-46f2-b17b-2aa67f5ce3fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.207', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4684203-7828-4ea2-86ad-83030eb9aefe, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a4401398-6f7f-4595-b308-33a66a468a1f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.487 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a4401398-6f7f-4595-b308-33a66a468a1f in datapath 1280650e-e283-4ddc-81aa-357640520155 unbound from our chassis#033[00m
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.490 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1280650e-e283-4ddc-81aa-357640520155, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.491 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8d5699e9-e811-4b59-ad78-e5d2e0f58acb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.492 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1280650e-e283-4ddc-81aa-357640520155 namespace which is not needed anymore#033[00m
Jan 23 05:25:07 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[301629]: [NOTICE]   (301636) : haproxy version is 2.8.14-c23fe91
Jan 23 05:25:07 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[301629]: [NOTICE]   (301636) : path to executable is /usr/sbin/haproxy
Jan 23 05:25:07 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[301629]: [WARNING]  (301636) : Exiting Master process...
Jan 23 05:25:07 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[301629]: [ALERT]    (301636) : Current worker (301638) exited with code 143 (Terminated)
Jan 23 05:25:07 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[301629]: [WARNING]  (301636) : All workers exited. Exiting... (0)
Jan 23 05:25:07 np0005593234 systemd[1]: libpod-fbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08.scope: Deactivated successfully.
Jan 23 05:25:07 np0005593234 podman[302023]: 2026-01-23 10:25:07.665615129 +0000 UTC m=+0.052129131 container died fbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:25:07 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08-userdata-shm.mount: Deactivated successfully.
Jan 23 05:25:07 np0005593234 systemd[1]: var-lib-containers-storage-overlay-4a4483c2bc6fe15e66de44a62fc9281fe1bd72db7a58c52dc8e4cb11fde05f2b-merged.mount: Deactivated successfully.
Jan 23 05:25:07 np0005593234 podman[302023]: 2026-01-23 10:25:07.703850076 +0000 UTC m=+0.090364048 container cleanup fbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:25:07 np0005593234 systemd[1]: libpod-conmon-fbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08.scope: Deactivated successfully.
Jan 23 05:25:07 np0005593234 podman[302051]: 2026-01-23 10:25:07.768747533 +0000 UTC m=+0.043253574 container remove fbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.774 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[87e2a113-f1ee-43d6-a46d-4b643387219b]: (4, ('Fri Jan 23 10:25:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155 (fbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08)\nfbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08\nFri Jan 23 10:25:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155 (fbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08)\nfbeecb1db736975457ef19744f55ef8254ec6a0f83ea0970bdbcbc2c9eac6d08\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.776 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5017aa24-40ac-49d8-867c-187cc8831feb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.777 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1280650e-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:07 np0005593234 nova_compute[227762]: 2026-01-23 10:25:07.778 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:07 np0005593234 kernel: tap1280650e-e0: left promiscuous mode
Jan 23 05:25:07 np0005593234 nova_compute[227762]: 2026-01-23 10:25:07.799 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.802 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[45a25b8e-f81d-4666-8e34-2e3f12a5846e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.814 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3b5af88a-b7c7-4f93-b41a-6616817dd61d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.816 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4d0cabaa-5514-4850-b61c-bd51e12d1128]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.829 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aa3f7fb5-9b9f-4b26-8233-e47593992630]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 776128, 'reachable_time': 15575, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302070, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.831 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1280650e-e283-4ddc-81aa-357640520155 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:25:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:07.831 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f5c13f-9906-4a69-a7c4-34d62779a820]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:07 np0005593234 systemd[1]: run-netns-ovnmeta\x2d1280650e\x2de283\x2d4ddc\x2d81aa\x2d357640520155.mount: Deactivated successfully.
Jan 23 05:25:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:07 np0005593234 nova_compute[227762]: 2026-01-23 10:25:07.943 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:08.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:08 np0005593234 nova_compute[227762]: 2026-01-23 10:25:08.659 227766 DEBUG nova.compute.manager [None req-56f8d969-01d2-42d5-a106-3f1e4efdd0a1 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:08.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:09 np0005593234 nova_compute[227762]: 2026-01-23 10:25:09.355 227766 DEBUG oslo_concurrency.lockutils [None req-56f8d969-01d2-42d5-a106-3f1e4efdd0a1 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 6.660s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:09 np0005593234 nova_compute[227762]: 2026-01-23 10:25:09.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:09 np0005593234 podman[302073]: 2026-01-23 10:25:09.923839937 +0000 UTC m=+0.211304637 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:25:10 np0005593234 nova_compute[227762]: 2026-01-23 10:25:10.050 227766 DEBUG nova.compute.manager [req-8834d565-80d4-4551-b5e5-2f9c06de8ced req-c8889a46-6445-4723-bdf9-33346f17c7c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-unplugged-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:10 np0005593234 nova_compute[227762]: 2026-01-23 10:25:10.051 227766 DEBUG oslo_concurrency.lockutils [req-8834d565-80d4-4551-b5e5-2f9c06de8ced req-c8889a46-6445-4723-bdf9-33346f17c7c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:10 np0005593234 nova_compute[227762]: 2026-01-23 10:25:10.052 227766 DEBUG oslo_concurrency.lockutils [req-8834d565-80d4-4551-b5e5-2f9c06de8ced req-c8889a46-6445-4723-bdf9-33346f17c7c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:10 np0005593234 nova_compute[227762]: 2026-01-23 10:25:10.053 227766 DEBUG oslo_concurrency.lockutils [req-8834d565-80d4-4551-b5e5-2f9c06de8ced req-c8889a46-6445-4723-bdf9-33346f17c7c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:10 np0005593234 nova_compute[227762]: 2026-01-23 10:25:10.053 227766 DEBUG nova.compute.manager [req-8834d565-80d4-4551-b5e5-2f9c06de8ced req-c8889a46-6445-4723-bdf9-33346f17c7c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] No waiting events found dispatching network-vif-unplugged-a4401398-6f7f-4595-b308-33a66a468a1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:10 np0005593234 nova_compute[227762]: 2026-01-23 10:25:10.054 227766 WARNING nova.compute.manager [req-8834d565-80d4-4551-b5e5-2f9c06de8ced req-c8889a46-6445-4723-bdf9-33346f17c7c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received unexpected event network-vif-unplugged-a4401398-6f7f-4595-b308-33a66a468a1f for instance with vm_state stopped and task_state None.#033[00m
Jan 23 05:25:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:10.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:10 np0005593234 nova_compute[227762]: 2026-01-23 10:25:10.639 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:10.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:25:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2458242056' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:25:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:25:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2458242056' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:25:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:12.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:12 np0005593234 nova_compute[227762]: 2026-01-23 10:25:12.766 227766 DEBUG nova.compute.manager [req-8e44a783-9da8-46ae-a454-e29eed5d8338 req-2c8d4cce-a0f2-4192-a1a1-9ccafae2b98c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:12 np0005593234 nova_compute[227762]: 2026-01-23 10:25:12.766 227766 DEBUG oslo_concurrency.lockutils [req-8e44a783-9da8-46ae-a454-e29eed5d8338 req-2c8d4cce-a0f2-4192-a1a1-9ccafae2b98c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:12 np0005593234 nova_compute[227762]: 2026-01-23 10:25:12.766 227766 DEBUG oslo_concurrency.lockutils [req-8e44a783-9da8-46ae-a454-e29eed5d8338 req-2c8d4cce-a0f2-4192-a1a1-9ccafae2b98c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:12 np0005593234 nova_compute[227762]: 2026-01-23 10:25:12.767 227766 DEBUG oslo_concurrency.lockutils [req-8e44a783-9da8-46ae-a454-e29eed5d8338 req-2c8d4cce-a0f2-4192-a1a1-9ccafae2b98c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:12 np0005593234 nova_compute[227762]: 2026-01-23 10:25:12.767 227766 DEBUG nova.compute.manager [req-8e44a783-9da8-46ae-a454-e29eed5d8338 req-2c8d4cce-a0f2-4192-a1a1-9ccafae2b98c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] No waiting events found dispatching network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:12 np0005593234 nova_compute[227762]: 2026-01-23 10:25:12.767 227766 WARNING nova.compute.manager [req-8e44a783-9da8-46ae-a454-e29eed5d8338 req-2c8d4cce-a0f2-4192-a1a1-9ccafae2b98c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received unexpected event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f for instance with vm_state stopped and task_state None.#033[00m
Jan 23 05:25:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:12.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:12 np0005593234 nova_compute[227762]: 2026-01-23 10:25:12.945 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:13 np0005593234 nova_compute[227762]: 2026-01-23 10:25:13.595 227766 DEBUG nova.objects.instance [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'flavor' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:13 np0005593234 nova_compute[227762]: 2026-01-23 10:25:13.627 227766 DEBUG oslo_concurrency.lockutils [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:25:13 np0005593234 nova_compute[227762]: 2026-01-23 10:25:13.627 227766 DEBUG oslo_concurrency.lockutils [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquired lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:25:13 np0005593234 nova_compute[227762]: 2026-01-23 10:25:13.628 227766 DEBUG nova.network.neutron [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:25:13 np0005593234 nova_compute[227762]: 2026-01-23 10:25:13.628 227766 DEBUG nova.objects.instance [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'info_cache' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:14.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:14.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:15 np0005593234 nova_compute[227762]: 2026-01-23 10:25:15.640 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:16.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:16.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e322 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:17 np0005593234 nova_compute[227762]: 2026-01-23 10:25:17.945 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:17 np0005593234 nova_compute[227762]: 2026-01-23 10:25:17.991 227766 DEBUG nova.network.neutron [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Updating instance_info_cache with network_info: [{"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.020 227766 DEBUG oslo_concurrency.lockutils [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Releasing lock "refresh_cache-bd0fc955-63ff-41a4-b31b-369c2b584544" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.047 227766 INFO nova.virt.libvirt.driver [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance destroyed successfully.#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.047 227766 DEBUG nova.objects.instance [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'numa_topology' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.065 227766 DEBUG nova.objects.instance [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'resources' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:18.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.117 227766 DEBUG nova.virt.libvirt.vif [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:23:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-59893283',display_name='tempest-AttachVolumeTestJSON-server-59893283',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-59893283',id=162,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFhliJMNa+yXuf0fSBz/3snMSJFr8dgTPMjo/Saadc9lk7SWpQL9azGzggupkC6Qe7ig8O16HGbub4k3l0C30EmW82whCbHcGBU51cs0XAqEAsopgWQJtLRZgtshG7dxog==',key_name='tempest-keypair-1909141305',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:23:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c7c25c6bb33b41bf9cd8febb8259fd87',ramdisk_id='',reservation_id='r-gp4q9qzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-345871886',owner_user_name='tempest-AttachVolumeTestJSON-345871886-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:25:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='01b7396ecc574dd6ba2df2f406921223',uuid=bd0fc955-63ff-41a4-b31b-369c2b584544,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.117 227766 DEBUG nova.network.os_vif_util [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converting VIF {"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.118 227766 DEBUG nova.network.os_vif_util [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.118 227766 DEBUG os_vif [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.119 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.120 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa4401398-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.121 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.122 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.124 227766 INFO os_vif [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f')#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.131 227766 DEBUG nova.virt.libvirt.driver [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Start _get_guest_xml network_info=[{"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.135 227766 WARNING nova.virt.libvirt.driver [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.144 227766 DEBUG nova.virt.libvirt.host [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.145 227766 DEBUG nova.virt.libvirt.host [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.148 227766 DEBUG nova.virt.libvirt.host [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.149 227766 DEBUG nova.virt.libvirt.host [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.149 227766 DEBUG nova.virt.libvirt.driver [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.150 227766 DEBUG nova.virt.hardware [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.150 227766 DEBUG nova.virt.hardware [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.150 227766 DEBUG nova.virt.hardware [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.150 227766 DEBUG nova.virt.hardware [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.151 227766 DEBUG nova.virt.hardware [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.151 227766 DEBUG nova.virt.hardware [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.151 227766 DEBUG nova.virt.hardware [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.151 227766 DEBUG nova.virt.hardware [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.151 227766 DEBUG nova.virt.hardware [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.151 227766 DEBUG nova.virt.hardware [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.152 227766 DEBUG nova.virt.hardware [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.152 227766 DEBUG nova.objects.instance [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'vcpu_model' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.210 227766 DEBUG oslo_concurrency.processutils [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:25:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2032987589' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.618 227766 DEBUG oslo_concurrency.processutils [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.408s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:18 np0005593234 nova_compute[227762]: 2026-01-23 10:25:18.675 227766 DEBUG oslo_concurrency.processutils [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:18.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:25:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/543761693' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.655 227766 DEBUG oslo_concurrency.processutils [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.981s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.658 227766 DEBUG nova.virt.libvirt.vif [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:23:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-59893283',display_name='tempest-AttachVolumeTestJSON-server-59893283',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-59893283',id=162,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFhliJMNa+yXuf0fSBz/3snMSJFr8dgTPMjo/Saadc9lk7SWpQL9azGzggupkC6Qe7ig8O16HGbub4k3l0C30EmW82whCbHcGBU51cs0XAqEAsopgWQJtLRZgtshG7dxog==',key_name='tempest-keypair-1909141305',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:23:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c7c25c6bb33b41bf9cd8febb8259fd87',ramdisk_id='',reservation_id='r-gp4q9qzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-345871886',owner_user_name='tempest-AttachVolumeTestJSON-345871886-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:25:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='01b7396ecc574dd6ba2df2f406921223',uuid=bd0fc955-63ff-41a4-b31b-369c2b584544,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.658 227766 DEBUG nova.network.os_vif_util [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converting VIF {"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.659 227766 DEBUG nova.network.os_vif_util [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.660 227766 DEBUG nova.objects.instance [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'pci_devices' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.679 227766 DEBUG nova.virt.libvirt.driver [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  <uuid>bd0fc955-63ff-41a4-b31b-369c2b584544</uuid>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  <name>instance-000000a2</name>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <nova:name>tempest-AttachVolumeTestJSON-server-59893283</nova:name>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:25:18</nova:creationTime>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <nova:user uuid="01b7396ecc574dd6ba2df2f406921223">tempest-AttachVolumeTestJSON-345871886-project-member</nova:user>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <nova:project uuid="c7c25c6bb33b41bf9cd8febb8259fd87">tempest-AttachVolumeTestJSON-345871886</nova:project>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <nova:port uuid="a4401398-6f7f-4595-b308-33a66a468a1f">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <entry name="serial">bd0fc955-63ff-41a4-b31b-369c2b584544</entry>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <entry name="uuid">bd0fc955-63ff-41a4-b31b-369c2b584544</entry>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/bd0fc955-63ff-41a4-b31b-369c2b584544_disk">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/bd0fc955-63ff-41a4-b31b-369c2b584544_disk.config">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:d9:d9:06"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <target dev="tapa4401398-6f"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/bd0fc955-63ff-41a4-b31b-369c2b584544/console.log" append="off"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <input type="keyboard" bus="usb"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:25:19 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:25:19 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:25:19 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:25:19 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.681 227766 DEBUG nova.virt.libvirt.driver [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.681 227766 DEBUG nova.virt.libvirt.driver [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] skipping disk for instance-000000a2 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.681 227766 DEBUG nova.virt.libvirt.vif [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:23:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-59893283',display_name='tempest-AttachVolumeTestJSON-server-59893283',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-59893283',id=162,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFhliJMNa+yXuf0fSBz/3snMSJFr8dgTPMjo/Saadc9lk7SWpQL9azGzggupkC6Qe7ig8O16HGbub4k3l0C30EmW82whCbHcGBU51cs0XAqEAsopgWQJtLRZgtshG7dxog==',key_name='tempest-keypair-1909141305',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:23:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='c7c25c6bb33b41bf9cd8febb8259fd87',ramdisk_id='',reservation_id='r-gp4q9qzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-345871886',owner_user_name='tempest-AttachVolumeTestJSON-345871886-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:25:09Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='01b7396ecc574dd6ba2df2f406921223',uuid=bd0fc955-63ff-41a4-b31b-369c2b584544,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.682 227766 DEBUG nova.network.os_vif_util [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converting VIF {"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.682 227766 DEBUG nova.network.os_vif_util [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.683 227766 DEBUG os_vif [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.683 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.684 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.684 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.686 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.686 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa4401398-6f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.687 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa4401398-6f, col_values=(('external_ids', {'iface-id': 'a4401398-6f7f-4595-b308-33a66a468a1f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:d9:06', 'vm-uuid': 'bd0fc955-63ff-41a4-b31b-369c2b584544'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.688 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:19 np0005593234 NetworkManager[48942]: <info>  [1769163919.6899] manager: (tapa4401398-6f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/328)
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.691 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.694 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.696 227766 INFO os_vif [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f')#033[00m
Jan 23 05:25:19 np0005593234 kernel: tapa4401398-6f: entered promiscuous mode
Jan 23 05:25:19 np0005593234 NetworkManager[48942]: <info>  [1769163919.7654] manager: (tapa4401398-6f): new Tun device (/org/freedesktop/NetworkManager/Devices/329)
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.766 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:19 np0005593234 ovn_controller[134547]: 2026-01-23T10:25:19Z|00687|binding|INFO|Claiming lport a4401398-6f7f-4595-b308-33a66a468a1f for this chassis.
Jan 23 05:25:19 np0005593234 ovn_controller[134547]: 2026-01-23T10:25:19Z|00688|binding|INFO|a4401398-6f7f-4595-b308-33a66a468a1f: Claiming fa:16:3e:d9:d9:06 10.100.0.7
Jan 23 05:25:19 np0005593234 ovn_controller[134547]: 2026-01-23T10:25:19Z|00689|binding|INFO|Setting lport a4401398-6f7f-4595-b308-33a66a468a1f ovn-installed in OVS
Jan 23 05:25:19 np0005593234 ovn_controller[134547]: 2026-01-23T10:25:19Z|00690|binding|INFO|Setting lport a4401398-6f7f-4595-b308-33a66a468a1f up in Southbound
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.778 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:d9:06 10.100.0.7'], port_security=['fa:16:3e:d9:d9:06 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'bd0fc955-63ff-41a4-b31b-369c2b584544', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1280650e-e283-4ddc-81aa-357640520155', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7c25c6bb33b41bf9cd8febb8259fd87', 'neutron:revision_number': '7', 'neutron:security_group_ids': '3c721f45-9254-46f2-b17b-2aa67f5ce3fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.207'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4684203-7828-4ea2-86ad-83030eb9aefe, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a4401398-6f7f-4595-b308-33a66a468a1f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.780 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a4401398-6f7f-4595-b308-33a66a468a1f in datapath 1280650e-e283-4ddc-81aa-357640520155 bound to our chassis#033[00m
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.781 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1280650e-e283-4ddc-81aa-357640520155#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.781 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.783 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:19 np0005593234 systemd-udevd[302231]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.794 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fe3420b3-fd9f-4a32-b13e-4d0350e9bc11]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.795 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1280650e-e1 in ovnmeta-1280650e-e283-4ddc-81aa-357640520155 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.797 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1280650e-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.797 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6462203c-323b-41bf-a224-91cded40f1a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.798 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8512cd39-7d7f-4b9f-9903-6896adbdbd8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:19 np0005593234 systemd-machined[195626]: New machine qemu-77-instance-000000a2.
Jan 23 05:25:19 np0005593234 NetworkManager[48942]: <info>  [1769163919.8073] device (tapa4401398-6f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:25:19 np0005593234 NetworkManager[48942]: <info>  [1769163919.8079] device (tapa4401398-6f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.810 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[16a63504-1cfb-4d62-9bdf-4586d1248379]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:19 np0005593234 systemd[1]: Started Virtual Machine qemu-77-instance-000000a2.
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.832 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e4c01675-b86a-4831-9b97-c4502cc36f3e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.860 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[6bafec12-ff11-4bd8-8d3e-6895e99d0d7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:19 np0005593234 NetworkManager[48942]: <info>  [1769163919.8661] manager: (tap1280650e-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/330)
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.865 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f32091c3-8e87-43aa-919a-5684076f147e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:19 np0005593234 systemd-udevd[302235]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.895 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[66c73f94-33f4-495e-a05f-c5bf4a06b0d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.897 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ffcacf3f-61e0-4729-a84e-1b75f3d8adb5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:19 np0005593234 NetworkManager[48942]: <info>  [1769163919.9191] device (tap1280650e-e0): carrier: link connected
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.924 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b6606579-2ae8-4b1d-b3fc-873682f2e927]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.940 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[42fd155f-2404-4142-a21b-d3ff14ba7e57]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1280650e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:5b:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 781487, 'reachable_time': 24586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302264, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.955 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[81c68062-b7dc-4d99-b6ac-9eafc592b6bf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:5b3e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 781487, 'tstamp': 781487}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 302265, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.960 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:19.971 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[67737a47-bf74-432a-b6c2-062bf3cb6c5c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1280650e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:5b:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 214], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 781487, 'reachable_time': 24586, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 302266, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.985 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Triggering sync for uuid bd0fc955-63ff-41a4-b31b-369c2b584544 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.985 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.986 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.986 227766 INFO nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 23 05:25:19 np0005593234 nova_compute[227762]: 2026-01-23 10:25:19.986 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:20.003 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[84eefba0-7017-4aaf-bed2-194aff54de55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:20.056 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[799c29d5-92bf-4d15-a0b7-d000d337e600]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:20.058 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1280650e-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:20.059 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:20.059 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1280650e-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:20 np0005593234 NetworkManager[48942]: <info>  [1769163920.0624] manager: (tap1280650e-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/331)
Jan 23 05:25:20 np0005593234 kernel: tap1280650e-e0: entered promiscuous mode
Jan 23 05:25:20 np0005593234 nova_compute[227762]: 2026-01-23 10:25:20.061 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:20.065 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1280650e-e0, col_values=(('external_ids', {'iface-id': '8ca9fbcb-59f5-4006-84df-ab99827a2b39'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:20 np0005593234 nova_compute[227762]: 2026-01-23 10:25:20.066 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:20 np0005593234 nova_compute[227762]: 2026-01-23 10:25:20.067 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:25:20Z|00691|binding|INFO|Releasing lport 8ca9fbcb-59f5-4006-84df-ab99827a2b39 from this chassis (sb_readonly=0)
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:20.068 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1280650e-e283-4ddc-81aa-357640520155.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1280650e-e283-4ddc-81aa-357640520155.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:20.074 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1ce50638-a55d-49d7-acb5-a4e70cc66fd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:20.075 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-1280650e-e283-4ddc-81aa-357640520155
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/1280650e-e283-4ddc-81aa-357640520155.pid.haproxy
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 1280650e-e283-4ddc-81aa-357640520155
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:25:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:20.078 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'env', 'PROCESS_TAG=haproxy-1280650e-e283-4ddc-81aa-357640520155', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1280650e-e283-4ddc-81aa-357640520155.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:25:20 np0005593234 nova_compute[227762]: 2026-01-23 10:25:20.082 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:20.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:20 np0005593234 nova_compute[227762]: 2026-01-23 10:25:20.460 227766 DEBUG nova.compute.manager [req-894b4513-9534-48d8-84c6-e1ce6c32fb07 req-9feccf48-6d80-4919-96a0-3b24a878cc7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:20 np0005593234 nova_compute[227762]: 2026-01-23 10:25:20.460 227766 DEBUG oslo_concurrency.lockutils [req-894b4513-9534-48d8-84c6-e1ce6c32fb07 req-9feccf48-6d80-4919-96a0-3b24a878cc7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:20 np0005593234 nova_compute[227762]: 2026-01-23 10:25:20.460 227766 DEBUG oslo_concurrency.lockutils [req-894b4513-9534-48d8-84c6-e1ce6c32fb07 req-9feccf48-6d80-4919-96a0-3b24a878cc7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:20 np0005593234 nova_compute[227762]: 2026-01-23 10:25:20.460 227766 DEBUG oslo_concurrency.lockutils [req-894b4513-9534-48d8-84c6-e1ce6c32fb07 req-9feccf48-6d80-4919-96a0-3b24a878cc7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:20 np0005593234 nova_compute[227762]: 2026-01-23 10:25:20.460 227766 DEBUG nova.compute.manager [req-894b4513-9534-48d8-84c6-e1ce6c32fb07 req-9feccf48-6d80-4919-96a0-3b24a878cc7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] No waiting events found dispatching network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:20 np0005593234 nova_compute[227762]: 2026-01-23 10:25:20.461 227766 WARNING nova.compute.manager [req-894b4513-9534-48d8-84c6-e1ce6c32fb07 req-9feccf48-6d80-4919-96a0-3b24a878cc7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received unexpected event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 23 05:25:20 np0005593234 podman[302300]: 2026-01-23 10:25:20.485040308 +0000 UTC m=+0.055109683 container create 6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:25:20 np0005593234 systemd[1]: Started libpod-conmon-6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966.scope.
Jan 23 05:25:20 np0005593234 podman[302300]: 2026-01-23 10:25:20.456816521 +0000 UTC m=+0.026885916 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:25:20 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:25:20 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef4d34486f722c2a0878656b39536da0d3a8baf13b68436d099a28bcfb613688/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:25:20 np0005593234 podman[302300]: 2026-01-23 10:25:20.576634644 +0000 UTC m=+0.146704069 container init 6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 23 05:25:20 np0005593234 podman[302300]: 2026-01-23 10:25:20.582260888 +0000 UTC m=+0.152330273 container start 6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:25:20 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[302316]: [NOTICE]   (302338) : New worker (302340) forked
Jan 23 05:25:20 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[302316]: [NOTICE]   (302338) : Loading success.
Jan 23 05:25:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:20.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:21 np0005593234 nova_compute[227762]: 2026-01-23 10:25:21.092 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163906.0909493, bd0fc955-63ff-41a4-b31b-369c2b584544 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:25:21 np0005593234 nova_compute[227762]: 2026-01-23 10:25:21.092 227766 INFO nova.compute.manager [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:25:21 np0005593234 nova_compute[227762]: 2026-01-23 10:25:21.312 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for bd0fc955-63ff-41a4-b31b-369c2b584544 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:25:21 np0005593234 nova_compute[227762]: 2026-01-23 10:25:21.313 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163921.3122115, bd0fc955-63ff-41a4-b31b-369c2b584544 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:25:21 np0005593234 nova_compute[227762]: 2026-01-23 10:25:21.313 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:25:21 np0005593234 nova_compute[227762]: 2026-01-23 10:25:21.315 227766 DEBUG nova.compute.manager [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:25:21 np0005593234 nova_compute[227762]: 2026-01-23 10:25:21.319 227766 INFO nova.virt.libvirt.driver [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance rebooted successfully.#033[00m
Jan 23 05:25:21 np0005593234 nova_compute[227762]: 2026-01-23 10:25:21.319 227766 DEBUG nova.compute.manager [None req-91588f92-6286-412c-b7a8-621cd486c606 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:21 np0005593234 nova_compute[227762]: 2026-01-23 10:25:21.970 227766 DEBUG nova.compute.manager [None req-aa0ea035-8c83-4810-ba28-dba21afe288d - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:22.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e323 e323: 3 total, 3 up, 3 in
Jan 23 05:25:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:25:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:25:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:25:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:25:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:25:22 np0005593234 nova_compute[227762]: 2026-01-23 10:25:22.893 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.026000816s ======
Jan 23 05:25:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:22.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.026000816s
Jan 23 05:25:22 np0005593234 nova_compute[227762]: 2026-01-23 10:25:22.897 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:25:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:22 np0005593234 nova_compute[227762]: 2026-01-23 10:25:22.947 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:23 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:25:23 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:25:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:24.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:24 np0005593234 nova_compute[227762]: 2026-01-23 10:25:24.516 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163921.3130202, bd0fc955-63ff-41a4-b31b-369c2b584544 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:25:24 np0005593234 nova_compute[227762]: 2026-01-23 10:25:24.517 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] VM Started (Lifecycle Event)#033[00m
Jan 23 05:25:24 np0005593234 nova_compute[227762]: 2026-01-23 10:25:24.580 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:24 np0005593234 nova_compute[227762]: 2026-01-23 10:25:24.583 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:25:24 np0005593234 nova_compute[227762]: 2026-01-23 10:25:24.689 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:24.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:25 np0005593234 nova_compute[227762]: 2026-01-23 10:25:25.095 227766 DEBUG nova.compute.manager [req-baaecdee-915d-4d58-94fb-cb77908d4a56 req-0976d78c-1921-492d-b633-5de5e7a0a48a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:25 np0005593234 nova_compute[227762]: 2026-01-23 10:25:25.095 227766 DEBUG oslo_concurrency.lockutils [req-baaecdee-915d-4d58-94fb-cb77908d4a56 req-0976d78c-1921-492d-b633-5de5e7a0a48a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:25 np0005593234 nova_compute[227762]: 2026-01-23 10:25:25.096 227766 DEBUG oslo_concurrency.lockutils [req-baaecdee-915d-4d58-94fb-cb77908d4a56 req-0976d78c-1921-492d-b633-5de5e7a0a48a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:25 np0005593234 nova_compute[227762]: 2026-01-23 10:25:25.096 227766 DEBUG oslo_concurrency.lockutils [req-baaecdee-915d-4d58-94fb-cb77908d4a56 req-0976d78c-1921-492d-b633-5de5e7a0a48a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:25 np0005593234 nova_compute[227762]: 2026-01-23 10:25:25.096 227766 DEBUG nova.compute.manager [req-baaecdee-915d-4d58-94fb-cb77908d4a56 req-0976d78c-1921-492d-b633-5de5e7a0a48a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] No waiting events found dispatching network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:25 np0005593234 nova_compute[227762]: 2026-01-23 10:25:25.096 227766 WARNING nova.compute.manager [req-baaecdee-915d-4d58-94fb-cb77908d4a56 req-0976d78c-1921-492d-b633-5de5e7a0a48a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received unexpected event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f for instance with vm_state active and task_state None.#033[00m
Jan 23 05:25:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:26.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:26.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e323 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:27 np0005593234 nova_compute[227762]: 2026-01-23 10:25:27.951 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:28.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:28.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:25:29Z|00692|binding|INFO|Releasing lport 8ca9fbcb-59f5-4006-84df-ab99827a2b39 from this chassis (sb_readonly=0)
Jan 23 05:25:29 np0005593234 nova_compute[227762]: 2026-01-23 10:25:29.437 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:29 np0005593234 nova_compute[227762]: 2026-01-23 10:25:29.692 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:30.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:25:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:25:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:30.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:31 np0005593234 podman[302681]: 2026-01-23 10:25:31.788081312 +0000 UTC m=+0.067900661 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:25:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:32.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 e324: 3 total, 3 up, 3 in
Jan 23 05:25:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:32.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:32 np0005593234 nova_compute[227762]: 2026-01-23 10:25:32.952 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:33 np0005593234 ovn_controller[134547]: 2026-01-23T10:25:33Z|00079|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d9:d9:06 10.100.0.7
Jan 23 05:25:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:34.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:34 np0005593234 nova_compute[227762]: 2026-01-23 10:25:34.731 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:34.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:36.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:36.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:37 np0005593234 nova_compute[227762]: 2026-01-23 10:25:37.955 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:38.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:38.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:39 np0005593234 nova_compute[227762]: 2026-01-23 10:25:39.747 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:40.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:40 np0005593234 podman[302758]: 2026-01-23 10:25:40.530006325 +0000 UTC m=+0.075320921 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:25:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:40.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.675 227766 DEBUG oslo_concurrency.lockutils [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.675 227766 DEBUG oslo_concurrency.lockutils [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.676 227766 DEBUG oslo_concurrency.lockutils [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.676 227766 DEBUG oslo_concurrency.lockutils [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.676 227766 DEBUG oslo_concurrency.lockutils [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.677 227766 INFO nova.compute.manager [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Terminating instance#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.678 227766 DEBUG nova.compute.manager [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:25:41 np0005593234 kernel: tapa4401398-6f (unregistering): left promiscuous mode
Jan 23 05:25:41 np0005593234 NetworkManager[48942]: <info>  [1769163941.7511] device (tapa4401398-6f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.758 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:25:41Z|00693|binding|INFO|Releasing lport a4401398-6f7f-4595-b308-33a66a468a1f from this chassis (sb_readonly=0)
Jan 23 05:25:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:25:41Z|00694|binding|INFO|Setting lport a4401398-6f7f-4595-b308-33a66a468a1f down in Southbound
Jan 23 05:25:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:25:41Z|00695|binding|INFO|Removing iface tapa4401398-6f ovn-installed in OVS
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.760 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:41.767 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:d9:06 10.100.0.7'], port_security=['fa:16:3e:d9:d9:06 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'bd0fc955-63ff-41a4-b31b-369c2b584544', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1280650e-e283-4ddc-81aa-357640520155', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7c25c6bb33b41bf9cd8febb8259fd87', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3c721f45-9254-46f2-b17b-2aa67f5ce3fb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.207', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4684203-7828-4ea2-86ad-83030eb9aefe, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a4401398-6f7f-4595-b308-33a66a468a1f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:25:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:41.771 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a4401398-6f7f-4595-b308-33a66a468a1f in datapath 1280650e-e283-4ddc-81aa-357640520155 unbound from our chassis#033[00m
Jan 23 05:25:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:41.773 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1280650e-e283-4ddc-81aa-357640520155, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:25:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:41.776 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ed005474-81d7-469a-b1dd-d614b1adc703]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:41.777 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1280650e-e283-4ddc-81aa-357640520155 namespace which is not needed anymore#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.783 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:41 np0005593234 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a2.scope: Deactivated successfully.
Jan 23 05:25:41 np0005593234 systemd[1]: machine-qemu\x2d77\x2dinstance\x2d000000a2.scope: Consumed 14.070s CPU time.
Jan 23 05:25:41 np0005593234 systemd-machined[195626]: Machine qemu-77-instance-000000a2 terminated.
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.898 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:41 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[302316]: [NOTICE]   (302338) : haproxy version is 2.8.14-c23fe91
Jan 23 05:25:41 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[302316]: [NOTICE]   (302338) : path to executable is /usr/sbin/haproxy
Jan 23 05:25:41 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[302316]: [WARNING]  (302338) : Exiting Master process...
Jan 23 05:25:41 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[302316]: [ALERT]    (302338) : Current worker (302340) exited with code 143 (Terminated)
Jan 23 05:25:41 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[302316]: [WARNING]  (302338) : All workers exited. Exiting... (0)
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.902 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:41 np0005593234 systemd[1]: libpod-6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966.scope: Deactivated successfully.
Jan 23 05:25:41 np0005593234 podman[302809]: 2026-01-23 10:25:41.910686027 +0000 UTC m=+0.043954208 container died 6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.915 227766 INFO nova.virt.libvirt.driver [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Instance destroyed successfully.#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.916 227766 DEBUG nova.objects.instance [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'resources' on Instance uuid bd0fc955-63ff-41a4-b31b-369c2b584544 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.933 227766 DEBUG nova.virt.libvirt.vif [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:23:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-59893283',display_name='tempest-AttachVolumeTestJSON-server-59893283',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-59893283',id=162,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFhliJMNa+yXuf0fSBz/3snMSJFr8dgTPMjo/Saadc9lk7SWpQL9azGzggupkC6Qe7ig8O16HGbub4k3l0C30EmW82whCbHcGBU51cs0XAqEAsopgWQJtLRZgtshG7dxog==',key_name='tempest-keypair-1909141305',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:23:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c7c25c6bb33b41bf9cd8febb8259fd87',ramdisk_id='',reservation_id='r-gp4q9qzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-345871886',owner_user_name='tempest-AttachVolumeTestJSON-345871886-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:25:23Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='01b7396ecc574dd6ba2df2f406921223',uuid=bd0fc955-63ff-41a4-b31b-369c2b584544,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.934 227766 DEBUG nova.network.os_vif_util [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converting VIF {"id": "a4401398-6f7f-4595-b308-33a66a468a1f", "address": "fa:16:3e:d9:d9:06", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.207", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa4401398-6f", "ovs_interfaceid": "a4401398-6f7f-4595-b308-33a66a468a1f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.934 227766 DEBUG nova.network.os_vif_util [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.935 227766 DEBUG os_vif [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:25:41 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966-userdata-shm.mount: Deactivated successfully.
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.939 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.940 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa4401398-6f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:41 np0005593234 systemd[1]: var-lib-containers-storage-overlay-ef4d34486f722c2a0878656b39536da0d3a8baf13b68436d099a28bcfb613688-merged.mount: Deactivated successfully.
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.942 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.943 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:41 np0005593234 nova_compute[227762]: 2026-01-23 10:25:41.946 227766 INFO os_vif [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:d9:06,bridge_name='br-int',has_traffic_filtering=True,id=a4401398-6f7f-4595-b308-33a66a468a1f,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa4401398-6f')#033[00m
Jan 23 05:25:41 np0005593234 podman[302809]: 2026-01-23 10:25:41.950009829 +0000 UTC m=+0.083278010 container cleanup 6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 05:25:41 np0005593234 systemd[1]: libpod-conmon-6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966.scope: Deactivated successfully.
Jan 23 05:25:42 np0005593234 podman[302855]: 2026-01-23 10:25:42.013403708 +0000 UTC m=+0.041916703 container remove 6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:25:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:42.020 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d92aa7a2-9607-4358-8b74-ec4f9078dd9c]: (4, ('Fri Jan 23 10:25:41 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155 (6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966)\n6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966\nFri Jan 23 10:25:41 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155 (6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966)\n6399cd1a01f1a9115abdc4496fd9a50ddeb7226be1f0c1d05e2e9aa86da0d966\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:42.022 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b468aa59-4e6e-4382-a673-ea86aa9cfe88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:42.024 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1280650e-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.026 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:42 np0005593234 kernel: tap1280650e-e0: left promiscuous mode
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.045 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:42.049 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d0cb8aa5-8dda-4fb8-8493-71de7fb4d362]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:42.066 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6399dd49-65a6-431b-91a5-60ef30cd2dbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:42.067 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aa8dac5a-82d5-4d4f-9aae-85eb00644879]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:42.081 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[15daa64c-9108-497d-93d8-ea4299bd88e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 781481, 'reachable_time': 22793, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 302878, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:42.084 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1280650e-e283-4ddc-81aa-357640520155 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:25:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:42.084 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[0761e668-6ca9-47da-b0d5-1e0ae0e47b25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:25:42 np0005593234 systemd[1]: run-netns-ovnmeta\x2d1280650e\x2de283\x2d4ddc\x2d81aa\x2d357640520155.mount: Deactivated successfully.
Jan 23 05:25:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:25:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:42.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.399 227766 INFO nova.virt.libvirt.driver [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Deleting instance files /var/lib/nova/instances/bd0fc955-63ff-41a4-b31b-369c2b584544_del#033[00m
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.400 227766 INFO nova.virt.libvirt.driver [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Deletion of /var/lib/nova/instances/bd0fc955-63ff-41a4-b31b-369c2b584544_del complete#033[00m
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.455 227766 INFO nova.compute.manager [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Took 0.78 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.455 227766 DEBUG oslo.service.loopingcall [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.456 227766 DEBUG nova.compute.manager [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.456 227766 DEBUG nova.network.neutron [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.771 227766 DEBUG nova.compute.manager [req-6620f0db-d586-45ed-b612-cc68e2a8832b req-e0599356-76a0-41fb-85c6-bc0f968c1236 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-unplugged-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.772 227766 DEBUG oslo_concurrency.lockutils [req-6620f0db-d586-45ed-b612-cc68e2a8832b req-e0599356-76a0-41fb-85c6-bc0f968c1236 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.773 227766 DEBUG oslo_concurrency.lockutils [req-6620f0db-d586-45ed-b612-cc68e2a8832b req-e0599356-76a0-41fb-85c6-bc0f968c1236 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.773 227766 DEBUG oslo_concurrency.lockutils [req-6620f0db-d586-45ed-b612-cc68e2a8832b req-e0599356-76a0-41fb-85c6-bc0f968c1236 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.773 227766 DEBUG nova.compute.manager [req-6620f0db-d586-45ed-b612-cc68e2a8832b req-e0599356-76a0-41fb-85c6-bc0f968c1236 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] No waiting events found dispatching network-vif-unplugged-a4401398-6f7f-4595-b308-33a66a468a1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.774 227766 DEBUG nova.compute.manager [req-6620f0db-d586-45ed-b612-cc68e2a8832b req-e0599356-76a0-41fb-85c6-bc0f968c1236 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-unplugged-a4401398-6f7f-4595-b308-33a66a468a1f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:25:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:42.860 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:42.860 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:42.861 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:42 np0005593234 nova_compute[227762]: 2026-01-23 10:25:42.957 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:42.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:43 np0005593234 nova_compute[227762]: 2026-01-23 10:25:43.501 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:43 np0005593234 nova_compute[227762]: 2026-01-23 10:25:43.565 227766 DEBUG nova.network.neutron [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:25:43 np0005593234 nova_compute[227762]: 2026-01-23 10:25:43.596 227766 INFO nova.compute.manager [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Took 1.14 seconds to deallocate network for instance.#033[00m
Jan 23 05:25:43 np0005593234 nova_compute[227762]: 2026-01-23 10:25:43.653 227766 DEBUG oslo_concurrency.lockutils [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:43 np0005593234 nova_compute[227762]: 2026-01-23 10:25:43.654 227766 DEBUG oslo_concurrency.lockutils [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:43 np0005593234 nova_compute[227762]: 2026-01-23 10:25:43.687 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:43 np0005593234 nova_compute[227762]: 2026-01-23 10:25:43.701 227766 DEBUG nova.compute.manager [req-af1bd100-6023-4767-b6e7-60bf3a7b2e87 req-ed99797d-5aaa-41fb-a76c-156aab73989c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-deleted-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:43 np0005593234 nova_compute[227762]: 2026-01-23 10:25:43.760 227766 DEBUG oslo_concurrency.processutils [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:44.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:25:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1589239683' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:44 np0005593234 nova_compute[227762]: 2026-01-23 10:25:44.218 227766 DEBUG oslo_concurrency.processutils [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:44 np0005593234 nova_compute[227762]: 2026-01-23 10:25:44.225 227766 DEBUG nova.compute.provider_tree [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:25:44 np0005593234 nova_compute[227762]: 2026-01-23 10:25:44.245 227766 DEBUG nova.scheduler.client.report [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:25:44 np0005593234 nova_compute[227762]: 2026-01-23 10:25:44.271 227766 DEBUG oslo_concurrency.lockutils [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:44 np0005593234 nova_compute[227762]: 2026-01-23 10:25:44.334 227766 INFO nova.scheduler.client.report [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Deleted allocations for instance bd0fc955-63ff-41a4-b31b-369c2b584544#033[00m
Jan 23 05:25:44 np0005593234 nova_compute[227762]: 2026-01-23 10:25:44.444 227766 DEBUG oslo_concurrency.lockutils [None req-b1fb0848-5b34-4844-8a1b-ee1d9e9a2f42 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.768s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:44 np0005593234 nova_compute[227762]: 2026-01-23 10:25:44.921 227766 DEBUG nova.compute.manager [req-84b257f7-68d0-45ab-a4e3-d81201ed12e5 req-5f776140-da84-4c95-a543-62786ed8a911 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:25:44 np0005593234 nova_compute[227762]: 2026-01-23 10:25:44.922 227766 DEBUG oslo_concurrency.lockutils [req-84b257f7-68d0-45ab-a4e3-d81201ed12e5 req-5f776140-da84-4c95-a543-62786ed8a911 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:44 np0005593234 nova_compute[227762]: 2026-01-23 10:25:44.922 227766 DEBUG oslo_concurrency.lockutils [req-84b257f7-68d0-45ab-a4e3-d81201ed12e5 req-5f776140-da84-4c95-a543-62786ed8a911 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:44 np0005593234 nova_compute[227762]: 2026-01-23 10:25:44.922 227766 DEBUG oslo_concurrency.lockutils [req-84b257f7-68d0-45ab-a4e3-d81201ed12e5 req-5f776140-da84-4c95-a543-62786ed8a911 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "bd0fc955-63ff-41a4-b31b-369c2b584544-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:44 np0005593234 nova_compute[227762]: 2026-01-23 10:25:44.922 227766 DEBUG nova.compute.manager [req-84b257f7-68d0-45ab-a4e3-d81201ed12e5 req-5f776140-da84-4c95-a543-62786ed8a911 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] No waiting events found dispatching network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:25:44 np0005593234 nova_compute[227762]: 2026-01-23 10:25:44.923 227766 WARNING nova.compute.manager [req-84b257f7-68d0-45ab-a4e3-d81201ed12e5 req-5f776140-da84-4c95-a543-62786ed8a911 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Received unexpected event network-vif-plugged-a4401398-6f7f-4595-b308-33a66a468a1f for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:25:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:44.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:46.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:46 np0005593234 nova_compute[227762]: 2026-01-23 10:25:46.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:46 np0005593234 nova_compute[227762]: 2026-01-23 10:25:46.770 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:46 np0005593234 nova_compute[227762]: 2026-01-23 10:25:46.771 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:46 np0005593234 nova_compute[227762]: 2026-01-23 10:25:46.771 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:46 np0005593234 nova_compute[227762]: 2026-01-23 10:25:46.771 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:25:46 np0005593234 nova_compute[227762]: 2026-01-23 10:25:46.771 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:46 np0005593234 nova_compute[227762]: 2026-01-23 10:25:46.943 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:46.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:25:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2046212888' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.196 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.355 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.356 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4398MB free_disk=20.921573638916016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.356 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.357 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.435 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.436 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.456 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:25:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2173456230' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.910 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.915 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.934 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:25:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.960 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.963 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:25:47 np0005593234 nova_compute[227762]: 2026-01-23 10:25:47.963 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:48.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:48.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:49 np0005593234 nova_compute[227762]: 2026-01-23 10:25:49.964 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:50.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:50.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:51 np0005593234 nova_compute[227762]: 2026-01-23 10:25:51.945 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:52.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:52 np0005593234 nova_compute[227762]: 2026-01-23 10:25:52.962 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:52.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:54.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:54 np0005593234 nova_compute[227762]: 2026-01-23 10:25:54.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:54.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:55 np0005593234 nova_compute[227762]: 2026-01-23 10:25:55.481 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:55 np0005593234 nova_compute[227762]: 2026-01-23 10:25:55.482 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:55 np0005593234 nova_compute[227762]: 2026-01-23 10:25:55.501 227766 DEBUG nova.compute.manager [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:25:55 np0005593234 nova_compute[227762]: 2026-01-23 10:25:55.610 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:55 np0005593234 nova_compute[227762]: 2026-01-23 10:25:55.610 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:55 np0005593234 nova_compute[227762]: 2026-01-23 10:25:55.615 227766 DEBUG nova.virt.hardware [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:25:55 np0005593234 nova_compute[227762]: 2026-01-23 10:25:55.615 227766 INFO nova.compute.claims [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:25:55 np0005593234 nova_compute[227762]: 2026-01-23 10:25:55.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:55 np0005593234 nova_compute[227762]: 2026-01-23 10:25:55.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:25:55 np0005593234 nova_compute[227762]: 2026-01-23 10:25:55.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:25:55 np0005593234 nova_compute[227762]: 2026-01-23 10:25:55.766 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:25:55 np0005593234 nova_compute[227762]: 2026-01-23 10:25:55.766 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:25:55 np0005593234 nova_compute[227762]: 2026-01-23 10:25:55.777 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:56.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:25:56 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2642236723' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.209 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.213 227766 DEBUG nova.compute.provider_tree [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.231 227766 DEBUG nova.scheduler.client.report [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.255 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.256 227766 DEBUG nova.compute.manager [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.312 227766 DEBUG nova.compute.manager [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.313 227766 DEBUG nova.network.neutron [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.335 227766 INFO nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.356 227766 DEBUG nova.compute.manager [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:25:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:56.467 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=61, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=60) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:25:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:25:56.468 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.468 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.537 227766 DEBUG nova.compute.manager [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.538 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.538 227766 INFO nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Creating image(s)#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.567 227766 DEBUG nova.storage.rbd_utils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image 12b37be9-93a2-4e10-9056-68a743ed2673_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.594 227766 DEBUG nova.storage.rbd_utils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image 12b37be9-93a2-4e10-9056-68a743ed2673_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.617 227766 DEBUG nova.storage.rbd_utils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image 12b37be9-93a2-4e10-9056-68a743ed2673_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.620 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.686 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.687 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.687 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.688 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.709 227766 DEBUG nova.storage.rbd_utils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image 12b37be9-93a2-4e10-9056-68a743ed2673_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.712 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 12b37be9-93a2-4e10-9056-68a743ed2673_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.914 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163941.913742, bd0fc955-63ff-41a4-b31b-369c2b584544 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.914 227766 INFO nova.compute.manager [-] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.943 227766 DEBUG nova.compute.manager [None req-f1e0cab5-0169-4991-a635-47a0b764102b - - - - - -] [instance: bd0fc955-63ff-41a4-b31b-369c2b584544] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.946 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:56.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:56 np0005593234 nova_compute[227762]: 2026-01-23 10:25:56.986 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 12b37be9-93a2-4e10-9056-68a743ed2673_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.274s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:25:57 np0005593234 nova_compute[227762]: 2026-01-23 10:25:57.060 227766 DEBUG nova.storage.rbd_utils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] resizing rbd image 12b37be9-93a2-4e10-9056-68a743ed2673_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:25:57 np0005593234 nova_compute[227762]: 2026-01-23 10:25:57.173 227766 DEBUG nova.objects.instance [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'migration_context' on Instance uuid 12b37be9-93a2-4e10-9056-68a743ed2673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:25:57 np0005593234 nova_compute[227762]: 2026-01-23 10:25:57.204 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:25:57 np0005593234 nova_compute[227762]: 2026-01-23 10:25:57.204 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Ensure instance console log exists: /var/lib/nova/instances/12b37be9-93a2-4e10-9056-68a743ed2673/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:25:57 np0005593234 nova_compute[227762]: 2026-01-23 10:25:57.204 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:25:57 np0005593234 nova_compute[227762]: 2026-01-23 10:25:57.205 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:25:57 np0005593234 nova_compute[227762]: 2026-01-23 10:25:57.205 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:25:57 np0005593234 nova_compute[227762]: 2026-01-23 10:25:57.254 227766 DEBUG nova.policy [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '01b7396ecc574dd6ba2df2f406921223', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c7c25c6bb33b41bf9cd8febb8259fd87', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:25:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:25:57 np0005593234 nova_compute[227762]: 2026-01-23 10:25:57.964 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:25:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:25:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:58.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:25:58 np0005593234 nova_compute[227762]: 2026-01-23 10:25:58.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:58 np0005593234 nova_compute[227762]: 2026-01-23 10:25:58.763 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:25:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:25:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:25:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:58.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:25:59 np0005593234 nova_compute[227762]: 2026-01-23 10:25:59.510 227766 DEBUG nova.network.neutron [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Successfully created port: 33699e18-9d87-4a6b-9145-84572bf07525 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:26:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:00.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:00 np0005593234 nova_compute[227762]: 2026-01-23 10:26:00.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:00 np0005593234 nova_compute[227762]: 2026-01-23 10:26:00.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:26:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:00.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:01 np0005593234 nova_compute[227762]: 2026-01-23 10:26:01.948 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:02.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:02.470 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '61'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:02 np0005593234 podman[303197]: 2026-01-23 10:26:02.771367399 +0000 UTC m=+0.061549923 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 05:26:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:02 np0005593234 nova_compute[227762]: 2026-01-23 10:26:02.966 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:02.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:04.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:04.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:05 np0005593234 nova_compute[227762]: 2026-01-23 10:26:05.906 227766 DEBUG nova.network.neutron [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Successfully updated port: 33699e18-9d87-4a6b-9145-84572bf07525 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:26:05 np0005593234 nova_compute[227762]: 2026-01-23 10:26:05.939 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "refresh_cache-12b37be9-93a2-4e10-9056-68a743ed2673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:05 np0005593234 nova_compute[227762]: 2026-01-23 10:26:05.939 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquired lock "refresh_cache-12b37be9-93a2-4e10-9056-68a743ed2673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:05 np0005593234 nova_compute[227762]: 2026-01-23 10:26:05.940 227766 DEBUG nova.network.neutron [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:26:06 np0005593234 nova_compute[227762]: 2026-01-23 10:26:06.099 227766 DEBUG nova.compute.manager [req-2e38fb54-29fd-49fe-b6f6-5adeebcc58e2 req-aaed9c22-b68f-4ac1-96f4-0a88a2a03d47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Received event network-changed-33699e18-9d87-4a6b-9145-84572bf07525 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:06 np0005593234 nova_compute[227762]: 2026-01-23 10:26:06.099 227766 DEBUG nova.compute.manager [req-2e38fb54-29fd-49fe-b6f6-5adeebcc58e2 req-aaed9c22-b68f-4ac1-96f4-0a88a2a03d47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Refreshing instance network info cache due to event network-changed-33699e18-9d87-4a6b-9145-84572bf07525. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:26:06 np0005593234 nova_compute[227762]: 2026-01-23 10:26:06.099 227766 DEBUG oslo_concurrency.lockutils [req-2e38fb54-29fd-49fe-b6f6-5adeebcc58e2 req-aaed9c22-b68f-4ac1-96f4-0a88a2a03d47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-12b37be9-93a2-4e10-9056-68a743ed2673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:06.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:06 np0005593234 nova_compute[227762]: 2026-01-23 10:26:06.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:06 np0005593234 nova_compute[227762]: 2026-01-23 10:26:06.784 227766 DEBUG nova.network.neutron [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:26:06 np0005593234 nova_compute[227762]: 2026-01-23 10:26:06.950 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:06.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:07 np0005593234 nova_compute[227762]: 2026-01-23 10:26:07.969 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:08.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:08.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:10.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:10 np0005593234 podman[303223]: 2026-01-23 10:26:10.814719088 +0000 UTC m=+0.111097913 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.902 227766 DEBUG nova.network.neutron [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Updating instance_info_cache with network_info: [{"id": "33699e18-9d87-4a6b-9145-84572bf07525", "address": "fa:16:3e:09:3c:ad", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33699e18-9d", "ovs_interfaceid": "33699e18-9d87-4a6b-9145-84572bf07525", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.942 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Releasing lock "refresh_cache-12b37be9-93a2-4e10-9056-68a743ed2673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.942 227766 DEBUG nova.compute.manager [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Instance network_info: |[{"id": "33699e18-9d87-4a6b-9145-84572bf07525", "address": "fa:16:3e:09:3c:ad", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33699e18-9d", "ovs_interfaceid": "33699e18-9d87-4a6b-9145-84572bf07525", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.943 227766 DEBUG oslo_concurrency.lockutils [req-2e38fb54-29fd-49fe-b6f6-5adeebcc58e2 req-aaed9c22-b68f-4ac1-96f4-0a88a2a03d47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-12b37be9-93a2-4e10-9056-68a743ed2673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.943 227766 DEBUG nova.network.neutron [req-2e38fb54-29fd-49fe-b6f6-5adeebcc58e2 req-aaed9c22-b68f-4ac1-96f4-0a88a2a03d47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Refreshing network info cache for port 33699e18-9d87-4a6b-9145-84572bf07525 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.945 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Start _get_guest_xml network_info=[{"id": "33699e18-9d87-4a6b-9145-84572bf07525", "address": "fa:16:3e:09:3c:ad", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33699e18-9d", "ovs_interfaceid": "33699e18-9d87-4a6b-9145-84572bf07525", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.950 227766 WARNING nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.959 227766 DEBUG nova.virt.libvirt.host [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.961 227766 DEBUG nova.virt.libvirt.host [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.966 227766 DEBUG nova.virt.libvirt.host [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.967 227766 DEBUG nova.virt.libvirt.host [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.970 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.970 227766 DEBUG nova.virt.hardware [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.970 227766 DEBUG nova.virt.hardware [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.970 227766 DEBUG nova.virt.hardware [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.971 227766 DEBUG nova.virt.hardware [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.971 227766 DEBUG nova.virt.hardware [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.971 227766 DEBUG nova.virt.hardware [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.971 227766 DEBUG nova.virt.hardware [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.971 227766 DEBUG nova.virt.hardware [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.972 227766 DEBUG nova.virt.hardware [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.972 227766 DEBUG nova.virt.hardware [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.972 227766 DEBUG nova.virt.hardware [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:26:10 np0005593234 nova_compute[227762]: 2026-01-23 10:26:10.975 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:10.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:26:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1300651057' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.388 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.412 227766 DEBUG nova.storage.rbd_utils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image 12b37be9-93a2-4e10-9056-68a743ed2673_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.416 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:26:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/31623512' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.840 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.842 227766 DEBUG nova.virt.libvirt.vif [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:25:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1294604962',display_name='tempest-AttachVolumeTestJSON-server-1294604962',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-1294604962',id=166,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIl3PJnyMfFo6c2aSbcndGrUeCJDPvPLm0yGb1s4ShCt8tulHq0w/NQxJ8e+7tYOXA+JcLABr8r+WehowojFcIvgwp+tPzjk76SVFT14Seq+GvrBIJWX3DZtL3BrePr6pw==',key_name='tempest-keypair-1905896331',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c7c25c6bb33b41bf9cd8febb8259fd87',ramdisk_id='',reservation_id='r-292m7c9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-345871886',owner_user_name='tempest-AttachVolumeTestJSON-345871886-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:25:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='01b7396ecc574dd6ba2df2f406921223',uuid=12b37be9-93a2-4e10-9056-68a743ed2673,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "33699e18-9d87-4a6b-9145-84572bf07525", "address": "fa:16:3e:09:3c:ad", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33699e18-9d", "ovs_interfaceid": "33699e18-9d87-4a6b-9145-84572bf07525", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.842 227766 DEBUG nova.network.os_vif_util [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converting VIF {"id": "33699e18-9d87-4a6b-9145-84572bf07525", "address": "fa:16:3e:09:3c:ad", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33699e18-9d", "ovs_interfaceid": "33699e18-9d87-4a6b-9145-84572bf07525", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.843 227766 DEBUG nova.network.os_vif_util [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:3c:ad,bridge_name='br-int',has_traffic_filtering=True,id=33699e18-9d87-4a6b-9145-84572bf07525,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33699e18-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.844 227766 DEBUG nova.objects.instance [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'pci_devices' on Instance uuid 12b37be9-93a2-4e10-9056-68a743ed2673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.859 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  <uuid>12b37be9-93a2-4e10-9056-68a743ed2673</uuid>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  <name>instance-000000a6</name>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <nova:name>tempest-AttachVolumeTestJSON-server-1294604962</nova:name>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:26:10</nova:creationTime>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <nova:user uuid="01b7396ecc574dd6ba2df2f406921223">tempest-AttachVolumeTestJSON-345871886-project-member</nova:user>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <nova:project uuid="c7c25c6bb33b41bf9cd8febb8259fd87">tempest-AttachVolumeTestJSON-345871886</nova:project>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <nova:port uuid="33699e18-9d87-4a6b-9145-84572bf07525">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <entry name="serial">12b37be9-93a2-4e10-9056-68a743ed2673</entry>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <entry name="uuid">12b37be9-93a2-4e10-9056-68a743ed2673</entry>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/12b37be9-93a2-4e10-9056-68a743ed2673_disk">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/12b37be9-93a2-4e10-9056-68a743ed2673_disk.config">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:09:3c:ad"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <target dev="tap33699e18-9d"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/12b37be9-93a2-4e10-9056-68a743ed2673/console.log" append="off"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:26:11 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:26:11 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:26:11 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:26:11 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.861 227766 DEBUG nova.compute.manager [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Preparing to wait for external event network-vif-plugged-33699e18-9d87-4a6b-9145-84572bf07525 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.861 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.861 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.861 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.862 227766 DEBUG nova.virt.libvirt.vif [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:25:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1294604962',display_name='tempest-AttachVolumeTestJSON-server-1294604962',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-1294604962',id=166,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIl3PJnyMfFo6c2aSbcndGrUeCJDPvPLm0yGb1s4ShCt8tulHq0w/NQxJ8e+7tYOXA+JcLABr8r+WehowojFcIvgwp+tPzjk76SVFT14Seq+GvrBIJWX3DZtL3BrePr6pw==',key_name='tempest-keypair-1905896331',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c7c25c6bb33b41bf9cd8febb8259fd87',ramdisk_id='',reservation_id='r-292m7c9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeTestJSON-345871886',owner_user_name='tempest-AttachVolumeTestJSON-345871886-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:25:56Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='01b7396ecc574dd6ba2df2f406921223',uuid=12b37be9-93a2-4e10-9056-68a743ed2673,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "33699e18-9d87-4a6b-9145-84572bf07525", "address": "fa:16:3e:09:3c:ad", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33699e18-9d", "ovs_interfaceid": "33699e18-9d87-4a6b-9145-84572bf07525", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.862 227766 DEBUG nova.network.os_vif_util [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converting VIF {"id": "33699e18-9d87-4a6b-9145-84572bf07525", "address": "fa:16:3e:09:3c:ad", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33699e18-9d", "ovs_interfaceid": "33699e18-9d87-4a6b-9145-84572bf07525", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.863 227766 DEBUG nova.network.os_vif_util [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:09:3c:ad,bridge_name='br-int',has_traffic_filtering=True,id=33699e18-9d87-4a6b-9145-84572bf07525,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33699e18-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.863 227766 DEBUG os_vif [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:3c:ad,bridge_name='br-int',has_traffic_filtering=True,id=33699e18-9d87-4a6b-9145-84572bf07525,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33699e18-9d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.864 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.864 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.865 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.867 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.867 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33699e18-9d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.867 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap33699e18-9d, col_values=(('external_ids', {'iface-id': '33699e18-9d87-4a6b-9145-84572bf07525', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:09:3c:ad', 'vm-uuid': '12b37be9-93a2-4e10-9056-68a743ed2673'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:11 np0005593234 NetworkManager[48942]: <info>  [1769163971.8702] manager: (tap33699e18-9d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/332)
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.870 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.874 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.875 227766 INFO os_vif [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:09:3c:ad,bridge_name='br-int',has_traffic_filtering=True,id=33699e18-9d87-4a6b-9145-84572bf07525,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33699e18-9d')#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.919 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.919 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.919 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No VIF found with MAC fa:16:3e:09:3c:ad, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.920 227766 INFO nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Using config drive#033[00m
Jan 23 05:26:11 np0005593234 nova_compute[227762]: 2026-01-23 10:26:11.944 227766 DEBUG nova.storage.rbd_utils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image 12b37be9-93a2-4e10-9056-68a743ed2673_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:12.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:12 np0005593234 nova_compute[227762]: 2026-01-23 10:26:12.598 227766 INFO nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Creating config drive at /var/lib/nova/instances/12b37be9-93a2-4e10-9056-68a743ed2673/disk.config#033[00m
Jan 23 05:26:12 np0005593234 nova_compute[227762]: 2026-01-23 10:26:12.607 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/12b37be9-93a2-4e10-9056-68a743ed2673/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvheqlo8e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:12 np0005593234 nova_compute[227762]: 2026-01-23 10:26:12.737 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/12b37be9-93a2-4e10-9056-68a743ed2673/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvheqlo8e" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:12 np0005593234 nova_compute[227762]: 2026-01-23 10:26:12.767 227766 DEBUG nova.storage.rbd_utils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] rbd image 12b37be9-93a2-4e10-9056-68a743ed2673_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:12 np0005593234 nova_compute[227762]: 2026-01-23 10:26:12.770 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/12b37be9-93a2-4e10-9056-68a743ed2673/disk.config 12b37be9-93a2-4e10-9056-68a743ed2673_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:12 np0005593234 nova_compute[227762]: 2026-01-23 10:26:12.921 227766 DEBUG oslo_concurrency.processutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/12b37be9-93a2-4e10-9056-68a743ed2673/disk.config 12b37be9-93a2-4e10-9056-68a743ed2673_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:12 np0005593234 nova_compute[227762]: 2026-01-23 10:26:12.921 227766 INFO nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Deleting local config drive /var/lib/nova/instances/12b37be9-93a2-4e10-9056-68a743ed2673/disk.config because it was imported into RBD.#033[00m
Jan 23 05:26:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:12 np0005593234 kernel: tap33699e18-9d: entered promiscuous mode
Jan 23 05:26:12 np0005593234 NetworkManager[48942]: <info>  [1769163972.9681] manager: (tap33699e18-9d): new Tun device (/org/freedesktop/NetworkManager/Devices/333)
Jan 23 05:26:12 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:12Z|00696|binding|INFO|Claiming lport 33699e18-9d87-4a6b-9145-84572bf07525 for this chassis.
Jan 23 05:26:12 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:12Z|00697|binding|INFO|33699e18-9d87-4a6b-9145-84572bf07525: Claiming fa:16:3e:09:3c:ad 10.100.0.11
Jan 23 05:26:12 np0005593234 nova_compute[227762]: 2026-01-23 10:26:12.968 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:12 np0005593234 nova_compute[227762]: 2026-01-23 10:26:12.971 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:12 np0005593234 systemd-udevd[303382]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:26:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:13.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:13 np0005593234 NetworkManager[48942]: <info>  [1769163973.0069] device (tap33699e18-9d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:26:13 np0005593234 NetworkManager[48942]: <info>  [1769163973.0089] device (tap33699e18-9d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:26:13 np0005593234 nova_compute[227762]: 2026-01-23 10:26:13.031 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:13Z|00698|binding|INFO|Setting lport 33699e18-9d87-4a6b-9145-84572bf07525 ovn-installed in OVS
Jan 23 05:26:13 np0005593234 systemd-machined[195626]: New machine qemu-78-instance-000000a6.
Jan 23 05:26:13 np0005593234 nova_compute[227762]: 2026-01-23 10:26:13.037 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:13 np0005593234 systemd[1]: Started Virtual Machine qemu-78-instance-000000a6.
Jan 23 05:26:13 np0005593234 nova_compute[227762]: 2026-01-23 10:26:13.489 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163973.4881692, 12b37be9-93a2-4e10-9056-68a743ed2673 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:13 np0005593234 nova_compute[227762]: 2026-01-23 10:26:13.490 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] VM Started (Lifecycle Event)#033[00m
Jan 23 05:26:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:13Z|00699|binding|INFO|Setting lport 33699e18-9d87-4a6b-9145-84572bf07525 up in Southbound
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.717 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:3c:ad 10.100.0.11'], port_security=['fa:16:3e:09:3c:ad 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '12b37be9-93a2-4e10-9056-68a743ed2673', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1280650e-e283-4ddc-81aa-357640520155', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7c25c6bb33b41bf9cd8febb8259fd87', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3994afaf-c8e7-4265-9c7f-619a98353860', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4684203-7828-4ea2-86ad-83030eb9aefe, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=33699e18-9d87-4a6b-9145-84572bf07525) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.718 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 33699e18-9d87-4a6b-9145-84572bf07525 in datapath 1280650e-e283-4ddc-81aa-357640520155 bound to our chassis#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.719 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1280650e-e283-4ddc-81aa-357640520155#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.731 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[221612ff-a479-45cf-b1a0-f92f3bed232e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.732 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1280650e-e1 in ovnmeta-1280650e-e283-4ddc-81aa-357640520155 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.735 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1280650e-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.736 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d5541861-5d10-4005-b1a6-7e4115393d72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.736 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5c5470c2-6eae-4c2f-af42-6355be268e67]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.747 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[858af28e-9b2f-40c3-a643-b05f38ec7e63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.763 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8d658916-889a-4f55-86c4-08565ffac779]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.794 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9b2a796c-a236-48f9-a09e-e1c68a9c30b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:13 np0005593234 systemd-udevd[303384]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:26:13 np0005593234 NetworkManager[48942]: <info>  [1769163973.8015] manager: (tap1280650e-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/334)
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.800 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a059606a-361b-4a26-80e1-e09feea90b96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.832 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ecda8bb4-0c2c-466e-b570-c9eed415180e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.834 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[fa206433-1bb5-4474-8daf-aec61ca38496]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:13 np0005593234 NetworkManager[48942]: <info>  [1769163973.8557] device (tap1280650e-e0): carrier: link connected
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.861 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[54d5b4b2-2a76-47ed-8015-29c11f0f8498]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.876 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3227060b-e81c-463b-9d41-da655450331b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1280650e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:5b:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786881, 'reachable_time': 21919, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 303460, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.895 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea03dd1-4319-4301-93c5-1a51e9edb3fc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feea:5b3e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 786881, 'tstamp': 786881}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 303461, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.918 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[39c15613-7a73-4b0e-a813-01ceec0b65c5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1280650e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ea:5b:3e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 217], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786881, 'reachable_time': 21919, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 303462, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:13.950 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb3e48e-b6b3-41e5-871a-7aa69ec83c94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:14.013 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7453e6ef-fc40-4161-9df5-43a2683d50ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:14.014 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1280650e-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:14.015 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:14.016 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1280650e-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:14 np0005593234 nova_compute[227762]: 2026-01-23 10:26:14.034 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:14 np0005593234 NetworkManager[48942]: <info>  [1769163974.0347] manager: (tap1280650e-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/335)
Jan 23 05:26:14 np0005593234 kernel: tap1280650e-e0: entered promiscuous mode
Jan 23 05:26:14 np0005593234 nova_compute[227762]: 2026-01-23 10:26:14.036 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:14.038 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1280650e-e0, col_values=(('external_ids', {'iface-id': '8ca9fbcb-59f5-4006-84df-ab99827a2b39'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:14 np0005593234 nova_compute[227762]: 2026-01-23 10:26:14.039 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:14Z|00700|binding|INFO|Releasing lport 8ca9fbcb-59f5-4006-84df-ab99827a2b39 from this chassis (sb_readonly=1)
Jan 23 05:26:14 np0005593234 nova_compute[227762]: 2026-01-23 10:26:14.057 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:14 np0005593234 nova_compute[227762]: 2026-01-23 10:26:14.059 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:14.060 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1280650e-e283-4ddc-81aa-357640520155.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1280650e-e283-4ddc-81aa-357640520155.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:14.061 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8381a3d7-5165-4c3c-a024-a278704cb87d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:14.062 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-1280650e-e283-4ddc-81aa-357640520155
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/1280650e-e283-4ddc-81aa-357640520155.pid.haproxy
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 1280650e-e283-4ddc-81aa-357640520155
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:26:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:14.063 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'env', 'PROCESS_TAG=haproxy-1280650e-e283-4ddc-81aa-357640520155', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1280650e-e283-4ddc-81aa-357640520155.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:26:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:14.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:14 np0005593234 nova_compute[227762]: 2026-01-23 10:26:14.362 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:14 np0005593234 nova_compute[227762]: 2026-01-23 10:26:14.367 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163973.48964, 12b37be9-93a2-4e10-9056-68a743ed2673 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:14 np0005593234 nova_compute[227762]: 2026-01-23 10:26:14.368 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:26:14 np0005593234 podman[303496]: 2026-01-23 10:26:14.406577026 +0000 UTC m=+0.048775366 container create 1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:26:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:14Z|00701|binding|INFO|Releasing lport 8ca9fbcb-59f5-4006-84df-ab99827a2b39 from this chassis (sb_readonly=0)
Jan 23 05:26:14 np0005593234 systemd[1]: Started libpod-conmon-1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa.scope.
Jan 23 05:26:14 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:26:14 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b7ad02a52b5c851035b5b90dcc6111692236ef4a8982f301b569b01ee01337b2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:26:14 np0005593234 nova_compute[227762]: 2026-01-23 10:26:14.463 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:14 np0005593234 nova_compute[227762]: 2026-01-23 10:26:14.466 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:26:14 np0005593234 podman[303496]: 2026-01-23 10:26:14.469099869 +0000 UTC m=+0.111298229 container init 1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:26:14 np0005593234 podman[303496]: 2026-01-23 10:26:14.475132776 +0000 UTC m=+0.117331116 container start 1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:26:14 np0005593234 podman[303496]: 2026-01-23 10:26:14.381540588 +0000 UTC m=+0.023738948 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:26:14 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[303512]: [NOTICE]   (303516) : New worker (303518) forked
Jan 23 05:26:14 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[303512]: [NOTICE]   (303516) : Loading success.
Jan 23 05:26:14 np0005593234 nova_compute[227762]: 2026-01-23 10:26:14.534 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:26:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:15.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:16.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:16 np0005593234 nova_compute[227762]: 2026-01-23 10:26:16.685 227766 DEBUG nova.network.neutron [req-2e38fb54-29fd-49fe-b6f6-5adeebcc58e2 req-aaed9c22-b68f-4ac1-96f4-0a88a2a03d47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Updated VIF entry in instance network info cache for port 33699e18-9d87-4a6b-9145-84572bf07525. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:26:16 np0005593234 nova_compute[227762]: 2026-01-23 10:26:16.687 227766 DEBUG nova.network.neutron [req-2e38fb54-29fd-49fe-b6f6-5adeebcc58e2 req-aaed9c22-b68f-4ac1-96f4-0a88a2a03d47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Updating instance_info_cache with network_info: [{"id": "33699e18-9d87-4a6b-9145-84572bf07525", "address": "fa:16:3e:09:3c:ad", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33699e18-9d", "ovs_interfaceid": "33699e18-9d87-4a6b-9145-84572bf07525", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:16 np0005593234 nova_compute[227762]: 2026-01-23 10:26:16.729 227766 DEBUG oslo_concurrency.lockutils [req-2e38fb54-29fd-49fe-b6f6-5adeebcc58e2 req-aaed9c22-b68f-4ac1-96f4-0a88a2a03d47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-12b37be9-93a2-4e10-9056-68a743ed2673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:16 np0005593234 nova_compute[227762]: 2026-01-23 10:26:16.871 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:17.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.683 227766 DEBUG nova.compute.manager [req-6e135abf-0bf7-495d-8116-a5baf6274cbd req-bf3bd932-501b-48d4-83d8-e9372ba61c5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Received event network-vif-plugged-33699e18-9d87-4a6b-9145-84572bf07525 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.683 227766 DEBUG oslo_concurrency.lockutils [req-6e135abf-0bf7-495d-8116-a5baf6274cbd req-bf3bd932-501b-48d4-83d8-e9372ba61c5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.684 227766 DEBUG oslo_concurrency.lockutils [req-6e135abf-0bf7-495d-8116-a5baf6274cbd req-bf3bd932-501b-48d4-83d8-e9372ba61c5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.684 227766 DEBUG oslo_concurrency.lockutils [req-6e135abf-0bf7-495d-8116-a5baf6274cbd req-bf3bd932-501b-48d4-83d8-e9372ba61c5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.684 227766 DEBUG nova.compute.manager [req-6e135abf-0bf7-495d-8116-a5baf6274cbd req-bf3bd932-501b-48d4-83d8-e9372ba61c5e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Processing event network-vif-plugged-33699e18-9d87-4a6b-9145-84572bf07525 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.685 227766 DEBUG nova.compute.manager [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.688 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163977.6885, 12b37be9-93a2-4e10-9056-68a743ed2673 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.688 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.690 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.694 227766 INFO nova.virt.libvirt.driver [-] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Instance spawned successfully.#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.694 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.725 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.729 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.729 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.730 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.730 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.731 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.731 227766 DEBUG nova.virt.libvirt.driver [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.735 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.786 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.810 227766 INFO nova.compute.manager [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Took 21.27 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.811 227766 DEBUG nova.compute.manager [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.900 227766 INFO nova.compute.manager [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Took 22.33 seconds to build instance.#033[00m
Jan 23 05:26:17 np0005593234 nova_compute[227762]: 2026-01-23 10:26:17.941 227766 DEBUG oslo_concurrency.lockutils [None req-b66136c9-c020-4b97-9ef0-81f314c890f6 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.459s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:18 np0005593234 nova_compute[227762]: 2026-01-23 10:26:18.031 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:18.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:19.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:19 np0005593234 nova_compute[227762]: 2026-01-23 10:26:19.850 227766 DEBUG nova.compute.manager [req-d1d4c1ca-fa52-4535-8fc7-e121fc0e273b req-d225b4bb-2a89-4a5c-82c7-4db38c705152 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Received event network-vif-plugged-33699e18-9d87-4a6b-9145-84572bf07525 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:19 np0005593234 nova_compute[227762]: 2026-01-23 10:26:19.850 227766 DEBUG oslo_concurrency.lockutils [req-d1d4c1ca-fa52-4535-8fc7-e121fc0e273b req-d225b4bb-2a89-4a5c-82c7-4db38c705152 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:19 np0005593234 nova_compute[227762]: 2026-01-23 10:26:19.851 227766 DEBUG oslo_concurrency.lockutils [req-d1d4c1ca-fa52-4535-8fc7-e121fc0e273b req-d225b4bb-2a89-4a5c-82c7-4db38c705152 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:19 np0005593234 nova_compute[227762]: 2026-01-23 10:26:19.851 227766 DEBUG oslo_concurrency.lockutils [req-d1d4c1ca-fa52-4535-8fc7-e121fc0e273b req-d225b4bb-2a89-4a5c-82c7-4db38c705152 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:19 np0005593234 nova_compute[227762]: 2026-01-23 10:26:19.851 227766 DEBUG nova.compute.manager [req-d1d4c1ca-fa52-4535-8fc7-e121fc0e273b req-d225b4bb-2a89-4a5c-82c7-4db38c705152 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] No waiting events found dispatching network-vif-plugged-33699e18-9d87-4a6b-9145-84572bf07525 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:19 np0005593234 nova_compute[227762]: 2026-01-23 10:26:19.851 227766 WARNING nova.compute.manager [req-d1d4c1ca-fa52-4535-8fc7-e121fc0e273b req-d225b4bb-2a89-4a5c-82c7-4db38c705152 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Received unexpected event network-vif-plugged-33699e18-9d87-4a6b-9145-84572bf07525 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:26:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:20.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:21.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:21 np0005593234 nova_compute[227762]: 2026-01-23 10:26:21.874 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:22.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:23.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:23 np0005593234 nova_compute[227762]: 2026-01-23 10:26:23.034 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:23 np0005593234 nova_compute[227762]: 2026-01-23 10:26:23.623 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:23 np0005593234 NetworkManager[48942]: <info>  [1769163983.6355] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/336)
Jan 23 05:26:23 np0005593234 NetworkManager[48942]: <info>  [1769163983.6364] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/337)
Jan 23 05:26:23 np0005593234 nova_compute[227762]: 2026-01-23 10:26:23.782 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:23 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:23Z|00702|binding|INFO|Releasing lport 8ca9fbcb-59f5-4006-84df-ab99827a2b39 from this chassis (sb_readonly=0)
Jan 23 05:26:23 np0005593234 nova_compute[227762]: 2026-01-23 10:26:23.797 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:24.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:24 np0005593234 nova_compute[227762]: 2026-01-23 10:26:24.248 227766 DEBUG nova.compute.manager [req-259747b7-cf6c-4979-a18d-e222411d8670 req-f69894a0-71c2-4b01-8d7b-64184d64ed69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Received event network-changed-33699e18-9d87-4a6b-9145-84572bf07525 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:24 np0005593234 nova_compute[227762]: 2026-01-23 10:26:24.248 227766 DEBUG nova.compute.manager [req-259747b7-cf6c-4979-a18d-e222411d8670 req-f69894a0-71c2-4b01-8d7b-64184d64ed69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Refreshing instance network info cache due to event network-changed-33699e18-9d87-4a6b-9145-84572bf07525. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:26:24 np0005593234 nova_compute[227762]: 2026-01-23 10:26:24.249 227766 DEBUG oslo_concurrency.lockutils [req-259747b7-cf6c-4979-a18d-e222411d8670 req-f69894a0-71c2-4b01-8d7b-64184d64ed69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-12b37be9-93a2-4e10-9056-68a743ed2673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:24 np0005593234 nova_compute[227762]: 2026-01-23 10:26:24.249 227766 DEBUG oslo_concurrency.lockutils [req-259747b7-cf6c-4979-a18d-e222411d8670 req-f69894a0-71c2-4b01-8d7b-64184d64ed69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-12b37be9-93a2-4e10-9056-68a743ed2673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:24 np0005593234 nova_compute[227762]: 2026-01-23 10:26:24.250 227766 DEBUG nova.network.neutron [req-259747b7-cf6c-4979-a18d-e222411d8670 req-f69894a0-71c2-4b01-8d7b-64184d64ed69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Refreshing network info cache for port 33699e18-9d87-4a6b-9145-84572bf07525 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:26:24 np0005593234 nova_compute[227762]: 2026-01-23 10:26:24.301 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "b31fafb4-3888-4647-9d5d-5f528ff795b5" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:24 np0005593234 nova_compute[227762]: 2026-01-23 10:26:24.302 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:24 np0005593234 nova_compute[227762]: 2026-01-23 10:26:24.357 227766 DEBUG nova.compute.manager [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:26:24 np0005593234 nova_compute[227762]: 2026-01-23 10:26:24.571 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:24 np0005593234 nova_compute[227762]: 2026-01-23 10:26:24.572 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:24 np0005593234 nova_compute[227762]: 2026-01-23 10:26:24.583 227766 DEBUG nova.virt.hardware [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:26:24 np0005593234 nova_compute[227762]: 2026-01-23 10:26:24.583 227766 INFO nova.compute.claims [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:26:24 np0005593234 nova_compute[227762]: 2026-01-23 10:26:24.866 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:25.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/668635943' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.294 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.301 227766 DEBUG nova.compute.provider_tree [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.333 227766 DEBUG nova.scheduler.client.report [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.399 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.400 227766 DEBUG nova.compute.manager [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.481 227766 DEBUG nova.compute.manager [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.481 227766 DEBUG nova.network.neutron [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.505 227766 INFO nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.534 227766 DEBUG nova.compute.manager [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.623 227766 DEBUG nova.compute.manager [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.625 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.625 227766 INFO nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Creating image(s)#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.657 227766 DEBUG nova.storage.rbd_utils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image b31fafb4-3888-4647-9d5d-5f528ff795b5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.686 227766 DEBUG nova.storage.rbd_utils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image b31fafb4-3888-4647-9d5d-5f528ff795b5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.718 227766 DEBUG nova.storage.rbd_utils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image b31fafb4-3888-4647-9d5d-5f528ff795b5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.722 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #145. Immutable memtables: 0.
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.726046) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 91] Flushing memtable with next log file: 145
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163985726240, "job": 91, "event": "flush_started", "num_memtables": 1, "num_entries": 1257, "num_deletes": 252, "total_data_size": 2761971, "memory_usage": 2794312, "flush_reason": "Manual Compaction"}
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 91] Level-0 flush table #146: started
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163985737393, "cf_name": "default", "job": 91, "event": "table_file_creation", "file_number": 146, "file_size": 1823916, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 69528, "largest_seqno": 70780, "table_properties": {"data_size": 1818311, "index_size": 2935, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12543, "raw_average_key_size": 20, "raw_value_size": 1806945, "raw_average_value_size": 2938, "num_data_blocks": 128, "num_entries": 615, "num_filter_entries": 615, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163883, "oldest_key_time": 1769163883, "file_creation_time": 1769163985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 146, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 91] Flush lasted 11563 microseconds, and 5422 cpu microseconds.
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.737736) [db/flush_job.cc:967] [default] [JOB 91] Level-0 flush table #146: 1823916 bytes OK
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.737874) [db/memtable_list.cc:519] [default] Level-0 commit table #146 started
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.739658) [db/memtable_list.cc:722] [default] Level-0 commit table #146: memtable #1 done
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.739676) EVENT_LOG_v1 {"time_micros": 1769163985739669, "job": 91, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.739697) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 91] Try to delete WAL files size 2755872, prev total WAL file size 2755872, number of live WAL files 2.
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000142.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.741336) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036303234' seq:72057594037927935, type:22 .. '7061786F730036323736' seq:0, type:0; will stop at (end)
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 92] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 91 Base level 0, inputs: [146(1781KB)], [144(10156KB)]
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163985741451, "job": 92, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [146], "files_L6": [144], "score": -1, "input_data_size": 12224491, "oldest_snapshot_seqno": -1}
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 92] Generated table #147: 8801 keys, 10313418 bytes, temperature: kUnknown
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163985796518, "cf_name": "default", "job": 92, "event": "table_file_creation", "file_number": 147, "file_size": 10313418, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10258767, "index_size": 31562, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22021, "raw_key_size": 233236, "raw_average_key_size": 26, "raw_value_size": 10106346, "raw_average_value_size": 1148, "num_data_blocks": 1198, "num_entries": 8801, "num_filter_entries": 8801, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769163985, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 147, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.796 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.797 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.796948) [db/compaction/compaction_job.cc:1663] [default] [JOB 92] Compacted 1@0 + 1@6 files to L6 => 10313418 bytes
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.798441) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 220.9 rd, 186.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 9.9 +0.0 blob) out(9.8 +0.0 blob), read-write-amplify(12.4) write-amplify(5.7) OK, records in: 9324, records dropped: 523 output_compression: NoCompression
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.798 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.798459) EVENT_LOG_v1 {"time_micros": 1769163985798450, "job": 92, "event": "compaction_finished", "compaction_time_micros": 55341, "compaction_time_cpu_micros": 24316, "output_level": 6, "num_output_files": 1, "total_output_size": 10313418, "num_input_records": 9324, "num_output_records": 8801, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000146.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163985799103, "job": 92, "event": "table_file_deletion", "file_number": 146}
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.798 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000144.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163985800767, "job": 92, "event": "table_file_deletion", "file_number": 144}
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.741229) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.800892) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.800900) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.800902) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.800904) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:26:25 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:26:25.800906) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.824 227766 DEBUG nova.storage.rbd_utils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image b31fafb4-3888-4647-9d5d-5f528ff795b5_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.827 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 b31fafb4-3888-4647-9d5d-5f528ff795b5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:25 np0005593234 nova_compute[227762]: 2026-01-23 10:26:25.971 227766 DEBUG nova.policy [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'a3cd8c3758e14f9c8e4ad1a9a94a9995', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'b27af793a8cc42259216fbeaa302ba03', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:26:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:26.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:26 np0005593234 nova_compute[227762]: 2026-01-23 10:26:26.180 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 b31fafb4-3888-4647-9d5d-5f528ff795b5_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.353s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:26 np0005593234 nova_compute[227762]: 2026-01-23 10:26:26.254 227766 DEBUG nova.storage.rbd_utils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] resizing rbd image b31fafb4-3888-4647-9d5d-5f528ff795b5_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:26:26 np0005593234 nova_compute[227762]: 2026-01-23 10:26:26.362 227766 DEBUG nova.objects.instance [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'migration_context' on Instance uuid b31fafb4-3888-4647-9d5d-5f528ff795b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:26 np0005593234 nova_compute[227762]: 2026-01-23 10:26:26.389 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:26:26 np0005593234 nova_compute[227762]: 2026-01-23 10:26:26.390 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Ensure instance console log exists: /var/lib/nova/instances/b31fafb4-3888-4647-9d5d-5f528ff795b5/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:26:26 np0005593234 nova_compute[227762]: 2026-01-23 10:26:26.390 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:26 np0005593234 nova_compute[227762]: 2026-01-23 10:26:26.390 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:26 np0005593234 nova_compute[227762]: 2026-01-23 10:26:26.391 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:26 np0005593234 nova_compute[227762]: 2026-01-23 10:26:26.876 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:26 np0005593234 nova_compute[227762]: 2026-01-23 10:26:26.916 227766 DEBUG nova.network.neutron [req-259747b7-cf6c-4979-a18d-e222411d8670 req-f69894a0-71c2-4b01-8d7b-64184d64ed69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Updated VIF entry in instance network info cache for port 33699e18-9d87-4a6b-9145-84572bf07525. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:26:26 np0005593234 nova_compute[227762]: 2026-01-23 10:26:26.917 227766 DEBUG nova.network.neutron [req-259747b7-cf6c-4979-a18d-e222411d8670 req-f69894a0-71c2-4b01-8d7b-64184d64ed69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Updating instance_info_cache with network_info: [{"id": "33699e18-9d87-4a6b-9145-84572bf07525", "address": "fa:16:3e:09:3c:ad", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33699e18-9d", "ovs_interfaceid": "33699e18-9d87-4a6b-9145-84572bf07525", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:27.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:27 np0005593234 nova_compute[227762]: 2026-01-23 10:26:27.176 227766 DEBUG oslo_concurrency.lockutils [req-259747b7-cf6c-4979-a18d-e222411d8670 req-f69894a0-71c2-4b01-8d7b-64184d64ed69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-12b37be9-93a2-4e10-9056-68a743ed2673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:27 np0005593234 nova_compute[227762]: 2026-01-23 10:26:27.823 227766 DEBUG nova.network.neutron [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Successfully created port: 54d9357e-ac9f-458b-b6ce-6da38bc7a025 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:26:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:28 np0005593234 nova_compute[227762]: 2026-01-23 10:26:28.036 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:28.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:29.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:30.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:30 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:30Z|00080|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:09:3c:ad 10.100.0.11
Jan 23 05:26:30 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:30Z|00081|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:09:3c:ad 10.100.0.11
Jan 23 05:26:30 np0005593234 nova_compute[227762]: 2026-01-23 10:26:30.714 227766 DEBUG nova.network.neutron [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Successfully updated port: 54d9357e-ac9f-458b-b6ce-6da38bc7a025 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:26:30 np0005593234 nova_compute[227762]: 2026-01-23 10:26:30.740 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:30 np0005593234 nova_compute[227762]: 2026-01-23 10:26:30.740 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquired lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:30 np0005593234 nova_compute[227762]: 2026-01-23 10:26:30.740 227766 DEBUG nova.network.neutron [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:26:30 np0005593234 nova_compute[227762]: 2026-01-23 10:26:30.817 227766 DEBUG nova.compute.manager [req-1e5fb0c0-a7b8-4b99-aae4-beafe1560aba req-5fcb887c-68a4-401a-adfc-cfae7c2c77f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Received event network-changed-54d9357e-ac9f-458b-b6ce-6da38bc7a025 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:30 np0005593234 nova_compute[227762]: 2026-01-23 10:26:30.817 227766 DEBUG nova.compute.manager [req-1e5fb0c0-a7b8-4b99-aae4-beafe1560aba req-5fcb887c-68a4-401a-adfc-cfae7c2c77f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Refreshing instance network info cache due to event network-changed-54d9357e-ac9f-458b-b6ce-6da38bc7a025. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:26:30 np0005593234 nova_compute[227762]: 2026-01-23 10:26:30.817 227766 DEBUG oslo_concurrency.lockutils [req-1e5fb0c0-a7b8-4b99-aae4-beafe1560aba req-5fcb887c-68a4-401a-adfc-cfae7c2c77f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:30 np0005593234 nova_compute[227762]: 2026-01-23 10:26:30.933 227766 DEBUG nova.network.neutron [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:26:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:31.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:31 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:26:31 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:26:31 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:26:31 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:26:31 np0005593234 nova_compute[227762]: 2026-01-23 10:26:31.878 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:31 np0005593234 nova_compute[227762]: 2026-01-23 10:26:31.950 227766 DEBUG nova.network.neutron [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Updating instance_info_cache with network_info: [{"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:31 np0005593234 nova_compute[227762]: 2026-01-23 10:26:31.973 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Releasing lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:31 np0005593234 nova_compute[227762]: 2026-01-23 10:26:31.973 227766 DEBUG nova.compute.manager [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Instance network_info: |[{"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:26:31 np0005593234 nova_compute[227762]: 2026-01-23 10:26:31.974 227766 DEBUG oslo_concurrency.lockutils [req-1e5fb0c0-a7b8-4b99-aae4-beafe1560aba req-5fcb887c-68a4-401a-adfc-cfae7c2c77f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:31 np0005593234 nova_compute[227762]: 2026-01-23 10:26:31.975 227766 DEBUG nova.network.neutron [req-1e5fb0c0-a7b8-4b99-aae4-beafe1560aba req-5fcb887c-68a4-401a-adfc-cfae7c2c77f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Refreshing network info cache for port 54d9357e-ac9f-458b-b6ce-6da38bc7a025 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:26:31 np0005593234 nova_compute[227762]: 2026-01-23 10:26:31.981 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Start _get_guest_xml network_info=[{"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:26:31 np0005593234 nova_compute[227762]: 2026-01-23 10:26:31.989 227766 WARNING nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.001 227766 DEBUG nova.virt.libvirt.host [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.002 227766 DEBUG nova.virt.libvirt.host [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.007 227766 DEBUG nova.virt.libvirt.host [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.007 227766 DEBUG nova.virt.libvirt.host [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.009 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.009 227766 DEBUG nova.virt.hardware [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.009 227766 DEBUG nova.virt.hardware [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.009 227766 DEBUG nova.virt.hardware [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.009 227766 DEBUG nova.virt.hardware [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.009 227766 DEBUG nova.virt.hardware [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.010 227766 DEBUG nova.virt.hardware [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.010 227766 DEBUG nova.virt.hardware [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.010 227766 DEBUG nova.virt.hardware [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.010 227766 DEBUG nova.virt.hardware [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.010 227766 DEBUG nova.virt.hardware [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.010 227766 DEBUG nova.virt.hardware [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.013 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:32.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:26:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/870940329' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.468 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.498 227766 DEBUG nova.storage.rbd_utils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image b31fafb4-3888-4647-9d5d-5f528ff795b5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.501 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e324 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:26:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/25018226' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.976 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.978 227766 DEBUG nova.virt.libvirt.vif [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:26:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-404892044',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-404892044',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-acc',id=169,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMBgFvZ4cbi7AnPS6dwqlDZxqi0tL9pk6Pv5TmYxIwyaaf9gGUq+Kaim4h5w6BHZOb0aX+j7fNILO3q4iXwnipSp+yyY1uOiInLjY+WwJtAHiBpUfsc4DJ7rPMVoRVR8SA==',key_name='tempest-TestSecurityGroupsBasicOps-1579063337',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-7p1o492a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:26:25Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=b31fafb4-3888-4647-9d5d-5f528ff795b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.978 227766 DEBUG nova.network.os_vif_util [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.979 227766 DEBUG nova.network.os_vif_util [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:7b:6d,bridge_name='br-int',has_traffic_filtering=True,id=54d9357e-ac9f-458b-b6ce-6da38bc7a025,network=Network(e7a48d7e-0ec9-4b5d-b243-77d724af740b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d9357e-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:26:32 np0005593234 nova_compute[227762]: 2026-01-23 10:26:32.980 227766 DEBUG nova.objects.instance [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'pci_devices' on Instance uuid b31fafb4-3888-4647-9d5d-5f528ff795b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.008 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  <uuid>b31fafb4-3888-4647-9d5d-5f528ff795b5</uuid>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  <name>instance-000000a9</name>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-404892044</nova:name>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:26:31</nova:creationTime>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <nova:user uuid="a3cd8c3758e14f9c8e4ad1a9a94a9995">tempest-TestSecurityGroupsBasicOps-622349977-project-member</nova:user>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <nova:project uuid="b27af793a8cc42259216fbeaa302ba03">tempest-TestSecurityGroupsBasicOps-622349977</nova:project>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <nova:port uuid="54d9357e-ac9f-458b-b6ce-6da38bc7a025">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <entry name="serial">b31fafb4-3888-4647-9d5d-5f528ff795b5</entry>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <entry name="uuid">b31fafb4-3888-4647-9d5d-5f528ff795b5</entry>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/b31fafb4-3888-4647-9d5d-5f528ff795b5_disk">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/b31fafb4-3888-4647-9d5d-5f528ff795b5_disk.config">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:e4:7b:6d"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <target dev="tap54d9357e-ac"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/b31fafb4-3888-4647-9d5d-5f528ff795b5/console.log" append="off"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:26:33 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:26:33 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:26:33 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:26:33 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.009 227766 DEBUG nova.compute.manager [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Preparing to wait for external event network-vif-plugged-54d9357e-ac9f-458b-b6ce-6da38bc7a025 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.010 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.010 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.010 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.011 227766 DEBUG nova.virt.libvirt.vif [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:26:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-404892044',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-404892044',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-acc',id=169,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMBgFvZ4cbi7AnPS6dwqlDZxqi0tL9pk6Pv5TmYxIwyaaf9gGUq+Kaim4h5w6BHZOb0aX+j7fNILO3q4iXwnipSp+yyY1uOiInLjY+WwJtAHiBpUfsc4DJ7rPMVoRVR8SA==',key_name='tempest-TestSecurityGroupsBasicOps-1579063337',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-7p1o492a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:26:25Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=b31fafb4-3888-4647-9d5d-5f528ff795b5,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.011 227766 DEBUG nova.network.os_vif_util [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.012 227766 DEBUG nova.network.os_vif_util [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e4:7b:6d,bridge_name='br-int',has_traffic_filtering=True,id=54d9357e-ac9f-458b-b6ce-6da38bc7a025,network=Network(e7a48d7e-0ec9-4b5d-b243-77d724af740b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d9357e-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.012 227766 DEBUG os_vif [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:7b:6d,bridge_name='br-int',has_traffic_filtering=True,id=54d9357e-ac9f-458b-b6ce-6da38bc7a025,network=Network(e7a48d7e-0ec9-4b5d-b243-77d724af740b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d9357e-ac') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.013 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.013 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.013 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.016 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.017 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap54d9357e-ac, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.017 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap54d9357e-ac, col_values=(('external_ids', {'iface-id': '54d9357e-ac9f-458b-b6ce-6da38bc7a025', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e4:7b:6d', 'vm-uuid': 'b31fafb4-3888-4647-9d5d-5f528ff795b5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:33 np0005593234 NetworkManager[48942]: <info>  [1769163993.0197] manager: (tap54d9357e-ac): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/338)
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.018 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.021 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.025 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.026 227766 INFO os_vif [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e4:7b:6d,bridge_name='br-int',has_traffic_filtering=True,id=54d9357e-ac9f-458b-b6ce-6da38bc7a025,network=Network(e7a48d7e-0ec9-4b5d-b243-77d724af740b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d9357e-ac')#033[00m
Jan 23 05:26:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:33.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.037 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.103 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.105 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.105 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] No VIF found with MAC fa:16:3e:e4:7b:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.105 227766 INFO nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Using config drive#033[00m
Jan 23 05:26:33 np0005593234 podman[303973]: 2026-01-23 10:26:33.124342724 +0000 UTC m=+0.060977996 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.133 227766 DEBUG nova.storage.rbd_utils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image b31fafb4-3888-4647-9d5d-5f528ff795b5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.473 227766 DEBUG nova.network.neutron [req-1e5fb0c0-a7b8-4b99-aae4-beafe1560aba req-5fcb887c-68a4-401a-adfc-cfae7c2c77f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Updated VIF entry in instance network info cache for port 54d9357e-ac9f-458b-b6ce-6da38bc7a025. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.474 227766 DEBUG nova.network.neutron [req-1e5fb0c0-a7b8-4b99-aae4-beafe1560aba req-5fcb887c-68a4-401a-adfc-cfae7c2c77f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Updating instance_info_cache with network_info: [{"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.495 227766 DEBUG oslo_concurrency.lockutils [req-1e5fb0c0-a7b8-4b99-aae4-beafe1560aba req-5fcb887c-68a4-401a-adfc-cfae7c2c77f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.588 227766 INFO nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Creating config drive at /var/lib/nova/instances/b31fafb4-3888-4647-9d5d-5f528ff795b5/disk.config#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.593 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b31fafb4-3888-4647-9d5d-5f528ff795b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvjo835cq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.727 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b31fafb4-3888-4647-9d5d-5f528ff795b5/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvjo835cq" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.762 227766 DEBUG nova.storage.rbd_utils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] rbd image b31fafb4-3888-4647-9d5d-5f528ff795b5_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.765 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/b31fafb4-3888-4647-9d5d-5f528ff795b5/disk.config b31fafb4-3888-4647-9d5d-5f528ff795b5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.944 227766 DEBUG oslo_concurrency.processutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/b31fafb4-3888-4647-9d5d-5f528ff795b5/disk.config b31fafb4-3888-4647-9d5d-5f528ff795b5_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.179s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:33 np0005593234 nova_compute[227762]: 2026-01-23 10:26:33.946 227766 INFO nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Deleting local config drive /var/lib/nova/instances/b31fafb4-3888-4647-9d5d-5f528ff795b5/disk.config because it was imported into RBD.#033[00m
Jan 23 05:26:34 np0005593234 kernel: tap54d9357e-ac: entered promiscuous mode
Jan 23 05:26:34 np0005593234 NetworkManager[48942]: <info>  [1769163994.0005] manager: (tap54d9357e-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/339)
Jan 23 05:26:34 np0005593234 nova_compute[227762]: 2026-01-23 10:26:34.002 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:34Z|00703|binding|INFO|Claiming lport 54d9357e-ac9f-458b-b6ce-6da38bc7a025 for this chassis.
Jan 23 05:26:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:34Z|00704|binding|INFO|54d9357e-ac9f-458b-b6ce-6da38bc7a025: Claiming fa:16:3e:e4:7b:6d 10.100.0.6
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.011 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:7b:6d 10.100.0.6'], port_security=['fa:16:3e:e4:7b:6d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b31fafb4-3888-4647-9d5d-5f528ff795b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e7a48d7e-0ec9-4b5d-b243-77d724af740b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b27af793a8cc42259216fbeaa302ba03', 'neutron:revision_number': '2', 'neutron:security_group_ids': '44462e8d-74cf-41bb-9b11-7aa082a0a20c f254f9d1-4249-4434-8719-2f2e0b2c9d0c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b51bdcc-ae54-4ab6-8265-f5e2637692e4, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=54d9357e-ac9f-458b-b6ce-6da38bc7a025) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.014 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 54d9357e-ac9f-458b-b6ce-6da38bc7a025 in datapath e7a48d7e-0ec9-4b5d-b243-77d724af740b bound to our chassis#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.016 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e7a48d7e-0ec9-4b5d-b243-77d724af740b#033[00m
Jan 23 05:26:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:34Z|00705|binding|INFO|Setting lport 54d9357e-ac9f-458b-b6ce-6da38bc7a025 ovn-installed in OVS
Jan 23 05:26:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:34Z|00706|binding|INFO|Setting lport 54d9357e-ac9f-458b-b6ce-6da38bc7a025 up in Southbound
Jan 23 05:26:34 np0005593234 nova_compute[227762]: 2026-01-23 10:26:34.028 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.028 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[33b63eca-04e1-41f7-9a9e-f0b8fa08324c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.030 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape7a48d7e-01 in ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.032 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape7a48d7e-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.032 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[afbd1726-a496-4e0c-8b79-f1cfa1d3c83c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.033 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[22c9f782-354e-425e-9113-bccea8ce5a57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 nova_compute[227762]: 2026-01-23 10:26:34.034 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:34 np0005593234 systemd-udevd[304062]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:26:34 np0005593234 systemd-machined[195626]: New machine qemu-79-instance-000000a9.
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.046 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[599843a1-b5cf-4077-a7df-97a313676038]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 NetworkManager[48942]: <info>  [1769163994.0536] device (tap54d9357e-ac): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:26:34 np0005593234 NetworkManager[48942]: <info>  [1769163994.0545] device (tap54d9357e-ac): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:26:34 np0005593234 systemd[1]: Started Virtual Machine qemu-79-instance-000000a9.
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.073 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cd93d28b-eccb-46b2-a89b-c5e0dbe62129]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.101 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[5e7a0868-2b56-454f-9658-94dc8fe4b509]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.106 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f817fd96-371c-4bd4-ba03-8efa8143e75d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 NetworkManager[48942]: <info>  [1769163994.1079] manager: (tape7a48d7e-00): new Veth device (/org/freedesktop/NetworkManager/Devices/340)
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.135 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4da49416-e349-4127-b74d-aaadc3bd6ceb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.139 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[23f73e89-6275-4c46-bd36-858f76e9e5c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 NetworkManager[48942]: <info>  [1769163994.1647] device (tape7a48d7e-00): carrier: link connected
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.170 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7b887912-48b6-4a63-9b23-84371ad0a1ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:34.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.188 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[af90625f-b68c-456b-8d85-ce75557a2cec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape7a48d7e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:92:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 788912, 'reachable_time': 26468, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304095, 'error': None, 'target': 'ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.206 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eb531e58-b58e-4e9b-bf90-3065dfbdeff6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:925f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 788912, 'tstamp': 788912}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304096, 'error': None, 'target': 'ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.225 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8f1aeb42-4d1a-4efc-8119-1798425105e5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape7a48d7e-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:90:92:5f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 219], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 788912, 'reachable_time': 26468, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 304097, 'error': None, 'target': 'ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.256 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aff54b62-578d-40df-a4f0-3febef4ea1e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.314 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0b1ce9a8-ab7f-469c-bc1c-6e9296366941]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.316 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7a48d7e-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.316 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.316 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape7a48d7e-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:34 np0005593234 nova_compute[227762]: 2026-01-23 10:26:34.318 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:34 np0005593234 kernel: tape7a48d7e-00: entered promiscuous mode
Jan 23 05:26:34 np0005593234 NetworkManager[48942]: <info>  [1769163994.3188] manager: (tape7a48d7e-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/341)
Jan 23 05:26:34 np0005593234 nova_compute[227762]: 2026-01-23 10:26:34.321 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.321 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape7a48d7e-00, col_values=(('external_ids', {'iface-id': 'ec60a79a-3050-4cfa-9c46-cf939e3eeae0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.325 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e7a48d7e-0ec9-4b5d-b243-77d724af740b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e7a48d7e-0ec9-4b5d-b243-77d724af740b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:26:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:34Z|00707|binding|INFO|Releasing lport ec60a79a-3050-4cfa-9c46-cf939e3eeae0 from this chassis (sb_readonly=0)
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.328 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9b4b37bc-40ae-4a69-899b-e914b8e1ae19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.329 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-e7a48d7e-0ec9-4b5d-b243-77d724af740b
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/e7a48d7e-0ec9-4b5d-b243-77d724af740b.pid.haproxy
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID e7a48d7e-0ec9-4b5d-b243-77d724af740b
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:26:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:34.329 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b', 'env', 'PROCESS_TAG=haproxy-e7a48d7e-0ec9-4b5d-b243-77d724af740b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e7a48d7e-0ec9-4b5d-b243-77d724af740b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:26:34 np0005593234 nova_compute[227762]: 2026-01-23 10:26:34.343 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:34 np0005593234 podman[304128]: 2026-01-23 10:26:34.702414569 +0000 UTC m=+0.061084468 container create 63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:26:34 np0005593234 systemd[1]: Started libpod-conmon-63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f.scope.
Jan 23 05:26:34 np0005593234 podman[304128]: 2026-01-23 10:26:34.66833979 +0000 UTC m=+0.027009719 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:26:34 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:26:34 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47f5b4ac7f9944b8b5ed54891ddb9cfdaa75aa6911998526254050d88735e779/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:26:34 np0005593234 podman[304128]: 2026-01-23 10:26:34.808034051 +0000 UTC m=+0.166703980 container init 63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:26:34 np0005593234 podman[304128]: 2026-01-23 10:26:34.819064314 +0000 UTC m=+0.177734213 container start 63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:26:34 np0005593234 neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b[304177]: [NOTICE]   (304183) : New worker (304185) forked
Jan 23 05:26:34 np0005593234 neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b[304177]: [NOTICE]   (304183) : Loading success.
Jan 23 05:26:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:35.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e325 e325: 3 total, 3 up, 3 in
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.160 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163995.1601684, b31fafb4-3888-4647-9d5d-5f528ff795b5 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.161 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] VM Started (Lifecycle Event)#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.180 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.184 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163995.1610458, b31fafb4-3888-4647-9d5d-5f528ff795b5 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.184 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.204 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.207 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.232 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.737 227766 DEBUG nova.compute.manager [req-fcb9487f-0078-41f4-9e5b-3709f255131f req-9de7d212-727e-4bbe-9c99-e57292423e1a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Received event network-vif-plugged-54d9357e-ac9f-458b-b6ce-6da38bc7a025 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.737 227766 DEBUG oslo_concurrency.lockutils [req-fcb9487f-0078-41f4-9e5b-3709f255131f req-9de7d212-727e-4bbe-9c99-e57292423e1a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.738 227766 DEBUG oslo_concurrency.lockutils [req-fcb9487f-0078-41f4-9e5b-3709f255131f req-9de7d212-727e-4bbe-9c99-e57292423e1a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.738 227766 DEBUG oslo_concurrency.lockutils [req-fcb9487f-0078-41f4-9e5b-3709f255131f req-9de7d212-727e-4bbe-9c99-e57292423e1a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.738 227766 DEBUG nova.compute.manager [req-fcb9487f-0078-41f4-9e5b-3709f255131f req-9de7d212-727e-4bbe-9c99-e57292423e1a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Processing event network-vif-plugged-54d9357e-ac9f-458b-b6ce-6da38bc7a025 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.739 227766 DEBUG nova.compute.manager [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.742 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769163995.7417908, b31fafb4-3888-4647-9d5d-5f528ff795b5 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.742 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.743 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.747 227766 INFO nova.virt.libvirt.driver [-] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Instance spawned successfully.#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.747 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.765 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.770 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.773 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.774 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.774 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.775 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.775 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.775 227766 DEBUG nova.virt.libvirt.driver [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.798 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.828 227766 INFO nova.compute.manager [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Took 10.20 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.828 227766 DEBUG nova.compute.manager [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.889 227766 INFO nova.compute.manager [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Took 11.45 seconds to build instance.#033[00m
Jan 23 05:26:35 np0005593234 nova_compute[227762]: 2026-01-23 10:26:35.915 227766 DEBUG oslo_concurrency.lockutils [None req-f37324b4-1ec9-4040-b0fe-25438d9086c8 a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.613s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e326 e326: 3 total, 3 up, 3 in
Jan 23 05:26:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:36.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:37.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e327 e327: 3 total, 3 up, 3 in
Jan 23 05:26:37 np0005593234 nova_compute[227762]: 2026-01-23 10:26:37.883 227766 DEBUG nova.compute.manager [req-78a40189-3c20-416a-96bf-4926b49ce932 req-47ecf891-152e-4f7c-bb6e-cc2d1d9e1511 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Received event network-vif-plugged-54d9357e-ac9f-458b-b6ce-6da38bc7a025 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:37 np0005593234 nova_compute[227762]: 2026-01-23 10:26:37.883 227766 DEBUG oslo_concurrency.lockutils [req-78a40189-3c20-416a-96bf-4926b49ce932 req-47ecf891-152e-4f7c-bb6e-cc2d1d9e1511 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:37 np0005593234 nova_compute[227762]: 2026-01-23 10:26:37.884 227766 DEBUG oslo_concurrency.lockutils [req-78a40189-3c20-416a-96bf-4926b49ce932 req-47ecf891-152e-4f7c-bb6e-cc2d1d9e1511 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:37 np0005593234 nova_compute[227762]: 2026-01-23 10:26:37.884 227766 DEBUG oslo_concurrency.lockutils [req-78a40189-3c20-416a-96bf-4926b49ce932 req-47ecf891-152e-4f7c-bb6e-cc2d1d9e1511 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:37 np0005593234 nova_compute[227762]: 2026-01-23 10:26:37.884 227766 DEBUG nova.compute.manager [req-78a40189-3c20-416a-96bf-4926b49ce932 req-47ecf891-152e-4f7c-bb6e-cc2d1d9e1511 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] No waiting events found dispatching network-vif-plugged-54d9357e-ac9f-458b-b6ce-6da38bc7a025 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:37 np0005593234 nova_compute[227762]: 2026-01-23 10:26:37.885 227766 WARNING nova.compute.manager [req-78a40189-3c20-416a-96bf-4926b49ce932 req-47ecf891-152e-4f7c-bb6e-cc2d1d9e1511 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Received unexpected event network-vif-plugged-54d9357e-ac9f-458b-b6ce-6da38bc7a025 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:26:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e327 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:38 np0005593234 nova_compute[227762]: 2026-01-23 10:26:38.020 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:38 np0005593234 nova_compute[227762]: 2026-01-23 10:26:38.041 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:38.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:39.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:26:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:26:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:40.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:41.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:41 np0005593234 podman[304303]: 2026-01-23 10:26:41.836527033 +0000 UTC m=+0.126724678 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:26:41 np0005593234 nova_compute[227762]: 2026-01-23 10:26:41.949 227766 DEBUG oslo_concurrency.lockutils [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:41 np0005593234 nova_compute[227762]: 2026-01-23 10:26:41.949 227766 DEBUG oslo_concurrency.lockutils [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:41 np0005593234 nova_compute[227762]: 2026-01-23 10:26:41.970 227766 DEBUG nova.objects.instance [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'flavor' on Instance uuid 12b37be9-93a2-4e10-9056-68a743ed2673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.009 227766 DEBUG oslo_concurrency.lockutils [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.060s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:42 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:42Z|00708|binding|INFO|Releasing lport 8ca9fbcb-59f5-4006-84df-ab99827a2b39 from this chassis (sb_readonly=0)
Jan 23 05:26:42 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:42Z|00709|binding|INFO|Releasing lport ec60a79a-3050-4cfa-9c46-cf939e3eeae0 from this chassis (sb_readonly=0)
Jan 23 05:26:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:42.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.220 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.304 227766 DEBUG oslo_concurrency.lockutils [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.305 227766 DEBUG oslo_concurrency.lockutils [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.305 227766 INFO nova.compute.manager [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Attaching volume fd29228e-f2b9-4bc3-af15-5ec512d8169b to /dev/vdb#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.506 227766 DEBUG os_brick.utils [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.509 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.520 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.521 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[fa851b5c-635e-4053-8b5e-a475309268ea]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.522 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.531 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.531 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[3b6aaadb-3ecb-48f5-b711-e02ed1fb3f7c]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.533 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.542 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.542 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[e78ffc6d-c940-435d-8dba-a5d3b83788ab]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.544 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[e86b22f1-0834-4a20-a379-d719e0758953]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.545 227766 DEBUG oslo_concurrency.processutils [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.572 227766 DEBUG oslo_concurrency.processutils [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "nvme version" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.575 227766 DEBUG os_brick.initiator.connectors.lightos [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.575 227766 DEBUG os_brick.initiator.connectors.lightos [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.575 227766 DEBUG os_brick.initiator.connectors.lightos [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.576 227766 DEBUG os_brick.utils [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] <== get_connector_properties: return (69ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:26:42 np0005593234 nova_compute[227762]: 2026-01-23 10:26:42.576 227766 DEBUG nova.virt.block_device [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Updating existing volume attachment record: 7bf13d07-b8cd-4849-8619-87436191ce4e _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:26:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 e328: 3 total, 3 up, 3 in
Jan 23 05:26:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:42.861 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:42.862 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:42.862 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.023 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.042 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:43.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.274 227766 DEBUG nova.compute.manager [req-34700fe2-2453-4929-af19-b956bfacc782 req-8123571d-a198-4044-840a-e7c88167679b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Received event network-changed-54d9357e-ac9f-458b-b6ce-6da38bc7a025 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.275 227766 DEBUG nova.compute.manager [req-34700fe2-2453-4929-af19-b956bfacc782 req-8123571d-a198-4044-840a-e7c88167679b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Refreshing instance network info cache due to event network-changed-54d9357e-ac9f-458b-b6ce-6da38bc7a025. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.276 227766 DEBUG oslo_concurrency.lockutils [req-34700fe2-2453-4929-af19-b956bfacc782 req-8123571d-a198-4044-840a-e7c88167679b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.276 227766 DEBUG oslo_concurrency.lockutils [req-34700fe2-2453-4929-af19-b956bfacc782 req-8123571d-a198-4044-840a-e7c88167679b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.277 227766 DEBUG nova.network.neutron [req-34700fe2-2453-4929-af19-b956bfacc782 req-8123571d-a198-4044-840a-e7c88167679b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Refreshing network info cache for port 54d9357e-ac9f-458b-b6ce-6da38bc7a025 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:26:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:26:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4140796097' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.607 227766 DEBUG nova.objects.instance [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'flavor' on Instance uuid 12b37be9-93a2-4e10-9056-68a743ed2673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.634 227766 DEBUG nova.virt.libvirt.driver [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Attempting to attach volume fd29228e-f2b9-4bc3-af15-5ec512d8169b with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.638 227766 DEBUG nova.virt.libvirt.guest [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:26:43 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:26:43 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-fd29228e-f2b9-4bc3-af15-5ec512d8169b">
Jan 23 05:26:43 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:43 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:43 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:43 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:26:43 np0005593234 nova_compute[227762]:  <auth username="openstack">
Jan 23 05:26:43 np0005593234 nova_compute[227762]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:26:43 np0005593234 nova_compute[227762]:  </auth>
Jan 23 05:26:43 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:26:43 np0005593234 nova_compute[227762]:  <serial>fd29228e-f2b9-4bc3-af15-5ec512d8169b</serial>
Jan 23 05:26:43 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:26:43 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.792 227766 DEBUG nova.virt.libvirt.driver [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.792 227766 DEBUG nova.virt.libvirt.driver [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.793 227766 DEBUG nova.virt.libvirt.driver [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:43 np0005593234 nova_compute[227762]: 2026-01-23 10:26:43.793 227766 DEBUG nova.virt.libvirt.driver [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No VIF found with MAC fa:16:3e:09:3c:ad, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:26:44 np0005593234 nova_compute[227762]: 2026-01-23 10:26:44.102 227766 DEBUG oslo_concurrency.lockutils [None req-8a411831-5e06-4240-a4b9-660634fbe137 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.797s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:44.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:45.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:46 np0005593234 nova_compute[227762]: 2026-01-23 10:26:46.077 227766 DEBUG nova.network.neutron [req-34700fe2-2453-4929-af19-b956bfacc782 req-8123571d-a198-4044-840a-e7c88167679b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Updated VIF entry in instance network info cache for port 54d9357e-ac9f-458b-b6ce-6da38bc7a025. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:26:46 np0005593234 nova_compute[227762]: 2026-01-23 10:26:46.078 227766 DEBUG nova.network.neutron [req-34700fe2-2453-4929-af19-b956bfacc782 req-8123571d-a198-4044-840a-e7c88167679b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Updating instance_info_cache with network_info: [{"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:46 np0005593234 nova_compute[227762]: 2026-01-23 10:26:46.103 227766 DEBUG oslo_concurrency.lockutils [req-34700fe2-2453-4929-af19-b956bfacc782 req-8123571d-a198-4044-840a-e7c88167679b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:46.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:46 np0005593234 nova_compute[227762]: 2026-01-23 10:26:46.623 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:46 np0005593234 nova_compute[227762]: 2026-01-23 10:26:46.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:46 np0005593234 nova_compute[227762]: 2026-01-23 10:26:46.768 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:46 np0005593234 nova_compute[227762]: 2026-01-23 10:26:46.769 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:46 np0005593234 nova_compute[227762]: 2026-01-23 10:26:46.769 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:46 np0005593234 nova_compute[227762]: 2026-01-23 10:26:46.770 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:26:46 np0005593234 nova_compute[227762]: 2026-01-23 10:26:46.770 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.052 227766 DEBUG oslo_concurrency.lockutils [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.053 227766 DEBUG oslo_concurrency.lockutils [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:26:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:47.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.074 227766 DEBUG nova.objects.instance [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'flavor' on Instance uuid 12b37be9-93a2-4e10-9056-68a743ed2673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.124 227766 DEBUG oslo_concurrency.lockutils [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:26:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/378381195' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.215 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.307 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000a9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.308 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000a9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.312 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000a6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.312 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000a6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.313 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000a6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.433 227766 DEBUG oslo_concurrency.lockutils [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.434 227766 DEBUG oslo_concurrency.lockutils [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.434 227766 INFO nova.compute.manager [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Attaching volume bc9f2ac6-a45e-44ec-92bc-8e8e9f030cb1 to /dev/vdc#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.519 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.521 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3980MB free_disk=20.855281829833984GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.521 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.521 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.624 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 12b37be9-93a2-4e10-9056-68a743ed2673 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.624 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance b31fafb4-3888-4647-9d5d-5f528ff795b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.624 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.625 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.632 227766 DEBUG os_brick.utils [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.634 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.646 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.646 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.647 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[b7699785-1978-4ed5-bdc5-9b263b9631ed]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.650 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.660 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.660 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[76fb2c41-b873-47c3-815d-5a856ee4db49]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.662 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.671 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.671 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.670 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.670 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[31f7d3b0-49ff-4031-85c7-89634d523f73]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.675 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[89309af8-c352-442f-a7f1-ac60db2d8de8]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.676 227766 DEBUG oslo_concurrency.processutils [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.701 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.705 227766 DEBUG oslo_concurrency.processutils [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "nvme version" returned: 0 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.706 227766 DEBUG os_brick.initiator.connectors.lightos [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.707 227766 DEBUG os_brick.initiator.connectors.lightos [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.707 227766 DEBUG os_brick.initiator.connectors.lightos [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.707 227766 DEBUG os_brick.utils [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] <== get_connector_properties: return (75ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.707 227766 DEBUG nova.virt.block_device [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Updating existing volume attachment record: 612a1775-2755-4c28-b014-c749b2705287 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.731 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:26:47 np0005593234 nova_compute[227762]: 2026-01-23 10:26:47.822 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:26:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.026 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.044 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:26:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:48.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:26:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:26:48 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3534917292' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.282 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.288 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.330 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.372 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.372 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.851s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:26:48 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2981338672' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.709 227766 DEBUG nova.objects.instance [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'flavor' on Instance uuid 12b37be9-93a2-4e10-9056-68a743ed2673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.742 227766 DEBUG nova.virt.libvirt.driver [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Attempting to attach volume bc9f2ac6-a45e-44ec-92bc-8e8e9f030cb1 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.745 227766 DEBUG nova.virt.libvirt.guest [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:26:48 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:26:48 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-bc9f2ac6-a45e-44ec-92bc-8e8e9f030cb1">
Jan 23 05:26:48 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:48 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:48 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:48 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:26:48 np0005593234 nova_compute[227762]:  <auth username="openstack">
Jan 23 05:26:48 np0005593234 nova_compute[227762]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:26:48 np0005593234 nova_compute[227762]:  </auth>
Jan 23 05:26:48 np0005593234 nova_compute[227762]:  <target dev="vdc" bus="virtio"/>
Jan 23 05:26:48 np0005593234 nova_compute[227762]:  <serial>bc9f2ac6-a45e-44ec-92bc-8e8e9f030cb1</serial>
Jan 23 05:26:48 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:26:48 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.879 227766 DEBUG nova.virt.libvirt.driver [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.880 227766 DEBUG nova.virt.libvirt.driver [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.880 227766 DEBUG nova.virt.libvirt.driver [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.880 227766 DEBUG nova.virt.libvirt.driver [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:26:48 np0005593234 nova_compute[227762]: 2026-01-23 10:26:48.881 227766 DEBUG nova.virt.libvirt.driver [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] No VIF found with MAC fa:16:3e:09:3c:ad, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:26:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:49.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:49 np0005593234 nova_compute[227762]: 2026-01-23 10:26:49.141 227766 DEBUG oslo_concurrency.lockutils [None req-55de22bc-0e14-4d71-bd6f-b2c3919d8d2c 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:50Z|00082|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e4:7b:6d 10.100.0.6
Jan 23 05:26:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:50Z|00083|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e4:7b:6d 10.100.0.6
Jan 23 05:26:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:50.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.017 227766 DEBUG oslo_concurrency.lockutils [None req-36e15907-14b9-4faf-8f26-391ad8d6a941 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.017 227766 DEBUG oslo_concurrency.lockutils [None req-36e15907-14b9-4faf-8f26-391ad8d6a941 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.034 227766 INFO nova.compute.manager [None req-36e15907-14b9-4faf-8f26-391ad8d6a941 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Detaching volume fd29228e-f2b9-4bc3-af15-5ec512d8169b#033[00m
Jan 23 05:26:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:51.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.255 227766 INFO nova.virt.block_device [None req-36e15907-14b9-4faf-8f26-391ad8d6a941 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Attempting to driver detach volume fd29228e-f2b9-4bc3-af15-5ec512d8169b from mountpoint /dev/vdb#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.263 227766 DEBUG nova.virt.libvirt.driver [None req-36e15907-14b9-4faf-8f26-391ad8d6a941 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Attempting to detach device vdb from instance 12b37be9-93a2-4e10-9056-68a743ed2673 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.263 227766 DEBUG nova.virt.libvirt.guest [None req-36e15907-14b9-4faf-8f26-391ad8d6a941 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:26:51 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-fd29228e-f2b9-4bc3-af15-5ec512d8169b">
Jan 23 05:26:51 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:  <serial>fd29228e-f2b9-4bc3-af15-5ec512d8169b</serial>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:26:51 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:26:51 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.271 227766 INFO nova.virt.libvirt.driver [None req-36e15907-14b9-4faf-8f26-391ad8d6a941 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully detached device vdb from instance 12b37be9-93a2-4e10-9056-68a743ed2673 from the persistent domain config.#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.272 227766 DEBUG nova.virt.libvirt.driver [None req-36e15907-14b9-4faf-8f26-391ad8d6a941 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 12b37be9-93a2-4e10-9056-68a743ed2673 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.272 227766 DEBUG nova.virt.libvirt.guest [None req-36e15907-14b9-4faf-8f26-391ad8d6a941 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:26:51 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-fd29228e-f2b9-4bc3-af15-5ec512d8169b">
Jan 23 05:26:51 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:  <serial>fd29228e-f2b9-4bc3-af15-5ec512d8169b</serial>
Jan 23 05:26:51 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:26:51 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:26:51 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.318 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769164011.3186486, 12b37be9-93a2-4e10-9056-68a743ed2673 => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.320 227766 DEBUG nova.virt.libvirt.driver [None req-36e15907-14b9-4faf-8f26-391ad8d6a941 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 12b37be9-93a2-4e10-9056-68a743ed2673 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.322 227766 INFO nova.virt.libvirt.driver [None req-36e15907-14b9-4faf-8f26-391ad8d6a941 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully detached device vdb from instance 12b37be9-93a2-4e10-9056-68a743ed2673 from the live domain config.#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.372 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.568 227766 DEBUG nova.objects.instance [None req-36e15907-14b9-4faf-8f26-391ad8d6a941 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'flavor' on Instance uuid 12b37be9-93a2-4e10-9056-68a743ed2673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.611 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:51 np0005593234 nova_compute[227762]: 2026-01-23 10:26:51.644 227766 DEBUG oslo_concurrency.lockutils [None req-36e15907-14b9-4faf-8f26-391ad8d6a941 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:52.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:52 np0005593234 nova_compute[227762]: 2026-01-23 10:26:52.843 227766 DEBUG oslo_concurrency.lockutils [None req-6ed7576e-0225-4342-88ee-ce9cf39f5cd7 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:52 np0005593234 nova_compute[227762]: 2026-01-23 10:26:52.844 227766 DEBUG oslo_concurrency.lockutils [None req-6ed7576e-0225-4342-88ee-ce9cf39f5cd7 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:52 np0005593234 nova_compute[227762]: 2026-01-23 10:26:52.863 227766 INFO nova.compute.manager [None req-6ed7576e-0225-4342-88ee-ce9cf39f5cd7 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Detaching volume bc9f2ac6-a45e-44ec-92bc-8e8e9f030cb1#033[00m
Jan 23 05:26:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:53 np0005593234 nova_compute[227762]: 2026-01-23 10:26:53.023 227766 INFO nova.virt.block_device [None req-6ed7576e-0225-4342-88ee-ce9cf39f5cd7 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Attempting to driver detach volume bc9f2ac6-a45e-44ec-92bc-8e8e9f030cb1 from mountpoint /dev/vdc#033[00m
Jan 23 05:26:53 np0005593234 nova_compute[227762]: 2026-01-23 10:26:53.030 227766 DEBUG nova.virt.libvirt.driver [None req-6ed7576e-0225-4342-88ee-ce9cf39f5cd7 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Attempting to detach device vdc from instance 12b37be9-93a2-4e10-9056-68a743ed2673 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:26:53 np0005593234 nova_compute[227762]: 2026-01-23 10:26:53.030 227766 DEBUG nova.virt.libvirt.guest [None req-6ed7576e-0225-4342-88ee-ce9cf39f5cd7 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:26:53 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-bc9f2ac6-a45e-44ec-92bc-8e8e9f030cb1">
Jan 23 05:26:53 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:  <target dev="vdc" bus="virtio"/>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:  <serial>bc9f2ac6-a45e-44ec-92bc-8e8e9f030cb1</serial>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 23 05:26:53 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:26:53 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:26:53 np0005593234 nova_compute[227762]: 2026-01-23 10:26:53.031 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:53 np0005593234 nova_compute[227762]: 2026-01-23 10:26:53.037 227766 INFO nova.virt.libvirt.driver [None req-6ed7576e-0225-4342-88ee-ce9cf39f5cd7 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully detached device vdc from instance 12b37be9-93a2-4e10-9056-68a743ed2673 from the persistent domain config.#033[00m
Jan 23 05:26:53 np0005593234 nova_compute[227762]: 2026-01-23 10:26:53.037 227766 DEBUG nova.virt.libvirt.driver [None req-6ed7576e-0225-4342-88ee-ce9cf39f5cd7 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance 12b37be9-93a2-4e10-9056-68a743ed2673 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:26:53 np0005593234 nova_compute[227762]: 2026-01-23 10:26:53.038 227766 DEBUG nova.virt.libvirt.guest [None req-6ed7576e-0225-4342-88ee-ce9cf39f5cd7 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:26:53 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-bc9f2ac6-a45e-44ec-92bc-8e8e9f030cb1">
Jan 23 05:26:53 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:  <target dev="vdc" bus="virtio"/>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:  <serial>bc9f2ac6-a45e-44ec-92bc-8e8e9f030cb1</serial>
Jan 23 05:26:53 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x07" slot="0x00" function="0x0"/>
Jan 23 05:26:53 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:26:53 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:26:53 np0005593234 nova_compute[227762]: 2026-01-23 10:26:53.045 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:53.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:53 np0005593234 nova_compute[227762]: 2026-01-23 10:26:53.087 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769164013.0875585, 12b37be9-93a2-4e10-9056-68a743ed2673 => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:26:53 np0005593234 nova_compute[227762]: 2026-01-23 10:26:53.089 227766 DEBUG nova.virt.libvirt.driver [None req-6ed7576e-0225-4342-88ee-ce9cf39f5cd7 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance 12b37be9-93a2-4e10-9056-68a743ed2673 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:26:53 np0005593234 nova_compute[227762]: 2026-01-23 10:26:53.090 227766 INFO nova.virt.libvirt.driver [None req-6ed7576e-0225-4342-88ee-ce9cf39f5cd7 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully detached device vdc from instance 12b37be9-93a2-4e10-9056-68a743ed2673 from the live domain config.#033[00m
Jan 23 05:26:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:54.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:55.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:55 np0005593234 nova_compute[227762]: 2026-01-23 10:26:55.402 227766 DEBUG nova.objects.instance [None req-6ed7576e-0225-4342-88ee-ce9cf39f5cd7 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'flavor' on Instance uuid 12b37be9-93a2-4e10-9056-68a743ed2673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:55 np0005593234 nova_compute[227762]: 2026-01-23 10:26:55.663 227766 DEBUG oslo_concurrency.lockutils [None req-6ed7576e-0225-4342-88ee-ce9cf39f5cd7 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 2.819s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:55 np0005593234 nova_compute[227762]: 2026-01-23 10:26:55.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:55 np0005593234 nova_compute[227762]: 2026-01-23 10:26:55.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:26:55 np0005593234 nova_compute[227762]: 2026-01-23 10:26:55.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:26:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:56.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:56 np0005593234 nova_compute[227762]: 2026-01-23 10:26:56.216 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-12b37be9-93a2-4e10-9056-68a743ed2673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:26:56 np0005593234 nova_compute[227762]: 2026-01-23 10:26:56.217 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-12b37be9-93a2-4e10-9056-68a743ed2673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:26:56 np0005593234 nova_compute[227762]: 2026-01-23 10:26:56.217 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:26:56 np0005593234 nova_compute[227762]: 2026-01-23 10:26:56.217 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 12b37be9-93a2-4e10-9056-68a743ed2673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:57.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:57 np0005593234 nova_compute[227762]: 2026-01-23 10:26:57.497 227766 DEBUG oslo_concurrency.lockutils [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:57 np0005593234 nova_compute[227762]: 2026-01-23 10:26:57.498 227766 DEBUG oslo_concurrency.lockutils [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:57 np0005593234 nova_compute[227762]: 2026-01-23 10:26:57.498 227766 DEBUG oslo_concurrency.lockutils [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:57 np0005593234 nova_compute[227762]: 2026-01-23 10:26:57.498 227766 DEBUG oslo_concurrency.lockutils [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:57 np0005593234 nova_compute[227762]: 2026-01-23 10:26:57.499 227766 DEBUG oslo_concurrency.lockutils [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:57 np0005593234 nova_compute[227762]: 2026-01-23 10:26:57.500 227766 INFO nova.compute.manager [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Terminating instance#033[00m
Jan 23 05:26:57 np0005593234 nova_compute[227762]: 2026-01-23 10:26:57.501 227766 DEBUG nova.compute.manager [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:26:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.033 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.047 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.122 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Updating instance_info_cache with network_info: [{"id": "33699e18-9d87-4a6b-9145-84572bf07525", "address": "fa:16:3e:09:3c:ad", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33699e18-9d", "ovs_interfaceid": "33699e18-9d87-4a6b-9145-84572bf07525", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.138 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-12b37be9-93a2-4e10-9056-68a743ed2673" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.138 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.138 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:58.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:58 np0005593234 kernel: tap33699e18-9d (unregistering): left promiscuous mode
Jan 23 05:26:58 np0005593234 NetworkManager[48942]: <info>  [1769164018.3552] device (tap33699e18-9d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:26:58 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:58Z|00710|binding|INFO|Releasing lport 33699e18-9d87-4a6b-9145-84572bf07525 from this chassis (sb_readonly=0)
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.366 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:58 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:58Z|00711|binding|INFO|Setting lport 33699e18-9d87-4a6b-9145-84572bf07525 down in Southbound
Jan 23 05:26:58 np0005593234 ovn_controller[134547]: 2026-01-23T10:26:58Z|00712|binding|INFO|Removing iface tap33699e18-9d ovn-installed in OVS
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.380 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:09:3c:ad 10.100.0.11'], port_security=['fa:16:3e:09:3c:ad 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '12b37be9-93a2-4e10-9056-68a743ed2673', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1280650e-e283-4ddc-81aa-357640520155', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c7c25c6bb33b41bf9cd8febb8259fd87', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3994afaf-c8e7-4265-9c7f-619a98353860', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.247'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d4684203-7828-4ea2-86ad-83030eb9aefe, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=33699e18-9d87-4a6b-9145-84572bf07525) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.381 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 33699e18-9d87-4a6b-9145-84572bf07525 in datapath 1280650e-e283-4ddc-81aa-357640520155 unbound from our chassis#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.382 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.383 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1280650e-e283-4ddc-81aa-357640520155, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.384 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[25e26521-bb29-4e8d-899d-ba82eec7d640]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.384 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1280650e-e283-4ddc-81aa-357640520155 namespace which is not needed anymore#033[00m
Jan 23 05:26:58 np0005593234 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a6.scope: Deactivated successfully.
Jan 23 05:26:58 np0005593234 systemd[1]: machine-qemu\x2d78\x2dinstance\x2d000000a6.scope: Consumed 14.817s CPU time.
Jan 23 05:26:58 np0005593234 systemd-machined[195626]: Machine qemu-78-instance-000000a6 terminated.
Jan 23 05:26:58 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[303512]: [NOTICE]   (303516) : haproxy version is 2.8.14-c23fe91
Jan 23 05:26:58 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[303512]: [NOTICE]   (303516) : path to executable is /usr/sbin/haproxy
Jan 23 05:26:58 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[303512]: [WARNING]  (303516) : Exiting Master process...
Jan 23 05:26:58 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[303512]: [WARNING]  (303516) : Exiting Master process...
Jan 23 05:26:58 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[303512]: [ALERT]    (303516) : Current worker (303518) exited with code 143 (Terminated)
Jan 23 05:26:58 np0005593234 neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155[303512]: [WARNING]  (303516) : All workers exited. Exiting... (0)
Jan 23 05:26:58 np0005593234 systemd[1]: libpod-1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa.scope: Deactivated successfully.
Jan 23 05:26:58 np0005593234 podman[304514]: 2026-01-23 10:26:58.503952019 +0000 UTC m=+0.041365296 container died 1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:26:58 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa-userdata-shm.mount: Deactivated successfully.
Jan 23 05:26:58 np0005593234 systemd[1]: var-lib-containers-storage-overlay-b7ad02a52b5c851035b5b90dcc6111692236ef4a8982f301b569b01ee01337b2-merged.mount: Deactivated successfully.
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.539 227766 INFO nova.virt.libvirt.driver [-] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Instance destroyed successfully.#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.539 227766 DEBUG nova.objects.instance [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lazy-loading 'resources' on Instance uuid 12b37be9-93a2-4e10-9056-68a743ed2673 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:26:58 np0005593234 podman[304514]: 2026-01-23 10:26:58.548665658 +0000 UTC m=+0.086078935 container cleanup 1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:26:58 np0005593234 systemd[1]: libpod-conmon-1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa.scope: Deactivated successfully.
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.564 227766 DEBUG nova.virt.libvirt.vif [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:25:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-AttachVolumeTestJSON-server-1294604962',display_name='tempest-AttachVolumeTestJSON-server-1294604962',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumetestjson-server-1294604962',id=166,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIl3PJnyMfFo6c2aSbcndGrUeCJDPvPLm0yGb1s4ShCt8tulHq0w/NQxJ8e+7tYOXA+JcLABr8r+WehowojFcIvgwp+tPzjk76SVFT14Seq+GvrBIJWX3DZtL3BrePr6pw==',key_name='tempest-keypair-1905896331',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:26:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c7c25c6bb33b41bf9cd8febb8259fd87',ramdisk_id='',reservation_id='r-292m7c9k',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeTestJSON-345871886',owner_user_name='tempest-AttachVolumeTestJSON-345871886-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:26:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='01b7396ecc574dd6ba2df2f406921223',uuid=12b37be9-93a2-4e10-9056-68a743ed2673,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "33699e18-9d87-4a6b-9145-84572bf07525", "address": "fa:16:3e:09:3c:ad", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33699e18-9d", "ovs_interfaceid": "33699e18-9d87-4a6b-9145-84572bf07525", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.564 227766 DEBUG nova.network.os_vif_util [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converting VIF {"id": "33699e18-9d87-4a6b-9145-84572bf07525", "address": "fa:16:3e:09:3c:ad", "network": {"id": "1280650e-e283-4ddc-81aa-357640520155", "bridge": "br-int", "label": "tempest-AttachVolumeTestJSON-970937620-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c7c25c6bb33b41bf9cd8febb8259fd87", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33699e18-9d", "ovs_interfaceid": "33699e18-9d87-4a6b-9145-84572bf07525", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.565 227766 DEBUG nova.network.os_vif_util [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:09:3c:ad,bridge_name='br-int',has_traffic_filtering=True,id=33699e18-9d87-4a6b-9145-84572bf07525,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33699e18-9d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.565 227766 DEBUG os_vif [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:3c:ad,bridge_name='br-int',has_traffic_filtering=True,id=33699e18-9d87-4a6b-9145-84572bf07525,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33699e18-9d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.567 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.567 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33699e18-9d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.571 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.574 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.576 227766 INFO os_vif [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:09:3c:ad,bridge_name='br-int',has_traffic_filtering=True,id=33699e18-9d87-4a6b-9145-84572bf07525,network=Network(1280650e-e283-4ddc-81aa-357640520155),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33699e18-9d')#033[00m
Jan 23 05:26:58 np0005593234 podman[304554]: 2026-01-23 10:26:58.611520111 +0000 UTC m=+0.044332339 container remove 1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.616 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b62e8510-3aea-45de-b7c7-6414abba9140]: (4, ('Fri Jan 23 10:26:58 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155 (1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa)\n1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa\nFri Jan 23 10:26:58 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-1280650e-e283-4ddc-81aa-357640520155 (1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa)\n1e27abb58bc5026297d6ebeba50c7c056aeac06acfcfebd7717dafd476e3e0aa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.619 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[41971a7b-d17e-465d-b021-f4f18e101305]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.620 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1280650e-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.621 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:58 np0005593234 kernel: tap1280650e-e0: left promiscuous mode
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.635 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.636 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c93a1b08-06ff-4a83-b69f-9ddab4a535ab]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.655 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[83dbcf05-42f8-4394-a144-8b44cecbb52b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.656 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[683e263d-f53e-4ce2-ae45-2be6ca86b370]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.670 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b4ddfd00-7617-44b5-8eff-598330e4ca39]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 786874, 'reachable_time': 18420, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304587, 'error': None, 'target': 'ovnmeta-1280650e-e283-4ddc-81aa-357640520155', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:58 np0005593234 systemd[1]: run-netns-ovnmeta\x2d1280650e\x2de283\x2d4ddc\x2d81aa\x2d357640520155.mount: Deactivated successfully.
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.674 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1280650e-e283-4ddc-81aa-357640520155 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:26:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:26:58.674 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[a5c5f316-2e68-47cf-8933-0f3113bf7c1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.940 227766 DEBUG nova.compute.manager [req-bf6b8a01-dbe7-47d7-a3cb-30ef65275334 req-11a19c77-27e2-47e4-a253-866caa7ba013 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Received event network-vif-unplugged-33699e18-9d87-4a6b-9145-84572bf07525 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.941 227766 DEBUG oslo_concurrency.lockutils [req-bf6b8a01-dbe7-47d7-a3cb-30ef65275334 req-11a19c77-27e2-47e4-a253-866caa7ba013 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.941 227766 DEBUG oslo_concurrency.lockutils [req-bf6b8a01-dbe7-47d7-a3cb-30ef65275334 req-11a19c77-27e2-47e4-a253-866caa7ba013 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.941 227766 DEBUG oslo_concurrency.lockutils [req-bf6b8a01-dbe7-47d7-a3cb-30ef65275334 req-11a19c77-27e2-47e4-a253-866caa7ba013 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.941 227766 DEBUG nova.compute.manager [req-bf6b8a01-dbe7-47d7-a3cb-30ef65275334 req-11a19c77-27e2-47e4-a253-866caa7ba013 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] No waiting events found dispatching network-vif-unplugged-33699e18-9d87-4a6b-9145-84572bf07525 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:26:58 np0005593234 nova_compute[227762]: 2026-01-23 10:26:58.942 227766 DEBUG nova.compute.manager [req-bf6b8a01-dbe7-47d7-a3cb-30ef65275334 req-11a19c77-27e2-47e4-a253-866caa7ba013 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Received event network-vif-unplugged-33699e18-9d87-4a6b-9145-84572bf07525 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:26:59 np0005593234 nova_compute[227762]: 2026-01-23 10:26:59.060 227766 INFO nova.virt.libvirt.driver [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Deleting instance files /var/lib/nova/instances/12b37be9-93a2-4e10-9056-68a743ed2673_del#033[00m
Jan 23 05:26:59 np0005593234 nova_compute[227762]: 2026-01-23 10:26:59.061 227766 INFO nova.virt.libvirt.driver [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Deletion of /var/lib/nova/instances/12b37be9-93a2-4e10-9056-68a743ed2673_del complete#033[00m
Jan 23 05:26:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:26:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:26:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:59.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:26:59 np0005593234 nova_compute[227762]: 2026-01-23 10:26:59.146 227766 INFO nova.compute.manager [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Took 1.65 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:26:59 np0005593234 nova_compute[227762]: 2026-01-23 10:26:59.147 227766 DEBUG oslo.service.loopingcall [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:26:59 np0005593234 nova_compute[227762]: 2026-01-23 10:26:59.147 227766 DEBUG nova.compute.manager [-] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:26:59 np0005593234 nova_compute[227762]: 2026-01-23 10:26:59.147 227766 DEBUG nova.network.neutron [-] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:27:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:00.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:00 np0005593234 nova_compute[227762]: 2026-01-23 10:27:00.307 227766 DEBUG nova.network.neutron [-] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:27:00 np0005593234 nova_compute[227762]: 2026-01-23 10:27:00.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:01.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.164 227766 DEBUG nova.compute.manager [req-53b0fb3c-5c5e-4194-b93d-a552be05352b req-e443b56e-ae66-401b-ab6b-8767e3858fc8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Received event network-vif-deleted-33699e18-9d87-4a6b-9145-84572bf07525 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.165 227766 INFO nova.compute.manager [req-53b0fb3c-5c5e-4194-b93d-a552be05352b req-e443b56e-ae66-401b-ab6b-8767e3858fc8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Neutron deleted interface 33699e18-9d87-4a6b-9145-84572bf07525; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.165 227766 DEBUG nova.network.neutron [req-53b0fb3c-5c5e-4194-b93d-a552be05352b req-e443b56e-ae66-401b-ab6b-8767e3858fc8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.168 227766 DEBUG nova.compute.manager [req-a369b285-d83a-4c7a-9386-4f600e2aabab req-39d5487f-a8b1-4ee8-8206-213236903d08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Received event network-vif-plugged-33699e18-9d87-4a6b-9145-84572bf07525 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.168 227766 DEBUG oslo_concurrency.lockutils [req-a369b285-d83a-4c7a-9386-4f600e2aabab req-39d5487f-a8b1-4ee8-8206-213236903d08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.168 227766 DEBUG oslo_concurrency.lockutils [req-a369b285-d83a-4c7a-9386-4f600e2aabab req-39d5487f-a8b1-4ee8-8206-213236903d08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.168 227766 DEBUG oslo_concurrency.lockutils [req-a369b285-d83a-4c7a-9386-4f600e2aabab req-39d5487f-a8b1-4ee8-8206-213236903d08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.169 227766 DEBUG nova.compute.manager [req-a369b285-d83a-4c7a-9386-4f600e2aabab req-39d5487f-a8b1-4ee8-8206-213236903d08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] No waiting events found dispatching network-vif-plugged-33699e18-9d87-4a6b-9145-84572bf07525 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.169 227766 WARNING nova.compute.manager [req-a369b285-d83a-4c7a-9386-4f600e2aabab req-39d5487f-a8b1-4ee8-8206-213236903d08 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Received unexpected event network-vif-plugged-33699e18-9d87-4a6b-9145-84572bf07525 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.194 227766 INFO nova.compute.manager [-] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Took 2.05 seconds to deallocate network for instance.#033[00m
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.204 227766 DEBUG nova.compute.manager [req-53b0fb3c-5c5e-4194-b93d-a552be05352b req-e443b56e-ae66-401b-ab6b-8767e3858fc8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Detach interface failed, port_id=33699e18-9d87-4a6b-9145-84572bf07525, reason: Instance 12b37be9-93a2-4e10-9056-68a743ed2673 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.609 227766 DEBUG oslo_concurrency.lockutils [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.610 227766 DEBUG oslo_concurrency.lockutils [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:01 np0005593234 nova_compute[227762]: 2026-01-23 10:27:01.681 227766 DEBUG oslo_concurrency.processutils [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:27:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1873328022' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:27:02 np0005593234 nova_compute[227762]: 2026-01-23 10:27:02.133 227766 DEBUG oslo_concurrency.processutils [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:02 np0005593234 nova_compute[227762]: 2026-01-23 10:27:02.140 227766 DEBUG nova.compute.provider_tree [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:27:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:02.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:02 np0005593234 nova_compute[227762]: 2026-01-23 10:27:02.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:02 np0005593234 nova_compute[227762]: 2026-01-23 10:27:02.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:27:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:03 np0005593234 nova_compute[227762]: 2026-01-23 10:27:03.049 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:03.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:03 np0005593234 nova_compute[227762]: 2026-01-23 10:27:03.348 227766 DEBUG nova.scheduler.client.report [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:27:03 np0005593234 nova_compute[227762]: 2026-01-23 10:27:03.477 227766 DEBUG oslo_concurrency.lockutils [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.867s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:03 np0005593234 nova_compute[227762]: 2026-01-23 10:27:03.569 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:03 np0005593234 nova_compute[227762]: 2026-01-23 10:27:03.596 227766 INFO nova.scheduler.client.report [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Deleted allocations for instance 12b37be9-93a2-4e10-9056-68a743ed2673#033[00m
Jan 23 05:27:03 np0005593234 podman[304613]: 2026-01-23 10:27:03.777885713 +0000 UTC m=+0.070252604 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 05:27:03 np0005593234 nova_compute[227762]: 2026-01-23 10:27:03.810 227766 DEBUG oslo_concurrency.lockutils [None req-a13da01a-ccba-4ea1-bf75-f462da5d3a75 01b7396ecc574dd6ba2df2f406921223 c7c25c6bb33b41bf9cd8febb8259fd87 - - default default] Lock "12b37be9-93a2-4e10-9056-68a743ed2673" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:04.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:05.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:06.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:07.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:08 np0005593234 nova_compute[227762]: 2026-01-23 10:27:08.051 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:08 np0005593234 nova_compute[227762]: 2026-01-23 10:27:08.116 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "0691611c-0064-4559-8c77-6c437a3b1d28" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:08 np0005593234 nova_compute[227762]: 2026-01-23 10:27:08.117 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:08.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:08 np0005593234 nova_compute[227762]: 2026-01-23 10:27:08.572 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:08 np0005593234 nova_compute[227762]: 2026-01-23 10:27:08.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:09.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:10 np0005593234 nova_compute[227762]: 2026-01-23 10:27:10.019 227766 DEBUG nova.compute.manager [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:27:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:10.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:11.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:12 np0005593234 nova_compute[227762]: 2026-01-23 10:27:12.170 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:12 np0005593234 nova_compute[227762]: 2026-01-23 10:27:12.171 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:12 np0005593234 nova_compute[227762]: 2026-01-23 10:27:12.177 227766 DEBUG nova.virt.hardware [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:27:12 np0005593234 nova_compute[227762]: 2026-01-23 10:27:12.177 227766 INFO nova.compute.claims [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:27:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:12.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:12 np0005593234 podman[304638]: 2026-01-23 10:27:12.804279435 +0000 UTC m=+0.094776465 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:27:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:13 np0005593234 nova_compute[227762]: 2026-01-23 10:27:13.053 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:13.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:13 np0005593234 nova_compute[227762]: 2026-01-23 10:27:13.203 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:13 np0005593234 nova_compute[227762]: 2026-01-23 10:27:13.536 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164018.5358787, 12b37be9-93a2-4e10-9056-68a743ed2673 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:27:13 np0005593234 nova_compute[227762]: 2026-01-23 10:27:13.537 227766 INFO nova.compute.manager [-] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:27:13 np0005593234 nova_compute[227762]: 2026-01-23 10:27:13.559 227766 DEBUG nova.compute.manager [None req-025cb5a7-0340-410f-a875-b00f21555962 - - - - - -] [instance: 12b37be9-93a2-4e10-9056-68a743ed2673] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:27:13 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3343046869' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:27:13 np0005593234 nova_compute[227762]: 2026-01-23 10:27:13.632 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:13 np0005593234 nova_compute[227762]: 2026-01-23 10:27:13.640 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:13 np0005593234 nova_compute[227762]: 2026-01-23 10:27:13.646 227766 DEBUG nova.compute.provider_tree [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:27:13 np0005593234 nova_compute[227762]: 2026-01-23 10:27:13.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:13 np0005593234 nova_compute[227762]: 2026-01-23 10:27:13.991 227766 DEBUG nova.scheduler.client.report [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:27:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:14.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:14 np0005593234 nova_compute[227762]: 2026-01-23 10:27:14.274 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:14 np0005593234 nova_compute[227762]: 2026-01-23 10:27:14.275 227766 DEBUG nova.compute.manager [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:27:14 np0005593234 nova_compute[227762]: 2026-01-23 10:27:14.506 227766 DEBUG nova.compute.manager [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:27:14 np0005593234 nova_compute[227762]: 2026-01-23 10:27:14.506 227766 DEBUG nova.network.neutron [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:27:14 np0005593234 nova_compute[227762]: 2026-01-23 10:27:14.881 227766 DEBUG nova.policy [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60291ce86b6946629a2e48f6680312cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:27:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:15.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:15 np0005593234 nova_compute[227762]: 2026-01-23 10:27:15.142 227766 INFO nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:27:15 np0005593234 nova_compute[227762]: 2026-01-23 10:27:15.836 227766 DEBUG nova.compute.manager [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:27:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:16.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:17.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:17 np0005593234 nova_compute[227762]: 2026-01-23 10:27:17.814 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:17.814 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=62, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=61) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:27:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:17.816 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:27:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.055 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.216 227766 DEBUG nova.compute.manager [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.217 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.217 227766 INFO nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Creating image(s)#033[00m
Jan 23 05:27:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:18.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.238 227766 DEBUG nova.storage.rbd_utils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0691611c-0064-4559-8c77-6c437a3b1d28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.264 227766 DEBUG nova.storage.rbd_utils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0691611c-0064-4559-8c77-6c437a3b1d28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.288 227766 DEBUG nova.storage.rbd_utils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0691611c-0064-4559-8c77-6c437a3b1d28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.291 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.396 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.397 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.398 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.399 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.425 227766 DEBUG nova.storage.rbd_utils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0691611c-0064-4559-8c77-6c437a3b1d28_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.429 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0691611c-0064-4559-8c77-6c437a3b1d28_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:18 np0005593234 nova_compute[227762]: 2026-01-23 10:27:18.634 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:19.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:19 np0005593234 nova_compute[227762]: 2026-01-23 10:27:19.322 227766 DEBUG nova.network.neutron [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Successfully created port: 3655f265-3e79-476b-b0ee-7515d802ad05 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:27:19 np0005593234 nova_compute[227762]: 2026-01-23 10:27:19.508 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:19 np0005593234 nova_compute[227762]: 2026-01-23 10:27:19.510 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 0691611c-0064-4559-8c77-6c437a3b1d28_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:19 np0005593234 nova_compute[227762]: 2026-01-23 10:27:19.592 227766 DEBUG nova.storage.rbd_utils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] resizing rbd image 0691611c-0064-4559-8c77-6c437a3b1d28_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:27:19 np0005593234 nova_compute[227762]: 2026-01-23 10:27:19.719 227766 DEBUG nova.objects.instance [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 0691611c-0064-4559-8c77-6c437a3b1d28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:19 np0005593234 nova_compute[227762]: 2026-01-23 10:27:19.738 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:27:19 np0005593234 nova_compute[227762]: 2026-01-23 10:27:19.738 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Ensure instance console log exists: /var/lib/nova/instances/0691611c-0064-4559-8c77-6c437a3b1d28/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:27:19 np0005593234 nova_compute[227762]: 2026-01-23 10:27:19.739 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:19 np0005593234 nova_compute[227762]: 2026-01-23 10:27:19.739 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:19 np0005593234 nova_compute[227762]: 2026-01-23 10:27:19.739 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:20.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:21.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:22.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:27:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2824879916' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:27:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:27:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2824879916' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:27:23 np0005593234 nova_compute[227762]: 2026-01-23 10:27:23.057 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:23.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:23 np0005593234 nova_compute[227762]: 2026-01-23 10:27:23.682 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:24 np0005593234 nova_compute[227762]: 2026-01-23 10:27:24.106 227766 DEBUG nova.network.neutron [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Successfully updated port: 3655f265-3e79-476b-b0ee-7515d802ad05 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:27:24 np0005593234 nova_compute[227762]: 2026-01-23 10:27:24.145 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "refresh_cache-0691611c-0064-4559-8c77-6c437a3b1d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:27:24 np0005593234 nova_compute[227762]: 2026-01-23 10:27:24.145 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquired lock "refresh_cache-0691611c-0064-4559-8c77-6c437a3b1d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:27:24 np0005593234 nova_compute[227762]: 2026-01-23 10:27:24.146 227766 DEBUG nova.network.neutron [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:27:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:24.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:24 np0005593234 nova_compute[227762]: 2026-01-23 10:27:24.466 227766 DEBUG nova.compute.manager [req-93686b77-b0d1-4b54-8a83-0538e0b07df9 req-28f4a573-fcb2-47e7-ab92-dd9b78cebd80 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Received event network-changed-3655f265-3e79-476b-b0ee-7515d802ad05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:24 np0005593234 nova_compute[227762]: 2026-01-23 10:27:24.467 227766 DEBUG nova.compute.manager [req-93686b77-b0d1-4b54-8a83-0538e0b07df9 req-28f4a573-fcb2-47e7-ab92-dd9b78cebd80 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Refreshing instance network info cache due to event network-changed-3655f265-3e79-476b-b0ee-7515d802ad05. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:27:24 np0005593234 nova_compute[227762]: 2026-01-23 10:27:24.468 227766 DEBUG oslo_concurrency.lockutils [req-93686b77-b0d1-4b54-8a83-0538e0b07df9 req-28f4a573-fcb2-47e7-ab92-dd9b78cebd80 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0691611c-0064-4559-8c77-6c437a3b1d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:27:24 np0005593234 nova_compute[227762]: 2026-01-23 10:27:24.545 227766 DEBUG nova.network.neutron [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:27:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:25.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:27:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3180018281' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:27:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:27:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3180018281' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:27:25 np0005593234 nova_compute[227762]: 2026-01-23 10:27:25.903 227766 DEBUG nova.network.neutron [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Updating instance_info_cache with network_info: [{"id": "3655f265-3e79-476b-b0ee-7515d802ad05", "address": "fa:16:3e:dd:0c:c9", "network": {"id": "e9a30b2a-861e-43f1-ad49-9f70838e4d73", "bridge": "br-int", "label": "tempest-network-smoke--796549785", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3655f265-3e", "ovs_interfaceid": "3655f265-3e79-476b-b0ee-7515d802ad05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:27:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:26.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.445 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Releasing lock "refresh_cache-0691611c-0064-4559-8c77-6c437a3b1d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.445 227766 DEBUG nova.compute.manager [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Instance network_info: |[{"id": "3655f265-3e79-476b-b0ee-7515d802ad05", "address": "fa:16:3e:dd:0c:c9", "network": {"id": "e9a30b2a-861e-43f1-ad49-9f70838e4d73", "bridge": "br-int", "label": "tempest-network-smoke--796549785", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3655f265-3e", "ovs_interfaceid": "3655f265-3e79-476b-b0ee-7515d802ad05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.446 227766 DEBUG oslo_concurrency.lockutils [req-93686b77-b0d1-4b54-8a83-0538e0b07df9 req-28f4a573-fcb2-47e7-ab92-dd9b78cebd80 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0691611c-0064-4559-8c77-6c437a3b1d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.446 227766 DEBUG nova.network.neutron [req-93686b77-b0d1-4b54-8a83-0538e0b07df9 req-28f4a573-fcb2-47e7-ab92-dd9b78cebd80 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Refreshing network info cache for port 3655f265-3e79-476b-b0ee-7515d802ad05 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.450 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Start _get_guest_xml network_info=[{"id": "3655f265-3e79-476b-b0ee-7515d802ad05", "address": "fa:16:3e:dd:0c:c9", "network": {"id": "e9a30b2a-861e-43f1-ad49-9f70838e4d73", "bridge": "br-int", "label": "tempest-network-smoke--796549785", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3655f265-3e", "ovs_interfaceid": "3655f265-3e79-476b-b0ee-7515d802ad05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.457 227766 WARNING nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.466 227766 DEBUG nova.virt.libvirt.host [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.467 227766 DEBUG nova.virt.libvirt.host [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.471 227766 DEBUG nova.virt.libvirt.host [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.472 227766 DEBUG nova.virt.libvirt.host [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.473 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.474 227766 DEBUG nova.virt.hardware [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.474 227766 DEBUG nova.virt.hardware [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.475 227766 DEBUG nova.virt.hardware [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.475 227766 DEBUG nova.virt.hardware [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.475 227766 DEBUG nova.virt.hardware [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.476 227766 DEBUG nova.virt.hardware [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.476 227766 DEBUG nova.virt.hardware [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.476 227766 DEBUG nova.virt.hardware [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.476 227766 DEBUG nova.virt.hardware [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.477 227766 DEBUG nova.virt.hardware [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.477 227766 DEBUG nova.virt.hardware [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.480 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:27:26 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3850688487' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.937 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.960 227766 DEBUG nova.storage.rbd_utils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0691611c-0064-4559-8c77-6c437a3b1d28_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:26 np0005593234 nova_compute[227762]: 2026-01-23 10:27:26.964 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:27.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:27:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1900182275' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.423 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.426 227766 DEBUG nova.virt.libvirt.vif [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:27:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-498738817',display_name='tempest-TestNetworkBasicOps-server-498738817',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-498738817',id=171,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ0MOr1ZXYTOMxqhCSzGVv7TnAMPzxDZWjTMFAcXgD/tRBlRHPzSyzU59fQfLx6eAnMbo1gdikjTHATMM7fFoGqqkpyHJCxu4lcKE1O3ar3uaP0DSC+sPXD6UKfjXuo70A==',key_name='tempest-TestNetworkBasicOps-340669632',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-u0c7ay0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:27:17Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=0691611c-0064-4559-8c77-6c437a3b1d28,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3655f265-3e79-476b-b0ee-7515d802ad05", "address": "fa:16:3e:dd:0c:c9", "network": {"id": "e9a30b2a-861e-43f1-ad49-9f70838e4d73", "bridge": "br-int", "label": "tempest-network-smoke--796549785", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3655f265-3e", "ovs_interfaceid": "3655f265-3e79-476b-b0ee-7515d802ad05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.426 227766 DEBUG nova.network.os_vif_util [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "3655f265-3e79-476b-b0ee-7515d802ad05", "address": "fa:16:3e:dd:0c:c9", "network": {"id": "e9a30b2a-861e-43f1-ad49-9f70838e4d73", "bridge": "br-int", "label": "tempest-network-smoke--796549785", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3655f265-3e", "ovs_interfaceid": "3655f265-3e79-476b-b0ee-7515d802ad05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.427 227766 DEBUG nova.network.os_vif_util [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:0c:c9,bridge_name='br-int',has_traffic_filtering=True,id=3655f265-3e79-476b-b0ee-7515d802ad05,network=Network(e9a30b2a-861e-43f1-ad49-9f70838e4d73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3655f265-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.428 227766 DEBUG nova.objects.instance [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0691611c-0064-4559-8c77-6c437a3b1d28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.711 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  <uuid>0691611c-0064-4559-8c77-6c437a3b1d28</uuid>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  <name>instance-000000ab</name>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkBasicOps-server-498738817</nova:name>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:27:26</nova:creationTime>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <nova:port uuid="3655f265-3e79-476b-b0ee-7515d802ad05">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <entry name="serial">0691611c-0064-4559-8c77-6c437a3b1d28</entry>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <entry name="uuid">0691611c-0064-4559-8c77-6c437a3b1d28</entry>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/0691611c-0064-4559-8c77-6c437a3b1d28_disk">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/0691611c-0064-4559-8c77-6c437a3b1d28_disk.config">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:dd:0c:c9"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <target dev="tap3655f265-3e"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/0691611c-0064-4559-8c77-6c437a3b1d28/console.log" append="off"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:27:27 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:27:27 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:27:27 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:27:27 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.712 227766 DEBUG nova.compute.manager [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Preparing to wait for external event network-vif-plugged-3655f265-3e79-476b-b0ee-7515d802ad05 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.713 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.713 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.713 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.714 227766 DEBUG nova.virt.libvirt.vif [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:27:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-498738817',display_name='tempest-TestNetworkBasicOps-server-498738817',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-498738817',id=171,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ0MOr1ZXYTOMxqhCSzGVv7TnAMPzxDZWjTMFAcXgD/tRBlRHPzSyzU59fQfLx6eAnMbo1gdikjTHATMM7fFoGqqkpyHJCxu4lcKE1O3ar3uaP0DSC+sPXD6UKfjXuo70A==',key_name='tempest-TestNetworkBasicOps-340669632',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-u0c7ay0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:27:17Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=0691611c-0064-4559-8c77-6c437a3b1d28,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3655f265-3e79-476b-b0ee-7515d802ad05", "address": "fa:16:3e:dd:0c:c9", "network": {"id": "e9a30b2a-861e-43f1-ad49-9f70838e4d73", "bridge": "br-int", "label": "tempest-network-smoke--796549785", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3655f265-3e", "ovs_interfaceid": "3655f265-3e79-476b-b0ee-7515d802ad05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.714 227766 DEBUG nova.network.os_vif_util [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "3655f265-3e79-476b-b0ee-7515d802ad05", "address": "fa:16:3e:dd:0c:c9", "network": {"id": "e9a30b2a-861e-43f1-ad49-9f70838e4d73", "bridge": "br-int", "label": "tempest-network-smoke--796549785", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3655f265-3e", "ovs_interfaceid": "3655f265-3e79-476b-b0ee-7515d802ad05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.715 227766 DEBUG nova.network.os_vif_util [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:dd:0c:c9,bridge_name='br-int',has_traffic_filtering=True,id=3655f265-3e79-476b-b0ee-7515d802ad05,network=Network(e9a30b2a-861e-43f1-ad49-9f70838e4d73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3655f265-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.715 227766 DEBUG os_vif [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:0c:c9,bridge_name='br-int',has_traffic_filtering=True,id=3655f265-3e79-476b-b0ee-7515d802ad05,network=Network(e9a30b2a-861e-43f1-ad49-9f70838e4d73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3655f265-3e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.716 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.717 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.717 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.722 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.722 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3655f265-3e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.723 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3655f265-3e, col_values=(('external_ids', {'iface-id': '3655f265-3e79-476b-b0ee-7515d802ad05', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:dd:0c:c9', 'vm-uuid': '0691611c-0064-4559-8c77-6c437a3b1d28'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.725 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:27 np0005593234 NetworkManager[48942]: <info>  [1769164047.7259] manager: (tap3655f265-3e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/342)
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.728 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.730 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.731 227766 INFO os_vif [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:dd:0c:c9,bridge_name='br-int',has_traffic_filtering=True,id=3655f265-3e79-476b-b0ee-7515d802ad05,network=Network(e9a30b2a-861e-43f1-ad49-9f70838e4d73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3655f265-3e')#033[00m
Jan 23 05:27:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:27.818 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '62'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.990 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.990 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.990 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No VIF found with MAC fa:16:3e:dd:0c:c9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:27:27 np0005593234 nova_compute[227762]: 2026-01-23 10:27:27.991 227766 INFO nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Using config drive#033[00m
Jan 23 05:27:28 np0005593234 nova_compute[227762]: 2026-01-23 10:27:28.011 227766 DEBUG nova.storage.rbd_utils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0691611c-0064-4559-8c77-6c437a3b1d28_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:28 np0005593234 nova_compute[227762]: 2026-01-23 10:27:28.060 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:28.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:28 np0005593234 nova_compute[227762]: 2026-01-23 10:27:28.835 227766 INFO nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Creating config drive at /var/lib/nova/instances/0691611c-0064-4559-8c77-6c437a3b1d28/disk.config#033[00m
Jan 23 05:27:28 np0005593234 nova_compute[227762]: 2026-01-23 10:27:28.846 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0691611c-0064-4559-8c77-6c437a3b1d28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwweo2t8a execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:28 np0005593234 nova_compute[227762]: 2026-01-23 10:27:28.989 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0691611c-0064-4559-8c77-6c437a3b1d28/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwweo2t8a" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.015 227766 DEBUG nova.storage.rbd_utils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 0691611c-0064-4559-8c77-6c437a3b1d28_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.019 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/0691611c-0064-4559-8c77-6c437a3b1d28/disk.config 0691611c-0064-4559-8c77-6c437a3b1d28_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:29.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.171 227766 DEBUG oslo_concurrency.processutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/0691611c-0064-4559-8c77-6c437a3b1d28/disk.config 0691611c-0064-4559-8c77-6c437a3b1d28_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.152s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.172 227766 INFO nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Deleting local config drive /var/lib/nova/instances/0691611c-0064-4559-8c77-6c437a3b1d28/disk.config because it was imported into RBD.#033[00m
Jan 23 05:27:29 np0005593234 kernel: tap3655f265-3e: entered promiscuous mode
Jan 23 05:27:29 np0005593234 NetworkManager[48942]: <info>  [1769164049.2227] manager: (tap3655f265-3e): new Tun device (/org/freedesktop/NetworkManager/Devices/343)
Jan 23 05:27:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:27:29Z|00713|binding|INFO|Claiming lport 3655f265-3e79-476b-b0ee-7515d802ad05 for this chassis.
Jan 23 05:27:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:27:29Z|00714|binding|INFO|3655f265-3e79-476b-b0ee-7515d802ad05: Claiming fa:16:3e:dd:0c:c9 10.100.0.5
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.263 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:27:29Z|00715|binding|INFO|Setting lport 3655f265-3e79-476b-b0ee-7515d802ad05 ovn-installed in OVS
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.289 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:29 np0005593234 systemd-machined[195626]: New machine qemu-80-instance-000000ab.
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.291 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:29 np0005593234 systemd-udevd[305048]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:27:29 np0005593234 NetworkManager[48942]: <info>  [1769164049.3065] device (tap3655f265-3e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:27:29 np0005593234 systemd[1]: Started Virtual Machine qemu-80-instance-000000ab.
Jan 23 05:27:29 np0005593234 NetworkManager[48942]: <info>  [1769164049.3074] device (tap3655f265-3e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:27:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:27:29Z|00716|binding|INFO|Setting lport 3655f265-3e79-476b-b0ee-7515d802ad05 up in Southbound
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.649 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:0c:c9 10.100.0.5'], port_security=['fa:16:3e:dd:0c:c9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '0691611c-0064-4559-8c77-6c437a3b1d28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9a30b2a-861e-43f1-ad49-9f70838e4d73', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8585edb1-05f5-4381-af60-0123ecc3bd9f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6730e18a-1eb5-4e1f-9ad1-d2d85c48cc63, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=3655f265-3e79-476b-b0ee-7515d802ad05) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.650 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 3655f265-3e79-476b-b0ee-7515d802ad05 in datapath e9a30b2a-861e-43f1-ad49-9f70838e4d73 bound to our chassis#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.653 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e9a30b2a-861e-43f1-ad49-9f70838e4d73#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.666 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[718cf778-7e93-4d38-8e9e-31184b7cdec7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.667 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape9a30b2a-81 in ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.670 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape9a30b2a-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.670 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8aefd8f6-d8cc-4e0c-991c-eab3955a4157]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.671 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[627edcdf-252f-4d76-a879-3ebae81a13cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.682 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[0f6b61f7-54ff-4724-b010-1f7a77d361f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.694 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[441c4966-109b-4906-95d6-9e4788e0856b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.726 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[16e2d43f-dcf5-4ef3-a2e6-abd4ea640f4a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 systemd-udevd[305050]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:27:29 np0005593234 NetworkManager[48942]: <info>  [1769164049.7339] manager: (tape9a30b2a-80): new Veth device (/org/freedesktop/NetworkManager/Devices/344)
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.734 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[57f647aa-5553-4ef5-ab94-3d43b8c257e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.766 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164049.7662482, 0691611c-0064-4559-8c77-6c437a3b1d28 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.767 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] VM Started (Lifecycle Event)#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.770 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[8980ef50-b6f9-459d-ae59-3f7c44027037]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.773 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[bba24c5c-58f0-4685-801a-af65c11c4664]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 NetworkManager[48942]: <info>  [1769164049.7957] device (tape9a30b2a-80): carrier: link connected
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.801 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[5622140a-01f4-43c6-93bd-e873ed2d576a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.803 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.809 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164049.766443, 0691611c-0064-4559-8c77-6c437a3b1d28 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.809 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.818 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2908db9d-61e1-49ce-8e99-847d571b7068]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape9a30b2a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:ba:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 794475, 'reachable_time': 44124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305123, 'error': None, 'target': 'ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.832 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e93ffc4f-ec2f-4e53-b4e1-79ba718bd853]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9b:bad9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 794475, 'tstamp': 794475}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 305124, 'error': None, 'target': 'ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.848 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9c07be2b-1c1c-4192-8650-74b6960dcc01]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape9a30b2a-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9b:ba:d9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 222], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 794475, 'reachable_time': 44124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 305125, 'error': None, 'target': 'ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.866 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.870 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.877 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[360faff4-d5c7-41c3-a8bb-f3258f7ea7d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.890 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.935 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b20559d0-3151-484d-9e6e-cdc6f5bbcf36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.937 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9a30b2a-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.937 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.938 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape9a30b2a-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.939 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:29 np0005593234 kernel: tape9a30b2a-80: entered promiscuous mode
Jan 23 05:27:29 np0005593234 NetworkManager[48942]: <info>  [1769164049.9409] manager: (tape9a30b2a-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/345)
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.942 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape9a30b2a-80, col_values=(('external_ids', {'iface-id': 'aab0e8f7-90c9-48c5-8303-2455c5df9b8d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.945 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:27:29Z|00717|binding|INFO|Releasing lport aab0e8f7-90c9-48c5-8303-2455c5df9b8d from this chassis (sb_readonly=0)
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.946 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.947 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e9a30b2a-861e-43f1-ad49-9f70838e4d73.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e9a30b2a-861e-43f1-ad49-9f70838e4d73.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.948 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d70bd620-ad1d-4919-afd8-9fbffdf33b4e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.948 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-e9a30b2a-861e-43f1-ad49-9f70838e4d73
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/e9a30b2a-861e-43f1-ad49-9f70838e4d73.pid.haproxy
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID e9a30b2a-861e-43f1-ad49-9f70838e4d73
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:27:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:29.949 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73', 'env', 'PROCESS_TAG=haproxy-e9a30b2a-861e-43f1-ad49-9f70838e4d73', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e9a30b2a-861e-43f1-ad49-9f70838e4d73.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:27:29 np0005593234 nova_compute[227762]: 2026-01-23 10:27:29.960 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.119 227766 DEBUG nova.compute.manager [req-c20d4550-38fb-4a29-b1ee-4ec66c3611c0 req-3ae0fb29-d355-48ca-81b5-907252988044 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Received event network-vif-plugged-3655f265-3e79-476b-b0ee-7515d802ad05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.119 227766 DEBUG oslo_concurrency.lockutils [req-c20d4550-38fb-4a29-b1ee-4ec66c3611c0 req-3ae0fb29-d355-48ca-81b5-907252988044 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.120 227766 DEBUG oslo_concurrency.lockutils [req-c20d4550-38fb-4a29-b1ee-4ec66c3611c0 req-3ae0fb29-d355-48ca-81b5-907252988044 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.121 227766 DEBUG oslo_concurrency.lockutils [req-c20d4550-38fb-4a29-b1ee-4ec66c3611c0 req-3ae0fb29-d355-48ca-81b5-907252988044 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.121 227766 DEBUG nova.compute.manager [req-c20d4550-38fb-4a29-b1ee-4ec66c3611c0 req-3ae0fb29-d355-48ca-81b5-907252988044 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Processing event network-vif-plugged-3655f265-3e79-476b-b0ee-7515d802ad05 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.121 227766 DEBUG nova.compute.manager [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.130 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164050.1301825, 0691611c-0064-4559-8c77-6c437a3b1d28 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.130 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.132 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.144 227766 INFO nova.virt.libvirt.driver [-] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Instance spawned successfully.#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.144 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.164 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.169 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.172 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.173 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.173 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.173 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.174 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.174 227766 DEBUG nova.virt.libvirt.driver [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.218 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:27:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:30.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.265 227766 INFO nova.compute.manager [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Took 12.05 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.265 227766 DEBUG nova.compute.manager [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.274 227766 DEBUG nova.network.neutron [req-93686b77-b0d1-4b54-8a83-0538e0b07df9 req-28f4a573-fcb2-47e7-ab92-dd9b78cebd80 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Updated VIF entry in instance network info cache for port 3655f265-3e79-476b-b0ee-7515d802ad05. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.275 227766 DEBUG nova.network.neutron [req-93686b77-b0d1-4b54-8a83-0538e0b07df9 req-28f4a573-fcb2-47e7-ab92-dd9b78cebd80 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Updating instance_info_cache with network_info: [{"id": "3655f265-3e79-476b-b0ee-7515d802ad05", "address": "fa:16:3e:dd:0c:c9", "network": {"id": "e9a30b2a-861e-43f1-ad49-9f70838e4d73", "bridge": "br-int", "label": "tempest-network-smoke--796549785", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3655f265-3e", "ovs_interfaceid": "3655f265-3e79-476b-b0ee-7515d802ad05", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:27:30 np0005593234 podman[305158]: 2026-01-23 10:27:30.29480286 +0000 UTC m=+0.049555081 container create e6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:27:30 np0005593234 systemd[1]: Started libpod-conmon-e6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e.scope.
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.336 227766 DEBUG oslo_concurrency.lockutils [req-93686b77-b0d1-4b54-8a83-0538e0b07df9 req-28f4a573-fcb2-47e7-ab92-dd9b78cebd80 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0691611c-0064-4559-8c77-6c437a3b1d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:27:30 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:27:30 np0005593234 podman[305158]: 2026-01-23 10:27:30.267843093 +0000 UTC m=+0.022595344 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.367 227766 INFO nova.compute.manager [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Took 18.27 seconds to build instance.#033[00m
Jan 23 05:27:30 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73c8196ddf65a660e928bc8be039ec1db347b3521d221d97f16283f61b6e40bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:27:30 np0005593234 podman[305158]: 2026-01-23 10:27:30.383884288 +0000 UTC m=+0.138636529 container init e6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:27:30 np0005593234 nova_compute[227762]: 2026-01-23 10:27:30.387 227766 DEBUG oslo_concurrency.lockutils [None req-01654a47-984b-4fe7-918f-252842ab8c98 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:30 np0005593234 podman[305158]: 2026-01-23 10:27:30.390028919 +0000 UTC m=+0.144781140 container start e6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:27:30 np0005593234 neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73[305173]: [NOTICE]   (305177) : New worker (305179) forked
Jan 23 05:27:30 np0005593234 neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73[305173]: [NOTICE]   (305177) : Loading success.
Jan 23 05:27:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:31.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:32.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:32 np0005593234 nova_compute[227762]: 2026-01-23 10:27:32.727 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:32 np0005593234 nova_compute[227762]: 2026-01-23 10:27:32.868 227766 DEBUG nova.compute.manager [req-f59a4e93-3c51-4b18-9fe7-5919d0f0f962 req-85c684e0-05fd-45a7-a0e5-3bba8663ed68 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Received event network-vif-plugged-3655f265-3e79-476b-b0ee-7515d802ad05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:32 np0005593234 nova_compute[227762]: 2026-01-23 10:27:32.869 227766 DEBUG oslo_concurrency.lockutils [req-f59a4e93-3c51-4b18-9fe7-5919d0f0f962 req-85c684e0-05fd-45a7-a0e5-3bba8663ed68 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:32 np0005593234 nova_compute[227762]: 2026-01-23 10:27:32.869 227766 DEBUG oslo_concurrency.lockutils [req-f59a4e93-3c51-4b18-9fe7-5919d0f0f962 req-85c684e0-05fd-45a7-a0e5-3bba8663ed68 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:32 np0005593234 nova_compute[227762]: 2026-01-23 10:27:32.869 227766 DEBUG oslo_concurrency.lockutils [req-f59a4e93-3c51-4b18-9fe7-5919d0f0f962 req-85c684e0-05fd-45a7-a0e5-3bba8663ed68 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:32 np0005593234 nova_compute[227762]: 2026-01-23 10:27:32.870 227766 DEBUG nova.compute.manager [req-f59a4e93-3c51-4b18-9fe7-5919d0f0f962 req-85c684e0-05fd-45a7-a0e5-3bba8663ed68 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] No waiting events found dispatching network-vif-plugged-3655f265-3e79-476b-b0ee-7515d802ad05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:27:32 np0005593234 nova_compute[227762]: 2026-01-23 10:27:32.870 227766 WARNING nova.compute.manager [req-f59a4e93-3c51-4b18-9fe7-5919d0f0f962 req-85c684e0-05fd-45a7-a0e5-3bba8663ed68 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Received unexpected event network-vif-plugged-3655f265-3e79-476b-b0ee-7515d802ad05 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:27:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:33 np0005593234 nova_compute[227762]: 2026-01-23 10:27:33.061 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:33 np0005593234 nova_compute[227762]: 2026-01-23 10:27:33.130 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:27:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:33.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:27:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:34.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:27:34Z|00718|binding|INFO|Releasing lport ec60a79a-3050-4cfa-9c46-cf939e3eeae0 from this chassis (sb_readonly=0)
Jan 23 05:27:34 np0005593234 ovn_controller[134547]: 2026-01-23T10:27:34Z|00719|binding|INFO|Releasing lport aab0e8f7-90c9-48c5-8303-2455c5df9b8d from this chassis (sb_readonly=0)
Jan 23 05:27:34 np0005593234 nova_compute[227762]: 2026-01-23 10:27:34.339 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:34 np0005593234 podman[305190]: 2026-01-23 10:27:34.769613734 +0000 UTC m=+0.062934127 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:27:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:35.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:35 np0005593234 nova_compute[227762]: 2026-01-23 10:27:35.154 227766 DEBUG nova.compute.manager [req-2f016fa3-3c7e-4621-a42d-7a550bdde1e4 req-26e76b6f-e66e-4bc4-a943-6313f290ab76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Received event network-changed-3655f265-3e79-476b-b0ee-7515d802ad05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:27:35 np0005593234 nova_compute[227762]: 2026-01-23 10:27:35.155 227766 DEBUG nova.compute.manager [req-2f016fa3-3c7e-4621-a42d-7a550bdde1e4 req-26e76b6f-e66e-4bc4-a943-6313f290ab76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Refreshing instance network info cache due to event network-changed-3655f265-3e79-476b-b0ee-7515d802ad05. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:27:35 np0005593234 nova_compute[227762]: 2026-01-23 10:27:35.155 227766 DEBUG oslo_concurrency.lockutils [req-2f016fa3-3c7e-4621-a42d-7a550bdde1e4 req-26e76b6f-e66e-4bc4-a943-6313f290ab76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0691611c-0064-4559-8c77-6c437a3b1d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:27:35 np0005593234 nova_compute[227762]: 2026-01-23 10:27:35.155 227766 DEBUG oslo_concurrency.lockutils [req-2f016fa3-3c7e-4621-a42d-7a550bdde1e4 req-26e76b6f-e66e-4bc4-a943-6313f290ab76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0691611c-0064-4559-8c77-6c437a3b1d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:27:35 np0005593234 nova_compute[227762]: 2026-01-23 10:27:35.155 227766 DEBUG nova.network.neutron [req-2f016fa3-3c7e-4621-a42d-7a550bdde1e4 req-26e76b6f-e66e-4bc4-a943-6313f290ab76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Refreshing network info cache for port 3655f265-3e79-476b-b0ee-7515d802ad05 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:27:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:36.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:37.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:37 np0005593234 nova_compute[227762]: 2026-01-23 10:27:37.730 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:38 np0005593234 nova_compute[227762]: 2026-01-23 10:27:38.063 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:38.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:38 np0005593234 nova_compute[227762]: 2026-01-23 10:27:38.942 227766 DEBUG nova.network.neutron [req-2f016fa3-3c7e-4621-a42d-7a550bdde1e4 req-26e76b6f-e66e-4bc4-a943-6313f290ab76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Updated VIF entry in instance network info cache for port 3655f265-3e79-476b-b0ee-7515d802ad05. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:27:38 np0005593234 nova_compute[227762]: 2026-01-23 10:27:38.943 227766 DEBUG nova.network.neutron [req-2f016fa3-3c7e-4621-a42d-7a550bdde1e4 req-26e76b6f-e66e-4bc4-a943-6313f290ab76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Updating instance_info_cache with network_info: [{"id": "3655f265-3e79-476b-b0ee-7515d802ad05", "address": "fa:16:3e:dd:0c:c9", "network": {"id": "e9a30b2a-861e-43f1-ad49-9f70838e4d73", "bridge": "br-int", "label": "tempest-network-smoke--796549785", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3655f265-3e", "ovs_interfaceid": "3655f265-3e79-476b-b0ee-7515d802ad05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:27:38 np0005593234 nova_compute[227762]: 2026-01-23 10:27:38.980 227766 DEBUG oslo_concurrency.lockutils [req-2f016fa3-3c7e-4621-a42d-7a550bdde1e4 req-26e76b6f-e66e-4bc4-a943-6313f290ab76 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0691611c-0064-4559-8c77-6c437a3b1d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:27:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:39.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:40.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:41.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:42.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:27:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:27:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:27:42 np0005593234 nova_compute[227762]: 2026-01-23 10:27:42.732 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:42.863 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:42.865 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:27:42.866 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:43 np0005593234 nova_compute[227762]: 2026-01-23 10:27:43.065 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:43.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:43 np0005593234 ovn_controller[134547]: 2026-01-23T10:27:43Z|00084|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:dd:0c:c9 10.100.0.5
Jan 23 05:27:43 np0005593234 ovn_controller[134547]: 2026-01-23T10:27:43Z|00085|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:0c:c9 10.100.0.5
Jan 23 05:27:43 np0005593234 podman[305399]: 2026-01-23 10:27:43.822063486 +0000 UTC m=+0.118255085 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:27:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:44.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:45.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:46.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:46 np0005593234 nova_compute[227762]: 2026-01-23 10:27:46.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:46 np0005593234 nova_compute[227762]: 2026-01-23 10:27:46.779 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:46 np0005593234 nova_compute[227762]: 2026-01-23 10:27:46.779 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:46 np0005593234 nova_compute[227762]: 2026-01-23 10:27:46.779 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:46 np0005593234 nova_compute[227762]: 2026-01-23 10:27:46.780 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:27:46 np0005593234 nova_compute[227762]: 2026-01-23 10:27:46.780 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:47.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:27:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3660968379' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.220 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.322 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000a9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.323 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000a9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.327 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000ab as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.327 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000ab as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.734 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.790 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.793 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3943MB free_disk=20.790687561035156GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.793 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.794 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.913 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance b31fafb4-3888-4647-9d5d-5f528ff795b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.913 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 0691611c-0064-4559-8c77-6c437a3b1d28 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.914 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.914 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:27:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:47 np0005593234 nova_compute[227762]: 2026-01-23 10:27:47.996 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:27:48 np0005593234 nova_compute[227762]: 2026-01-23 10:27:48.066 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:48.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:27:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:27:48 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3998462272' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:27:48 np0005593234 nova_compute[227762]: 2026-01-23 10:27:48.433 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:27:48 np0005593234 nova_compute[227762]: 2026-01-23 10:27:48.438 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:27:48 np0005593234 nova_compute[227762]: 2026-01-23 10:27:48.473 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:27:48 np0005593234 nova_compute[227762]: 2026-01-23 10:27:48.515 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:27:48 np0005593234 nova_compute[227762]: 2026-01-23 10:27:48.516 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.722s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:27:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:49.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:50.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:51.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:51 np0005593234 nova_compute[227762]: 2026-01-23 10:27:51.515 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:52 np0005593234 nova_compute[227762]: 2026-01-23 10:27:52.208 227766 INFO nova.compute.manager [None req-271fc330-477c-4b3d-89ff-de7b2b6a22f6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Get console output#033[00m
Jan 23 05:27:52 np0005593234 nova_compute[227762]: 2026-01-23 10:27:52.216 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:27:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:52.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:52 np0005593234 nova_compute[227762]: 2026-01-23 10:27:52.736 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:27:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:27:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:53 np0005593234 nova_compute[227762]: 2026-01-23 10:27:53.069 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:53 np0005593234 ovn_controller[134547]: 2026-01-23T10:27:53Z|00086|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:0c:c9 10.100.0.5
Jan 23 05:27:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:53.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:54.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:55.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:55 np0005593234 nova_compute[227762]: 2026-01-23 10:27:55.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:27:55 np0005593234 nova_compute[227762]: 2026-01-23 10:27:55.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:27:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:56.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:57.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:57 np0005593234 nova_compute[227762]: 2026-01-23 10:27:57.339 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:27:57 np0005593234 nova_compute[227762]: 2026-01-23 10:27:57.339 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:27:57 np0005593234 nova_compute[227762]: 2026-01-23 10:27:57.339 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:27:57 np0005593234 nova_compute[227762]: 2026-01-23 10:27:57.739 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:27:58 np0005593234 nova_compute[227762]: 2026-01-23 10:27:58.070 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:27:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:27:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:58.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:27:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:27:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:27:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:59.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:28:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:28:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:00.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:28:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:01.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:02.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:02 np0005593234 nova_compute[227762]: 2026-01-23 10:28:02.742 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:03 np0005593234 nova_compute[227762]: 2026-01-23 10:28:03.071 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:03.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:04.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:05.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:05 np0005593234 podman[305583]: 2026-01-23 10:28:05.750491336 +0000 UTC m=+0.046404473 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 23 05:28:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:28:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:06.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:28:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:07.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:07 np0005593234 nova_compute[227762]: 2026-01-23 10:28:07.745 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:08 np0005593234 nova_compute[227762]: 2026-01-23 10:28:08.110 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:28:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:08.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:28:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:09.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:10.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:28:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:11.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:28:12 np0005593234 nova_compute[227762]: 2026-01-23 10:28:12.243 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:12.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:12 np0005593234 nova_compute[227762]: 2026-01-23 10:28:12.747 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:13 np0005593234 nova_compute[227762]: 2026-01-23 10:28:13.111 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:13.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:14.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:14 np0005593234 podman[305608]: 2026-01-23 10:28:14.78826962 +0000 UTC m=+0.083475094 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 23 05:28:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:15.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:16.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:17 np0005593234 ovn_controller[134547]: 2026-01-23T10:28:17Z|00087|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:0c:c9 10.100.0.5
Jan 23 05:28:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:17.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:17 np0005593234 nova_compute[227762]: 2026-01-23 10:28:17.750 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:18 np0005593234 nova_compute[227762]: 2026-01-23 10:28:18.148 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:28:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:18.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:28:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:19.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:20.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:21.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:21 np0005593234 ovn_controller[134547]: 2026-01-23T10:28:21Z|00088|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:0c:c9 10.100.0.5
Jan 23 05:28:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:22.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:22 np0005593234 nova_compute[227762]: 2026-01-23 10:28:22.754 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:23 np0005593234 nova_compute[227762]: 2026-01-23 10:28:23.151 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:23.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:24.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:24 np0005593234 ovn_controller[134547]: 2026-01-23T10:28:24Z|00089|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:dd:0c:c9 10.100.0.5
Jan 23 05:28:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:25.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:26.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:26 np0005593234 nova_compute[227762]: 2026-01-23 10:28:26.625 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:26.626 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=63, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=62) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:28:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:26.628 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:28:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:27.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:27.630 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '63'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:28:27 np0005593234 nova_compute[227762]: 2026-01-23 10:28:27.756 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:28 np0005593234 nova_compute[227762]: 2026-01-23 10:28:28.202 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:28.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:29.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:30.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:31.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:32.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.615 227766 DEBUG oslo_concurrency.lockutils [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "0691611c-0064-4559-8c77-6c437a3b1d28" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.616 227766 DEBUG oslo_concurrency.lockutils [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.616 227766 DEBUG oslo_concurrency.lockutils [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.616 227766 DEBUG oslo_concurrency.lockutils [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.616 227766 DEBUG oslo_concurrency.lockutils [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.618 227766 INFO nova.compute.manager [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Terminating instance#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.619 227766 DEBUG nova.compute.manager [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.710 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Updating instance_info_cache with network_info: [{"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.759 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.774 227766 DEBUG nova.compute.manager [req-ebe65839-1d60-4811-bc7b-e65d8d67d128 req-daffc1de-3eb9-4592-b500-0734694bcef2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Received event network-changed-3655f265-3e79-476b-b0ee-7515d802ad05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.775 227766 DEBUG nova.compute.manager [req-ebe65839-1d60-4811-bc7b-e65d8d67d128 req-daffc1de-3eb9-4592-b500-0734694bcef2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Refreshing instance network info cache due to event network-changed-3655f265-3e79-476b-b0ee-7515d802ad05. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.775 227766 DEBUG oslo_concurrency.lockutils [req-ebe65839-1d60-4811-bc7b-e65d8d67d128 req-daffc1de-3eb9-4592-b500-0734694bcef2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-0691611c-0064-4559-8c77-6c437a3b1d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.775 227766 DEBUG oslo_concurrency.lockutils [req-ebe65839-1d60-4811-bc7b-e65d8d67d128 req-daffc1de-3eb9-4592-b500-0734694bcef2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-0691611c-0064-4559-8c77-6c437a3b1d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.775 227766 DEBUG nova.network.neutron [req-ebe65839-1d60-4811-bc7b-e65d8d67d128 req-daffc1de-3eb9-4592-b500-0734694bcef2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Refreshing network info cache for port 3655f265-3e79-476b-b0ee-7515d802ad05 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.812 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.812 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.813 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.813 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.813 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.814 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.814 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:32 np0005593234 nova_compute[227762]: 2026-01-23 10:28:32.814 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:28:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e328 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:33 np0005593234 kernel: tap3655f265-3e (unregistering): left promiscuous mode
Jan 23 05:28:33 np0005593234 NetworkManager[48942]: <info>  [1769164113.1321] device (tap3655f265-3e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:28:33 np0005593234 ovn_controller[134547]: 2026-01-23T10:28:33Z|00720|binding|INFO|Releasing lport 3655f265-3e79-476b-b0ee-7515d802ad05 from this chassis (sb_readonly=0)
Jan 23 05:28:33 np0005593234 ovn_controller[134547]: 2026-01-23T10:28:33Z|00721|binding|INFO|Setting lport 3655f265-3e79-476b-b0ee-7515d802ad05 down in Southbound
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.141 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:33 np0005593234 ovn_controller[134547]: 2026-01-23T10:28:33Z|00722|binding|INFO|Removing iface tap3655f265-3e ovn-installed in OVS
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.143 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.156 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.204 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:33 np0005593234 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000ab.scope: Deactivated successfully.
Jan 23 05:28:33 np0005593234 systemd[1]: machine-qemu\x2d80\x2dinstance\x2d000000ab.scope: Consumed 16.211s CPU time.
Jan 23 05:28:33 np0005593234 systemd-machined[195626]: Machine qemu-80-instance-000000ab terminated.
Jan 23 05:28:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:33.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.265 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.270 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.291 227766 INFO nova.virt.libvirt.driver [-] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Instance destroyed successfully.#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.292 227766 DEBUG nova.objects.instance [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'resources' on Instance uuid 0691611c-0064-4559-8c77-6c437a3b1d28 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.539 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:dd:0c:c9 10.100.0.5'], port_security=['fa:16:3e:dd:0c:c9 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '0691611c-0064-4559-8c77-6c437a3b1d28', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e9a30b2a-861e-43f1-ad49-9f70838e4d73', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8585edb1-05f5-4381-af60-0123ecc3bd9f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6730e18a-1eb5-4e1f-9ad1-d2d85c48cc63, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=3655f265-3e79-476b-b0ee-7515d802ad05) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.540 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 3655f265-3e79-476b-b0ee-7515d802ad05 in datapath e9a30b2a-861e-43f1-ad49-9f70838e4d73 unbound from our chassis#033[00m
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.542 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e9a30b2a-861e-43f1-ad49-9f70838e4d73, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.544 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4dea51cc-4a34-4d5f-80e4-d9e3e0c2ab47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.544 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73 namespace which is not needed anymore#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.559 227766 DEBUG nova.virt.libvirt.vif [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:27:01Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-498738817',display_name='tempest-TestNetworkBasicOps-server-498738817',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-498738817',id=171,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJ0MOr1ZXYTOMxqhCSzGVv7TnAMPzxDZWjTMFAcXgD/tRBlRHPzSyzU59fQfLx6eAnMbo1gdikjTHATMM7fFoGqqkpyHJCxu4lcKE1O3ar3uaP0DSC+sPXD6UKfjXuo70A==',key_name='tempest-TestNetworkBasicOps-340669632',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:27:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-u0c7ay0g',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:27:30Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=0691611c-0064-4559-8c77-6c437a3b1d28,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3655f265-3e79-476b-b0ee-7515d802ad05", "address": "fa:16:3e:dd:0c:c9", "network": {"id": "e9a30b2a-861e-43f1-ad49-9f70838e4d73", "bridge": "br-int", "label": "tempest-network-smoke--796549785", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3655f265-3e", "ovs_interfaceid": "3655f265-3e79-476b-b0ee-7515d802ad05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.560 227766 DEBUG nova.network.os_vif_util [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "3655f265-3e79-476b-b0ee-7515d802ad05", "address": "fa:16:3e:dd:0c:c9", "network": {"id": "e9a30b2a-861e-43f1-ad49-9f70838e4d73", "bridge": "br-int", "label": "tempest-network-smoke--796549785", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.217", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3655f265-3e", "ovs_interfaceid": "3655f265-3e79-476b-b0ee-7515d802ad05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.561 227766 DEBUG nova.network.os_vif_util [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:dd:0c:c9,bridge_name='br-int',has_traffic_filtering=True,id=3655f265-3e79-476b-b0ee-7515d802ad05,network=Network(e9a30b2a-861e-43f1-ad49-9f70838e4d73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3655f265-3e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.562 227766 DEBUG os_vif [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:0c:c9,bridge_name='br-int',has_traffic_filtering=True,id=3655f265-3e79-476b-b0ee-7515d802ad05,network=Network(e9a30b2a-861e-43f1-ad49-9f70838e4d73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3655f265-3e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.565 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.565 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3655f265-3e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.567 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.568 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.571 227766 INFO os_vif [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:dd:0c:c9,bridge_name='br-int',has_traffic_filtering=True,id=3655f265-3e79-476b-b0ee-7515d802ad05,network=Network(e9a30b2a-861e-43f1-ad49-9f70838e4d73),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap3655f265-3e')#033[00m
Jan 23 05:28:33 np0005593234 neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73[305173]: [NOTICE]   (305177) : haproxy version is 2.8.14-c23fe91
Jan 23 05:28:33 np0005593234 neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73[305173]: [NOTICE]   (305177) : path to executable is /usr/sbin/haproxy
Jan 23 05:28:33 np0005593234 neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73[305173]: [WARNING]  (305177) : Exiting Master process...
Jan 23 05:28:33 np0005593234 neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73[305173]: [WARNING]  (305177) : Exiting Master process...
Jan 23 05:28:33 np0005593234 neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73[305173]: [ALERT]    (305177) : Current worker (305179) exited with code 143 (Terminated)
Jan 23 05:28:33 np0005593234 neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73[305173]: [WARNING]  (305177) : All workers exited. Exiting... (0)
Jan 23 05:28:33 np0005593234 systemd[1]: libpod-e6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e.scope: Deactivated successfully.
Jan 23 05:28:33 np0005593234 podman[305744]: 2026-01-23 10:28:33.681105417 +0000 UTC m=+0.046200787 container died e6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 05:28:33 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e-userdata-shm.mount: Deactivated successfully.
Jan 23 05:28:33 np0005593234 systemd[1]: var-lib-containers-storage-overlay-73c8196ddf65a660e928bc8be039ec1db347b3521d221d97f16283f61b6e40bb-merged.mount: Deactivated successfully.
Jan 23 05:28:33 np0005593234 podman[305744]: 2026-01-23 10:28:33.716788226 +0000 UTC m=+0.081883596 container cleanup e6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:28:33 np0005593234 systemd[1]: libpod-conmon-e6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e.scope: Deactivated successfully.
Jan 23 05:28:33 np0005593234 podman[305773]: 2026-01-23 10:28:33.787042469 +0000 UTC m=+0.049986684 container remove e6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.792 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d8a1576d-fe68-4d31-81ef-47185f2a17f6]: (4, ('Fri Jan 23 10:28:33 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73 (e6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e)\ne6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e\nFri Jan 23 10:28:33 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73 (e6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e)\ne6aa8df7de50d6efd9250f8456be850ae57106181c4f867877f011033aa2ad9e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.794 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[432d84c3-62cd-464b-b03c-8d46e269efd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.795 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape9a30b2a-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.797 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:33 np0005593234 kernel: tape9a30b2a-80: left promiscuous mode
Jan 23 05:28:33 np0005593234 nova_compute[227762]: 2026-01-23 10:28:33.811 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.814 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[08cd48b6-7a99-44d3-949e-1c7ddc069c7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.831 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6958f1a0-48a7-48ac-b6f3-eb164fd54584]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.832 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[de248ade-c70d-4662-bf3d-9a78b938e6d8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.845 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b25733ff-00ea-41ca-90fd-571c9324f8d5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 794468, 'reachable_time': 42702, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305788, 'error': None, 'target': 'ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:33 np0005593234 systemd[1]: run-netns-ovnmeta\x2de9a30b2a\x2d861e\x2d43f1\x2dad49\x2d9f70838e4d73.mount: Deactivated successfully.
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.849 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e9a30b2a-861e-43f1-ad49-9f70838e4d73 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:28:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:33.850 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[1846b25c-8804-4f33-a143-170812766994]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:28:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e329 e329: 3 total, 3 up, 3 in
Jan 23 05:28:34 np0005593234 nova_compute[227762]: 2026-01-23 10:28:34.168 227766 INFO nova.virt.libvirt.driver [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Deleting instance files /var/lib/nova/instances/0691611c-0064-4559-8c77-6c437a3b1d28_del#033[00m
Jan 23 05:28:34 np0005593234 nova_compute[227762]: 2026-01-23 10:28:34.169 227766 INFO nova.virt.libvirt.driver [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Deletion of /var/lib/nova/instances/0691611c-0064-4559-8c77-6c437a3b1d28_del complete#033[00m
Jan 23 05:28:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:34.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:35.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:35 np0005593234 nova_compute[227762]: 2026-01-23 10:28:35.338 227766 INFO nova.compute.manager [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Took 2.72 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:28:35 np0005593234 nova_compute[227762]: 2026-01-23 10:28:35.338 227766 DEBUG oslo.service.loopingcall [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:28:35 np0005593234 nova_compute[227762]: 2026-01-23 10:28:35.339 227766 DEBUG nova.compute.manager [-] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:28:35 np0005593234 nova_compute[227762]: 2026-01-23 10:28:35.339 227766 DEBUG nova.network.neutron [-] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:28:35 np0005593234 nova_compute[227762]: 2026-01-23 10:28:35.470 227766 DEBUG nova.compute.manager [req-cca8d91f-3cb4-45d5-8e6e-a2da56c11fa0 req-8d2265e9-138e-49e2-9e27-5d56c847b621 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Received event network-vif-unplugged-3655f265-3e79-476b-b0ee-7515d802ad05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:28:35 np0005593234 nova_compute[227762]: 2026-01-23 10:28:35.471 227766 DEBUG oslo_concurrency.lockutils [req-cca8d91f-3cb4-45d5-8e6e-a2da56c11fa0 req-8d2265e9-138e-49e2-9e27-5d56c847b621 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:35 np0005593234 nova_compute[227762]: 2026-01-23 10:28:35.471 227766 DEBUG oslo_concurrency.lockutils [req-cca8d91f-3cb4-45d5-8e6e-a2da56c11fa0 req-8d2265e9-138e-49e2-9e27-5d56c847b621 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:35 np0005593234 nova_compute[227762]: 2026-01-23 10:28:35.471 227766 DEBUG oslo_concurrency.lockutils [req-cca8d91f-3cb4-45d5-8e6e-a2da56c11fa0 req-8d2265e9-138e-49e2-9e27-5d56c847b621 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:35 np0005593234 nova_compute[227762]: 2026-01-23 10:28:35.471 227766 DEBUG nova.compute.manager [req-cca8d91f-3cb4-45d5-8e6e-a2da56c11fa0 req-8d2265e9-138e-49e2-9e27-5d56c847b621 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] No waiting events found dispatching network-vif-unplugged-3655f265-3e79-476b-b0ee-7515d802ad05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:28:35 np0005593234 nova_compute[227762]: 2026-01-23 10:28:35.472 227766 DEBUG nova.compute.manager [req-cca8d91f-3cb4-45d5-8e6e-a2da56c11fa0 req-8d2265e9-138e-49e2-9e27-5d56c847b621 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Received event network-vif-unplugged-3655f265-3e79-476b-b0ee-7515d802ad05 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:28:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:36.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:36 np0005593234 podman[305791]: 2026-01-23 10:28:36.791153675 +0000 UTC m=+0.081949988 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:28:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:37.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e329 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:38 np0005593234 nova_compute[227762]: 2026-01-23 10:28:38.206 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:38.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e330 e330: 3 total, 3 up, 3 in
Jan 23 05:28:38 np0005593234 nova_compute[227762]: 2026-01-23 10:28:38.418 227766 DEBUG nova.compute.manager [req-ec2af5f8-84b9-4572-9204-629b1c6bb6dc req-75b6ed8f-3230-4869-a1fb-cb17648d7cf0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Received event network-vif-plugged-3655f265-3e79-476b-b0ee-7515d802ad05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:28:38 np0005593234 nova_compute[227762]: 2026-01-23 10:28:38.419 227766 DEBUG oslo_concurrency.lockutils [req-ec2af5f8-84b9-4572-9204-629b1c6bb6dc req-75b6ed8f-3230-4869-a1fb-cb17648d7cf0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:38 np0005593234 nova_compute[227762]: 2026-01-23 10:28:38.419 227766 DEBUG oslo_concurrency.lockutils [req-ec2af5f8-84b9-4572-9204-629b1c6bb6dc req-75b6ed8f-3230-4869-a1fb-cb17648d7cf0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:38 np0005593234 nova_compute[227762]: 2026-01-23 10:28:38.419 227766 DEBUG oslo_concurrency.lockutils [req-ec2af5f8-84b9-4572-9204-629b1c6bb6dc req-75b6ed8f-3230-4869-a1fb-cb17648d7cf0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:38 np0005593234 nova_compute[227762]: 2026-01-23 10:28:38.419 227766 DEBUG nova.compute.manager [req-ec2af5f8-84b9-4572-9204-629b1c6bb6dc req-75b6ed8f-3230-4869-a1fb-cb17648d7cf0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] No waiting events found dispatching network-vif-plugged-3655f265-3e79-476b-b0ee-7515d802ad05 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:28:38 np0005593234 nova_compute[227762]: 2026-01-23 10:28:38.419 227766 WARNING nova.compute.manager [req-ec2af5f8-84b9-4572-9204-629b1c6bb6dc req-75b6ed8f-3230-4869-a1fb-cb17648d7cf0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Received unexpected event network-vif-plugged-3655f265-3e79-476b-b0ee-7515d802ad05 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:28:38 np0005593234 nova_compute[227762]: 2026-01-23 10:28:38.509 227766 DEBUG nova.network.neutron [req-ebe65839-1d60-4811-bc7b-e65d8d67d128 req-daffc1de-3eb9-4592-b500-0734694bcef2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Updated VIF entry in instance network info cache for port 3655f265-3e79-476b-b0ee-7515d802ad05. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:28:38 np0005593234 nova_compute[227762]: 2026-01-23 10:28:38.509 227766 DEBUG nova.network.neutron [req-ebe65839-1d60-4811-bc7b-e65d8d67d128 req-daffc1de-3eb9-4592-b500-0734694bcef2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Updating instance_info_cache with network_info: [{"id": "3655f265-3e79-476b-b0ee-7515d802ad05", "address": "fa:16:3e:dd:0c:c9", "network": {"id": "e9a30b2a-861e-43f1-ad49-9f70838e4d73", "bridge": "br-int", "label": "tempest-network-smoke--796549785", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3655f265-3e", "ovs_interfaceid": "3655f265-3e79-476b-b0ee-7515d802ad05", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:28:38 np0005593234 nova_compute[227762]: 2026-01-23 10:28:38.567 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:38 np0005593234 nova_compute[227762]: 2026-01-23 10:28:38.695 227766 DEBUG oslo_concurrency.lockutils [req-ebe65839-1d60-4811-bc7b-e65d8d67d128 req-daffc1de-3eb9-4592-b500-0734694bcef2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-0691611c-0064-4559-8c77-6c437a3b1d28" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:28:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:28:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:39.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:28:39 np0005593234 nova_compute[227762]: 2026-01-23 10:28:39.454 227766 DEBUG nova.network.neutron [-] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:28:39 np0005593234 nova_compute[227762]: 2026-01-23 10:28:39.496 227766 INFO nova.compute.manager [-] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Took 4.16 seconds to deallocate network for instance.#033[00m
Jan 23 05:28:39 np0005593234 nova_compute[227762]: 2026-01-23 10:28:39.561 227766 DEBUG oslo_concurrency.lockutils [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:39 np0005593234 nova_compute[227762]: 2026-01-23 10:28:39.562 227766 DEBUG oslo_concurrency.lockutils [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:39 np0005593234 nova_compute[227762]: 2026-01-23 10:28:39.630 227766 DEBUG oslo_concurrency.processutils [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:28:39 np0005593234 nova_compute[227762]: 2026-01-23 10:28:39.807 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:39 np0005593234 nova_compute[227762]: 2026-01-23 10:28:39.808 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:28:40 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4215887836' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:28:40 np0005593234 nova_compute[227762]: 2026-01-23 10:28:40.066 227766 DEBUG oslo_concurrency.processutils [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:28:40 np0005593234 nova_compute[227762]: 2026-01-23 10:28:40.072 227766 DEBUG nova.compute.provider_tree [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:28:40 np0005593234 nova_compute[227762]: 2026-01-23 10:28:40.089 227766 DEBUG nova.scheduler.client.report [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:28:40 np0005593234 nova_compute[227762]: 2026-01-23 10:28:40.111 227766 DEBUG oslo_concurrency.lockutils [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.549s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:40 np0005593234 nova_compute[227762]: 2026-01-23 10:28:40.140 227766 INFO nova.scheduler.client.report [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Deleted allocations for instance 0691611c-0064-4559-8c77-6c437a3b1d28#033[00m
Jan 23 05:28:40 np0005593234 nova_compute[227762]: 2026-01-23 10:28:40.221 227766 DEBUG oslo_concurrency.lockutils [None req-0479787b-55de-44fe-aa14-5c5022791ef1 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "0691611c-0064-4559-8c77-6c437a3b1d28" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:40.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:40 np0005593234 nova_compute[227762]: 2026-01-23 10:28:40.515 227766 DEBUG nova.compute.manager [req-ea745dbd-5da1-44a5-9d36-983ef82dd0f1 req-fdf347cc-6ba5-4494-afee-06ab6054dfa7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Received event network-vif-deleted-3655f265-3e79-476b-b0ee-7515d802ad05 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:28:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:41.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:42.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:42.864 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:42.864 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:28:42.865 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e330 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:43 np0005593234 nova_compute[227762]: 2026-01-23 10:28:43.208 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:28:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:43.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:28:43 np0005593234 nova_compute[227762]: 2026-01-23 10:28:43.454 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:43 np0005593234 nova_compute[227762]: 2026-01-23 10:28:43.568 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:44.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:28:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1762014429' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:28:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:28:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1762014429' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:28:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:28:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:45.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:28:45 np0005593234 podman[305886]: 2026-01-23 10:28:45.994480373 +0000 UTC m=+0.288986649 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:28:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:46.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:47.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e330 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:48 np0005593234 nova_compute[227762]: 2026-01-23 10:28:48.210 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:48 np0005593234 nova_compute[227762]: 2026-01-23 10:28:48.289 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164113.2883797, 0691611c-0064-4559-8c77-6c437a3b1d28 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:28:48 np0005593234 nova_compute[227762]: 2026-01-23 10:28:48.290 227766 INFO nova.compute.manager [-] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:28:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:48.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:48 np0005593234 nova_compute[227762]: 2026-01-23 10:28:48.536 227766 DEBUG nova.compute.manager [None req-48fee001-a0ae-49f0-96f9-23117740723d - - - - - -] [instance: 0691611c-0064-4559-8c77-6c437a3b1d28] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:28:48 np0005593234 nova_compute[227762]: 2026-01-23 10:28:48.570 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e331 e331: 3 total, 3 up, 3 in
Jan 23 05:28:48 np0005593234 nova_compute[227762]: 2026-01-23 10:28:48.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:48 np0005593234 nova_compute[227762]: 2026-01-23 10:28:48.781 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:48 np0005593234 nova_compute[227762]: 2026-01-23 10:28:48.782 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:48 np0005593234 nova_compute[227762]: 2026-01-23 10:28:48.782 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:48 np0005593234 nova_compute[227762]: 2026-01-23 10:28:48.782 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:28:48 np0005593234 nova_compute[227762]: 2026-01-23 10:28:48.783 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:28:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:28:49 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/609731149' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:28:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:49.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:49 np0005593234 nova_compute[227762]: 2026-01-23 10:28:49.256 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:28:49 np0005593234 nova_compute[227762]: 2026-01-23 10:28:49.425 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000a9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:28:49 np0005593234 nova_compute[227762]: 2026-01-23 10:28:49.426 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000a9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:28:49 np0005593234 nova_compute[227762]: 2026-01-23 10:28:49.617 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:28:49 np0005593234 nova_compute[227762]: 2026-01-23 10:28:49.619 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4111MB free_disk=20.85184097290039GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:28:49 np0005593234 nova_compute[227762]: 2026-01-23 10:28:49.619 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:28:49 np0005593234 nova_compute[227762]: 2026-01-23 10:28:49.620 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:28:49 np0005593234 ovn_controller[134547]: 2026-01-23T10:28:49Z|00723|binding|INFO|Releasing lport ec60a79a-3050-4cfa-9c46-cf939e3eeae0 from this chassis (sb_readonly=0)
Jan 23 05:28:49 np0005593234 nova_compute[227762]: 2026-01-23 10:28:49.694 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:49 np0005593234 nova_compute[227762]: 2026-01-23 10:28:49.832 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance b31fafb4-3888-4647-9d5d-5f528ff795b5 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:28:49 np0005593234 nova_compute[227762]: 2026-01-23 10:28:49.833 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:28:49 np0005593234 nova_compute[227762]: 2026-01-23 10:28:49.834 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:28:49 np0005593234 nova_compute[227762]: 2026-01-23 10:28:49.880 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:28:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:28:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:50.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:28:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:28:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1176425167' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:28:50 np0005593234 nova_compute[227762]: 2026-01-23 10:28:50.335 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:28:50 np0005593234 nova_compute[227762]: 2026-01-23 10:28:50.342 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:28:50 np0005593234 nova_compute[227762]: 2026-01-23 10:28:50.441 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:28:50 np0005593234 nova_compute[227762]: 2026-01-23 10:28:50.522 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:28:50 np0005593234 nova_compute[227762]: 2026-01-23 10:28:50.523 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.903s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:28:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:51.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:51 np0005593234 nova_compute[227762]: 2026-01-23 10:28:51.523 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:52.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:53 np0005593234 nova_compute[227762]: 2026-01-23 10:28:53.212 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:53.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:53 np0005593234 nova_compute[227762]: 2026-01-23 10:28:53.572 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:54.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:54 np0005593234 nova_compute[227762]: 2026-01-23 10:28:54.342 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:55 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:28:55 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:28:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:55.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:55 np0005593234 nova_compute[227762]: 2026-01-23 10:28:55.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:28:55 np0005593234 nova_compute[227762]: 2026-01-23 10:28:55.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:28:55 np0005593234 nova_compute[227762]: 2026-01-23 10:28:55.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:28:56 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:28:56 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:28:56 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:28:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:56.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:57.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:57 np0005593234 nova_compute[227762]: 2026-01-23 10:28:57.467 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e331 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:28:58 np0005593234 nova_compute[227762]: 2026-01-23 10:28:58.214 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:58.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 e332: 3 total, 3 up, 3 in
Jan 23 05:28:58 np0005593234 nova_compute[227762]: 2026-01-23 10:28:58.574 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:28:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:28:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:28:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:59.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:28:59 np0005593234 nova_compute[227762]: 2026-01-23 10:28:59.442 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:28:59 np0005593234 nova_compute[227762]: 2026-01-23 10:28:59.442 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:28:59 np0005593234 nova_compute[227762]: 2026-01-23 10:28:59.442 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:28:59 np0005593234 nova_compute[227762]: 2026-01-23 10:28:59.443 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid b31fafb4-3888-4647-9d5d-5f528ff795b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:29:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:00.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:01.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:02.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:02 np0005593234 nova_compute[227762]: 2026-01-23 10:29:02.757 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Updating instance_info_cache with network_info: [{"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:29:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:03 np0005593234 nova_compute[227762]: 2026-01-23 10:29:03.267 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:29:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:03.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:29:03 np0005593234 nova_compute[227762]: 2026-01-23 10:29:03.542 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:29:03 np0005593234 nova_compute[227762]: 2026-01-23 10:29:03.543 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:29:03 np0005593234 nova_compute[227762]: 2026-01-23 10:29:03.543 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:03 np0005593234 nova_compute[227762]: 2026-01-23 10:29:03.543 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:03 np0005593234 nova_compute[227762]: 2026-01-23 10:29:03.543 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:03 np0005593234 nova_compute[227762]: 2026-01-23 10:29:03.575 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:03 np0005593234 nova_compute[227762]: 2026-01-23 10:29:03.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:03 np0005593234 nova_compute[227762]: 2026-01-23 10:29:03.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:29:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:29:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:29:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:04.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:05.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:29:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:06.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:29:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:07.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:07 np0005593234 podman[306199]: 2026-01-23 10:29:07.761392066 +0000 UTC m=+0.054700760 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 23 05:29:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:08 np0005593234 nova_compute[227762]: 2026-01-23 10:29:08.269 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:08.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:08 np0005593234 nova_compute[227762]: 2026-01-23 10:29:08.579 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:09.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:09 np0005593234 nova_compute[227762]: 2026-01-23 10:29:09.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:10.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:11.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:12.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:13 np0005593234 nova_compute[227762]: 2026-01-23 10:29:13.272 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:13.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:13 np0005593234 nova_compute[227762]: 2026-01-23 10:29:13.580 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:14.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:14 np0005593234 nova_compute[227762]: 2026-01-23 10:29:14.740 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:15.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:16.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:16 np0005593234 nova_compute[227762]: 2026-01-23 10:29:16.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:16 np0005593234 podman[306225]: 2026-01-23 10:29:16.775782643 +0000 UTC m=+0.072738351 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:29:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:17.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:18 np0005593234 nova_compute[227762]: 2026-01-23 10:29:18.274 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:29:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:18.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:29:18 np0005593234 nova_compute[227762]: 2026-01-23 10:29:18.581 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:19.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:20.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:21.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:22.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:23 np0005593234 nova_compute[227762]: 2026-01-23 10:29:23.275 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:23.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:23 np0005593234 nova_compute[227762]: 2026-01-23 10:29:23.582 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:24.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:25.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:26 np0005593234 nova_compute[227762]: 2026-01-23 10:29:26.207 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:26.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:27.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:28 np0005593234 nova_compute[227762]: 2026-01-23 10:29:28.276 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:28.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:28 np0005593234 nova_compute[227762]: 2026-01-23 10:29:28.583 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:28.888 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=64, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=63) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:29:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:28.889 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:29:28 np0005593234 nova_compute[227762]: 2026-01-23 10:29:28.915 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:29:29Z|00724|binding|INFO|Releasing lport ec60a79a-3050-4cfa-9c46-cf939e3eeae0 from this chassis (sb_readonly=0)
Jan 23 05:29:29 np0005593234 nova_compute[227762]: 2026-01-23 10:29:29.079 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:29.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:30.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:30 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:30.890 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '64'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:29:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:31.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:32 np0005593234 nova_compute[227762]: 2026-01-23 10:29:32.002 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:32.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:33 np0005593234 nova_compute[227762]: 2026-01-23 10:29:33.277 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:33.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:33 np0005593234 nova_compute[227762]: 2026-01-23 10:29:33.584 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:33 np0005593234 nova_compute[227762]: 2026-01-23 10:29:33.620 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:33 np0005593234 nova_compute[227762]: 2026-01-23 10:29:33.620 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:29:33 np0005593234 nova_compute[227762]: 2026-01-23 10:29:33.638 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:29:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:34.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:34 np0005593234 nova_compute[227762]: 2026-01-23 10:29:34.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:34 np0005593234 nova_compute[227762]: 2026-01-23 10:29:34.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:29:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:35.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:36.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:36 np0005593234 ovn_controller[134547]: 2026-01-23T10:29:36Z|00725|binding|INFO|Releasing lport ec60a79a-3050-4cfa-9c46-cf939e3eeae0 from this chassis (sb_readonly=0)
Jan 23 05:29:36 np0005593234 nova_compute[227762]: 2026-01-23 10:29:36.528 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:37.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:38 np0005593234 nova_compute[227762]: 2026-01-23 10:29:38.280 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:38.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:38 np0005593234 nova_compute[227762]: 2026-01-23 10:29:38.585 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:38 np0005593234 podman[306362]: 2026-01-23 10:29:38.755644764 +0000 UTC m=+0.050778878 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent)
Jan 23 05:29:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:39.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.638 227766 DEBUG nova.compute.manager [req-18bea9db-5081-42eb-80ca-834acc8ee210 req-1465c273-6cb5-42d7-955b-caf868c86879 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Received event network-changed-54d9357e-ac9f-458b-b6ce-6da38bc7a025 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.639 227766 DEBUG nova.compute.manager [req-18bea9db-5081-42eb-80ca-834acc8ee210 req-1465c273-6cb5-42d7-955b-caf868c86879 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Refreshing instance network info cache due to event network-changed-54d9357e-ac9f-458b-b6ce-6da38bc7a025. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.639 227766 DEBUG oslo_concurrency.lockutils [req-18bea9db-5081-42eb-80ca-834acc8ee210 req-1465c273-6cb5-42d7-955b-caf868c86879 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.639 227766 DEBUG oslo_concurrency.lockutils [req-18bea9db-5081-42eb-80ca-834acc8ee210 req-1465c273-6cb5-42d7-955b-caf868c86879 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.639 227766 DEBUG nova.network.neutron [req-18bea9db-5081-42eb-80ca-834acc8ee210 req-1465c273-6cb5-42d7-955b-caf868c86879 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Refreshing network info cache for port 54d9357e-ac9f-458b-b6ce-6da38bc7a025 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.746 227766 DEBUG oslo_concurrency.lockutils [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "b31fafb4-3888-4647-9d5d-5f528ff795b5" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.747 227766 DEBUG oslo_concurrency.lockutils [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.747 227766 DEBUG oslo_concurrency.lockutils [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.748 227766 DEBUG oslo_concurrency.lockutils [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.748 227766 DEBUG oslo_concurrency.lockutils [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.749 227766 INFO nova.compute.manager [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Terminating instance#033[00m
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.750 227766 DEBUG nova.compute.manager [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:29:39 np0005593234 kernel: tap54d9357e-ac (unregistering): left promiscuous mode
Jan 23 05:29:39 np0005593234 NetworkManager[48942]: <info>  [1769164179.9692] device (tap54d9357e-ac): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.978 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:39 np0005593234 ovn_controller[134547]: 2026-01-23T10:29:39Z|00726|binding|INFO|Releasing lport 54d9357e-ac9f-458b-b6ce-6da38bc7a025 from this chassis (sb_readonly=0)
Jan 23 05:29:39 np0005593234 ovn_controller[134547]: 2026-01-23T10:29:39Z|00727|binding|INFO|Setting lport 54d9357e-ac9f-458b-b6ce-6da38bc7a025 down in Southbound
Jan 23 05:29:39 np0005593234 ovn_controller[134547]: 2026-01-23T10:29:39Z|00728|binding|INFO|Removing iface tap54d9357e-ac ovn-installed in OVS
Jan 23 05:29:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:39.988 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e4:7b:6d 10.100.0.6'], port_security=['fa:16:3e:e4:7b:6d 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'b31fafb4-3888-4647-9d5d-5f528ff795b5', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e7a48d7e-0ec9-4b5d-b243-77d724af740b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b27af793a8cc42259216fbeaa302ba03', 'neutron:revision_number': '4', 'neutron:security_group_ids': '44462e8d-74cf-41bb-9b11-7aa082a0a20c f254f9d1-4249-4434-8719-2f2e0b2c9d0c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2b51bdcc-ae54-4ab6-8265-f5e2637692e4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=54d9357e-ac9f-458b-b6ce-6da38bc7a025) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:29:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:39.989 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 54d9357e-ac9f-458b-b6ce-6da38bc7a025 in datapath e7a48d7e-0ec9-4b5d-b243-77d724af740b unbound from our chassis#033[00m
Jan 23 05:29:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:39.990 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e7a48d7e-0ec9-4b5d-b243-77d724af740b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:29:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:39.991 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[37ae1510-258c-41cc-8d5a-f3b12de41b83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:29:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:39.992 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b namespace which is not needed anymore#033[00m
Jan 23 05:29:39 np0005593234 nova_compute[227762]: 2026-01-23 10:29:39.997 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:40 np0005593234 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a9.scope: Deactivated successfully.
Jan 23 05:29:40 np0005593234 systemd[1]: machine-qemu\x2d79\x2dinstance\x2d000000a9.scope: Consumed 20.711s CPU time.
Jan 23 05:29:40 np0005593234 systemd-machined[195626]: Machine qemu-79-instance-000000a9 terminated.
Jan 23 05:29:40 np0005593234 kernel: tap54d9357e-ac: entered promiscuous mode
Jan 23 05:29:40 np0005593234 NetworkManager[48942]: <info>  [1769164180.1656] manager: (tap54d9357e-ac): new Tun device (/org/freedesktop/NetworkManager/Devices/346)
Jan 23 05:29:40 np0005593234 kernel: tap54d9357e-ac (unregistering): left promiscuous mode
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.172 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.185 227766 INFO nova.virt.libvirt.driver [-] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Instance destroyed successfully.#033[00m
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.185 227766 DEBUG nova.objects.instance [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lazy-loading 'resources' on Instance uuid b31fafb4-3888-4647-9d5d-5f528ff795b5 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.203 227766 DEBUG nova.virt.libvirt.vif [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:26:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-404892044',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-622349977-access_point-404892044',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-622349977-acc',id=169,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMBgFvZ4cbi7AnPS6dwqlDZxqi0tL9pk6Pv5TmYxIwyaaf9gGUq+Kaim4h5w6BHZOb0aX+j7fNILO3q4iXwnipSp+yyY1uOiInLjY+WwJtAHiBpUfsc4DJ7rPMVoRVR8SA==',key_name='tempest-TestSecurityGroupsBasicOps-1579063337',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:26:35Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='b27af793a8cc42259216fbeaa302ba03',ramdisk_id='',reservation_id='r-7p1o492a',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-622349977',owner_user_name='tempest-TestSecurityGroupsBasicOps-622349977-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:26:35Z,user_data=None,user_id='a3cd8c3758e14f9c8e4ad1a9a94a9995',uuid=b31fafb4-3888-4647-9d5d-5f528ff795b5,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.204 227766 DEBUG nova.network.os_vif_util [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converting VIF {"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.226", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.204 227766 DEBUG nova.network.os_vif_util [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e4:7b:6d,bridge_name='br-int',has_traffic_filtering=True,id=54d9357e-ac9f-458b-b6ce-6da38bc7a025,network=Network(e7a48d7e-0ec9-4b5d-b243-77d724af740b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d9357e-ac') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.205 227766 DEBUG os_vif [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:7b:6d,bridge_name='br-int',has_traffic_filtering=True,id=54d9357e-ac9f-458b-b6ce-6da38bc7a025,network=Network(e7a48d7e-0ec9-4b5d-b243-77d724af740b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d9357e-ac') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.206 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.207 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap54d9357e-ac, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.208 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.210 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.212 227766 INFO os_vif [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e4:7b:6d,bridge_name='br-int',has_traffic_filtering=True,id=54d9357e-ac9f-458b-b6ce-6da38bc7a025,network=Network(e7a48d7e-0ec9-4b5d-b243-77d724af740b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap54d9357e-ac')#033[00m
Jan 23 05:29:40 np0005593234 neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b[304177]: [NOTICE]   (304183) : haproxy version is 2.8.14-c23fe91
Jan 23 05:29:40 np0005593234 neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b[304177]: [NOTICE]   (304183) : path to executable is /usr/sbin/haproxy
Jan 23 05:29:40 np0005593234 neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b[304177]: [WARNING]  (304183) : Exiting Master process...
Jan 23 05:29:40 np0005593234 neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b[304177]: [WARNING]  (304183) : Exiting Master process...
Jan 23 05:29:40 np0005593234 neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b[304177]: [ALERT]    (304183) : Current worker (304185) exited with code 143 (Terminated)
Jan 23 05:29:40 np0005593234 neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b[304177]: [WARNING]  (304183) : All workers exited. Exiting... (0)
Jan 23 05:29:40 np0005593234 systemd[1]: libpod-63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f.scope: Deactivated successfully.
Jan 23 05:29:40 np0005593234 podman[306406]: 2026-01-23 10:29:40.247033662 +0000 UTC m=+0.181769151 container died 63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 05:29:40 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f-userdata-shm.mount: Deactivated successfully.
Jan 23 05:29:40 np0005593234 systemd[1]: var-lib-containers-storage-overlay-47f5b4ac7f9944b8b5ed54891ddb9cfdaa75aa6911998526254050d88735e779-merged.mount: Deactivated successfully.
Jan 23 05:29:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:40.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:40 np0005593234 podman[306406]: 2026-01-23 10:29:40.392182126 +0000 UTC m=+0.326917625 container cleanup 63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:29:40 np0005593234 systemd[1]: libpod-conmon-63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f.scope: Deactivated successfully.
Jan 23 05:29:40 np0005593234 podman[306463]: 2026-01-23 10:29:40.691018083 +0000 UTC m=+0.279715600 container remove 63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 05:29:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:40.697 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[102a78a6-d8b1-441c-8aff-73717664f02e]: (4, ('Fri Jan 23 10:29:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b (63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f)\n63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f\nFri Jan 23 10:29:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b (63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f)\n63d70534dfbcb3ce8927105116c2554685b573f03892128f30ccab2fb30d537f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:29:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:40.699 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0c288a1a-3f97-4cba-b139-0310a60c54e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:29:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:40.700 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape7a48d7e-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:29:40 np0005593234 kernel: tape7a48d7e-00: left promiscuous mode
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.703 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:40 np0005593234 nova_compute[227762]: 2026-01-23 10:29:40.729 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:40.732 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[02344604-2a78-4886-913f-757677bd97e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:29:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:40.746 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dcea86b4-6fac-4706-9939-bd61df799094]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:29:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:40.747 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9c81995e-81ff-478a-b195-c7fbe304129a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:29:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:40.763 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f0eff994-65d7-4213-bc87-cad21f01237b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 788905, 'reachable_time': 17374, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 306478, 'error': None, 'target': 'ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:29:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:40.765 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e7a48d7e-0ec9-4b5d-b243-77d724af740b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:29:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:40.766 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[de43a6dc-285e-4d91-bffd-68f881fafeda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:29:40 np0005593234 systemd[1]: run-netns-ovnmeta\x2de7a48d7e\x2d0ec9\x2d4b5d\x2db243\x2d77d724af740b.mount: Deactivated successfully.
Jan 23 05:29:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:41.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.064 227766 DEBUG nova.compute.manager [req-556ddf07-b9cb-4677-a787-c2bd29a47a1e req-8c32bb18-041d-409a-bfeb-1d906f618680 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Received event network-vif-unplugged-54d9357e-ac9f-458b-b6ce-6da38bc7a025 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.065 227766 DEBUG oslo_concurrency.lockutils [req-556ddf07-b9cb-4677-a787-c2bd29a47a1e req-8c32bb18-041d-409a-bfeb-1d906f618680 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.065 227766 DEBUG oslo_concurrency.lockutils [req-556ddf07-b9cb-4677-a787-c2bd29a47a1e req-8c32bb18-041d-409a-bfeb-1d906f618680 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.065 227766 DEBUG oslo_concurrency.lockutils [req-556ddf07-b9cb-4677-a787-c2bd29a47a1e req-8c32bb18-041d-409a-bfeb-1d906f618680 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.066 227766 DEBUG nova.compute.manager [req-556ddf07-b9cb-4677-a787-c2bd29a47a1e req-8c32bb18-041d-409a-bfeb-1d906f618680 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] No waiting events found dispatching network-vif-unplugged-54d9357e-ac9f-458b-b6ce-6da38bc7a025 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.066 227766 DEBUG nova.compute.manager [req-556ddf07-b9cb-4677-a787-c2bd29a47a1e req-8c32bb18-041d-409a-bfeb-1d906f618680 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Received event network-vif-unplugged-54d9357e-ac9f-458b-b6ce-6da38bc7a025 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:29:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:29:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:42.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.658 227766 INFO nova.virt.libvirt.driver [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Deleting instance files /var/lib/nova/instances/b31fafb4-3888-4647-9d5d-5f528ff795b5_del#033[00m
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.658 227766 INFO nova.virt.libvirt.driver [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Deletion of /var/lib/nova/instances/b31fafb4-3888-4647-9d5d-5f528ff795b5_del complete#033[00m
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.752 227766 INFO nova.compute.manager [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Took 3.00 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.753 227766 DEBUG oslo.service.loopingcall [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.753 227766 DEBUG nova.compute.manager [-] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.753 227766 DEBUG nova.network.neutron [-] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:42.865 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:42.865 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:29:42.865 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.915 227766 DEBUG nova.network.neutron [req-18bea9db-5081-42eb-80ca-834acc8ee210 req-1465c273-6cb5-42d7-955b-caf868c86879 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Updated VIF entry in instance network info cache for port 54d9357e-ac9f-458b-b6ce-6da38bc7a025. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:29:42 np0005593234 nova_compute[227762]: 2026-01-23 10:29:42.916 227766 DEBUG nova.network.neutron [req-18bea9db-5081-42eb-80ca-834acc8ee210 req-1465c273-6cb5-42d7-955b-caf868c86879 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Updating instance_info_cache with network_info: [{"id": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "address": "fa:16:3e:e4:7b:6d", "network": {"id": "e7a48d7e-0ec9-4b5d-b243-77d724af740b", "bridge": "br-int", "label": "tempest-network-smoke--114616362", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "b27af793a8cc42259216fbeaa302ba03", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap54d9357e-ac", "ovs_interfaceid": "54d9357e-ac9f-458b-b6ce-6da38bc7a025", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:29:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:43 np0005593234 nova_compute[227762]: 2026-01-23 10:29:43.281 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:43.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:44 np0005593234 nova_compute[227762]: 2026-01-23 10:29:44.292 227766 DEBUG oslo_concurrency.lockutils [req-18bea9db-5081-42eb-80ca-834acc8ee210 req-1465c273-6cb5-42d7-955b-caf868c86879 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-b31fafb4-3888-4647-9d5d-5f528ff795b5" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:29:44 np0005593234 nova_compute[227762]: 2026-01-23 10:29:44.299 227766 DEBUG nova.compute.manager [req-cd29833f-8760-43f0-81f9-7cc3a88c1e14 req-696bbf18-fdc5-4839-813a-b7920714564c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Received event network-vif-plugged-54d9357e-ac9f-458b-b6ce-6da38bc7a025 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:29:44 np0005593234 nova_compute[227762]: 2026-01-23 10:29:44.299 227766 DEBUG oslo_concurrency.lockutils [req-cd29833f-8760-43f0-81f9-7cc3a88c1e14 req-696bbf18-fdc5-4839-813a-b7920714564c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:44 np0005593234 nova_compute[227762]: 2026-01-23 10:29:44.300 227766 DEBUG oslo_concurrency.lockutils [req-cd29833f-8760-43f0-81f9-7cc3a88c1e14 req-696bbf18-fdc5-4839-813a-b7920714564c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:44 np0005593234 nova_compute[227762]: 2026-01-23 10:29:44.300 227766 DEBUG oslo_concurrency.lockutils [req-cd29833f-8760-43f0-81f9-7cc3a88c1e14 req-696bbf18-fdc5-4839-813a-b7920714564c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:44 np0005593234 nova_compute[227762]: 2026-01-23 10:29:44.300 227766 DEBUG nova.compute.manager [req-cd29833f-8760-43f0-81f9-7cc3a88c1e14 req-696bbf18-fdc5-4839-813a-b7920714564c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] No waiting events found dispatching network-vif-plugged-54d9357e-ac9f-458b-b6ce-6da38bc7a025 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:29:44 np0005593234 nova_compute[227762]: 2026-01-23 10:29:44.301 227766 WARNING nova.compute.manager [req-cd29833f-8760-43f0-81f9-7cc3a88c1e14 req-696bbf18-fdc5-4839-813a-b7920714564c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Received unexpected event network-vif-plugged-54d9357e-ac9f-458b-b6ce-6da38bc7a025 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:29:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:44.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:45 np0005593234 nova_compute[227762]: 2026-01-23 10:29:45.210 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:45 np0005593234 nova_compute[227762]: 2026-01-23 10:29:45.245 227766 DEBUG nova.network.neutron [-] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:29:45 np0005593234 nova_compute[227762]: 2026-01-23 10:29:45.273 227766 INFO nova.compute.manager [-] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Took 2.52 seconds to deallocate network for instance.#033[00m
Jan 23 05:29:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:45.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:45 np0005593234 nova_compute[227762]: 2026-01-23 10:29:45.343 227766 DEBUG oslo_concurrency.lockutils [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:45 np0005593234 nova_compute[227762]: 2026-01-23 10:29:45.344 227766 DEBUG oslo_concurrency.lockutils [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:45 np0005593234 nova_compute[227762]: 2026-01-23 10:29:45.371 227766 DEBUG nova.compute.manager [req-2f1f392e-d3cc-47b8-b299-264c529321c8 req-7e1a8a41-f5a0-416a-9aca-b754d7b21361 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Received event network-vif-deleted-54d9357e-ac9f-458b-b6ce-6da38bc7a025 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:29:45 np0005593234 nova_compute[227762]: 2026-01-23 10:29:45.397 227766 DEBUG oslo_concurrency.processutils [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:29:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4010559459' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:29:45 np0005593234 nova_compute[227762]: 2026-01-23 10:29:45.887 227766 DEBUG oslo_concurrency.processutils [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:45 np0005593234 nova_compute[227762]: 2026-01-23 10:29:45.895 227766 DEBUG nova.compute.provider_tree [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:29:45 np0005593234 nova_compute[227762]: 2026-01-23 10:29:45.916 227766 DEBUG nova.scheduler.client.report [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:29:45 np0005593234 nova_compute[227762]: 2026-01-23 10:29:45.946 227766 DEBUG oslo_concurrency.lockutils [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:46 np0005593234 nova_compute[227762]: 2026-01-23 10:29:46.003 227766 INFO nova.scheduler.client.report [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Deleted allocations for instance b31fafb4-3888-4647-9d5d-5f528ff795b5#033[00m
Jan 23 05:29:46 np0005593234 nova_compute[227762]: 2026-01-23 10:29:46.083 227766 DEBUG oslo_concurrency.lockutils [None req-56cfae7e-4ca9-47ba-9ac1-1980f958e95c a3cd8c3758e14f9c8e4ad1a9a94a9995 b27af793a8cc42259216fbeaa302ba03 - - default default] Lock "b31fafb4-3888-4647-9d5d-5f528ff795b5" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.336s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:46.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:47.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:48 np0005593234 nova_compute[227762]: 2026-01-23 10:29:48.069 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:48 np0005593234 podman[306507]: 2026-01-23 10:29:48.096536564 +0000 UTC m=+0.382914868 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #148. Immutable memtables: 0.
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.148599) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 93] Flushing memtable with next log file: 148
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164188148690, "job": 93, "event": "flush_started", "num_memtables": 1, "num_entries": 2386, "num_deletes": 255, "total_data_size": 5558385, "memory_usage": 5648376, "flush_reason": "Manual Compaction"}
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 93] Level-0 flush table #149: started
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164188169866, "cf_name": "default", "job": 93, "event": "table_file_creation", "file_number": 149, "file_size": 3645647, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 70785, "largest_seqno": 73166, "table_properties": {"data_size": 3636033, "index_size": 6043, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20333, "raw_average_key_size": 20, "raw_value_size": 3616659, "raw_average_value_size": 3668, "num_data_blocks": 263, "num_entries": 986, "num_filter_entries": 986, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163986, "oldest_key_time": 1769163986, "file_creation_time": 1769164188, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 149, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 93] Flush lasted 21318 microseconds, and 6964 cpu microseconds.
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.169934) [db/flush_job.cc:967] [default] [JOB 93] Level-0 flush table #149: 3645647 bytes OK
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.169955) [db/memtable_list.cc:519] [default] Level-0 commit table #149 started
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.172271) [db/memtable_list.cc:722] [default] Level-0 commit table #149: memtable #1 done
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.172290) EVENT_LOG_v1 {"time_micros": 1769164188172283, "job": 93, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.172312) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 93] Try to delete WAL files size 5547788, prev total WAL file size 5547788, number of live WAL files 2.
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000145.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.173760) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036323735' seq:72057594037927935, type:22 .. '7061786F730036353237' seq:0, type:0; will stop at (end)
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 94] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 93 Base level 0, inputs: [149(3560KB)], [147(10071KB)]
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164188173913, "job": 94, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [149], "files_L6": [147], "score": -1, "input_data_size": 13959065, "oldest_snapshot_seqno": -1}
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 94] Generated table #150: 9259 keys, 12060731 bytes, temperature: kUnknown
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164188265293, "cf_name": "default", "job": 94, "event": "table_file_creation", "file_number": 150, "file_size": 12060731, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12001433, "index_size": 35057, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23173, "raw_key_size": 243562, "raw_average_key_size": 26, "raw_value_size": 11839464, "raw_average_value_size": 1278, "num_data_blocks": 1341, "num_entries": 9259, "num_filter_entries": 9259, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769164188, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 150, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.265668) [db/compaction/compaction_job.cc:1663] [default] [JOB 94] Compacted 1@0 + 1@6 files to L6 => 12060731 bytes
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.267525) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 152.6 rd, 131.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.5, 9.8 +0.0 blob) out(11.5 +0.0 blob), read-write-amplify(7.1) write-amplify(3.3) OK, records in: 9787, records dropped: 528 output_compression: NoCompression
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.267551) EVENT_LOG_v1 {"time_micros": 1769164188267539, "job": 94, "event": "compaction_finished", "compaction_time_micros": 91459, "compaction_time_cpu_micros": 41322, "output_level": 6, "num_output_files": 1, "total_output_size": 12060731, "num_input_records": 9787, "num_output_records": 9259, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000149.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164188268613, "job": 94, "event": "table_file_deletion", "file_number": 149}
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000147.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164188271181, "job": 94, "event": "table_file_deletion", "file_number": 147}
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.173548) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.271362) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.271369) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.271371) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.271373) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:29:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:29:48.271375) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:29:48 np0005593234 nova_compute[227762]: 2026-01-23 10:29:48.283 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:29:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:48.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:29:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:49.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:49 np0005593234 nova_compute[227762]: 2026-01-23 10:29:49.763 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:49 np0005593234 nova_compute[227762]: 2026-01-23 10:29:49.792 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:49 np0005593234 nova_compute[227762]: 2026-01-23 10:29:49.792 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:49 np0005593234 nova_compute[227762]: 2026-01-23 10:29:49.792 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:49 np0005593234 nova_compute[227762]: 2026-01-23 10:29:49.792 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:29:49 np0005593234 nova_compute[227762]: 2026-01-23 10:29:49.793 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:50 np0005593234 nova_compute[227762]: 2026-01-23 10:29:50.213 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:29:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4114465260' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:29:50 np0005593234 nova_compute[227762]: 2026-01-23 10:29:50.240 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:50.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:50 np0005593234 nova_compute[227762]: 2026-01-23 10:29:50.408 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:29:50 np0005593234 nova_compute[227762]: 2026-01-23 10:29:50.409 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4336MB free_disk=20.972171783447266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:29:50 np0005593234 nova_compute[227762]: 2026-01-23 10:29:50.409 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:50 np0005593234 nova_compute[227762]: 2026-01-23 10:29:50.410 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:50 np0005593234 nova_compute[227762]: 2026-01-23 10:29:50.472 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:29:50 np0005593234 nova_compute[227762]: 2026-01-23 10:29:50.473 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:29:50 np0005593234 nova_compute[227762]: 2026-01-23 10:29:50.492 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:29:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4084256948' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:29:50 np0005593234 nova_compute[227762]: 2026-01-23 10:29:50.943 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:50 np0005593234 nova_compute[227762]: 2026-01-23 10:29:50.950 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:29:50 np0005593234 nova_compute[227762]: 2026-01-23 10:29:50.974 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:29:51 np0005593234 nova_compute[227762]: 2026-01-23 10:29:51.003 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:29:51 np0005593234 nova_compute[227762]: 2026-01-23 10:29:51.003 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.593s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:51 np0005593234 nova_compute[227762]: 2026-01-23 10:29:51.084 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:51 np0005593234 nova_compute[227762]: 2026-01-23 10:29:51.303 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:51.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:29:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:52.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:29:52 np0005593234 nova_compute[227762]: 2026-01-23 10:29:52.985 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:53 np0005593234 nova_compute[227762]: 2026-01-23 10:29:53.285 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:53.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:54.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:55 np0005593234 nova_compute[227762]: 2026-01-23 10:29:55.184 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164180.182765, b31fafb4-3888-4647-9d5d-5f528ff795b5 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:29:55 np0005593234 nova_compute[227762]: 2026-01-23 10:29:55.184 227766 INFO nova.compute.manager [-] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:29:55 np0005593234 nova_compute[227762]: 2026-01-23 10:29:55.210 227766 DEBUG nova.compute.manager [None req-3798f45f-6562-40cc-986a-565abaf35e50 - - - - - -] [instance: b31fafb4-3888-4647-9d5d-5f528ff795b5] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:29:55 np0005593234 nova_compute[227762]: 2026-01-23 10:29:55.215 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:55.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:55 np0005593234 nova_compute[227762]: 2026-01-23 10:29:55.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:55 np0005593234 nova_compute[227762]: 2026-01-23 10:29:55.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:29:55 np0005593234 nova_compute[227762]: 2026-01-23 10:29:55.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:29:55 np0005593234 nova_compute[227762]: 2026-01-23 10:29:55.784 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:29:55 np0005593234 nova_compute[227762]: 2026-01-23 10:29:55.950 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:55 np0005593234 nova_compute[227762]: 2026-01-23 10:29:55.951 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:55 np0005593234 nova_compute[227762]: 2026-01-23 10:29:55.982 227766 DEBUG nova.compute.manager [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:29:56 np0005593234 nova_compute[227762]: 2026-01-23 10:29:56.086 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:29:56 np0005593234 nova_compute[227762]: 2026-01-23 10:29:56.086 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:29:56 np0005593234 nova_compute[227762]: 2026-01-23 10:29:56.098 227766 DEBUG nova.virt.hardware [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:29:56 np0005593234 nova_compute[227762]: 2026-01-23 10:29:56.098 227766 INFO nova.compute.claims [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:29:56 np0005593234 nova_compute[227762]: 2026-01-23 10:29:56.247 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:29:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:56.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:29:56 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2957991947' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:29:56 np0005593234 nova_compute[227762]: 2026-01-23 10:29:56.693 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:29:56 np0005593234 nova_compute[227762]: 2026-01-23 10:29:56.698 227766 DEBUG nova.compute.provider_tree [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:29:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:57.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:57 np0005593234 nova_compute[227762]: 2026-01-23 10:29:57.530 227766 DEBUG nova.scheduler.client.report [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:29:57 np0005593234 nova_compute[227762]: 2026-01-23 10:29:57.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:29:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:29:58 np0005593234 nova_compute[227762]: 2026-01-23 10:29:58.286 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:29:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:58.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:29:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:29:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:59.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:29:59 np0005593234 nova_compute[227762]: 2026-01-23 10:29:59.741 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 3.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:29:59 np0005593234 nova_compute[227762]: 2026-01-23 10:29:59.742 227766 DEBUG nova.compute.manager [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:30:00 np0005593234 nova_compute[227762]: 2026-01-23 10:30:00.218 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:00.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:00 np0005593234 nova_compute[227762]: 2026-01-23 10:30:00.461 227766 DEBUG nova.compute.manager [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:30:00 np0005593234 nova_compute[227762]: 2026-01-23 10:30:00.461 227766 DEBUG nova.network.neutron [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:30:00 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 05:30:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:01.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:01 np0005593234 nova_compute[227762]: 2026-01-23 10:30:01.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:01 np0005593234 nova_compute[227762]: 2026-01-23 10:30:01.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:01 np0005593234 nova_compute[227762]: 2026-01-23 10:30:01.839 227766 INFO nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:30:01 np0005593234 nova_compute[227762]: 2026-01-23 10:30:01.945 227766 DEBUG nova.compute.manager [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:30:02 np0005593234 nova_compute[227762]: 2026-01-23 10:30:02.314 227766 DEBUG nova.policy [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e1629a4b14764dddaabcadd16f3e1c1c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:30:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:02.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:02 np0005593234 nova_compute[227762]: 2026-01-23 10:30:02.886 227766 DEBUG nova.compute.manager [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:30:02 np0005593234 nova_compute[227762]: 2026-01-23 10:30:02.888 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:30:02 np0005593234 nova_compute[227762]: 2026-01-23 10:30:02.888 227766 INFO nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Creating image(s)#033[00m
Jan 23 05:30:02 np0005593234 nova_compute[227762]: 2026-01-23 10:30:02.934 227766 DEBUG nova.storage.rbd_utils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:02 np0005593234 nova_compute[227762]: 2026-01-23 10:30:02.966 227766 DEBUG nova.storage.rbd_utils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.000 227766 DEBUG nova.storage.rbd_utils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.005 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.079 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.080 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.081 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.081 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.107 227766 DEBUG nova.storage.rbd_utils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.110 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.289 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:03.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.404 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.294s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.467 227766 DEBUG nova.storage.rbd_utils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] resizing rbd image 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.561 227766 DEBUG nova.objects.instance [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'migration_context' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.647 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.647 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Ensure instance console log exists: /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.648 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.648 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.648 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:03 np0005593234 nova_compute[227762]: 2026-01-23 10:30:03.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:30:04 np0005593234 nova_compute[227762]: 2026-01-23 10:30:04.036 227766 DEBUG nova.network.neutron [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Successfully created port: fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:30:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:30:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:04.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:30:05 np0005593234 nova_compute[227762]: 2026-01-23 10:30:05.221 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:05.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:06.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:30:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:30:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:30:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:07.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:07 np0005593234 nova_compute[227762]: 2026-01-23 10:30:07.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:30:07 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:30:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:08 np0005593234 nova_compute[227762]: 2026-01-23 10:30:08.291 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:08.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:08 np0005593234 nova_compute[227762]: 2026-01-23 10:30:08.988 227766 DEBUG nova.network.neutron [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Successfully updated port: fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:30:09 np0005593234 nova_compute[227762]: 2026-01-23 10:30:09.087 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:30:09 np0005593234 nova_compute[227762]: 2026-01-23 10:30:09.087 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquired lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:30:09 np0005593234 nova_compute[227762]: 2026-01-23 10:30:09.087 227766 DEBUG nova.network.neutron [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:30:09 np0005593234 nova_compute[227762]: 2026-01-23 10:30:09.216 227766 DEBUG nova.compute.manager [req-3f024bea-4051-480e-8305-ad33bce859bf req-76e24630-fee0-4678-8603-e5b2dc2848e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-changed-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:09 np0005593234 nova_compute[227762]: 2026-01-23 10:30:09.217 227766 DEBUG nova.compute.manager [req-3f024bea-4051-480e-8305-ad33bce859bf req-76e24630-fee0-4678-8603-e5b2dc2848e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Refreshing instance network info cache due to event network-changed-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:30:09 np0005593234 nova_compute[227762]: 2026-01-23 10:30:09.217 227766 DEBUG oslo_concurrency.lockutils [req-3f024bea-4051-480e-8305-ad33bce859bf req-76e24630-fee0-4678-8603-e5b2dc2848e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:30:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:09.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:09 np0005593234 nova_compute[227762]: 2026-01-23 10:30:09.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:09 np0005593234 podman[306959]: 2026-01-23 10:30:09.759660722 +0000 UTC m=+0.053685578 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 23 05:30:09 np0005593234 nova_compute[227762]: 2026-01-23 10:30:09.908 227766 DEBUG nova.network.neutron [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:30:10 np0005593234 nova_compute[227762]: 2026-01-23 10:30:10.224 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:30:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:10.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.265 227766 DEBUG nova.network.neutron [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updating instance_info_cache with network_info: [{"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.299 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Releasing lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.300 227766 DEBUG nova.compute.manager [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Instance network_info: |[{"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.300 227766 DEBUG oslo_concurrency.lockutils [req-3f024bea-4051-480e-8305-ad33bce859bf req-76e24630-fee0-4678-8603-e5b2dc2848e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.300 227766 DEBUG nova.network.neutron [req-3f024bea-4051-480e-8305-ad33bce859bf req-76e24630-fee0-4678-8603-e5b2dc2848e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Refreshing network info cache for port fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.303 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Start _get_guest_xml network_info=[{"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.308 227766 WARNING nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.314 227766 DEBUG nova.virt.libvirt.host [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.315 227766 DEBUG nova.virt.libvirt.host [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.321 227766 DEBUG nova.virt.libvirt.host [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.321 227766 DEBUG nova.virt.libvirt.host [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.322 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.323 227766 DEBUG nova.virt.hardware [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.323 227766 DEBUG nova.virt.hardware [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.323 227766 DEBUG nova.virt.hardware [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.323 227766 DEBUG nova.virt.hardware [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.324 227766 DEBUG nova.virt.hardware [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.324 227766 DEBUG nova.virt.hardware [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.324 227766 DEBUG nova.virt.hardware [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.324 227766 DEBUG nova.virt.hardware [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.324 227766 DEBUG nova.virt.hardware [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.324 227766 DEBUG nova.virt.hardware [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.325 227766 DEBUG nova.virt.hardware [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.327 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:11.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:30:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/980612188' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.751 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.778 227766 DEBUG nova.storage.rbd_utils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:11 np0005593234 nova_compute[227762]: 2026-01-23 10:30:11.782 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:30:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3242097505' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.222 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.224 227766 DEBUG nova.virt.libvirt.vif [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:29:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1480329487',display_name='tempest-ServerStableDeviceRescueTest-server-1480329487',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1480329487',id=175,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='815b71acf60d4ed8933ebd05228fa0c0',ramdisk_id='',reservation_id='r-gplzbwwd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1802220041',owner_user_name='tempest-ServerStableDeviceRescueTest-1802220041-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:30:02Z,user_data=None,user_id='e1629a4b14764dddaabcadd16f3e1c1c',uuid=64ccc062-b11b-4cbc-96ba-620e43dfdb20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.224 227766 DEBUG nova.network.os_vif_util [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converting VIF {"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.226 227766 DEBUG nova.network.os_vif_util [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:3a:01,bridge_name='br-int',has_traffic_filtering=True,id=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc7eda8e-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.227 227766 DEBUG nova.objects.instance [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.310 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  <uuid>64ccc062-b11b-4cbc-96ba-620e43dfdb20</uuid>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  <name>instance-000000af</name>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1480329487</nova:name>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:30:11</nova:creationTime>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <nova:user uuid="e1629a4b14764dddaabcadd16f3e1c1c">tempest-ServerStableDeviceRescueTest-1802220041-project-member</nova:user>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <nova:project uuid="815b71acf60d4ed8933ebd05228fa0c0">tempest-ServerStableDeviceRescueTest-1802220041</nova:project>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <nova:port uuid="fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <entry name="serial">64ccc062-b11b-4cbc-96ba-620e43dfdb20</entry>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <entry name="uuid">64ccc062-b11b-4cbc-96ba-620e43dfdb20</entry>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.config">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:9b:3a:01"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <target dev="tapfc7eda8e-2c"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/console.log" append="off"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:30:12 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:30:12 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:30:12 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:30:12 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.312 227766 DEBUG nova.compute.manager [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Preparing to wait for external event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.312 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.312 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.312 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.313 227766 DEBUG nova.virt.libvirt.vif [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:29:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1480329487',display_name='tempest-ServerStableDeviceRescueTest-server-1480329487',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1480329487',id=175,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='815b71acf60d4ed8933ebd05228fa0c0',ramdisk_id='',reservation_id='r-gplzbwwd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1802220041',owner_user_name='tempest-ServerStableDeviceRescueTest-1802220041-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:30:02Z,user_data=None,user_id='e1629a4b14764dddaabcadd16f3e1c1c',uuid=64ccc062-b11b-4cbc-96ba-620e43dfdb20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.313 227766 DEBUG nova.network.os_vif_util [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converting VIF {"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.314 227766 DEBUG nova.network.os_vif_util [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:3a:01,bridge_name='br-int',has_traffic_filtering=True,id=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc7eda8e-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.314 227766 DEBUG os_vif [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:3a:01,bridge_name='br-int',has_traffic_filtering=True,id=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc7eda8e-2c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.315 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.315 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.316 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.318 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.318 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc7eda8e-2c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.319 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfc7eda8e-2c, col_values=(('external_ids', {'iface-id': 'fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:3a:01', 'vm-uuid': '64ccc062-b11b-4cbc-96ba-620e43dfdb20'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:12 np0005593234 NetworkManager[48942]: <info>  [1769164212.3214] manager: (tapfc7eda8e-2c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/347)
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.320 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.325 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.326 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.327 227766 INFO os_vif [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:3a:01,bridge_name='br-int',has_traffic_filtering=True,id=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc7eda8e-2c')#033[00m
Jan 23 05:30:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:12.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.629 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.629 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.630 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No VIF found with MAC fa:16:3e:9b:3a:01, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.630 227766 INFO nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Using config drive#033[00m
Jan 23 05:30:12 np0005593234 nova_compute[227762]: 2026-01-23 10:30:12.657 227766 DEBUG nova.storage.rbd_utils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.108 227766 INFO nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Creating config drive at /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/disk.config#033[00m
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.113 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcj878q1r execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.247 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcj878q1r" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.273 227766 DEBUG nova.storage.rbd_utils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.276 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/disk.config 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.298 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:13.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.409 227766 DEBUG oslo_concurrency.processutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/disk.config 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.409 227766 INFO nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Deleting local config drive /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/disk.config because it was imported into RBD.#033[00m
Jan 23 05:30:13 np0005593234 kernel: tapfc7eda8e-2c: entered promiscuous mode
Jan 23 05:30:13 np0005593234 NetworkManager[48942]: <info>  [1769164213.4610] manager: (tapfc7eda8e-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/348)
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.462 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:13Z|00729|binding|INFO|Claiming lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for this chassis.
Jan 23 05:30:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:13Z|00730|binding|INFO|fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15: Claiming fa:16:3e:9b:3a:01 10.100.0.11
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.468 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.484 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:3a:01 10.100.0.11'], port_security=['fa:16:3e:9b:3a:01 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '64ccc062-b11b-4cbc-96ba-620e43dfdb20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.485 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 bound to our chassis#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.486 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7d5530f-5227-4f75-bac0-2604bb3d68e2#033[00m
Jan 23 05:30:13 np0005593234 systemd-udevd[307115]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:30:13 np0005593234 systemd-machined[195626]: New machine qemu-81-instance-000000af.
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.499 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[383a497f-c8a6-4108-9fed-123d0ed84318]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.500 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7d5530f-51 in ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.502 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7d5530f-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.502 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d06cdcb0-aceb-4409-84fa-3e6b83cfb6d3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.503 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3af44e0f-7af7-4f9f-9d82-8ef29780c5ea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 NetworkManager[48942]: <info>  [1769164213.5127] device (tapfc7eda8e-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:30:13 np0005593234 NetworkManager[48942]: <info>  [1769164213.5137] device (tapfc7eda8e-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.516 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[2f32ade2-2eb8-4a33-ad8f-6c921e9f385c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.526 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:13Z|00731|binding|INFO|Setting lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 ovn-installed in OVS
Jan 23 05:30:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:13Z|00732|binding|INFO|Setting lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 up in Southbound
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.529 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:13 np0005593234 systemd[1]: Started Virtual Machine qemu-81-instance-000000af.
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.541 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6f323c0f-0498-4bb6-8754-a79238842d46]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.576 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ee4da5ed-e19a-4be1-8128-4b691b8a03d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 NetworkManager[48942]: <info>  [1769164213.5816] manager: (tapd7d5530f-50): new Veth device (/org/freedesktop/NetworkManager/Devices/349)
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.581 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7337f55a-6017-4503-928f-28e9a5b71046]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.612 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b062ac87-aa05-4911-96ed-6e5b53fc30cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.614 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[94ad78db-2c8e-42cf-8a46-e383939ba896]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 NetworkManager[48942]: <info>  [1769164213.6347] device (tapd7d5530f-50): carrier: link connected
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.639 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1335bcc8-8f99-4a77-a38a-e09eb432b539]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.655 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8ccf4d75-1125-4303-aa1f-0a341ee0c0fb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 810859, 'reachable_time': 25656, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 307148, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.668 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4a5af235-1183-4ec0-a668-0f4c522f67c1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:67cc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 810859, 'tstamp': 810859}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 307149, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.682 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b7905cd9-379b-4650-a3ca-140d1199ec97]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 226], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 810859, 'reachable_time': 25656, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 307150, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.711 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c934de7a-be28-4a68-9dd8-8243614670f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.764 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cef85ae1-7c63-4fb9-b948-24253cb2065f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.766 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.767 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.767 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d5530f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.769 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:13 np0005593234 NetworkManager[48942]: <info>  [1769164213.7701] manager: (tapd7d5530f-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/350)
Jan 23 05:30:13 np0005593234 kernel: tapd7d5530f-50: entered promiscuous mode
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.776 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7d5530f-50, col_values=(('external_ids', {'iface-id': '4c99eeb5-c437-4d31-ac3b-bfd151140733'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.777 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:13Z|00733|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.781 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.781 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2aadfda9-e226-4fb8-a5ec-6ca651aff3dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.782 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-d7d5530f-5227-4f75-bac0-2604bb3d68e2
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID d7d5530f-5227-4f75-bac0-2604bb3d68e2
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:30:13 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:13.783 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'env', 'PROCESS_TAG=haproxy-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7d5530f-5227-4f75-bac0-2604bb3d68e2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:30:13 np0005593234 nova_compute[227762]: 2026-01-23 10:30:13.790 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:14 np0005593234 nova_compute[227762]: 2026-01-23 10:30:14.127 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164214.126336, 64ccc062-b11b-4cbc-96ba-620e43dfdb20 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:14 np0005593234 nova_compute[227762]: 2026-01-23 10:30:14.128 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] VM Started (Lifecycle Event)#033[00m
Jan 23 05:30:14 np0005593234 nova_compute[227762]: 2026-01-23 10:30:14.158 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:14 np0005593234 nova_compute[227762]: 2026-01-23 10:30:14.162 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164214.127793, 64ccc062-b11b-4cbc-96ba-620e43dfdb20 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:14 np0005593234 nova_compute[227762]: 2026-01-23 10:30:14.162 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:30:14 np0005593234 podman[307221]: 2026-01-23 10:30:14.183392287 +0000 UTC m=+0.048747074 container create 055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:30:14 np0005593234 nova_compute[227762]: 2026-01-23 10:30:14.189 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:14 np0005593234 nova_compute[227762]: 2026-01-23 10:30:14.194 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:30:14 np0005593234 nova_compute[227762]: 2026-01-23 10:30:14.215 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:30:14 np0005593234 systemd[1]: Started libpod-conmon-055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9.scope.
Jan 23 05:30:14 np0005593234 podman[307221]: 2026-01-23 10:30:14.1585459 +0000 UTC m=+0.023900707 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:30:14 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:30:14 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6444332d7aba38977c00e64875a1659f532e4b7fd2ef48b34023ee1aaa0e9a5a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:30:14 np0005593234 podman[307221]: 2026-01-23 10:30:14.273164892 +0000 UTC m=+0.138519699 container init 055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:30:14 np0005593234 podman[307221]: 2026-01-23 10:30:14.278733486 +0000 UTC m=+0.144088273 container start 055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:30:14 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[307237]: [NOTICE]   (307241) : New worker (307243) forked
Jan 23 05:30:14 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[307237]: [NOTICE]   (307241) : Loading success.
Jan 23 05:30:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:14.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:14 np0005593234 nova_compute[227762]: 2026-01-23 10:30:14.695 227766 DEBUG nova.network.neutron [req-3f024bea-4051-480e-8305-ad33bce859bf req-76e24630-fee0-4678-8603-e5b2dc2848e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updated VIF entry in instance network info cache for port fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:30:14 np0005593234 nova_compute[227762]: 2026-01-23 10:30:14.696 227766 DEBUG nova.network.neutron [req-3f024bea-4051-480e-8305-ad33bce859bf req-76e24630-fee0-4678-8603-e5b2dc2848e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updating instance_info_cache with network_info: [{"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:30:14 np0005593234 nova_compute[227762]: 2026-01-23 10:30:14.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:14 np0005593234 nova_compute[227762]: 2026-01-23 10:30:14.774 227766 DEBUG oslo_concurrency.lockutils [req-3f024bea-4051-480e-8305-ad33bce859bf req-76e24630-fee0-4678-8603-e5b2dc2848e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:30:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:30:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:30:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:15.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:16.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:17 np0005593234 nova_compute[227762]: 2026-01-23 10:30:17.322 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:17.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:18 np0005593234 nova_compute[227762]: 2026-01-23 10:30:18.358 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:18.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:18 np0005593234 podman[307354]: 2026-01-23 10:30:18.783499951 +0000 UTC m=+0.081996042 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:30:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:19.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:20.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:21.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:22 np0005593234 nova_compute[227762]: 2026-01-23 10:30:22.324 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:22.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:22 np0005593234 nova_compute[227762]: 2026-01-23 10:30:22.960 227766 DEBUG nova.compute.manager [req-08ee740f-b52a-407d-85cc-0ddc0bcb9618 req-065e4c36-4bc4-4007-b7e4-480dc6c0df6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:22 np0005593234 nova_compute[227762]: 2026-01-23 10:30:22.961 227766 DEBUG oslo_concurrency.lockutils [req-08ee740f-b52a-407d-85cc-0ddc0bcb9618 req-065e4c36-4bc4-4007-b7e4-480dc6c0df6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:22 np0005593234 nova_compute[227762]: 2026-01-23 10:30:22.961 227766 DEBUG oslo_concurrency.lockutils [req-08ee740f-b52a-407d-85cc-0ddc0bcb9618 req-065e4c36-4bc4-4007-b7e4-480dc6c0df6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:22 np0005593234 nova_compute[227762]: 2026-01-23 10:30:22.961 227766 DEBUG oslo_concurrency.lockutils [req-08ee740f-b52a-407d-85cc-0ddc0bcb9618 req-065e4c36-4bc4-4007-b7e4-480dc6c0df6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:22 np0005593234 nova_compute[227762]: 2026-01-23 10:30:22.961 227766 DEBUG nova.compute.manager [req-08ee740f-b52a-407d-85cc-0ddc0bcb9618 req-065e4c36-4bc4-4007-b7e4-480dc6c0df6a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Processing event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:30:22 np0005593234 nova_compute[227762]: 2026-01-23 10:30:22.962 227766 DEBUG nova.compute.manager [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:30:22 np0005593234 nova_compute[227762]: 2026-01-23 10:30:22.967 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164222.9674857, 64ccc062-b11b-4cbc-96ba-620e43dfdb20 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:22 np0005593234 nova_compute[227762]: 2026-01-23 10:30:22.967 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:30:22 np0005593234 nova_compute[227762]: 2026-01-23 10:30:22.970 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:30:22 np0005593234 nova_compute[227762]: 2026-01-23 10:30:22.975 227766 INFO nova.virt.libvirt.driver [-] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Instance spawned successfully.#033[00m
Jan 23 05:30:22 np0005593234 nova_compute[227762]: 2026-01-23 10:30:22.976 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:30:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e332 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:23 np0005593234 nova_compute[227762]: 2026-01-23 10:30:23.361 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:23.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:23 np0005593234 nova_compute[227762]: 2026-01-23 10:30:23.914 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:23 np0005593234 nova_compute[227762]: 2026-01-23 10:30:23.918 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:23 np0005593234 nova_compute[227762]: 2026-01-23 10:30:23.919 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:23 np0005593234 nova_compute[227762]: 2026-01-23 10:30:23.919 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:23 np0005593234 nova_compute[227762]: 2026-01-23 10:30:23.920 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:23 np0005593234 nova_compute[227762]: 2026-01-23 10:30:23.920 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:23 np0005593234 nova_compute[227762]: 2026-01-23 10:30:23.920 227766 DEBUG nova.virt.libvirt.driver [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:23 np0005593234 nova_compute[227762]: 2026-01-23 10:30:23.924 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:30:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e333 e333: 3 total, 3 up, 3 in
Jan 23 05:30:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:24.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:24 np0005593234 nova_compute[227762]: 2026-01-23 10:30:24.778 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:30:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:25.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:26 np0005593234 nova_compute[227762]: 2026-01-23 10:30:26.110 227766 INFO nova.compute.manager [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Took 23.22 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:30:26 np0005593234 nova_compute[227762]: 2026-01-23 10:30:26.111 227766 DEBUG nova.compute.manager [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e334 e334: 3 total, 3 up, 3 in
Jan 23 05:30:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:26.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:26 np0005593234 nova_compute[227762]: 2026-01-23 10:30:26.619 227766 INFO nova.compute.manager [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Took 30.56 seconds to build instance.#033[00m
Jan 23 05:30:27 np0005593234 nova_compute[227762]: 2026-01-23 10:30:27.037 227766 DEBUG oslo_concurrency.lockutils [None req-e22b89e2-2ed4-4ef8-b931-267217484c27 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 31.086s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:27 np0005593234 nova_compute[227762]: 2026-01-23 10:30:27.326 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e335 e335: 3 total, 3 up, 3 in
Jan 23 05:30:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:27.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:27 np0005593234 nova_compute[227762]: 2026-01-23 10:30:27.423 227766 DEBUG nova.compute.manager [req-90a139ac-8520-49ef-a96d-b6f8a0662252 req-3148f01a-beb5-4827-83c9-ba500c3a4f2f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:27 np0005593234 nova_compute[227762]: 2026-01-23 10:30:27.423 227766 DEBUG oslo_concurrency.lockutils [req-90a139ac-8520-49ef-a96d-b6f8a0662252 req-3148f01a-beb5-4827-83c9-ba500c3a4f2f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:27 np0005593234 nova_compute[227762]: 2026-01-23 10:30:27.424 227766 DEBUG oslo_concurrency.lockutils [req-90a139ac-8520-49ef-a96d-b6f8a0662252 req-3148f01a-beb5-4827-83c9-ba500c3a4f2f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:27 np0005593234 nova_compute[227762]: 2026-01-23 10:30:27.424 227766 DEBUG oslo_concurrency.lockutils [req-90a139ac-8520-49ef-a96d-b6f8a0662252 req-3148f01a-beb5-4827-83c9-ba500c3a4f2f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:27 np0005593234 nova_compute[227762]: 2026-01-23 10:30:27.424 227766 DEBUG nova.compute.manager [req-90a139ac-8520-49ef-a96d-b6f8a0662252 req-3148f01a-beb5-4827-83c9-ba500c3a4f2f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] No waiting events found dispatching network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:30:27 np0005593234 nova_compute[227762]: 2026-01-23 10:30:27.424 227766 WARNING nova.compute.manager [req-90a139ac-8520-49ef-a96d-b6f8a0662252 req-3148f01a-beb5-4827-83c9-ba500c3a4f2f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received unexpected event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:30:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:28 np0005593234 nova_compute[227762]: 2026-01-23 10:30:28.363 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:28.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:29 np0005593234 nova_compute[227762]: 2026-01-23 10:30:29.020 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:29.022 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=65, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=64) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:30:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:29.024 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:30:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:29.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:30:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:30.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:30:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:31.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:31 np0005593234 nova_compute[227762]: 2026-01-23 10:30:31.802 227766 DEBUG nova.compute.manager [None req-efbd8f3f-005b-4b2c-97fa-4f6e94f48d1d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:32 np0005593234 nova_compute[227762]: 2026-01-23 10:30:32.042 227766 INFO nova.compute.manager [None req-efbd8f3f-005b-4b2c-97fa-4f6e94f48d1d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] instance snapshotting#033[00m
Jan 23 05:30:32 np0005593234 nova_compute[227762]: 2026-01-23 10:30:32.329 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:32 np0005593234 nova_compute[227762]: 2026-01-23 10:30:32.396 227766 INFO nova.virt.libvirt.driver [None req-efbd8f3f-005b-4b2c-97fa-4f6e94f48d1d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Beginning live snapshot process#033[00m
Jan 23 05:30:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:32.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:32 np0005593234 nova_compute[227762]: 2026-01-23 10:30:32.717 227766 DEBUG nova.virt.libvirt.imagebackend [None req-efbd8f3f-005b-4b2c-97fa-4f6e94f48d1d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 05:30:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e335 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:33 np0005593234 nova_compute[227762]: 2026-01-23 10:30:33.088 227766 DEBUG nova.storage.rbd_utils [None req-efbd8f3f-005b-4b2c-97fa-4f6e94f48d1d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] creating snapshot(b6764d2daaa043c69538ab8f2a77dfa3) on rbd image(64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:30:33 np0005593234 nova_compute[227762]: 2026-01-23 10:30:33.367 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:33.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e336 e336: 3 total, 3 up, 3 in
Jan 23 05:30:33 np0005593234 nova_compute[227762]: 2026-01-23 10:30:33.515 227766 DEBUG nova.storage.rbd_utils [None req-efbd8f3f-005b-4b2c-97fa-4f6e94f48d1d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] cloning vms/64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk@b6764d2daaa043c69538ab8f2a77dfa3 to images/33bd8321-22af-4ee8-875d-6188b12bef8e clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:30:33 np0005593234 nova_compute[227762]: 2026-01-23 10:30:33.643 227766 DEBUG nova.storage.rbd_utils [None req-efbd8f3f-005b-4b2c-97fa-4f6e94f48d1d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] flattening images/33bd8321-22af-4ee8-875d-6188b12bef8e flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 05:30:33 np0005593234 nova_compute[227762]: 2026-01-23 10:30:33.972 227766 DEBUG nova.storage.rbd_utils [None req-efbd8f3f-005b-4b2c-97fa-4f6e94f48d1d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] removing snapshot(b6764d2daaa043c69538ab8f2a77dfa3) on rbd image(64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:30:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:34.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e337 e337: 3 total, 3 up, 3 in
Jan 23 05:30:34 np0005593234 nova_compute[227762]: 2026-01-23 10:30:34.535 227766 DEBUG nova.storage.rbd_utils [None req-efbd8f3f-005b-4b2c-97fa-4f6e94f48d1d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] creating snapshot(snap) on rbd image(33bd8321-22af-4ee8-875d-6188b12bef8e) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:30:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:35.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e338 e338: 3 total, 3 up, 3 in
Jan 23 05:30:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:36.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:36 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:36Z|00090|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9b:3a:01 10.100.0.11
Jan 23 05:30:36 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:36Z|00091|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9b:3a:01 10.100.0.11
Jan 23 05:30:37 np0005593234 nova_compute[227762]: 2026-01-23 10:30:37.331 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:30:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:37.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:30:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:38.026 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '65'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e338 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:38 np0005593234 nova_compute[227762]: 2026-01-23 10:30:38.367 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:38.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:38 np0005593234 nova_compute[227762]: 2026-01-23 10:30:38.621 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Acquiring lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:38 np0005593234 nova_compute[227762]: 2026-01-23 10:30:38.621 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:38 np0005593234 nova_compute[227762]: 2026-01-23 10:30:38.643 227766 DEBUG nova.compute.manager [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:30:38 np0005593234 nova_compute[227762]: 2026-01-23 10:30:38.739 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:38 np0005593234 nova_compute[227762]: 2026-01-23 10:30:38.740 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:38 np0005593234 nova_compute[227762]: 2026-01-23 10:30:38.746 227766 DEBUG nova.virt.hardware [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:30:38 np0005593234 nova_compute[227762]: 2026-01-23 10:30:38.747 227766 INFO nova.compute.claims [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:30:38 np0005593234 nova_compute[227762]: 2026-01-23 10:30:38.786 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:38 np0005593234 nova_compute[227762]: 2026-01-23 10:30:38.786 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:38 np0005593234 nova_compute[227762]: 2026-01-23 10:30:38.804 227766 DEBUG nova.compute.manager [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:30:39 np0005593234 nova_compute[227762]: 2026-01-23 10:30:39.123 227766 DEBUG oslo_concurrency.processutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:39 np0005593234 nova_compute[227762]: 2026-01-23 10:30:39.187 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:39.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:30:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/636027778' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:30:39 np0005593234 nova_compute[227762]: 2026-01-23 10:30:39.665 227766 DEBUG oslo_concurrency.processutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.543s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:39 np0005593234 nova_compute[227762]: 2026-01-23 10:30:39.674 227766 DEBUG nova.compute.provider_tree [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:30:39 np0005593234 nova_compute[227762]: 2026-01-23 10:30:39.943 227766 DEBUG nova.scheduler.client.report [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:30:40 np0005593234 nova_compute[227762]: 2026-01-23 10:30:40.216 227766 INFO nova.virt.libvirt.driver [None req-efbd8f3f-005b-4b2c-97fa-4f6e94f48d1d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Snapshot image upload complete#033[00m
Jan 23 05:30:40 np0005593234 nova_compute[227762]: 2026-01-23 10:30:40.216 227766 INFO nova.compute.manager [None req-efbd8f3f-005b-4b2c-97fa-4f6e94f48d1d e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Took 8.17 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 23 05:30:40 np0005593234 nova_compute[227762]: 2026-01-23 10:30:40.340 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:40 np0005593234 nova_compute[227762]: 2026-01-23 10:30:40.341 227766 DEBUG nova.compute.manager [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:30:40 np0005593234 nova_compute[227762]: 2026-01-23 10:30:40.343 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 1.156s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:40 np0005593234 nova_compute[227762]: 2026-01-23 10:30:40.351 227766 DEBUG nova.virt.hardware [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:30:40 np0005593234 nova_compute[227762]: 2026-01-23 10:30:40.352 227766 INFO nova.compute.claims [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:30:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:40.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:40 np0005593234 podman[307605]: 2026-01-23 10:30:40.766363683 +0000 UTC m=+0.057266850 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 23 05:30:40 np0005593234 nova_compute[227762]: 2026-01-23 10:30:40.855 227766 DEBUG nova.compute.manager [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:30:40 np0005593234 nova_compute[227762]: 2026-01-23 10:30:40.855 227766 DEBUG nova.network.neutron [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:30:40 np0005593234 nova_compute[227762]: 2026-01-23 10:30:40.889 227766 INFO nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:30:40 np0005593234 nova_compute[227762]: 2026-01-23 10:30:40.915 227766 DEBUG nova.compute.manager [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.035 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.254 227766 DEBUG nova.compute.manager [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.256 227766 DEBUG nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.256 227766 INFO nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Creating image(s)#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.277 227766 DEBUG nova.storage.rbd_utils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] rbd image 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.299 227766 DEBUG nova.storage.rbd_utils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] rbd image 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.323 227766 DEBUG nova.storage.rbd_utils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] rbd image 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.326 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Acquiring lock "24b5f14653c54ae0908f395335c2f9a23eca4957" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.327 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "24b5f14653c54ae0908f395335c2f9a23eca4957" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:41.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:30:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4029670328' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.470 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.475 227766 DEBUG nova.compute.provider_tree [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.563 227766 DEBUG nova.virt.libvirt.imagebackend [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/81a92860-f94f-4274-aba5-1ec35fd1f681/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/81a92860-f94f-4274-aba5-1ec35fd1f681/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.624 227766 DEBUG nova.scheduler.client.report [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.631 227766 DEBUG nova.virt.libvirt.imagebackend [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Selected location: {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/81a92860-f94f-4274-aba5-1ec35fd1f681/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.632 227766 DEBUG nova.storage.rbd_utils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] cloning images/81a92860-f94f-4274-aba5-1ec35fd1f681@snap to None/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.938 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.939 227766 DEBUG nova.compute.manager [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.950 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "24b5f14653c54ae0908f395335c2f9a23eca4957" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:41 np0005593234 nova_compute[227762]: 2026-01-23 10:30:41.996 227766 DEBUG nova.policy [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '8e1f41f21f79408d8dff1331cfd1e0db', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '7be5cb5abaf44b0a9c0c307d348d8f75', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.047 227766 DEBUG nova.compute.manager [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.048 227766 DEBUG nova.network.neutron [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.108 227766 DEBUG nova.objects.instance [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lazy-loading 'migration_context' on Instance uuid 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.334 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:42.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.457 227766 INFO nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.461 227766 DEBUG nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.461 227766 DEBUG nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Ensure instance console log exists: /var/lib/nova/instances/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.462 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.462 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.462 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.482 227766 DEBUG nova.compute.manager [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:30:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:42.866 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:42.867 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:42.868 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.879 227766 DEBUG nova.compute.manager [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.880 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.881 227766 INFO nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Creating image(s)#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.907 227766 DEBUG nova.storage.rbd_utils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.948 227766 DEBUG nova.storage.rbd_utils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.972 227766 DEBUG nova.storage.rbd_utils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:42 np0005593234 nova_compute[227762]: 2026-01-23 10:30:42.976 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:43 np0005593234 nova_compute[227762]: 2026-01-23 10:30:43.001 227766 DEBUG nova.policy [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60291ce86b6946629a2e48f6680312cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:30:43 np0005593234 nova_compute[227762]: 2026-01-23 10:30:43.040 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:43 np0005593234 nova_compute[227762]: 2026-01-23 10:30:43.040 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:43 np0005593234 nova_compute[227762]: 2026-01-23 10:30:43.041 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:43 np0005593234 nova_compute[227762]: 2026-01-23 10:30:43.041 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:43 np0005593234 nova_compute[227762]: 2026-01-23 10:30:43.064 227766 DEBUG nova.storage.rbd_utils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:43 np0005593234 nova_compute[227762]: 2026-01-23 10:30:43.067 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e338 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:43 np0005593234 nova_compute[227762]: 2026-01-23 10:30:43.370 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:30:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:43.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:30:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e339 e339: 3 total, 3 up, 3 in
Jan 23 05:30:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:44.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:44 np0005593234 nova_compute[227762]: 2026-01-23 10:30:44.892 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.825s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:44 np0005593234 nova_compute[227762]: 2026-01-23 10:30:44.970 227766 DEBUG nova.storage.rbd_utils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] resizing rbd image f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:30:45 np0005593234 nova_compute[227762]: 2026-01-23 10:30:45.075 227766 DEBUG nova.objects.instance [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'migration_context' on Instance uuid f10b70f9-c203-4706-8e68-a3c1cd3af7a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:45 np0005593234 nova_compute[227762]: 2026-01-23 10:30:45.128 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:30:45 np0005593234 nova_compute[227762]: 2026-01-23 10:30:45.129 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Ensure instance console log exists: /var/lib/nova/instances/f10b70f9-c203-4706-8e68-a3c1cd3af7a9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:30:45 np0005593234 nova_compute[227762]: 2026-01-23 10:30:45.129 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:45 np0005593234 nova_compute[227762]: 2026-01-23 10:30:45.129 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:45 np0005593234 nova_compute[227762]: 2026-01-23 10:30:45.130 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:30:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:45.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:30:45 np0005593234 nova_compute[227762]: 2026-01-23 10:30:45.420 227766 INFO nova.compute.manager [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Rescuing#033[00m
Jan 23 05:30:45 np0005593234 nova_compute[227762]: 2026-01-23 10:30:45.421 227766 DEBUG oslo_concurrency.lockutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:30:45 np0005593234 nova_compute[227762]: 2026-01-23 10:30:45.421 227766 DEBUG oslo_concurrency.lockutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquired lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:30:45 np0005593234 nova_compute[227762]: 2026-01-23 10:30:45.421 227766 DEBUG nova.network.neutron [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:30:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:46.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:46 np0005593234 nova_compute[227762]: 2026-01-23 10:30:46.478 227766 DEBUG nova.network.neutron [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Successfully created port: a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:30:47 np0005593234 nova_compute[227762]: 2026-01-23 10:30:47.336 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:47.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:47 np0005593234 nova_compute[227762]: 2026-01-23 10:30:47.448 227766 DEBUG nova.network.neutron [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Successfully created port: 85f614c4-b9b2-474a-a52e-5acbcb0a43c5 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:30:47 np0005593234 nova_compute[227762]: 2026-01-23 10:30:47.579 227766 DEBUG nova.network.neutron [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updating instance_info_cache with network_info: [{"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:30:47 np0005593234 nova_compute[227762]: 2026-01-23 10:30:47.605 227766 DEBUG oslo_concurrency.lockutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Releasing lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:30:47 np0005593234 nova_compute[227762]: 2026-01-23 10:30:47.930 227766 DEBUG nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:30:47 np0005593234 nova_compute[227762]: 2026-01-23 10:30:47.953 227766 DEBUG nova.network.neutron [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Successfully updated port: a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:30:47 np0005593234 nova_compute[227762]: 2026-01-23 10:30:47.975 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Acquiring lock "refresh_cache-6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:30:47 np0005593234 nova_compute[227762]: 2026-01-23 10:30:47.975 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Acquired lock "refresh_cache-6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:30:47 np0005593234 nova_compute[227762]: 2026-01-23 10:30:47.976 227766 DEBUG nova.network.neutron [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:30:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:48 np0005593234 nova_compute[227762]: 2026-01-23 10:30:48.166 227766 DEBUG nova.compute.manager [req-bd22cd33-b298-4664-9b1c-e9e618eb2194 req-72eacd10-9975-42ae-870a-c8fc1829adea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Received event network-changed-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:48 np0005593234 nova_compute[227762]: 2026-01-23 10:30:48.166 227766 DEBUG nova.compute.manager [req-bd22cd33-b298-4664-9b1c-e9e618eb2194 req-72eacd10-9975-42ae-870a-c8fc1829adea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Refreshing instance network info cache due to event network-changed-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:30:48 np0005593234 nova_compute[227762]: 2026-01-23 10:30:48.167 227766 DEBUG oslo_concurrency.lockutils [req-bd22cd33-b298-4664-9b1c-e9e618eb2194 req-72eacd10-9975-42ae-870a-c8fc1829adea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:30:48 np0005593234 nova_compute[227762]: 2026-01-23 10:30:48.279 227766 DEBUG nova.network.neutron [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:30:48 np0005593234 nova_compute[227762]: 2026-01-23 10:30:48.372 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:48.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:30:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:49.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:30:49 np0005593234 podman[307993]: 2026-01-23 10:30:49.823968628 +0000 UTC m=+0.115303283 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true)
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.282 227766 DEBUG nova.network.neutron [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Successfully updated port: 85f614c4-b9b2-474a-a52e-5acbcb0a43c5 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.318 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "refresh_cache-f10b70f9-c203-4706-8e68-a3c1cd3af7a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.319 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquired lock "refresh_cache-f10b70f9-c203-4706-8e68-a3c1cd3af7a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.319 227766 DEBUG nova.network.neutron [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:30:50 np0005593234 kernel: tapfc7eda8e-2c (unregistering): left promiscuous mode
Jan 23 05:30:50 np0005593234 NetworkManager[48942]: <info>  [1769164250.3970] device (tapfc7eda8e-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:30:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:50Z|00734|binding|INFO|Releasing lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 from this chassis (sb_readonly=0)
Jan 23 05:30:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:50Z|00735|binding|INFO|Setting lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 down in Southbound
Jan 23 05:30:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:50Z|00736|binding|INFO|Removing iface tapfc7eda8e-2c ovn-installed in OVS
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.410 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:3a:01 10.100.0.11'], port_security=['fa:16:3e:9b:3a:01 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '64ccc062-b11b-4cbc-96ba-620e43dfdb20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.411 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 unbound from our chassis#033[00m
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.412 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7d5530f-5227-4f75-bac0-2604bb3d68e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.414 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[090ef38d-c5e7-4ba5-9937-eb145569d74f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.414 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 namespace which is not needed anymore#033[00m
Jan 23 05:30:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:50.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:50 np0005593234 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000af.scope: Deactivated successfully.
Jan 23 05:30:50 np0005593234 systemd[1]: machine-qemu\x2d81\x2dinstance\x2d000000af.scope: Consumed 14.185s CPU time.
Jan 23 05:30:50 np0005593234 systemd-machined[195626]: Machine qemu-81-instance-000000af terminated.
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.500 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:50 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[307237]: [NOTICE]   (307241) : haproxy version is 2.8.14-c23fe91
Jan 23 05:30:50 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[307237]: [NOTICE]   (307241) : path to executable is /usr/sbin/haproxy
Jan 23 05:30:50 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[307237]: [WARNING]  (307241) : Exiting Master process...
Jan 23 05:30:50 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[307237]: [ALERT]    (307241) : Current worker (307243) exited with code 143 (Terminated)
Jan 23 05:30:50 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[307237]: [WARNING]  (307241) : All workers exited. Exiting... (0)
Jan 23 05:30:50 np0005593234 systemd[1]: libpod-055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9.scope: Deactivated successfully.
Jan 23 05:30:50 np0005593234 podman[308045]: 2026-01-23 10:30:50.547319208 +0000 UTC m=+0.041660733 container died 055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:30:50 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9-userdata-shm.mount: Deactivated successfully.
Jan 23 05:30:50 np0005593234 systemd[1]: var-lib-containers-storage-overlay-6444332d7aba38977c00e64875a1659f532e4b7fd2ef48b34023ee1aaa0e9a5a-merged.mount: Deactivated successfully.
Jan 23 05:30:50 np0005593234 podman[308045]: 2026-01-23 10:30:50.586035128 +0000 UTC m=+0.080376653 container cleanup 055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:30:50 np0005593234 systemd[1]: libpod-conmon-055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9.scope: Deactivated successfully.
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.631 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.635 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:50 np0005593234 podman[308076]: 2026-01-23 10:30:50.651660598 +0000 UTC m=+0.045971966 container remove 055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.656 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dbdfdbd7-d3ad-4c3f-be00-a1e6450c15fa]: (4, ('Fri Jan 23 10:30:50 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 (055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9)\n055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9\nFri Jan 23 10:30:50 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 (055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9)\n055c5f6d782a998fb85cf814f36571e53373314f9614fec17757e3ad11b96cf9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.658 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ad13b1be-6fc3-469e-b7d2-673487160c25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.660 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.661 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:50 np0005593234 kernel: tapd7d5530f-50: left promiscuous mode
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.679 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.682 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5205564e-4487-4a82-bc0e-4e464a7e8379]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.695 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7d17a423-7d42-4c30-b178-23c558065de5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.697 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d2bf3a1c-aad5-4845-bca5-87e0837433c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.713 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d90f36dc-15fb-493e-8250-979a0b0810a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 810853, 'reachable_time': 42282, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308103, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:50 np0005593234 systemd[1]: run-netns-ovnmeta\x2dd7d5530f\x2d5227\x2d4f75\x2dbac0\x2d2604bb3d68e2.mount: Deactivated successfully.
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.716 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:30:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:50.717 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[cd4c5be0-0675-473b-83ef-1004a69879e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.717 227766 DEBUG nova.compute.manager [req-bab7ef6b-c3bb-4388-8c5f-ca96274776a9 req-7920343d-1902-4b0a-b109-cc8be5cd1c64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-unplugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.717 227766 DEBUG oslo_concurrency.lockutils [req-bab7ef6b-c3bb-4388-8c5f-ca96274776a9 req-7920343d-1902-4b0a-b109-cc8be5cd1c64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.718 227766 DEBUG oslo_concurrency.lockutils [req-bab7ef6b-c3bb-4388-8c5f-ca96274776a9 req-7920343d-1902-4b0a-b109-cc8be5cd1c64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.718 227766 DEBUG oslo_concurrency.lockutils [req-bab7ef6b-c3bb-4388-8c5f-ca96274776a9 req-7920343d-1902-4b0a-b109-cc8be5cd1c64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.718 227766 DEBUG nova.compute.manager [req-bab7ef6b-c3bb-4388-8c5f-ca96274776a9 req-7920343d-1902-4b0a-b109-cc8be5cd1c64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] No waiting events found dispatching network-vif-unplugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.718 227766 WARNING nova.compute.manager [req-bab7ef6b-c3bb-4388-8c5f-ca96274776a9 req-7920343d-1902-4b0a-b109-cc8be5cd1c64 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received unexpected event network-vif-unplugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.731 227766 DEBUG nova.network.neutron [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Updating instance_info_cache with network_info: [{"id": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "address": "fa:16:3e:4f:dc:d1", "network": {"id": "bd95237d-0845-479e-9505-318e01879565", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-2097735183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7be5cb5abaf44b0a9c0c307d348d8f75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c66cbc-51", "ovs_interfaceid": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.736 227766 DEBUG nova.compute.manager [req-767e2c28-50e8-48a7-8f3f-769ecacd2462 req-db4eb2bc-d1b9-4edf-a24d-bce3651a6ef1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Received event network-changed-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.736 227766 DEBUG nova.compute.manager [req-767e2c28-50e8-48a7-8f3f-769ecacd2462 req-db4eb2bc-d1b9-4edf-a24d-bce3651a6ef1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Refreshing instance network info cache due to event network-changed-85f614c4-b9b2-474a-a52e-5acbcb0a43c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.736 227766 DEBUG oslo_concurrency.lockutils [req-767e2c28-50e8-48a7-8f3f-769ecacd2462 req-db4eb2bc-d1b9-4edf-a24d-bce3651a6ef1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-f10b70f9-c203-4706-8e68-a3c1cd3af7a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.750 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Releasing lock "refresh_cache-6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.750 227766 DEBUG nova.compute.manager [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Instance network_info: |[{"id": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "address": "fa:16:3e:4f:dc:d1", "network": {"id": "bd95237d-0845-479e-9505-318e01879565", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-2097735183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7be5cb5abaf44b0a9c0c307d348d8f75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c66cbc-51", "ovs_interfaceid": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.751 227766 DEBUG oslo_concurrency.lockutils [req-bd22cd33-b298-4664-9b1c-e9e618eb2194 req-72eacd10-9975-42ae-870a-c8fc1829adea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.751 227766 DEBUG nova.network.neutron [req-bd22cd33-b298-4664-9b1c-e9e618eb2194 req-72eacd10-9975-42ae-870a-c8fc1829adea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Refreshing network info cache for port a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.754 227766 DEBUG nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Start _get_guest_xml network_info=[{"id": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "address": "fa:16:3e:4f:dc:d1", "network": {"id": "bd95237d-0845-479e-9505-318e01879565", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-2097735183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7be5cb5abaf44b0a9c0c307d348d8f75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c66cbc-51", "ovs_interfaceid": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T10:30:19Z,direct_url=<?>,disk_format='raw',id=81a92860-f94f-4274-aba5-1ec35fd1f681,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-1660473649',owner='7be5cb5abaf44b0a9c0c307d348d8f75',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T10:30:31Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '81a92860-f94f-4274-aba5-1ec35fd1f681'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.758 227766 WARNING nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.762 227766 DEBUG nova.virt.libvirt.host [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.763 227766 DEBUG nova.virt.libvirt.host [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.765 227766 DEBUG nova.virt.libvirt.host [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.766 227766 DEBUG nova.virt.libvirt.host [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.767 227766 DEBUG nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.767 227766 DEBUG nova.virt.hardware [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T10:30:19Z,direct_url=<?>,disk_format='raw',id=81a92860-f94f-4274-aba5-1ec35fd1f681,min_disk=1,min_ram=0,name='tempest-TestSnapshotPatternsnapshot-1660473649',owner='7be5cb5abaf44b0a9c0c307d348d8f75',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T10:30:31Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.767 227766 DEBUG nova.virt.hardware [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.768 227766 DEBUG nova.virt.hardware [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.768 227766 DEBUG nova.virt.hardware [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.768 227766 DEBUG nova.virt.hardware [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.768 227766 DEBUG nova.virt.hardware [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.768 227766 DEBUG nova.virt.hardware [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.769 227766 DEBUG nova.virt.hardware [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.769 227766 DEBUG nova.virt.hardware [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.769 227766 DEBUG nova.virt.hardware [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.769 227766 DEBUG nova.virt.hardware [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.771 227766 DEBUG oslo_concurrency.processutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.794 227766 DEBUG nova.network.neutron [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.944 227766 INFO nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.949 227766 INFO nova.virt.libvirt.driver [-] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Instance destroyed successfully.#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.950 227766 DEBUG nova.objects.instance [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:50 np0005593234 nova_compute[227762]: 2026-01-23 10:30:50.975 227766 INFO nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Attempting a stable device rescue#033[00m
Jan 23 05:30:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:30:51 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3597543033' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.188 227766 DEBUG oslo_concurrency.processutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.227 227766 DEBUG nova.storage.rbd_utils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] rbd image 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.230 227766 DEBUG oslo_concurrency.processutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.281 227766 DEBUG nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.286 227766 DEBUG nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.287 227766 INFO nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Creating image(s)#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.314 227766 DEBUG nova.storage.rbd_utils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.318 227766 DEBUG nova.objects.instance [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.377 227766 DEBUG nova.storage.rbd_utils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.408 227766 DEBUG nova.storage.rbd_utils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.412 227766 DEBUG oslo_concurrency.lockutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "f7dea98795a45aafbdd9781e1b01169c66716d24" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.413 227766 DEBUG oslo_concurrency.lockutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "f7dea98795a45aafbdd9781e1b01169c66716d24" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:51.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.642 227766 DEBUG nova.virt.libvirt.imagebackend [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/33bd8321-22af-4ee8-875d-6188b12bef8e/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/33bd8321-22af-4ee8-875d-6188b12bef8e/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 05:30:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:30:51 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3382498619' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.699 227766 DEBUG oslo_concurrency.processutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.701 227766 DEBUG nova.virt.libvirt.vif [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:30:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-720792772',display_name='tempest-TestSnapshotPattern-server-720792772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-720792772',id=177,image_ref='81a92860-f94f-4274-aba5-1ec35fd1f681',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO0lEKBsepjSG1BUrT2qgopJ/7aCoBcgDi3hhuJKTvppGpJeuS7bRrTAjsHpfJAjqSviKitZ9vmMFVrUxqv9t4cjKwPE6pfdP8/KJg/bYjfHtBTugoC0prDbk1bWow1ivA==',key_name='tempest-TestSnapshotPattern-313488550',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7be5cb5abaf44b0a9c0c307d348d8f75',ramdisk_id='',reservation_id='r-as8m7jan',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='713eba08-716b-48ed-866e-e231d09ebfaf',image_min_disk='1',image_min_ram='0',image_owner_id='7be5cb5abaf44b0a9c0c307d348d8f75',image_owner_project_name='tempest-TestSnapshotPattern-428739353',image_owner_user_name='tempest-TestSnapshotPattern-428739353-project-member',image_user_id='8e1f41f21f79408d8dff1331cfd1e0db',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-428739353',owner_user_name='tempest-TestSnapshotPattern-428739353-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:30:40Z,user_data=None,user_id='8e1f41f21f79408d8dff1331cfd1e0db',uuid=6ed27eef-aee3-4b7d-a31f-8b7d753a25b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "address": "fa:16:3e:4f:dc:d1", "network": {"id": "bd95237d-0845-479e-9505-318e01879565", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-2097735183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7be5cb5abaf44b0a9c0c307d348d8f75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c66cbc-51", "ovs_interfaceid": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.701 227766 DEBUG nova.network.os_vif_util [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Converting VIF {"id": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "address": "fa:16:3e:4f:dc:d1", "network": {"id": "bd95237d-0845-479e-9505-318e01879565", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-2097735183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7be5cb5abaf44b0a9c0c307d348d8f75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c66cbc-51", "ovs_interfaceid": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.702 227766 DEBUG nova.network.os_vif_util [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:dc:d1,bridge_name='br-int',has_traffic_filtering=True,id=a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471,network=Network(bd95237d-0845-479e-9505-318e01879565),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c66cbc-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.703 227766 DEBUG nova.objects.instance [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.708 227766 DEBUG nova.virt.libvirt.imagebackend [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Selected location: {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/33bd8321-22af-4ee8-875d-6188b12bef8e/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.709 227766 DEBUG nova.storage.rbd_utils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] cloning images/33bd8321-22af-4ee8-875d-6188b12bef8e@snap to None/64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.739 227766 DEBUG nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  <uuid>6ed27eef-aee3-4b7d-a31f-8b7d753a25b9</uuid>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  <name>instance-000000b1</name>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestSnapshotPattern-server-720792772</nova:name>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:30:50</nova:creationTime>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <nova:user uuid="8e1f41f21f79408d8dff1331cfd1e0db">tempest-TestSnapshotPattern-428739353-project-member</nova:user>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <nova:project uuid="7be5cb5abaf44b0a9c0c307d348d8f75">tempest-TestSnapshotPattern-428739353</nova:project>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="81a92860-f94f-4274-aba5-1ec35fd1f681"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <nova:port uuid="a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <entry name="serial">6ed27eef-aee3-4b7d-a31f-8b7d753a25b9</entry>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <entry name="uuid">6ed27eef-aee3-4b7d-a31f-8b7d753a25b9</entry>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk.config">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:4f:dc:d1"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <target dev="tapa2c66cbc-51"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9/console.log" append="off"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <input type="keyboard" bus="usb"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:30:51 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:30:51 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:30:51 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:30:51 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.740 227766 DEBUG nova.compute.manager [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Preparing to wait for external event network-vif-plugged-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.740 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Acquiring lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.741 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.741 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.741 227766 DEBUG nova.virt.libvirt.vif [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:30:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-720792772',display_name='tempest-TestSnapshotPattern-server-720792772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-720792772',id=177,image_ref='81a92860-f94f-4274-aba5-1ec35fd1f681',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO0lEKBsepjSG1BUrT2qgopJ/7aCoBcgDi3hhuJKTvppGpJeuS7bRrTAjsHpfJAjqSviKitZ9vmMFVrUxqv9t4cjKwPE6pfdP8/KJg/bYjfHtBTugoC0prDbk1bWow1ivA==',key_name='tempest-TestSnapshotPattern-313488550',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='7be5cb5abaf44b0a9c0c307d348d8f75',ramdisk_id='',reservation_id='r-as8m7jan',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='713eba08-716b-48ed-866e-e231d09ebfaf',image_min_disk='1',image_min_ram='0',image_owner_id='7be5cb5abaf44b0a9c0c307d348d8f75',image_owner_project_name='tempest-TestSnapshotPattern-428739353',image_owner_user_name='tempest-TestSnapshotPattern-428739353-project-member',image_user_id='8e1f41f21f79408d8dff1331cfd1e0db',image_version='8.0',network_allocated='True',owner_project_name='tempest-TestSnapshotPattern-428739353',owner_user_name='tempest-TestSnapshotPattern-428739353-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:30:40Z,user_data=None,user_id='8e1f41f21f79408d8dff1331cfd1e0db',uuid=6ed27eef-aee3-4b7d-a31f-8b7d753a25b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "address": "fa:16:3e:4f:dc:d1", "network": {"id": "bd95237d-0845-479e-9505-318e01879565", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-2097735183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7be5cb5abaf44b0a9c0c307d348d8f75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c66cbc-51", "ovs_interfaceid": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.742 227766 DEBUG nova.network.os_vif_util [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Converting VIF {"id": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "address": "fa:16:3e:4f:dc:d1", "network": {"id": "bd95237d-0845-479e-9505-318e01879565", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-2097735183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7be5cb5abaf44b0a9c0c307d348d8f75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c66cbc-51", "ovs_interfaceid": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.742 227766 DEBUG nova.network.os_vif_util [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4f:dc:d1,bridge_name='br-int',has_traffic_filtering=True,id=a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471,network=Network(bd95237d-0845-479e-9505-318e01879565),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c66cbc-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.743 227766 DEBUG os_vif [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:dc:d1,bridge_name='br-int',has_traffic_filtering=True,id=a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471,network=Network(bd95237d-0845-479e-9505-318e01879565),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c66cbc-51') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.743 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.744 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.744 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.748 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.748 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2c66cbc-51, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.749 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa2c66cbc-51, col_values=(('external_ids', {'iface-id': 'a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4f:dc:d1', 'vm-uuid': '6ed27eef-aee3-4b7d-a31f-8b7d753a25b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.750 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:51 np0005593234 NetworkManager[48942]: <info>  [1769164251.7511] manager: (tapa2c66cbc-51): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/351)
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.752 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.758 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.758 227766 INFO os_vif [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4f:dc:d1,bridge_name='br-int',has_traffic_filtering=True,id=a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471,network=Network(bd95237d-0845-479e-9505-318e01879565),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c66cbc-51')#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.766 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.767 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.767 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.767 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.768 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.859 227766 DEBUG nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.859 227766 DEBUG nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.860 227766 DEBUG nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] No VIF found with MAC fa:16:3e:4f:dc:d1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.861 227766 INFO nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Using config drive#033[00m
Jan 23 05:30:51 np0005593234 nova_compute[227762]: 2026-01-23 10:30:51.891 227766 DEBUG nova.storage.rbd_utils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] rbd image 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:30:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4135882153' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.180 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.412s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.265 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.266 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.268 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000b1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.269 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000b1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.305 227766 INFO nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Creating config drive at /var/lib/nova/instances/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9/disk.config#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.312 227766 DEBUG oslo_concurrency.processutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4m8b3ipv execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.359 227766 DEBUG oslo_concurrency.lockutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "f7dea98795a45aafbdd9781e1b01169c66716d24" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.945s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.414 227766 DEBUG nova.objects.instance [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'migration_context' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.433 227766 DEBUG nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.435 227766 DEBUG nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Start _get_guest_xml network_info=[{"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "vif_mac": "fa:16:3e:9b:3a:01"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'scsi', 'dev': 'sdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '33bd8321-22af-4ee8-875d-6188b12bef8e', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.436 227766 DEBUG nova.objects.instance [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'resources' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:52.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.445 227766 DEBUG oslo_concurrency.processutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4m8b3ipv" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.473 227766 DEBUG nova.storage.rbd_utils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] rbd image 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.477 227766 DEBUG oslo_concurrency.processutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9/disk.config 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.513 227766 WARNING nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.521 227766 DEBUG nova.virt.libvirt.host [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.522 227766 DEBUG nova.virt.libvirt.host [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.525 227766 DEBUG nova.virt.libvirt.host [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.526 227766 DEBUG nova.virt.libvirt.host [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.527 227766 DEBUG nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.528 227766 DEBUG nova.virt.hardware [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.528 227766 DEBUG nova.virt.hardware [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.528 227766 DEBUG nova.virt.hardware [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.529 227766 DEBUG nova.virt.hardware [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.529 227766 DEBUG nova.virt.hardware [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.529 227766 DEBUG nova.virt.hardware [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.529 227766 DEBUG nova.virt.hardware [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.530 227766 DEBUG nova.virt.hardware [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.530 227766 DEBUG nova.virt.hardware [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.530 227766 DEBUG nova.virt.hardware [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.531 227766 DEBUG nova.virt.hardware [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.531 227766 DEBUG nova.objects.instance [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.553 227766 DEBUG oslo_concurrency.processutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.645 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.646 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4237MB free_disk=20.809986114501953GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.646 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.647 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.659 227766 DEBUG oslo_concurrency.processutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9/disk.config 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.181s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.659 227766 INFO nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Deleting local config drive /var/lib/nova/instances/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9/disk.config because it was imported into RBD.#033[00m
Jan 23 05:30:52 np0005593234 kernel: tapa2c66cbc-51: entered promiscuous mode
Jan 23 05:30:52 np0005593234 systemd-udevd[308026]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:30:52 np0005593234 NetworkManager[48942]: <info>  [1769164252.7077] manager: (tapa2c66cbc-51): new Tun device (/org/freedesktop/NetworkManager/Devices/352)
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.712 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:52Z|00737|binding|INFO|Claiming lport a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 for this chassis.
Jan 23 05:30:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:52Z|00738|binding|INFO|a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471: Claiming fa:16:3e:4f:dc:d1 10.100.0.9
Jan 23 05:30:52 np0005593234 NetworkManager[48942]: <info>  [1769164252.7211] device (tapa2c66cbc-51): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:30:52 np0005593234 NetworkManager[48942]: <info>  [1769164252.7216] device (tapa2c66cbc-51): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.726 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.730 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:52 np0005593234 NetworkManager[48942]: <info>  [1769164252.7311] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/353)
Jan 23 05:30:52 np0005593234 NetworkManager[48942]: <info>  [1769164252.7315] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/354)
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.735 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:dc:d1 10.100.0.9'], port_security=['fa:16:3e:4f:dc:d1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6ed27eef-aee3-4b7d-a31f-8b7d753a25b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd95237d-0845-479e-9505-318e01879565', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7be5cb5abaf44b0a9c0c307d348d8f75', 'neutron:revision_number': '2', 'neutron:security_group_ids': '9caee2dd-fa48-495f-923a-9b90f0b8d219', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fddb1949-170b-4939-a509-14ac4d8149d1, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.737 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 in datapath bd95237d-0845-479e-9505-318e01879565 bound to our chassis#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.738 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bd95237d-0845-479e-9505-318e01879565#033[00m
Jan 23 05:30:52 np0005593234 systemd-machined[195626]: New machine qemu-82-instance-000000b1.
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.749 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[db42ba33-0f19-43fc-a069-bd4b245f972a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.750 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbd95237d-01 in ovnmeta-bd95237d-0845-479e-9505-318e01879565 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.757 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbd95237d-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.757 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d8f617ee-5328-4050-be84-67ec0ca3b2fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.759 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fcae08b0-184c-49c2-ab8c-9d29230d4a61]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.770 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[ef0644c8-28aa-48fb-874c-a4771710fafd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:52 np0005593234 systemd[1]: Started Virtual Machine qemu-82-instance-000000b1.
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.793 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 64ccc062-b11b-4cbc-96ba-620e43dfdb20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.794 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.795 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance f10b70f9-c203-4706-8e68-a3c1cd3af7a9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.796 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.796 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.797 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[256e1330-eed4-48bd-b849-8ca3ac1d9d91]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.828 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ef69e9d0-6426-486f-a819-dbc270aa1a79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.835 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[af388f97-3b1b-4e02-9fb3-ca85a6b133da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:52 np0005593234 NetworkManager[48942]: <info>  [1769164252.8366] manager: (tapbd95237d-00): new Veth device (/org/freedesktop/NetworkManager/Devices/355)
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.870 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9b64ea29-1189-4e19-8203-5097230bba11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.873 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f4faa19d-43cb-4b72-a8e1-f61c404f2e09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:52 np0005593234 NetworkManager[48942]: <info>  [1769164252.8978] device (tapbd95237d-00): carrier: link connected
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.904 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[47a8ade0-370f-4ff6-9683-f4988a8fcd9d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.920 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cb48770b-bf4e-4618-959e-d40ac20c3fff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd95237d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:77:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 814785, 'reachable_time': 35283, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308460, 'error': None, 'target': 'ovnmeta-bd95237d-0845-479e-9505-318e01879565', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.924 227766 DEBUG nova.compute.manager [req-5093c292-6f9f-4185-a517-ba396baec061 req-9deee166-4885-4add-b36a-ea0d70cee6ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.924 227766 DEBUG oslo_concurrency.lockutils [req-5093c292-6f9f-4185-a517-ba396baec061 req-9deee166-4885-4add-b36a-ea0d70cee6ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.924 227766 DEBUG oslo_concurrency.lockutils [req-5093c292-6f9f-4185-a517-ba396baec061 req-9deee166-4885-4add-b36a-ea0d70cee6ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.925 227766 DEBUG oslo_concurrency.lockutils [req-5093c292-6f9f-4185-a517-ba396baec061 req-9deee166-4885-4add-b36a-ea0d70cee6ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.925 227766 DEBUG nova.compute.manager [req-5093c292-6f9f-4185-a517-ba396baec061 req-9deee166-4885-4add-b36a-ea0d70cee6ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] No waiting events found dispatching network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.925 227766 WARNING nova.compute.manager [req-5093c292-6f9f-4185-a517-ba396baec061 req-9deee166-4885-4add-b36a-ea0d70cee6ea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received unexpected event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.935 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0e76df7c-ddcf-41b4-9d3b-9e28511c9f98]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee9:776b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 814785, 'tstamp': 814785}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308461, 'error': None, 'target': 'ovnmeta-bd95237d-0845-479e-9505-318e01879565', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.950 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe61a63-ddf8-485c-b8d1-dd28ff76db68]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd95237d-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:77:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 229], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 814785, 'reachable_time': 35283, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308462, 'error': None, 'target': 'ovnmeta-bd95237d-0845-479e-9505-318e01879565', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.974 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:52 np0005593234 nova_compute[227762]: 2026-01-23 10:30:52.978 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:52.981 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[236d211b-7c87-4609-a49e-e518bb00fc04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.006 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:30:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2316458686' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:30:53 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:53Z|00739|binding|INFO|Setting lport a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 ovn-installed in OVS
Jan 23 05:30:53 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:53Z|00740|binding|INFO|Setting lport a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 up in Southbound
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.021 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.034 227766 DEBUG oslo_concurrency.processutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:53.036 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[671de7dc-d1dd-4e45-93a1-cb4d0cddabd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:53.037 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd95237d-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:53.038 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:53.038 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd95237d-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:53 np0005593234 NetworkManager[48942]: <info>  [1769164253.0406] manager: (tapbd95237d-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/356)
Jan 23 05:30:53 np0005593234 kernel: tapbd95237d-00: entered promiscuous mode
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:53.047 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbd95237d-00, col_values=(('external_ids', {'iface-id': '0f4f3525-34df-42ca-96c3-3c7e0c388556'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:53 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:53Z|00741|binding|INFO|Releasing lport 0f4f3525-34df-42ca-96c3-3c7e0c388556 from this chassis (sb_readonly=0)
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:53.066 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bd95237d-0845-479e-9505-318e01879565.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bd95237d-0845-479e-9505-318e01879565.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:53.067 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1b3e7ca6-c87c-4139-8bef-f61cdbedae35]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:53.068 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-bd95237d-0845-479e-9505-318e01879565
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/bd95237d-0845-479e-9505-318e01879565.pid.haproxy
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID bd95237d-0845-479e-9505-318e01879565
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.070 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:53 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:53.071 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bd95237d-0845-479e-9505-318e01879565', 'env', 'PROCESS_TAG=haproxy-bd95237d-0845-479e-9505-318e01879565', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bd95237d-0845-479e-9505-318e01879565.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.077 227766 DEBUG oslo_concurrency.processutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.126 227766 DEBUG nova.network.neutron [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Updating instance_info_cache with network_info: [{"id": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "address": "fa:16:3e:43:0a:48", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f614c4-b9", "ovs_interfaceid": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.147 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Releasing lock "refresh_cache-f10b70f9-c203-4706-8e68-a3c1cd3af7a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.148 227766 DEBUG nova.compute.manager [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Instance network_info: |[{"id": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "address": "fa:16:3e:43:0a:48", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f614c4-b9", "ovs_interfaceid": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.149 227766 DEBUG oslo_concurrency.lockutils [req-767e2c28-50e8-48a7-8f3f-769ecacd2462 req-db4eb2bc-d1b9-4edf-a24d-bce3651a6ef1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-f10b70f9-c203-4706-8e68-a3c1cd3af7a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.150 227766 DEBUG nova.network.neutron [req-767e2c28-50e8-48a7-8f3f-769ecacd2462 req-db4eb2bc-d1b9-4edf-a24d-bce3651a6ef1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Refreshing network info cache for port 85f614c4-b9b2-474a-a52e-5acbcb0a43c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.154 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Start _get_guest_xml network_info=[{"id": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "address": "fa:16:3e:43:0a:48", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f614c4-b9", "ovs_interfaceid": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.160 227766 WARNING nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.167 227766 DEBUG nova.virt.libvirt.host [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.168 227766 DEBUG nova.virt.libvirt.host [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.171 227766 DEBUG nova.virt.libvirt.host [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.172 227766 DEBUG nova.virt.libvirt.host [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.173 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.174 227766 DEBUG nova.virt.hardware [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.174 227766 DEBUG nova.virt.hardware [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.175 227766 DEBUG nova.virt.hardware [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.175 227766 DEBUG nova.virt.hardware [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.175 227766 DEBUG nova.virt.hardware [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.175 227766 DEBUG nova.virt.hardware [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.176 227766 DEBUG nova.virt.hardware [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.176 227766 DEBUG nova.virt.hardware [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.176 227766 DEBUG nova.virt.hardware [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.177 227766 DEBUG nova.virt.hardware [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.177 227766 DEBUG nova.virt.hardware [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.180 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.203 227766 DEBUG nova.network.neutron [req-bd22cd33-b298-4664-9b1c-e9e618eb2194 req-72eacd10-9975-42ae-870a-c8fc1829adea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Updated VIF entry in instance network info cache for port a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.204 227766 DEBUG nova.network.neutron [req-bd22cd33-b298-4664-9b1c-e9e618eb2194 req-72eacd10-9975-42ae-870a-c8fc1829adea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Updating instance_info_cache with network_info: [{"id": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "address": "fa:16:3e:4f:dc:d1", "network": {"id": "bd95237d-0845-479e-9505-318e01879565", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-2097735183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7be5cb5abaf44b0a9c0c307d348d8f75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c66cbc-51", "ovs_interfaceid": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.226 227766 DEBUG oslo_concurrency.lockutils [req-bd22cd33-b298-4664-9b1c-e9e618eb2194 req-72eacd10-9975-42ae-870a-c8fc1829adea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.419 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:53.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:30:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3693693389' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.484 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:53 np0005593234 podman[308595]: 2026-01-23 10:30:53.489130153 +0000 UTC m=+0.050027934 container create f3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.493 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:30:53 np0005593234 systemd[1]: Started libpod-conmon-f3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c.scope.
Jan 23 05:30:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.528 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:30:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/387739996' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:30:53 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:30:53 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c48f63baae2323e23727824b9b0b8926232bfe1829633028b20dce53780300c2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.548 227766 DEBUG oslo_concurrency.processutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.549 227766 DEBUG oslo_concurrency.processutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:53 np0005593234 podman[308595]: 2026-01-23 10:30:53.460752496 +0000 UTC m=+0.021650307 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:30:53 np0005593234 podman[308595]: 2026-01-23 10:30:53.565846479 +0000 UTC m=+0.126744280 container init f3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:30:53 np0005593234 podman[308595]: 2026-01-23 10:30:53.572379423 +0000 UTC m=+0.133277194 container start f3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.574 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.574 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.928s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:53 np0005593234 neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565[308637]: [NOTICE]   (308646) : New worker (308648) forked
Jan 23 05:30:53 np0005593234 neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565[308637]: [NOTICE]   (308646) : Loading success.
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.597 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164253.597241, 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.598 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] VM Started (Lifecycle Event)#033[00m
Jan 23 05:30:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:30:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/568395303' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.630 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.636 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.659 227766 DEBUG nova.storage.rbd_utils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.663 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.687 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164253.5975482, 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.688 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.722 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.725 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.746 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:30:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:30:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/678164882' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.982 227766 DEBUG oslo_concurrency.processutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.984 227766 DEBUG nova.virt.libvirt.vif [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:29:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1480329487',display_name='tempest-ServerStableDeviceRescueTest-server-1480329487',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1480329487',id=175,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:30:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='815b71acf60d4ed8933ebd05228fa0c0',ramdisk_id='',reservation_id='r-gplzbwwd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1802220041',owner_user_name='tempest-ServerStableDeviceRescueTest-1802220041-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:30:40Z,user_data=None,user_id='e1629a4b14764dddaabcadd16f3e1c1c',uuid=64ccc062-b11b-4cbc-96ba-620e43dfdb20,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "vif_mac": "fa:16:3e:9b:3a:01"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.984 227766 DEBUG nova.network.os_vif_util [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converting VIF {"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "vif_mac": "fa:16:3e:9b:3a:01"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.985 227766 DEBUG nova.network.os_vif_util [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9b:3a:01,bridge_name='br-int',has_traffic_filtering=True,id=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc7eda8e-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:30:53 np0005593234 nova_compute[227762]: 2026-01-23 10:30:53.986 227766 DEBUG nova.objects.instance [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.032 227766 DEBUG nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <uuid>64ccc062-b11b-4cbc-96ba-620e43dfdb20</uuid>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <name>instance-000000af</name>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-1480329487</nova:name>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:30:52</nova:creationTime>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:user uuid="e1629a4b14764dddaabcadd16f3e1c1c">tempest-ServerStableDeviceRescueTest-1802220041-project-member</nova:user>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:project uuid="815b71acf60d4ed8933ebd05228fa0c0">tempest-ServerStableDeviceRescueTest-1802220041</nova:project>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:port uuid="fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <entry name="serial">64ccc062-b11b-4cbc-96ba-620e43dfdb20</entry>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <entry name="uuid">64ccc062-b11b-4cbc-96ba-620e43dfdb20</entry>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.config">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.rescue">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <target dev="sdb" bus="scsi"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <boot order="1"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:9b:3a:01"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <target dev="tapfc7eda8e-2c"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/console.log" append="off"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:30:54 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:30:54 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.048 227766 INFO nova.virt.libvirt.driver [-] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Instance destroyed successfully.#033[00m
Jan 23 05:30:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:30:54 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2014567966' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.076 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.077 227766 DEBUG nova.virt.libvirt.vif [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:30:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1500420906',display_name='tempest-TestNetworkBasicOps-server-1500420906',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1500420906',id=178,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDBwx9H6lyetcD0klU5EtF5tu3Bgw3j8Vtfzx918uyQtIQS8G8H59VLbqgzhuLaaEzLIvZnDjv0NJTs71foX7/dVyJnhuy3i18FefCvqxPKfNTiayVm3kq0+RpjimjAsHA==',key_name='tempest-TestNetworkBasicOps-503821714',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-1t28963m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:30:42Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=f10b70f9-c203-4706-8e68-a3c1cd3af7a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "address": "fa:16:3e:43:0a:48", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f614c4-b9", "ovs_interfaceid": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.077 227766 DEBUG nova.network.os_vif_util [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "address": "fa:16:3e:43:0a:48", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f614c4-b9", "ovs_interfaceid": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.078 227766 DEBUG nova.network.os_vif_util [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:0a:48,bridge_name='br-int',has_traffic_filtering=True,id=85f614c4-b9b2-474a-a52e-5acbcb0a43c5,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f614c4-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.079 227766 DEBUG nova.objects.instance [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid f10b70f9-c203-4706-8e68-a3c1cd3af7a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.102 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <uuid>f10b70f9-c203-4706-8e68-a3c1cd3af7a9</uuid>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <name>instance-000000b2</name>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkBasicOps-server-1500420906</nova:name>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:30:53</nova:creationTime>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <nova:port uuid="85f614c4-b9b2-474a-a52e-5acbcb0a43c5">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <entry name="serial">f10b70f9-c203-4706-8e68-a3c1cd3af7a9</entry>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <entry name="uuid">f10b70f9-c203-4706-8e68-a3c1cd3af7a9</entry>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk.config">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:43:0a:48"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <target dev="tap85f614c4-b9"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/f10b70f9-c203-4706-8e68-a3c1cd3af7a9/console.log" append="off"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:30:54 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:30:54 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:30:54 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:30:54 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.103 227766 DEBUG nova.compute.manager [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Preparing to wait for external event network-vif-plugged-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.103 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.103 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.104 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.104 227766 DEBUG nova.virt.libvirt.vif [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:30:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1500420906',display_name='tempest-TestNetworkBasicOps-server-1500420906',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1500420906',id=178,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDBwx9H6lyetcD0klU5EtF5tu3Bgw3j8Vtfzx918uyQtIQS8G8H59VLbqgzhuLaaEzLIvZnDjv0NJTs71foX7/dVyJnhuy3i18FefCvqxPKfNTiayVm3kq0+RpjimjAsHA==',key_name='tempest-TestNetworkBasicOps-503821714',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-1t28963m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:30:42Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=f10b70f9-c203-4706-8e68-a3c1cd3af7a9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "address": "fa:16:3e:43:0a:48", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f614c4-b9", "ovs_interfaceid": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.104 227766 DEBUG nova.network.os_vif_util [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "address": "fa:16:3e:43:0a:48", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f614c4-b9", "ovs_interfaceid": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.105 227766 DEBUG nova.network.os_vif_util [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:0a:48,bridge_name='br-int',has_traffic_filtering=True,id=85f614c4-b9b2-474a-a52e-5acbcb0a43c5,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f614c4-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.105 227766 DEBUG os_vif [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:0a:48,bridge_name='br-int',has_traffic_filtering=True,id=85f614c4-b9b2-474a-a52e-5acbcb0a43c5,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f614c4-b9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.106 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.106 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.107 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.111 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.111 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap85f614c4-b9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.111 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap85f614c4-b9, col_values=(('external_ids', {'iface-id': '85f614c4-b9b2-474a-a52e-5acbcb0a43c5', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:43:0a:48', 'vm-uuid': 'f10b70f9-c203-4706-8e68-a3c1cd3af7a9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.112 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:54 np0005593234 NetworkManager[48942]: <info>  [1769164254.1136] manager: (tap85f614c4-b9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/357)
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.115 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.119 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.120 227766 INFO os_vif [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:0a:48,bridge_name='br-int',has_traffic_filtering=True,id=85f614c4-b9b2-474a-a52e-5acbcb0a43c5,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f614c4-b9')#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.133 227766 DEBUG nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.134 227766 DEBUG nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.134 227766 DEBUG nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name sdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.134 227766 DEBUG nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No VIF found with MAC fa:16:3e:9b:3a:01, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.134 227766 INFO nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Using config drive#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.158 227766 DEBUG nova.storage.rbd_utils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:54.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.545 227766 DEBUG nova.objects.instance [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.560 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.561 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.561 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No VIF found with MAC fa:16:3e:43:0a:48, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.562 227766 INFO nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Using config drive#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.588 227766 DEBUG nova.storage.rbd_utils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.592 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:54 np0005593234 nova_compute[227762]: 2026-01-23 10:30:54.616 227766 DEBUG nova.objects.instance [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'keypairs' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.092 227766 INFO nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Creating config drive at /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/disk.config.rescue#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.104 227766 DEBUG oslo_concurrency.processutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsb4o37i4 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.158 227766 DEBUG nova.compute.manager [req-b2e6caeb-975a-4cdc-a20b-d4bd4dffca9a req-ee818fbd-cfc3-4283-9c70-14d9369dbe03 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Received event network-vif-plugged-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.159 227766 DEBUG oslo_concurrency.lockutils [req-b2e6caeb-975a-4cdc-a20b-d4bd4dffca9a req-ee818fbd-cfc3-4283-9c70-14d9369dbe03 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.159 227766 DEBUG oslo_concurrency.lockutils [req-b2e6caeb-975a-4cdc-a20b-d4bd4dffca9a req-ee818fbd-cfc3-4283-9c70-14d9369dbe03 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.160 227766 DEBUG oslo_concurrency.lockutils [req-b2e6caeb-975a-4cdc-a20b-d4bd4dffca9a req-ee818fbd-cfc3-4283-9c70-14d9369dbe03 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.160 227766 DEBUG nova.compute.manager [req-b2e6caeb-975a-4cdc-a20b-d4bd4dffca9a req-ee818fbd-cfc3-4283-9c70-14d9369dbe03 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Processing event network-vif-plugged-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.161 227766 DEBUG nova.compute.manager [req-b2e6caeb-975a-4cdc-a20b-d4bd4dffca9a req-ee818fbd-cfc3-4283-9c70-14d9369dbe03 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Received event network-vif-plugged-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.162 227766 DEBUG oslo_concurrency.lockutils [req-b2e6caeb-975a-4cdc-a20b-d4bd4dffca9a req-ee818fbd-cfc3-4283-9c70-14d9369dbe03 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.162 227766 DEBUG oslo_concurrency.lockutils [req-b2e6caeb-975a-4cdc-a20b-d4bd4dffca9a req-ee818fbd-cfc3-4283-9c70-14d9369dbe03 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.163 227766 DEBUG oslo_concurrency.lockutils [req-b2e6caeb-975a-4cdc-a20b-d4bd4dffca9a req-ee818fbd-cfc3-4283-9c70-14d9369dbe03 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.163 227766 DEBUG nova.compute.manager [req-b2e6caeb-975a-4cdc-a20b-d4bd4dffca9a req-ee818fbd-cfc3-4283-9c70-14d9369dbe03 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] No waiting events found dispatching network-vif-plugged-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.164 227766 WARNING nova.compute.manager [req-b2e6caeb-975a-4cdc-a20b-d4bd4dffca9a req-ee818fbd-cfc3-4283-9c70-14d9369dbe03 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Received unexpected event network-vif-plugged-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 for instance with vm_state building and task_state spawning.#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.165 227766 DEBUG nova.compute.manager [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.170 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164255.169643, 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.170 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.175 227766 DEBUG nova.virt.libvirt.driver [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.181 227766 INFO nova.virt.libvirt.driver [-] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Instance spawned successfully.#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.182 227766 INFO nova.compute.manager [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Took 13.93 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.183 227766 DEBUG nova.compute.manager [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.199 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.207 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.236 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.244 227766 INFO nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Creating config drive at /var/lib/nova/instances/f10b70f9-c203-4706-8e68-a3c1cd3af7a9/disk.config#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.254 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f10b70f9-c203-4706-8e68-a3c1cd3af7a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph0lco4cx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.293 227766 DEBUG oslo_concurrency.processutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsb4o37i4" returned: 0 in 0.189s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.334 227766 DEBUG nova.storage.rbd_utils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.339 227766 DEBUG oslo_concurrency.processutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/disk.config.rescue 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.375 227766 INFO nova.compute.manager [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Took 16.66 seconds to build instance.#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.400 227766 DEBUG oslo_concurrency.lockutils [None req-5156c1fa-7ec8-431d-8888-db5935df459f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.778s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.405 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f10b70f9-c203-4706-8e68-a3c1cd3af7a9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph0lco4cx" returned: 0 in 0.151s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:55.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.431 227766 DEBUG nova.storage.rbd_utils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.434 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/f10b70f9-c203-4706-8e68-a3c1cd3af7a9/disk.config f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.514 227766 DEBUG oslo_concurrency.processutils [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/disk.config.rescue 64ccc062-b11b-4cbc-96ba-620e43dfdb20_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.175s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.515 227766 INFO nova.virt.libvirt.driver [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Deleting local config drive /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20/disk.config.rescue because it was imported into RBD.#033[00m
Jan 23 05:30:55 np0005593234 kernel: tapfc7eda8e-2c: entered promiscuous mode
Jan 23 05:30:55 np0005593234 NetworkManager[48942]: <info>  [1769164255.5560] manager: (tapfc7eda8e-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/358)
Jan 23 05:30:55 np0005593234 systemd-udevd[308642]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:30:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:55Z|00742|binding|INFO|Claiming lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for this chassis.
Jan 23 05:30:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:55Z|00743|binding|INFO|fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15: Claiming fa:16:3e:9b:3a:01 10.100.0.11
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.607 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:55 np0005593234 NetworkManager[48942]: <info>  [1769164255.6134] device (tapfc7eda8e-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:30:55 np0005593234 NetworkManager[48942]: <info>  [1769164255.6141] device (tapfc7eda8e-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.617 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:3a:01 10.100.0.11'], port_security=['fa:16:3e:9b:3a:01 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '64ccc062-b11b-4cbc-96ba-620e43dfdb20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '5', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.619 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 bound to our chassis#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.621 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7d5530f-5227-4f75-bac0-2604bb3d68e2#033[00m
Jan 23 05:30:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:55Z|00744|binding|INFO|Setting lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 ovn-installed in OVS
Jan 23 05:30:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:55Z|00745|binding|INFO|Setting lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 up in Southbound
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.630 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:55 np0005593234 systemd-machined[195626]: New machine qemu-83-instance-000000af.
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.631 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0425aa62-b66b-43f5-a4ba-292e5fb3f115]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.636 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7d5530f-51 in ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.637 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7d5530f-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.637 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8d762073-7e63-4013-a85d-f9477eaaf5d0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.638 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d6ad29ef-953b-49e1-8200-4bd31d3c222a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 systemd[1]: Started Virtual Machine qemu-83-instance-000000af.
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.652 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[f082d5bf-ef7a-414a-84b2-b256fd12f400]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.656 227766 DEBUG oslo_concurrency.processutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/f10b70f9-c203-4706-8e68-a3c1cd3af7a9/disk.config f10b70f9-c203-4706-8e68-a3c1cd3af7a9_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.222s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.657 227766 INFO nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Deleting local config drive /var/lib/nova/instances/f10b70f9-c203-4706-8e68-a3c1cd3af7a9/disk.config because it was imported into RBD.#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.679 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[52d69a6c-4209-4654-88d2-396abebaa457]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.709 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[cba4db8c-93e9-449a-a20c-c5f68d02feab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 systemd-udevd[308853]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.715 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[476288a0-79cf-4ec6-aebd-f9e2d15104dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 kernel: tap85f614c4-b9: entered promiscuous mode
Jan 23 05:30:55 np0005593234 NetworkManager[48942]: <info>  [1769164255.7220] manager: (tap85f614c4-b9): new Tun device (/org/freedesktop/NetworkManager/Devices/359)
Jan 23 05:30:55 np0005593234 NetworkManager[48942]: <info>  [1769164255.7231] manager: (tapd7d5530f-50): new Veth device (/org/freedesktop/NetworkManager/Devices/360)
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.725 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:55Z|00746|binding|INFO|Claiming lport 85f614c4-b9b2-474a-a52e-5acbcb0a43c5 for this chassis.
Jan 23 05:30:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:55Z|00747|binding|INFO|85f614c4-b9b2-474a-a52e-5acbcb0a43c5: Claiming fa:16:3e:43:0a:48 10.100.0.3
Jan 23 05:30:55 np0005593234 NetworkManager[48942]: <info>  [1769164255.7270] device (tap85f614c4-b9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:30:55 np0005593234 NetworkManager[48942]: <info>  [1769164255.7275] device (tap85f614c4-b9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.739 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:0a:48 10.100.0.3'], port_security=['fa:16:3e:43:0a:48 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f10b70f9-c203-4706-8e68-a3c1cd3af7a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3128fa93-5584-4fd7-b8b2-100d4babba87', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2fde87e7-bf35-4066-8d9f-5bce5d8c471c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7d423e3-a129-4092-a097-e9db38a84e9f, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=85f614c4-b9b2-474a-a52e-5acbcb0a43c5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:30:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:55Z|00748|binding|INFO|Setting lport 85f614c4-b9b2-474a-a52e-5acbcb0a43c5 ovn-installed in OVS
Jan 23 05:30:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:55Z|00749|binding|INFO|Setting lport 85f614c4-b9b2-474a-a52e-5acbcb0a43c5 up in Southbound
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.744 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:55 np0005593234 systemd-machined[195626]: New machine qemu-84-instance-000000b2.
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.760 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7ff8a5c9-6f9d-4bc1-9b4c-28cae584057f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.762 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a6ace265-accb-48ae-b35b-062e86c679e5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 systemd[1]: Started Virtual Machine qemu-84-instance-000000b2.
Jan 23 05:30:55 np0005593234 NetworkManager[48942]: <info>  [1769164255.7838] device (tapd7d5530f-50): carrier: link connected
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.789 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[8197efa1-5725-4e84-ab33-9e8f060d4e0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.805 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dcf7a2ff-8583-4440-8a06-f72cf4b80c5b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 232], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815074, 'reachable_time': 23426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 308905, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.821 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9f3f52b1-6abb-4a8f-90b2-3ce2a5faa270]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:67cc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815074, 'tstamp': 815074}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 308907, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.836 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5c469f2c-2232-4988-afc2-0de65ff8c5ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 232], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815074, 'reachable_time': 23426, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 308909, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.862 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4fcaf0a3-32ca-47e7-a115-07507b6287fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.917 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[728a4dea-f6c1-4eec-b0f0-f03dab44de34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.919 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.920 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:30:55 np0005593234 kernel: tapd7d5530f-50: entered promiscuous mode
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.920 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d5530f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.922 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:55 np0005593234 NetworkManager[48942]: <info>  [1769164255.9247] manager: (tapd7d5530f-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/361)
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.926 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7d5530f-50, col_values=(('external_ids', {'iface-id': '4c99eeb5-c437-4d31-ac3b-bfd151140733'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.927 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.928 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:55 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:55Z|00750|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.928 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.929 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[47647258-a81d-46b9-832d-d571a96e77bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.930 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-d7d5530f-5227-4f75-bac0-2604bb3d68e2
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID d7d5530f-5227-4f75-bac0-2604bb3d68e2
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:30:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:55.931 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'env', 'PROCESS_TAG=haproxy-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7d5530f-5227-4f75-bac0-2604bb3d68e2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.942 227766 DEBUG nova.network.neutron [req-767e2c28-50e8-48a7-8f3f-769ecacd2462 req-db4eb2bc-d1b9-4edf-a24d-bce3651a6ef1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Updated VIF entry in instance network info cache for port 85f614c4-b9b2-474a-a52e-5acbcb0a43c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.943 227766 DEBUG nova.network.neutron [req-767e2c28-50e8-48a7-8f3f-769ecacd2462 req-db4eb2bc-d1b9-4edf-a24d-bce3651a6ef1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Updating instance_info_cache with network_info: [{"id": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "address": "fa:16:3e:43:0a:48", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f614c4-b9", "ovs_interfaceid": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.946 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:55 np0005593234 nova_compute[227762]: 2026-01-23 10:30:55.961 227766 DEBUG oslo_concurrency.lockutils [req-767e2c28-50e8-48a7-8f3f-769ecacd2462 req-db4eb2bc-d1b9-4edf-a24d-bce3651a6ef1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-f10b70f9-c203-4706-8e68-a3c1cd3af7a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.141 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164256.141236, f10b70f9-c203-4706-8e68-a3c1cd3af7a9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.142 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] VM Started (Lifecycle Event)#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.171 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.178 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164256.1414015, f10b70f9-c203-4706-8e68-a3c1cd3af7a9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.179 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.201 227766 DEBUG nova.compute.manager [req-e27d6b44-7eb7-4143-9500-d1c1b642df65 req-4bf7b5a7-9239-4651-8a19-7f1f1d7afe7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Received event network-vif-plugged-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.202 227766 DEBUG oslo_concurrency.lockutils [req-e27d6b44-7eb7-4143-9500-d1c1b642df65 req-4bf7b5a7-9239-4651-8a19-7f1f1d7afe7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.202 227766 DEBUG oslo_concurrency.lockutils [req-e27d6b44-7eb7-4143-9500-d1c1b642df65 req-4bf7b5a7-9239-4651-8a19-7f1f1d7afe7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.203 227766 DEBUG oslo_concurrency.lockutils [req-e27d6b44-7eb7-4143-9500-d1c1b642df65 req-4bf7b5a7-9239-4651-8a19-7f1f1d7afe7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.203 227766 DEBUG nova.compute.manager [req-e27d6b44-7eb7-4143-9500-d1c1b642df65 req-4bf7b5a7-9239-4651-8a19-7f1f1d7afe7f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Processing event network-vif-plugged-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.203 227766 DEBUG nova.compute.manager [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.207 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.210 227766 INFO nova.virt.libvirt.driver [-] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Instance spawned successfully.#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.210 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.213 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.216 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164256.2064867, f10b70f9-c203-4706-8e68-a3c1cd3af7a9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.217 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.232 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.233 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.233 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.234 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.234 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.235 227766 DEBUG nova.virt.libvirt.driver [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.292 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.296 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.319 227766 INFO nova.compute.manager [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Took 13.44 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.319 227766 DEBUG nova.compute.manager [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.320 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:30:56 np0005593234 podman[308981]: 2026-01-23 10:30:56.327299038 +0000 UTC m=+0.057603992 container create 44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 05:30:56 np0005593234 systemd[1]: Started libpod-conmon-44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6.scope.
Jan 23 05:30:56 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:30:56 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e434875e75a56e311dc78339893b0819f53f2762e4282be9044978fb1e09396/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:30:56 np0005593234 podman[308981]: 2026-01-23 10:30:56.301034767 +0000 UTC m=+0.031339741 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.396 227766 INFO nova.compute.manager [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Took 17.51 seconds to build instance.#033[00m
Jan 23 05:30:56 np0005593234 podman[308981]: 2026-01-23 10:30:56.403823499 +0000 UTC m=+0.134128473 container init 44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 05:30:56 np0005593234 podman[308981]: 2026-01-23 10:30:56.408864396 +0000 UTC m=+0.139169350 container start 44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.422 227766 DEBUG oslo_concurrency.lockutils [None req-5888afcb-44ad-492f-ae19-ae832e02cfa6 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.636s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:56 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309030]: [NOTICE]   (309052) : New worker (309056) forked
Jan 23 05:30:56 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309030]: [NOTICE]   (309052) : Loading success.
Jan 23 05:30:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:56.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.481 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 85f614c4-b9b2-474a-a52e-5acbcb0a43c5 in datapath 3128fa93-5584-4fd7-b8b2-100d4babba87 unbound from our chassis#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.483 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3128fa93-5584-4fd7-b8b2-100d4babba87#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.493 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aa361d83-99fb-4cfa-aec4-740ed03b77f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.494 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3128fa93-51 in ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.496 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3128fa93-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.496 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1f08eaaa-03ac-4a96-b478-36388286f2b6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.497 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[93654178-16be-4159-999f-1239a8f440bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.509 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[40cafb88-0d00-4d8a-be7e-7ff5abbd1096]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.531 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9bfcae56-ca35-41e9-9ad4-95f4defe04ba]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.573 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1ded25a5-09b0-4ee0-94d9-4215f649d91e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 NetworkManager[48942]: <info>  [1769164256.5908] manager: (tap3128fa93-50): new Veth device (/org/freedesktop/NetworkManager/Devices/362)
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.592 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[65053816-cef1-4ec7-8da4-b536c14e66f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 systemd-udevd[309072]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.630 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9921f87a-095b-47e1-84ea-8375a0c726f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.634 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[19966fdb-3402-45e1-8dfd-5be4d3062932]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 NetworkManager[48942]: <info>  [1769164256.6675] device (tap3128fa93-50): carrier: link connected
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.673 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[aaed7193-7efd-4050-9e88-a60ffe9dde2d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.688 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1c6e0951-4e3c-4f45-8d1b-1a9de9bff8a4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3128fa93-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:9d:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815162, 'reachable_time': 30520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309091, 'error': None, 'target': 'ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.711 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6d920ec7-524d-4f8c-9967-d0406c2a2004]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe9e:9de0'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815162, 'tstamp': 815162}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 309092, 'error': None, 'target': 'ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.729 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[afdbac11-62af-42d5-83c2-8a52a0698622]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3128fa93-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:9e:9d:e0'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 233], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815162, 'reachable_time': 30520, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 309093, 'error': None, 'target': 'ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.747 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.762 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[66db1bdc-7667-4503-9051-66bd10cb091f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.780 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.781 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.781 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.781 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.822 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[18c63039-d6d0-4eb7-8d58-03e0b0755c7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.823 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3128fa93-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.824 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.824 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3128fa93-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.825 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:56 np0005593234 kernel: tap3128fa93-50: entered promiscuous mode
Jan 23 05:30:56 np0005593234 NetworkManager[48942]: <info>  [1769164256.8266] manager: (tap3128fa93-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/363)
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.828 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3128fa93-50, col_values=(('external_ids', {'iface-id': '40f3a1ed-d213-498b-a2eb-96feaa1eae36'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.829 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.832 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:56 np0005593234 ovn_controller[134547]: 2026-01-23T10:30:56Z|00751|binding|INFO|Releasing lport 40f3a1ed-d213-498b-a2eb-96feaa1eae36 from this chassis (sb_readonly=0)
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.833 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3128fa93-5584-4fd7-b8b2-100d4babba87.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3128fa93-5584-4fd7-b8b2-100d4babba87.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.834 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[492918c0-6255-4be7-81a0-7d2ef077d6cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.834 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-3128fa93-5584-4fd7-b8b2-100d4babba87
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/3128fa93-5584-4fd7-b8b2-100d4babba87.pid.haproxy
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 3128fa93-5584-4fd7-b8b2-100d4babba87
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:30:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:30:56.835 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87', 'env', 'PROCESS_TAG=haproxy-3128fa93-5584-4fd7-b8b2-100d4babba87', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3128fa93-5584-4fd7-b8b2-100d4babba87.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:30:56 np0005593234 nova_compute[227762]: 2026-01-23 10:30:56.848 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:57 np0005593234 podman[309125]: 2026-01-23 10:30:57.21745595 +0000 UTC m=+0.048731844 container create 80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:30:57 np0005593234 systemd[1]: Started libpod-conmon-80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44.scope.
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.274 227766 DEBUG nova.compute.manager [req-8a4d334a-15c7-49ac-b2af-74f371d3b589 req-373cdbb5-1ab4-4ca6-a693-cb183dbd89e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.275 227766 DEBUG oslo_concurrency.lockutils [req-8a4d334a-15c7-49ac-b2af-74f371d3b589 req-373cdbb5-1ab4-4ca6-a693-cb183dbd89e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.276 227766 DEBUG oslo_concurrency.lockutils [req-8a4d334a-15c7-49ac-b2af-74f371d3b589 req-373cdbb5-1ab4-4ca6-a693-cb183dbd89e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.276 227766 DEBUG oslo_concurrency.lockutils [req-8a4d334a-15c7-49ac-b2af-74f371d3b589 req-373cdbb5-1ab4-4ca6-a693-cb183dbd89e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.276 227766 DEBUG nova.compute.manager [req-8a4d334a-15c7-49ac-b2af-74f371d3b589 req-373cdbb5-1ab4-4ca6-a693-cb183dbd89e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] No waiting events found dispatching network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:30:57 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.278 227766 WARNING nova.compute.manager [req-8a4d334a-15c7-49ac-b2af-74f371d3b589 req-373cdbb5-1ab4-4ca6-a693-cb183dbd89e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received unexpected event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:30:57 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e204cf57fe8ad17aebafeff5d09caa0d0a9f8a5ce877fa8afe3c02027cd65654/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.285 227766 DEBUG nova.compute.manager [req-8a4d334a-15c7-49ac-b2af-74f371d3b589 req-373cdbb5-1ab4-4ca6-a693-cb183dbd89e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.285 227766 DEBUG oslo_concurrency.lockutils [req-8a4d334a-15c7-49ac-b2af-74f371d3b589 req-373cdbb5-1ab4-4ca6-a693-cb183dbd89e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.285 227766 DEBUG oslo_concurrency.lockutils [req-8a4d334a-15c7-49ac-b2af-74f371d3b589 req-373cdbb5-1ab4-4ca6-a693-cb183dbd89e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.285 227766 DEBUG oslo_concurrency.lockutils [req-8a4d334a-15c7-49ac-b2af-74f371d3b589 req-373cdbb5-1ab4-4ca6-a693-cb183dbd89e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.286 227766 DEBUG nova.compute.manager [req-8a4d334a-15c7-49ac-b2af-74f371d3b589 req-373cdbb5-1ab4-4ca6-a693-cb183dbd89e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] No waiting events found dispatching network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.286 227766 WARNING nova.compute.manager [req-8a4d334a-15c7-49ac-b2af-74f371d3b589 req-373cdbb5-1ab4-4ca6-a693-cb183dbd89e5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received unexpected event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:30:57 np0005593234 podman[309125]: 2026-01-23 10:30:57.192896123 +0000 UTC m=+0.024172047 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:30:57 np0005593234 podman[309125]: 2026-01-23 10:30:57.307009528 +0000 UTC m=+0.138285452 container init 80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Jan 23 05:30:57 np0005593234 podman[309125]: 2026-01-23 10:30:57.314305245 +0000 UTC m=+0.145581139 container start 80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:30:57 np0005593234 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[309138]: [NOTICE]   (309147) : New worker (309149) forked
Jan 23 05:30:57 np0005593234 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[309138]: [NOTICE]   (309147) : Loading success.
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.427 227766 DEBUG nova.compute.manager [None req-530b31f7-c56b-47a1-a552-2f92281975a1 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.428 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for 64ccc062-b11b-4cbc-96ba-620e43dfdb20 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.429 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164257.428139, 64ccc062-b11b-4cbc-96ba-620e43dfdb20 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.429 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:30:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:57.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.503 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.507 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.544 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.545 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164257.4281962, 64ccc062-b11b-4cbc-96ba-620e43dfdb20 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.545 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] VM Started (Lifecycle Event)#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.566 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:30:57 np0005593234 nova_compute[227762]: 2026-01-23 10:30:57.570 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:30:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.359 227766 DEBUG nova.compute.manager [req-9e93c07e-7bc4-4541-98a4-3bc0549a8e23 req-1eb9f762-e018-42bc-aac4-c29d20201b89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Received event network-vif-plugged-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.361 227766 DEBUG oslo_concurrency.lockutils [req-9e93c07e-7bc4-4541-98a4-3bc0549a8e23 req-1eb9f762-e018-42bc-aac4-c29d20201b89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.362 227766 DEBUG oslo_concurrency.lockutils [req-9e93c07e-7bc4-4541-98a4-3bc0549a8e23 req-1eb9f762-e018-42bc-aac4-c29d20201b89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.362 227766 DEBUG oslo_concurrency.lockutils [req-9e93c07e-7bc4-4541-98a4-3bc0549a8e23 req-1eb9f762-e018-42bc-aac4-c29d20201b89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.362 227766 DEBUG nova.compute.manager [req-9e93c07e-7bc4-4541-98a4-3bc0549a8e23 req-1eb9f762-e018-42bc-aac4-c29d20201b89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] No waiting events found dispatching network-vif-plugged-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.363 227766 WARNING nova.compute.manager [req-9e93c07e-7bc4-4541-98a4-3bc0549a8e23 req-1eb9f762-e018-42bc-aac4-c29d20201b89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Received unexpected event network-vif-plugged-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.420 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:58.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.479 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updating instance_info_cache with network_info: [{"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.538 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.539 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.715 227766 INFO nova.compute.manager [None req-ba3b4e40-b0af-42d7-b758-1ac883af504a e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Unrescuing#033[00m
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.717 227766 DEBUG oslo_concurrency.lockutils [None req-ba3b4e40-b0af-42d7-b758-1ac883af504a e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.718 227766 DEBUG oslo_concurrency.lockutils [None req-ba3b4e40-b0af-42d7-b758-1ac883af504a e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquired lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.720 227766 DEBUG nova.network.neutron [None req-ba3b4e40-b0af-42d7-b758-1ac883af504a e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:30:58 np0005593234 nova_compute[227762]: 2026-01-23 10:30:58.750 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:30:59 np0005593234 nova_compute[227762]: 2026-01-23 10:30:59.113 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:30:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:30:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:30:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:59.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:00.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:00 np0005593234 nova_compute[227762]: 2026-01-23 10:31:00.761 227766 DEBUG nova.compute.manager [req-b162cd24-a654-4215-ae16-19d188815bfb req-52c0a213-2f01-4f01-9dba-d6d3cdacd725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Received event network-changed-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:00 np0005593234 nova_compute[227762]: 2026-01-23 10:31:00.762 227766 DEBUG nova.compute.manager [req-b162cd24-a654-4215-ae16-19d188815bfb req-52c0a213-2f01-4f01-9dba-d6d3cdacd725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Refreshing instance network info cache due to event network-changed-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:31:00 np0005593234 nova_compute[227762]: 2026-01-23 10:31:00.762 227766 DEBUG oslo_concurrency.lockutils [req-b162cd24-a654-4215-ae16-19d188815bfb req-52c0a213-2f01-4f01-9dba-d6d3cdacd725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:31:00 np0005593234 nova_compute[227762]: 2026-01-23 10:31:00.762 227766 DEBUG oslo_concurrency.lockutils [req-b162cd24-a654-4215-ae16-19d188815bfb req-52c0a213-2f01-4f01-9dba-d6d3cdacd725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:31:00 np0005593234 nova_compute[227762]: 2026-01-23 10:31:00.762 227766 DEBUG nova.network.neutron [req-b162cd24-a654-4215-ae16-19d188815bfb req-52c0a213-2f01-4f01-9dba-d6d3cdacd725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Refreshing network info cache for port a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:31:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:01.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:01 np0005593234 nova_compute[227762]: 2026-01-23 10:31:01.563 227766 DEBUG nova.network.neutron [None req-ba3b4e40-b0af-42d7-b758-1ac883af504a e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updating instance_info_cache with network_info: [{"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:31:01 np0005593234 nova_compute[227762]: 2026-01-23 10:31:01.589 227766 DEBUG oslo_concurrency.lockutils [None req-ba3b4e40-b0af-42d7-b758-1ac883af504a e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Releasing lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:31:01 np0005593234 nova_compute[227762]: 2026-01-23 10:31:01.594 227766 DEBUG nova.objects.instance [None req-ba3b4e40-b0af-42d7-b758-1ac883af504a e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'flavor' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:31:01 np0005593234 kernel: tapfc7eda8e-2c (unregistering): left promiscuous mode
Jan 23 05:31:01 np0005593234 NetworkManager[48942]: <info>  [1769164261.6954] device (tapfc7eda8e-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:31:01 np0005593234 nova_compute[227762]: 2026-01-23 10:31:01.704 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:01Z|00752|binding|INFO|Releasing lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 from this chassis (sb_readonly=0)
Jan 23 05:31:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:01Z|00753|binding|INFO|Setting lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 down in Southbound
Jan 23 05:31:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:01Z|00754|binding|INFO|Removing iface tapfc7eda8e-2c ovn-installed in OVS
Jan 23 05:31:01 np0005593234 nova_compute[227762]: 2026-01-23 10:31:01.707 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:01.721 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:3a:01 10.100.0.11'], port_security=['fa:16:3e:9b:3a:01 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '64ccc062-b11b-4cbc-96ba-620e43dfdb20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '6', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:31:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:01.723 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 unbound from our chassis#033[00m
Jan 23 05:31:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:01.724 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7d5530f-5227-4f75-bac0-2604bb3d68e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:31:01 np0005593234 nova_compute[227762]: 2026-01-23 10:31:01.727 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:01.726 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7ef78e44-c024-4315-abc9-4e5e766889a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:01.739 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 namespace which is not needed anymore#033[00m
Jan 23 05:31:01 np0005593234 nova_compute[227762]: 2026-01-23 10:31:01.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:01 np0005593234 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000af.scope: Deactivated successfully.
Jan 23 05:31:01 np0005593234 systemd[1]: machine-qemu\x2d83\x2dinstance\x2d000000af.scope: Consumed 5.199s CPU time.
Jan 23 05:31:01 np0005593234 systemd-machined[195626]: Machine qemu-83-instance-000000af terminated.
Jan 23 05:31:01 np0005593234 nova_compute[227762]: 2026-01-23 10:31:01.863 227766 INFO nova.virt.libvirt.driver [-] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Instance destroyed successfully.#033[00m
Jan 23 05:31:01 np0005593234 nova_compute[227762]: 2026-01-23 10:31:01.864 227766 DEBUG nova.objects.instance [None req-ba3b4e40-b0af-42d7-b758-1ac883af504a e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:31:01 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309030]: [NOTICE]   (309052) : haproxy version is 2.8.14-c23fe91
Jan 23 05:31:01 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309030]: [NOTICE]   (309052) : path to executable is /usr/sbin/haproxy
Jan 23 05:31:01 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309030]: [WARNING]  (309052) : Exiting Master process...
Jan 23 05:31:01 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309030]: [ALERT]    (309052) : Current worker (309056) exited with code 143 (Terminated)
Jan 23 05:31:01 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309030]: [WARNING]  (309052) : All workers exited. Exiting... (0)
Jan 23 05:31:01 np0005593234 systemd[1]: libpod-44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6.scope: Deactivated successfully.
Jan 23 05:31:01 np0005593234 systemd-udevd[309213]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:31:01 np0005593234 kernel: tapfc7eda8e-2c: entered promiscuous mode
Jan 23 05:31:01 np0005593234 NetworkManager[48942]: <info>  [1769164261.9621] manager: (tapfc7eda8e-2c): new Tun device (/org/freedesktop/NetworkManager/Devices/364)
Jan 23 05:31:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:01Z|00755|binding|INFO|Claiming lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for this chassis.
Jan 23 05:31:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:01Z|00756|binding|INFO|fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15: Claiming fa:16:3e:9b:3a:01 10.100.0.11
Jan 23 05:31:01 np0005593234 podman[309244]: 2026-01-23 10:31:01.964199566 +0000 UTC m=+0.048443364 container died 44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:31:01 np0005593234 nova_compute[227762]: 2026-01-23 10:31:01.964 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:01.970 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:3a:01 10.100.0.11'], port_security=['fa:16:3e:9b:3a:01 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '64ccc062-b11b-4cbc-96ba-620e43dfdb20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '6', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:31:01 np0005593234 NetworkManager[48942]: <info>  [1769164261.9723] device (tapfc7eda8e-2c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:31:01 np0005593234 NetworkManager[48942]: <info>  [1769164261.9731] device (tapfc7eda8e-2c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:31:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:01Z|00757|binding|INFO|Setting lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 ovn-installed in OVS
Jan 23 05:31:01 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:01Z|00758|binding|INFO|Setting lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 up in Southbound
Jan 23 05:31:01 np0005593234 nova_compute[227762]: 2026-01-23 10:31:01.982 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:01 np0005593234 nova_compute[227762]: 2026-01-23 10:31:01.984 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:01 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6-userdata-shm.mount: Deactivated successfully.
Jan 23 05:31:02 np0005593234 systemd[1]: var-lib-containers-storage-overlay-6e434875e75a56e311dc78339893b0819f53f2762e4282be9044978fb1e09396-merged.mount: Deactivated successfully.
Jan 23 05:31:02 np0005593234 systemd-machined[195626]: New machine qemu-85-instance-000000af.
Jan 23 05:31:02 np0005593234 podman[309244]: 2026-01-23 10:31:02.015786839 +0000 UTC m=+0.100030637 container cleanup 44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:31:02 np0005593234 systemd[1]: Started Virtual Machine qemu-85-instance-000000af.
Jan 23 05:31:02 np0005593234 systemd[1]: libpod-conmon-44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6.scope: Deactivated successfully.
Jan 23 05:31:02 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:31:02 np0005593234 podman[309286]: 2026-01-23 10:31:02.083787963 +0000 UTC m=+0.045494532 container remove 44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.093 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[536f5906-b8ed-46e2-8071-8faba267cad8]: (4, ('Fri Jan 23 10:31:01 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 (44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6)\n44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6\nFri Jan 23 10:31:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 (44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6)\n44835b2418899f5246dc3bb2d3ab17a29842f5fcca4548aaac4167f17c8a9ff6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.095 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a8dcfcd2-286b-4e0b-a2f1-300d7d95d6b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.096 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.098 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:02 np0005593234 kernel: tapd7d5530f-50: left promiscuous mode
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.112 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.115 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[81262058-c7cd-453f-98a4-c1c0489256b1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.130 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d5ff1208-4ea7-459a-84f4-54b2a3ea29d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.131 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[992a3a8a-cb43-43cf-89c0-0f0cefcbf345]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.147 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[20428125-e285-49ea-83cb-f0c398e3a4c3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815066, 'reachable_time': 22568, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309306, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 systemd[1]: run-netns-ovnmeta\x2dd7d5530f\x2d5227\x2d4f75\x2dbac0\x2d2604bb3d68e2.mount: Deactivated successfully.
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.153 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.154 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[ad31482a-7c10-4eaf-8ba5-1ddf847f72ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.154 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 unbound from our chassis#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.156 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7d5530f-5227-4f75-bac0-2604bb3d68e2#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.166 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[628ff007-e2da-43e3-a07f-783fe6578216]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.167 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd7d5530f-51 in ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.168 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd7d5530f-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.169 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[13f43f06-8a54-4498-a79f-e734cc237050]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.169 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[141cd5fb-8a11-4001-98b7-92edcf61ead2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.183 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[0fb5a2dd-b051-49f3-b98b-ff387092d5dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.208 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e3548414-c06c-414b-aa9f-32e93fabe7a0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.236 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[5123111e-a2fa-4327-b70f-6809d6f8f618]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 NetworkManager[48942]: <info>  [1769164262.2433] manager: (tapd7d5530f-50): new Veth device (/org/freedesktop/NetworkManager/Devices/365)
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.243 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[538a7931-19c7-4f39-97e8-a20f0b8df833]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.369 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f900e7a9-617f-44c3-8cc0-cb3edb8fa215]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.372 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[38706df0-eaf6-45ff-97a1-9d68c6d88711]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.387 227766 DEBUG nova.compute.manager [req-0cfb4484-a68c-4f99-8e92-062b8f659d19 req-0335cc20-2550-411e-b5d0-1fc0e0d3a291 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-unplugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.388 227766 DEBUG oslo_concurrency.lockutils [req-0cfb4484-a68c-4f99-8e92-062b8f659d19 req-0335cc20-2550-411e-b5d0-1fc0e0d3a291 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.388 227766 DEBUG oslo_concurrency.lockutils [req-0cfb4484-a68c-4f99-8e92-062b8f659d19 req-0335cc20-2550-411e-b5d0-1fc0e0d3a291 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.388 227766 DEBUG oslo_concurrency.lockutils [req-0cfb4484-a68c-4f99-8e92-062b8f659d19 req-0335cc20-2550-411e-b5d0-1fc0e0d3a291 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.388 227766 DEBUG nova.compute.manager [req-0cfb4484-a68c-4f99-8e92-062b8f659d19 req-0335cc20-2550-411e-b5d0-1fc0e0d3a291 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] No waiting events found dispatching network-vif-unplugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.389 227766 WARNING nova.compute.manager [req-0cfb4484-a68c-4f99-8e92-062b8f659d19 req-0335cc20-2550-411e-b5d0-1fc0e0d3a291 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received unexpected event network-vif-unplugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 23 05:31:02 np0005593234 NetworkManager[48942]: <info>  [1769164262.3920] device (tapd7d5530f-50): carrier: link connected
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.398 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[09a7efb6-2af5-4389-bf5a-7976e9f4b6bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.415 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7f985120-048e-424b-b287-009b4350652e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815735, 'reachable_time': 37082, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 309332, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.431 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[95730f92-c785-47e2-ae7b-bde4cd5776da]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe00:67cc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815735, 'tstamp': 815735}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 309333, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.447 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[32028067-5a48-47a2-984a-9e029fe1b805]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815735, 'reachable_time': 37082, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 309334, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:02.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.477 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b6e7bf8a-cac6-40e0-ad53-096e90d464e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.540 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[46384890-4937-4d70-acee-7866da93059f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.542 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.542 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.542 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d5530f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:02 np0005593234 NetworkManager[48942]: <info>  [1769164262.5450] manager: (tapd7d5530f-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/366)
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.544 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:02 np0005593234 kernel: tapd7d5530f-50: entered promiscuous mode
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.546 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7d5530f-50, col_values=(('external_ids', {'iface-id': '4c99eeb5-c437-4d31-ac3b-bfd151140733'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.549 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.550 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:31:02 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:02Z|00759|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.553 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[744288e1-0488-4ff8-a97f-b08d06e464be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.555 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-d7d5530f-5227-4f75-bac0-2604bb3d68e2
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/d7d5530f-5227-4f75-bac0-2604bb3d68e2.pid.haproxy
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID d7d5530f-5227-4f75-bac0-2604bb3d68e2
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:31:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:02.556 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'env', 'PROCESS_TAG=haproxy-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d7d5530f-5227-4f75-bac0-2604bb3d68e2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.570 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.745 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for 64ccc062-b11b-4cbc-96ba-620e43dfdb20 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.745 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164262.744655, 64ccc062-b11b-4cbc-96ba-620e43dfdb20 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.745 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.779 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.785 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.810 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.811 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164262.744791, 64ccc062-b11b-4cbc-96ba-620e43dfdb20 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.811 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] VM Started (Lifecycle Event)#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.844 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.849 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:31:02 np0005593234 nova_compute[227762]: 2026-01-23 10:31:02.897 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 23 05:31:02 np0005593234 podman[309424]: 2026-01-23 10:31:02.95617917 +0000 UTC m=+0.052922894 container create b7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:31:03 np0005593234 systemd[1]: Started libpod-conmon-b7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79.scope.
Jan 23 05:31:03 np0005593234 podman[309424]: 2026-01-23 10:31:02.930217659 +0000 UTC m=+0.026961403 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:31:03 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:31:03 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f510ff51a8f0c21529f751bc8982caa4c2675a0cfd8d6fd7477afc40144634d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:31:03 np0005593234 podman[309424]: 2026-01-23 10:31:03.057599589 +0000 UTC m=+0.154343333 container init b7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Jan 23 05:31:03 np0005593234 podman[309424]: 2026-01-23 10:31:03.06691034 +0000 UTC m=+0.163654064 container start b7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:31:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:03 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309439]: [NOTICE]   (309444) : New worker (309446) forked
Jan 23 05:31:03 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309439]: [NOTICE]   (309444) : Loading success.
Jan 23 05:31:03 np0005593234 nova_compute[227762]: 2026-01-23 10:31:03.236 227766 DEBUG nova.compute.manager [req-5bb518e3-e3cb-45ae-86b4-c98d4ec174ed req-9b8aa877-3b37-432e-9bea-542247d6ed42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Received event network-changed-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:03 np0005593234 nova_compute[227762]: 2026-01-23 10:31:03.237 227766 DEBUG nova.compute.manager [req-5bb518e3-e3cb-45ae-86b4-c98d4ec174ed req-9b8aa877-3b37-432e-9bea-542247d6ed42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Refreshing instance network info cache due to event network-changed-85f614c4-b9b2-474a-a52e-5acbcb0a43c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:31:03 np0005593234 nova_compute[227762]: 2026-01-23 10:31:03.237 227766 DEBUG oslo_concurrency.lockutils [req-5bb518e3-e3cb-45ae-86b4-c98d4ec174ed req-9b8aa877-3b37-432e-9bea-542247d6ed42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-f10b70f9-c203-4706-8e68-a3c1cd3af7a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:31:03 np0005593234 nova_compute[227762]: 2026-01-23 10:31:03.238 227766 DEBUG oslo_concurrency.lockutils [req-5bb518e3-e3cb-45ae-86b4-c98d4ec174ed req-9b8aa877-3b37-432e-9bea-542247d6ed42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-f10b70f9-c203-4706-8e68-a3c1cd3af7a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:31:03 np0005593234 nova_compute[227762]: 2026-01-23 10:31:03.238 227766 DEBUG nova.network.neutron [req-5bb518e3-e3cb-45ae-86b4-c98d4ec174ed req-9b8aa877-3b37-432e-9bea-542247d6ed42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Refreshing network info cache for port 85f614c4-b9b2-474a-a52e-5acbcb0a43c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:31:03 np0005593234 nova_compute[227762]: 2026-01-23 10:31:03.304 227766 DEBUG nova.compute.manager [None req-ba3b4e40-b0af-42d7-b758-1ac883af504a e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:31:03 np0005593234 nova_compute[227762]: 2026-01-23 10:31:03.421 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:03.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:03 np0005593234 nova_compute[227762]: 2026-01-23 10:31:03.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:03 np0005593234 nova_compute[227762]: 2026-01-23 10:31:03.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:03 np0005593234 nova_compute[227762]: 2026-01-23 10:31:03.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:31:03 np0005593234 nova_compute[227762]: 2026-01-23 10:31:03.964 227766 DEBUG nova.network.neutron [req-b162cd24-a654-4215-ae16-19d188815bfb req-52c0a213-2f01-4f01-9dba-d6d3cdacd725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Updated VIF entry in instance network info cache for port a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:31:03 np0005593234 nova_compute[227762]: 2026-01-23 10:31:03.965 227766 DEBUG nova.network.neutron [req-b162cd24-a654-4215-ae16-19d188815bfb req-52c0a213-2f01-4f01-9dba-d6d3cdacd725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Updating instance_info_cache with network_info: [{"id": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "address": "fa:16:3e:4f:dc:d1", "network": {"id": "bd95237d-0845-479e-9505-318e01879565", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-2097735183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7be5cb5abaf44b0a9c0c307d348d8f75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c66cbc-51", "ovs_interfaceid": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.116 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.201 227766 DEBUG oslo_concurrency.lockutils [req-b162cd24-a654-4215-ae16-19d188815bfb req-52c0a213-2f01-4f01-9dba-d6d3cdacd725 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:31:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:04.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.595 227766 DEBUG nova.compute.manager [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.596 227766 DEBUG oslo_concurrency.lockutils [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.596 227766 DEBUG oslo_concurrency.lockutils [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.597 227766 DEBUG oslo_concurrency.lockutils [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.597 227766 DEBUG nova.compute.manager [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] No waiting events found dispatching network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.598 227766 WARNING nova.compute.manager [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received unexpected event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.598 227766 DEBUG nova.compute.manager [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.599 227766 DEBUG oslo_concurrency.lockutils [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.599 227766 DEBUG oslo_concurrency.lockutils [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.599 227766 DEBUG oslo_concurrency.lockutils [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.600 227766 DEBUG nova.compute.manager [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] No waiting events found dispatching network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.600 227766 WARNING nova.compute.manager [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received unexpected event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.601 227766 DEBUG nova.compute.manager [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.601 227766 DEBUG oslo_concurrency.lockutils [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.601 227766 DEBUG oslo_concurrency.lockutils [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.602 227766 DEBUG oslo_concurrency.lockutils [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.602 227766 DEBUG nova.compute.manager [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] No waiting events found dispatching network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:04 np0005593234 nova_compute[227762]: 2026-01-23 10:31:04.602 227766 WARNING nova.compute.manager [req-6ddb280a-cee5-4f5a-8724-72e2eebc0316 req-ae821ed4-ba05-4e45-aac1-592d14458998 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received unexpected event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:31:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:05.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:06 np0005593234 nova_compute[227762]: 2026-01-23 10:31:06.355 227766 DEBUG nova.network.neutron [req-5bb518e3-e3cb-45ae-86b4-c98d4ec174ed req-9b8aa877-3b37-432e-9bea-542247d6ed42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Updated VIF entry in instance network info cache for port 85f614c4-b9b2-474a-a52e-5acbcb0a43c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:31:06 np0005593234 nova_compute[227762]: 2026-01-23 10:31:06.356 227766 DEBUG nova.network.neutron [req-5bb518e3-e3cb-45ae-86b4-c98d4ec174ed req-9b8aa877-3b37-432e-9bea-542247d6ed42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Updating instance_info_cache with network_info: [{"id": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "address": "fa:16:3e:43:0a:48", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f614c4-b9", "ovs_interfaceid": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:31:06 np0005593234 nova_compute[227762]: 2026-01-23 10:31:06.421 227766 DEBUG oslo_concurrency.lockutils [req-5bb518e3-e3cb-45ae-86b4-c98d4ec174ed req-9b8aa877-3b37-432e-9bea-542247d6ed42 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-f10b70f9-c203-4706-8e68-a3c1cd3af7a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:31:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:06.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:07.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:08 np0005593234 nova_compute[227762]: 2026-01-23 10:31:08.424 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:31:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:08.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:31:09 np0005593234 nova_compute[227762]: 2026-01-23 10:31:09.119 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:09.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:09 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #56. Immutable memtables: 12.
Jan 23 05:31:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.017000537s ======
Jan 23 05:31:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:10.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.017000537s
Jan 23 05:31:11 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:11Z|00092|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.5 does not match offer 10.100.0.9
Jan 23 05:31:11 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:11Z|00093|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:4f:dc:d1 10.100.0.9
Jan 23 05:31:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:11.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:11 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:11Z|00094|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:43:0a:48 10.100.0.3
Jan 23 05:31:11 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:11Z|00095|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:43:0a:48 10.100.0.3
Jan 23 05:31:11 np0005593234 nova_compute[227762]: 2026-01-23 10:31:11.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:11 np0005593234 podman[309461]: 2026-01-23 10:31:11.761385589 +0000 UTC m=+0.054012318 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:31:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:31:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:12.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:31:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:31:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3293519195' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:31:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:13 np0005593234 nova_compute[227762]: 2026-01-23 10:31:13.427 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:13.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:14 np0005593234 nova_compute[227762]: 2026-01-23 10:31:14.121 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:14.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:15.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:16 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:16Z|00096|pinctrl(ovn_pinctrl0)|WARN|DHCPREQUEST requested IP 10.100.0.5 does not match offer 10.100.0.9
Jan 23 05:31:16 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:16Z|00097|pinctrl(ovn_pinctrl0)|INFO|DHCPNAK fa:16:3e:4f:dc:d1 10.100.0.9
Jan 23 05:31:16 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:16Z|00098|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:4f:dc:d1 10.100.0.9
Jan 23 05:31:16 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:16Z|00099|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:4f:dc:d1 10.100.0.9
Jan 23 05:31:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:16.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:16 np0005593234 nova_compute[227762]: 2026-01-23 10:31:16.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:31:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:31:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:17.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:18 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:18Z|00100|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9b:3a:01 10.100.0.11
Jan 23 05:31:18 np0005593234 nova_compute[227762]: 2026-01-23 10:31:18.459 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:18.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:19 np0005593234 nova_compute[227762]: 2026-01-23 10:31:19.124 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:19.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:20.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:20 np0005593234 podman[309669]: 2026-01-23 10:31:20.816847227 +0000 UTC m=+0.113827807 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 05:31:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:21.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:22.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:23 np0005593234 nova_compute[227762]: 2026-01-23 10:31:23.462 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:23.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #151. Immutable memtables: 0.
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.563970) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 95] Flushing memtable with next log file: 151
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164283564058, "job": 95, "event": "flush_started", "num_memtables": 1, "num_entries": 1355, "num_deletes": 260, "total_data_size": 2826193, "memory_usage": 2878432, "flush_reason": "Manual Compaction"}
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 95] Level-0 flush table #152: started
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164283581396, "cf_name": "default", "job": 95, "event": "table_file_creation", "file_number": 152, "file_size": 1842297, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 73172, "largest_seqno": 74521, "table_properties": {"data_size": 1836455, "index_size": 3108, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 13115, "raw_average_key_size": 20, "raw_value_size": 1824385, "raw_average_value_size": 2789, "num_data_blocks": 136, "num_entries": 654, "num_filter_entries": 654, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164188, "oldest_key_time": 1769164188, "file_creation_time": 1769164283, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 152, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 95] Flush lasted 17831 microseconds, and 8557 cpu microseconds.
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.581805) [db/flush_job.cc:967] [default] [JOB 95] Level-0 flush table #152: 1842297 bytes OK
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.581868) [db/memtable_list.cc:519] [default] Level-0 commit table #152 started
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.583593) [db/memtable_list.cc:722] [default] Level-0 commit table #152: memtable #1 done
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.583607) EVENT_LOG_v1 {"time_micros": 1769164283583603, "job": 95, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.583627) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 95] Try to delete WAL files size 2819652, prev total WAL file size 2819652, number of live WAL files 2.
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000148.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.584705) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0032373632' seq:72057594037927935, type:22 .. '6C6F676D0033303136' seq:0, type:0; will stop at (end)
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 96] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 95 Base level 0, inputs: [152(1799KB)], [150(11MB)]
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164283584823, "job": 96, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [152], "files_L6": [150], "score": -1, "input_data_size": 13903028, "oldest_snapshot_seqno": -1}
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 96] Generated table #153: 9375 keys, 13758695 bytes, temperature: kUnknown
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164283676088, "cf_name": "default", "job": 96, "event": "table_file_creation", "file_number": 153, "file_size": 13758695, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13696542, "index_size": 37592, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23493, "raw_key_size": 247073, "raw_average_key_size": 26, "raw_value_size": 13530599, "raw_average_value_size": 1443, "num_data_blocks": 1446, "num_entries": 9375, "num_filter_entries": 9375, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769164283, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 153, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.676437) [db/compaction/compaction_job.cc:1663] [default] [JOB 96] Compacted 1@0 + 1@6 files to L6 => 13758695 bytes
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.677954) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 152.5 rd, 150.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 11.5 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(15.0) write-amplify(7.5) OK, records in: 9913, records dropped: 538 output_compression: NoCompression
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.677976) EVENT_LOG_v1 {"time_micros": 1769164283677965, "job": 96, "event": "compaction_finished", "compaction_time_micros": 91187, "compaction_time_cpu_micros": 29357, "output_level": 6, "num_output_files": 1, "total_output_size": 13758695, "num_input_records": 9913, "num_output_records": 9375, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000152.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164283678502, "job": 96, "event": "table_file_deletion", "file_number": 152}
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000150.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164283681264, "job": 96, "event": "table_file_deletion", "file_number": 150}
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.584557) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.681336) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.681343) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.681344) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.681346) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:31:23 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:31:23.681347) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:31:24 np0005593234 nova_compute[227762]: 2026-01-23 10:31:24.126 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:24.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:24 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:31:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:25.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:31:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:26.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:31:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:27.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:28 np0005593234 nova_compute[227762]: 2026-01-23 10:31:28.466 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:28.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:29 np0005593234 nova_compute[227762]: 2026-01-23 10:31:29.162 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:29.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:30.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:31.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:32.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:33.350 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=66, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=65) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:31:33 np0005593234 nova_compute[227762]: 2026-01-23 10:31:33.350 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:33.351 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:31:33 np0005593234 nova_compute[227762]: 2026-01-23 10:31:33.467 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:33.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:34 np0005593234 nova_compute[227762]: 2026-01-23 10:31:34.164 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:31:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:34.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:31:35 np0005593234 nova_compute[227762]: 2026-01-23 10:31:35.217 227766 DEBUG nova.compute.manager [None req-ec895481-afa7-4043-ac84-7000f092bb9f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:31:35 np0005593234 nova_compute[227762]: 2026-01-23 10:31:35.325 227766 INFO nova.compute.manager [None req-ec895481-afa7-4043-ac84-7000f092bb9f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] instance snapshotting#033[00m
Jan 23 05:31:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:35.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:35 np0005593234 nova_compute[227762]: 2026-01-23 10:31:35.846 227766 INFO nova.virt.libvirt.driver [None req-ec895481-afa7-4043-ac84-7000f092bb9f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Beginning live snapshot process#033[00m
Jan 23 05:31:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:36.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:37 np0005593234 nova_compute[227762]: 2026-01-23 10:31:37.385 227766 DEBUG nova.storage.rbd_utils [None req-ec895481-afa7-4043-ac84-7000f092bb9f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] creating snapshot(c76544eb9dd142e195e049f2b2972c4e) on rbd image(6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:31:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:37.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e339 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e340 e340: 3 total, 3 up, 3 in
Jan 23 05:31:38 np0005593234 nova_compute[227762]: 2026-01-23 10:31:38.470 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:38.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:38 np0005593234 nova_compute[227762]: 2026-01-23 10:31:38.580 227766 DEBUG nova.storage.rbd_utils [None req-ec895481-afa7-4043-ac84-7000f092bb9f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] cloning vms/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk@c76544eb9dd142e195e049f2b2972c4e to images/45316771-2c9a-4201-a1b6-cc39488f10bc clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:31:38 np0005593234 nova_compute[227762]: 2026-01-23 10:31:38.743 227766 DEBUG nova.storage.rbd_utils [None req-ec895481-afa7-4043-ac84-7000f092bb9f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] flattening images/45316771-2c9a-4201-a1b6-cc39488f10bc flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.166 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.387 227766 DEBUG nova.storage.rbd_utils [None req-ec895481-afa7-4043-ac84-7000f092bb9f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] removing snapshot(c76544eb9dd142e195e049f2b2972c4e) on rbd image(6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.460 227766 DEBUG oslo_concurrency.lockutils [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.460 227766 DEBUG oslo_concurrency.lockutils [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.460 227766 DEBUG oslo_concurrency.lockutils [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.460 227766 DEBUG oslo_concurrency.lockutils [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.460 227766 DEBUG oslo_concurrency.lockutils [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.461 227766 INFO nova.compute.manager [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Terminating instance#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.462 227766 DEBUG nova.compute.manager [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:31:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:39.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e341 e341: 3 total, 3 up, 3 in
Jan 23 05:31:39 np0005593234 kernel: tap85f614c4-b9 (unregistering): left promiscuous mode
Jan 23 05:31:39 np0005593234 NetworkManager[48942]: <info>  [1769164299.6022] device (tap85f614c4-b9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:31:39 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:39Z|00760|binding|INFO|Releasing lport 85f614c4-b9b2-474a-a52e-5acbcb0a43c5 from this chassis (sb_readonly=0)
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.613 227766 DEBUG nova.storage.rbd_utils [None req-ec895481-afa7-4043-ac84-7000f092bb9f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] creating snapshot(snap) on rbd image(45316771-2c9a-4201-a1b6-cc39488f10bc) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:31:39 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:39Z|00761|binding|INFO|Setting lport 85f614c4-b9b2-474a-a52e-5acbcb0a43c5 down in Southbound
Jan 23 05:31:39 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:39Z|00762|binding|INFO|Removing iface tap85f614c4-b9 ovn-installed in OVS
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.627 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:0a:48 10.100.0.3'], port_security=['fa:16:3e:43:0a:48 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f10b70f9-c203-4706-8e68-a3c1cd3af7a9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3128fa93-5584-4fd7-b8b2-100d4babba87', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2fde87e7-bf35-4066-8d9f-5bce5d8c471c', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a7d423e3-a129-4092-a097-e9db38a84e9f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=85f614c4-b9b2-474a-a52e-5acbcb0a43c5) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.628 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 85f614c4-b9b2-474a-a52e-5acbcb0a43c5 in datapath 3128fa93-5584-4fd7-b8b2-100d4babba87 unbound from our chassis#033[00m
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.630 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3128fa93-5584-4fd7-b8b2-100d4babba87, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.631 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[64d611fa-fe5a-4d0c-bb61-85d09836c0ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.631 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87 namespace which is not needed anymore#033[00m
Jan 23 05:31:39 np0005593234 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000b2.scope: Deactivated successfully.
Jan 23 05:31:39 np0005593234 systemd[1]: machine-qemu\x2d84\x2dinstance\x2d000000b2.scope: Consumed 16.354s CPU time.
Jan 23 05:31:39 np0005593234 systemd-machined[195626]: Machine qemu-84-instance-000000b2 terminated.
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.734 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:39 np0005593234 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[309138]: [NOTICE]   (309147) : haproxy version is 2.8.14-c23fe91
Jan 23 05:31:39 np0005593234 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[309138]: [NOTICE]   (309147) : path to executable is /usr/sbin/haproxy
Jan 23 05:31:39 np0005593234 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[309138]: [WARNING]  (309147) : Exiting Master process...
Jan 23 05:31:39 np0005593234 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[309138]: [WARNING]  (309147) : Exiting Master process...
Jan 23 05:31:39 np0005593234 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[309138]: [ALERT]    (309147) : Current worker (309149) exited with code 143 (Terminated)
Jan 23 05:31:39 np0005593234 neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87[309138]: [WARNING]  (309147) : All workers exited. Exiting... (0)
Jan 23 05:31:39 np0005593234 systemd[1]: libpod-80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44.scope: Deactivated successfully.
Jan 23 05:31:39 np0005593234 podman[309969]: 2026-01-23 10:31:39.774984694 +0000 UTC m=+0.043106937 container died 80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:31:39 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44-userdata-shm.mount: Deactivated successfully.
Jan 23 05:31:39 np0005593234 systemd[1]: var-lib-containers-storage-overlay-e204cf57fe8ad17aebafeff5d09caa0d0a9f8a5ce877fa8afe3c02027cd65654-merged.mount: Deactivated successfully.
Jan 23 05:31:39 np0005593234 podman[309969]: 2026-01-23 10:31:39.80810426 +0000 UTC m=+0.076226503 container cleanup 80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:31:39 np0005593234 systemd[1]: libpod-conmon-80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44.scope: Deactivated successfully.
Jan 23 05:31:39 np0005593234 podman[309998]: 2026-01-23 10:31:39.863426328 +0000 UTC m=+0.038080901 container remove 80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.870 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9f347e91-09a8-4380-a59a-a9de43bc3204]: (4, ('Fri Jan 23 10:31:39 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87 (80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44)\n80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44\nFri Jan 23 10:31:39 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87 (80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44)\n80786340444bb6832f286309c63318023bb5dad6755862684db43ac431a0cb44\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.873 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8ec3bd71-6e48-4f6d-ac96-d0c4e0fbcb6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.874 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3128fa93-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.877 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:39 np0005593234 kernel: tap3128fa93-50: left promiscuous mode
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.891 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.895 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.898 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[47b91c07-4a6b-48bf-9b8c-3283211f48b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.904 227766 INFO nova.virt.libvirt.driver [-] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Instance destroyed successfully.#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.904 227766 DEBUG nova.objects.instance [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'resources' on Instance uuid f10b70f9-c203-4706-8e68-a3c1cd3af7a9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.912 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1c785d76-6d7d-4f71-9f4d-2b9fb1397478]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.913 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a8be99e1-d547-4797-8729-53f8a1dd2dc6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.924 227766 DEBUG nova.virt.libvirt.vif [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:30:37Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1500420906',display_name='tempest-TestNetworkBasicOps-server-1500420906',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1500420906',id=178,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDBwx9H6lyetcD0klU5EtF5tu3Bgw3j8Vtfzx918uyQtIQS8G8H59VLbqgzhuLaaEzLIvZnDjv0NJTs71foX7/dVyJnhuy3i18FefCvqxPKfNTiayVm3kq0+RpjimjAsHA==',key_name='tempest-TestNetworkBasicOps-503821714',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:30:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-1t28963m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:30:56Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=f10b70f9-c203-4706-8e68-a3c1cd3af7a9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "address": "fa:16:3e:43:0a:48", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f614c4-b9", "ovs_interfaceid": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.925 227766 DEBUG nova.network.os_vif_util [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "address": "fa:16:3e:43:0a:48", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.227", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f614c4-b9", "ovs_interfaceid": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.926 227766 DEBUG nova.network.os_vif_util [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:43:0a:48,bridge_name='br-int',has_traffic_filtering=True,id=85f614c4-b9b2-474a-a52e-5acbcb0a43c5,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f614c4-b9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.926 227766 DEBUG os_vif [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:0a:48,bridge_name='br-int',has_traffic_filtering=True,id=85f614c4-b9b2-474a-a52e-5acbcb0a43c5,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f614c4-b9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.928 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.927 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7c9422b9-3436-4058-9ea6-0aafcdc25e4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815152, 'reachable_time': 43522, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310023, 'error': None, 'target': 'ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.928 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap85f614c4-b9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:39 np0005593234 systemd[1]: run-netns-ovnmeta\x2d3128fa93\x2d5584\x2d4fd7\x2db8b2\x2d100d4babba87.mount: Deactivated successfully.
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.930 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.932 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.932 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3128fa93-5584-4fd7-b8b2-100d4babba87 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:31:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:39.932 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[6a855964-fb26-4cff-8c3c-8f50fffcb361]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.933 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:39 np0005593234 nova_compute[227762]: 2026-01-23 10:31:39.935 227766 INFO os_vif [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:0a:48,bridge_name='br-int',has_traffic_filtering=True,id=85f614c4-b9b2-474a-a52e-5acbcb0a43c5,network=Network(3128fa93-5584-4fd7-b8b2-100d4babba87),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap85f614c4-b9')#033[00m
Jan 23 05:31:40 np0005593234 nova_compute[227762]: 2026-01-23 10:31:40.369 227766 INFO nova.virt.libvirt.driver [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Deleting instance files /var/lib/nova/instances/f10b70f9-c203-4706-8e68-a3c1cd3af7a9_del#033[00m
Jan 23 05:31:40 np0005593234 nova_compute[227762]: 2026-01-23 10:31:40.370 227766 INFO nova.virt.libvirt.driver [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Deletion of /var/lib/nova/instances/f10b70f9-c203-4706-8e68-a3c1cd3af7a9_del complete#033[00m
Jan 23 05:31:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:40.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:40 np0005593234 nova_compute[227762]: 2026-01-23 10:31:40.507 227766 INFO nova.compute.manager [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Took 1.04 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:31:40 np0005593234 nova_compute[227762]: 2026-01-23 10:31:40.508 227766 DEBUG oslo.service.loopingcall [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:31:40 np0005593234 nova_compute[227762]: 2026-01-23 10:31:40.508 227766 DEBUG nova.compute.manager [-] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:31:40 np0005593234 nova_compute[227762]: 2026-01-23 10:31:40.508 227766 DEBUG nova.network.neutron [-] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:31:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e342 e342: 3 total, 3 up, 3 in
Jan 23 05:31:40 np0005593234 nova_compute[227762]: 2026-01-23 10:31:40.891 227766 DEBUG nova.compute.manager [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Received event network-changed-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:40 np0005593234 nova_compute[227762]: 2026-01-23 10:31:40.891 227766 DEBUG nova.compute.manager [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Refreshing instance network info cache due to event network-changed-85f614c4-b9b2-474a-a52e-5acbcb0a43c5. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:31:40 np0005593234 nova_compute[227762]: 2026-01-23 10:31:40.892 227766 DEBUG oslo_concurrency.lockutils [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-f10b70f9-c203-4706-8e68-a3c1cd3af7a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:31:40 np0005593234 nova_compute[227762]: 2026-01-23 10:31:40.892 227766 DEBUG oslo_concurrency.lockutils [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-f10b70f9-c203-4706-8e68-a3c1cd3af7a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:31:40 np0005593234 nova_compute[227762]: 2026-01-23 10:31:40.893 227766 DEBUG nova.network.neutron [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Refreshing network info cache for port 85f614c4-b9b2-474a-a52e-5acbcb0a43c5 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:31:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:41.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:42.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:42 np0005593234 podman[310047]: 2026-01-23 10:31:42.763184459 +0000 UTC m=+0.054808454 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:31:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:42.868 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:42.869 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:42.869 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e342 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:43.353 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '66'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:43 np0005593234 nova_compute[227762]: 2026-01-23 10:31:43.472 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:43 np0005593234 nova_compute[227762]: 2026-01-23 10:31:43.479 227766 DEBUG nova.network.neutron [-] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:31:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:43.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:43 np0005593234 nova_compute[227762]: 2026-01-23 10:31:43.553 227766 INFO nova.compute.manager [-] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Took 3.04 seconds to deallocate network for instance.#033[00m
Jan 23 05:31:43 np0005593234 nova_compute[227762]: 2026-01-23 10:31:43.650 227766 DEBUG nova.compute.manager [req-3e320097-3aaa-421d-8575-351aca0a065e req-971f9f31-545c-4df8-aaf0-1b8568160913 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Received event network-vif-deleted-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:43 np0005593234 nova_compute[227762]: 2026-01-23 10:31:43.684 227766 DEBUG oslo_concurrency.lockutils [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:43 np0005593234 nova_compute[227762]: 2026-01-23 10:31:43.685 227766 DEBUG oslo_concurrency.lockutils [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:43 np0005593234 nova_compute[227762]: 2026-01-23 10:31:43.855 227766 DEBUG nova.compute.manager [req-e74f3cde-4349-4140-83ef-e50740342dc9 req-a072c5de-664f-4e33-b4da-f3430ab58114 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Received event network-vif-unplugged-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:43 np0005593234 nova_compute[227762]: 2026-01-23 10:31:43.855 227766 DEBUG oslo_concurrency.lockutils [req-e74f3cde-4349-4140-83ef-e50740342dc9 req-a072c5de-664f-4e33-b4da-f3430ab58114 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:43 np0005593234 nova_compute[227762]: 2026-01-23 10:31:43.855 227766 DEBUG oslo_concurrency.lockutils [req-e74f3cde-4349-4140-83ef-e50740342dc9 req-a072c5de-664f-4e33-b4da-f3430ab58114 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:43 np0005593234 nova_compute[227762]: 2026-01-23 10:31:43.856 227766 DEBUG oslo_concurrency.lockutils [req-e74f3cde-4349-4140-83ef-e50740342dc9 req-a072c5de-664f-4e33-b4da-f3430ab58114 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:43 np0005593234 nova_compute[227762]: 2026-01-23 10:31:43.856 227766 DEBUG nova.compute.manager [req-e74f3cde-4349-4140-83ef-e50740342dc9 req-a072c5de-664f-4e33-b4da-f3430ab58114 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] No waiting events found dispatching network-vif-unplugged-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:43 np0005593234 nova_compute[227762]: 2026-01-23 10:31:43.856 227766 WARNING nova.compute.manager [req-e74f3cde-4349-4140-83ef-e50740342dc9 req-a072c5de-664f-4e33-b4da-f3430ab58114 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Received unexpected event network-vif-unplugged-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:31:44 np0005593234 nova_compute[227762]: 2026-01-23 10:31:44.237 227766 DEBUG oslo_concurrency.processutils [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:31:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:44.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:31:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:31:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3598822370' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:31:44 np0005593234 nova_compute[227762]: 2026-01-23 10:31:44.664 227766 DEBUG oslo_concurrency.processutils [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:44 np0005593234 nova_compute[227762]: 2026-01-23 10:31:44.670 227766 DEBUG nova.compute.provider_tree [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:31:44 np0005593234 nova_compute[227762]: 2026-01-23 10:31:44.716 227766 DEBUG nova.scheduler.client.report [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:31:44 np0005593234 nova_compute[227762]: 2026-01-23 10:31:44.777 227766 DEBUG oslo_concurrency.lockutils [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.092s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:44 np0005593234 nova_compute[227762]: 2026-01-23 10:31:44.812 227766 INFO nova.scheduler.client.report [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Deleted allocations for instance f10b70f9-c203-4706-8e68-a3c1cd3af7a9#033[00m
Jan 23 05:31:44 np0005593234 nova_compute[227762]: 2026-01-23 10:31:44.930 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:31:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1940378036' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:31:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:31:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1940378036' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:31:44 np0005593234 nova_compute[227762]: 2026-01-23 10:31:44.958 227766 INFO nova.virt.libvirt.driver [None req-ec895481-afa7-4043-ac84-7000f092bb9f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Snapshot image upload complete#033[00m
Jan 23 05:31:44 np0005593234 nova_compute[227762]: 2026-01-23 10:31:44.958 227766 INFO nova.compute.manager [None req-ec895481-afa7-4043-ac84-7000f092bb9f 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Took 9.63 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 23 05:31:45 np0005593234 nova_compute[227762]: 2026-01-23 10:31:45.016 227766 DEBUG nova.network.neutron [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Updated VIF entry in instance network info cache for port 85f614c4-b9b2-474a-a52e-5acbcb0a43c5. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:31:45 np0005593234 nova_compute[227762]: 2026-01-23 10:31:45.016 227766 DEBUG nova.network.neutron [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Updating instance_info_cache with network_info: [{"id": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "address": "fa:16:3e:43:0a:48", "network": {"id": "3128fa93-5584-4fd7-b8b2-100d4babba87", "bridge": "br-int", "label": "tempest-network-smoke--1879452470", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap85f614c4-b9", "ovs_interfaceid": "85f614c4-b9b2-474a-a52e-5acbcb0a43c5", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:31:45 np0005593234 nova_compute[227762]: 2026-01-23 10:31:45.055 227766 DEBUG oslo_concurrency.lockutils [None req-8b1d1a19-209e-4642-b7ea-e601de0b3703 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.595s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:45 np0005593234 nova_compute[227762]: 2026-01-23 10:31:45.106 227766 DEBUG oslo_concurrency.lockutils [req-a14bbc9b-510d-4dd0-95c4-07247ecb7910 req-c949add0-a3c9-4d49-85e2-d8016fcaf95a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-f10b70f9-c203-4706-8e68-a3c1cd3af7a9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:31:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:45.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:46 np0005593234 nova_compute[227762]: 2026-01-23 10:31:46.092 227766 DEBUG nova.compute.manager [req-124e06b0-3ecc-4e87-9555-7ca23916f0bd req-636d884d-1a12-499d-ba58-3523a3d44d91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Received event network-vif-plugged-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:46 np0005593234 nova_compute[227762]: 2026-01-23 10:31:46.093 227766 DEBUG oslo_concurrency.lockutils [req-124e06b0-3ecc-4e87-9555-7ca23916f0bd req-636d884d-1a12-499d-ba58-3523a3d44d91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:46 np0005593234 nova_compute[227762]: 2026-01-23 10:31:46.093 227766 DEBUG oslo_concurrency.lockutils [req-124e06b0-3ecc-4e87-9555-7ca23916f0bd req-636d884d-1a12-499d-ba58-3523a3d44d91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:46 np0005593234 nova_compute[227762]: 2026-01-23 10:31:46.093 227766 DEBUG oslo_concurrency.lockutils [req-124e06b0-3ecc-4e87-9555-7ca23916f0bd req-636d884d-1a12-499d-ba58-3523a3d44d91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "f10b70f9-c203-4706-8e68-a3c1cd3af7a9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:46 np0005593234 nova_compute[227762]: 2026-01-23 10:31:46.093 227766 DEBUG nova.compute.manager [req-124e06b0-3ecc-4e87-9555-7ca23916f0bd req-636d884d-1a12-499d-ba58-3523a3d44d91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] No waiting events found dispatching network-vif-plugged-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:46 np0005593234 nova_compute[227762]: 2026-01-23 10:31:46.094 227766 WARNING nova.compute.manager [req-124e06b0-3ecc-4e87-9555-7ca23916f0bd req-636d884d-1a12-499d-ba58-3523a3d44d91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Received unexpected event network-vif-plugged-85f614c4-b9b2-474a-a52e-5acbcb0a43c5 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:31:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:46.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e343 e343: 3 total, 3 up, 3 in
Jan 23 05:31:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:47.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e343 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:48 np0005593234 nova_compute[227762]: 2026-01-23 10:31:48.473 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:48.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e344 e344: 3 total, 3 up, 3 in
Jan 23 05:31:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:49.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:49 np0005593234 nova_compute[227762]: 2026-01-23 10:31:49.932 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:50.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:51.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:51 np0005593234 podman[310095]: 2026-01-23 10:31:51.824594322 +0000 UTC m=+0.105522707 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.384 227766 DEBUG oslo_concurrency.lockutils [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Acquiring lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.384 227766 DEBUG oslo_concurrency.lockutils [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.385 227766 DEBUG oslo_concurrency.lockutils [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Acquiring lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.385 227766 DEBUG oslo_concurrency.lockutils [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.385 227766 DEBUG oslo_concurrency.lockutils [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.386 227766 INFO nova.compute.manager [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Terminating instance#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.387 227766 DEBUG nova.compute.manager [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:31:52 np0005593234 kernel: tapa2c66cbc-51 (unregistering): left promiscuous mode
Jan 23 05:31:52 np0005593234 NetworkManager[48942]: <info>  [1769164312.4684] device (tapa2c66cbc-51): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:31:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:52Z|00763|binding|INFO|Releasing lport a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 from this chassis (sb_readonly=0)
Jan 23 05:31:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:52Z|00764|binding|INFO|Setting lport a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 down in Southbound
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.476 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:31:52Z|00765|binding|INFO|Removing iface tapa2c66cbc-51 ovn-installed in OVS
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.480 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.489 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4f:dc:d1 10.100.0.9'], port_security=['fa:16:3e:4f:dc:d1 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '6ed27eef-aee3-4b7d-a31f-8b7d753a25b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd95237d-0845-479e-9505-318e01879565', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7be5cb5abaf44b0a9c0c307d348d8f75', 'neutron:revision_number': '4', 'neutron:security_group_ids': '9caee2dd-fa48-495f-923a-9b90f0b8d219', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=fddb1949-170b-4939-a509-14ac4d8149d1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.490 144381 INFO neutron.agent.ovn.metadata.agent [-] Port a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 in datapath bd95237d-0845-479e-9505-318e01879565 unbound from our chassis#033[00m
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.491 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bd95237d-0845-479e-9505-318e01879565, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.492 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f077f762-aaa7-4627-97fa-a079448a4839]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.493 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bd95237d-0845-479e-9505-318e01879565 namespace which is not needed anymore#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.494 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:52.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:52 np0005593234 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000b1.scope: Deactivated successfully.
Jan 23 05:31:52 np0005593234 systemd[1]: machine-qemu\x2d82\x2dinstance\x2d000000b1.scope: Consumed 17.547s CPU time.
Jan 23 05:31:52 np0005593234 systemd-machined[195626]: Machine qemu-82-instance-000000b1 terminated.
Jan 23 05:31:52 np0005593234 neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565[308637]: [NOTICE]   (308646) : haproxy version is 2.8.14-c23fe91
Jan 23 05:31:52 np0005593234 neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565[308637]: [NOTICE]   (308646) : path to executable is /usr/sbin/haproxy
Jan 23 05:31:52 np0005593234 neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565[308637]: [WARNING]  (308646) : Exiting Master process...
Jan 23 05:31:52 np0005593234 neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565[308637]: [WARNING]  (308646) : Exiting Master process...
Jan 23 05:31:52 np0005593234 neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565[308637]: [ALERT]    (308646) : Current worker (308648) exited with code 143 (Terminated)
Jan 23 05:31:52 np0005593234 neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565[308637]: [WARNING]  (308646) : All workers exited. Exiting... (0)
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.626 227766 INFO nova.virt.libvirt.driver [-] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Instance destroyed successfully.#033[00m
Jan 23 05:31:52 np0005593234 systemd[1]: libpod-f3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c.scope: Deactivated successfully.
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.626 227766 DEBUG nova.objects.instance [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lazy-loading 'resources' on Instance uuid 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:31:52 np0005593234 podman[310145]: 2026-01-23 10:31:52.631430921 +0000 UTC m=+0.052730538 container died f3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 05:31:52 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c-userdata-shm.mount: Deactivated successfully.
Jan 23 05:31:52 np0005593234 systemd[1]: var-lib-containers-storage-overlay-c48f63baae2323e23727824b9b0b8926232bfe1829633028b20dce53780300c2-merged.mount: Deactivated successfully.
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.668 227766 DEBUG nova.virt.libvirt.vif [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:30:36Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestSnapshotPattern-server-720792772',display_name='tempest-TestSnapshotPattern-server-720792772',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testsnapshotpattern-server-720792772',id=177,image_ref='81a92860-f94f-4274-aba5-1ec35fd1f681',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO0lEKBsepjSG1BUrT2qgopJ/7aCoBcgDi3hhuJKTvppGpJeuS7bRrTAjsHpfJAjqSviKitZ9vmMFVrUxqv9t4cjKwPE6pfdP8/KJg/bYjfHtBTugoC0prDbk1bWow1ivA==',key_name='tempest-TestSnapshotPattern-313488550',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:30:55Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7be5cb5abaf44b0a9c0c307d348d8f75',ramdisk_id='',reservation_id='r-as8m7jan',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_boot_roles='reader,member',image_container_format='bare',image_disk_format='raw',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_image_location='snapshot',image_image_state='available',image_image_type='snapshot',image_instance_uuid='713eba08-716b-48ed-866e-e231d09ebfaf',image_min_disk='1',image_min_ram='0',image_owner_id='7be5cb5abaf44b0a9c0c307d348d8f75',image_owner_project_name='tempest-TestSnapshotPattern-428739353',image_owner_user_name='tempest-TestSnapshotPattern-428739353-project-member',image_user_id='8e1f41f21f79408d8dff1331cfd1e0db',image_version='8.0',owner_project_name='tempest-TestSnapshotPattern-428739353',owner_user_name='tempest-TestSnapshotPattern-428739353-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:31:45Z,user_data=None,user_id='8e1f41f21f79408d8dff1331cfd1e0db',uuid=6ed27eef-aee3-4b7d-a31f-8b7d753a25b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "address": "fa:16:3e:4f:dc:d1", "network": {"id": "bd95237d-0845-479e-9505-318e01879565", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-2097735183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7be5cb5abaf44b0a9c0c307d348d8f75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c66cbc-51", "ovs_interfaceid": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.669 227766 DEBUG nova.network.os_vif_util [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Converting VIF {"id": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "address": "fa:16:3e:4f:dc:d1", "network": {"id": "bd95237d-0845-479e-9505-318e01879565", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-2097735183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.198", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7be5cb5abaf44b0a9c0c307d348d8f75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c66cbc-51", "ovs_interfaceid": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.670 227766 DEBUG nova.network.os_vif_util [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:4f:dc:d1,bridge_name='br-int',has_traffic_filtering=True,id=a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471,network=Network(bd95237d-0845-479e-9505-318e01879565),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c66cbc-51') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.670 227766 DEBUG os_vif [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:dc:d1,bridge_name='br-int',has_traffic_filtering=True,id=a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471,network=Network(bd95237d-0845-479e-9505-318e01879565),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c66cbc-51') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.672 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:52 np0005593234 podman[310145]: 2026-01-23 10:31:52.672623838 +0000 UTC m=+0.093923465 container cleanup f3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.672 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2c66cbc-51, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.674 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.676 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.679 227766 INFO os_vif [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:4f:dc:d1,bridge_name='br-int',has_traffic_filtering=True,id=a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471,network=Network(bd95237d-0845-479e-9505-318e01879565),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa2c66cbc-51')#033[00m
Jan 23 05:31:52 np0005593234 systemd[1]: libpod-conmon-f3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c.scope: Deactivated successfully.
Jan 23 05:31:52 np0005593234 podman[310185]: 2026-01-23 10:31:52.732822129 +0000 UTC m=+0.038784942 container remove f3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.737 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[57cd6297-08dc-4868-8ef2-0813207ba5df]: (4, ('Fri Jan 23 10:31:52 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565 (f3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c)\nf3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c\nFri Jan 23 10:31:52 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bd95237d-0845-479e-9505-318e01879565 (f3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c)\nf3a67086a1b5b734813bb9249a7036a672f220fbd3771e2de10cb2c936e9883c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.739 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[32b9367d-db2a-4554-8704-384cf5ff854f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.739 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd95237d-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.741 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:52 np0005593234 kernel: tapbd95237d-00: left promiscuous mode
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.761 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.764 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[59a138ec-4b75-4417-b459-fe899d03a762]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.784 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[64ac466a-ea5e-4545-b921-52b2234bd478]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.786 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6527df87-b575-4147-9e62-1eeb64245c97]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.799 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[64683a04-c261-4cd4-84ad-dd365c53ed16]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 814778, 'reachable_time': 30198, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 310218, 'error': None, 'target': 'ovnmeta-bd95237d-0845-479e-9505-318e01879565', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:52 np0005593234 systemd[1]: run-netns-ovnmeta\x2dbd95237d\x2d0845\x2d479e\x2d9505\x2d318e01879565.mount: Deactivated successfully.
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.803 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bd95237d-0845-479e-9505-318e01879565 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:31:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:31:52.803 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[c52d6a0c-8e9a-471b-8b07-8a25622c7e47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.993 227766 DEBUG nova.compute.manager [req-175ae1dd-cb92-4d44-ba2d-387cd1f1211a req-b82c6b05-c323-404c-9e4d-bb19f80144b1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Received event network-vif-unplugged-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.994 227766 DEBUG oslo_concurrency.lockutils [req-175ae1dd-cb92-4d44-ba2d-387cd1f1211a req-b82c6b05-c323-404c-9e4d-bb19f80144b1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.994 227766 DEBUG oslo_concurrency.lockutils [req-175ae1dd-cb92-4d44-ba2d-387cd1f1211a req-b82c6b05-c323-404c-9e4d-bb19f80144b1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.994 227766 DEBUG oslo_concurrency.lockutils [req-175ae1dd-cb92-4d44-ba2d-387cd1f1211a req-b82c6b05-c323-404c-9e4d-bb19f80144b1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.994 227766 DEBUG nova.compute.manager [req-175ae1dd-cb92-4d44-ba2d-387cd1f1211a req-b82c6b05-c323-404c-9e4d-bb19f80144b1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] No waiting events found dispatching network-vif-unplugged-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:52 np0005593234 nova_compute[227762]: 2026-01-23 10:31:52.994 227766 DEBUG nova.compute.manager [req-175ae1dd-cb92-4d44-ba2d-387cd1f1211a req-b82c6b05-c323-404c-9e4d-bb19f80144b1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Received event network-vif-unplugged-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:31:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e344 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.177 227766 INFO nova.virt.libvirt.driver [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Deleting instance files /var/lib/nova/instances/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_del#033[00m
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.179 227766 INFO nova.virt.libvirt.driver [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Deletion of /var/lib/nova/instances/6ed27eef-aee3-4b7d-a31f-8b7d753a25b9_del complete#033[00m
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.311 227766 INFO nova.compute.manager [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Took 0.92 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.312 227766 DEBUG oslo.service.loopingcall [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.312 227766 DEBUG nova.compute.manager [-] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.313 227766 DEBUG nova.network.neutron [-] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.510 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:31:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:53.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:31:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e345 e345: 3 total, 3 up, 3 in
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.769 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.769 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.769 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.769 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:31:53 np0005593234 nova_compute[227762]: 2026-01-23 10:31:53.770 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:31:54 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3449950488' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.248 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.381 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.381 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:31:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:54.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:31:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.0 total, 600.0 interval#012Cumulative writes: 14K writes, 74K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.15 GB, 0.03 MB/s#012Cumulative WAL: 14K writes, 14K syncs, 1.00 writes per sync, written: 0.15 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1588 writes, 7967 keys, 1588 commit groups, 1.0 writes per commit group, ingest: 16.15 MB, 0.03 MB/s#012Interval WAL: 1588 writes, 1588 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     56.1      1.66              0.27        48    0.035       0      0       0.0       0.0#012  L6      1/0   13.12 MB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   5.0    119.2    101.6      4.53              1.48        47    0.096    330K    25K       0.0       0.0#012 Sum      1/0   13.12 MB   0.0      0.5     0.1      0.4       0.5      0.1       0.0   6.0     87.3     89.4      6.19              1.75        95    0.065    330K    25K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   6.5    140.5    146.1      0.53              0.22        12    0.044     56K   3136       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.5     0.1      0.4       0.4      0.0       0.0   0.0    119.2    101.6      4.53              1.48        47    0.096    330K    25K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     56.2      1.66              0.27        47    0.035       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 5400.0 total, 600.0 interval#012Flush(GB): cumulative 0.091, interval 0.012#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.54 GB write, 0.10 MB/s write, 0.53 GB read, 0.10 MB/s read, 6.2 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 304.00 MB usage: 59.64 MB table_size: 0 occupancy: 18446744073709551615 collections: 10 last_copies: 0 last_secs: 0.000345 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3419,57.24 MB,18.8303%) FilterBlock(95,930.17 KB,0.298806%) IndexBlock(95,1.49 MB,0.489079%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.550 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.551 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4014MB free_disk=20.797321319580078GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.551 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.552 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.862 227766 DEBUG nova.compute.manager [req-2c965f32-8f09-4b82-a4f2-ebb89ad4e83c req-ec54defa-5c84-40ff-9baf-be5d53679802 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Received event network-changed-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.863 227766 DEBUG nova.compute.manager [req-2c965f32-8f09-4b82-a4f2-ebb89ad4e83c req-ec54defa-5c84-40ff-9baf-be5d53679802 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Refreshing instance network info cache due to event network-changed-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.863 227766 DEBUG oslo_concurrency.lockutils [req-2c965f32-8f09-4b82-a4f2-ebb89ad4e83c req-ec54defa-5c84-40ff-9baf-be5d53679802 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.864 227766 DEBUG oslo_concurrency.lockutils [req-2c965f32-8f09-4b82-a4f2-ebb89ad4e83c req-ec54defa-5c84-40ff-9baf-be5d53679802 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.864 227766 DEBUG nova.network.neutron [req-2c965f32-8f09-4b82-a4f2-ebb89ad4e83c req-ec54defa-5c84-40ff-9baf-be5d53679802 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Refreshing network info cache for port a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.890 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 64ccc062-b11b-4cbc-96ba-620e43dfdb20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.890 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.890 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.890 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.903 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164299.901346, f10b70f9-c203-4706-8e68-a3c1cd3af7a9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.904 227766 INFO nova.compute.manager [-] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.922 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:31:54 np0005593234 nova_compute[227762]: 2026-01-23 10:31:54.948 227766 DEBUG nova.compute.manager [None req-6c8d31fe-a687-4897-8dea-7f59c1ff5d7e - - - - - -] [instance: f10b70f9-c203-4706-8e68-a3c1cd3af7a9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.044 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.044 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.122 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.200 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.432 227766 DEBUG nova.compute.manager [req-18c486a2-0747-4c0e-89aa-f6b9955c9b86 req-4fdac65c-2b39-4b5e-9f95-e9afec0187c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Received event network-vif-plugged-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.433 227766 DEBUG oslo_concurrency.lockutils [req-18c486a2-0747-4c0e-89aa-f6b9955c9b86 req-4fdac65c-2b39-4b5e-9f95-e9afec0187c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.433 227766 DEBUG oslo_concurrency.lockutils [req-18c486a2-0747-4c0e-89aa-f6b9955c9b86 req-4fdac65c-2b39-4b5e-9f95-e9afec0187c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.433 227766 DEBUG oslo_concurrency.lockutils [req-18c486a2-0747-4c0e-89aa-f6b9955c9b86 req-4fdac65c-2b39-4b5e-9f95-e9afec0187c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.434 227766 DEBUG nova.compute.manager [req-18c486a2-0747-4c0e-89aa-f6b9955c9b86 req-4fdac65c-2b39-4b5e-9f95-e9afec0187c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] No waiting events found dispatching network-vif-plugged-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.434 227766 WARNING nova.compute.manager [req-18c486a2-0747-4c0e-89aa-f6b9955c9b86 req-4fdac65c-2b39-4b5e-9f95-e9afec0187c8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Received unexpected event network-vif-plugged-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:31:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:55.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.526 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:31:55 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2125367175' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.939 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.413s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.948 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:31:55 np0005593234 nova_compute[227762]: 2026-01-23 10:31:55.975 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:31:56 np0005593234 nova_compute[227762]: 2026-01-23 10:31:56.032 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:31:56 np0005593234 nova_compute[227762]: 2026-01-23 10:31:56.033 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.481s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:56 np0005593234 nova_compute[227762]: 2026-01-23 10:31:56.407 227766 DEBUG nova.network.neutron [-] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:31:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:56.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:56 np0005593234 nova_compute[227762]: 2026-01-23 10:31:56.551 227766 INFO nova.compute.manager [-] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Took 3.24 seconds to deallocate network for instance.#033[00m
Jan 23 05:31:56 np0005593234 nova_compute[227762]: 2026-01-23 10:31:56.826 227766 DEBUG oslo_concurrency.lockutils [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:31:56 np0005593234 nova_compute[227762]: 2026-01-23 10:31:56.826 227766 DEBUG oslo_concurrency.lockutils [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:31:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e346 e346: 3 total, 3 up, 3 in
Jan 23 05:31:56 np0005593234 nova_compute[227762]: 2026-01-23 10:31:56.936 227766 DEBUG oslo_concurrency.processutils [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:31:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:31:57 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1621687805' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:31:57 np0005593234 nova_compute[227762]: 2026-01-23 10:31:57.388 227766 DEBUG oslo_concurrency.processutils [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:31:57 np0005593234 nova_compute[227762]: 2026-01-23 10:31:57.394 227766 DEBUG nova.compute.provider_tree [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:31:57 np0005593234 nova_compute[227762]: 2026-01-23 10:31:57.414 227766 DEBUG nova.scheduler.client.report [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:31:57 np0005593234 nova_compute[227762]: 2026-01-23 10:31:57.477 227766 DEBUG oslo_concurrency.lockutils [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.650s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:31:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:57.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:31:57 np0005593234 nova_compute[227762]: 2026-01-23 10:31:57.532 227766 INFO nova.scheduler.client.report [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Deleted allocations for instance 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9#033[00m
Jan 23 05:31:57 np0005593234 nova_compute[227762]: 2026-01-23 10:31:57.675 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e347 e347: 3 total, 3 up, 3 in
Jan 23 05:31:57 np0005593234 nova_compute[227762]: 2026-01-23 10:31:57.982 227766 DEBUG nova.compute.manager [req-5860289a-7725-40a9-b01a-901abaddae8c req-97ec68af-6db8-4195-87d4-92d0ea602b43 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Received event network-vif-deleted-a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:31:58 np0005593234 nova_compute[227762]: 2026-01-23 10:31:58.085 227766 DEBUG oslo_concurrency.lockutils [None req-5f4e496b-982f-4bb7-9346-f560248e5e1c 8e1f41f21f79408d8dff1331cfd1e0db 7be5cb5abaf44b0a9c0c307d348d8f75 - - default default] Lock "6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:31:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e347 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:31:58 np0005593234 nova_compute[227762]: 2026-01-23 10:31:58.514 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:31:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:58.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e348 e348: 3 total, 3 up, 3 in
Jan 23 05:31:59 np0005593234 nova_compute[227762]: 2026-01-23 10:31:59.033 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:31:59 np0005593234 nova_compute[227762]: 2026-01-23 10:31:59.034 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:31:59 np0005593234 nova_compute[227762]: 2026-01-23 10:31:59.187 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:31:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:31:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:31:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:59.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:31:59 np0005593234 nova_compute[227762]: 2026-01-23 10:31:59.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:00.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:00 np0005593234 nova_compute[227762]: 2026-01-23 10:32:00.543 227766 DEBUG nova.network.neutron [req-2c965f32-8f09-4b82-a4f2-ebb89ad4e83c req-ec54defa-5c84-40ff-9baf-be5d53679802 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Updated VIF entry in instance network info cache for port a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:32:00 np0005593234 nova_compute[227762]: 2026-01-23 10:32:00.543 227766 DEBUG nova.network.neutron [req-2c965f32-8f09-4b82-a4f2-ebb89ad4e83c req-ec54defa-5c84-40ff-9baf-be5d53679802 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Updating instance_info_cache with network_info: [{"id": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "address": "fa:16:3e:4f:dc:d1", "network": {"id": "bd95237d-0845-479e-9505-318e01879565", "bridge": "br-int", "label": "tempest-TestSnapshotPattern-2097735183-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7be5cb5abaf44b0a9c0c307d348d8f75", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa2c66cbc-51", "ovs_interfaceid": "a2c66cbc-516c-4bb9-a6a0-db2a9dcfe471", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:32:00 np0005593234 nova_compute[227762]: 2026-01-23 10:32:00.601 227766 DEBUG oslo_concurrency.lockutils [req-2c965f32-8f09-4b82-a4f2-ebb89ad4e83c req-ec54defa-5c84-40ff-9baf-be5d53679802 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6ed27eef-aee3-4b7d-a31f-8b7d753a25b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:32:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e349 e349: 3 total, 3 up, 3 in
Jan 23 05:32:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:01.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:01 np0005593234 nova_compute[227762]: 2026-01-23 10:32:01.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:02.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:02 np0005593234 nova_compute[227762]: 2026-01-23 10:32:02.677 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:03 np0005593234 nova_compute[227762]: 2026-01-23 10:32:03.515 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:03.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:03 np0005593234 nova_compute[227762]: 2026-01-23 10:32:03.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:03 np0005593234 nova_compute[227762]: 2026-01-23 10:32:03.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:03 np0005593234 nova_compute[227762]: 2026-01-23 10:32:03.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:32:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:32:04Z|00766|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:32:04 np0005593234 nova_compute[227762]: 2026-01-23 10:32:04.402 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:04.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:32:04Z|00767|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:32:04 np0005593234 nova_compute[227762]: 2026-01-23 10:32:04.677 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:05.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:06.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:07.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:07 np0005593234 nova_compute[227762]: 2026-01-23 10:32:07.625 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164312.6236415, 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:32:07 np0005593234 nova_compute[227762]: 2026-01-23 10:32:07.625 227766 INFO nova.compute.manager [-] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:32:07 np0005593234 nova_compute[227762]: 2026-01-23 10:32:07.680 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:07 np0005593234 nova_compute[227762]: 2026-01-23 10:32:07.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:07 np0005593234 nova_compute[227762]: 2026-01-23 10:32:07.779 227766 DEBUG nova.compute.manager [None req-d04388b5-7366-4da1-bcb3-ea602cd4e776 - - - - - -] [instance: 6ed27eef-aee3-4b7d-a31f-8b7d753a25b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:32:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e349 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:08 np0005593234 nova_compute[227762]: 2026-01-23 10:32:08.517 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:08.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 e350: 3 total, 3 up, 3 in
Jan 23 05:32:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:09.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:10.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:32:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:11.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:32:11 np0005593234 nova_compute[227762]: 2026-01-23 10:32:11.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:12.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:12 np0005593234 nova_compute[227762]: 2026-01-23 10:32:12.682 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:13 np0005593234 nova_compute[227762]: 2026-01-23 10:32:13.520 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:13.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:13 np0005593234 podman[310349]: 2026-01-23 10:32:13.768427653 +0000 UTC m=+0.058653543 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:32:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:14.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:15 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 23 05:32:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:15.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:32:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:16.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:32:16 np0005593234 nova_compute[227762]: 2026-01-23 10:32:16.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:17.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:17 np0005593234 nova_compute[227762]: 2026-01-23 10:32:17.686 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:18.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:18 np0005593234 nova_compute[227762]: 2026-01-23 10:32:18.582 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:19.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:20.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:21.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:32:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:22.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:32:22 np0005593234 nova_compute[227762]: 2026-01-23 10:32:22.688 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:22 np0005593234 podman[310423]: 2026-01-23 10:32:22.780679852 +0000 UTC m=+0.073281981 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 05:32:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:23.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:23 np0005593234 nova_compute[227762]: 2026-01-23 10:32:23.627 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:24.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:24 np0005593234 podman[310623]: 2026-01-23 10:32:24.978968865 +0000 UTC m=+0.178532199 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:32:25 np0005593234 podman[310623]: 2026-01-23 10:32:25.103278598 +0000 UTC m=+0.302841942 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=reef, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0)
Jan 23 05:32:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:25.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:25 np0005593234 podman[310776]: 2026-01-23 10:32:25.744878585 +0000 UTC m=+0.129970412 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:32:25 np0005593234 podman[310776]: 2026-01-23 10:32:25.756945092 +0000 UTC m=+0.142036909 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:32:26 np0005593234 podman[310843]: 2026-01-23 10:32:26.217375677 +0000 UTC m=+0.218769325 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, io.openshift.tags=Ceph keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.28.2, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.expose-services=)
Jan 23 05:32:26 np0005593234 podman[310843]: 2026-01-23 10:32:26.230847669 +0000 UTC m=+0.232241297 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, build-date=2023-02-22T09:23:20, release=1793, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., version=2.2.4, architecture=x86_64, io.openshift.tags=Ceph keepalived, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, io.buildah.version=1.28.2)
Jan 23 05:32:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:26.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:27.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:27 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:27 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:27 np0005593234 nova_compute[227762]: 2026-01-23 10:32:27.691 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:28 np0005593234 ceph-mgr[77448]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 05:32:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:28.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:28 np0005593234 nova_compute[227762]: 2026-01-23 10:32:28.627 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #154. Immutable memtables: 0.
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.773217) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 97] Flushing memtable with next log file: 154
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164348773327, "job": 97, "event": "flush_started", "num_memtables": 1, "num_entries": 1104, "num_deletes": 253, "total_data_size": 2232835, "memory_usage": 2268504, "flush_reason": "Manual Compaction"}
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 97] Level-0 flush table #155: started
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164348836562, "cf_name": "default", "job": 97, "event": "table_file_creation", "file_number": 155, "file_size": 1052623, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 74526, "largest_seqno": 75625, "table_properties": {"data_size": 1048141, "index_size": 2005, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 11611, "raw_average_key_size": 21, "raw_value_size": 1038658, "raw_average_value_size": 1930, "num_data_blocks": 86, "num_entries": 538, "num_filter_entries": 538, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164283, "oldest_key_time": 1769164283, "file_creation_time": 1769164348, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 155, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 97] Flush lasted 63457 microseconds, and 3680 cpu microseconds.
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.836697) [db/flush_job.cc:967] [default] [JOB 97] Level-0 flush table #155: 1052623 bytes OK
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.836715) [db/memtable_list.cc:519] [default] Level-0 commit table #155 started
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.838812) [db/memtable_list.cc:722] [default] Level-0 commit table #155: memtable #1 done
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.838828) EVENT_LOG_v1 {"time_micros": 1769164348838823, "job": 97, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.838848) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 97] Try to delete WAL files size 2227314, prev total WAL file size 2227578, number of live WAL files 2.
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000151.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.839796) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032353038' seq:72057594037927935, type:22 .. '6D6772737461740032373539' seq:0, type:0; will stop at (end)
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 98] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 97 Base level 0, inputs: [155(1027KB)], [153(13MB)]
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164348839916, "job": 98, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [155], "files_L6": [153], "score": -1, "input_data_size": 14811318, "oldest_snapshot_seqno": -1}
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 98] Generated table #156: 9413 keys, 11351106 bytes, temperature: kUnknown
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164348966753, "cf_name": "default", "job": 98, "event": "table_file_creation", "file_number": 156, "file_size": 11351106, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11292312, "index_size": 34169, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23557, "raw_key_size": 248217, "raw_average_key_size": 26, "raw_value_size": 11129179, "raw_average_value_size": 1182, "num_data_blocks": 1305, "num_entries": 9413, "num_filter_entries": 9413, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769164348, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 156, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.967241) [db/compaction/compaction_job.cc:1663] [default] [JOB 98] Compacted 1@0 + 1@6 files to L6 => 11351106 bytes
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.968790) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 116.7 rd, 89.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.0, 13.1 +0.0 blob) out(10.8 +0.0 blob), read-write-amplify(24.9) write-amplify(10.8) OK, records in: 9913, records dropped: 500 output_compression: NoCompression
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.968808) EVENT_LOG_v1 {"time_micros": 1769164348968799, "job": 98, "event": "compaction_finished", "compaction_time_micros": 126902, "compaction_time_cpu_micros": 27116, "output_level": 6, "num_output_files": 1, "total_output_size": 11351106, "num_input_records": 9913, "num_output_records": 9413, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000155.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164348969236, "job": 98, "event": "table_file_deletion", "file_number": 155}
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000153.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164348971816, "job": 98, "event": "table_file_deletion", "file_number": 153}
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.839665) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.971881) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.971903) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.971905) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.971907) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:28 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:28.971909) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:32:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 5400.5 total, 600.0 interval#012Cumulative writes: 64K writes, 255K keys, 64K commit groups, 1.0 writes per commit group, ingest: 0.26 GB, 0.05 MB/s#012Cumulative WAL: 64K writes, 23K syncs, 2.70 writes per sync, written: 0.26 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 7508 writes, 29K keys, 7508 commit groups, 1.0 writes per commit group, ingest: 29.86 MB, 0.05 MB/s#012Interval WAL: 7508 writes, 2986 syncs, 2.51 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:32:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:29.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:30.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:31.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:32.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:32 np0005593234 nova_compute[227762]: 2026-01-23 10:32:32.693 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:33.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:32:33.614 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=67, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=66) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:32:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:32:33.615 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:32:33 np0005593234 nova_compute[227762]: 2026-01-23 10:32:33.616 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:33 np0005593234 nova_compute[227762]: 2026-01-23 10:32:33.628 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:34.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:35.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:32:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:36.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:37.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:37 np0005593234 nova_compute[227762]: 2026-01-23 10:32:37.694 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:38.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:38 np0005593234 nova_compute[227762]: 2026-01-23 10:32:38.678 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:39.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:40.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:41.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:42.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:32:42.617 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '67'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:32:42 np0005593234 nova_compute[227762]: 2026-01-23 10:32:42.697 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:32:42.868 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:32:42.869 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:32:42.869 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:43.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:43 np0005593234 nova_compute[227762]: 2026-01-23 10:32:43.727 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:44.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:44 np0005593234 podman[311234]: 2026-01-23 10:32:44.765261917 +0000 UTC m=+0.057842078 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:32:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:45.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:46.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:47.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:47 np0005593234 nova_compute[227762]: 2026-01-23 10:32:47.700 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:48.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:48 np0005593234 nova_compute[227762]: 2026-01-23 10:32:48.730 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:49.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:50.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:32:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:51.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:32:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:32:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:52.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:32:52 np0005593234 nova_compute[227762]: 2026-01-23 10:32:52.702 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:53.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:53 np0005593234 nova_compute[227762]: 2026-01-23 10:32:53.732 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:53 np0005593234 podman[311257]: 2026-01-23 10:32:53.811034212 +0000 UTC m=+0.102515044 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:32:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:54.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #157. Immutable memtables: 0.
Jan 23 05:32:54 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:54.997015) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:32:54 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 99] Flushing memtable with next log file: 157
Jan 23 05:32:54 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164374997168, "job": 99, "event": "flush_started", "num_memtables": 1, "num_entries": 513, "num_deletes": 251, "total_data_size": 776174, "memory_usage": 786920, "flush_reason": "Manual Compaction"}
Jan 23 05:32:54 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 99] Level-0 flush table #158: started
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164375004532, "cf_name": "default", "job": 99, "event": "table_file_creation", "file_number": 158, "file_size": 512617, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 75631, "largest_seqno": 76138, "table_properties": {"data_size": 509827, "index_size": 825, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 901, "raw_key_size": 6568, "raw_average_key_size": 19, "raw_value_size": 504300, "raw_average_value_size": 1461, "num_data_blocks": 36, "num_entries": 345, "num_filter_entries": 345, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164348, "oldest_key_time": 1769164348, "file_creation_time": 1769164374, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 158, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 99] Flush lasted 7584 microseconds, and 4228 cpu microseconds.
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.004615) [db/flush_job.cc:967] [default] [JOB 99] Level-0 flush table #158: 512617 bytes OK
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.004642) [db/memtable_list.cc:519] [default] Level-0 commit table #158 started
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.006546) [db/memtable_list.cc:722] [default] Level-0 commit table #158: memtable #1 done
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.006602) EVENT_LOG_v1 {"time_micros": 1769164375006596, "job": 99, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.006625) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 99] Try to delete WAL files size 773136, prev total WAL file size 773136, number of live WAL files 2.
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000154.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.007117) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036353236' seq:72057594037927935, type:22 .. '7061786F730036373738' seq:0, type:0; will stop at (end)
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 100] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 99 Base level 0, inputs: [158(500KB)], [156(10MB)]
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164375007210, "job": 100, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [158], "files_L6": [156], "score": -1, "input_data_size": 11863723, "oldest_snapshot_seqno": -1}
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.010 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.010 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 100] Generated table #159: 9244 keys, 9912157 bytes, temperature: kUnknown
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164375077263, "cf_name": "default", "job": 100, "event": "table_file_creation", "file_number": 159, "file_size": 9912157, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 9855650, "index_size": 32266, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 23173, "raw_key_size": 245374, "raw_average_key_size": 26, "raw_value_size": 9696656, "raw_average_value_size": 1048, "num_data_blocks": 1218, "num_entries": 9244, "num_filter_entries": 9244, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769164375, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 159, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.077724) [db/compaction/compaction_job.cc:1663] [default] [JOB 100] Compacted 1@0 + 1@6 files to L6 => 9912157 bytes
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.079206) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 169.1 rd, 141.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 10.8 +0.0 blob) out(9.5 +0.0 blob), read-write-amplify(42.5) write-amplify(19.3) OK, records in: 9758, records dropped: 514 output_compression: NoCompression
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.079230) EVENT_LOG_v1 {"time_micros": 1769164375079218, "job": 100, "event": "compaction_finished", "compaction_time_micros": 70176, "compaction_time_cpu_micros": 24577, "output_level": 6, "num_output_files": 1, "total_output_size": 9912157, "num_input_records": 9758, "num_output_records": 9244, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000158.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164375079557, "job": 100, "event": "table_file_deletion", "file_number": 158}
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000156.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164375082445, "job": 100, "event": "table_file_deletion", "file_number": 156}
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.006996) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.082520) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.082526) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.082528) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.082530) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:55 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:32:55.082532) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.142 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "298e1080-6898-4a9b-903e-052965024e8a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.143 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.147 227766 DEBUG nova.compute.manager [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.187 227766 DEBUG nova.compute.manager [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.438 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.439 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.450 227766 DEBUG nova.virt.hardware [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.450 227766 INFO nova.compute.claims [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.481 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:55.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.802 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:55 np0005593234 nova_compute[227762]: 2026-01-23 10:32:55.908 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:32:56 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2914338837' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:32:56 np0005593234 nova_compute[227762]: 2026-01-23 10:32:56.327 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:56 np0005593234 nova_compute[227762]: 2026-01-23 10:32:56.333 227766 DEBUG nova.compute.provider_tree [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:32:56 np0005593234 nova_compute[227762]: 2026-01-23 10:32:56.372 227766 DEBUG nova.scheduler.client.report [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:32:56 np0005593234 nova_compute[227762]: 2026-01-23 10:32:56.403 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.964s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:56 np0005593234 nova_compute[227762]: 2026-01-23 10:32:56.405 227766 DEBUG nova.compute.manager [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:32:56 np0005593234 nova_compute[227762]: 2026-01-23 10:32:56.409 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.929s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:56 np0005593234 nova_compute[227762]: 2026-01-23 10:32:56.428 227766 DEBUG nova.virt.hardware [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:32:56 np0005593234 nova_compute[227762]: 2026-01-23 10:32:56.429 227766 INFO nova.compute.claims [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:32:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:32:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:56.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:32:57 np0005593234 nova_compute[227762]: 2026-01-23 10:32:57.536 227766 DEBUG nova.compute.manager [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:32:57 np0005593234 nova_compute[227762]: 2026-01-23 10:32:57.537 227766 DEBUG nova.network.neutron [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:32:57 np0005593234 nova_compute[227762]: 2026-01-23 10:32:57.589 227766 INFO nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:32:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:57.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:57 np0005593234 nova_compute[227762]: 2026-01-23 10:32:57.622 227766 DEBUG nova.compute.manager [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:32:57 np0005593234 nova_compute[227762]: 2026-01-23 10:32:57.704 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:57 np0005593234 nova_compute[227762]: 2026-01-23 10:32:57.707 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:32:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2201993384' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.126 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.132 227766 DEBUG nova.compute.provider_tree [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.156 227766 DEBUG nova.scheduler.client.report [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.169 227766 DEBUG nova.compute.manager [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:32:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.171 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.171 227766 INFO nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Creating image(s)#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.197 227766 DEBUG nova.storage.rbd_utils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 1114ae68-dab9-46b3-abab-53f135df78d8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.225 227766 DEBUG nova.storage.rbd_utils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 1114ae68-dab9-46b3-abab-53f135df78d8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.251 227766 DEBUG nova.storage.rbd_utils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 1114ae68-dab9-46b3-abab-53f135df78d8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.255 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.278 227766 DEBUG nova.policy [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'e1629a4b14764dddaabcadd16f3e1c1c', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.318 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.318 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.319 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.319 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.344 227766 DEBUG nova.storage.rbd_utils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 1114ae68-dab9-46b3-abab-53f135df78d8_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.348 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 1114ae68-dab9-46b3-abab-53f135df78d8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.477 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.068s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.479 227766 DEBUG nova.compute.manager [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.482 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 2.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.482 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.483 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.483 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:58.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.661 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 1114ae68-dab9-46b3-abab-53f135df78d8_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.313s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.734 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.736 227766 DEBUG nova.compute.manager [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.737 227766 DEBUG nova.network.neutron [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.744 227766 DEBUG nova.storage.rbd_utils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] resizing rbd image 1114ae68-dab9-46b3-abab-53f135df78d8_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.863 227766 DEBUG nova.objects.instance [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'migration_context' on Instance uuid 1114ae68-dab9-46b3-abab-53f135df78d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:32:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:32:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1011652076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.960 227766 INFO nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.963 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.965 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.966 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Ensure instance console log exists: /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.966 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.966 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:58 np0005593234 nova_compute[227762]: 2026-01-23 10:32:58.967 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.024 227766 DEBUG nova.compute.manager [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.090 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.091 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.206 227766 DEBUG nova.policy [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60291ce86b6946629a2e48f6680312cb', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.209 227766 DEBUG nova.compute.manager [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.211 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.211 227766 INFO nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Creating image(s)#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.231 227766 DEBUG nova.storage.rbd_utils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 298e1080-6898-4a9b-903e-052965024e8a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.253 227766 DEBUG nova.storage.rbd_utils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 298e1080-6898-4a9b-903e-052965024e8a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.274 227766 DEBUG nova.storage.rbd_utils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 298e1080-6898-4a9b-903e-052965024e8a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.278 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.343 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.345 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.346 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.347 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.369 227766 DEBUG nova.storage.rbd_utils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 298e1080-6898-4a9b-903e-052965024e8a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.373 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 298e1080-6898-4a9b-903e-052965024e8a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.413 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.415 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3987MB free_disk=20.89715576171875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.415 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.415 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.613 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 64ccc062-b11b-4cbc-96ba-620e43dfdb20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.613 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 1114ae68-dab9-46b3-abab-53f135df78d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.613 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 298e1080-6898-4a9b-903e-052965024e8a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.614 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.614 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:32:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:32:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:32:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:59.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.664 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 298e1080-6898-4a9b-903e-052965024e8a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.291s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.745 227766 DEBUG nova.storage.rbd_utils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] resizing rbd image 298e1080-6898-4a9b-903e-052965024e8a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.864 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.906 227766 DEBUG nova.objects.instance [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'migration_context' on Instance uuid 298e1080-6898-4a9b-903e-052965024e8a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.934 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.934 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Ensure instance console log exists: /var/lib/nova/instances/298e1080-6898-4a9b-903e-052965024e8a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.935 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.935 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:32:59 np0005593234 nova_compute[227762]: 2026-01-23 10:32:59.935 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:33:00 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3110654776' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:33:00 np0005593234 nova_compute[227762]: 2026-01-23 10:33:00.319 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:00 np0005593234 nova_compute[227762]: 2026-01-23 10:33:00.321 227766 DEBUG nova.network.neutron [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Successfully created port: 1827509f-e3b0-49ea-b1ff-982db21148b8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:33:00 np0005593234 nova_compute[227762]: 2026-01-23 10:33:00.329 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:33:00 np0005593234 nova_compute[227762]: 2026-01-23 10:33:00.361 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:33:00 np0005593234 nova_compute[227762]: 2026-01-23 10:33:00.452 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:33:00 np0005593234 nova_compute[227762]: 2026-01-23 10:33:00.452 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:00.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:01 np0005593234 nova_compute[227762]: 2026-01-23 10:33:01.549 227766 DEBUG nova.network.neutron [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Successfully created port: b6ddc2d2-277d-4859-8c63-6920fe72886a _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:33:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:01.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:02 np0005593234 nova_compute[227762]: 2026-01-23 10:33:02.453 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:02 np0005593234 nova_compute[227762]: 2026-01-23 10:33:02.454 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:33:02 np0005593234 nova_compute[227762]: 2026-01-23 10:33:02.454 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:33:02 np0005593234 nova_compute[227762]: 2026-01-23 10:33:02.478 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:33:02 np0005593234 nova_compute[227762]: 2026-01-23 10:33:02.478 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:33:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:02.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:02 np0005593234 nova_compute[227762]: 2026-01-23 10:33:02.706 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:02 np0005593234 nova_compute[227762]: 2026-01-23 10:33:02.775 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:02 np0005593234 nova_compute[227762]: 2026-01-23 10:33:02.776 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:02 np0005593234 nova_compute[227762]: 2026-01-23 10:33:02.776 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:33:02 np0005593234 nova_compute[227762]: 2026-01-23 10:33:02.776 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:02 np0005593234 nova_compute[227762]: 2026-01-23 10:33:02.997 227766 DEBUG nova.network.neutron [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Successfully updated port: 1827509f-e3b0-49ea-b1ff-982db21148b8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:33:03 np0005593234 nova_compute[227762]: 2026-01-23 10:33:03.030 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:03 np0005593234 nova_compute[227762]: 2026-01-23 10:33:03.031 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquired lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:03 np0005593234 nova_compute[227762]: 2026-01-23 10:33:03.031 227766 DEBUG nova.network.neutron [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:33:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:03 np0005593234 nova_compute[227762]: 2026-01-23 10:33:03.345 227766 DEBUG nova.compute.manager [req-29b3b731-a714-42a5-a5ce-bbf374822a03 req-47a5b949-158e-4be0-8810-130eb67dff8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-changed-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:03 np0005593234 nova_compute[227762]: 2026-01-23 10:33:03.345 227766 DEBUG nova.compute.manager [req-29b3b731-a714-42a5-a5ce-bbf374822a03 req-47a5b949-158e-4be0-8810-130eb67dff8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Refreshing instance network info cache due to event network-changed-1827509f-e3b0-49ea-b1ff-982db21148b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:33:03 np0005593234 nova_compute[227762]: 2026-01-23 10:33:03.346 227766 DEBUG oslo_concurrency.lockutils [req-29b3b731-a714-42a5-a5ce-bbf374822a03 req-47a5b949-158e-4be0-8810-130eb67dff8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:03 np0005593234 nova_compute[227762]: 2026-01-23 10:33:03.415 227766 DEBUG nova.network.neutron [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:33:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:33:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:03.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:33:03 np0005593234 nova_compute[227762]: 2026-01-23 10:33:03.736 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:04 np0005593234 nova_compute[227762]: 2026-01-23 10:33:04.411 227766 DEBUG nova.network.neutron [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Successfully updated port: b6ddc2d2-277d-4859-8c63-6920fe72886a _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:33:04 np0005593234 nova_compute[227762]: 2026-01-23 10:33:04.443 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "refresh_cache-298e1080-6898-4a9b-903e-052965024e8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:04 np0005593234 nova_compute[227762]: 2026-01-23 10:33:04.444 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquired lock "refresh_cache-298e1080-6898-4a9b-903e-052965024e8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:04 np0005593234 nova_compute[227762]: 2026-01-23 10:33:04.444 227766 DEBUG nova.network.neutron [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:33:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:04.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:04 np0005593234 nova_compute[227762]: 2026-01-23 10:33:04.891 227766 DEBUG nova.network.neutron [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:33:05 np0005593234 nova_compute[227762]: 2026-01-23 10:33:05.590 227766 DEBUG nova.compute.manager [req-d98a0300-3de5-49ba-979f-c46f9249b5b9 req-fbb15b0a-5d4f-49b3-8290-10b4dbf775d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Received event network-changed-b6ddc2d2-277d-4859-8c63-6920fe72886a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:05 np0005593234 nova_compute[227762]: 2026-01-23 10:33:05.590 227766 DEBUG nova.compute.manager [req-d98a0300-3de5-49ba-979f-c46f9249b5b9 req-fbb15b0a-5d4f-49b3-8290-10b4dbf775d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Refreshing instance network info cache due to event network-changed-b6ddc2d2-277d-4859-8c63-6920fe72886a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:33:05 np0005593234 nova_compute[227762]: 2026-01-23 10:33:05.591 227766 DEBUG oslo_concurrency.lockutils [req-d98a0300-3de5-49ba-979f-c46f9249b5b9 req-fbb15b0a-5d4f-49b3-8290-10b4dbf775d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-298e1080-6898-4a9b-903e-052965024e8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:05.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.268 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updating instance_info_cache with network_info: [{"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.304 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.304 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.305 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.305 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.305 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.305 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.306 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.346 227766 DEBUG nova.network.neutron [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Updating instance_info_cache with network_info: [{"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.366 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Releasing lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.366 227766 DEBUG nova.compute.manager [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Instance network_info: |[{"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.367 227766 DEBUG oslo_concurrency.lockutils [req-29b3b731-a714-42a5-a5ce-bbf374822a03 req-47a5b949-158e-4be0-8810-130eb67dff8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.367 227766 DEBUG nova.network.neutron [req-29b3b731-a714-42a5-a5ce-bbf374822a03 req-47a5b949-158e-4be0-8810-130eb67dff8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Refreshing network info cache for port 1827509f-e3b0-49ea-b1ff-982db21148b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.369 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Start _get_guest_xml network_info=[{"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.373 227766 WARNING nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.377 227766 DEBUG nova.virt.libvirt.host [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.378 227766 DEBUG nova.virt.libvirt.host [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.380 227766 DEBUG nova.virt.libvirt.host [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.381 227766 DEBUG nova.virt.libvirt.host [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.382 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.382 227766 DEBUG nova.virt.hardware [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.382 227766 DEBUG nova.virt.hardware [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.383 227766 DEBUG nova.virt.hardware [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.383 227766 DEBUG nova.virt.hardware [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.383 227766 DEBUG nova.virt.hardware [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.383 227766 DEBUG nova.virt.hardware [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.384 227766 DEBUG nova.virt.hardware [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.384 227766 DEBUG nova.virt.hardware [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.384 227766 DEBUG nova.virt.hardware [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.384 227766 DEBUG nova.virt.hardware [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.384 227766 DEBUG nova.virt.hardware [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.387 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:06.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:33:06 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/793951531' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.803 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.828 227766 DEBUG nova.storage.rbd_utils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 1114ae68-dab9-46b3-abab-53f135df78d8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:06 np0005593234 nova_compute[227762]: 2026-01-23 10:33:06.831 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:33:07 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3548047752' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.269 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.271 227766 DEBUG nova.virt.libvirt.vif [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:32:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-182552386',display_name='tempest-ServerStableDeviceRescueTest-server-182552386',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-182552386',id=181,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='815b71acf60d4ed8933ebd05228fa0c0',ramdisk_id='',reservation_id='r-491now07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1802220041',owner_user_name='tempest-ServerStableDeviceRescueTest-1802220041-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:32:57Z,user_data=None,user_id='e1629a4b14764dddaabcadd16f3e1c1c',uuid=1114ae68-dab9-46b3-abab-53f135df78d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.271 227766 DEBUG nova.network.os_vif_util [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converting VIF {"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.272 227766 DEBUG nova.network.os_vif_util [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:16:71,bridge_name='br-int',has_traffic_filtering=True,id=1827509f-e3b0-49ea-b1ff-982db21148b8,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1827509f-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.273 227766 DEBUG nova.objects.instance [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1114ae68-dab9-46b3-abab-53f135df78d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.300 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  <uuid>1114ae68-dab9-46b3-abab-53f135df78d8</uuid>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  <name>instance-000000b5</name>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-182552386</nova:name>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:33:06</nova:creationTime>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <nova:user uuid="e1629a4b14764dddaabcadd16f3e1c1c">tempest-ServerStableDeviceRescueTest-1802220041-project-member</nova:user>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <nova:project uuid="815b71acf60d4ed8933ebd05228fa0c0">tempest-ServerStableDeviceRescueTest-1802220041</nova:project>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <nova:port uuid="1827509f-e3b0-49ea-b1ff-982db21148b8">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <entry name="serial">1114ae68-dab9-46b3-abab-53f135df78d8</entry>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <entry name="uuid">1114ae68-dab9-46b3-abab-53f135df78d8</entry>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1114ae68-dab9-46b3-abab-53f135df78d8_disk">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1114ae68-dab9-46b3-abab-53f135df78d8_disk.config">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:02:16:71"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <target dev="tap1827509f-e3"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/console.log" append="off"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:33:07 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:33:07 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:33:07 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:33:07 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.302 227766 DEBUG nova.compute.manager [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Preparing to wait for external event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.302 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.303 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.303 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.303 227766 DEBUG nova.virt.libvirt.vif [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:32:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-182552386',display_name='tempest-ServerStableDeviceRescueTest-server-182552386',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-182552386',id=181,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='815b71acf60d4ed8933ebd05228fa0c0',ramdisk_id='',reservation_id='r-491now07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-ServerStableDeviceRescueTest-1802220041',owner_user_name='tempest-ServerStableDeviceRescueTest-1802220041-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:32:57Z,user_data=None,user_id='e1629a4b14764dddaabcadd16f3e1c1c',uuid=1114ae68-dab9-46b3-abab-53f135df78d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.304 227766 DEBUG nova.network.os_vif_util [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converting VIF {"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.304 227766 DEBUG nova.network.os_vif_util [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:16:71,bridge_name='br-int',has_traffic_filtering=True,id=1827509f-e3b0-49ea-b1ff-982db21148b8,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1827509f-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.305 227766 DEBUG os_vif [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:16:71,bridge_name='br-int',has_traffic_filtering=True,id=1827509f-e3b0-49ea-b1ff-982db21148b8,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1827509f-e3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.305 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.306 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.306 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.309 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.309 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1827509f-e3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.310 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1827509f-e3, col_values=(('external_ids', {'iface-id': '1827509f-e3b0-49ea-b1ff-982db21148b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:16:71', 'vm-uuid': '1114ae68-dab9-46b3-abab-53f135df78d8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.311 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:07 np0005593234 NetworkManager[48942]: <info>  [1769164387.3122] manager: (tap1827509f-e3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/367)
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.314 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.322 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.323 227766 INFO os_vif [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:16:71,bridge_name='br-int',has_traffic_filtering=True,id=1827509f-e3b0-49ea-b1ff-982db21148b8,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1827509f-e3')#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.428 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.428 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.428 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No VIF found with MAC fa:16:3e:02:16:71, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.429 227766 INFO nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Using config drive#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.461 227766 DEBUG nova.storage.rbd_utils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 1114ae68-dab9-46b3-abab-53f135df78d8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.559 227766 DEBUG nova.network.neutron [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Updating instance_info_cache with network_info: [{"id": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "address": "fa:16:3e:c2:a4:da", "network": {"id": "bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998", "bridge": "br-int", "label": "tempest-network-smoke--156740505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6ddc2d2-27", "ovs_interfaceid": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.587 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Releasing lock "refresh_cache-298e1080-6898-4a9b-903e-052965024e8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.587 227766 DEBUG nova.compute.manager [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Instance network_info: |[{"id": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "address": "fa:16:3e:c2:a4:da", "network": {"id": "bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998", "bridge": "br-int", "label": "tempest-network-smoke--156740505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6ddc2d2-27", "ovs_interfaceid": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.587 227766 DEBUG oslo_concurrency.lockutils [req-d98a0300-3de5-49ba-979f-c46f9249b5b9 req-fbb15b0a-5d4f-49b3-8290-10b4dbf775d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-298e1080-6898-4a9b-903e-052965024e8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.588 227766 DEBUG nova.network.neutron [req-d98a0300-3de5-49ba-979f-c46f9249b5b9 req-fbb15b0a-5d4f-49b3-8290-10b4dbf775d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Refreshing network info cache for port b6ddc2d2-277d-4859-8c63-6920fe72886a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.590 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Start _get_guest_xml network_info=[{"id": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "address": "fa:16:3e:c2:a4:da", "network": {"id": "bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998", "bridge": "br-int", "label": "tempest-network-smoke--156740505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6ddc2d2-27", "ovs_interfaceid": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.594 227766 WARNING nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.598 227766 DEBUG nova.virt.libvirt.host [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.598 227766 DEBUG nova.virt.libvirt.host [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.601 227766 DEBUG nova.virt.libvirt.host [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.602 227766 DEBUG nova.virt.libvirt.host [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.603 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.603 227766 DEBUG nova.virt.hardware [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.603 227766 DEBUG nova.virt.hardware [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.604 227766 DEBUG nova.virt.hardware [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.604 227766 DEBUG nova.virt.hardware [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.604 227766 DEBUG nova.virt.hardware [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.604 227766 DEBUG nova.virt.hardware [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.604 227766 DEBUG nova.virt.hardware [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.605 227766 DEBUG nova.virt.hardware [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.605 227766 DEBUG nova.virt.hardware [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.605 227766 DEBUG nova.virt.hardware [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.605 227766 DEBUG nova.virt.hardware [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:33:07 np0005593234 nova_compute[227762]: 2026-01-23 10:33:07.607 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:07.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:33:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1371770727' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.026 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.049 227766 DEBUG nova.storage.rbd_utils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 298e1080-6898-4a9b-903e-052965024e8a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.053 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.093 227766 INFO nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Creating config drive at /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/disk.config#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.097 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprfaqp_o8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.228 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmprfaqp_o8" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.255 227766 DEBUG nova.storage.rbd_utils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 1114ae68-dab9-46b3-abab-53f135df78d8_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.258 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/disk.config 1114ae68-dab9-46b3-abab-53f135df78d8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.400 227766 DEBUG oslo_concurrency.processutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/disk.config 1114ae68-dab9-46b3-abab-53f135df78d8_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.401 227766 INFO nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Deleting local config drive /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/disk.config because it was imported into RBD.#033[00m
Jan 23 05:33:08 np0005593234 kernel: tap1827509f-e3: entered promiscuous mode
Jan 23 05:33:08 np0005593234 NetworkManager[48942]: <info>  [1769164388.4541] manager: (tap1827509f-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/368)
Jan 23 05:33:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:08Z|00768|binding|INFO|Claiming lport 1827509f-e3b0-49ea-b1ff-982db21148b8 for this chassis.
Jan 23 05:33:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:08Z|00769|binding|INFO|1827509f-e3b0-49ea-b1ff-982db21148b8: Claiming fa:16:3e:02:16:71 10.100.0.14
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.460 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.470 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:16:71 10.100.0.14'], port_security=['fa:16:3e:02:16:71 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1114ae68-dab9-46b3-abab-53f135df78d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=1827509f-e3b0-49ea-b1ff-982db21148b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.472 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 1827509f-e3b0-49ea-b1ff-982db21148b8 in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 bound to our chassis#033[00m
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.473 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7d5530f-5227-4f75-bac0-2604bb3d68e2#033[00m
Jan 23 05:33:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:08Z|00770|binding|INFO|Setting lport 1827509f-e3b0-49ea-b1ff-982db21148b8 ovn-installed in OVS
Jan 23 05:33:08 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:08Z|00771|binding|INFO|Setting lport 1827509f-e3b0-49ea-b1ff-982db21148b8 up in Southbound
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.483 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:33:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3983975178' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:33:08 np0005593234 systemd-machined[195626]: New machine qemu-86-instance-000000b5.
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.494 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8bf031c4-66aa-41a6-b84b-c8c1b4df0ac3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:08 np0005593234 systemd[1]: Started Virtual Machine qemu-86-instance-000000b5.
Jan 23 05:33:08 np0005593234 systemd-udevd[311960]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.521 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.522 227766 DEBUG nova.virt.libvirt.vif [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:32:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1855261785',display_name='tempest-TestNetworkBasicOps-server-1855261785',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1855261785',id=182,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCxvuKoRCdU6NfGCJ9/K6lYehfBrYexe6JocWr8Q1ZD1CqGS7uFQ9Epr7CcEOwAFDI68GcaE9FQsMJxvu2ytNjTEF1iupLEVG5hBjQgiqOY7KJssQlPmRhqcGHft2KD/pQ==',key_name='tempest-TestNetworkBasicOps-843741360',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-0kakalu9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:32:59Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=298e1080-6898-4a9b-903e-052965024e8a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "address": "fa:16:3e:c2:a4:da", "network": {"id": "bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998", "bridge": "br-int", "label": "tempest-network-smoke--156740505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6ddc2d2-27", "ovs_interfaceid": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.523 227766 DEBUG nova.network.os_vif_util [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "address": "fa:16:3e:c2:a4:da", "network": {"id": "bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998", "bridge": "br-int", "label": "tempest-network-smoke--156740505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6ddc2d2-27", "ovs_interfaceid": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.523 227766 DEBUG nova.network.os_vif_util [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:a4:da,bridge_name='br-int',has_traffic_filtering=True,id=b6ddc2d2-277d-4859-8c63-6920fe72886a,network=Network(bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6ddc2d2-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.524 227766 DEBUG nova.objects.instance [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'pci_devices' on Instance uuid 298e1080-6898-4a9b-903e-052965024e8a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:08 np0005593234 NetworkManager[48942]: <info>  [1769164388.5319] device (tap1827509f-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:33:08 np0005593234 NetworkManager[48942]: <info>  [1769164388.5328] device (tap1827509f-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.542 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b14107c2-5878-4027-b6a1-7bd4e0d32c8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.546 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[79586f29-6d49-4c59-bad7-cebf2be18369]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.555 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  <uuid>298e1080-6898-4a9b-903e-052965024e8a</uuid>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  <name>instance-000000b6</name>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkBasicOps-server-1855261785</nova:name>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:33:07</nova:creationTime>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <nova:user uuid="60291ce86b6946629a2e48f6680312cb">tempest-TestNetworkBasicOps-789276745-project-member</nova:user>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <nova:project uuid="98c94577fcdb4c3d893898ede79ea2d4">tempest-TestNetworkBasicOps-789276745</nova:project>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <nova:port uuid="b6ddc2d2-277d-4859-8c63-6920fe72886a">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <entry name="serial">298e1080-6898-4a9b-903e-052965024e8a</entry>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <entry name="uuid">298e1080-6898-4a9b-903e-052965024e8a</entry>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/298e1080-6898-4a9b-903e-052965024e8a_disk">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/298e1080-6898-4a9b-903e-052965024e8a_disk.config">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:c2:a4:da"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <target dev="tapb6ddc2d2-27"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/298e1080-6898-4a9b-903e-052965024e8a/console.log" append="off"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:33:08 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:33:08 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:33:08 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:33:08 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.557 227766 DEBUG nova.compute.manager [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Preparing to wait for external event network-vif-plugged-b6ddc2d2-277d-4859-8c63-6920fe72886a prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.558 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "298e1080-6898-4a9b-903e-052965024e8a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.558 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.558 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.559 227766 DEBUG nova.virt.libvirt.vif [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:32:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1855261785',display_name='tempest-TestNetworkBasicOps-server-1855261785',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1855261785',id=182,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCxvuKoRCdU6NfGCJ9/K6lYehfBrYexe6JocWr8Q1ZD1CqGS7uFQ9Epr7CcEOwAFDI68GcaE9FQsMJxvu2ytNjTEF1iupLEVG5hBjQgiqOY7KJssQlPmRhqcGHft2KD/pQ==',key_name='tempest-TestNetworkBasicOps-843741360',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-0kakalu9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:32:59Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=298e1080-6898-4a9b-903e-052965024e8a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "address": "fa:16:3e:c2:a4:da", "network": {"id": "bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998", "bridge": "br-int", "label": "tempest-network-smoke--156740505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6ddc2d2-27", "ovs_interfaceid": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.560 227766 DEBUG nova.network.os_vif_util [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "address": "fa:16:3e:c2:a4:da", "network": {"id": "bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998", "bridge": "br-int", "label": "tempest-network-smoke--156740505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6ddc2d2-27", "ovs_interfaceid": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.560 227766 DEBUG nova.network.os_vif_util [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c2:a4:da,bridge_name='br-int',has_traffic_filtering=True,id=b6ddc2d2-277d-4859-8c63-6920fe72886a,network=Network(bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6ddc2d2-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.561 227766 DEBUG os_vif [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:a4:da,bridge_name='br-int',has_traffic_filtering=True,id=b6ddc2d2-277d-4859-8c63-6920fe72886a,network=Network(bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6ddc2d2-27') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.561 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.562 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.562 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.565 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.565 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb6ddc2d2-27, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.566 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb6ddc2d2-27, col_values=(('external_ids', {'iface-id': 'b6ddc2d2-277d-4859-8c63-6920fe72886a', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c2:a4:da', 'vm-uuid': '298e1080-6898-4a9b-903e-052965024e8a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.568 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:08 np0005593234 NetworkManager[48942]: <info>  [1769164388.5688] manager: (tapb6ddc2d2-27): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/369)
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.571 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.575 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.575 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7e383d63-2b31-4161-a2fb-7aac0458fa40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.577 227766 INFO os_vif [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c2:a4:da,bridge_name='br-int',has_traffic_filtering=True,id=b6ddc2d2-277d-4859-8c63-6920fe72886a,network=Network(bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6ddc2d2-27')#033[00m
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.595 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9f25a58c-f7ff-425e-a296-99e1e83b0cab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815735, 'reachable_time': 37082, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 311972, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:08.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.613 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[668f76b7-ddaf-4a24-a5f6-afc2980117db]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd7d5530f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815745, 'tstamp': 815745}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 311974, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd7d5530f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815749, 'tstamp': 815749}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 311974, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.616 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.617 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.622 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.622 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d5530f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.622 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.623 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7d5530f-50, col_values=(('external_ids', {'iface-id': '4c99eeb5-c437-4d31-ac3b-bfd151140733'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:08.623 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.646 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.646 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.646 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] No VIF found with MAC fa:16:3e:c2:a4:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.647 227766 INFO nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Using config drive#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.671 227766 DEBUG nova.storage.rbd_utils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 298e1080-6898-4a9b-903e-052965024e8a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:08 np0005593234 nova_compute[227762]: 2026-01-23 10:33:08.738 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.189 227766 DEBUG nova.network.neutron [req-29b3b731-a714-42a5-a5ce-bbf374822a03 req-47a5b949-158e-4be0-8810-130eb67dff8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Updated VIF entry in instance network info cache for port 1827509f-e3b0-49ea-b1ff-982db21148b8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.189 227766 DEBUG nova.network.neutron [req-29b3b731-a714-42a5-a5ce-bbf374822a03 req-47a5b949-158e-4be0-8810-130eb67dff8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Updating instance_info_cache with network_info: [{"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.219 227766 DEBUG oslo_concurrency.lockutils [req-29b3b731-a714-42a5-a5ce-bbf374822a03 req-47a5b949-158e-4be0-8810-130eb67dff8f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.222 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164389.221704, 1114ae68-dab9-46b3-abab-53f135df78d8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.223 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] VM Started (Lifecycle Event)#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.263 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.267 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164389.2218027, 1114ae68-dab9-46b3-abab-53f135df78d8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.268 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.298 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.301 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.328 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.546 227766 INFO nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Creating config drive at /var/lib/nova/instances/298e1080-6898-4a9b-903e-052965024e8a/disk.config#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.552 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/298e1080-6898-4a9b-903e-052965024e8a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoasv6any execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:09.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.691 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/298e1080-6898-4a9b-903e-052965024e8a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpoasv6any" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.718 227766 DEBUG nova.storage.rbd_utils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] rbd image 298e1080-6898-4a9b-903e-052965024e8a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.722 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/298e1080-6898-4a9b-903e-052965024e8a/disk.config 298e1080-6898-4a9b-903e-052965024e8a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.936 227766 DEBUG oslo_concurrency.processutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/298e1080-6898-4a9b-903e-052965024e8a/disk.config 298e1080-6898-4a9b-903e-052965024e8a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.214s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.938 227766 INFO nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Deleting local config drive /var/lib/nova/instances/298e1080-6898-4a9b-903e-052965024e8a/disk.config because it was imported into RBD.#033[00m
Jan 23 05:33:09 np0005593234 kernel: tapb6ddc2d2-27: entered promiscuous mode
Jan 23 05:33:09 np0005593234 NetworkManager[48942]: <info>  [1769164389.9874] manager: (tapb6ddc2d2-27): new Tun device (/org/freedesktop/NetworkManager/Devices/370)
Jan 23 05:33:09 np0005593234 systemd-udevd[311963]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:33:09 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:09Z|00772|binding|INFO|Claiming lport b6ddc2d2-277d-4859-8c63-6920fe72886a for this chassis.
Jan 23 05:33:09 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:09Z|00773|binding|INFO|b6ddc2d2-277d-4859-8c63-6920fe72886a: Claiming fa:16:3e:c2:a4:da 10.100.0.9
Jan 23 05:33:09 np0005593234 nova_compute[227762]: 2026-01-23 10:33:09.989 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:09 np0005593234 NetworkManager[48942]: <info>  [1769164389.9984] device (tapb6ddc2d2-27): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:33:09 np0005593234 NetworkManager[48942]: <info>  [1769164389.9996] device (tapb6ddc2d2-27): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:33:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:09.999 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:a4:da 10.100.0.9'], port_security=['fa:16:3e:c2:a4:da 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '298e1080-6898-4a9b-903e-052965024e8a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '2', 'neutron:security_group_ids': '59524fa5-1e38-4c78-bba4-1817dab86850', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bc6b799-b691-4f6b-8f97-c0d3cab241b4, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=b6ddc2d2-277d-4859-8c63-6920fe72886a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.000 144381 INFO neutron.agent.ovn.metadata.agent [-] Port b6ddc2d2-277d-4859-8c63-6920fe72886a in datapath bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998 bound to our chassis#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.001 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.011 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[09636c94-0d81-456b-af22-6700b8d48d41]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.012 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapbd8a7a46-f1 in ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.014 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapbd8a7a46-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.014 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[497346a5-9af4-4cc6-955b-b4ba4053a565]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.015 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b39f4312-697c-4350-b251-fbe46412aa72]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 systemd-machined[195626]: New machine qemu-87-instance-000000b6.
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.028 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a98157-785f-4567-abb7-d940cff3c5b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 systemd[1]: Started Virtual Machine qemu-87-instance-000000b6.
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.052 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.052 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[605a98db-39fb-47ec-9637-e042851ce1e4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:10Z|00774|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:33:10 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:10Z|00775|binding|INFO|Setting lport b6ddc2d2-277d-4859-8c63-6920fe72886a ovn-installed in OVS
Jan 23 05:33:10 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:10Z|00776|binding|INFO|Setting lport b6ddc2d2-277d-4859-8c63-6920fe72886a up in Southbound
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.063 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.086 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[fa5d954d-766e-4573-a865-9e160eb03c82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.094 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[386a8af8-ee95-4982-8b4c-ffe5639d991f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 NetworkManager[48942]: <info>  [1769164390.0951] manager: (tapbd8a7a46-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/371)
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.131 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[20a436f5-a929-45c4-ae4a-29e6847ae05f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.134 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c636f5ff-d6d9-4748-aa57-c0190223ae57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.157 227766 DEBUG nova.network.neutron [req-d98a0300-3de5-49ba-979f-c46f9249b5b9 req-fbb15b0a-5d4f-49b3-8290-10b4dbf775d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Updated VIF entry in instance network info cache for port b6ddc2d2-277d-4859-8c63-6920fe72886a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.157 227766 DEBUG nova.network.neutron [req-d98a0300-3de5-49ba-979f-c46f9249b5b9 req-fbb15b0a-5d4f-49b3-8290-10b4dbf775d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Updating instance_info_cache with network_info: [{"id": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "address": "fa:16:3e:c2:a4:da", "network": {"id": "bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998", "bridge": "br-int", "label": "tempest-network-smoke--156740505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6ddc2d2-27", "ovs_interfaceid": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:10 np0005593234 NetworkManager[48942]: <info>  [1769164390.1609] device (tapbd8a7a46-f0): carrier: link connected
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.165 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f7830015-fcaa-4a8f-8d0b-97791873bbc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.182 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[71901393-d2f0-462e-9646-e0c8adc75a29]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd8a7a46-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:9f:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 828512, 'reachable_time': 44824, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312123, 'error': None, 'target': 'ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.199 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5d5081a4-6b37-4cb7-b25b-658f40da2a2b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7b:9f05'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 828512, 'tstamp': 828512}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312124, 'error': None, 'target': 'ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.216 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a2e65d45-4cd9-4bb2-9540-f2be317b414a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapbd8a7a46-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7b:9f:05'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 241], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 828512, 'reachable_time': 44824, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 312125, 'error': None, 'target': 'ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.220 227766 DEBUG oslo_concurrency.lockutils [req-d98a0300-3de5-49ba-979f-c46f9249b5b9 req-fbb15b0a-5d4f-49b3-8290-10b4dbf775d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-298e1080-6898-4a9b-903e-052965024e8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.243 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[024fb5fb-bd54-4104-8c28-59013cf721f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.299 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1092efb0-c6bb-4267-9b40-d2e5b1df8847]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.301 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd8a7a46-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.301 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.302 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbd8a7a46-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:10 np0005593234 NetworkManager[48942]: <info>  [1769164390.3041] manager: (tapbd8a7a46-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/372)
Jan 23 05:33:10 np0005593234 kernel: tapbd8a7a46-f0: entered promiscuous mode
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.304 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.306 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapbd8a7a46-f0, col_values=(('external_ids', {'iface-id': 'f6117e93-58bd-4099-b49e-913018961730'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:10 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:10Z|00777|binding|INFO|Releasing lport f6117e93-58bd-4099-b49e-913018961730 from this chassis (sb_readonly=0)
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.307 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.321 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.323 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.324 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b59b4a3f-77a7-4821-8b90-d08e710ec44a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.325 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998.pid.haproxy
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:33:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:10.327 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998', 'env', 'PROCESS_TAG=haproxy-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:33:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:10.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.619 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164390.619209, 298e1080-6898-4a9b-903e-052965024e8a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.621 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 298e1080-6898-4a9b-903e-052965024e8a] VM Started (Lifecycle Event)#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.646 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.650 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164390.6202047, 298e1080-6898-4a9b-903e-052965024e8a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.650 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 298e1080-6898-4a9b-903e-052965024e8a] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:33:10 np0005593234 podman[312198]: 2026-01-23 10:33:10.683978309 +0000 UTC m=+0.053166892 container create 81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.680 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.691 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:33:10 np0005593234 systemd[1]: Started libpod-conmon-81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1.scope.
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.728 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 298e1080-6898-4a9b-903e-052965024e8a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:33:10 np0005593234 podman[312198]: 2026-01-23 10:33:10.656879982 +0000 UTC m=+0.026068595 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:33:10 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:33:10 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8de1384971c726afa0086383a50c471889291a9e2da7aa3d907db0815419f56c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:33:10 np0005593234 podman[312198]: 2026-01-23 10:33:10.774100695 +0000 UTC m=+0.143289278 container init 81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:33:10 np0005593234 podman[312198]: 2026-01-23 10:33:10.779449402 +0000 UTC m=+0.148637985 container start 81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:33:10 np0005593234 neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998[312213]: [NOTICE]   (312217) : New worker (312219) forked
Jan 23 05:33:10 np0005593234 neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998[312213]: [NOTICE]   (312217) : Loading success.
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.977 227766 DEBUG nova.compute.manager [req-9112ce35-8daf-45f9-b1d3-da5ee50000d2 req-6db05749-792a-410e-aa1b-b127bb4126a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.977 227766 DEBUG oslo_concurrency.lockutils [req-9112ce35-8daf-45f9-b1d3-da5ee50000d2 req-6db05749-792a-410e-aa1b-b127bb4126a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.978 227766 DEBUG oslo_concurrency.lockutils [req-9112ce35-8daf-45f9-b1d3-da5ee50000d2 req-6db05749-792a-410e-aa1b-b127bb4126a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.978 227766 DEBUG oslo_concurrency.lockutils [req-9112ce35-8daf-45f9-b1d3-da5ee50000d2 req-6db05749-792a-410e-aa1b-b127bb4126a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.978 227766 DEBUG nova.compute.manager [req-9112ce35-8daf-45f9-b1d3-da5ee50000d2 req-6db05749-792a-410e-aa1b-b127bb4126a1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Processing event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.979 227766 DEBUG nova.compute.manager [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.982 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164390.982445, 1114ae68-dab9-46b3-abab-53f135df78d8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.983 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.984 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.987 227766 INFO nova.virt.libvirt.driver [-] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Instance spawned successfully.#033[00m
Jan 23 05:33:10 np0005593234 nova_compute[227762]: 2026-01-23 10:33:10.988 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:33:11 np0005593234 nova_compute[227762]: 2026-01-23 10:33:11.022 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:11 np0005593234 nova_compute[227762]: 2026-01-23 10:33:11.026 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:11 np0005593234 nova_compute[227762]: 2026-01-23 10:33:11.026 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:11 np0005593234 nova_compute[227762]: 2026-01-23 10:33:11.027 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:11 np0005593234 nova_compute[227762]: 2026-01-23 10:33:11.027 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:11 np0005593234 nova_compute[227762]: 2026-01-23 10:33:11.027 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:11 np0005593234 nova_compute[227762]: 2026-01-23 10:33:11.028 227766 DEBUG nova.virt.libvirt.driver [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:11 np0005593234 nova_compute[227762]: 2026-01-23 10:33:11.032 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:33:11 np0005593234 nova_compute[227762]: 2026-01-23 10:33:11.087 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:33:11 np0005593234 nova_compute[227762]: 2026-01-23 10:33:11.122 227766 INFO nova.compute.manager [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Took 12.95 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:33:11 np0005593234 nova_compute[227762]: 2026-01-23 10:33:11.123 227766 DEBUG nova.compute.manager [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:11 np0005593234 nova_compute[227762]: 2026-01-23 10:33:11.241 227766 INFO nova.compute.manager [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Took 15.85 seconds to build instance.#033[00m
Jan 23 05:33:11 np0005593234 nova_compute[227762]: 2026-01-23 10:33:11.283 227766 DEBUG oslo_concurrency.lockutils [None req-158d125c-2005-468c-91aa-4e580496f046 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:11.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:12.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e350 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:13 np0005593234 nova_compute[227762]: 2026-01-23 10:33:13.397 227766 DEBUG nova.compute.manager [req-1040f39f-459d-495b-8845-88b8e63189ce req-c86a1018-5651-43ea-b353-38c51756e274 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:13 np0005593234 nova_compute[227762]: 2026-01-23 10:33:13.398 227766 DEBUG oslo_concurrency.lockutils [req-1040f39f-459d-495b-8845-88b8e63189ce req-c86a1018-5651-43ea-b353-38c51756e274 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:13 np0005593234 nova_compute[227762]: 2026-01-23 10:33:13.398 227766 DEBUG oslo_concurrency.lockutils [req-1040f39f-459d-495b-8845-88b8e63189ce req-c86a1018-5651-43ea-b353-38c51756e274 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:13 np0005593234 nova_compute[227762]: 2026-01-23 10:33:13.398 227766 DEBUG oslo_concurrency.lockutils [req-1040f39f-459d-495b-8845-88b8e63189ce req-c86a1018-5651-43ea-b353-38c51756e274 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:13 np0005593234 nova_compute[227762]: 2026-01-23 10:33:13.398 227766 DEBUG nova.compute.manager [req-1040f39f-459d-495b-8845-88b8e63189ce req-c86a1018-5651-43ea-b353-38c51756e274 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] No waiting events found dispatching network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:13 np0005593234 nova_compute[227762]: 2026-01-23 10:33:13.398 227766 WARNING nova.compute.manager [req-1040f39f-459d-495b-8845-88b8e63189ce req-c86a1018-5651-43ea-b353-38c51756e274 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received unexpected event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:33:13 np0005593234 nova_compute[227762]: 2026-01-23 10:33:13.569 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:13.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:13 np0005593234 nova_compute[227762]: 2026-01-23 10:33:13.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:13 np0005593234 nova_compute[227762]: 2026-01-23 10:33:13.782 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:13 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 23 05:33:14 np0005593234 nova_compute[227762]: 2026-01-23 10:33:14.292 227766 DEBUG nova.compute.manager [None req-a50acd54-557d-4d89-83be-31893c9dd896 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:14 np0005593234 nova_compute[227762]: 2026-01-23 10:33:14.358 227766 INFO nova.compute.manager [None req-a50acd54-557d-4d89-83be-31893c9dd896 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] instance snapshotting#033[00m
Jan 23 05:33:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:33:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:14.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:33:14 np0005593234 nova_compute[227762]: 2026-01-23 10:33:14.899 227766 INFO nova.virt.libvirt.driver [None req-a50acd54-557d-4d89-83be-31893c9dd896 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Beginning live snapshot process#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.160 227766 DEBUG nova.virt.libvirt.imagebackend [None req-a50acd54-557d-4d89-83be-31893c9dd896 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.613 227766 DEBUG nova.storage.rbd_utils [None req-a50acd54-557d-4d89-83be-31893c9dd896 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] creating snapshot(2835c3f512474430b6a9de6740e5a480) on rbd image(1114ae68-dab9-46b3-abab-53f135df78d8_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:33:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:15.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:15 np0005593234 podman[312281]: 2026-01-23 10:33:15.76627081 +0000 UTC m=+0.057195639 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:33:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e351 e351: 3 total, 3 up, 3 in
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.913 227766 DEBUG nova.storage.rbd_utils [None req-a50acd54-557d-4d89-83be-31893c9dd896 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] cloning vms/1114ae68-dab9-46b3-abab-53f135df78d8_disk@2835c3f512474430b6a9de6740e5a480 to images/562f5b35-ce71-43ca-99f2-1c1231b35c14 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.952 227766 DEBUG nova.compute.manager [req-d8d10525-daa3-4a46-88bf-02045b650141 req-17d3a7ef-8265-441a-bcbb-77bb4169eed9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Received event network-vif-plugged-b6ddc2d2-277d-4859-8c63-6920fe72886a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.952 227766 DEBUG oslo_concurrency.lockutils [req-d8d10525-daa3-4a46-88bf-02045b650141 req-17d3a7ef-8265-441a-bcbb-77bb4169eed9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "298e1080-6898-4a9b-903e-052965024e8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.953 227766 DEBUG oslo_concurrency.lockutils [req-d8d10525-daa3-4a46-88bf-02045b650141 req-17d3a7ef-8265-441a-bcbb-77bb4169eed9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.953 227766 DEBUG oslo_concurrency.lockutils [req-d8d10525-daa3-4a46-88bf-02045b650141 req-17d3a7ef-8265-441a-bcbb-77bb4169eed9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.954 227766 DEBUG nova.compute.manager [req-d8d10525-daa3-4a46-88bf-02045b650141 req-17d3a7ef-8265-441a-bcbb-77bb4169eed9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Processing event network-vif-plugged-b6ddc2d2-277d-4859-8c63-6920fe72886a _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.954 227766 DEBUG nova.compute.manager [req-d8d10525-daa3-4a46-88bf-02045b650141 req-17d3a7ef-8265-441a-bcbb-77bb4169eed9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Received event network-vif-plugged-b6ddc2d2-277d-4859-8c63-6920fe72886a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.955 227766 DEBUG oslo_concurrency.lockutils [req-d8d10525-daa3-4a46-88bf-02045b650141 req-17d3a7ef-8265-441a-bcbb-77bb4169eed9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "298e1080-6898-4a9b-903e-052965024e8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.955 227766 DEBUG oslo_concurrency.lockutils [req-d8d10525-daa3-4a46-88bf-02045b650141 req-17d3a7ef-8265-441a-bcbb-77bb4169eed9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.956 227766 DEBUG oslo_concurrency.lockutils [req-d8d10525-daa3-4a46-88bf-02045b650141 req-17d3a7ef-8265-441a-bcbb-77bb4169eed9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.956 227766 DEBUG nova.compute.manager [req-d8d10525-daa3-4a46-88bf-02045b650141 req-17d3a7ef-8265-441a-bcbb-77bb4169eed9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] No waiting events found dispatching network-vif-plugged-b6ddc2d2-277d-4859-8c63-6920fe72886a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.956 227766 WARNING nova.compute.manager [req-d8d10525-daa3-4a46-88bf-02045b650141 req-17d3a7ef-8265-441a-bcbb-77bb4169eed9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Received unexpected event network-vif-plugged-b6ddc2d2-277d-4859-8c63-6920fe72886a for instance with vm_state building and task_state spawning.#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.958 227766 DEBUG nova.compute.manager [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Instance event wait completed in 5 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.962 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164395.9617736, 298e1080-6898-4a9b-903e-052965024e8a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.962 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 298e1080-6898-4a9b-903e-052965024e8a] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.964 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.967 227766 INFO nova.virt.libvirt.driver [-] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Instance spawned successfully.#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.968 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:33:15 np0005593234 nova_compute[227762]: 2026-01-23 10:33:15.996 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.005 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.011 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.012 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.012 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.013 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.014 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.015 227766 DEBUG nova.virt.libvirt.driver [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.049 227766 DEBUG nova.storage.rbd_utils [None req-a50acd54-557d-4d89-83be-31893c9dd896 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] flattening images/562f5b35-ce71-43ca-99f2-1c1231b35c14 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.104 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 298e1080-6898-4a9b-903e-052965024e8a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.164 227766 INFO nova.compute.manager [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Took 16.95 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.164 227766 DEBUG nova.compute.manager [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.338 227766 INFO nova.compute.manager [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Took 20.88 seconds to build instance.#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.348 227766 DEBUG nova.storage.rbd_utils [None req-a50acd54-557d-4d89-83be-31893c9dd896 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] removing snapshot(2835c3f512474430b6a9de6740e5a480) on rbd image(1114ae68-dab9-46b3-abab-53f135df78d8_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.361 227766 DEBUG oslo_concurrency.lockutils [None req-5e9a5ade-30a2-4761-b3ce-ec68f8e8caa4 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 21.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:16.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e352 e352: 3 total, 3 up, 3 in
Jan 23 05:33:16 np0005593234 nova_compute[227762]: 2026-01-23 10:33:16.917 227766 DEBUG nova.storage.rbd_utils [None req-a50acd54-557d-4d89-83be-31893c9dd896 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] creating snapshot(snap) on rbd image(562f5b35-ce71-43ca-99f2-1c1231b35c14) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:33:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:17.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e353 e353: 3 total, 3 up, 3 in
Jan 23 05:33:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:18 np0005593234 nova_compute[227762]: 2026-01-23 10:33:18.572 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:18.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:18 np0005593234 nova_compute[227762]: 2026-01-23 10:33:18.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:18 np0005593234 nova_compute[227762]: 2026-01-23 10:33:18.819 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:19.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:20.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:21 np0005593234 nova_compute[227762]: 2026-01-23 10:33:21.303 227766 INFO nova.virt.libvirt.driver [None req-a50acd54-557d-4d89-83be-31893c9dd896 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Snapshot image upload complete#033[00m
Jan 23 05:33:21 np0005593234 nova_compute[227762]: 2026-01-23 10:33:21.304 227766 INFO nova.compute.manager [None req-a50acd54-557d-4d89-83be-31893c9dd896 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Took 6.94 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 23 05:33:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:21.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:21 np0005593234 nova_compute[227762]: 2026-01-23 10:33:21.788 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:21 np0005593234 NetworkManager[48942]: <info>  [1769164401.7903] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/373)
Jan 23 05:33:21 np0005593234 NetworkManager[48942]: <info>  [1769164401.7918] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/374)
Jan 23 05:33:21 np0005593234 nova_compute[227762]: 2026-01-23 10:33:21.895 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:21 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:21Z|00778|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:33:21 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:21Z|00779|binding|INFO|Releasing lport f6117e93-58bd-4099-b49e-913018961730 from this chassis (sb_readonly=0)
Jan 23 05:33:21 np0005593234 nova_compute[227762]: 2026-01-23 10:33:21.914 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:22 np0005593234 nova_compute[227762]: 2026-01-23 10:33:22.569 227766 DEBUG nova.compute.manager [req-97067ccd-7215-4cc5-9463-420a20256bec req-9478583c-2f1f-4db2-bf06-6ef1ead54f98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Received event network-changed-b6ddc2d2-277d-4859-8c63-6920fe72886a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:22 np0005593234 nova_compute[227762]: 2026-01-23 10:33:22.570 227766 DEBUG nova.compute.manager [req-97067ccd-7215-4cc5-9463-420a20256bec req-9478583c-2f1f-4db2-bf06-6ef1ead54f98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Refreshing instance network info cache due to event network-changed-b6ddc2d2-277d-4859-8c63-6920fe72886a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:33:22 np0005593234 nova_compute[227762]: 2026-01-23 10:33:22.570 227766 DEBUG oslo_concurrency.lockutils [req-97067ccd-7215-4cc5-9463-420a20256bec req-9478583c-2f1f-4db2-bf06-6ef1ead54f98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-298e1080-6898-4a9b-903e-052965024e8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:22 np0005593234 nova_compute[227762]: 2026-01-23 10:33:22.571 227766 DEBUG oslo_concurrency.lockutils [req-97067ccd-7215-4cc5-9463-420a20256bec req-9478583c-2f1f-4db2-bf06-6ef1ead54f98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-298e1080-6898-4a9b-903e-052965024e8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:22 np0005593234 nova_compute[227762]: 2026-01-23 10:33:22.571 227766 DEBUG nova.network.neutron [req-97067ccd-7215-4cc5-9463-420a20256bec req-9478583c-2f1f-4db2-bf06-6ef1ead54f98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Refreshing network info cache for port b6ddc2d2-277d-4859-8c63-6920fe72886a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:33:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:22.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e353 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:23 np0005593234 nova_compute[227762]: 2026-01-23 10:33:23.574 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.003000095s ======
Jan 23 05:33:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:23.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 23 05:33:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 e354: 3 total, 3 up, 3 in
Jan 23 05:33:23 np0005593234 nova_compute[227762]: 2026-01-23 10:33:23.820 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:24.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:24 np0005593234 nova_compute[227762]: 2026-01-23 10:33:24.728 227766 INFO nova.compute.manager [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Rescuing#033[00m
Jan 23 05:33:24 np0005593234 nova_compute[227762]: 2026-01-23 10:33:24.728 227766 DEBUG oslo_concurrency.lockutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:24 np0005593234 nova_compute[227762]: 2026-01-23 10:33:24.729 227766 DEBUG oslo_concurrency.lockutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquired lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:24 np0005593234 nova_compute[227762]: 2026-01-23 10:33:24.729 227766 DEBUG nova.network.neutron [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:33:24 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:24Z|00101|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:02:16:71 10.100.0.14
Jan 23 05:33:24 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:24Z|00102|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:02:16:71 10.100.0.14
Jan 23 05:33:24 np0005593234 podman[312446]: 2026-01-23 10:33:24.798029127 +0000 UTC m=+0.086182144 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:33:24 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:24Z|00780|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:33:24 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:24Z|00781|binding|INFO|Releasing lport f6117e93-58bd-4099-b49e-913018961730 from this chassis (sb_readonly=0)
Jan 23 05:33:25 np0005593234 nova_compute[227762]: 2026-01-23 10:33:25.021 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:25.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:25 np0005593234 nova_compute[227762]: 2026-01-23 10:33:25.703 227766 DEBUG nova.network.neutron [req-97067ccd-7215-4cc5-9463-420a20256bec req-9478583c-2f1f-4db2-bf06-6ef1ead54f98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Updated VIF entry in instance network info cache for port b6ddc2d2-277d-4859-8c63-6920fe72886a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:33:25 np0005593234 nova_compute[227762]: 2026-01-23 10:33:25.704 227766 DEBUG nova.network.neutron [req-97067ccd-7215-4cc5-9463-420a20256bec req-9478583c-2f1f-4db2-bf06-6ef1ead54f98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Updating instance_info_cache with network_info: [{"id": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "address": "fa:16:3e:c2:a4:da", "network": {"id": "bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998", "bridge": "br-int", "label": "tempest-network-smoke--156740505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6ddc2d2-27", "ovs_interfaceid": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:25 np0005593234 nova_compute[227762]: 2026-01-23 10:33:25.742 227766 DEBUG oslo_concurrency.lockutils [req-97067ccd-7215-4cc5-9463-420a20256bec req-9478583c-2f1f-4db2-bf06-6ef1ead54f98 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-298e1080-6898-4a9b-903e-052965024e8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:33:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:26.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:27.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:28 np0005593234 nova_compute[227762]: 2026-01-23 10:33:28.062 227766 DEBUG nova.network.neutron [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Updating instance_info_cache with network_info: [{"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:28 np0005593234 nova_compute[227762]: 2026-01-23 10:33:28.097 227766 DEBUG oslo_concurrency.lockutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Releasing lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:33:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:28 np0005593234 nova_compute[227762]: 2026-01-23 10:33:28.578 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:28.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:28 np0005593234 nova_compute[227762]: 2026-01-23 10:33:28.794 227766 DEBUG nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:33:28 np0005593234 nova_compute[227762]: 2026-01-23 10:33:28.822 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:29.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:29Z|00103|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c2:a4:da 10.100.0.9
Jan 23 05:33:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:29Z|00104|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c2:a4:da 10.100.0.9
Jan 23 05:33:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:30.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:33:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:31.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:33:31 np0005593234 nova_compute[227762]: 2026-01-23 10:33:31.810 227766 INFO nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:33:31 np0005593234 kernel: tap1827509f-e3 (unregistering): left promiscuous mode
Jan 23 05:33:31 np0005593234 NetworkManager[48942]: <info>  [1769164411.8607] device (tap1827509f-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:33:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:31Z|00782|binding|INFO|Releasing lport 1827509f-e3b0-49ea-b1ff-982db21148b8 from this chassis (sb_readonly=0)
Jan 23 05:33:31 np0005593234 nova_compute[227762]: 2026-01-23 10:33:31.871 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:31Z|00783|binding|INFO|Setting lport 1827509f-e3b0-49ea-b1ff-982db21148b8 down in Southbound
Jan 23 05:33:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:31Z|00784|binding|INFO|Removing iface tap1827509f-e3 ovn-installed in OVS
Jan 23 05:33:31 np0005593234 nova_compute[227762]: 2026-01-23 10:33:31.873 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:31.879 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:16:71 10.100.0.14'], port_security=['fa:16:3e:02:16:71 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1114ae68-dab9-46b3-abab-53f135df78d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=1827509f-e3b0-49ea-b1ff-982db21148b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:33:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:31.881 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 1827509f-e3b0-49ea-b1ff-982db21148b8 in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 unbound from our chassis#033[00m
Jan 23 05:33:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:31.882 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7d5530f-5227-4f75-bac0-2604bb3d68e2#033[00m
Jan 23 05:33:31 np0005593234 nova_compute[227762]: 2026-01-23 10:33:31.885 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:31.897 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[030ac585-8549-4f09-8316-444f39b017d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:31.932 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[441e94eb-5a66-4301-8199-7fa8df858837]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:31 np0005593234 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b5.scope: Deactivated successfully.
Jan 23 05:33:31 np0005593234 systemd[1]: machine-qemu\x2d86\x2dinstance\x2d000000b5.scope: Consumed 13.580s CPU time.
Jan 23 05:33:31 np0005593234 systemd-machined[195626]: Machine qemu-86-instance-000000b5 terminated.
Jan 23 05:33:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:31.938 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[8b795451-733d-49f5-911b-0e84e2478926]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:31.967 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[059fc4e8-97b9-4a7d-abb7-f32989f9f01e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:31.984 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b39bc9b9-9df2-47f3-bf72-0595fe7c2967]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 7, 'rx_bytes': 658, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815735, 'reachable_time': 37082, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312488, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:32.001 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8e555c0c-e612-4268-9628-dcd023c1fa37]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd7d5530f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815745, 'tstamp': 815745}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312489, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd7d5530f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815749, 'tstamp': 815749}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312489, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:32.003 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.006 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.011 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:32.011 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d5530f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:32.012 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:32.012 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7d5530f-50, col_values=(('external_ids', {'iface-id': '4c99eeb5-c437-4d31-ac3b-bfd151140733'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:32.013 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.041 227766 INFO nova.virt.libvirt.driver [-] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Instance destroyed successfully.#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.042 227766 DEBUG nova.objects.instance [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 1114ae68-dab9-46b3-abab-53f135df78d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.061 227766 INFO nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Attempting a stable device rescue#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.386 227766 DEBUG nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rescue generated disk_info: {'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} rescue /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4314#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.390 227766 DEBUG nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.390 227766 INFO nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Creating image(s)#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.414 227766 DEBUG nova.storage.rbd_utils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 1114ae68-dab9-46b3-abab-53f135df78d8_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.416 227766 DEBUG nova.objects.instance [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1114ae68-dab9-46b3-abab-53f135df78d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.461 227766 DEBUG nova.storage.rbd_utils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 1114ae68-dab9-46b3-abab-53f135df78d8_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.486 227766 DEBUG nova.storage.rbd_utils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 1114ae68-dab9-46b3-abab-53f135df78d8_disk.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.490 227766 DEBUG oslo_concurrency.lockutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "99c4614cc3475963a0b76a1be4909a89ba376908" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.491 227766 DEBUG oslo_concurrency.lockutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "99c4614cc3475963a0b76a1be4909a89ba376908" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.495 227766 DEBUG nova.compute.manager [req-309979f8-191a-455d-b5a3-6a5350376a5b req-5097b0d9-8cfb-4b7a-89b8-d3cdb6aff891 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-unplugged-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.495 227766 DEBUG oslo_concurrency.lockutils [req-309979f8-191a-455d-b5a3-6a5350376a5b req-5097b0d9-8cfb-4b7a-89b8-d3cdb6aff891 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.495 227766 DEBUG oslo_concurrency.lockutils [req-309979f8-191a-455d-b5a3-6a5350376a5b req-5097b0d9-8cfb-4b7a-89b8-d3cdb6aff891 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.495 227766 DEBUG oslo_concurrency.lockutils [req-309979f8-191a-455d-b5a3-6a5350376a5b req-5097b0d9-8cfb-4b7a-89b8-d3cdb6aff891 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.496 227766 DEBUG nova.compute.manager [req-309979f8-191a-455d-b5a3-6a5350376a5b req-5097b0d9-8cfb-4b7a-89b8-d3cdb6aff891 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] No waiting events found dispatching network-vif-unplugged-1827509f-e3b0-49ea-b1ff-982db21148b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.496 227766 WARNING nova.compute.manager [req-309979f8-191a-455d-b5a3-6a5350376a5b req-5097b0d9-8cfb-4b7a-89b8-d3cdb6aff891 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received unexpected event network-vif-unplugged-1827509f-e3b0-49ea-b1ff-982db21148b8 for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:33:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:32.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.852 227766 DEBUG nova.virt.libvirt.imagebackend [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/562f5b35-ce71-43ca-99f2-1c1231b35c14/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/562f5b35-ce71-43ca-99f2-1c1231b35c14/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.900 227766 DEBUG nova.virt.libvirt.imagebackend [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Selected location: {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/562f5b35-ce71-43ca-99f2-1c1231b35c14/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 23 05:33:32 np0005593234 nova_compute[227762]: 2026-01-23 10:33:32.900 227766 DEBUG nova.storage.rbd_utils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] cloning images/562f5b35-ce71-43ca-99f2-1c1231b35c14@snap to None/1114ae68-dab9-46b3-abab-53f135df78d8_disk.rescue clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.016 227766 DEBUG oslo_concurrency.lockutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "99c4614cc3475963a0b76a1be4909a89ba376908" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.525s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.063 227766 DEBUG nova.objects.instance [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'migration_context' on Instance uuid 1114ae68-dab9-46b3-abab-53f135df78d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.080 227766 DEBUG nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.082 227766 DEBUG nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Start _get_guest_xml network_info=[{"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "vif_mac": "fa:16:3e:02:16:71"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}, 'disk.rescue': {'bus': 'virtio', 'dev': 'vdb', 'type': 'disk'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue={'image_id': '562f5b35-ce71-43ca-99f2-1c1231b35c14', 'kernel_id': '', 'ramdisk_id': ''} block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.082 227766 DEBUG nova.objects.instance [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'resources' on Instance uuid 1114ae68-dab9-46b3-abab-53f135df78d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.103 227766 WARNING nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.112 227766 DEBUG nova.virt.libvirt.host [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.113 227766 DEBUG nova.virt.libvirt.host [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.117 227766 DEBUG nova.virt.libvirt.host [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.117 227766 DEBUG nova.virt.libvirt.host [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.118 227766 DEBUG nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.119 227766 DEBUG nova.virt.hardware [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.119 227766 DEBUG nova.virt.hardware [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.119 227766 DEBUG nova.virt.hardware [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.119 227766 DEBUG nova.virt.hardware [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.120 227766 DEBUG nova.virt.hardware [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.120 227766 DEBUG nova.virt.hardware [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.120 227766 DEBUG nova.virt.hardware [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.120 227766 DEBUG nova.virt.hardware [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.120 227766 DEBUG nova.virt.hardware [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.121 227766 DEBUG nova.virt.hardware [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.121 227766 DEBUG nova.virt.hardware [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.121 227766 DEBUG nova.objects.instance [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 1114ae68-dab9-46b3-abab-53f135df78d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.143 227766 DEBUG oslo_concurrency.processutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:33:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1843148416' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.580 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.590 227766 DEBUG oslo_concurrency.processutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.626 227766 DEBUG oslo_concurrency.processutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:33.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:33 np0005593234 nova_compute[227762]: 2026-01-23 10:33:33.824 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:33:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1664338125' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.093 227766 DEBUG oslo_concurrency.processutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.467s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.094 227766 DEBUG oslo_concurrency.processutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:33:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3130841521' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.518 227766 DEBUG oslo_concurrency.processutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.520 227766 DEBUG nova.virt.libvirt.vif [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:32:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-182552386',display_name='tempest-ServerStableDeviceRescueTest-server-182552386',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-182552386',id=181,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:33:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='815b71acf60d4ed8933ebd05228fa0c0',ramdisk_id='',reservation_id='r-491now07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1802220041',owner_user_name='tempest-ServerStableDeviceRescueTest-1802220041-project-member'},tags=<?>,task_state='rescuing',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:33:21Z,user_data=None,user_id='e1629a4b14764dddaabcadd16f3e1c1c',uuid=1114ae68-dab9-46b3-abab-53f135df78d8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "vif_mac": "fa:16:3e:02:16:71"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.520 227766 DEBUG nova.network.os_vif_util [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converting VIF {"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [], "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "vif_mac": "fa:16:3e:02:16:71"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.521 227766 DEBUG nova.network.os_vif_util [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:16:71,bridge_name='br-int',has_traffic_filtering=True,id=1827509f-e3b0-49ea-b1ff-982db21148b8,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1827509f-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.523 227766 DEBUG nova.objects.instance [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1114ae68-dab9-46b3-abab-53f135df78d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:34.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.819 227766 DEBUG nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  <uuid>1114ae68-dab9-46b3-abab-53f135df78d8</uuid>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  <name>instance-000000b5</name>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerStableDeviceRescueTest-server-182552386</nova:name>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:33:33</nova:creationTime>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <nova:user uuid="e1629a4b14764dddaabcadd16f3e1c1c">tempest-ServerStableDeviceRescueTest-1802220041-project-member</nova:user>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <nova:project uuid="815b71acf60d4ed8933ebd05228fa0c0">tempest-ServerStableDeviceRescueTest-1802220041</nova:project>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <nova:port uuid="1827509f-e3b0-49ea-b1ff-982db21148b8">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <entry name="serial">1114ae68-dab9-46b3-abab-53f135df78d8</entry>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <entry name="uuid">1114ae68-dab9-46b3-abab-53f135df78d8</entry>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1114ae68-dab9-46b3-abab-53f135df78d8_disk">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1114ae68-dab9-46b3-abab-53f135df78d8_disk.config">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1114ae68-dab9-46b3-abab-53f135df78d8_disk.rescue">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <target dev="vdb" bus="virtio"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <boot order="1"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:02:16:71"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <target dev="tap1827509f-e3"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/console.log" append="off"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:33:34 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:33:34 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:33:34 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:33:34 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.831 227766 INFO nova.virt.libvirt.driver [-] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Instance destroyed successfully.#033[00m
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.943 227766 DEBUG nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.943 227766 DEBUG nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.943 227766 DEBUG nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.943 227766 DEBUG nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] No VIF found with MAC fa:16:3e:02:16:71, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.944 227766 INFO nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Using config drive#033[00m
Jan 23 05:33:34 np0005593234 nova_compute[227762]: 2026-01-23 10:33:34.968 227766 DEBUG nova.storage.rbd_utils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 1114ae68-dab9-46b3-abab-53f135df78d8_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:35 np0005593234 nova_compute[227762]: 2026-01-23 10:33:35.015 227766 DEBUG nova.compute.manager [req-89cfd78c-128f-4f72-8704-c1c7e559948b req-c466f44e-eb33-4efc-bf92-ea4a45b4ce20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:35 np0005593234 nova_compute[227762]: 2026-01-23 10:33:35.016 227766 DEBUG oslo_concurrency.lockutils [req-89cfd78c-128f-4f72-8704-c1c7e559948b req-c466f44e-eb33-4efc-bf92-ea4a45b4ce20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:35 np0005593234 nova_compute[227762]: 2026-01-23 10:33:35.016 227766 DEBUG oslo_concurrency.lockutils [req-89cfd78c-128f-4f72-8704-c1c7e559948b req-c466f44e-eb33-4efc-bf92-ea4a45b4ce20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:35 np0005593234 nova_compute[227762]: 2026-01-23 10:33:35.016 227766 DEBUG oslo_concurrency.lockutils [req-89cfd78c-128f-4f72-8704-c1c7e559948b req-c466f44e-eb33-4efc-bf92-ea4a45b4ce20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:35 np0005593234 nova_compute[227762]: 2026-01-23 10:33:35.017 227766 DEBUG nova.compute.manager [req-89cfd78c-128f-4f72-8704-c1c7e559948b req-c466f44e-eb33-4efc-bf92-ea4a45b4ce20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] No waiting events found dispatching network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:35 np0005593234 nova_compute[227762]: 2026-01-23 10:33:35.017 227766 WARNING nova.compute.manager [req-89cfd78c-128f-4f72-8704-c1c7e559948b req-c466f44e-eb33-4efc-bf92-ea4a45b4ce20 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received unexpected event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 for instance with vm_state active and task_state rescuing.#033[00m
Jan 23 05:33:35 np0005593234 nova_compute[227762]: 2026-01-23 10:33:35.020 227766 DEBUG nova.objects.instance [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 1114ae68-dab9-46b3-abab-53f135df78d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:35 np0005593234 nova_compute[227762]: 2026-01-23 10:33:35.072 227766 DEBUG nova.objects.instance [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'keypairs' on Instance uuid 1114ae68-dab9-46b3-abab-53f135df78d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:35.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:35 np0005593234 nova_compute[227762]: 2026-01-23 10:33:35.710 227766 INFO nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Creating config drive at /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/disk.config.rescue#033[00m
Jan 23 05:33:35 np0005593234 nova_compute[227762]: 2026-01-23 10:33:35.715 227766 DEBUG oslo_concurrency.processutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9ha65fbx execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:35 np0005593234 nova_compute[227762]: 2026-01-23 10:33:35.848 227766 DEBUG oslo_concurrency.processutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/disk.config.rescue -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9ha65fbx" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:35 np0005593234 nova_compute[227762]: 2026-01-23 10:33:35.875 227766 DEBUG nova.storage.rbd_utils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] rbd image 1114ae68-dab9-46b3-abab-53f135df78d8_disk.config.rescue does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:33:35 np0005593234 nova_compute[227762]: 2026-01-23 10:33:35.878 227766 DEBUG oslo_concurrency.processutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/disk.config.rescue 1114ae68-dab9-46b3-abab-53f135df78d8_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.050 227766 DEBUG oslo_concurrency.processutils [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/disk.config.rescue 1114ae68-dab9-46b3-abab-53f135df78d8_disk.config.rescue --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.172s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.051 227766 INFO nova.virt.libvirt.driver [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Deleting local config drive /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8/disk.config.rescue because it was imported into RBD.#033[00m
Jan 23 05:33:36 np0005593234 kernel: tap1827509f-e3: entered promiscuous mode
Jan 23 05:33:36 np0005593234 NetworkManager[48942]: <info>  [1769164416.0973] manager: (tap1827509f-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/375)
Jan 23 05:33:36 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:36Z|00785|binding|INFO|Claiming lport 1827509f-e3b0-49ea-b1ff-982db21148b8 for this chassis.
Jan 23 05:33:36 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:36Z|00786|binding|INFO|1827509f-e3b0-49ea-b1ff-982db21148b8: Claiming fa:16:3e:02:16:71 10.100.0.14
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.097 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.105 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:16:71 10.100.0.14'], port_security=['fa:16:3e:02:16:71 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1114ae68-dab9-46b3-abab-53f135df78d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '5', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=1827509f-e3b0-49ea-b1ff-982db21148b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.107 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 1827509f-e3b0-49ea-b1ff-982db21148b8 in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 bound to our chassis#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.109 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7d5530f-5227-4f75-bac0-2604bb3d68e2#033[00m
Jan 23 05:33:36 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:36Z|00787|binding|INFO|Setting lport 1827509f-e3b0-49ea-b1ff-982db21148b8 ovn-installed in OVS
Jan 23 05:33:36 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:36Z|00788|binding|INFO|Setting lport 1827509f-e3b0-49ea-b1ff-982db21148b8 up in Southbound
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.117 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.123 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:36 np0005593234 systemd-udevd[312815]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.128 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[14d4a8ce-c636-42bb-866d-88103428d72b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:36 np0005593234 systemd-machined[195626]: New machine qemu-88-instance-000000b5.
Jan 23 05:33:36 np0005593234 NetworkManager[48942]: <info>  [1769164416.1387] device (tap1827509f-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:33:36 np0005593234 NetworkManager[48942]: <info>  [1769164416.1398] device (tap1827509f-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:33:36 np0005593234 systemd[1]: Started Virtual Machine qemu-88-instance-000000b5.
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.157 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d48ef3b9-8cfc-4357-ad95-845627896b9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.159 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[733f8c9e-3d9a-4d21-9b88-0e2dd3127ece]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.186 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[fa8de11c-fce1-4f76-90c5-5da223b65f7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.322 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0c57c3d9-c96f-4c60-bd04-771871685e50]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 9, 'rx_bytes': 658, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 9, 'rx_bytes': 658, 'tx_bytes': 522, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815735, 'reachable_time': 37082, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 312854, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.340 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ff88d7c5-4bd0-4839-bd60-f9595ee6101b]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd7d5530f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815745, 'tstamp': 815745}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312905, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd7d5530f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815749, 'tstamp': 815749}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 312905, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.342 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.344 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.345 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.345 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d5530f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.346 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.346 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7d5530f-50, col_values=(('external_ids', {'iface-id': '4c99eeb5-c437-4d31-ac3b-bfd151140733'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.347 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:36.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.669 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for 1114ae68-dab9-46b3-abab-53f135df78d8 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.670 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164416.6691082, 1114ae68-dab9-46b3-abab-53f135df78d8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.670 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.677 227766 DEBUG nova.compute.manager [None req-050447b9-d609-42f3-b8f0-014bb0cbe696 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.732 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.735 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.759 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] During sync_power_state the instance has a pending task (rescuing). Skip.#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.759 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164416.6706843, 1114ae68-dab9-46b3-abab-53f135df78d8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.760 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] VM Started (Lifecycle Event)#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.797 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.801 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.952 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=68, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=67) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:33:36 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:36.953 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:33:36 np0005593234 nova_compute[227762]: 2026-01-23 10:33:36.954 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.116 227766 DEBUG nova.compute.manager [req-8d30f790-1d94-47ec-a308-3481f0788941 req-9decb23e-99b8-4d97-84af-6c093703c7ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.117 227766 DEBUG oslo_concurrency.lockutils [req-8d30f790-1d94-47ec-a308-3481f0788941 req-9decb23e-99b8-4d97-84af-6c093703c7ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.117 227766 DEBUG oslo_concurrency.lockutils [req-8d30f790-1d94-47ec-a308-3481f0788941 req-9decb23e-99b8-4d97-84af-6c093703c7ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.117 227766 DEBUG oslo_concurrency.lockutils [req-8d30f790-1d94-47ec-a308-3481f0788941 req-9decb23e-99b8-4d97-84af-6c093703c7ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.117 227766 DEBUG nova.compute.manager [req-8d30f790-1d94-47ec-a308-3481f0788941 req-9decb23e-99b8-4d97-84af-6c093703c7ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] No waiting events found dispatching network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.117 227766 WARNING nova.compute.manager [req-8d30f790-1d94-47ec-a308-3481f0788941 req-9decb23e-99b8-4d97-84af-6c093703c7ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received unexpected event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 for instance with vm_state rescued and task_state None.#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.117 227766 DEBUG nova.compute.manager [req-8d30f790-1d94-47ec-a308-3481f0788941 req-9decb23e-99b8-4d97-84af-6c093703c7ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.118 227766 DEBUG oslo_concurrency.lockutils [req-8d30f790-1d94-47ec-a308-3481f0788941 req-9decb23e-99b8-4d97-84af-6c093703c7ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.118 227766 DEBUG oslo_concurrency.lockutils [req-8d30f790-1d94-47ec-a308-3481f0788941 req-9decb23e-99b8-4d97-84af-6c093703c7ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.118 227766 DEBUG oslo_concurrency.lockutils [req-8d30f790-1d94-47ec-a308-3481f0788941 req-9decb23e-99b8-4d97-84af-6c093703c7ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.118 227766 DEBUG nova.compute.manager [req-8d30f790-1d94-47ec-a308-3481f0788941 req-9decb23e-99b8-4d97-84af-6c093703c7ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] No waiting events found dispatching network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.118 227766 WARNING nova.compute.manager [req-8d30f790-1d94-47ec-a308-3481f0788941 req-9decb23e-99b8-4d97-84af-6c093703c7ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received unexpected event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 for instance with vm_state rescued and task_state None.#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.443 227766 INFO nova.compute.manager [None req-6ab50cf5-6cdd-4c4c-a4d6-b2a4f9be4e8b 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Get console output#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.451 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:33:37 np0005593234 podman[313148]: 2026-01-23 10:33:37.590804304 +0000 UTC m=+0.045166712 container create 198c282eb40edb8278a04ef085e6ae0e142bb4da2deee35a8f1cd3df178433bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_buck, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 05:33:37 np0005593234 systemd[1]: Started libpod-conmon-198c282eb40edb8278a04ef085e6ae0e142bb4da2deee35a8f1cd3df178433bd.scope.
Jan 23 05:33:37 np0005593234 podman[313148]: 2026-01-23 10:33:37.567801365 +0000 UTC m=+0.022163803 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 05:33:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:37.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:37 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:33:37 np0005593234 podman[313148]: 2026-01-23 10:33:37.700370768 +0000 UTC m=+0.154733196 container init 198c282eb40edb8278a04ef085e6ae0e142bb4da2deee35a8f1cd3df178433bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_buck, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:33:37 np0005593234 podman[313148]: 2026-01-23 10:33:37.709018858 +0000 UTC m=+0.163381266 container start 198c282eb40edb8278a04ef085e6ae0e142bb4da2deee35a8f1cd3df178433bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_buck, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, ceph=True, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:33:37 np0005593234 podman[313148]: 2026-01-23 10:33:37.712735363 +0000 UTC m=+0.167097791 container attach 198c282eb40edb8278a04ef085e6ae0e142bb4da2deee35a8f1cd3df178433bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_buck, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Jan 23 05:33:37 np0005593234 sad_buck[313164]: 167 167
Jan 23 05:33:37 np0005593234 systemd[1]: libpod-198c282eb40edb8278a04ef085e6ae0e142bb4da2deee35a8f1cd3df178433bd.scope: Deactivated successfully.
Jan 23 05:33:37 np0005593234 podman[313148]: 2026-01-23 10:33:37.717603506 +0000 UTC m=+0.171965914 container died 198c282eb40edb8278a04ef085e6ae0e142bb4da2deee35a8f1cd3df178433bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_buck, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Jan 23 05:33:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:37 np0005593234 systemd[1]: var-lib-containers-storage-overlay-5ad14d8d3e1f36914b0bdc8024fedd5cb232663f683816caafc7ffc5ceb0e7b3-merged.mount: Deactivated successfully.
Jan 23 05:33:37 np0005593234 podman[313148]: 2026-01-23 10:33:37.756909164 +0000 UTC m=+0.211271582 container remove 198c282eb40edb8278a04ef085e6ae0e142bb4da2deee35a8f1cd3df178433bd (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=sad_buck, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, io.buildah.version=1.39.3)
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.771 227766 INFO nova.compute.manager [None req-9f8064ab-4cbf-4851-8159-e4e893b33121 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Unrescuing#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.772 227766 DEBUG oslo_concurrency.lockutils [None req-9f8064ab-4cbf-4851-8159-e4e893b33121 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.772 227766 DEBUG oslo_concurrency.lockutils [None req-9f8064ab-4cbf-4851-8159-e4e893b33121 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquired lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:37 np0005593234 nova_compute[227762]: 2026-01-23 10:33:37.773 227766 DEBUG nova.network.neutron [None req-9f8064ab-4cbf-4851-8159-e4e893b33121 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:33:37 np0005593234 systemd[1]: libpod-conmon-198c282eb40edb8278a04ef085e6ae0e142bb4da2deee35a8f1cd3df178433bd.scope: Deactivated successfully.
Jan 23 05:33:38 np0005593234 podman[313188]: 2026-01-23 10:33:38.006170042 +0000 UTC m=+0.112054252 container create 6e5ab2d2c4d3eea98628339c8e62b5fcf3efb6d71b3ebfa00a59e270babd9c71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_williamson, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default)
Jan 23 05:33:38 np0005593234 podman[313188]: 2026-01-23 10:33:37.916780419 +0000 UTC m=+0.022664649 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 05:33:38 np0005593234 systemd[1]: Started libpod-conmon-6e5ab2d2c4d3eea98628339c8e62b5fcf3efb6d71b3ebfa00a59e270babd9c71.scope.
Jan 23 05:33:38 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:33:38 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed42324cbe2642b433ce3c88d4be607d65ba1d068a595fe4ca5d9a60c59d9f05/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 05:33:38 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed42324cbe2642b433ce3c88d4be607d65ba1d068a595fe4ca5d9a60c59d9f05/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 05:33:38 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed42324cbe2642b433ce3c88d4be607d65ba1d068a595fe4ca5d9a60c59d9f05/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 05:33:38 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ed42324cbe2642b433ce3c88d4be607d65ba1d068a595fe4ca5d9a60c59d9f05/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 05:33:38 np0005593234 podman[313188]: 2026-01-23 10:33:38.096491943 +0000 UTC m=+0.202376163 container init 6e5ab2d2c4d3eea98628339c8e62b5fcf3efb6d71b3ebfa00a59e270babd9c71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_williamson, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.39.3, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:33:38 np0005593234 podman[313188]: 2026-01-23 10:33:38.104914617 +0000 UTC m=+0.210798827 container start 6e5ab2d2c4d3eea98628339c8e62b5fcf3efb6d71b3ebfa00a59e270babd9c71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_williamson, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, CEPH_REF=reef, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507)
Jan 23 05:33:38 np0005593234 podman[313188]: 2026-01-23 10:33:38.108131337 +0000 UTC m=+0.214015537 container attach 6e5ab2d2c4d3eea98628339c8e62b5fcf3efb6d71b3ebfa00a59e270babd9c71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_williamson, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 05:33:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:38Z|00789|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:33:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:38Z|00790|binding|INFO|Releasing lport f6117e93-58bd-4099-b49e-913018961730 from this chassis (sb_readonly=0)
Jan 23 05:33:38 np0005593234 nova_compute[227762]: 2026-01-23 10:33:38.209 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:38Z|00791|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:33:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:38Z|00792|binding|INFO|Releasing lport f6117e93-58bd-4099-b49e-913018961730 from this chassis (sb_readonly=0)
Jan 23 05:33:38 np0005593234 nova_compute[227762]: 2026-01-23 10:33:38.332 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:38 np0005593234 nova_compute[227762]: 2026-01-23 10:33:38.582 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:38.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:38 np0005593234 nova_compute[227762]: 2026-01-23 10:33:38.869 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]: [
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:    {
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:        "available": false,
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:        "ceph_device": false,
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:        "lsm_data": {},
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:        "lvs": [],
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:        "path": "/dev/sr0",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:        "rejected_reasons": [
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "Has a FileSystem",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "Insufficient space (<5GB)"
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:        ],
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:        "sys_api": {
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "actuators": null,
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "device_nodes": "sr0",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "devname": "sr0",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "human_readable_size": "482.00 KB",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "id_bus": "ata",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "model": "QEMU DVD-ROM",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "nr_requests": "2",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "parent": "/dev/sr0",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "partitions": {},
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "path": "/dev/sr0",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "removable": "1",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "rev": "2.5+",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "ro": "0",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "rotational": "1",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "sas_address": "",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "sas_device_handle": "",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "scheduler_mode": "mq-deadline",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "sectors": 0,
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "sectorsize": "2048",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "size": 493568.0,
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "support_discard": "2048",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "type": "disk",
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:            "vendor": "QEMU"
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:        }
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]:    }
Jan 23 05:33:39 np0005593234 naughty_williamson[313205]: ]
Jan 23 05:33:39 np0005593234 podman[313188]: 2026-01-23 10:33:39.472503126 +0000 UTC m=+1.578387346 container died 6e5ab2d2c4d3eea98628339c8e62b5fcf3efb6d71b3ebfa00a59e270babd9c71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_williamson, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, CEPH_REF=reef, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.39.3, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad)
Jan 23 05:33:39 np0005593234 systemd[1]: libpod-6e5ab2d2c4d3eea98628339c8e62b5fcf3efb6d71b3ebfa00a59e270babd9c71.scope: Deactivated successfully.
Jan 23 05:33:39 np0005593234 systemd[1]: libpod-6e5ab2d2c4d3eea98628339c8e62b5fcf3efb6d71b3ebfa00a59e270babd9c71.scope: Consumed 1.281s CPU time.
Jan 23 05:33:39 np0005593234 systemd[1]: var-lib-containers-storage-overlay-ed42324cbe2642b433ce3c88d4be607d65ba1d068a595fe4ca5d9a60c59d9f05-merged.mount: Deactivated successfully.
Jan 23 05:33:39 np0005593234 podman[313188]: 2026-01-23 10:33:39.536843806 +0000 UTC m=+1.642728016 container remove 6e5ab2d2c4d3eea98628339c8e62b5fcf3efb6d71b3ebfa00a59e270babd9c71 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=naughty_williamson, ceph=True, OSD_FLAVOR=default, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 05:33:39 np0005593234 systemd[1]: libpod-conmon-6e5ab2d2c4d3eea98628339c8e62b5fcf3efb6d71b3ebfa00a59e270babd9c71.scope: Deactivated successfully.
Jan 23 05:33:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:39.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:33:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:39 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:33:40 np0005593234 nova_compute[227762]: 2026-01-23 10:33:40.076 227766 INFO nova.compute.manager [None req-faca779d-4036-42ac-8347-30092286d11e 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Get console output#033[00m
Jan 23 05:33:40 np0005593234 nova_compute[227762]: 2026-01-23 10:33:40.083 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:33:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:40.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.007 227766 DEBUG nova.network.neutron [None req-9f8064ab-4cbf-4851-8159-e4e893b33121 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Updating instance_info_cache with network_info: [{"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.043 227766 DEBUG oslo_concurrency.lockutils [None req-9f8064ab-4cbf-4851-8159-e4e893b33121 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Releasing lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.044 227766 DEBUG nova.objects.instance [None req-9f8064ab-4cbf-4851-8159-e4e893b33121 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'flavor' on Instance uuid 1114ae68-dab9-46b3-abab-53f135df78d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.349 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:41 np0005593234 NetworkManager[48942]: <info>  [1769164421.3496] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/376)
Jan 23 05:33:41 np0005593234 NetworkManager[48942]: <info>  [1769164421.3508] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/377)
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.453 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:41Z|00793|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:33:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:41Z|00794|binding|INFO|Releasing lport f6117e93-58bd-4099-b49e-913018961730 from this chassis (sb_readonly=0)
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.465 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:41 np0005593234 kernel: tap1827509f-e3 (unregistering): left promiscuous mode
Jan 23 05:33:41 np0005593234 NetworkManager[48942]: <info>  [1769164421.4971] device (tap1827509f-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:33:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:41Z|00795|binding|INFO|Releasing lport 1827509f-e3b0-49ea-b1ff-982db21148b8 from this chassis (sb_readonly=0)
Jan 23 05:33:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:41Z|00796|binding|INFO|Setting lport 1827509f-e3b0-49ea-b1ff-982db21148b8 down in Southbound
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.506 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:41Z|00797|binding|INFO|Removing iface tap1827509f-e3 ovn-installed in OVS
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.508 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.516 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:16:71 10.100.0.14'], port_security=['fa:16:3e:02:16:71 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1114ae68-dab9-46b3-abab-53f135df78d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '6', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=1827509f-e3b0-49ea-b1ff-982db21148b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.517 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 1827509f-e3b0-49ea-b1ff-982db21148b8 in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 unbound from our chassis#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.518 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7d5530f-5227-4f75-bac0-2604bb3d68e2#033[00m
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.522 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.535 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[39a2947f-a0fe-4acb-b7a8-e5f16d9513c3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:41 np0005593234 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000b5.scope: Deactivated successfully.
Jan 23 05:33:41 np0005593234 systemd[1]: machine-qemu\x2d88\x2dinstance\x2d000000b5.scope: Consumed 5.063s CPU time.
Jan 23 05:33:41 np0005593234 systemd-machined[195626]: Machine qemu-88-instance-000000b5 terminated.
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.562 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4e658611-0697-46e5-9621-c8de68fe6715]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.565 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[800ab300-431c-4ba1-afad-807673108b1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.592 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[3564b9a9-e920-4fde-bd59-247c28b9f0e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.610 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a994656e-d1bc-4a0a-b2af-534915e4bf47]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 11, 'rx_bytes': 658, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 11, 'rx_bytes': 658, 'tx_bytes': 606, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815735, 'reachable_time': 37082, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 314658, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.626 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[07b05270-b7d9-4528-b760-c7aa323094e4]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd7d5530f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815745, 'tstamp': 815745}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 314659, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd7d5530f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815749, 'tstamp': 815749}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 314659, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.628 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.630 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.634 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.635 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d5530f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.635 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.636 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7d5530f-50, col_values=(('external_ids', {'iface-id': '4c99eeb5-c437-4d31-ac3b-bfd151140733'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.636 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:41.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.703 227766 INFO nova.virt.libvirt.driver [-] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Instance destroyed successfully.#033[00m
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.703 227766 DEBUG nova.objects.instance [None req-9f8064ab-4cbf-4851-8159-e4e893b33121 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'numa_topology' on Instance uuid 1114ae68-dab9-46b3-abab-53f135df78d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:41 np0005593234 kernel: tap1827509f-e3: entered promiscuous mode
Jan 23 05:33:41 np0005593234 systemd-udevd[314651]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:33:41 np0005593234 NetworkManager[48942]: <info>  [1769164421.8026] manager: (tap1827509f-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/378)
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.804 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:41Z|00798|binding|INFO|Claiming lport 1827509f-e3b0-49ea-b1ff-982db21148b8 for this chassis.
Jan 23 05:33:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:41Z|00799|binding|INFO|1827509f-e3b0-49ea-b1ff-982db21148b8: Claiming fa:16:3e:02:16:71 10.100.0.14
Jan 23 05:33:41 np0005593234 NetworkManager[48942]: <info>  [1769164421.8145] device (tap1827509f-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:33:41 np0005593234 NetworkManager[48942]: <info>  [1769164421.8155] device (tap1827509f-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.817 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:16:71 10.100.0.14'], port_security=['fa:16:3e:02:16:71 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1114ae68-dab9-46b3-abab-53f135df78d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '6', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=1827509f-e3b0-49ea-b1ff-982db21148b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.818 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 1827509f-e3b0-49ea-b1ff-982db21148b8 in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 bound to our chassis#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.819 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7d5530f-5227-4f75-bac0-2604bb3d68e2#033[00m
Jan 23 05:33:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:41Z|00800|binding|INFO|Setting lport 1827509f-e3b0-49ea-b1ff-982db21148b8 ovn-installed in OVS
Jan 23 05:33:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:41Z|00801|binding|INFO|Setting lport 1827509f-e3b0-49ea-b1ff-982db21148b8 up in Southbound
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.823 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.827 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.835 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bd44c9dd-0270-4009-a9f3-87732899abb3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:41 np0005593234 systemd-machined[195626]: New machine qemu-89-instance-000000b5.
Jan 23 05:33:41 np0005593234 systemd[1]: Started Virtual Machine qemu-89-instance-000000b5.
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.865 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c92ca996-67c0-45b7-84e4-7c18401cb5e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.868 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1a00d181-6ccd-404d-9f6c-9df1d7c6fd64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.898 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[001f2b10-de15-41f7-a04c-8e378547efe4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.915 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d9e0f476-b512-4fff-8ac3-accd1afcfd40]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 9, 'tx_packets': 13, 'rx_bytes': 658, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 9, 'tx_packets': 13, 'rx_bytes': 658, 'tx_bytes': 690, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815735, 'reachable_time': 37082, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 314697, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.931 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7b2928f1-ccea-4ff9-b7e6-59b776d813c2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd7d5530f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815745, 'tstamp': 815745}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 314698, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd7d5530f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815749, 'tstamp': 815749}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 314698, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.932 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.934 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:41 np0005593234 nova_compute[227762]: 2026-01-23 10:33:41.935 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.935 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d5530f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.935 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.935 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7d5530f-50, col_values=(('external_ids', {'iface-id': '4c99eeb5-c437-4d31-ac3b-bfd151140733'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:41.936 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:33:42 np0005593234 nova_compute[227762]: 2026-01-23 10:33:42.410 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for 1114ae68-dab9-46b3-abab-53f135df78d8 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:33:42 np0005593234 nova_compute[227762]: 2026-01-23 10:33:42.411 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164422.410222, 1114ae68-dab9-46b3-abab-53f135df78d8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:33:42 np0005593234 nova_compute[227762]: 2026-01-23 10:33:42.411 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:33:42 np0005593234 nova_compute[227762]: 2026-01-23 10:33:42.511 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:42 np0005593234 nova_compute[227762]: 2026-01-23 10:33:42.515 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:33:42 np0005593234 nova_compute[227762]: 2026-01-23 10:33:42.553 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 23 05:33:42 np0005593234 nova_compute[227762]: 2026-01-23 10:33:42.554 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164422.411242, 1114ae68-dab9-46b3-abab-53f135df78d8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:33:42 np0005593234 nova_compute[227762]: 2026-01-23 10:33:42.554 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] VM Started (Lifecycle Event)#033[00m
Jan 23 05:33:42 np0005593234 nova_compute[227762]: 2026-01-23 10:33:42.576 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:42 np0005593234 nova_compute[227762]: 2026-01-23 10:33:42.580 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Synchronizing instance power state after lifecycle event "Started"; current vm_state: rescued, current task_state: unrescuing, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:33:42 np0005593234 nova_compute[227762]: 2026-01-23 10:33:42.605 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] During sync_power_state the instance has a pending task (unrescuing). Skip.#033[00m
Jan 23 05:33:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:42.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:42.869 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:42.870 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:42.871 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:43 np0005593234 nova_compute[227762]: 2026-01-23 10:33:43.157 227766 DEBUG nova.compute.manager [req-832bd869-ce41-405e-9850-93d1329f1edb req-e028f560-290f-454e-9cf7-dce8d19813a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-unplugged-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:43 np0005593234 nova_compute[227762]: 2026-01-23 10:33:43.158 227766 DEBUG oslo_concurrency.lockutils [req-832bd869-ce41-405e-9850-93d1329f1edb req-e028f560-290f-454e-9cf7-dce8d19813a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:43 np0005593234 nova_compute[227762]: 2026-01-23 10:33:43.158 227766 DEBUG oslo_concurrency.lockutils [req-832bd869-ce41-405e-9850-93d1329f1edb req-e028f560-290f-454e-9cf7-dce8d19813a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:43 np0005593234 nova_compute[227762]: 2026-01-23 10:33:43.158 227766 DEBUG oslo_concurrency.lockutils [req-832bd869-ce41-405e-9850-93d1329f1edb req-e028f560-290f-454e-9cf7-dce8d19813a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:43 np0005593234 nova_compute[227762]: 2026-01-23 10:33:43.158 227766 DEBUG nova.compute.manager [req-832bd869-ce41-405e-9850-93d1329f1edb req-e028f560-290f-454e-9cf7-dce8d19813a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] No waiting events found dispatching network-vif-unplugged-1827509f-e3b0-49ea-b1ff-982db21148b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:43 np0005593234 nova_compute[227762]: 2026-01-23 10:33:43.159 227766 WARNING nova.compute.manager [req-832bd869-ce41-405e-9850-93d1329f1edb req-e028f560-290f-454e-9cf7-dce8d19813a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received unexpected event network-vif-unplugged-1827509f-e3b0-49ea-b1ff-982db21148b8 for instance with vm_state rescued and task_state unrescuing.#033[00m
Jan 23 05:33:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:43 np0005593234 nova_compute[227762]: 2026-01-23 10:33:43.307 227766 INFO nova.compute.manager [None req-b19bb50a-f439-4a49-806c-abf7a5ac2faf 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Get console output#033[00m
Jan 23 05:33:43 np0005593234 nova_compute[227762]: 2026-01-23 10:33:43.312 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:33:43 np0005593234 nova_compute[227762]: 2026-01-23 10:33:43.586 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:43.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:43 np0005593234 nova_compute[227762]: 2026-01-23 10:33:43.865 227766 DEBUG nova.compute.manager [None req-9f8064ab-4cbf-4851-8159-e4e893b33121 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:33:43 np0005593234 nova_compute[227762]: 2026-01-23 10:33:43.871 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:44.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:45 np0005593234 nova_compute[227762]: 2026-01-23 10:33:45.325 227766 DEBUG nova.compute.manager [req-5d093d5f-f081-4f3c-8dbc-a9a375a3c64d req-e44ae21e-1f7a-483d-ad95-af98661241db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:45 np0005593234 nova_compute[227762]: 2026-01-23 10:33:45.326 227766 DEBUG oslo_concurrency.lockutils [req-5d093d5f-f081-4f3c-8dbc-a9a375a3c64d req-e44ae21e-1f7a-483d-ad95-af98661241db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:45 np0005593234 nova_compute[227762]: 2026-01-23 10:33:45.326 227766 DEBUG oslo_concurrency.lockutils [req-5d093d5f-f081-4f3c-8dbc-a9a375a3c64d req-e44ae21e-1f7a-483d-ad95-af98661241db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:45 np0005593234 nova_compute[227762]: 2026-01-23 10:33:45.326 227766 DEBUG oslo_concurrency.lockutils [req-5d093d5f-f081-4f3c-8dbc-a9a375a3c64d req-e44ae21e-1f7a-483d-ad95-af98661241db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:45 np0005593234 nova_compute[227762]: 2026-01-23 10:33:45.326 227766 DEBUG nova.compute.manager [req-5d093d5f-f081-4f3c-8dbc-a9a375a3c64d req-e44ae21e-1f7a-483d-ad95-af98661241db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] No waiting events found dispatching network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:45 np0005593234 nova_compute[227762]: 2026-01-23 10:33:45.327 227766 WARNING nova.compute.manager [req-5d093d5f-f081-4f3c-8dbc-a9a375a3c64d req-e44ae21e-1f7a-483d-ad95-af98661241db 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received unexpected event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:33:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:45.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:46.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:46 np0005593234 podman[314763]: 2026-01-23 10:33:46.768536193 +0000 UTC m=+0.055924478 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:33:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:46.957 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '68'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:47 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:47 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:33:47 np0005593234 nova_compute[227762]: 2026-01-23 10:33:47.577 227766 DEBUG nova.compute.manager [req-f2815f8f-a649-415e-9b0f-f7acb5419cdd req-37f1c613-9c0f-4662-863f-6064c3b985a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:47 np0005593234 nova_compute[227762]: 2026-01-23 10:33:47.577 227766 DEBUG oslo_concurrency.lockutils [req-f2815f8f-a649-415e-9b0f-f7acb5419cdd req-37f1c613-9c0f-4662-863f-6064c3b985a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:47 np0005593234 nova_compute[227762]: 2026-01-23 10:33:47.577 227766 DEBUG oslo_concurrency.lockutils [req-f2815f8f-a649-415e-9b0f-f7acb5419cdd req-37f1c613-9c0f-4662-863f-6064c3b985a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:47 np0005593234 nova_compute[227762]: 2026-01-23 10:33:47.577 227766 DEBUG oslo_concurrency.lockutils [req-f2815f8f-a649-415e-9b0f-f7acb5419cdd req-37f1c613-9c0f-4662-863f-6064c3b985a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:47 np0005593234 nova_compute[227762]: 2026-01-23 10:33:47.578 227766 DEBUG nova.compute.manager [req-f2815f8f-a649-415e-9b0f-f7acb5419cdd req-37f1c613-9c0f-4662-863f-6064c3b985a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] No waiting events found dispatching network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:47 np0005593234 nova_compute[227762]: 2026-01-23 10:33:47.578 227766 WARNING nova.compute.manager [req-f2815f8f-a649-415e-9b0f-f7acb5419cdd req-37f1c613-9c0f-4662-863f-6064c3b985a3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received unexpected event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:33:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:47.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.060 227766 DEBUG oslo_concurrency.lockutils [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "298e1080-6898-4a9b-903e-052965024e8a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.061 227766 DEBUG oslo_concurrency.lockutils [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.061 227766 DEBUG oslo_concurrency.lockutils [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "298e1080-6898-4a9b-903e-052965024e8a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.062 227766 DEBUG oslo_concurrency.lockutils [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.062 227766 DEBUG oslo_concurrency.lockutils [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.065 227766 INFO nova.compute.manager [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Terminating instance#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.067 227766 DEBUG nova.compute.manager [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:33:48 np0005593234 kernel: tapb6ddc2d2-27 (unregistering): left promiscuous mode
Jan 23 05:33:48 np0005593234 NetworkManager[48942]: <info>  [1769164428.2136] device (tapb6ddc2d2-27): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:33:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:48Z|00802|binding|INFO|Releasing lport b6ddc2d2-277d-4859-8c63-6920fe72886a from this chassis (sb_readonly=0)
Jan 23 05:33:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:48Z|00803|binding|INFO|Setting lport b6ddc2d2-277d-4859-8c63-6920fe72886a down in Southbound
Jan 23 05:33:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:48Z|00804|binding|INFO|Removing iface tapb6ddc2d2-27 ovn-installed in OVS
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.227 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.233 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c2:a4:da 10.100.0.9'], port_security=['fa:16:3e:c2:a4:da 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '298e1080-6898-4a9b-903e-052965024e8a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '98c94577fcdb4c3d893898ede79ea2d4', 'neutron:revision_number': '4', 'neutron:security_group_ids': '59524fa5-1e38-4c78-bba4-1817dab86850', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8bc6b799-b691-4f6b-8f97-c0d3cab241b4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=b6ddc2d2-277d-4859-8c63-6920fe72886a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.234 144381 INFO neutron.agent.ovn.metadata.agent [-] Port b6ddc2d2-277d-4859-8c63-6920fe72886a in datapath bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998 unbound from our chassis#033[00m
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.235 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.237 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ebe6e380-0eaf-4f0a-86ca-29e4fb63bd5a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.238 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998 namespace which is not needed anymore#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.240 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:48 np0005593234 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b6.scope: Deactivated successfully.
Jan 23 05:33:48 np0005593234 systemd[1]: machine-qemu\x2d87\x2dinstance\x2d000000b6.scope: Consumed 13.809s CPU time.
Jan 23 05:33:48 np0005593234 systemd-machined[195626]: Machine qemu-87-instance-000000b6 terminated.
Jan 23 05:33:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.290 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.295 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.304 227766 INFO nova.virt.libvirt.driver [-] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Instance destroyed successfully.#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.304 227766 DEBUG nova.objects.instance [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lazy-loading 'resources' on Instance uuid 298e1080-6898-4a9b-903e-052965024e8a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.334 227766 DEBUG nova.virt.libvirt.vif [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:32:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1855261785',display_name='tempest-TestNetworkBasicOps-server-1855261785',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1855261785',id=182,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCxvuKoRCdU6NfGCJ9/K6lYehfBrYexe6JocWr8Q1ZD1CqGS7uFQ9Epr7CcEOwAFDI68GcaE9FQsMJxvu2ytNjTEF1iupLEVG5hBjQgiqOY7KJssQlPmRhqcGHft2KD/pQ==',key_name='tempest-TestNetworkBasicOps-843741360',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:33:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='98c94577fcdb4c3d893898ede79ea2d4',ramdisk_id='',reservation_id='r-0kakalu9',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-789276745',owner_user_name='tempest-TestNetworkBasicOps-789276745-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:33:16Z,user_data=None,user_id='60291ce86b6946629a2e48f6680312cb',uuid=298e1080-6898-4a9b-903e-052965024e8a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "address": "fa:16:3e:c2:a4:da", "network": {"id": "bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998", "bridge": "br-int", "label": "tempest-network-smoke--156740505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6ddc2d2-27", "ovs_interfaceid": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.335 227766 DEBUG nova.network.os_vif_util [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converting VIF {"id": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "address": "fa:16:3e:c2:a4:da", "network": {"id": "bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998", "bridge": "br-int", "label": "tempest-network-smoke--156740505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.190", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6ddc2d2-27", "ovs_interfaceid": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.335 227766 DEBUG nova.network.os_vif_util [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c2:a4:da,bridge_name='br-int',has_traffic_filtering=True,id=b6ddc2d2-277d-4859-8c63-6920fe72886a,network=Network(bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6ddc2d2-27') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.336 227766 DEBUG os_vif [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:a4:da,bridge_name='br-int',has_traffic_filtering=True,id=b6ddc2d2-277d-4859-8c63-6920fe72886a,network=Network(bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6ddc2d2-27') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.338 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.338 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb6ddc2d2-27, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.340 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.342 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.345 227766 INFO os_vif [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c2:a4:da,bridge_name='br-int',has_traffic_filtering=True,id=b6ddc2d2-277d-4859-8c63-6920fe72886a,network=Network(bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb6ddc2d2-27')#033[00m
Jan 23 05:33:48 np0005593234 neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998[312213]: [NOTICE]   (312217) : haproxy version is 2.8.14-c23fe91
Jan 23 05:33:48 np0005593234 neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998[312213]: [NOTICE]   (312217) : path to executable is /usr/sbin/haproxy
Jan 23 05:33:48 np0005593234 neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998[312213]: [WARNING]  (312217) : Exiting Master process...
Jan 23 05:33:48 np0005593234 neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998[312213]: [ALERT]    (312217) : Current worker (312219) exited with code 143 (Terminated)
Jan 23 05:33:48 np0005593234 neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998[312213]: [WARNING]  (312217) : All workers exited. Exiting... (0)
Jan 23 05:33:48 np0005593234 systemd[1]: libpod-81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1.scope: Deactivated successfully.
Jan 23 05:33:48 np0005593234 podman[314869]: 2026-01-23 10:33:48.389644313 +0000 UTC m=+0.044251924 container died 81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:33:48 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1-userdata-shm.mount: Deactivated successfully.
Jan 23 05:33:48 np0005593234 systemd[1]: var-lib-containers-storage-overlay-8de1384971c726afa0086383a50c471889291a9e2da7aa3d907db0815419f56c-merged.mount: Deactivated successfully.
Jan 23 05:33:48 np0005593234 podman[314869]: 2026-01-23 10:33:48.425811053 +0000 UTC m=+0.080418664 container cleanup 81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 05:33:48 np0005593234 systemd[1]: libpod-conmon-81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1.scope: Deactivated successfully.
Jan 23 05:33:48 np0005593234 podman[314911]: 2026-01-23 10:33:48.505806372 +0000 UTC m=+0.055356640 container remove 81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.513 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bce47f80-a082-4b00-9565-d20ccfe331a0]: (4, ('Fri Jan 23 10:33:48 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998 (81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1)\n81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1\nFri Jan 23 10:33:48 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998 (81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1)\n81b716528d990023997ef0f4437f321c00fcdc1ea36688ef67ca51c3e11bc2e1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.515 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6b4c5ab1-f25a-41f0-a3e1-5a2e4c9fb7c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.516 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbd8a7a46-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.517 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:48 np0005593234 kernel: tapbd8a7a46-f0: left promiscuous mode
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.520 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.526 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[111529c7-a127-4342-9766-91d418392857]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.538 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.551 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cc366364-32e3-422f-9e3b-06a8331ef35b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.553 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d4fe8578-b7bd-4f15-aad0-c67b5b85df88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.577 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ac25b28c-1050-455c-be77-fd0d180850ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 828504, 'reachable_time': 17510, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 314924, 'error': None, 'target': 'ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:48 np0005593234 systemd[1]: run-netns-ovnmeta\x2dbd8a7a46\x2df9bb\x2d4df9\x2dbc71\x2d7cdb77bd5998.mount: Deactivated successfully.
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.581 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:33:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:33:48.581 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[60edec18-c545-4eda-83d8-0d3c3c1aa528]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:33:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:48.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.836 227766 INFO nova.virt.libvirt.driver [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Deleting instance files /var/lib/nova/instances/298e1080-6898-4a9b-903e-052965024e8a_del#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.837 227766 INFO nova.virt.libvirt.driver [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Deletion of /var/lib/nova/instances/298e1080-6898-4a9b-903e-052965024e8a_del complete#033[00m
Jan 23 05:33:48 np0005593234 nova_compute[227762]: 2026-01-23 10:33:48.871 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:49 np0005593234 nova_compute[227762]: 2026-01-23 10:33:49.115 227766 INFO nova.compute.manager [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Took 1.05 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:33:49 np0005593234 nova_compute[227762]: 2026-01-23 10:33:49.116 227766 DEBUG oslo.service.loopingcall [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:33:49 np0005593234 nova_compute[227762]: 2026-01-23 10:33:49.116 227766 DEBUG nova.compute.manager [-] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:33:49 np0005593234 nova_compute[227762]: 2026-01-23 10:33:49.117 227766 DEBUG nova.network.neutron [-] [instance: 298e1080-6898-4a9b-903e-052965024e8a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:33:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:49.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:49 np0005593234 nova_compute[227762]: 2026-01-23 10:33:49.891 227766 DEBUG nova.compute.manager [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Received event network-changed-b6ddc2d2-277d-4859-8c63-6920fe72886a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:49 np0005593234 nova_compute[227762]: 2026-01-23 10:33:49.891 227766 DEBUG nova.compute.manager [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Refreshing instance network info cache due to event network-changed-b6ddc2d2-277d-4859-8c63-6920fe72886a. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:33:49 np0005593234 nova_compute[227762]: 2026-01-23 10:33:49.892 227766 DEBUG oslo_concurrency.lockutils [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-298e1080-6898-4a9b-903e-052965024e8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:33:49 np0005593234 nova_compute[227762]: 2026-01-23 10:33:49.892 227766 DEBUG oslo_concurrency.lockutils [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-298e1080-6898-4a9b-903e-052965024e8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:33:49 np0005593234 nova_compute[227762]: 2026-01-23 10:33:49.892 227766 DEBUG nova.network.neutron [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Refreshing network info cache for port b6ddc2d2-277d-4859-8c63-6920fe72886a _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:33:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:50.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:51.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:52 np0005593234 nova_compute[227762]: 2026-01-23 10:33:52.119 227766 DEBUG nova.compute.manager [req-6d5b7389-f0e5-4f69-b8a5-f47b106c678d req-2755a9df-8e84-43f5-a97f-81591e1fbb24 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Received event network-vif-plugged-b6ddc2d2-277d-4859-8c63-6920fe72886a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:52 np0005593234 nova_compute[227762]: 2026-01-23 10:33:52.120 227766 DEBUG oslo_concurrency.lockutils [req-6d5b7389-f0e5-4f69-b8a5-f47b106c678d req-2755a9df-8e84-43f5-a97f-81591e1fbb24 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "298e1080-6898-4a9b-903e-052965024e8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:52 np0005593234 nova_compute[227762]: 2026-01-23 10:33:52.120 227766 DEBUG oslo_concurrency.lockutils [req-6d5b7389-f0e5-4f69-b8a5-f47b106c678d req-2755a9df-8e84-43f5-a97f-81591e1fbb24 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:52 np0005593234 nova_compute[227762]: 2026-01-23 10:33:52.121 227766 DEBUG oslo_concurrency.lockutils [req-6d5b7389-f0e5-4f69-b8a5-f47b106c678d req-2755a9df-8e84-43f5-a97f-81591e1fbb24 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:52 np0005593234 nova_compute[227762]: 2026-01-23 10:33:52.121 227766 DEBUG nova.compute.manager [req-6d5b7389-f0e5-4f69-b8a5-f47b106c678d req-2755a9df-8e84-43f5-a97f-81591e1fbb24 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] No waiting events found dispatching network-vif-plugged-b6ddc2d2-277d-4859-8c63-6920fe72886a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:52 np0005593234 nova_compute[227762]: 2026-01-23 10:33:52.121 227766 WARNING nova.compute.manager [req-6d5b7389-f0e5-4f69-b8a5-f47b106c678d req-2755a9df-8e84-43f5-a97f-81591e1fbb24 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Received unexpected event network-vif-plugged-b6ddc2d2-277d-4859-8c63-6920fe72886a for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:33:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:52.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:53 np0005593234 nova_compute[227762]: 2026-01-23 10:33:53.342 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:53.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:53 np0005593234 nova_compute[227762]: 2026-01-23 10:33:53.873 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:54 np0005593234 nova_compute[227762]: 2026-01-23 10:33:54.593 227766 DEBUG nova.network.neutron [-] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:54 np0005593234 nova_compute[227762]: 2026-01-23 10:33:54.650 227766 INFO nova.compute.manager [-] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Took 5.53 seconds to deallocate network for instance.#033[00m
Jan 23 05:33:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:33:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:54.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:33:54 np0005593234 nova_compute[227762]: 2026-01-23 10:33:54.903 227766 DEBUG oslo_concurrency.lockutils [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:54 np0005593234 nova_compute[227762]: 2026-01-23 10:33:54.904 227766 DEBUG oslo_concurrency.lockutils [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:55 np0005593234 nova_compute[227762]: 2026-01-23 10:33:55.002 227766 DEBUG nova.compute.manager [req-97353647-b097-4bc9-a480-c70d465de88c req-769a2bbc-d57d-4ec0-bb84-57c56444f3f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Received event network-vif-deleted-b6ddc2d2-277d-4859-8c63-6920fe72886a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:55 np0005593234 nova_compute[227762]: 2026-01-23 10:33:55.174 227766 DEBUG oslo_concurrency.processutils [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:33:55 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1087747975' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:33:55 np0005593234 nova_compute[227762]: 2026-01-23 10:33:55.632 227766 DEBUG oslo_concurrency.processutils [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:55 np0005593234 nova_compute[227762]: 2026-01-23 10:33:55.638 227766 DEBUG nova.compute.provider_tree [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:33:55 np0005593234 nova_compute[227762]: 2026-01-23 10:33:55.674 227766 DEBUG nova.scheduler.client.report [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:33:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:55.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:55 np0005593234 nova_compute[227762]: 2026-01-23 10:33:55.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:55 np0005593234 nova_compute[227762]: 2026-01-23 10:33:55.773 227766 DEBUG oslo_concurrency.lockutils [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.869s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:55 np0005593234 podman[314951]: 2026-01-23 10:33:55.796781581 +0000 UTC m=+0.090476078 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:33:55 np0005593234 nova_compute[227762]: 2026-01-23 10:33:55.893 227766 INFO nova.scheduler.client.report [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Deleted allocations for instance 298e1080-6898-4a9b-903e-052965024e8a#033[00m
Jan 23 05:33:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:33:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:56.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:33:56 np0005593234 nova_compute[227762]: 2026-01-23 10:33:56.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:33:56 np0005593234 ovn_controller[134547]: 2026-01-23T10:33:56Z|00105|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:02:16:71 10.100.0.14
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.006 227766 DEBUG oslo_concurrency.lockutils [None req-07b4648e-d704-430b-805a-2e16aa4b1b1f 60291ce86b6946629a2e48f6680312cb 98c94577fcdb4c3d893898ede79ea2d4 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.945s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.047 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.048 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.048 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.049 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.049 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.406 227766 DEBUG nova.network.neutron [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Updated VIF entry in instance network info cache for port b6ddc2d2-277d-4859-8c63-6920fe72886a. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.408 227766 DEBUG nova.network.neutron [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Updating instance_info_cache with network_info: [{"id": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "address": "fa:16:3e:c2:a4:da", "network": {"id": "bd8a7a46-f9bb-4df9-bc71-7cdb77bd5998", "bridge": "br-int", "label": "tempest-network-smoke--156740505", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "98c94577fcdb4c3d893898ede79ea2d4", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb6ddc2d2-27", "ovs_interfaceid": "b6ddc2d2-277d-4859-8c63-6920fe72886a", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:33:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:33:57 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/967992686' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.500 227766 DEBUG oslo_concurrency.lockutils [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-298e1080-6898-4a9b-903e-052965024e8a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.500 227766 DEBUG nova.compute.manager [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.501 227766 DEBUG oslo_concurrency.lockutils [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.501 227766 DEBUG oslo_concurrency.lockutils [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.502 227766 DEBUG oslo_concurrency.lockutils [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.502 227766 DEBUG nova.compute.manager [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] No waiting events found dispatching network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.502 227766 WARNING nova.compute.manager [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received unexpected event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.503 227766 DEBUG nova.compute.manager [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Received event network-vif-unplugged-b6ddc2d2-277d-4859-8c63-6920fe72886a external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.503 227766 DEBUG oslo_concurrency.lockutils [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "298e1080-6898-4a9b-903e-052965024e8a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.503 227766 DEBUG oslo_concurrency.lockutils [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.504 227766 DEBUG oslo_concurrency.lockutils [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "298e1080-6898-4a9b-903e-052965024e8a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.504 227766 DEBUG nova.compute.manager [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] No waiting events found dispatching network-vif-unplugged-b6ddc2d2-277d-4859-8c63-6920fe72886a pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.504 227766 DEBUG nova.compute.manager [req-d8c6bde3-d38c-4d9c-81f2-49fb8448ce56 req-114ffc85-d390-475f-bada-6204f9c1bc56 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Received event network-vif-unplugged-b6ddc2d2-277d-4859-8c63-6920fe72886a for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.505 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.623 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.624 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.626 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000b5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.627 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000b5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:33:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:57.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.791 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.793 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3774MB free_disk=20.806060791015625GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.793 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.793 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.928 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 64ccc062-b11b-4cbc-96ba-620e43dfdb20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.929 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 1114ae68-dab9-46b3-abab-53f135df78d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.929 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:33:57 np0005593234 nova_compute[227762]: 2026-01-23 10:33:57.929 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:33:58 np0005593234 nova_compute[227762]: 2026-01-23 10:33:58.000 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:33:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:33:58 np0005593234 nova_compute[227762]: 2026-01-23 10:33:58.377 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:33:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2048867262' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:33:58 np0005593234 nova_compute[227762]: 2026-01-23 10:33:58.431 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:33:58 np0005593234 nova_compute[227762]: 2026-01-23 10:33:58.437 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:33:58 np0005593234 nova_compute[227762]: 2026-01-23 10:33:58.545 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:33:58 np0005593234 nova_compute[227762]: 2026-01-23 10:33:58.668 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:33:58 np0005593234 nova_compute[227762]: 2026-01-23 10:33:58.668 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.875s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:33:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:33:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:58.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:33:58 np0005593234 nova_compute[227762]: 2026-01-23 10:33:58.876 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:33:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:33:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:33:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:59.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:00 np0005593234 nova_compute[227762]: 2026-01-23 10:34:00.669 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:00 np0005593234 nova_compute[227762]: 2026-01-23 10:34:00.670 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:34:00 np0005593234 nova_compute[227762]: 2026-01-23 10:34:00.670 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:34:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:00.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:00 np0005593234 nova_compute[227762]: 2026-01-23 10:34:00.962 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:34:00 np0005593234 nova_compute[227762]: 2026-01-23 10:34:00.963 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:34:00 np0005593234 nova_compute[227762]: 2026-01-23 10:34:00.963 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:34:00 np0005593234 nova_compute[227762]: 2026-01-23 10:34:00.963 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:34:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:01.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:02.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:03 np0005593234 nova_compute[227762]: 2026-01-23 10:34:03.303 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164428.302378, 298e1080-6898-4a9b-903e-052965024e8a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:34:03 np0005593234 nova_compute[227762]: 2026-01-23 10:34:03.303 227766 INFO nova.compute.manager [-] [instance: 298e1080-6898-4a9b-903e-052965024e8a] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:34:03 np0005593234 nova_compute[227762]: 2026-01-23 10:34:03.335 227766 DEBUG nova.compute.manager [None req-c6a9569a-2af1-46ec-979f-6050414bb334 - - - - - -] [instance: 298e1080-6898-4a9b-903e-052965024e8a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:34:03 np0005593234 nova_compute[227762]: 2026-01-23 10:34:03.381 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:03.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:03 np0005593234 nova_compute[227762]: 2026-01-23 10:34:03.878 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:04.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:05 np0005593234 nova_compute[227762]: 2026-01-23 10:34:05.149 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updating instance_info_cache with network_info: [{"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:34:05 np0005593234 nova_compute[227762]: 2026-01-23 10:34:05.167 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:34:05 np0005593234 nova_compute[227762]: 2026-01-23 10:34:05.167 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:34:05 np0005593234 nova_compute[227762]: 2026-01-23 10:34:05.167 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:05 np0005593234 nova_compute[227762]: 2026-01-23 10:34:05.168 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:05 np0005593234 nova_compute[227762]: 2026-01-23 10:34:05.168 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:05 np0005593234 nova_compute[227762]: 2026-01-23 10:34:05.168 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:34:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:05.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:06 np0005593234 ovn_controller[134547]: 2026-01-23T10:34:06Z|00805|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:34:06 np0005593234 nova_compute[227762]: 2026-01-23 10:34:06.165 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:06 np0005593234 ovn_controller[134547]: 2026-01-23T10:34:06Z|00806|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:34:06 np0005593234 nova_compute[227762]: 2026-01-23 10:34:06.308 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:06.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:06 np0005593234 nova_compute[227762]: 2026-01-23 10:34:06.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:07.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:08 np0005593234 nova_compute[227762]: 2026-01-23 10:34:08.384 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:08.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:08 np0005593234 nova_compute[227762]: 2026-01-23 10:34:08.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:08 np0005593234 nova_compute[227762]: 2026-01-23 10:34:08.880 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:09.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:10.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:11.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:12.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:13 np0005593234 nova_compute[227762]: 2026-01-23 10:34:13.387 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:13.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:13 np0005593234 nova_compute[227762]: 2026-01-23 10:34:13.883 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:14.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:14 np0005593234 nova_compute[227762]: 2026-01-23 10:34:14.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:15.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:16.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:17.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:17 np0005593234 podman[315088]: 2026-01-23 10:34:17.753872226 +0000 UTC m=+0.052966306 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:34:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:18 np0005593234 nova_compute[227762]: 2026-01-23 10:34:18.389 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:18.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:18 np0005593234 nova_compute[227762]: 2026-01-23 10:34:18.884 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:19.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:20.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:20 np0005593234 nova_compute[227762]: 2026-01-23 10:34:20.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:21.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:22.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:22 np0005593234 nova_compute[227762]: 2026-01-23 10:34:22.895 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:22 np0005593234 NetworkManager[48942]: <info>  [1769164462.8958] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/379)
Jan 23 05:34:22 np0005593234 NetworkManager[48942]: <info>  [1769164462.8970] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/380)
Jan 23 05:34:22 np0005593234 nova_compute[227762]: 2026-01-23 10:34:22.976 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:34:22Z|00807|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:34:22 np0005593234 nova_compute[227762]: 2026-01-23 10:34:22.983 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:23 np0005593234 nova_compute[227762]: 2026-01-23 10:34:23.391 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:23 np0005593234 nova_compute[227762]: 2026-01-23 10:34:23.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:23.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:23 np0005593234 nova_compute[227762]: 2026-01-23 10:34:23.886 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:24.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:25.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:26.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:26 np0005593234 podman[315163]: 2026-01-23 10:34:26.789818396 +0000 UTC m=+0.087805964 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:34:27 np0005593234 nova_compute[227762]: 2026-01-23 10:34:27.070 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:27.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:28 np0005593234 nova_compute[227762]: 2026-01-23 10:34:28.461 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:28.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:28 np0005593234 nova_compute[227762]: 2026-01-23 10:34:28.890 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:29 np0005593234 nova_compute[227762]: 2026-01-23 10:34:29.199 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:29.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:30.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:31.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:32.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:33 np0005593234 nova_compute[227762]: 2026-01-23 10:34:33.464 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:33.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:33 np0005593234 nova_compute[227762]: 2026-01-23 10:34:33.910 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:34.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:35.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:36.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:34:37.687 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=69, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=68) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:34:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:34:37.688 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:34:37 np0005593234 nova_compute[227762]: 2026-01-23 10:34:37.688 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:37.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e354 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:38 np0005593234 nova_compute[227762]: 2026-01-23 10:34:38.515 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:38.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:38 np0005593234 nova_compute[227762]: 2026-01-23 10:34:38.912 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:39.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:40.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:40 np0005593234 nova_compute[227762]: 2026-01-23 10:34:40.762 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:40 np0005593234 nova_compute[227762]: 2026-01-23 10:34:40.763 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:34:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:41.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:42 np0005593234 nova_compute[227762]: 2026-01-23 10:34:42.015 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e355 e355: 3 total, 3 up, 3 in
Jan 23 05:34:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:34:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:42.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:34:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:34:42.871 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:34:42.872 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:34:42.872 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e355 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e356 e356: 3 total, 3 up, 3 in
Jan 23 05:34:43 np0005593234 nova_compute[227762]: 2026-01-23 10:34:43.518 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:34:43.690 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '69'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:34:43 np0005593234 nova_compute[227762]: 2026-01-23 10:34:43.768 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:43 np0005593234 nova_compute[227762]: 2026-01-23 10:34:43.769 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:34:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:43.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:43 np0005593234 nova_compute[227762]: 2026-01-23 10:34:43.790 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:34:43 np0005593234 nova_compute[227762]: 2026-01-23 10:34:43.951 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e357 e357: 3 total, 3 up, 3 in
Jan 23 05:34:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:44.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:45.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:46.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:47.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e357 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:48 np0005593234 nova_compute[227762]: 2026-01-23 10:34:48.522 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:48.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:48 np0005593234 podman[315381]: 2026-01-23 10:34:48.772383589 +0000 UTC m=+0.061518193 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 23 05:34:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e358 e358: 3 total, 3 up, 3 in
Jan 23 05:34:48 np0005593234 nova_compute[227762]: 2026-01-23 10:34:48.953 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:34:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:34:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:34:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:49.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:34:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:34:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:34:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:34:50 np0005593234 nova_compute[227762]: 2026-01-23 10:34:50.190 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:50.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:51.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:52.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:53 np0005593234 nova_compute[227762]: 2026-01-23 10:34:53.525 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:53.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:54 np0005593234 nova_compute[227762]: 2026-01-23 10:34:54.015 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:54.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:55.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:56.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:57 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:34:57 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:34:57 np0005593234 nova_compute[227762]: 2026-01-23 10:34:57.765 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:57 np0005593234 nova_compute[227762]: 2026-01-23 10:34:57.766 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:34:57 np0005593234 podman[315406]: 2026-01-23 10:34:57.786521946 +0000 UTC m=+0.077590814 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 05:34:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:57.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:57 np0005593234 nova_compute[227762]: 2026-01-23 10:34:57.816 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:57 np0005593234 nova_compute[227762]: 2026-01-23 10:34:57.816 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:57 np0005593234 nova_compute[227762]: 2026-01-23 10:34:57.817 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:57 np0005593234 nova_compute[227762]: 2026-01-23 10:34:57.817 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:34:57 np0005593234 nova_compute[227762]: 2026-01-23 10:34:57.817 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:34:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2893501235' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.274 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.389 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.389 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.393 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000b5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.393 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000b5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.563 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.600 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.602 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3817MB free_disk=20.785118103027344GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.602 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.602 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:34:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:58.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.809 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 64ccc062-b11b-4cbc-96ba-620e43dfdb20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.809 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 1114ae68-dab9-46b3-abab-53f135df78d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.810 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.810 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:34:58 np0005593234 nova_compute[227762]: 2026-01-23 10:34:58.950 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:34:59 np0005593234 nova_compute[227762]: 2026-01-23 10:34:59.017 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:34:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:34:59 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/748362328' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:34:59 np0005593234 nova_compute[227762]: 2026-01-23 10:34:59.369 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.419s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:34:59 np0005593234 nova_compute[227762]: 2026-01-23 10:34:59.376 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:34:59 np0005593234 nova_compute[227762]: 2026-01-23 10:34:59.409 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:34:59 np0005593234 nova_compute[227762]: 2026-01-23 10:34:59.411 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:34:59 np0005593234 nova_compute[227762]: 2026-01-23 10:34:59.411 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.809s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:34:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:34:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:34:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:59.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:00.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:01.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:02 np0005593234 nova_compute[227762]: 2026-01-23 10:35:02.392 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:02 np0005593234 nova_compute[227762]: 2026-01-23 10:35:02.392 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:35:02 np0005593234 nova_compute[227762]: 2026-01-23 10:35:02.685 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:35:02 np0005593234 nova_compute[227762]: 2026-01-23 10:35:02.685 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:35:02 np0005593234 nova_compute[227762]: 2026-01-23 10:35:02.685 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:35:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:35:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:02.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:35:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:03 np0005593234 nova_compute[227762]: 2026-01-23 10:35:03.565 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:03 np0005593234 nova_compute[227762]: 2026-01-23 10:35:03.590 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Acquiring lock "6673e062-5d99-4c31-a3e0-673f55438d6e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:03 np0005593234 nova_compute[227762]: 2026-01-23 10:35:03.590 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:03 np0005593234 nova_compute[227762]: 2026-01-23 10:35:03.624 227766 DEBUG nova.compute.manager [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:35:03 np0005593234 nova_compute[227762]: 2026-01-23 10:35:03.710 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:03 np0005593234 nova_compute[227762]: 2026-01-23 10:35:03.711 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:03 np0005593234 nova_compute[227762]: 2026-01-23 10:35:03.721 227766 DEBUG nova.virt.hardware [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:35:03 np0005593234 nova_compute[227762]: 2026-01-23 10:35:03.721 227766 INFO nova.compute.claims [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:35:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:03.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:03 np0005593234 nova_compute[227762]: 2026-01-23 10:35:03.966 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:04 np0005593234 nova_compute[227762]: 2026-01-23 10:35:04.018 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:35:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/479510188' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:35:04 np0005593234 nova_compute[227762]: 2026-01-23 10:35:04.426 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:04 np0005593234 nova_compute[227762]: 2026-01-23 10:35:04.433 227766 DEBUG nova.compute.provider_tree [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:35:04 np0005593234 nova_compute[227762]: 2026-01-23 10:35:04.515 227766 DEBUG nova.scheduler.client.report [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:35:04 np0005593234 nova_compute[227762]: 2026-01-23 10:35:04.557 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.846s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:04 np0005593234 nova_compute[227762]: 2026-01-23 10:35:04.558 227766 DEBUG nova.compute.manager [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:35:04 np0005593234 nova_compute[227762]: 2026-01-23 10:35:04.715 227766 DEBUG nova.compute.manager [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:35:04 np0005593234 nova_compute[227762]: 2026-01-23 10:35:04.716 227766 DEBUG nova.network.neutron [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:35:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:04.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:04 np0005593234 nova_compute[227762]: 2026-01-23 10:35:04.751 227766 INFO nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:35:04 np0005593234 nova_compute[227762]: 2026-01-23 10:35:04.777 227766 DEBUG nova.compute.manager [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.250 227766 DEBUG nova.compute.manager [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.252 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.253 227766 INFO nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Creating image(s)#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.411 227766 DEBUG nova.storage.rbd_utils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] rbd image 6673e062-5d99-4c31-a3e0-673f55438d6e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.447 227766 DEBUG nova.storage.rbd_utils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] rbd image 6673e062-5d99-4c31-a3e0-673f55438d6e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.481 227766 DEBUG nova.storage.rbd_utils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] rbd image 6673e062-5d99-4c31-a3e0-673f55438d6e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.485 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.537 227766 DEBUG nova.policy [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a8ce4c88e8b46c5806ada5e3a6cdbbf', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd59dad6496894352a2f4c7eb66ca1914', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.559 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.560 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.561 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.561 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.599 227766 DEBUG nova.storage.rbd_utils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] rbd image 6673e062-5d99-4c31-a3e0-673f55438d6e_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:05 np0005593234 nova_compute[227762]: 2026-01-23 10:35:05.604 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 6673e062-5d99-4c31-a3e0-673f55438d6e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:05.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:06 np0005593234 nova_compute[227762]: 2026-01-23 10:35:06.448 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 6673e062-5d99-4c31-a3e0-673f55438d6e_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.844s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:06 np0005593234 nova_compute[227762]: 2026-01-23 10:35:06.529 227766 DEBUG nova.storage.rbd_utils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] resizing rbd image 6673e062-5d99-4c31-a3e0-673f55438d6e_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:35:06 np0005593234 nova_compute[227762]: 2026-01-23 10:35:06.657 227766 DEBUG nova.objects.instance [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lazy-loading 'migration_context' on Instance uuid 6673e062-5d99-4c31-a3e0-673f55438d6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:35:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:06.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:06 np0005593234 nova_compute[227762]: 2026-01-23 10:35:06.744 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:35:06 np0005593234 nova_compute[227762]: 2026-01-23 10:35:06.744 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Ensure instance console log exists: /var/lib/nova/instances/6673e062-5d99-4c31-a3e0-673f55438d6e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:35:06 np0005593234 nova_compute[227762]: 2026-01-23 10:35:06.746 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:06 np0005593234 nova_compute[227762]: 2026-01-23 10:35:06.747 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:06 np0005593234 nova_compute[227762]: 2026-01-23 10:35:06.747 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:07 np0005593234 nova_compute[227762]: 2026-01-23 10:35:07.291 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Updating instance_info_cache with network_info: [{"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:35:07 np0005593234 nova_compute[227762]: 2026-01-23 10:35:07.322 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-1114ae68-dab9-46b3-abab-53f135df78d8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:35:07 np0005593234 nova_compute[227762]: 2026-01-23 10:35:07.322 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:35:07 np0005593234 nova_compute[227762]: 2026-01-23 10:35:07.323 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:07 np0005593234 nova_compute[227762]: 2026-01-23 10:35:07.323 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:07 np0005593234 nova_compute[227762]: 2026-01-23 10:35:07.323 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:07 np0005593234 nova_compute[227762]: 2026-01-23 10:35:07.323 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:07 np0005593234 nova_compute[227762]: 2026-01-23 10:35:07.324 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:35:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:07.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:08 np0005593234 nova_compute[227762]: 2026-01-23 10:35:08.479 227766 DEBUG nova.network.neutron [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Successfully created port: ddcd0522-401c-4c1d-90df-7c407812f643 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:35:08 np0005593234 nova_compute[227762]: 2026-01-23 10:35:08.568 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:08.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:09 np0005593234 nova_compute[227762]: 2026-01-23 10:35:09.020 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:09.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:09 np0005593234 nova_compute[227762]: 2026-01-23 10:35:09.981 227766 DEBUG nova.network.neutron [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Successfully updated port: ddcd0522-401c-4c1d-90df-7c407812f643 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:35:10 np0005593234 nova_compute[227762]: 2026-01-23 10:35:10.016 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Acquiring lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:35:10 np0005593234 nova_compute[227762]: 2026-01-23 10:35:10.017 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Acquired lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:35:10 np0005593234 nova_compute[227762]: 2026-01-23 10:35:10.017 227766 DEBUG nova.network.neutron [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:35:10 np0005593234 nova_compute[227762]: 2026-01-23 10:35:10.162 227766 DEBUG nova.compute.manager [req-9ed565f5-6985-4e3d-ab12-f41c27055df6 req-5cec8519-695d-4811-9bfb-9cc6b60d7051 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Received event network-changed-ddcd0522-401c-4c1d-90df-7c407812f643 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:35:10 np0005593234 nova_compute[227762]: 2026-01-23 10:35:10.163 227766 DEBUG nova.compute.manager [req-9ed565f5-6985-4e3d-ab12-f41c27055df6 req-5cec8519-695d-4811-9bfb-9cc6b60d7051 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Refreshing instance network info cache due to event network-changed-ddcd0522-401c-4c1d-90df-7c407812f643. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:35:10 np0005593234 nova_compute[227762]: 2026-01-23 10:35:10.163 227766 DEBUG oslo_concurrency.lockutils [req-9ed565f5-6985-4e3d-ab12-f41c27055df6 req-5cec8519-695d-4811-9bfb-9cc6b60d7051 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:35:10 np0005593234 nova_compute[227762]: 2026-01-23 10:35:10.517 227766 DEBUG nova.network.neutron [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:35:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:10.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.793 227766 DEBUG nova.network.neutron [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Updating instance_info_cache with network_info: [{"id": "ddcd0522-401c-4c1d-90df-7c407812f643", "address": "fa:16:3e:9f:74:a2", "network": {"id": "f2cadf27-eae4-40fd-be37-b605e054ab76", "bridge": "br-int", "label": "tempest-TestStampPattern-1832792840-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59dad6496894352a2f4c7eb66ca1914", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd0522-40", "ovs_interfaceid": "ddcd0522-401c-4c1d-90df-7c407812f643", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:35:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:11.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.828 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Releasing lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.829 227766 DEBUG nova.compute.manager [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Instance network_info: |[{"id": "ddcd0522-401c-4c1d-90df-7c407812f643", "address": "fa:16:3e:9f:74:a2", "network": {"id": "f2cadf27-eae4-40fd-be37-b605e054ab76", "bridge": "br-int", "label": "tempest-TestStampPattern-1832792840-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59dad6496894352a2f4c7eb66ca1914", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd0522-40", "ovs_interfaceid": "ddcd0522-401c-4c1d-90df-7c407812f643", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.829 227766 DEBUG oslo_concurrency.lockutils [req-9ed565f5-6985-4e3d-ab12-f41c27055df6 req-5cec8519-695d-4811-9bfb-9cc6b60d7051 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.829 227766 DEBUG nova.network.neutron [req-9ed565f5-6985-4e3d-ab12-f41c27055df6 req-5cec8519-695d-4811-9bfb-9cc6b60d7051 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Refreshing network info cache for port ddcd0522-401c-4c1d-90df-7c407812f643 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.832 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Start _get_guest_xml network_info=[{"id": "ddcd0522-401c-4c1d-90df-7c407812f643", "address": "fa:16:3e:9f:74:a2", "network": {"id": "f2cadf27-eae4-40fd-be37-b605e054ab76", "bridge": "br-int", "label": "tempest-TestStampPattern-1832792840-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59dad6496894352a2f4c7eb66ca1914", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd0522-40", "ovs_interfaceid": "ddcd0522-401c-4c1d-90df-7c407812f643", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.837 227766 WARNING nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.842 227766 DEBUG nova.virt.libvirt.host [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.843 227766 DEBUG nova.virt.libvirt.host [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.851 227766 DEBUG nova.virt.libvirt.host [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.852 227766 DEBUG nova.virt.libvirt.host [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.853 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.854 227766 DEBUG nova.virt.hardware [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.854 227766 DEBUG nova.virt.hardware [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.854 227766 DEBUG nova.virt.hardware [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.855 227766 DEBUG nova.virt.hardware [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.855 227766 DEBUG nova.virt.hardware [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.855 227766 DEBUG nova.virt.hardware [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.855 227766 DEBUG nova.virt.hardware [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.855 227766 DEBUG nova.virt.hardware [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.856 227766 DEBUG nova.virt.hardware [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.856 227766 DEBUG nova.virt.hardware [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.856 227766 DEBUG nova.virt.hardware [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:35:11 np0005593234 nova_compute[227762]: 2026-01-23 10:35:11.859 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:35:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1983834689' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.314 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.342 227766 DEBUG nova.storage.rbd_utils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] rbd image 6673e062-5d99-4c31-a3e0-673f55438d6e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.346 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:12.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:35:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/956952719' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.801 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.803 227766 DEBUG nova.virt.libvirt.vif [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:35:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestStampPattern-server-1669121032',display_name='tempest-TestStampPattern-server-1669121032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-teststamppattern-server-1669121032',id=186,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMrKK+vqQ2ONAoFKX7V4eVrHBpyCPyjGn5U244sG4513gIb+5QaK2mU3GvydCfCOzo9xS+SqUIELsowqSaXGJbd+N0J3WtlcZAfr/OV3xzB4Bu/L3WF2HV34qxyNgfmi9Q==',key_name='tempest-TestStampPattern-1711959098',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d59dad6496894352a2f4c7eb66ca1914',ramdisk_id='',reservation_id='r-18qjfci3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestStampPattern-1763690147',owner_user_name='tempest-TestStampPattern-1763690147-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:35:04Z,user_data=None,user_id='9a8ce4c88e8b46c5806ada5e3a6cdbbf',uuid=6673e062-5d99-4c31-a3e0-673f55438d6e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddcd0522-401c-4c1d-90df-7c407812f643", "address": "fa:16:3e:9f:74:a2", "network": {"id": "f2cadf27-eae4-40fd-be37-b605e054ab76", "bridge": "br-int", "label": "tempest-TestStampPattern-1832792840-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59dad6496894352a2f4c7eb66ca1914", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd0522-40", "ovs_interfaceid": "ddcd0522-401c-4c1d-90df-7c407812f643", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.803 227766 DEBUG nova.network.os_vif_util [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Converting VIF {"id": "ddcd0522-401c-4c1d-90df-7c407812f643", "address": "fa:16:3e:9f:74:a2", "network": {"id": "f2cadf27-eae4-40fd-be37-b605e054ab76", "bridge": "br-int", "label": "tempest-TestStampPattern-1832792840-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59dad6496894352a2f4c7eb66ca1914", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd0522-40", "ovs_interfaceid": "ddcd0522-401c-4c1d-90df-7c407812f643", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.804 227766 DEBUG nova.network.os_vif_util [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:74:a2,bridge_name='br-int',has_traffic_filtering=True,id=ddcd0522-401c-4c1d-90df-7c407812f643,network=Network(f2cadf27-eae4-40fd-be37-b605e054ab76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd0522-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.805 227766 DEBUG nova.objects.instance [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6673e062-5d99-4c31-a3e0-673f55438d6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.909 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  <uuid>6673e062-5d99-4c31-a3e0-673f55438d6e</uuid>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  <name>instance-000000ba</name>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestStampPattern-server-1669121032</nova:name>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:35:11</nova:creationTime>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <nova:user uuid="9a8ce4c88e8b46c5806ada5e3a6cdbbf">tempest-TestStampPattern-1763690147-project-member</nova:user>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <nova:project uuid="d59dad6496894352a2f4c7eb66ca1914">tempest-TestStampPattern-1763690147</nova:project>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <nova:port uuid="ddcd0522-401c-4c1d-90df-7c407812f643">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <entry name="serial">6673e062-5d99-4c31-a3e0-673f55438d6e</entry>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <entry name="uuid">6673e062-5d99-4c31-a3e0-673f55438d6e</entry>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/6673e062-5d99-4c31-a3e0-673f55438d6e_disk">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/6673e062-5d99-4c31-a3e0-673f55438d6e_disk.config">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:9f:74:a2"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <target dev="tapddcd0522-40"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/6673e062-5d99-4c31-a3e0-673f55438d6e/console.log" append="off"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:35:12 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:35:12 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:35:12 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:35:12 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.910 227766 DEBUG nova.compute.manager [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Preparing to wait for external event network-vif-plugged-ddcd0522-401c-4c1d-90df-7c407812f643 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.911 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Acquiring lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.911 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.911 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.912 227766 DEBUG nova.virt.libvirt.vif [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:35:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestStampPattern-server-1669121032',display_name='tempest-TestStampPattern-server-1669121032',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-teststamppattern-server-1669121032',id=186,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMrKK+vqQ2ONAoFKX7V4eVrHBpyCPyjGn5U244sG4513gIb+5QaK2mU3GvydCfCOzo9xS+SqUIELsowqSaXGJbd+N0J3WtlcZAfr/OV3xzB4Bu/L3WF2HV34qxyNgfmi9Q==',key_name='tempest-TestStampPattern-1711959098',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d59dad6496894352a2f4c7eb66ca1914',ramdisk_id='',reservation_id='r-18qjfci3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestStampPattern-1763690147',owner_user_name='tempest-TestStampPattern-1763690147-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:35:04Z,user_data=None,user_id='9a8ce4c88e8b46c5806ada5e3a6cdbbf',uuid=6673e062-5d99-4c31-a3e0-673f55438d6e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "ddcd0522-401c-4c1d-90df-7c407812f643", "address": "fa:16:3e:9f:74:a2", "network": {"id": "f2cadf27-eae4-40fd-be37-b605e054ab76", "bridge": "br-int", "label": "tempest-TestStampPattern-1832792840-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59dad6496894352a2f4c7eb66ca1914", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd0522-40", "ovs_interfaceid": "ddcd0522-401c-4c1d-90df-7c407812f643", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.912 227766 DEBUG nova.network.os_vif_util [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Converting VIF {"id": "ddcd0522-401c-4c1d-90df-7c407812f643", "address": "fa:16:3e:9f:74:a2", "network": {"id": "f2cadf27-eae4-40fd-be37-b605e054ab76", "bridge": "br-int", "label": "tempest-TestStampPattern-1832792840-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59dad6496894352a2f4c7eb66ca1914", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd0522-40", "ovs_interfaceid": "ddcd0522-401c-4c1d-90df-7c407812f643", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.913 227766 DEBUG nova.network.os_vif_util [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9f:74:a2,bridge_name='br-int',has_traffic_filtering=True,id=ddcd0522-401c-4c1d-90df-7c407812f643,network=Network(f2cadf27-eae4-40fd-be37-b605e054ab76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd0522-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.913 227766 DEBUG os_vif [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:74:a2,bridge_name='br-int',has_traffic_filtering=True,id=ddcd0522-401c-4c1d-90df-7c407812f643,network=Network(f2cadf27-eae4-40fd-be37-b605e054ab76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd0522-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.914 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.914 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.915 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.919 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.919 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapddcd0522-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.919 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapddcd0522-40, col_values=(('external_ids', {'iface-id': 'ddcd0522-401c-4c1d-90df-7c407812f643', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9f:74:a2', 'vm-uuid': '6673e062-5d99-4c31-a3e0-673f55438d6e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.921 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:12 np0005593234 NetworkManager[48942]: <info>  [1769164512.9224] manager: (tapddcd0522-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/381)
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.923 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.929 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:12 np0005593234 nova_compute[227762]: 2026-01-23 10:35:12.931 227766 INFO os_vif [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9f:74:a2,bridge_name='br-int',has_traffic_filtering=True,id=ddcd0522-401c-4c1d-90df-7c407812f643,network=Network(f2cadf27-eae4-40fd-be37-b605e054ab76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd0522-40')#033[00m
Jan 23 05:35:13 np0005593234 nova_compute[227762]: 2026-01-23 10:35:13.004 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:35:13 np0005593234 nova_compute[227762]: 2026-01-23 10:35:13.005 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:35:13 np0005593234 nova_compute[227762]: 2026-01-23 10:35:13.005 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] No VIF found with MAC fa:16:3e:9f:74:a2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:35:13 np0005593234 nova_compute[227762]: 2026-01-23 10:35:13.006 227766 INFO nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Using config drive#033[00m
Jan 23 05:35:13 np0005593234 nova_compute[227762]: 2026-01-23 10:35:13.030 227766 DEBUG nova.storage.rbd_utils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] rbd image 6673e062-5d99-4c31-a3e0-673f55438d6e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:13.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.001 227766 INFO nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Creating config drive at /var/lib/nova/instances/6673e062-5d99-4c31-a3e0-673f55438d6e/disk.config#033[00m
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.006 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6673e062-5d99-4c31-a3e0-673f55438d6e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3jskl_iz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.029 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.139 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6673e062-5d99-4c31-a3e0-673f55438d6e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp3jskl_iz" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.165 227766 DEBUG nova.storage.rbd_utils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] rbd image 6673e062-5d99-4c31-a3e0-673f55438d6e_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.168 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6673e062-5d99-4c31-a3e0-673f55438d6e/disk.config 6673e062-5d99-4c31-a3e0-673f55438d6e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.543 227766 DEBUG nova.network.neutron [req-9ed565f5-6985-4e3d-ab12-f41c27055df6 req-5cec8519-695d-4811-9bfb-9cc6b60d7051 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Updated VIF entry in instance network info cache for port ddcd0522-401c-4c1d-90df-7c407812f643. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.544 227766 DEBUG nova.network.neutron [req-9ed565f5-6985-4e3d-ab12-f41c27055df6 req-5cec8519-695d-4811-9bfb-9cc6b60d7051 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Updating instance_info_cache with network_info: [{"id": "ddcd0522-401c-4c1d-90df-7c407812f643", "address": "fa:16:3e:9f:74:a2", "network": {"id": "f2cadf27-eae4-40fd-be37-b605e054ab76", "bridge": "br-int", "label": "tempest-TestStampPattern-1832792840-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59dad6496894352a2f4c7eb66ca1914", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd0522-40", "ovs_interfaceid": "ddcd0522-401c-4c1d-90df-7c407812f643", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.578 227766 DEBUG oslo_concurrency.lockutils [req-9ed565f5-6985-4e3d-ab12-f41c27055df6 req-5cec8519-695d-4811-9bfb-9cc6b60d7051 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:35:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:14.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.897 227766 DEBUG oslo_concurrency.processutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6673e062-5d99-4c31-a3e0-673f55438d6e/disk.config 6673e062-5d99-4c31-a3e0-673f55438d6e_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.729s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.898 227766 INFO nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Deleting local config drive /var/lib/nova/instances/6673e062-5d99-4c31-a3e0-673f55438d6e/disk.config because it was imported into RBD.#033[00m
Jan 23 05:35:14 np0005593234 NetworkManager[48942]: <info>  [1769164514.9528] manager: (tapddcd0522-40): new Tun device (/org/freedesktop/NetworkManager/Devices/382)
Jan 23 05:35:14 np0005593234 kernel: tapddcd0522-40: entered promiscuous mode
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.954 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.958 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:35:14Z|00808|binding|INFO|Claiming lport ddcd0522-401c-4c1d-90df-7c407812f643 for this chassis.
Jan 23 05:35:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:35:14Z|00809|binding|INFO|ddcd0522-401c-4c1d-90df-7c407812f643: Claiming fa:16:3e:9f:74:a2 10.100.0.10
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.980 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:14.989 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:74:a2 10.100.0.10'], port_security=['fa:16:3e:9f:74:a2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6673e062-5d99-4c31-a3e0-673f55438d6e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2cadf27-eae4-40fd-be37-b605e054ab76', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd59dad6496894352a2f4c7eb66ca1914', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b37d3a3b-2932-4053-8960-53ee514541cc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=abb49df5-6f5b-4d51-9b7e-cc6910f0a6bb, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=ddcd0522-401c-4c1d-90df-7c407812f643) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:35:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:14.991 144381 INFO neutron.agent.ovn.metadata.agent [-] Port ddcd0522-401c-4c1d-90df-7c407812f643 in datapath f2cadf27-eae4-40fd-be37-b605e054ab76 bound to our chassis#033[00m
Jan 23 05:35:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:14.992 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f2cadf27-eae4-40fd-be37-b605e054ab76#033[00m
Jan 23 05:35:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:35:14Z|00810|binding|INFO|Setting lport ddcd0522-401c-4c1d-90df-7c407812f643 ovn-installed in OVS
Jan 23 05:35:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:35:14Z|00811|binding|INFO|Setting lport ddcd0522-401c-4c1d-90df-7c407812f643 up in Southbound
Jan 23 05:35:14 np0005593234 nova_compute[227762]: 2026-01-23 10:35:14.998 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:15 np0005593234 systemd-udevd[315909]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.005 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4458ca4e-7790-4869-a39b-d266f4c38feb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.006 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf2cadf27-e1 in ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.008 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf2cadf27-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.008 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5a30c9e4-3a93-4fe6-923b-75c5f851f375]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.011 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[62ba675f-91b4-4f4f-886b-d2248599d78b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 systemd-machined[195626]: New machine qemu-90-instance-000000ba.
Jan 23 05:35:15 np0005593234 NetworkManager[48942]: <info>  [1769164515.0185] device (tapddcd0522-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:35:15 np0005593234 NetworkManager[48942]: <info>  [1769164515.0195] device (tapddcd0522-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:35:15 np0005593234 systemd[1]: Started Virtual Machine qemu-90-instance-000000ba.
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.027 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[33a33ea3-1aaa-4ed8-b0f2-9df1c88832d2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.042 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1c30649e-cee1-4ab6-9057-a7a5eebfba3c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.069 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1046001c-8fc6-493e-a11a-326764fae660]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 NetworkManager[48942]: <info>  [1769164515.0750] manager: (tapf2cadf27-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/383)
Jan 23 05:35:15 np0005593234 systemd-udevd[315913]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.075 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2aa1fff6-1cd5-4070-af14-9e7fab3f07df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.103 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a5556195-874c-4323-8f0d-9950b9971023]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.105 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f18547c4-ff87-4083-9161-70bbec987a45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 NetworkManager[48942]: <info>  [1769164515.1325] device (tapf2cadf27-e0): carrier: link connected
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.138 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[6e4dcb14-062b-4bb1-bfc1-1a0887487a1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.154 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[491ab0e9-926c-4cf7-b7b4-07f9d2028a6f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2cadf27-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:5f:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 248], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 841009, 'reachable_time': 29037, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 315942, 'error': None, 'target': 'ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.175 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e0fef8b6-9b06-4275-95f6-c2b5e603c10f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe20:5fa8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 841009, 'tstamp': 841009}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 315943, 'error': None, 'target': 'ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.192 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[93e71f09-3eed-4adc-96b9-3b3a8013849e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf2cadf27-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:20:5f:a8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 248], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 841009, 'reachable_time': 29037, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 315944, 'error': None, 'target': 'ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.225 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[81696907-e5e3-4b66-8e7d-d559d3fa8aba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.292 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bc231cf8-f3f3-4e77-8a9b-3355a29bc702]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.293 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2cadf27-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.294 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.294 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf2cadf27-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.357 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:15 np0005593234 kernel: tapf2cadf27-e0: entered promiscuous mode
Jan 23 05:35:15 np0005593234 NetworkManager[48942]: <info>  [1769164515.3651] manager: (tapf2cadf27-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/384)
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.365 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.366 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf2cadf27-e0, col_values=(('external_ids', {'iface-id': 'f45d82fe-2ce9-4738-9a72-0ec8f8b8032e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.367 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:15 np0005593234 ovn_controller[134547]: 2026-01-23T10:35:15Z|00812|binding|INFO|Releasing lport f45d82fe-2ce9-4738-9a72-0ec8f8b8032e from this chassis (sb_readonly=0)
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.379 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.380 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f2cadf27-eae4-40fd-be37-b605e054ab76.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f2cadf27-eae4-40fd-be37-b605e054ab76.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.381 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c72a4f83-ed5e-4a11-857a-ca3a9f24dddd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.381 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-f2cadf27-eae4-40fd-be37-b605e054ab76
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/f2cadf27-eae4-40fd-be37-b605e054ab76.pid.haproxy
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID f2cadf27-eae4-40fd-be37-b605e054ab76
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:35:15 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:15.382 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76', 'env', 'PROCESS_TAG=haproxy-f2cadf27-eae4-40fd-be37-b605e054ab76', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f2cadf27-eae4-40fd-be37-b605e054ab76.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.383 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.606 227766 DEBUG nova.compute.manager [req-79e4b770-8a4c-4aee-9333-b66621e1620d req-e1612d1b-287b-42a1-ab1f-9780923a7706 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Received event network-vif-plugged-ddcd0522-401c-4c1d-90df-7c407812f643 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.607 227766 DEBUG oslo_concurrency.lockutils [req-79e4b770-8a4c-4aee-9333-b66621e1620d req-e1612d1b-287b-42a1-ab1f-9780923a7706 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.607 227766 DEBUG oslo_concurrency.lockutils [req-79e4b770-8a4c-4aee-9333-b66621e1620d req-e1612d1b-287b-42a1-ab1f-9780923a7706 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.607 227766 DEBUG oslo_concurrency.lockutils [req-79e4b770-8a4c-4aee-9333-b66621e1620d req-e1612d1b-287b-42a1-ab1f-9780923a7706 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.607 227766 DEBUG nova.compute.manager [req-79e4b770-8a4c-4aee-9333-b66621e1620d req-e1612d1b-287b-42a1-ab1f-9780923a7706 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Processing event network-vif-plugged-ddcd0522-401c-4c1d-90df-7c407812f643 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.622 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164515.622062, 6673e062-5d99-4c31-a3e0-673f55438d6e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.622 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] VM Started (Lifecycle Event)#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.625 227766 DEBUG nova.compute.manager [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.629 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.633 227766 INFO nova.virt.libvirt.driver [-] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Instance spawned successfully.#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.633 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.662 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.668 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.671 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.673 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.674 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.674 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.674 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.675 227766 DEBUG nova.virt.libvirt.driver [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.724 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.724 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164515.625021, 6673e062-5d99-4c31-a3e0-673f55438d6e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.725 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.768 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.772 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164515.627974, 6673e062-5d99-4c31-a3e0-673f55438d6e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.772 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.804 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.809 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:35:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:15.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.840 227766 INFO nova.compute.manager [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Took 10.59 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.841 227766 DEBUG nova.compute.manager [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.850 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:35:15 np0005593234 podman[316018]: 2026-01-23 10:35:15.780932305 +0000 UTC m=+0.024371043 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.947 227766 INFO nova.compute.manager [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Took 12.26 seconds to build instance.#033[00m
Jan 23 05:35:15 np0005593234 nova_compute[227762]: 2026-01-23 10:35:15.979 227766 DEBUG oslo_concurrency.lockutils [None req-a44e65e7-bf7a-46cf-9643-7b4002ff2789 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.389s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:16 np0005593234 podman[316018]: 2026-01-23 10:35:16.415536853 +0000 UTC m=+0.658975581 container create fd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:35:16 np0005593234 systemd[1]: Started libpod-conmon-fd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd.scope.
Jan 23 05:35:16 np0005593234 nova_compute[227762]: 2026-01-23 10:35:16.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:16 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:35:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:16.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:16 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebb1f0d7eea14f32c205ab6ac86368e14be592a046be81afc1b042df152dfa02/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:35:16 np0005593234 podman[316018]: 2026-01-23 10:35:16.777040237 +0000 UTC m=+1.020478965 container init fd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:35:16 np0005593234 podman[316018]: 2026-01-23 10:35:16.786938126 +0000 UTC m=+1.030376844 container start fd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Jan 23 05:35:16 np0005593234 neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76[316035]: [NOTICE]   (316039) : New worker (316041) forked
Jan 23 05:35:16 np0005593234 neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76[316035]: [NOTICE]   (316039) : Loading success.
Jan 23 05:35:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:17.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:17 np0005593234 nova_compute[227762]: 2026-01-23 10:35:17.923 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:17.971 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=70, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=69) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:35:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:17.972 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:35:18 np0005593234 nova_compute[227762]: 2026-01-23 10:35:18.014 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:18 np0005593234 nova_compute[227762]: 2026-01-23 10:35:18.064 227766 DEBUG nova.compute.manager [req-0a89b127-0dd3-4368-baf6-0e953112cb60 req-17dc0ada-0db2-42ce-b0d4-e3a3303d7016 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Received event network-vif-plugged-ddcd0522-401c-4c1d-90df-7c407812f643 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:35:18 np0005593234 nova_compute[227762]: 2026-01-23 10:35:18.065 227766 DEBUG oslo_concurrency.lockutils [req-0a89b127-0dd3-4368-baf6-0e953112cb60 req-17dc0ada-0db2-42ce-b0d4-e3a3303d7016 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:18 np0005593234 nova_compute[227762]: 2026-01-23 10:35:18.065 227766 DEBUG oslo_concurrency.lockutils [req-0a89b127-0dd3-4368-baf6-0e953112cb60 req-17dc0ada-0db2-42ce-b0d4-e3a3303d7016 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:18 np0005593234 nova_compute[227762]: 2026-01-23 10:35:18.065 227766 DEBUG oslo_concurrency.lockutils [req-0a89b127-0dd3-4368-baf6-0e953112cb60 req-17dc0ada-0db2-42ce-b0d4-e3a3303d7016 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:18 np0005593234 nova_compute[227762]: 2026-01-23 10:35:18.065 227766 DEBUG nova.compute.manager [req-0a89b127-0dd3-4368-baf6-0e953112cb60 req-17dc0ada-0db2-42ce-b0d4-e3a3303d7016 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] No waiting events found dispatching network-vif-plugged-ddcd0522-401c-4c1d-90df-7c407812f643 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:35:18 np0005593234 nova_compute[227762]: 2026-01-23 10:35:18.066 227766 WARNING nova.compute.manager [req-0a89b127-0dd3-4368-baf6-0e953112cb60 req-17dc0ada-0db2-42ce-b0d4-e3a3303d7016 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Received unexpected event network-vif-plugged-ddcd0522-401c-4c1d-90df-7c407812f643 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:35:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:18.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:19 np0005593234 nova_compute[227762]: 2026-01-23 10:35:19.027 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:19 np0005593234 podman[316051]: 2026-01-23 10:35:19.758527031 +0000 UTC m=+0.054799504 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:35:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:19.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:19.974 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '70'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:35:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:20.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:21.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:22 np0005593234 nova_compute[227762]: 2026-01-23 10:35:22.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:22.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:22 np0005593234 nova_compute[227762]: 2026-01-23 10:35:22.924 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:23 np0005593234 nova_compute[227762]: 2026-01-23 10:35:23.667 227766 DEBUG nova.compute.manager [req-0487f9d1-dc4e-444f-8579-965f5f8ff264 req-fda206b7-8965-4178-9a5b-a13c713a84a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Received event network-changed-ddcd0522-401c-4c1d-90df-7c407812f643 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:35:23 np0005593234 nova_compute[227762]: 2026-01-23 10:35:23.667 227766 DEBUG nova.compute.manager [req-0487f9d1-dc4e-444f-8579-965f5f8ff264 req-fda206b7-8965-4178-9a5b-a13c713a84a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Refreshing instance network info cache due to event network-changed-ddcd0522-401c-4c1d-90df-7c407812f643. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:35:23 np0005593234 nova_compute[227762]: 2026-01-23 10:35:23.667 227766 DEBUG oslo_concurrency.lockutils [req-0487f9d1-dc4e-444f-8579-965f5f8ff264 req-fda206b7-8965-4178-9a5b-a13c713a84a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:35:23 np0005593234 nova_compute[227762]: 2026-01-23 10:35:23.668 227766 DEBUG oslo_concurrency.lockutils [req-0487f9d1-dc4e-444f-8579-965f5f8ff264 req-fda206b7-8965-4178-9a5b-a13c713a84a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:35:23 np0005593234 nova_compute[227762]: 2026-01-23 10:35:23.668 227766 DEBUG nova.network.neutron [req-0487f9d1-dc4e-444f-8579-965f5f8ff264 req-fda206b7-8965-4178-9a5b-a13c713a84a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Refreshing network info cache for port ddcd0522-401c-4c1d-90df-7c407812f643 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:35:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:23.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:24 np0005593234 nova_compute[227762]: 2026-01-23 10:35:24.028 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:24.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:35:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:25.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:35:25 np0005593234 nova_compute[227762]: 2026-01-23 10:35:25.961 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:25 np0005593234 nova_compute[227762]: 2026-01-23 10:35:25.996 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Triggering sync for uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:35:25 np0005593234 nova_compute[227762]: 2026-01-23 10:35:25.996 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Triggering sync for uuid 1114ae68-dab9-46b3-abab-53f135df78d8 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:35:25 np0005593234 nova_compute[227762]: 2026-01-23 10:35:25.997 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Triggering sync for uuid 6673e062-5d99-4c31-a3e0-673f55438d6e _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:35:25 np0005593234 nova_compute[227762]: 2026-01-23 10:35:25.997 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:25 np0005593234 nova_compute[227762]: 2026-01-23 10:35:25.997 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:25 np0005593234 nova_compute[227762]: 2026-01-23 10:35:25.998 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:25 np0005593234 nova_compute[227762]: 2026-01-23 10:35:25.999 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "1114ae68-dab9-46b3-abab-53f135df78d8" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:25 np0005593234 nova_compute[227762]: 2026-01-23 10:35:25.999 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "6673e062-5d99-4c31-a3e0-673f55438d6e" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:25 np0005593234 nova_compute[227762]: 2026-01-23 10:35:25.999 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:26 np0005593234 nova_compute[227762]: 2026-01-23 10:35:26.037 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "1114ae68-dab9-46b3-abab-53f135df78d8" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.038s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:26 np0005593234 nova_compute[227762]: 2026-01-23 10:35:26.045 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.048s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:26 np0005593234 nova_compute[227762]: 2026-01-23 10:35:26.050 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.051s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:26.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:27.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:27 np0005593234 nova_compute[227762]: 2026-01-23 10:35:27.927 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:28 np0005593234 nova_compute[227762]: 2026-01-23 10:35:28.315 227766 DEBUG nova.network.neutron [req-0487f9d1-dc4e-444f-8579-965f5f8ff264 req-fda206b7-8965-4178-9a5b-a13c713a84a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Updated VIF entry in instance network info cache for port ddcd0522-401c-4c1d-90df-7c407812f643. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:35:28 np0005593234 nova_compute[227762]: 2026-01-23 10:35:28.316 227766 DEBUG nova.network.neutron [req-0487f9d1-dc4e-444f-8579-965f5f8ff264 req-fda206b7-8965-4178-9a5b-a13c713a84a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Updating instance_info_cache with network_info: [{"id": "ddcd0522-401c-4c1d-90df-7c407812f643", "address": "fa:16:3e:9f:74:a2", "network": {"id": "f2cadf27-eae4-40fd-be37-b605e054ab76", "bridge": "br-int", "label": "tempest-TestStampPattern-1832792840-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59dad6496894352a2f4c7eb66ca1914", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd0522-40", "ovs_interfaceid": "ddcd0522-401c-4c1d-90df-7c407812f643", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:35:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:28 np0005593234 nova_compute[227762]: 2026-01-23 10:35:28.408 227766 DEBUG oslo_concurrency.lockutils [req-0487f9d1-dc4e-444f-8579-965f5f8ff264 req-fda206b7-8965-4178-9a5b-a13c713a84a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:35:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:28.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:28 np0005593234 podman[316124]: 2026-01-23 10:35:28.813621447 +0000 UTC m=+0.096423113 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:35:29 np0005593234 nova_compute[227762]: 2026-01-23 10:35:29.029 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:29.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:30 np0005593234 ovn_controller[134547]: 2026-01-23T10:35:30Z|00106|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9f:74:a2 10.100.0.10
Jan 23 05:35:30 np0005593234 ovn_controller[134547]: 2026-01-23T10:35:30Z|00107|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9f:74:a2 10.100.0.10
Jan 23 05:35:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:30.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:31.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:32.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:32 np0005593234 nova_compute[227762]: 2026-01-23 10:35:32.930 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:33.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:34 np0005593234 nova_compute[227762]: 2026-01-23 10:35:34.030 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:34.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:35.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:36.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:37.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:37 np0005593234 nova_compute[227762]: 2026-01-23 10:35:37.932 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:38.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:39 np0005593234 nova_compute[227762]: 2026-01-23 10:35:39.078 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:39.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.214 227766 DEBUG oslo_concurrency.lockutils [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Acquiring lock "6673e062-5d99-4c31-a3e0-673f55438d6e" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.214 227766 DEBUG oslo_concurrency.lockutils [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.231 227766 DEBUG nova.objects.instance [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lazy-loading 'flavor' on Instance uuid 6673e062-5d99-4c31-a3e0-673f55438d6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.266 227766 DEBUG oslo_concurrency.lockutils [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.052s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.601 227766 DEBUG oslo_concurrency.lockutils [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Acquiring lock "6673e062-5d99-4c31-a3e0-673f55438d6e" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.602 227766 DEBUG oslo_concurrency.lockutils [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.602 227766 INFO nova.compute.manager [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Attaching volume cc391f04-6d6d-4c9d-866a-f0d2a56a62fb to /dev/vdb#033[00m
Jan 23 05:35:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:40.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.925 227766 DEBUG os_brick.utils [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.927 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.939 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.940 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[e00c0538-5686-4565-9981-cc2bff88b8ec]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.941 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.949 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.949 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[c0334dee-3bcf-4a91-a27c-a5b8fbc25fec]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.951 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.959 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.959 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[83630bc1-077c-43ee-a342-2845480db43f]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.961 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[e000e200-3c27-468c-b041-cfbae6b2299d]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.961 227766 DEBUG oslo_concurrency.processutils [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.990 227766 DEBUG oslo_concurrency.processutils [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] CMD "nvme version" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.992 227766 DEBUG os_brick.initiator.connectors.lightos [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.992 227766 DEBUG os_brick.initiator.connectors.lightos [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.992 227766 DEBUG os_brick.initiator.connectors.lightos [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.993 227766 DEBUG os_brick.utils [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] <== get_connector_properties: return (66ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:35:40 np0005593234 nova_compute[227762]: 2026-01-23 10:35:40.993 227766 DEBUG nova.virt.block_device [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Updating existing volume attachment record: f57c1c19-8996-4f32-9cf2-f3df971a579b _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:35:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:35:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/625066722' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:35:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:41.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:41 np0005593234 nova_compute[227762]: 2026-01-23 10:35:41.929 227766 DEBUG nova.objects.instance [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lazy-loading 'flavor' on Instance uuid 6673e062-5d99-4c31-a3e0-673f55438d6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:35:41 np0005593234 nova_compute[227762]: 2026-01-23 10:35:41.969 227766 DEBUG nova.virt.libvirt.driver [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Attempting to attach volume cc391f04-6d6d-4c9d-866a-f0d2a56a62fb with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:35:41 np0005593234 nova_compute[227762]: 2026-01-23 10:35:41.973 227766 DEBUG nova.virt.libvirt.guest [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:35:41 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:35:41 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-cc391f04-6d6d-4c9d-866a-f0d2a56a62fb">
Jan 23 05:35:41 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:35:41 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:35:41 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:35:41 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:35:41 np0005593234 nova_compute[227762]:  <auth username="openstack">
Jan 23 05:35:41 np0005593234 nova_compute[227762]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:35:41 np0005593234 nova_compute[227762]:  </auth>
Jan 23 05:35:41 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:35:41 np0005593234 nova_compute[227762]:  <serial>cc391f04-6d6d-4c9d-866a-f0d2a56a62fb</serial>
Jan 23 05:35:41 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:35:41 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:35:42 np0005593234 nova_compute[227762]: 2026-01-23 10:35:42.113 227766 DEBUG nova.virt.libvirt.driver [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:35:42 np0005593234 nova_compute[227762]: 2026-01-23 10:35:42.114 227766 DEBUG nova.virt.libvirt.driver [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:35:42 np0005593234 nova_compute[227762]: 2026-01-23 10:35:42.114 227766 DEBUG nova.virt.libvirt.driver [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:35:42 np0005593234 nova_compute[227762]: 2026-01-23 10:35:42.114 227766 DEBUG nova.virt.libvirt.driver [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] No VIF found with MAC fa:16:3e:9f:74:a2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:35:42 np0005593234 nova_compute[227762]: 2026-01-23 10:35:42.460 227766 DEBUG oslo_concurrency.lockutils [None req-3cec5717-5e9f-4ccb-b062-dd2353f4534e 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.858s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:42.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:42.872 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:42.873 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:35:42.874 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:42 np0005593234 nova_compute[227762]: 2026-01-23 10:35:42.934 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:43.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:44 np0005593234 nova_compute[227762]: 2026-01-23 10:35:44.080 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:35:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1978246974' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:35:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:35:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1978246974' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:35:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:44.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:45.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:45 np0005593234 nova_compute[227762]: 2026-01-23 10:35:45.960 227766 DEBUG oslo_concurrency.lockutils [None req-4eb73269-ded7-4ad8-a01a-fd74a86dd7b2 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Acquiring lock "6673e062-5d99-4c31-a3e0-673f55438d6e" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:45 np0005593234 nova_compute[227762]: 2026-01-23 10:35:45.960 227766 DEBUG oslo_concurrency.lockutils [None req-4eb73269-ded7-4ad8-a01a-fd74a86dd7b2 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:46 np0005593234 nova_compute[227762]: 2026-01-23 10:35:46.003 227766 INFO nova.compute.manager [None req-4eb73269-ded7-4ad8-a01a-fd74a86dd7b2 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Detaching volume cc391f04-6d6d-4c9d-866a-f0d2a56a62fb#033[00m
Jan 23 05:35:46 np0005593234 nova_compute[227762]: 2026-01-23 10:35:46.285 227766 INFO nova.virt.block_device [None req-4eb73269-ded7-4ad8-a01a-fd74a86dd7b2 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Attempting to driver detach volume cc391f04-6d6d-4c9d-866a-f0d2a56a62fb from mountpoint /dev/vdb#033[00m
Jan 23 05:35:46 np0005593234 nova_compute[227762]: 2026-01-23 10:35:46.295 227766 DEBUG nova.virt.libvirt.driver [None req-4eb73269-ded7-4ad8-a01a-fd74a86dd7b2 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Attempting to detach device vdb from instance 6673e062-5d99-4c31-a3e0-673f55438d6e from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:35:46 np0005593234 nova_compute[227762]: 2026-01-23 10:35:46.296 227766 DEBUG nova.virt.libvirt.guest [None req-4eb73269-ded7-4ad8-a01a-fd74a86dd7b2 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:35:46 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-cc391f04-6d6d-4c9d-866a-f0d2a56a62fb">
Jan 23 05:35:46 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:  <serial>cc391f04-6d6d-4c9d-866a-f0d2a56a62fb</serial>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:35:46 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:35:46 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:35:46 np0005593234 nova_compute[227762]: 2026-01-23 10:35:46.302 227766 INFO nova.virt.libvirt.driver [None req-4eb73269-ded7-4ad8-a01a-fd74a86dd7b2 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Successfully detached device vdb from instance 6673e062-5d99-4c31-a3e0-673f55438d6e from the persistent domain config.#033[00m
Jan 23 05:35:46 np0005593234 nova_compute[227762]: 2026-01-23 10:35:46.303 227766 DEBUG nova.virt.libvirt.driver [None req-4eb73269-ded7-4ad8-a01a-fd74a86dd7b2 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance 6673e062-5d99-4c31-a3e0-673f55438d6e from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:35:46 np0005593234 nova_compute[227762]: 2026-01-23 10:35:46.303 227766 DEBUG nova.virt.libvirt.guest [None req-4eb73269-ded7-4ad8-a01a-fd74a86dd7b2 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:35:46 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-cc391f04-6d6d-4c9d-866a-f0d2a56a62fb">
Jan 23 05:35:46 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:  <serial>cc391f04-6d6d-4c9d-866a-f0d2a56a62fb</serial>
Jan 23 05:35:46 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:35:46 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:35:46 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:35:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:35:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:46.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:35:47 np0005593234 nova_compute[227762]: 2026-01-23 10:35:47.237 227766 DEBUG nova.virt.libvirt.driver [None req-4eb73269-ded7-4ad8-a01a-fd74a86dd7b2 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance 6673e062-5d99-4c31-a3e0-673f55438d6e _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:35:47 np0005593234 nova_compute[227762]: 2026-01-23 10:35:47.238 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769164547.2382543, 6673e062-5d99-4c31-a3e0-673f55438d6e => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:35:47 np0005593234 nova_compute[227762]: 2026-01-23 10:35:47.240 227766 INFO nova.virt.libvirt.driver [None req-4eb73269-ded7-4ad8-a01a-fd74a86dd7b2 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Successfully detached device vdb from instance 6673e062-5d99-4c31-a3e0-673f55438d6e from the live domain config.#033[00m
Jan 23 05:35:47 np0005593234 nova_compute[227762]: 2026-01-23 10:35:47.519 227766 DEBUG nova.objects.instance [None req-4eb73269-ded7-4ad8-a01a-fd74a86dd7b2 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lazy-loading 'flavor' on Instance uuid 6673e062-5d99-4c31-a3e0-673f55438d6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:35:47 np0005593234 nova_compute[227762]: 2026-01-23 10:35:47.570 227766 DEBUG oslo_concurrency.lockutils [None req-4eb73269-ded7-4ad8-a01a-fd74a86dd7b2 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 1.610s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:47.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:47 np0005593234 nova_compute[227762]: 2026-01-23 10:35:47.936 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e358 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:48.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:49 np0005593234 nova_compute[227762]: 2026-01-23 10:35:49.082 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:49.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e359 e359: 3 total, 3 up, 3 in
Jan 23 05:35:50 np0005593234 podman[316241]: 2026-01-23 10:35:50.759830514 +0000 UTC m=+0.050124567 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Jan 23 05:35:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:50.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:51.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:52.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:52 np0005593234 nova_compute[227762]: 2026-01-23 10:35:52.938 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:53 np0005593234 nova_compute[227762]: 2026-01-23 10:35:53.230 227766 DEBUG nova.compute.manager [None req-861d3ec9-bea7-4b42-a733-1b7233a64336 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:35:53 np0005593234 nova_compute[227762]: 2026-01-23 10:35:53.285 227766 INFO nova.compute.manager [None req-861d3ec9-bea7-4b42-a733-1b7233a64336 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] instance snapshotting#033[00m
Jan 23 05:35:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e359 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:53 np0005593234 nova_compute[227762]: 2026-01-23 10:35:53.732 227766 INFO nova.virt.libvirt.driver [None req-861d3ec9-bea7-4b42-a733-1b7233a64336 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Beginning live snapshot process#033[00m
Jan 23 05:35:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:53.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:54 np0005593234 nova_compute[227762]: 2026-01-23 10:35:54.164 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:54 np0005593234 nova_compute[227762]: 2026-01-23 10:35:54.171 227766 DEBUG nova.virt.libvirt.imagebackend [None req-861d3ec9-bea7-4b42-a733-1b7233a64336 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 05:35:54 np0005593234 nova_compute[227762]: 2026-01-23 10:35:54.575 227766 DEBUG nova.storage.rbd_utils [None req-861d3ec9-bea7-4b42-a733-1b7233a64336 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] creating snapshot(32737cc81e8545c8b8f5ad82fe20efef) on rbd image(6673e062-5d99-4c31-a3e0-673f55438d6e_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:35:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:54.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e360 e360: 3 total, 3 up, 3 in
Jan 23 05:35:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:55.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:56.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e361 e361: 3 total, 3 up, 3 in
Jan 23 05:35:57 np0005593234 nova_compute[227762]: 2026-01-23 10:35:57.518 227766 DEBUG nova.storage.rbd_utils [None req-861d3ec9-bea7-4b42-a733-1b7233a64336 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] cloning vms/6673e062-5d99-4c31-a3e0-673f55438d6e_disk@32737cc81e8545c8b8f5ad82fe20efef to images/60e9525a-9f0e-4c80-9c46-e7936c55e48b clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:35:57 np0005593234 nova_compute[227762]: 2026-01-23 10:35:57.708 227766 DEBUG nova.storage.rbd_utils [None req-861d3ec9-bea7-4b42-a733-1b7233a64336 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] flattening images/60e9525a-9f0e-4c80-9c46-e7936c55e48b flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 05:35:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:57.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:57 np0005593234 nova_compute[227762]: 2026-01-23 10:35:57.941 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e361 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:35:58 np0005593234 nova_compute[227762]: 2026-01-23 10:35:58.567 227766 DEBUG nova.storage.rbd_utils [None req-861d3ec9-bea7-4b42-a733-1b7233a64336 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] removing snapshot(32737cc81e8545c8b8f5ad82fe20efef) on rbd image(6673e062-5d99-4c31-a3e0-673f55438d6e_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:35:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:35:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:58.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:35:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e362 e362: 3 total, 3 up, 3 in
Jan 23 05:35:59 np0005593234 podman[316556]: 2026-01-23 10:35:59.023682841 +0000 UTC m=+0.077491762 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller)
Jan 23 05:35:59 np0005593234 nova_compute[227762]: 2026-01-23 10:35:59.069 227766 DEBUG nova.storage.rbd_utils [None req-861d3ec9-bea7-4b42-a733-1b7233a64336 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] creating snapshot(snap) on rbd image(60e9525a-9f0e-4c80-9c46-e7936c55e48b) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:35:59 np0005593234 nova_compute[227762]: 2026-01-23 10:35:59.112 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:35:59 np0005593234 nova_compute[227762]: 2026-01-23 10:35:59.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:59 np0005593234 nova_compute[227762]: 2026-01-23 10:35:59.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:35:59 np0005593234 nova_compute[227762]: 2026-01-23 10:35:59.789 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:35:59 np0005593234 nova_compute[227762]: 2026-01-23 10:35:59.789 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:35:59 np0005593234 nova_compute[227762]: 2026-01-23 10:35:59.789 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:35:59 np0005593234 nova_compute[227762]: 2026-01-23 10:35:59.790 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:35:59 np0005593234 nova_compute[227762]: 2026-01-23 10:35:59.790 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:35:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:35:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:35:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:59.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:35:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:35:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:35:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:35:59 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:36:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e363 e363: 3 total, 3 up, 3 in
Jan 23 05:36:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:36:00 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2296550670' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.410 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.620s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.515 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000ba as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.516 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000ba as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.520 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.520 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000af as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.525 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000b5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.526 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000b5 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.703 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.704 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3555MB free_disk=20.67003631591797GB free_vcpus=5 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.704 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.705 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:00.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.908 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 64ccc062-b11b-4cbc-96ba-620e43dfdb20 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.908 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 1114ae68-dab9-46b3-abab-53f135df78d8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.908 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 6673e062-5d99-4c31-a3e0-673f55438d6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.909 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:36:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:00.909 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=20GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:36:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:36:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:36:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:36:01 np0005593234 nova_compute[227762]: 2026-01-23 10:36:01.276 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:36:01 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4261012689' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:01 np0005593234 nova_compute[227762]: 2026-01-23 10:36:01.700 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:01 np0005593234 nova_compute[227762]: 2026-01-23 10:36:01.705 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:36:01 np0005593234 nova_compute[227762]: 2026-01-23 10:36:01.743 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:36:01 np0005593234 nova_compute[227762]: 2026-01-23 10:36:01.788 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:36:01 np0005593234 nova_compute[227762]: 2026-01-23 10:36:01.789 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.084s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:01.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:02 np0005593234 nova_compute[227762]: 2026-01-23 10:36:02.789 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:02 np0005593234 nova_compute[227762]: 2026-01-23 10:36:02.790 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:36:02 np0005593234 nova_compute[227762]: 2026-01-23 10:36:02.790 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:36:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:02.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:02 np0005593234 nova_compute[227762]: 2026-01-23 10:36:02.943 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:03 np0005593234 nova_compute[227762]: 2026-01-23 10:36:03.108 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:36:03 np0005593234 nova_compute[227762]: 2026-01-23 10:36:03.109 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:36:03 np0005593234 nova_compute[227762]: 2026-01-23 10:36:03.109 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:36:03 np0005593234 nova_compute[227762]: 2026-01-23 10:36:03.109 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:36:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:03.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:03 np0005593234 nova_compute[227762]: 2026-01-23 10:36:03.997 227766 INFO nova.virt.libvirt.driver [None req-861d3ec9-bea7-4b42-a733-1b7233a64336 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Snapshot image upload complete#033[00m
Jan 23 05:36:03 np0005593234 nova_compute[227762]: 2026-01-23 10:36:03.998 227766 INFO nova.compute.manager [None req-861d3ec9-bea7-4b42-a733-1b7233a64336 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Took 10.71 seconds to snapshot the instance on the hypervisor.#033[00m
Jan 23 05:36:04 np0005593234 nova_compute[227762]: 2026-01-23 10:36:04.087 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:04.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:05 np0005593234 nova_compute[227762]: 2026-01-23 10:36:05.819 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:05.823 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=71, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=70) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:36:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:05.824 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:36:05 np0005593234 nova_compute[227762]: 2026-01-23 10:36:05.902 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updating instance_info_cache with network_info: [{"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:36:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:05.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:05 np0005593234 nova_compute[227762]: 2026-01-23 10:36:05.927 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-64ccc062-b11b-4cbc-96ba-620e43dfdb20" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:36:05 np0005593234 nova_compute[227762]: 2026-01-23 10:36:05.928 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:36:05 np0005593234 nova_compute[227762]: 2026-01-23 10:36:05.928 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:05 np0005593234 nova_compute[227762]: 2026-01-23 10:36:05.928 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:05 np0005593234 nova_compute[227762]: 2026-01-23 10:36:05.929 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:05 np0005593234 nova_compute[227762]: 2026-01-23 10:36:05.929 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:36:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:06.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:07.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:07 np0005593234 nova_compute[227762]: 2026-01-23 10:36:07.945 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e363 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:08 np0005593234 nova_compute[227762]: 2026-01-23 10:36:08.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:08.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:08.826 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '71'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e364 e364: 3 total, 3 up, 3 in
Jan 23 05:36:09 np0005593234 nova_compute[227762]: 2026-01-23 10:36:09.089 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:09 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:36:09 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:36:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:09.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:10.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:36:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:11.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:36:12 np0005593234 nova_compute[227762]: 2026-01-23 10:36:12.741 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:36:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:12.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:36:12 np0005593234 nova_compute[227762]: 2026-01-23 10:36:12.990 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #160. Immutable memtables: 0.
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.078817) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 101] Flushing memtable with next log file: 160
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164573079065, "job": 101, "event": "flush_started", "num_memtables": 1, "num_entries": 2438, "num_deletes": 254, "total_data_size": 5802617, "memory_usage": 5901616, "flush_reason": "Manual Compaction"}
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 101] Level-0 flush table #161: started
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164573104805, "cf_name": "default", "job": 101, "event": "table_file_creation", "file_number": 161, "file_size": 3784504, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 76143, "largest_seqno": 78576, "table_properties": {"data_size": 3774477, "index_size": 6392, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 20991, "raw_average_key_size": 20, "raw_value_size": 3754358, "raw_average_value_size": 3709, "num_data_blocks": 276, "num_entries": 1012, "num_filter_entries": 1012, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164376, "oldest_key_time": 1769164376, "file_creation_time": 1769164573, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 161, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 101] Flush lasted 25966 microseconds, and 10665 cpu microseconds.
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.104863) [db/flush_job.cc:967] [default] [JOB 101] Level-0 flush table #161: 3784504 bytes OK
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.104886) [db/memtable_list.cc:519] [default] Level-0 commit table #161 started
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.112473) [db/memtable_list.cc:722] [default] Level-0 commit table #161: memtable #1 done
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.112514) EVENT_LOG_v1 {"time_micros": 1769164573112506, "job": 101, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.112538) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 101] Try to delete WAL files size 5791729, prev total WAL file size 5791729, number of live WAL files 2.
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000157.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.113881) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730036373737' seq:72057594037927935, type:22 .. '7061786F730037303239' seq:0, type:0; will stop at (end)
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 102] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 101 Base level 0, inputs: [161(3695KB)], [159(9679KB)]
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164573113997, "job": 102, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [161], "files_L6": [159], "score": -1, "input_data_size": 13696661, "oldest_snapshot_seqno": -1}
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 102] Generated table #162: 9729 keys, 11804583 bytes, temperature: kUnknown
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164573219870, "cf_name": "default", "job": 102, "event": "table_file_creation", "file_number": 162, "file_size": 11804583, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11743170, "index_size": 35956, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24389, "raw_key_size": 256391, "raw_average_key_size": 26, "raw_value_size": 11574094, "raw_average_value_size": 1189, "num_data_blocks": 1369, "num_entries": 9729, "num_filter_entries": 9729, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769164573, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 162, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.220244) [db/compaction/compaction_job.cc:1663] [default] [JOB 102] Compacted 1@0 + 1@6 files to L6 => 11804583 bytes
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.277242) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 129.3 rd, 111.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.6, 9.5 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(6.7) write-amplify(3.1) OK, records in: 10256, records dropped: 527 output_compression: NoCompression
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.277274) EVENT_LOG_v1 {"time_micros": 1769164573277261, "job": 102, "event": "compaction_finished", "compaction_time_micros": 105951, "compaction_time_cpu_micros": 28763, "output_level": 6, "num_output_files": 1, "total_output_size": 11804583, "num_input_records": 10256, "num_output_records": 9729, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000161.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164573278056, "job": 102, "event": "table_file_deletion", "file_number": 161}
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000159.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164573279677, "job": 102, "event": "table_file_deletion", "file_number": 159}
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.113760) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.279790) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.279798) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.279801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.279803) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:36:13.279805) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:36:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e364 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:13.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:14 np0005593234 nova_compute[227762]: 2026-01-23 10:36:14.104 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:14.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e365 e365: 3 total, 3 up, 3 in
Jan 23 05:36:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:15.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:16 np0005593234 nova_compute[227762]: 2026-01-23 10:36:16.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:16.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.606 227766 DEBUG oslo_concurrency.lockutils [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.607 227766 DEBUG oslo_concurrency.lockutils [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.607 227766 DEBUG oslo_concurrency.lockutils [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.607 227766 DEBUG oslo_concurrency.lockutils [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.607 227766 DEBUG oslo_concurrency.lockutils [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.608 227766 INFO nova.compute.manager [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Terminating instance#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.609 227766 DEBUG nova.compute.manager [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:36:17 np0005593234 kernel: tap1827509f-e3 (unregistering): left promiscuous mode
Jan 23 05:36:17 np0005593234 NetworkManager[48942]: <info>  [1769164577.7369] device (tap1827509f-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:36:17 np0005593234 ovn_controller[134547]: 2026-01-23T10:36:17Z|00813|binding|INFO|Releasing lport 1827509f-e3b0-49ea-b1ff-982db21148b8 from this chassis (sb_readonly=0)
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.744 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:17 np0005593234 ovn_controller[134547]: 2026-01-23T10:36:17Z|00814|binding|INFO|Setting lport 1827509f-e3b0-49ea-b1ff-982db21148b8 down in Southbound
Jan 23 05:36:17 np0005593234 ovn_controller[134547]: 2026-01-23T10:36:17Z|00815|binding|INFO|Removing iface tap1827509f-e3 ovn-installed in OVS
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.748 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.761 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.761 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:16:71 10.100.0.14'], port_security=['fa:16:3e:02:16:71 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '1114ae68-dab9-46b3-abab-53f135df78d8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=1827509f-e3b0-49ea-b1ff-982db21148b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.764 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 1827509f-e3b0-49ea-b1ff-982db21148b8 in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 unbound from our chassis#033[00m
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.769 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d7d5530f-5227-4f75-bac0-2604bb3d68e2#033[00m
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.786 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9f8c7ac4-4aac-472b-92e5-2aaf8e8d8241]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:17 np0005593234 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000b5.scope: Deactivated successfully.
Jan 23 05:36:17 np0005593234 systemd[1]: machine-qemu\x2d89\x2dinstance\x2d000000b5.scope: Consumed 19.424s CPU time.
Jan 23 05:36:17 np0005593234 systemd-machined[195626]: Machine qemu-89-instance-000000b5 terminated.
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.816 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f923d630-f4e8-491b-8ebb-f41d6613eb7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.819 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[243ad3be-7637-4174-aa52-25ef309e4947]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.868 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.873 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[864c794c-b3fc-4bb4-be7a-1483670f9108]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.879 227766 INFO nova.virt.libvirt.driver [-] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Instance destroyed successfully.#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.880 227766 DEBUG nova.objects.instance [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'resources' on Instance uuid 1114ae68-dab9-46b3-abab-53f135df78d8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.892 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e609f612-634d-4854-b007-d8c39253c6c2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd7d5530f-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:00:67:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 11, 'tx_packets': 15, 'rx_bytes': 742, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 11, 'tx_packets': 15, 'rx_bytes': 742, 'tx_bytes': 774, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 236], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815735, 'reachable_time': 21815, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 316861, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.907 227766 DEBUG nova.virt.libvirt.vif [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:32:51Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-182552386',display_name='tempest-ServerStableDeviceRescueTest-server-182552386',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-182552386',id=181,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:33:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='815b71acf60d4ed8933ebd05228fa0c0',ramdisk_id='',reservation_id='r-491now07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1802220041',owner_user_name='tempest-ServerStableDeviceRescueTest-1802220041-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:33:43Z,user_data=None,user_id='e1629a4b14764dddaabcadd16f3e1c1c',uuid=1114ae68-dab9-46b3-abab-53f135df78d8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.908 227766 DEBUG nova.network.os_vif_util [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converting VIF {"id": "1827509f-e3b0-49ea-b1ff-982db21148b8", "address": "fa:16:3e:02:16:71", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1827509f-e3", "ovs_interfaceid": "1827509f-e3b0-49ea-b1ff-982db21148b8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.909 227766 DEBUG nova.network.os_vif_util [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:16:71,bridge_name='br-int',has_traffic_filtering=True,id=1827509f-e3b0-49ea-b1ff-982db21148b8,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1827509f-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.909 227766 DEBUG os_vif [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:16:71,bridge_name='br-int',has_traffic_filtering=True,id=1827509f-e3b0-49ea-b1ff-982db21148b8,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1827509f-e3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.909 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[75109711-8c69-42c4-a8db-7d187e642347]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd7d5530f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815745, 'tstamp': 815745}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 316862, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd7d5530f-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 815749, 'tstamp': 815749}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 316862, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.911 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.911 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.911 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1827509f-e3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.913 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.913 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.914 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd7d5530f-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.914 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.914 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd7d5530f-50, col_values=(('external_ids', {'iface-id': '4c99eeb5-c437-4d31-ac3b-bfd151140733'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:17.915 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:36:17 np0005593234 nova_compute[227762]: 2026-01-23 10:36:17.916 227766 INFO os_vif [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:16:71,bridge_name='br-int',has_traffic_filtering=True,id=1827509f-e3b0-49ea-b1ff-982db21148b8,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1827509f-e3')#033[00m
Jan 23 05:36:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:17.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e365 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:18 np0005593234 nova_compute[227762]: 2026-01-23 10:36:18.722 227766 DEBUG nova.compute.manager [req-08f99081-0847-4421-9c43-de83859db1ee req-ef2590ba-20b0-476a-b861-d4a5f2303825 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-unplugged-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:18 np0005593234 nova_compute[227762]: 2026-01-23 10:36:18.722 227766 DEBUG oslo_concurrency.lockutils [req-08f99081-0847-4421-9c43-de83859db1ee req-ef2590ba-20b0-476a-b861-d4a5f2303825 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:18 np0005593234 nova_compute[227762]: 2026-01-23 10:36:18.722 227766 DEBUG oslo_concurrency.lockutils [req-08f99081-0847-4421-9c43-de83859db1ee req-ef2590ba-20b0-476a-b861-d4a5f2303825 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:18 np0005593234 nova_compute[227762]: 2026-01-23 10:36:18.723 227766 DEBUG oslo_concurrency.lockutils [req-08f99081-0847-4421-9c43-de83859db1ee req-ef2590ba-20b0-476a-b861-d4a5f2303825 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:18 np0005593234 nova_compute[227762]: 2026-01-23 10:36:18.723 227766 DEBUG nova.compute.manager [req-08f99081-0847-4421-9c43-de83859db1ee req-ef2590ba-20b0-476a-b861-d4a5f2303825 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] No waiting events found dispatching network-vif-unplugged-1827509f-e3b0-49ea-b1ff-982db21148b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:36:18 np0005593234 nova_compute[227762]: 2026-01-23 10:36:18.723 227766 DEBUG nova.compute.manager [req-08f99081-0847-4421-9c43-de83859db1ee req-ef2590ba-20b0-476a-b861-d4a5f2303825 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-unplugged-1827509f-e3b0-49ea-b1ff-982db21148b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:36:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:18.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e366 e366: 3 total, 3 up, 3 in
Jan 23 05:36:19 np0005593234 nova_compute[227762]: 2026-01-23 10:36:19.144 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:19 np0005593234 nova_compute[227762]: 2026-01-23 10:36:19.149 227766 INFO nova.virt.libvirt.driver [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Deleting instance files /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8_del#033[00m
Jan 23 05:36:19 np0005593234 nova_compute[227762]: 2026-01-23 10:36:19.149 227766 INFO nova.virt.libvirt.driver [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Deletion of /var/lib/nova/instances/1114ae68-dab9-46b3-abab-53f135df78d8_del complete#033[00m
Jan 23 05:36:19 np0005593234 nova_compute[227762]: 2026-01-23 10:36:19.224 227766 INFO nova.compute.manager [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Took 1.61 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:36:19 np0005593234 nova_compute[227762]: 2026-01-23 10:36:19.225 227766 DEBUG oslo.service.loopingcall [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:36:19 np0005593234 nova_compute[227762]: 2026-01-23 10:36:19.225 227766 DEBUG nova.compute.manager [-] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:36:19 np0005593234 nova_compute[227762]: 2026-01-23 10:36:19.225 227766 DEBUG nova.network.neutron [-] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:36:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:19.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:20.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:20 np0005593234 nova_compute[227762]: 2026-01-23 10:36:20.909 227766 DEBUG nova.compute.manager [req-bb81c01e-a65f-4ed4-af5a-861fb2f3bbc7 req-88e36dc8-52c7-4b17-87af-8bc8b85faedb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:20 np0005593234 nova_compute[227762]: 2026-01-23 10:36:20.910 227766 DEBUG oslo_concurrency.lockutils [req-bb81c01e-a65f-4ed4-af5a-861fb2f3bbc7 req-88e36dc8-52c7-4b17-87af-8bc8b85faedb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:20 np0005593234 nova_compute[227762]: 2026-01-23 10:36:20.910 227766 DEBUG oslo_concurrency.lockutils [req-bb81c01e-a65f-4ed4-af5a-861fb2f3bbc7 req-88e36dc8-52c7-4b17-87af-8bc8b85faedb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:20 np0005593234 nova_compute[227762]: 2026-01-23 10:36:20.910 227766 DEBUG oslo_concurrency.lockutils [req-bb81c01e-a65f-4ed4-af5a-861fb2f3bbc7 req-88e36dc8-52c7-4b17-87af-8bc8b85faedb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:20 np0005593234 nova_compute[227762]: 2026-01-23 10:36:20.910 227766 DEBUG nova.compute.manager [req-bb81c01e-a65f-4ed4-af5a-861fb2f3bbc7 req-88e36dc8-52c7-4b17-87af-8bc8b85faedb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] No waiting events found dispatching network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:36:20 np0005593234 nova_compute[227762]: 2026-01-23 10:36:20.910 227766 WARNING nova.compute.manager [req-bb81c01e-a65f-4ed4-af5a-861fb2f3bbc7 req-88e36dc8-52c7-4b17-87af-8bc8b85faedb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received unexpected event network-vif-plugged-1827509f-e3b0-49ea-b1ff-982db21148b8 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:36:21 np0005593234 nova_compute[227762]: 2026-01-23 10:36:21.111 227766 DEBUG nova.network.neutron [-] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:36:21 np0005593234 nova_compute[227762]: 2026-01-23 10:36:21.140 227766 INFO nova.compute.manager [-] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Took 1.91 seconds to deallocate network for instance.#033[00m
Jan 23 05:36:21 np0005593234 nova_compute[227762]: 2026-01-23 10:36:21.233 227766 DEBUG oslo_concurrency.lockutils [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:21 np0005593234 nova_compute[227762]: 2026-01-23 10:36:21.234 227766 DEBUG oslo_concurrency.lockutils [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:21 np0005593234 nova_compute[227762]: 2026-01-23 10:36:21.248 227766 DEBUG nova.compute.manager [req-f6e59bc7-2b17-466d-b2fb-ebf4fd9d2d2e req-426aa5af-93af-486a-8f8a-cc0ee0d0e4d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Received event network-vif-deleted-1827509f-e3b0-49ea-b1ff-982db21148b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:21 np0005593234 podman[316909]: 2026-01-23 10:36:21.317431265 +0000 UTC m=+0.060989866 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Jan 23 05:36:21 np0005593234 nova_compute[227762]: 2026-01-23 10:36:21.380 227766 DEBUG oslo_concurrency.processutils [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:36:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/750607507' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:21 np0005593234 nova_compute[227762]: 2026-01-23 10:36:21.927 227766 DEBUG oslo_concurrency.processutils [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.547s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:21.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:21 np0005593234 nova_compute[227762]: 2026-01-23 10:36:21.934 227766 DEBUG nova.compute.provider_tree [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:36:21 np0005593234 nova_compute[227762]: 2026-01-23 10:36:21.958 227766 DEBUG nova.scheduler.client.report [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:36:21 np0005593234 nova_compute[227762]: 2026-01-23 10:36:21.987 227766 DEBUG oslo_concurrency.lockutils [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.753s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:22 np0005593234 nova_compute[227762]: 2026-01-23 10:36:22.038 227766 INFO nova.scheduler.client.report [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Deleted allocations for instance 1114ae68-dab9-46b3-abab-53f135df78d8#033[00m
Jan 23 05:36:22 np0005593234 nova_compute[227762]: 2026-01-23 10:36:22.175 227766 DEBUG oslo_concurrency.lockutils [None req-b799d3dc-db6e-4976-a995-009e42af9559 e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "1114ae68-dab9-46b3-abab-53f135df78d8" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:36:22Z|00816|binding|INFO|Releasing lport 4c99eeb5-c437-4d31-ac3b-bfd151140733 from this chassis (sb_readonly=0)
Jan 23 05:36:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:36:22Z|00817|binding|INFO|Releasing lport f45d82fe-2ce9-4738-9a72-0ec8f8b8032e from this chassis (sb_readonly=0)
Jan 23 05:36:22 np0005593234 nova_compute[227762]: 2026-01-23 10:36:22.241 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:22.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:22 np0005593234 nova_compute[227762]: 2026-01-23 10:36:22.912 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e366 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:23 np0005593234 nova_compute[227762]: 2026-01-23 10:36:23.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:23.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:24 np0005593234 nova_compute[227762]: 2026-01-23 10:36:24.146 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:24.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e367 e367: 3 total, 3 up, 3 in
Jan 23 05:36:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:25.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:26.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:27 np0005593234 nova_compute[227762]: 2026-01-23 10:36:27.913 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:27.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e367 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:36:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:28.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:36:29 np0005593234 nova_compute[227762]: 2026-01-23 10:36:29.148 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:29 np0005593234 podman[316981]: 2026-01-23 10:36:29.788350571 +0000 UTC m=+0.086940297 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:36:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:29.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:30.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:30 np0005593234 nova_compute[227762]: 2026-01-23 10:36:30.924 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:31.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:32.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:32 np0005593234 nova_compute[227762]: 2026-01-23 10:36:32.879 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164577.8774364, 1114ae68-dab9-46b3-abab-53f135df78d8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:36:32 np0005593234 nova_compute[227762]: 2026-01-23 10:36:32.880 227766 INFO nova.compute.manager [-] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:36:32 np0005593234 nova_compute[227762]: 2026-01-23 10:36:32.915 227766 DEBUG nova.compute.manager [None req-efead9ff-0f63-4b9c-be1a-60d9e2014cdc - - - - - -] [instance: 1114ae68-dab9-46b3-abab-53f135df78d8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:36:32 np0005593234 nova_compute[227762]: 2026-01-23 10:36:32.915 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e367 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e368 e368: 3 total, 3 up, 3 in
Jan 23 05:36:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:33.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:34 np0005593234 nova_compute[227762]: 2026-01-23 10:36:34.149 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:34.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:36:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:35.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:36:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:36.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e369 e369: 3 total, 3 up, 3 in
Jan 23 05:36:37 np0005593234 nova_compute[227762]: 2026-01-23 10:36:37.486 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:37.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:37 np0005593234 nova_compute[227762]: 2026-01-23 10:36:37.955 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:38 np0005593234 nova_compute[227762]: 2026-01-23 10:36:38.291 227766 DEBUG oslo_concurrency.lockutils [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:38 np0005593234 nova_compute[227762]: 2026-01-23 10:36:38.292 227766 DEBUG oslo_concurrency.lockutils [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:38 np0005593234 nova_compute[227762]: 2026-01-23 10:36:38.292 227766 DEBUG oslo_concurrency.lockutils [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:38 np0005593234 nova_compute[227762]: 2026-01-23 10:36:38.292 227766 DEBUG oslo_concurrency.lockutils [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:38 np0005593234 nova_compute[227762]: 2026-01-23 10:36:38.292 227766 DEBUG oslo_concurrency.lockutils [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:38 np0005593234 nova_compute[227762]: 2026-01-23 10:36:38.293 227766 INFO nova.compute.manager [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Terminating instance#033[00m
Jan 23 05:36:38 np0005593234 nova_compute[227762]: 2026-01-23 10:36:38.294 227766 DEBUG nova.compute.manager [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:36:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e369 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:38.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:38 np0005593234 kernel: tapfc7eda8e-2c (unregistering): left promiscuous mode
Jan 23 05:36:38 np0005593234 NetworkManager[48942]: <info>  [1769164598.8448] device (tapfc7eda8e-2c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:36:38 np0005593234 nova_compute[227762]: 2026-01-23 10:36:38.854 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:36:38Z|00818|binding|INFO|Releasing lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 from this chassis (sb_readonly=0)
Jan 23 05:36:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:36:38Z|00819|binding|INFO|Setting lport fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 down in Southbound
Jan 23 05:36:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:36:38Z|00820|binding|INFO|Removing iface tapfc7eda8e-2c ovn-installed in OVS
Jan 23 05:36:38 np0005593234 nova_compute[227762]: 2026-01-23 10:36:38.859 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:38.861 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:3a:01 10.100.0.11'], port_security=['fa:16:3e:9b:3a:01 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '64ccc062-b11b-4cbc-96ba-620e43dfdb20', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '815b71acf60d4ed8933ebd05228fa0c0', 'neutron:revision_number': '8', 'neutron:security_group_ids': '2840f436-c8a5-4177-8456-1f0b11461ed7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=11c3d371-746a-4085-8cb4-b3d90e2e50bf, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:36:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:38.864 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 in datapath d7d5530f-5227-4f75-bac0-2604bb3d68e2 unbound from our chassis#033[00m
Jan 23 05:36:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:38.866 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d7d5530f-5227-4f75-bac0-2604bb3d68e2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:36:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:38.868 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5a4cb568-8874-4380-9dc4-d116c6521565]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:38 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:38.869 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 namespace which is not needed anymore#033[00m
Jan 23 05:36:38 np0005593234 nova_compute[227762]: 2026-01-23 10:36:38.873 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:38 np0005593234 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000af.scope: Deactivated successfully.
Jan 23 05:36:38 np0005593234 systemd[1]: machine-qemu\x2d85\x2dinstance\x2d000000af.scope: Consumed 27.131s CPU time.
Jan 23 05:36:38 np0005593234 systemd-machined[195626]: Machine qemu-85-instance-000000af terminated.
Jan 23 05:36:38 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309439]: [NOTICE]   (309444) : haproxy version is 2.8.14-c23fe91
Jan 23 05:36:38 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309439]: [NOTICE]   (309444) : path to executable is /usr/sbin/haproxy
Jan 23 05:36:38 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309439]: [WARNING]  (309444) : Exiting Master process...
Jan 23 05:36:38 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309439]: [ALERT]    (309444) : Current worker (309446) exited with code 143 (Terminated)
Jan 23 05:36:38 np0005593234 neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2[309439]: [WARNING]  (309444) : All workers exited. Exiting... (0)
Jan 23 05:36:39 np0005593234 systemd[1]: libpod-b7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79.scope: Deactivated successfully.
Jan 23 05:36:39 np0005593234 podman[317039]: 2026-01-23 10:36:39.008734893 +0000 UTC m=+0.044445960 container died b7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:36:39 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79-userdata-shm.mount: Deactivated successfully.
Jan 23 05:36:39 np0005593234 systemd[1]: var-lib-containers-storage-overlay-7f510ff51a8f0c21529f751bc8982caa4c2675a0cfd8d6fd7477afc40144634d-merged.mount: Deactivated successfully.
Jan 23 05:36:39 np0005593234 podman[317039]: 2026-01-23 10:36:39.049214167 +0000 UTC m=+0.084925204 container cleanup b7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 05:36:39 np0005593234 systemd[1]: libpod-conmon-b7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79.scope: Deactivated successfully.
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.129 227766 INFO nova.virt.libvirt.driver [-] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Instance destroyed successfully.#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.130 227766 DEBUG nova.objects.instance [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lazy-loading 'resources' on Instance uuid 64ccc062-b11b-4cbc-96ba-620e43dfdb20 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.158 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.224 227766 DEBUG nova.virt.libvirt.vif [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:29:55Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-ServerStableDeviceRescueTest-server-1480329487',display_name='tempest-ServerStableDeviceRescueTest-server-1480329487',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-serverstabledevicerescuetest-server-1480329487',id=175,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:30:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='815b71acf60d4ed8933ebd05228fa0c0',ramdisk_id='',reservation_id='r-gplzbwwd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-ServerStableDeviceRescueTest-1802220041',owner_user_name='tempest-ServerStableDeviceRescueTest-1802220041-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:31:03Z,user_data=None,user_id='e1629a4b14764dddaabcadd16f3e1c1c',uuid=64ccc062-b11b-4cbc-96ba-620e43dfdb20,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.225 227766 DEBUG nova.network.os_vif_util [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converting VIF {"id": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "address": "fa:16:3e:9b:3a:01", "network": {"id": "d7d5530f-5227-4f75-bac0-2604bb3d68e2", "bridge": "br-int", "label": "tempest-ServerStableDeviceRescueTest-1351383381-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "815b71acf60d4ed8933ebd05228fa0c0", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfc7eda8e-2c", "ovs_interfaceid": "fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.226 227766 DEBUG nova.network.os_vif_util [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9b:3a:01,bridge_name='br-int',has_traffic_filtering=True,id=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc7eda8e-2c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.226 227766 DEBUG os_vif [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:3a:01,bridge_name='br-int',has_traffic_filtering=True,id=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc7eda8e-2c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.228 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.228 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc7eda8e-2c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.230 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.232 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.234 227766 INFO os_vif [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:3a:01,bridge_name='br-int',has_traffic_filtering=True,id=fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15,network=Network(d7d5530f-5227-4f75-bac0-2604bb3d68e2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfc7eda8e-2c')#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.404 227766 DEBUG nova.compute.manager [req-3973fb84-86a1-4371-a989-12028148db40 req-48b20dbe-f4c5-46b6-981b-1d71e30daa90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-unplugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.404 227766 DEBUG oslo_concurrency.lockutils [req-3973fb84-86a1-4371-a989-12028148db40 req-48b20dbe-f4c5-46b6-981b-1d71e30daa90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.405 227766 DEBUG oslo_concurrency.lockutils [req-3973fb84-86a1-4371-a989-12028148db40 req-48b20dbe-f4c5-46b6-981b-1d71e30daa90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.405 227766 DEBUG oslo_concurrency.lockutils [req-3973fb84-86a1-4371-a989-12028148db40 req-48b20dbe-f4c5-46b6-981b-1d71e30daa90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.405 227766 DEBUG nova.compute.manager [req-3973fb84-86a1-4371-a989-12028148db40 req-48b20dbe-f4c5-46b6-981b-1d71e30daa90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] No waiting events found dispatching network-vif-unplugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.405 227766 DEBUG nova.compute.manager [req-3973fb84-86a1-4371-a989-12028148db40 req-48b20dbe-f4c5-46b6-981b-1d71e30daa90 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-unplugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:36:39 np0005593234 podman[317070]: 2026-01-23 10:36:39.595147674 +0000 UTC m=+0.525821310 container remove b7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:36:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:39.602 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[63b7d6a0-4828-4e18-ad6e-2636145d5ab0]: (4, ('Fri Jan 23 10:36:38 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 (b7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79)\nb7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79\nFri Jan 23 10:36:39 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 (b7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79)\nb7110cedb5c47d1bb54b16f82e6c6efc95624c5ecabab3d63b81781910e7eb79\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:39.604 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a91b7595-00c7-4e54-b038-1b541ebe53d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:39.605 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd7d5530f-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.607 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:39 np0005593234 kernel: tapd7d5530f-50: left promiscuous mode
Jan 23 05:36:39 np0005593234 nova_compute[227762]: 2026-01-23 10:36:39.620 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:39.624 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5509a902-a6e3-4e43-9363-3825fd783f4b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:39.650 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3a1dbb55-dcae-4d6c-95bd-570276417c06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:39.651 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3724fa64-341e-46bb-823e-85db3384d8e6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:39.669 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6d160cc7-f172-4f04-a6e4-903dacac0445]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 815719, 'reachable_time': 31381, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 317114, 'error': None, 'target': 'ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:39 np0005593234 systemd[1]: run-netns-ovnmeta\x2dd7d5530f\x2d5227\x2d4f75\x2dbac0\x2d2604bb3d68e2.mount: Deactivated successfully.
Jan 23 05:36:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:39.674 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d7d5530f-5227-4f75-bac0-2604bb3d68e2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:36:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:39.675 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[ce1f87ac-dd3e-4a7e-a234-bf179b23ce1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:36:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:36:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:39.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:36:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:40.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:41 np0005593234 nova_compute[227762]: 2026-01-23 10:36:41.275 227766 INFO nova.virt.libvirt.driver [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Deleting instance files /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20_del#033[00m
Jan 23 05:36:41 np0005593234 nova_compute[227762]: 2026-01-23 10:36:41.276 227766 INFO nova.virt.libvirt.driver [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Deletion of /var/lib/nova/instances/64ccc062-b11b-4cbc-96ba-620e43dfdb20_del complete#033[00m
Jan 23 05:36:41 np0005593234 nova_compute[227762]: 2026-01-23 10:36:41.503 227766 INFO nova.compute.manager [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Took 3.21 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:36:41 np0005593234 nova_compute[227762]: 2026-01-23 10:36:41.504 227766 DEBUG oslo.service.loopingcall [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:36:41 np0005593234 nova_compute[227762]: 2026-01-23 10:36:41.505 227766 DEBUG nova.compute.manager [-] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:36:41 np0005593234 nova_compute[227762]: 2026-01-23 10:36:41.505 227766 DEBUG nova.network.neutron [-] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:36:41 np0005593234 nova_compute[227762]: 2026-01-23 10:36:41.706 227766 DEBUG nova.compute.manager [req-f6999e32-1228-4d41-a028-528462c50ac6 req-e1a447f7-b56a-491d-bda6-af1fb364bbbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:41 np0005593234 nova_compute[227762]: 2026-01-23 10:36:41.707 227766 DEBUG oslo_concurrency.lockutils [req-f6999e32-1228-4d41-a028-528462c50ac6 req-e1a447f7-b56a-491d-bda6-af1fb364bbbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:41 np0005593234 nova_compute[227762]: 2026-01-23 10:36:41.708 227766 DEBUG oslo_concurrency.lockutils [req-f6999e32-1228-4d41-a028-528462c50ac6 req-e1a447f7-b56a-491d-bda6-af1fb364bbbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:41 np0005593234 nova_compute[227762]: 2026-01-23 10:36:41.708 227766 DEBUG oslo_concurrency.lockutils [req-f6999e32-1228-4d41-a028-528462c50ac6 req-e1a447f7-b56a-491d-bda6-af1fb364bbbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:41 np0005593234 nova_compute[227762]: 2026-01-23 10:36:41.708 227766 DEBUG nova.compute.manager [req-f6999e32-1228-4d41-a028-528462c50ac6 req-e1a447f7-b56a-491d-bda6-af1fb364bbbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] No waiting events found dispatching network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:36:41 np0005593234 nova_compute[227762]: 2026-01-23 10:36:41.709 227766 WARNING nova.compute.manager [req-f6999e32-1228-4d41-a028-528462c50ac6 req-e1a447f7-b56a-491d-bda6-af1fb364bbbb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received unexpected event network-vif-plugged-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:36:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:41.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:42 np0005593234 nova_compute[227762]: 2026-01-23 10:36:42.363 227766 DEBUG nova.network.neutron [-] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:36:42 np0005593234 nova_compute[227762]: 2026-01-23 10:36:42.391 227766 INFO nova.compute.manager [-] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Took 0.89 seconds to deallocate network for instance.#033[00m
Jan 23 05:36:42 np0005593234 nova_compute[227762]: 2026-01-23 10:36:42.455 227766 DEBUG nova.compute.manager [req-f9ffedd8-e187-4bbc-be00-fc074c27a619 req-e5e40a22-9d6c-4ba4-8e59-3ab2a2781d55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Received event network-vif-deleted-fc7eda8e-2c6e-4391-aa9c-5a0072ec4c15 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:36:42 np0005593234 nova_compute[227762]: 2026-01-23 10:36:42.466 227766 DEBUG oslo_concurrency.lockutils [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:42 np0005593234 nova_compute[227762]: 2026-01-23 10:36:42.466 227766 DEBUG oslo_concurrency.lockutils [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:42 np0005593234 nova_compute[227762]: 2026-01-23 10:36:42.533 227766 DEBUG oslo_concurrency.processutils [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:42.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:42.873 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:42.875 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:36:42.875 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e369 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:36:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2334137154' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:43 np0005593234 nova_compute[227762]: 2026-01-23 10:36:43.699 227766 DEBUG oslo_concurrency.processutils [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:43 np0005593234 nova_compute[227762]: 2026-01-23 10:36:43.705 227766 DEBUG nova.compute.provider_tree [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:36:43 np0005593234 nova_compute[227762]: 2026-01-23 10:36:43.761 227766 DEBUG nova.scheduler.client.report [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:36:43 np0005593234 nova_compute[227762]: 2026-01-23 10:36:43.789 227766 DEBUG oslo_concurrency.lockutils [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.323s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:43 np0005593234 nova_compute[227762]: 2026-01-23 10:36:43.818 227766 INFO nova.scheduler.client.report [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Deleted allocations for instance 64ccc062-b11b-4cbc-96ba-620e43dfdb20#033[00m
Jan 23 05:36:43 np0005593234 nova_compute[227762]: 2026-01-23 10:36:43.884 227766 DEBUG oslo_concurrency.lockutils [None req-4feb34e3-972a-47fa-8c0a-eda3cb8c342c e1629a4b14764dddaabcadd16f3e1c1c 815b71acf60d4ed8933ebd05228fa0c0 - - default default] Lock "64ccc062-b11b-4cbc-96ba-620e43dfdb20" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.593s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:43.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:36:44 np0005593234 nova_compute[227762]: 2026-01-23 10:36:44.159 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:44 np0005593234 nova_compute[227762]: 2026-01-23 10:36:44.230 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:36:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/657077111' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:36:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:36:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/657077111' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:36:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:44.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:45.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:46.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:47.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e369 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:48 np0005593234 ovn_controller[134547]: 2026-01-23T10:36:48Z|00821|binding|INFO|Releasing lport f45d82fe-2ce9-4738-9a72-0ec8f8b8032e from this chassis (sb_readonly=0)
Jan 23 05:36:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:48.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:48 np0005593234 nova_compute[227762]: 2026-01-23 10:36:48.865 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e370 e370: 3 total, 3 up, 3 in
Jan 23 05:36:49 np0005593234 nova_compute[227762]: 2026-01-23 10:36:49.161 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:49 np0005593234 nova_compute[227762]: 2026-01-23 10:36:49.231 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:49.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:50.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:51 np0005593234 podman[317195]: 2026-01-23 10:36:51.787280424 +0000 UTC m=+0.070201493 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:36:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:51.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:52.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:53.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:54 np0005593234 nova_compute[227762]: 2026-01-23 10:36:54.127 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164599.1255493, 64ccc062-b11b-4cbc-96ba-620e43dfdb20 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:36:54 np0005593234 nova_compute[227762]: 2026-01-23 10:36:54.127 227766 INFO nova.compute.manager [-] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:36:54 np0005593234 nova_compute[227762]: 2026-01-23 10:36:54.160 227766 DEBUG nova.compute.manager [None req-357dbe46-cac1-4eea-ac0b-7008fd9f8d25 - - - - - -] [instance: 64ccc062-b11b-4cbc-96ba-620e43dfdb20] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:36:54 np0005593234 nova_compute[227762]: 2026-01-23 10:36:54.163 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:54 np0005593234 nova_compute[227762]: 2026-01-23 10:36:54.232 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:54.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:55.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:56.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:57.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:36:58 np0005593234 nova_compute[227762]: 2026-01-23 10:36:58.775 227766 DEBUG oslo_concurrency.lockutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Acquiring lock "d077ff2d-0631-474d-b9a9-61fc36577163" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:58 np0005593234 nova_compute[227762]: 2026-01-23 10:36:58.776 227766 DEBUG oslo_concurrency.lockutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "d077ff2d-0631-474d-b9a9-61fc36577163" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:58 np0005593234 nova_compute[227762]: 2026-01-23 10:36:58.794 227766 DEBUG nova.compute.manager [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:36:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:36:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:58.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:36:58 np0005593234 nova_compute[227762]: 2026-01-23 10:36:58.980 227766 DEBUG oslo_concurrency.lockutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:36:58 np0005593234 nova_compute[227762]: 2026-01-23 10:36:58.980 227766 DEBUG oslo_concurrency.lockutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:36:58 np0005593234 nova_compute[227762]: 2026-01-23 10:36:58.988 227766 DEBUG nova.virt.hardware [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:36:58 np0005593234 nova_compute[227762]: 2026-01-23 10:36:58.988 227766 INFO nova.compute.claims [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.103 227766 DEBUG nova.scheduler.client.report [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.165 227766 DEBUG nova.scheduler.client.report [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.166 227766 DEBUG nova.compute.provider_tree [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.169 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.182 227766 DEBUG nova.scheduler.client.report [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.205 227766 DEBUG nova.scheduler.client.report [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.234 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.272 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:36:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:36:59 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2411288539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.709 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.719 227766 DEBUG nova.compute.provider_tree [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.735 227766 DEBUG nova.scheduler.client.report [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.764 227766 DEBUG oslo_concurrency.lockutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.765 227766 DEBUG nova.compute.manager [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.809 227766 DEBUG nova.compute.manager [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Not allocating networking since 'none' was specified. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1948#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.835 227766 INFO nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.870 227766 DEBUG nova.compute.manager [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.963 227766 DEBUG nova.compute.manager [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.965 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:36:59 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.965 227766 INFO nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Creating image(s)#033[00m
Jan 23 05:36:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:36:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:36:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:59.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:36:59.999 227766 DEBUG nova.storage.rbd_utils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.028 227766 DEBUG nova.storage.rbd_utils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.060 227766 DEBUG nova.storage.rbd_utils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.064 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.126 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.127 227766 DEBUG oslo_concurrency.lockutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.128 227766 DEBUG oslo_concurrency.lockutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.128 227766 DEBUG oslo_concurrency.lockutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.162 227766 DEBUG nova.storage.rbd_utils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.166 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 d077ff2d-0631-474d-b9a9-61fc36577163_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.704 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 d077ff2d-0631-474d-b9a9-61fc36577163_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.538s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.782 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.790 227766 DEBUG nova.storage.rbd_utils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] resizing rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:37:00 np0005593234 podman[317336]: 2026-01-23 10:37:00.798648657 +0000 UTC m=+0.088376493 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 23 05:37:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:00.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.881 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.881 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.881 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.882 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.882 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.962 227766 DEBUG nova.objects.instance [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lazy-loading 'migration_context' on Instance uuid d077ff2d-0631-474d-b9a9-61fc36577163 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.977 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.978 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Ensure instance console log exists: /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.978 227766 DEBUG oslo_concurrency.lockutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.978 227766 DEBUG oslo_concurrency.lockutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.979 227766 DEBUG oslo_concurrency.lockutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.980 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.985 227766 WARNING nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.992 227766 DEBUG nova.virt.libvirt.host [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.992 227766 DEBUG nova.virt.libvirt.host [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.997 227766 DEBUG nova.virt.libvirt.host [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.997 227766 DEBUG nova.virt.libvirt.host [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.998 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.998 227766 DEBUG nova.virt.hardware [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.999 227766 DEBUG nova.virt.hardware [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.999 227766 DEBUG nova.virt.hardware [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:37:00 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.999 227766 DEBUG nova.virt.hardware [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.999 227766 DEBUG nova.virt.hardware [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:00.999 227766 DEBUG nova.virt.hardware [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.000 227766 DEBUG nova.virt.hardware [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.000 227766 DEBUG nova.virt.hardware [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.000 227766 DEBUG nova.virt.hardware [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.000 227766 DEBUG nova.virt.hardware [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.000 227766 DEBUG nova.virt.hardware [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.003 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:37:01 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3347826092' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.314 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.383 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000ba as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.384 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000ba as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:37:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:37:01 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3387490758' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.437 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.467 227766 DEBUG nova.storage.rbd_utils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.471 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.620 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.621 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3893MB free_disk=20.913299560546875GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.621 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.621 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.720 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 6673e062-5d99-4c31-a3e0-673f55438d6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.720 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance d077ff2d-0631-474d-b9a9-61fc36577163 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.721 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.721 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:37:01 np0005593234 nova_compute[227762]: 2026-01-23 10:37:01.770 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:01.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:37:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3558338888' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:37:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:37:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3890734211' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.275 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.277 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.806s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.279 227766 DEBUG nova.objects.instance [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lazy-loading 'pci_devices' on Instance uuid d077ff2d-0631-474d-b9a9-61fc36577163 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.288 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.305 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.310 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  <uuid>d077ff2d-0631-474d-b9a9-61fc36577163</uuid>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  <name>instance-000000be</name>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerShowV257Test-server-1695295809</nova:name>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:37:00</nova:creationTime>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <nova:user uuid="4016f133f441491ab245d0b8e9d6d7f5">tempest-ServerShowV257Test-1820049832-project-member</nova:user>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <nova:project uuid="b6db0c8b5ec04031aaacc904f210c5dd">tempest-ServerShowV257Test-1820049832</nova:project>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <entry name="serial">d077ff2d-0631-474d-b9a9-61fc36577163</entry>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <entry name="uuid">d077ff2d-0631-474d-b9a9-61fc36577163</entry>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/d077ff2d-0631-474d-b9a9-61fc36577163_disk">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/d077ff2d-0631-474d-b9a9-61fc36577163_disk.config">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/console.log" append="off"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:37:02 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:37:02 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:37:02 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:37:02 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.331 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.332 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.710s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.435 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.435 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.436 227766 INFO nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Using config drive#033[00m
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.464 227766 DEBUG nova.storage.rbd_utils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:02.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.873 227766 INFO nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Creating config drive at /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/disk.config#033[00m
Jan 23 05:37:02 np0005593234 nova_compute[227762]: 2026-01-23 10:37:02.878 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2tijjxiz execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:03 np0005593234 nova_compute[227762]: 2026-01-23 10:37:03.017 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp2tijjxiz" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:03 np0005593234 nova_compute[227762]: 2026-01-23 10:37:03.262 227766 DEBUG nova.storage.rbd_utils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:03 np0005593234 nova_compute[227762]: 2026-01-23 10:37:03.266 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/disk.config d077ff2d-0631-474d-b9a9-61fc36577163_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:03 np0005593234 nova_compute[227762]: 2026-01-23 10:37:03.770 227766 DEBUG oslo_concurrency.processutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/disk.config d077ff2d-0631-474d-b9a9-61fc36577163_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:03 np0005593234 nova_compute[227762]: 2026-01-23 10:37:03.771 227766 INFO nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Deleting local config drive /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/disk.config because it was imported into RBD.#033[00m
Jan 23 05:37:03 np0005593234 systemd-machined[195626]: New machine qemu-91-instance-000000be.
Jan 23 05:37:03 np0005593234 systemd[1]: Started Virtual Machine qemu-91-instance-000000be.
Jan 23 05:37:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:03.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.168 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.236 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:04 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #57. Immutable memtables: 13.
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.294 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.294 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.362 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164624.3621895, d077ff2d-0631-474d-b9a9-61fc36577163 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.364 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.366 227766 DEBUG nova.compute.manager [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.367 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.372 227766 INFO nova.virt.libvirt.driver [-] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Instance spawned successfully.#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.373 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.386 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.391 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.396 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.396 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.397 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.397 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.398 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.398 227766 DEBUG nova.virt.libvirt.driver [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.426 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.427 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164624.3640769, d077ff2d-0631-474d-b9a9-61fc36577163 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.427 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] VM Started (Lifecycle Event)#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.461 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.466 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.478 227766 INFO nova.compute.manager [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Took 4.51 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.479 227766 DEBUG nova.compute.manager [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.512 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.546 227766 INFO nova.compute.manager [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Took 5.69 seconds to build instance.#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.571 227766 DEBUG oslo_concurrency.lockutils [None req-5e6e5fa9-589b-482f-90e7-2c8f745319ce 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "d077ff2d-0631-474d-b9a9-61fc36577163" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 5.796s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.654 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.654 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:37:04 np0005593234 nova_compute[227762]: 2026-01-23 10:37:04.655 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:37:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:04.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:37:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2689328159' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:37:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:05.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:06 np0005593234 nova_compute[227762]: 2026-01-23 10:37:06.008 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Updating instance_info_cache with network_info: [{"id": "ddcd0522-401c-4c1d-90df-7c407812f643", "address": "fa:16:3e:9f:74:a2", "network": {"id": "f2cadf27-eae4-40fd-be37-b605e054ab76", "bridge": "br-int", "label": "tempest-TestStampPattern-1832792840-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59dad6496894352a2f4c7eb66ca1914", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd0522-40", "ovs_interfaceid": "ddcd0522-401c-4c1d-90df-7c407812f643", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:37:06 np0005593234 nova_compute[227762]: 2026-01-23 10:37:06.029 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:37:06 np0005593234 nova_compute[227762]: 2026-01-23 10:37:06.030 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:37:06 np0005593234 nova_compute[227762]: 2026-01-23 10:37:06.030 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:06 np0005593234 nova_compute[227762]: 2026-01-23 10:37:06.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:06.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:07 np0005593234 nova_compute[227762]: 2026-01-23 10:37:07.229 227766 INFO nova.compute.manager [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Rebuilding instance#033[00m
Jan 23 05:37:07 np0005593234 nova_compute[227762]: 2026-01-23 10:37:07.446 227766 DEBUG nova.objects.instance [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lazy-loading 'trusted_certs' on Instance uuid d077ff2d-0631-474d-b9a9-61fc36577163 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:37:07 np0005593234 nova_compute[227762]: 2026-01-23 10:37:07.464 227766 DEBUG nova.compute.manager [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:37:07 np0005593234 nova_compute[227762]: 2026-01-23 10:37:07.512 227766 DEBUG nova.objects.instance [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lazy-loading 'pci_requests' on Instance uuid d077ff2d-0631-474d-b9a9-61fc36577163 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:37:07 np0005593234 nova_compute[227762]: 2026-01-23 10:37:07.529 227766 DEBUG nova.objects.instance [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lazy-loading 'pci_devices' on Instance uuid d077ff2d-0631-474d-b9a9-61fc36577163 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:37:07 np0005593234 nova_compute[227762]: 2026-01-23 10:37:07.543 227766 DEBUG nova.objects.instance [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lazy-loading 'resources' on Instance uuid d077ff2d-0631-474d-b9a9-61fc36577163 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:37:07 np0005593234 nova_compute[227762]: 2026-01-23 10:37:07.557 227766 DEBUG nova.objects.instance [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lazy-loading 'migration_context' on Instance uuid d077ff2d-0631-474d-b9a9-61fc36577163 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:37:07 np0005593234 nova_compute[227762]: 2026-01-23 10:37:07.573 227766 DEBUG nova.objects.instance [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 05:37:07 np0005593234 nova_compute[227762]: 2026-01-23 10:37:07.576 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:37:07 np0005593234 nova_compute[227762]: 2026-01-23 10:37:07.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:07 np0005593234 nova_compute[227762]: 2026-01-23 10:37:07.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:37:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:07.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:08.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:09 np0005593234 nova_compute[227762]: 2026-01-23 10:37:09.170 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:09 np0005593234 nova_compute[227762]: 2026-01-23 10:37:09.239 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:09 np0005593234 nova_compute[227762]: 2026-01-23 10:37:09.965 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:09.966 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=72, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=71) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:37:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:09.968 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:37:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:09.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:37:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:37:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:37:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:37:10 np0005593234 nova_compute[227762]: 2026-01-23 10:37:10.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:10.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:11.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:12.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:37:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:14.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:37:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:14 np0005593234 nova_compute[227762]: 2026-01-23 10:37:14.173 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:14 np0005593234 nova_compute[227762]: 2026-01-23 10:37:14.242 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:14.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:16.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:16 np0005593234 nova_compute[227762]: 2026-01-23 10:37:16.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:16.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:17 np0005593234 nova_compute[227762]: 2026-01-23 10:37:17.623 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Instance in state 1 after 10 seconds - resending shutdown _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4101#033[00m
Jan 23 05:37:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:17.970 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '72'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:37:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:37:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:18.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:37:18 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:37:18 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:37:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:37:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/574691209' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:37:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:37:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/574691209' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:37:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:18.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e370 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:19 np0005593234 nova_compute[227762]: 2026-01-23 10:37:19.215 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:19 np0005593234 nova_compute[227762]: 2026-01-23 10:37:19.243 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e371 e371: 3 total, 3 up, 3 in
Jan 23 05:37:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:20.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e372 e372: 3 total, 3 up, 3 in
Jan 23 05:37:20 np0005593234 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000be.scope: Deactivated successfully.
Jan 23 05:37:20 np0005593234 systemd[1]: machine-qemu\x2d91\x2dinstance\x2d000000be.scope: Consumed 13.043s CPU time.
Jan 23 05:37:20 np0005593234 systemd-machined[195626]: Machine qemu-91-instance-000000be terminated.
Jan 23 05:37:20 np0005593234 nova_compute[227762]: 2026-01-23 10:37:20.637 227766 INFO nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Instance shutdown successfully after 13 seconds.#033[00m
Jan 23 05:37:20 np0005593234 nova_compute[227762]: 2026-01-23 10:37:20.643 227766 INFO nova.virt.libvirt.driver [-] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Instance destroyed successfully.#033[00m
Jan 23 05:37:20 np0005593234 nova_compute[227762]: 2026-01-23 10:37:20.649 227766 INFO nova.virt.libvirt.driver [-] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Instance destroyed successfully.#033[00m
Jan 23 05:37:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:20.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:21 np0005593234 nova_compute[227762]: 2026-01-23 10:37:21.123 227766 INFO nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Deleting instance files /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163_del#033[00m
Jan 23 05:37:21 np0005593234 nova_compute[227762]: 2026-01-23 10:37:21.124 227766 INFO nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Deletion of /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163_del complete#033[00m
Jan 23 05:37:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:22.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.136 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.137 227766 INFO nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Creating image(s)#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.170 227766 DEBUG nova.storage.rbd_utils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.202 227766 DEBUG nova.storage.rbd_utils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.234 227766 DEBUG nova.storage.rbd_utils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.238 227766 DEBUG oslo_concurrency.processutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.331 227766 DEBUG oslo_concurrency.processutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.332 227766 DEBUG oslo_concurrency.lockutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Acquiring lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.332 227766 DEBUG oslo_concurrency.lockutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.333 227766 DEBUG oslo_concurrency.lockutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.363 227766 DEBUG nova.storage.rbd_utils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.367 227766 DEBUG oslo_concurrency.processutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 d077ff2d-0631-474d-b9a9-61fc36577163_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.682 227766 DEBUG oslo_concurrency.processutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 d077ff2d-0631-474d-b9a9-61fc36577163_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.315s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.756 227766 DEBUG nova.storage.rbd_utils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] resizing rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:37:22 np0005593234 podman[318068]: 2026-01-23 10:37:22.777579055 +0000 UTC m=+0.062594227 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.795 227766 DEBUG nova.compute.manager [req-8a68d72a-ecbd-4905-933e-ddfe2460ae3a req-933a25df-b387-419b-9b03-08ed0826c256 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Received event network-changed-ddcd0522-401c-4c1d-90df-7c407812f643 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.796 227766 DEBUG nova.compute.manager [req-8a68d72a-ecbd-4905-933e-ddfe2460ae3a req-933a25df-b387-419b-9b03-08ed0826c256 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Refreshing instance network info cache due to event network-changed-ddcd0522-401c-4c1d-90df-7c407812f643. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.796 227766 DEBUG oslo_concurrency.lockutils [req-8a68d72a-ecbd-4905-933e-ddfe2460ae3a req-933a25df-b387-419b-9b03-08ed0826c256 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.796 227766 DEBUG oslo_concurrency.lockutils [req-8a68d72a-ecbd-4905-933e-ddfe2460ae3a req-933a25df-b387-419b-9b03-08ed0826c256 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.797 227766 DEBUG nova.network.neutron [req-8a68d72a-ecbd-4905-933e-ddfe2460ae3a req-933a25df-b387-419b-9b03-08ed0826c256 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Refreshing network info cache for port ddcd0522-401c-4c1d-90df-7c407812f643 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.866 227766 DEBUG oslo_concurrency.lockutils [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Acquiring lock "6673e062-5d99-4c31-a3e0-673f55438d6e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.867 227766 DEBUG oslo_concurrency.lockutils [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.867 227766 DEBUG oslo_concurrency.lockutils [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Acquiring lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.867 227766 DEBUG oslo_concurrency.lockutils [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.868 227766 DEBUG oslo_concurrency.lockutils [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.869 227766 INFO nova.compute.manager [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Terminating instance#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.870 227766 DEBUG nova.compute.manager [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.875 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.876 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Ensure instance console log exists: /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.876 227766 DEBUG oslo_concurrency.lockutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.877 227766 DEBUG oslo_concurrency.lockutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.877 227766 DEBUG oslo_concurrency.lockutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:22.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.878 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.885 227766 WARNING nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.894 227766 DEBUG nova.virt.libvirt.host [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.895 227766 DEBUG nova.virt.libvirt.host [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.902 227766 DEBUG nova.virt.libvirt.host [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.902 227766 DEBUG nova.virt.libvirt.host [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.904 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.904 227766 DEBUG nova.virt.hardware [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.904 227766 DEBUG nova.virt.hardware [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.905 227766 DEBUG nova.virt.hardware [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.905 227766 DEBUG nova.virt.hardware [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.905 227766 DEBUG nova.virt.hardware [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.905 227766 DEBUG nova.virt.hardware [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.906 227766 DEBUG nova.virt.hardware [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.906 227766 DEBUG nova.virt.hardware [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.906 227766 DEBUG nova.virt.hardware [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.906 227766 DEBUG nova.virt.hardware [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.906 227766 DEBUG nova.virt.hardware [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.907 227766 DEBUG nova.objects.instance [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lazy-loading 'vcpu_model' on Instance uuid d077ff2d-0631-474d-b9a9-61fc36577163 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.936 227766 DEBUG oslo_concurrency.processutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:22 np0005593234 kernel: tapddcd0522-40 (unregistering): left promiscuous mode
Jan 23 05:37:22 np0005593234 NetworkManager[48942]: <info>  [1769164642.9400] device (tapddcd0522-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:37:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:37:22Z|00822|binding|INFO|Releasing lport ddcd0522-401c-4c1d-90df-7c407812f643 from this chassis (sb_readonly=0)
Jan 23 05:37:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:37:22Z|00823|binding|INFO|Setting lport ddcd0522-401c-4c1d-90df-7c407812f643 down in Southbound
Jan 23 05:37:22 np0005593234 ovn_controller[134547]: 2026-01-23T10:37:22Z|00824|binding|INFO|Removing iface tapddcd0522-40 ovn-installed in OVS
Jan 23 05:37:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:22.961 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9f:74:a2 10.100.0.10'], port_security=['fa:16:3e:9f:74:a2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '6673e062-5d99-4c31-a3e0-673f55438d6e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f2cadf27-eae4-40fd-be37-b605e054ab76', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd59dad6496894352a2f4c7eb66ca1914', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b37d3a3b-2932-4053-8960-53ee514541cc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=abb49df5-6f5b-4d51-9b7e-cc6910f0a6bb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=ddcd0522-401c-4c1d-90df-7c407812f643) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:37:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:22.964 144381 INFO neutron.agent.ovn.metadata.agent [-] Port ddcd0522-401c-4c1d-90df-7c407812f643 in datapath f2cadf27-eae4-40fd-be37-b605e054ab76 unbound from our chassis#033[00m
Jan 23 05:37:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:22.966 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f2cadf27-eae4-40fd-be37-b605e054ab76, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:37:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:22.967 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0988df1a-2ae7-4677-a63f-ed2447bf1233]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:37:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:22.968 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76 namespace which is not needed anymore#033[00m
Jan 23 05:37:22 np0005593234 nova_compute[227762]: 2026-01-23 10:37:22.971 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:22 np0005593234 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000ba.scope: Deactivated successfully.
Jan 23 05:37:22 np0005593234 systemd[1]: machine-qemu\x2d90\x2dinstance\x2d000000ba.scope: Consumed 19.129s CPU time.
Jan 23 05:37:23 np0005593234 systemd-machined[195626]: Machine qemu-90-instance-000000ba terminated.
Jan 23 05:37:23 np0005593234 NetworkManager[48942]: <info>  [1769164643.0952] manager: (tapddcd0522-40): new Tun device (/org/freedesktop/NetworkManager/Devices/385)
Jan 23 05:37:23 np0005593234 neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76[316035]: [NOTICE]   (316039) : haproxy version is 2.8.14-c23fe91
Jan 23 05:37:23 np0005593234 neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76[316035]: [NOTICE]   (316039) : path to executable is /usr/sbin/haproxy
Jan 23 05:37:23 np0005593234 neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76[316035]: [WARNING]  (316039) : Exiting Master process...
Jan 23 05:37:23 np0005593234 neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76[316035]: [ALERT]    (316039) : Current worker (316041) exited with code 143 (Terminated)
Jan 23 05:37:23 np0005593234 neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76[316035]: [WARNING]  (316039) : All workers exited. Exiting... (0)
Jan 23 05:37:23 np0005593234 systemd[1]: libpod-fd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd.scope: Deactivated successfully.
Jan 23 05:37:23 np0005593234 podman[318184]: 2026-01-23 10:37:23.111306882 +0000 UTC m=+0.047004220 container died fd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.115 227766 INFO nova.virt.libvirt.driver [-] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Instance destroyed successfully.#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.116 227766 DEBUG nova.objects.instance [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lazy-loading 'resources' on Instance uuid 6673e062-5d99-4c31-a3e0-673f55438d6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:37:23 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd-userdata-shm.mount: Deactivated successfully.
Jan 23 05:37:23 np0005593234 systemd[1]: var-lib-containers-storage-overlay-ebb1f0d7eea14f32c205ab6ac86368e14be592a046be81afc1b042df152dfa02-merged.mount: Deactivated successfully.
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.142 227766 DEBUG nova.virt.libvirt.vif [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:35:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestStampPattern-server-1669121032',display_name='tempest-TestStampPattern-server-1669121032',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-teststamppattern-server-1669121032',id=186,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMrKK+vqQ2ONAoFKX7V4eVrHBpyCPyjGn5U244sG4513gIb+5QaK2mU3GvydCfCOzo9xS+SqUIELsowqSaXGJbd+N0J3WtlcZAfr/OV3xzB4Bu/L3WF2HV34qxyNgfmi9Q==',key_name='tempest-TestStampPattern-1711959098',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:35:15Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d59dad6496894352a2f4c7eb66ca1914',ramdisk_id='',reservation_id='r-18qjfci3',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestStampPattern-1763690147',owner_user_name='tempest-TestStampPattern-1763690147-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:36:04Z,user_data=None,user_id='9a8ce4c88e8b46c5806ada5e3a6cdbbf',uuid=6673e062-5d99-4c31-a3e0-673f55438d6e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "ddcd0522-401c-4c1d-90df-7c407812f643", "address": "fa:16:3e:9f:74:a2", "network": {"id": "f2cadf27-eae4-40fd-be37-b605e054ab76", "bridge": "br-int", "label": "tempest-TestStampPattern-1832792840-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59dad6496894352a2f4c7eb66ca1914", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd0522-40", "ovs_interfaceid": "ddcd0522-401c-4c1d-90df-7c407812f643", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.143 227766 DEBUG nova.network.os_vif_util [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Converting VIF {"id": "ddcd0522-401c-4c1d-90df-7c407812f643", "address": "fa:16:3e:9f:74:a2", "network": {"id": "f2cadf27-eae4-40fd-be37-b605e054ab76", "bridge": "br-int", "label": "tempest-TestStampPattern-1832792840-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.223", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59dad6496894352a2f4c7eb66ca1914", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd0522-40", "ovs_interfaceid": "ddcd0522-401c-4c1d-90df-7c407812f643", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.144 227766 DEBUG nova.network.os_vif_util [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9f:74:a2,bridge_name='br-int',has_traffic_filtering=True,id=ddcd0522-401c-4c1d-90df-7c407812f643,network=Network(f2cadf27-eae4-40fd-be37-b605e054ab76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd0522-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.144 227766 DEBUG os_vif [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:74:a2,bridge_name='br-int',has_traffic_filtering=True,id=ddcd0522-401c-4c1d-90df-7c407812f643,network=Network(f2cadf27-eae4-40fd-be37-b605e054ab76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd0522-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.146 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.146 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapddcd0522-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.150 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.152 227766 INFO os_vif [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9f:74:a2,bridge_name='br-int',has_traffic_filtering=True,id=ddcd0522-401c-4c1d-90df-7c407812f643,network=Network(f2cadf27-eae4-40fd-be37-b605e054ab76),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapddcd0522-40')#033[00m
Jan 23 05:37:23 np0005593234 podman[318184]: 2026-01-23 10:37:23.155859804 +0000 UTC m=+0.091557132 container cleanup fd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:37:23 np0005593234 systemd[1]: libpod-conmon-fd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd.scope: Deactivated successfully.
Jan 23 05:37:23 np0005593234 podman[318248]: 2026-01-23 10:37:23.227675228 +0000 UTC m=+0.048951641 container remove fd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:23.239 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c456bcc7-e376-40c4-8a2b-e7bc59b29de5]: (4, ('Fri Jan 23 10:37:23 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76 (fd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd)\nfd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd\nFri Jan 23 10:37:23 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76 (fd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd)\nfd656eedec61d0b9c8857225367e4437e71c32d20a845354f7ea0109fce06cfd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:23.241 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f7504f18-1d2f-4d1a-8e32-b9bfdbc8a67c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:23.242 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf2cadf27-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.244 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:23 np0005593234 kernel: tapf2cadf27-e0: left promiscuous mode
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.249 227766 DEBUG nova.compute.manager [req-0d64387c-f8f9-4cc6-a920-6051e15e2394 req-10dabd26-110d-49a9-a4a0-658550c5d845 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Received event network-vif-unplugged-ddcd0522-401c-4c1d-90df-7c407812f643 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.249 227766 DEBUG oslo_concurrency.lockutils [req-0d64387c-f8f9-4cc6-a920-6051e15e2394 req-10dabd26-110d-49a9-a4a0-658550c5d845 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.250 227766 DEBUG oslo_concurrency.lockutils [req-0d64387c-f8f9-4cc6-a920-6051e15e2394 req-10dabd26-110d-49a9-a4a0-658550c5d845 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.250 227766 DEBUG oslo_concurrency.lockutils [req-0d64387c-f8f9-4cc6-a920-6051e15e2394 req-10dabd26-110d-49a9-a4a0-658550c5d845 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.250 227766 DEBUG nova.compute.manager [req-0d64387c-f8f9-4cc6-a920-6051e15e2394 req-10dabd26-110d-49a9-a4a0-658550c5d845 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] No waiting events found dispatching network-vif-unplugged-ddcd0522-401c-4c1d-90df-7c407812f643 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.265 227766 DEBUG nova.compute.manager [req-0d64387c-f8f9-4cc6-a920-6051e15e2394 req-10dabd26-110d-49a9-a4a0-658550c5d845 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Received event network-vif-unplugged-ddcd0522-401c-4c1d-90df-7c407812f643 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.270 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:23.273 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[99332ab6-1daf-4e70-a8db-d756168c79c7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:23.296 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2d0e1758-d246-444e-9308-ce6961b7440b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:23.297 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[99f2ed5a-d55e-40bd-993c-ff5d1db71a5f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:23.314 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8ec9f82f-e220-445e-89fa-fc14551c95b7]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 841002, 'reachable_time': 17838, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 318274, 'error': None, 'target': 'ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:37:23 np0005593234 systemd[1]: run-netns-ovnmeta\x2df2cadf27\x2deae4\x2d40fd\x2dbe37\x2db605e054ab76.mount: Deactivated successfully.
Jan 23 05:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:23.320 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f2cadf27-eae4-40fd-be37-b605e054ab76 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:37:23 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:23.320 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc8beee-5460-49f8-86cc-f74fe4c4826a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:37:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:37:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2671372570' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.431 227766 DEBUG oslo_concurrency.processutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.459 227766 DEBUG nova.storage.rbd_utils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.462 227766 DEBUG oslo_concurrency.processutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.724 227766 INFO nova.virt.libvirt.driver [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Deleting instance files /var/lib/nova/instances/6673e062-5d99-4c31-a3e0-673f55438d6e_del#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.725 227766 INFO nova.virt.libvirt.driver [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Deletion of /var/lib/nova/instances/6673e062-5d99-4c31-a3e0-673f55438d6e_del complete#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.779 227766 INFO nova.compute.manager [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Took 0.91 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.780 227766 DEBUG oslo.service.loopingcall [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.780 227766 DEBUG nova.compute.manager [-] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.780 227766 DEBUG nova.network.neutron [-] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:37:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:37:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/405688733' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.967 227766 DEBUG oslo_concurrency.processutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:23 np0005593234 nova_compute[227762]: 2026-01-23 10:37:23.971 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  <uuid>d077ff2d-0631-474d-b9a9-61fc36577163</uuid>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  <name>instance-000000be</name>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <nova:name>tempest-ServerShowV257Test-server-1695295809</nova:name>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:37:22</nova:creationTime>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <nova:user uuid="4016f133f441491ab245d0b8e9d6d7f5">tempest-ServerShowV257Test-1820049832-project-member</nova:user>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <nova:project uuid="b6db0c8b5ec04031aaacc904f210c5dd">tempest-ServerShowV257Test-1820049832</nova:project>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="ae1f9e37-418c-462f-81d1-3599a6d89de9"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <nova:ports/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <entry name="serial">d077ff2d-0631-474d-b9a9-61fc36577163</entry>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <entry name="uuid">d077ff2d-0631-474d-b9a9-61fc36577163</entry>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/d077ff2d-0631-474d-b9a9-61fc36577163_disk">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/d077ff2d-0631-474d-b9a9-61fc36577163_disk.config">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/console.log" append="off"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:37:23 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:37:23 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:37:23 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:37:23 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:37:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:24.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.025 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.026 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.026 227766 INFO nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Using config drive#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.054 227766 DEBUG nova.storage.rbd_utils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e372 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.080 227766 DEBUG nova.objects.instance [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lazy-loading 'ec2_ids' on Instance uuid d077ff2d-0631-474d-b9a9-61fc36577163 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.104 227766 DEBUG nova.objects.instance [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lazy-loading 'keypairs' on Instance uuid d077ff2d-0631-474d-b9a9-61fc36577163 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.187 227766 DEBUG nova.network.neutron [req-8a68d72a-ecbd-4905-933e-ddfe2460ae3a req-933a25df-b387-419b-9b03-08ed0826c256 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Updated VIF entry in instance network info cache for port ddcd0522-401c-4c1d-90df-7c407812f643. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.187 227766 DEBUG nova.network.neutron [req-8a68d72a-ecbd-4905-933e-ddfe2460ae3a req-933a25df-b387-419b-9b03-08ed0826c256 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Updating instance_info_cache with network_info: [{"id": "ddcd0522-401c-4c1d-90df-7c407812f643", "address": "fa:16:3e:9f:74:a2", "network": {"id": "f2cadf27-eae4-40fd-be37-b605e054ab76", "bridge": "br-int", "label": "tempest-TestStampPattern-1832792840-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d59dad6496894352a2f4c7eb66ca1914", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapddcd0522-40", "ovs_interfaceid": "ddcd0522-401c-4c1d-90df-7c407812f643", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.217 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.221 227766 DEBUG oslo_concurrency.lockutils [req-8a68d72a-ecbd-4905-933e-ddfe2460ae3a req-933a25df-b387-419b-9b03-08ed0826c256 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6673e062-5d99-4c31-a3e0-673f55438d6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.356 227766 INFO nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Creating config drive at /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/disk.config#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.363 227766 DEBUG oslo_concurrency.processutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd6r8z9pl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.477 227766 DEBUG nova.network.neutron [-] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.502 227766 INFO nova.compute.manager [-] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Took 0.72 seconds to deallocate network for instance.#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.503 227766 DEBUG oslo_concurrency.processutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpd6r8z9pl" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.541 227766 DEBUG nova.storage.rbd_utils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] rbd image d077ff2d-0631-474d-b9a9-61fc36577163_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.546 227766 DEBUG oslo_concurrency.processutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/disk.config d077ff2d-0631-474d-b9a9-61fc36577163_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.619 227766 DEBUG oslo_concurrency.lockutils [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.620 227766 DEBUG oslo_concurrency.lockutils [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.683 227766 DEBUG oslo_concurrency.processutils [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.731 227766 DEBUG oslo_concurrency.processutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/disk.config d077ff2d-0631-474d-b9a9-61fc36577163_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.185s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.732 227766 INFO nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Deleting local config drive /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163/disk.config because it was imported into RBD.#033[00m
Jan 23 05:37:24 np0005593234 nova_compute[227762]: 2026-01-23 10:37:24.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:37:24 np0005593234 systemd-machined[195626]: New machine qemu-92-instance-000000be.
Jan 23 05:37:24 np0005593234 systemd[1]: Started Virtual Machine qemu-92-instance-000000be.
Jan 23 05:37:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:24.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:37:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2796426136' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.107 227766 DEBUG oslo_concurrency.processutils [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.115 227766 DEBUG nova.compute.provider_tree [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.135 227766 DEBUG nova.scheduler.client.report [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.167 227766 DEBUG oslo_concurrency.lockutils [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.203 227766 INFO nova.scheduler.client.report [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Deleted allocations for instance 6673e062-5d99-4c31-a3e0-673f55438d6e#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.216 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for d077ff2d-0631-474d-b9a9-61fc36577163 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.216 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164645.2159982, d077ff2d-0631-474d-b9a9-61fc36577163 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.217 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.220 227766 DEBUG nova.compute.manager [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.220 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.227 227766 INFO nova.virt.libvirt.driver [-] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Instance spawned successfully.#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.228 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.261 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.265 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.274 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.274 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.275 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.275 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.275 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.276 227766 DEBUG nova.virt.libvirt.driver [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.306 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.307 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164645.2182553, d077ff2d-0631-474d-b9a9-61fc36577163 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.307 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] VM Started (Lifecycle Event)#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.347 227766 DEBUG oslo_concurrency.lockutils [None req-4e806961-ab83-4eff-9745-3f1b96b9e63b 9a8ce4c88e8b46c5806ada5e3a6cdbbf d59dad6496894352a2f4c7eb66ca1914 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.480s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.356 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.359 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.387 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.406 227766 DEBUG nova.compute.manager [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.411 227766 DEBUG nova.compute.manager [req-25139dd8-0649-4f05-8538-7ef61c8e1d8a req-6798a846-cb4f-4290-9a13-eefdb5d6f132 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Received event network-vif-plugged-ddcd0522-401c-4c1d-90df-7c407812f643 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.412 227766 DEBUG oslo_concurrency.lockutils [req-25139dd8-0649-4f05-8538-7ef61c8e1d8a req-6798a846-cb4f-4290-9a13-eefdb5d6f132 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.412 227766 DEBUG oslo_concurrency.lockutils [req-25139dd8-0649-4f05-8538-7ef61c8e1d8a req-6798a846-cb4f-4290-9a13-eefdb5d6f132 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.412 227766 DEBUG oslo_concurrency.lockutils [req-25139dd8-0649-4f05-8538-7ef61c8e1d8a req-6798a846-cb4f-4290-9a13-eefdb5d6f132 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6673e062-5d99-4c31-a3e0-673f55438d6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.412 227766 DEBUG nova.compute.manager [req-25139dd8-0649-4f05-8538-7ef61c8e1d8a req-6798a846-cb4f-4290-9a13-eefdb5d6f132 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] No waiting events found dispatching network-vif-plugged-ddcd0522-401c-4c1d-90df-7c407812f643 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.412 227766 WARNING nova.compute.manager [req-25139dd8-0649-4f05-8538-7ef61c8e1d8a req-6798a846-cb4f-4290-9a13-eefdb5d6f132 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Received unexpected event network-vif-plugged-ddcd0522-401c-4c1d-90df-7c407812f643 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.413 227766 DEBUG nova.compute.manager [req-25139dd8-0649-4f05-8538-7ef61c8e1d8a req-6798a846-cb4f-4290-9a13-eefdb5d6f132 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Received event network-vif-deleted-ddcd0522-401c-4c1d-90df-7c407812f643 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.468 227766 DEBUG oslo_concurrency.lockutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.468 227766 DEBUG oslo_concurrency.lockutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.468 227766 DEBUG nova.objects.instance [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 05:37:25 np0005593234 nova_compute[227762]: 2026-01-23 10:37:25.532 227766 DEBUG oslo_concurrency.lockutils [None req-93774634-6402-4672-bbb4-d2555483a4fb 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.064s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:37:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:26.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:37:26 np0005593234 nova_compute[227762]: 2026-01-23 10:37:26.465 227766 DEBUG oslo_concurrency.lockutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Acquiring lock "d077ff2d-0631-474d-b9a9-61fc36577163" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:26 np0005593234 nova_compute[227762]: 2026-01-23 10:37:26.466 227766 DEBUG oslo_concurrency.lockutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "d077ff2d-0631-474d-b9a9-61fc36577163" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:26 np0005593234 nova_compute[227762]: 2026-01-23 10:37:26.466 227766 DEBUG oslo_concurrency.lockutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Acquiring lock "d077ff2d-0631-474d-b9a9-61fc36577163-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:26 np0005593234 nova_compute[227762]: 2026-01-23 10:37:26.466 227766 DEBUG oslo_concurrency.lockutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "d077ff2d-0631-474d-b9a9-61fc36577163-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:26 np0005593234 nova_compute[227762]: 2026-01-23 10:37:26.466 227766 DEBUG oslo_concurrency.lockutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "d077ff2d-0631-474d-b9a9-61fc36577163-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:26 np0005593234 nova_compute[227762]: 2026-01-23 10:37:26.467 227766 INFO nova.compute.manager [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Terminating instance#033[00m
Jan 23 05:37:26 np0005593234 nova_compute[227762]: 2026-01-23 10:37:26.468 227766 DEBUG oslo_concurrency.lockutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Acquiring lock "refresh_cache-d077ff2d-0631-474d-b9a9-61fc36577163" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:37:26 np0005593234 nova_compute[227762]: 2026-01-23 10:37:26.468 227766 DEBUG oslo_concurrency.lockutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Acquired lock "refresh_cache-d077ff2d-0631-474d-b9a9-61fc36577163" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:37:26 np0005593234 nova_compute[227762]: 2026-01-23 10:37:26.469 227766 DEBUG nova.network.neutron [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:37:26 np0005593234 nova_compute[227762]: 2026-01-23 10:37:26.644 227766 DEBUG nova.network.neutron [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:37:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:26.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:26 np0005593234 nova_compute[227762]: 2026-01-23 10:37:26.919 227766 DEBUG nova.network.neutron [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:37:26 np0005593234 nova_compute[227762]: 2026-01-23 10:37:26.936 227766 DEBUG oslo_concurrency.lockutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Releasing lock "refresh_cache-d077ff2d-0631-474d-b9a9-61fc36577163" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:37:26 np0005593234 nova_compute[227762]: 2026-01-23 10:37:26.937 227766 DEBUG nova.compute.manager [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:37:26 np0005593234 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000be.scope: Deactivated successfully.
Jan 23 05:37:26 np0005593234 systemd[1]: machine-qemu\x2d92\x2dinstance\x2d000000be.scope: Consumed 2.267s CPU time.
Jan 23 05:37:26 np0005593234 systemd-machined[195626]: Machine qemu-92-instance-000000be terminated.
Jan 23 05:37:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:37:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/175992503' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:37:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:37:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/175992503' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:37:27 np0005593234 nova_compute[227762]: 2026-01-23 10:37:27.157 227766 INFO nova.virt.libvirt.driver [-] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Instance destroyed successfully.#033[00m
Jan 23 05:37:27 np0005593234 nova_compute[227762]: 2026-01-23 10:37:27.157 227766 DEBUG nova.objects.instance [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lazy-loading 'resources' on Instance uuid d077ff2d-0631-474d-b9a9-61fc36577163 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:37:27 np0005593234 nova_compute[227762]: 2026-01-23 10:37:27.690 227766 INFO nova.virt.libvirt.driver [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Deleting instance files /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163_del#033[00m
Jan 23 05:37:27 np0005593234 nova_compute[227762]: 2026-01-23 10:37:27.691 227766 INFO nova.virt.libvirt.driver [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Deletion of /var/lib/nova/instances/d077ff2d-0631-474d-b9a9-61fc36577163_del complete#033[00m
Jan 23 05:37:27 np0005593234 nova_compute[227762]: 2026-01-23 10:37:27.765 227766 INFO nova.compute.manager [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Took 0.83 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:37:27 np0005593234 nova_compute[227762]: 2026-01-23 10:37:27.765 227766 DEBUG oslo.service.loopingcall [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:37:27 np0005593234 nova_compute[227762]: 2026-01-23 10:37:27.766 227766 DEBUG nova.compute.manager [-] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:37:27 np0005593234 nova_compute[227762]: 2026-01-23 10:37:27.766 227766 DEBUG nova.network.neutron [-] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:37:27 np0005593234 nova_compute[227762]: 2026-01-23 10:37:27.888 227766 DEBUG nova.network.neutron [-] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:37:27 np0005593234 nova_compute[227762]: 2026-01-23 10:37:27.903 227766 DEBUG nova.network.neutron [-] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:37:27 np0005593234 nova_compute[227762]: 2026-01-23 10:37:27.915 227766 INFO nova.compute.manager [-] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Took 0.15 seconds to deallocate network for instance.#033[00m
Jan 23 05:37:27 np0005593234 nova_compute[227762]: 2026-01-23 10:37:27.956 227766 DEBUG oslo_concurrency.lockutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:27 np0005593234 nova_compute[227762]: 2026-01-23 10:37:27.957 227766 DEBUG oslo_concurrency.lockutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:28 np0005593234 nova_compute[227762]: 2026-01-23 10:37:28.002 227766 DEBUG oslo_concurrency.processutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:37:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:28.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:28 np0005593234 nova_compute[227762]: 2026-01-23 10:37:28.149 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:37:28 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1798156930' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:37:28 np0005593234 nova_compute[227762]: 2026-01-23 10:37:28.460 227766 DEBUG oslo_concurrency.processutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:37:28 np0005593234 nova_compute[227762]: 2026-01-23 10:37:28.466 227766 DEBUG nova.compute.provider_tree [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:37:28 np0005593234 nova_compute[227762]: 2026-01-23 10:37:28.512 227766 DEBUG nova.scheduler.client.report [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:37:28 np0005593234 nova_compute[227762]: 2026-01-23 10:37:28.597 227766 DEBUG oslo_concurrency.lockutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:28 np0005593234 nova_compute[227762]: 2026-01-23 10:37:28.622 227766 INFO nova.scheduler.client.report [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Deleted allocations for instance d077ff2d-0631-474d-b9a9-61fc36577163#033[00m
Jan 23 05:37:28 np0005593234 nova_compute[227762]: 2026-01-23 10:37:28.694 227766 DEBUG oslo_concurrency.lockutils [None req-081f5c38-62b1-47df-8145-264299441cd7 4016f133f441491ab245d0b8e9d6d7f5 b6db0c8b5ec04031aaacc904f210c5dd - - default default] Lock "d077ff2d-0631-474d-b9a9-61fc36577163" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.228s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:28.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:29 np0005593234 nova_compute[227762]: 2026-01-23 10:37:29.220 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e372 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e373 e373: 3 total, 3 up, 3 in
Jan 23 05:37:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:30.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:30.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:31 np0005593234 nova_compute[227762]: 2026-01-23 10:37:31.221 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:31 np0005593234 nova_compute[227762]: 2026-01-23 10:37:31.404 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:31 np0005593234 podman[318504]: 2026-01-23 10:37:31.820547934 +0000 UTC m=+0.100064168 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:37:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:32.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:37:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:32.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:37:33 np0005593234 nova_compute[227762]: 2026-01-23 10:37:33.205 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:34.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:34 np0005593234 nova_compute[227762]: 2026-01-23 10:37:34.221 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:34.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:36.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:36.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:38.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:38 np0005593234 nova_compute[227762]: 2026-01-23 10:37:38.114 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164643.1122754, 6673e062-5d99-4c31-a3e0-673f55438d6e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:37:38 np0005593234 nova_compute[227762]: 2026-01-23 10:37:38.114 227766 INFO nova.compute.manager [-] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:37:38 np0005593234 nova_compute[227762]: 2026-01-23 10:37:38.145 227766 DEBUG nova.compute.manager [None req-dd02832d-db1a-4002-afb9-5930ed80c2b9 - - - - - -] [instance: 6673e062-5d99-4c31-a3e0-673f55438d6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:37:38 np0005593234 nova_compute[227762]: 2026-01-23 10:37:38.207 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 05:37:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:38.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 05:37:39 np0005593234 nova_compute[227762]: 2026-01-23 10:37:39.223 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:40.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:40.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:42.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:42 np0005593234 nova_compute[227762]: 2026-01-23 10:37:42.156 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164647.1551697, d077ff2d-0631-474d-b9a9-61fc36577163 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:37:42 np0005593234 nova_compute[227762]: 2026-01-23 10:37:42.156 227766 INFO nova.compute.manager [-] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:37:42 np0005593234 nova_compute[227762]: 2026-01-23 10:37:42.526 227766 DEBUG nova.compute.manager [None req-0be747c0-2788-4037-9c77-11200c8d21e3 - - - - - -] [instance: d077ff2d-0631-474d-b9a9-61fc36577163] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:37:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:42.874 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:37:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:42.875 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:37:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:37:42.875 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:37:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:42.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:43 np0005593234 nova_compute[227762]: 2026-01-23 10:37:43.210 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:37:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:44.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:37:44 np0005593234 nova_compute[227762]: 2026-01-23 10:37:44.224 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:44.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:46.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:46.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:48.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:48 np0005593234 nova_compute[227762]: 2026-01-23 10:37:48.228 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:48.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:49 np0005593234 nova_compute[227762]: 2026-01-23 10:37:49.225 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:50.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:50.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:52.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:37:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:52.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:37:53 np0005593234 nova_compute[227762]: 2026-01-23 10:37:53.281 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:53 np0005593234 podman[318594]: 2026-01-23 10:37:53.77288516 +0000 UTC m=+0.060667056 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Jan 23 05:37:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:54.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:54 np0005593234 nova_compute[227762]: 2026-01-23 10:37:54.228 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:54.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:37:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:56.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:37:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:56.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:58.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:58 np0005593234 nova_compute[227762]: 2026-01-23 10:37:58.323 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:37:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:37:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:58.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:37:59 np0005593234 nova_compute[227762]: 2026-01-23 10:37:59.231 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #163. Immutable memtables: 0.
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:37:59.422907) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 103] Flushing memtable with next log file: 163
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164679423054, "job": 103, "event": "flush_started", "num_memtables": 1, "num_entries": 1421, "num_deletes": 265, "total_data_size": 2964864, "memory_usage": 2992864, "flush_reason": "Manual Compaction"}
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 103] Level-0 flush table #164: started
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164679647277, "cf_name": "default", "job": 103, "event": "table_file_creation", "file_number": 164, "file_size": 1955379, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 78581, "largest_seqno": 79997, "table_properties": {"data_size": 1949267, "index_size": 3314, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13695, "raw_average_key_size": 20, "raw_value_size": 1936705, "raw_average_value_size": 2873, "num_data_blocks": 146, "num_entries": 674, "num_filter_entries": 674, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164574, "oldest_key_time": 1769164574, "file_creation_time": 1769164679, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 164, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 103] Flush lasted 363600 microseconds, and 5404 cpu microseconds.
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:37:59.647341) [db/flush_job.cc:967] [default] [JOB 103] Level-0 flush table #164: 1955379 bytes OK
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:37:59.786668) [db/memtable_list.cc:519] [default] Level-0 commit table #164 started
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:37:59.790257) [db/memtable_list.cc:722] [default] Level-0 commit table #164: memtable #1 done
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:37:59.790307) EVENT_LOG_v1 {"time_micros": 1769164679790297, "job": 103, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:37:59.790335) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 103] Try to delete WAL files size 2958048, prev total WAL file size 2958048, number of live WAL files 2.
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000160.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:37:59.791811) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033303135' seq:72057594037927935, type:22 .. '6C6F676D0033323638' seq:0, type:0; will stop at (end)
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 104] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 103 Base level 0, inputs: [164(1909KB)], [162(11MB)]
Jan 23 05:37:59 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164679791934, "job": 104, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [164], "files_L6": [162], "score": -1, "input_data_size": 13759962, "oldest_snapshot_seqno": -1}
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 104] Generated table #165: 9859 keys, 13626151 bytes, temperature: kUnknown
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164680054467, "cf_name": "default", "job": 104, "event": "table_file_creation", "file_number": 165, "file_size": 13626151, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13561645, "index_size": 38729, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24709, "raw_key_size": 260154, "raw_average_key_size": 26, "raw_value_size": 13388048, "raw_average_value_size": 1357, "num_data_blocks": 1484, "num_entries": 9859, "num_filter_entries": 9859, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769164679, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 165, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:38:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:00.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:38:00.054821) [db/compaction/compaction_job.cc:1663] [default] [JOB 104] Compacted 1@0 + 1@6 files to L6 => 13626151 bytes
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:38:00.128742) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 52.4 rd, 51.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 11.3 +0.0 blob) out(13.0 +0.0 blob), read-write-amplify(14.0) write-amplify(7.0) OK, records in: 10403, records dropped: 544 output_compression: NoCompression
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:38:00.128779) EVENT_LOG_v1 {"time_micros": 1769164680128766, "job": 104, "event": "compaction_finished", "compaction_time_micros": 262672, "compaction_time_cpu_micros": 30331, "output_level": 6, "num_output_files": 1, "total_output_size": 13626151, "num_input_records": 10403, "num_output_records": 9859, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000164.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164680129217, "job": 104, "event": "table_file_deletion", "file_number": 164}
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000162.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164680131457, "job": 104, "event": "table_file_deletion", "file_number": 162}
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:37:59.791598) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:38:00.131535) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:38:00.131541) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:38:00.131543) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:38:00.131546) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:38:00 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:38:00.131549) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:38:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:38:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:00.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:38:01 np0005593234 nova_compute[227762]: 2026-01-23 10:38:01.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:02.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:02 np0005593234 podman[318642]: 2026-01-23 10:38:02.172813357 +0000 UTC m=+0.091489949 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:38:02 np0005593234 nova_compute[227762]: 2026-01-23 10:38:02.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:02 np0005593234 nova_compute[227762]: 2026-01-23 10:38:02.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:38:02 np0005593234 nova_compute[227762]: 2026-01-23 10:38:02.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:38:02 np0005593234 nova_compute[227762]: 2026-01-23 10:38:02.769 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:38:02 np0005593234 nova_compute[227762]: 2026-01-23 10:38:02.770 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:02 np0005593234 nova_compute[227762]: 2026-01-23 10:38:02.801 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:02 np0005593234 nova_compute[227762]: 2026-01-23 10:38:02.802 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:02 np0005593234 nova_compute[227762]: 2026-01-23 10:38:02.802 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:02 np0005593234 nova_compute[227762]: 2026-01-23 10:38:02.802 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:38:02 np0005593234 nova_compute[227762]: 2026-01-23 10:38:02.802 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:02.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:03 np0005593234 nova_compute[227762]: 2026-01-23 10:38:03.390 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:38:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1746753053' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:38:03 np0005593234 nova_compute[227762]: 2026-01-23 10:38:03.442 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.640s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:03 np0005593234 nova_compute[227762]: 2026-01-23 10:38:03.606 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:38:03 np0005593234 nova_compute[227762]: 2026-01-23 10:38:03.608 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4158MB free_disk=20.956493377685547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:38:03 np0005593234 nova_compute[227762]: 2026-01-23 10:38:03.608 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:03 np0005593234 nova_compute[227762]: 2026-01-23 10:38:03.608 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:03 np0005593234 nova_compute[227762]: 2026-01-23 10:38:03.690 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:38:03 np0005593234 nova_compute[227762]: 2026-01-23 10:38:03.691 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:38:03 np0005593234 nova_compute[227762]: 2026-01-23 10:38:03.712 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:04.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:04 np0005593234 nova_compute[227762]: 2026-01-23 10:38:04.232 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:38:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1629466076' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:38:04 np0005593234 nova_compute[227762]: 2026-01-23 10:38:04.524 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.812s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:04 np0005593234 nova_compute[227762]: 2026-01-23 10:38:04.531 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:38:04 np0005593234 nova_compute[227762]: 2026-01-23 10:38:04.737 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:38:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:04 np0005593234 nova_compute[227762]: 2026-01-23 10:38:04.763 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:38:04 np0005593234 nova_compute[227762]: 2026-01-23 10:38:04.763 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.155s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:04.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:06.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:06.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:08.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:08 np0005593234 nova_compute[227762]: 2026-01-23 10:38:08.394 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:08.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:09 np0005593234 nova_compute[227762]: 2026-01-23 10:38:09.233 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:09 np0005593234 nova_compute[227762]: 2026-01-23 10:38:09.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:09 np0005593234 nova_compute[227762]: 2026-01-23 10:38:09.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:09 np0005593234 nova_compute[227762]: 2026-01-23 10:38:09.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:09 np0005593234 nova_compute[227762]: 2026-01-23 10:38:09.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:38:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:10.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:10 np0005593234 nova_compute[227762]: 2026-01-23 10:38:10.486 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:10.485 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=73, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=72) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:38:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:10.486 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:38:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:10.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:12.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:12 np0005593234 nova_compute[227762]: 2026-01-23 10:38:12.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:12 np0005593234 nova_compute[227762]: 2026-01-23 10:38:12.758 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:12.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:13 np0005593234 nova_compute[227762]: 2026-01-23 10:38:13.444 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:14.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:14 np0005593234 nova_compute[227762]: 2026-01-23 10:38:14.236 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:14.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:16.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:16 np0005593234 nova_compute[227762]: 2026-01-23 10:38:16.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:16.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:18.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:18 np0005593234 nova_compute[227762]: 2026-01-23 10:38:18.529 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:18.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.122 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.122 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.141 227766 DEBUG nova.compute.manager [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.230 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.231 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.238 227766 DEBUG nova.virt.hardware [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.239 227766 INFO nova.compute.claims [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.241 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.366 227766 DEBUG oslo_concurrency.processutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:19 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:38:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:38:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/557767576' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.846 227766 DEBUG oslo_concurrency.processutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.852 227766 DEBUG nova.compute.provider_tree [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.878 227766 DEBUG nova.scheduler.client.report [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.897 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.667s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.898 227766 DEBUG nova.compute.manager [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.942 227766 DEBUG nova.compute.manager [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.943 227766 DEBUG nova.network.neutron [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.966 227766 INFO nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:38:19 np0005593234 nova_compute[227762]: 2026-01-23 10:38:19.983 227766 DEBUG nova.compute.manager [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.020 227766 INFO nova.virt.block_device [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Booting with volume 707d7a47-dd85-4006-890f-724df1ffbdae at /dev/vda#033[00m
Jan 23 05:38:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:20.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.211 227766 DEBUG nova.policy [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93cd560e84264023877c47122b5919de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e762fca3b634c7aa1d994314c059c54', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.286 227766 DEBUG os_brick.utils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.288 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.300 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.300 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[4d7a1538-2f8c-4e2a-86c4-1f04320c0214]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.302 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.308 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.309 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[60aec26c-3204-40ef-a154-e9120526022f]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.310 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.316 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.006s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.317 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[a8cfa351-2149-4674-ad8a-c09d1b032be4]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.318 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[31f967b6-df1b-4830-9d03-69c71dabfa30]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.318 227766 DEBUG oslo_concurrency.processutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.343 227766 DEBUG oslo_concurrency.processutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "nvme version" returned: 0 in 0.025s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.345 227766 DEBUG os_brick.initiator.connectors.lightos [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.346 227766 DEBUG os_brick.initiator.connectors.lightos [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.346 227766 DEBUG os_brick.initiator.connectors.lightos [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.346 227766 DEBUG os_brick.utils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] <== get_connector_properties: return (59ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:38:20 np0005593234 nova_compute[227762]: 2026-01-23 10:38:20.346 227766 DEBUG nova.virt.block_device [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Updating existing volume attachment record: cacaf0b8-fb4a-4f74-8db8-fa08eb2c8eae _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:38:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:20.487 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '73'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:38:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:38:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:20.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:21 np0005593234 nova_compute[227762]: 2026-01-23 10:38:21.132 227766 DEBUG nova.network.neutron [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Successfully created port: 93cbf6f2-1b0c-4fcf-b194-5f85394193db _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:38:21 np0005593234 nova_compute[227762]: 2026-01-23 10:38:21.983 227766 DEBUG nova.compute.manager [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:38:21 np0005593234 nova_compute[227762]: 2026-01-23 10:38:21.985 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:38:21 np0005593234 nova_compute[227762]: 2026-01-23 10:38:21.986 227766 INFO nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Creating image(s)#033[00m
Jan 23 05:38:21 np0005593234 nova_compute[227762]: 2026-01-23 10:38:21.986 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:38:21 np0005593234 nova_compute[227762]: 2026-01-23 10:38:21.987 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Ensure instance console log exists: /var/lib/nova/instances/307f203d-cfc0-45a9-a0cd-3acee0ef7133/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:38:21 np0005593234 nova_compute[227762]: 2026-01-23 10:38:21.987 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:21 np0005593234 nova_compute[227762]: 2026-01-23 10:38:21.987 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:21 np0005593234 nova_compute[227762]: 2026-01-23 10:38:21.988 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:38:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:22.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:38:22 np0005593234 nova_compute[227762]: 2026-01-23 10:38:22.248 227766 DEBUG nova.network.neutron [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Successfully updated port: 93cbf6f2-1b0c-4fcf-b194-5f85394193db _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:38:22 np0005593234 nova_compute[227762]: 2026-01-23 10:38:22.266 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:38:22 np0005593234 nova_compute[227762]: 2026-01-23 10:38:22.266 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquired lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:38:22 np0005593234 nova_compute[227762]: 2026-01-23 10:38:22.267 227766 DEBUG nova.network.neutron [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:38:22 np0005593234 nova_compute[227762]: 2026-01-23 10:38:22.381 227766 DEBUG nova.compute.manager [req-a42b08ae-8ff6-4e2a-835d-301753008747 req-bfc4cbf4-2c0d-4372-87bc-0e56ca401add 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Received event network-changed-93cbf6f2-1b0c-4fcf-b194-5f85394193db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:38:22 np0005593234 nova_compute[227762]: 2026-01-23 10:38:22.381 227766 DEBUG nova.compute.manager [req-a42b08ae-8ff6-4e2a-835d-301753008747 req-bfc4cbf4-2c0d-4372-87bc-0e56ca401add 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Refreshing instance network info cache due to event network-changed-93cbf6f2-1b0c-4fcf-b194-5f85394193db. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:38:22 np0005593234 nova_compute[227762]: 2026-01-23 10:38:22.382 227766 DEBUG oslo_concurrency.lockutils [req-a42b08ae-8ff6-4e2a-835d-301753008747 req-bfc4cbf4-2c0d-4372-87bc-0e56ca401add 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:38:22 np0005593234 nova_compute[227762]: 2026-01-23 10:38:22.423 227766 DEBUG nova.network.neutron [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:38:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:22.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.359 227766 DEBUG nova.network.neutron [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Updating instance_info_cache with network_info: [{"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.402 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Releasing lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.402 227766 DEBUG nova.compute.manager [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Instance network_info: |[{"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.402 227766 DEBUG oslo_concurrency.lockutils [req-a42b08ae-8ff6-4e2a-835d-301753008747 req-bfc4cbf4-2c0d-4372-87bc-0e56ca401add 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.403 227766 DEBUG nova.network.neutron [req-a42b08ae-8ff6-4e2a-835d-301753008747 req-bfc4cbf4-2c0d-4372-87bc-0e56ca401add 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Refreshing network info cache for port 93cbf6f2-1b0c-4fcf-b194-5f85394193db _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.406 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Start _get_guest_xml network_info=[{"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'boot_index': 0, 'mount_device': '/dev/vda', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-707d7a47-dd85-4006-890f-724df1ffbdae', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '707d7a47-dd85-4006-890f-724df1ffbdae', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '307f203d-cfc0-45a9-a0cd-3acee0ef7133', 'attached_at': '', 'detached_at': '', 'volume_id': '707d7a47-dd85-4006-890f-724df1ffbdae', 'serial': '707d7a47-dd85-4006-890f-724df1ffbdae', 'multiattach': True}, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': 'cacaf0b8-fb4a-4f74-8db8-fa08eb2c8eae', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.410 227766 WARNING nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.414 227766 DEBUG nova.virt.libvirt.host [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.415 227766 DEBUG nova.virt.libvirt.host [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.419 227766 DEBUG nova.virt.libvirt.host [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.419 227766 DEBUG nova.virt.libvirt.host [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.421 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.421 227766 DEBUG nova.virt.hardware [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.421 227766 DEBUG nova.virt.hardware [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.422 227766 DEBUG nova.virt.hardware [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.422 227766 DEBUG nova.virt.hardware [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.422 227766 DEBUG nova.virt.hardware [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.422 227766 DEBUG nova.virt.hardware [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.422 227766 DEBUG nova.virt.hardware [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.423 227766 DEBUG nova.virt.hardware [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.423 227766 DEBUG nova.virt.hardware [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.423 227766 DEBUG nova.virt.hardware [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.423 227766 DEBUG nova.virt.hardware [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.458 227766 DEBUG nova.storage.rbd_utils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image 307f203d-cfc0-45a9-a0cd-3acee0ef7133_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.462 227766 DEBUG oslo_concurrency.processutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:23 np0005593234 nova_compute[227762]: 2026-01-23 10:38:23.532 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:38:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3438946399' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.020 227766 DEBUG oslo_concurrency.processutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.558s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.053 227766 DEBUG nova.virt.libvirt.vif [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:38:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeMultiAttachTest-server-1180367469',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumemultiattachtest-server-1180367469',id=192,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e762fca3b634c7aa1d994314c059c54',ramdisk_id='',reservation_id='r-1lyck9l7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-63035580',owner_user_name='tempest-AttachVolumeMultiAttachTest-63035580-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:38:20Z,user_data=None,user_id='93cd560e84264023877c47122b5919de',uuid=307f203d-cfc0-45a9-a0cd-3acee0ef7133,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.054 227766 DEBUG nova.network.os_vif_util [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converting VIF {"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.055 227766 DEBUG nova.network.os_vif_util [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:58:e7,bridge_name='br-int',has_traffic_filtering=True,id=93cbf6f2-1b0c-4fcf-b194-5f85394193db,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93cbf6f2-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.056 227766 DEBUG nova.objects.instance [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'pci_devices' on Instance uuid 307f203d-cfc0-45a9-a0cd-3acee0ef7133 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.073 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  <uuid>307f203d-cfc0-45a9-a0cd-3acee0ef7133</uuid>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  <name>instance-000000c0</name>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <nova:name>tempest-AttachVolumeMultiAttachTest-server-1180367469</nova:name>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:38:23</nova:creationTime>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <nova:user uuid="93cd560e84264023877c47122b5919de">tempest-AttachVolumeMultiAttachTest-63035580-project-member</nova:user>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <nova:project uuid="6e762fca3b634c7aa1d994314c059c54">tempest-AttachVolumeMultiAttachTest-63035580</nova:project>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <nova:port uuid="93cbf6f2-1b0c-4fcf-b194-5f85394193db">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <entry name="serial">307f203d-cfc0-45a9-a0cd-3acee0ef7133</entry>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <entry name="uuid">307f203d-cfc0-45a9-a0cd-3acee0ef7133</entry>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/307f203d-cfc0-45a9-a0cd-3acee0ef7133_disk.config">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="volumes/volume-707d7a47-dd85-4006-890f-724df1ffbdae">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <serial>707d7a47-dd85-4006-890f-724df1ffbdae</serial>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <shareable/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:9b:58:e7"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <target dev="tap93cbf6f2-1b"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/307f203d-cfc0-45a9-a0cd-3acee0ef7133/console.log" append="off"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:38:24 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:38:24 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:38:24 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:38:24 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.075 227766 DEBUG nova.compute.manager [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Preparing to wait for external event network-vif-plugged-93cbf6f2-1b0c-4fcf-b194-5f85394193db prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.076 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.076 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.076 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.077 227766 DEBUG nova.virt.libvirt.vif [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:38:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeMultiAttachTest-server-1180367469',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumemultiattachtest-server-1180367469',id=192,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e762fca3b634c7aa1d994314c059c54',ramdisk_id='',reservation_id='r-1lyck9l7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-63035580',owner_user_name='tempest-AttachVolumeMultiAttachTest-63035580-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:38:20Z,user_data=None,user_id='93cd560e84264023877c47122b5919de',uuid=307f203d-cfc0-45a9-a0cd-3acee0ef7133,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.077 227766 DEBUG nova.network.os_vif_util [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converting VIF {"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.078 227766 DEBUG nova.network.os_vif_util [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:58:e7,bridge_name='br-int',has_traffic_filtering=True,id=93cbf6f2-1b0c-4fcf-b194-5f85394193db,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93cbf6f2-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.078 227766 DEBUG os_vif [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:58:e7,bridge_name='br-int',has_traffic_filtering=True,id=93cbf6f2-1b0c-4fcf-b194-5f85394193db,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93cbf6f2-1b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.082 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.082 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.082 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.085 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.085 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap93cbf6f2-1b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.086 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap93cbf6f2-1b, col_values=(('external_ids', {'iface-id': '93cbf6f2-1b0c-4fcf-b194-5f85394193db', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:58:e7', 'vm-uuid': '307f203d-cfc0-45a9-a0cd-3acee0ef7133'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.087 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:24 np0005593234 NetworkManager[48942]: <info>  [1769164704.0882] manager: (tap93cbf6f2-1b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/386)
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.089 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.096 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.097 227766 INFO os_vif [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:58:e7,bridge_name='br-int',has_traffic_filtering=True,id=93cbf6f2-1b0c-4fcf-b194-5f85394193db,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93cbf6f2-1b')#033[00m
Jan 23 05:38:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:24.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.157 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.157 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.158 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No VIF found with MAC fa:16:3e:9b:58:e7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.158 227766 INFO nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Using config drive#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.185 227766 DEBUG nova.storage.rbd_utils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image 307f203d-cfc0-45a9-a0cd-3acee0ef7133_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.241 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.708 227766 INFO nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Creating config drive at /var/lib/nova/instances/307f203d-cfc0-45a9-a0cd-3acee0ef7133/disk.config#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.713 227766 DEBUG oslo_concurrency.processutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/307f203d-cfc0-45a9-a0cd-3acee0ef7133/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvy_7_z26 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:24 np0005593234 podman[319021]: 2026-01-23 10:38:24.765463329 +0000 UTC m=+0.056557458 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:38:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.848 227766 DEBUG oslo_concurrency.processutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/307f203d-cfc0-45a9-a0cd-3acee0ef7133/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvy_7_z26" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.878 227766 DEBUG nova.storage.rbd_utils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image 307f203d-cfc0-45a9-a0cd-3acee0ef7133_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.881 227766 DEBUG oslo_concurrency.processutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/307f203d-cfc0-45a9-a0cd-3acee0ef7133/disk.config 307f203d-cfc0-45a9-a0cd-3acee0ef7133_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.926 227766 DEBUG nova.network.neutron [req-a42b08ae-8ff6-4e2a-835d-301753008747 req-bfc4cbf4-2c0d-4372-87bc-0e56ca401add 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Updated VIF entry in instance network info cache for port 93cbf6f2-1b0c-4fcf-b194-5f85394193db. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.927 227766 DEBUG nova.network.neutron [req-a42b08ae-8ff6-4e2a-835d-301753008747 req-bfc4cbf4-2c0d-4372-87bc-0e56ca401add 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Updating instance_info_cache with network_info: [{"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:38:24 np0005593234 nova_compute[227762]: 2026-01-23 10:38:24.948 227766 DEBUG oslo_concurrency.lockutils [req-a42b08ae-8ff6-4e2a-835d-301753008747 req-bfc4cbf4-2c0d-4372-87bc-0e56ca401add 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:38:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:24.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:25 np0005593234 nova_compute[227762]: 2026-01-23 10:38:25.434 227766 DEBUG oslo_concurrency.processutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/307f203d-cfc0-45a9-a0cd-3acee0ef7133/disk.config 307f203d-cfc0-45a9-a0cd-3acee0ef7133_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.553s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:38:25 np0005593234 nova_compute[227762]: 2026-01-23 10:38:25.434 227766 INFO nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Deleting local config drive /var/lib/nova/instances/307f203d-cfc0-45a9-a0cd-3acee0ef7133/disk.config because it was imported into RBD.#033[00m
Jan 23 05:38:25 np0005593234 kernel: tap93cbf6f2-1b: entered promiscuous mode
Jan 23 05:38:25 np0005593234 NetworkManager[48942]: <info>  [1769164705.4933] manager: (tap93cbf6f2-1b): new Tun device (/org/freedesktop/NetworkManager/Devices/387)
Jan 23 05:38:25 np0005593234 nova_compute[227762]: 2026-01-23 10:38:25.493 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:25 np0005593234 ovn_controller[134547]: 2026-01-23T10:38:25Z|00825|binding|INFO|Claiming lport 93cbf6f2-1b0c-4fcf-b194-5f85394193db for this chassis.
Jan 23 05:38:25 np0005593234 ovn_controller[134547]: 2026-01-23T10:38:25Z|00826|binding|INFO|93cbf6f2-1b0c-4fcf-b194-5f85394193db: Claiming fa:16:3e:9b:58:e7 10.100.0.12
Jan 23 05:38:25 np0005593234 nova_compute[227762]: 2026-01-23 10:38:25.501 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:25 np0005593234 nova_compute[227762]: 2026-01-23 10:38:25.504 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.516 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:58:e7 10.100.0.12'], port_security=['fa:16:3e:9b:58:e7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '307f203d-cfc0-45a9-a0cd-3acee0ef7133', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e762fca3b634c7aa1d994314c059c54', 'neutron:revision_number': '2', 'neutron:security_group_ids': '48274d25-9599-424c-bfd1-ff8c0b4eb8cc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0936335-b706-4400-8411-bdd084c8cdf7, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=93cbf6f2-1b0c-4fcf-b194-5f85394193db) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.518 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 93cbf6f2-1b0c-4fcf-b194-5f85394193db in datapath fba2ba4a-d82c-4f8b-9754-c13fbec41a04 bound to our chassis#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.521 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fba2ba4a-d82c-4f8b-9754-c13fbec41a04#033[00m
Jan 23 05:38:25 np0005593234 systemd-udevd[319095]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:38:25 np0005593234 systemd-machined[195626]: New machine qemu-93-instance-000000c0.
Jan 23 05:38:25 np0005593234 NetworkManager[48942]: <info>  [1769164705.5368] device (tap93cbf6f2-1b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:38:25 np0005593234 NetworkManager[48942]: <info>  [1769164705.5373] device (tap93cbf6f2-1b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.542 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1f7fd864-fdc3-4a4d-9a12-7a2d28d6bf8a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.544 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfba2ba4a-d1 in ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.546 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfba2ba4a-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.546 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[66c35190-3d88-430a-a9a1-89cc1c4940ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.548 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[22f3bf6d-0d9b-40e0-bf8d-6f69b98b7caa]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.560 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[712790e1-2a8b-4e1d-85c8-166e6c9656e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 systemd[1]: Started Virtual Machine qemu-93-instance-000000c0.
Jan 23 05:38:25 np0005593234 ovn_controller[134547]: 2026-01-23T10:38:25Z|00827|binding|INFO|Setting lport 93cbf6f2-1b0c-4fcf-b194-5f85394193db ovn-installed in OVS
Jan 23 05:38:25 np0005593234 ovn_controller[134547]: 2026-01-23T10:38:25Z|00828|binding|INFO|Setting lport 93cbf6f2-1b0c-4fcf-b194-5f85394193db up in Southbound
Jan 23 05:38:25 np0005593234 nova_compute[227762]: 2026-01-23 10:38:25.585 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.586 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[47c6cd6e-8851-4110-9686-a7ab14ee92a6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.618 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4146af8b-50e5-4ca2-b84c-5e4636ce92d0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 NetworkManager[48942]: <info>  [1769164705.6244] manager: (tapfba2ba4a-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/388)
Jan 23 05:38:25 np0005593234 systemd-udevd[319098]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.623 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f3c4ecf0-48c1-4ff3-a057-9dd3b519d7d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.654 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[87b46ca1-4c33-48fc-8afd-09e9d9b66ee5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.657 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[8176adff-0c20-4461-96dc-31426bf92df4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 NetworkManager[48942]: <info>  [1769164705.6798] device (tapfba2ba4a-d0): carrier: link connected
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.684 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e92e2936-552f-4427-9963-f803a9573ee2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.700 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ea153a86-0055-4d12-829c-0cc51ec22987]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfba2ba4a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:db:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 860063, 'reachable_time': 36917, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 319128, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.713 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a4b4f46b-c934-4a41-9650-34429b7bceb8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe27:db55'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 860063, 'tstamp': 860063}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 319129, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.729 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6e986509-cb99-4fc4-bbc2-560228e7e1d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfba2ba4a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:db:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 860063, 'reachable_time': 36917, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 319130, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 nova_compute[227762]: 2026-01-23 10:38:25.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.760 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6068d471-71fd-45ae-b455-c867ffdd53a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.821 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[16fb227a-9135-48ae-a07c-aa15ff4149de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.822 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfba2ba4a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.823 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.823 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfba2ba4a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:25 np0005593234 kernel: tapfba2ba4a-d0: entered promiscuous mode
Jan 23 05:38:25 np0005593234 NetworkManager[48942]: <info>  [1769164705.8253] manager: (tapfba2ba4a-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/389)
Jan 23 05:38:25 np0005593234 nova_compute[227762]: 2026-01-23 10:38:25.826 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.827 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfba2ba4a-d0, col_values=(('external_ids', {'iface-id': '2348ddba-3dc3-4456-a637-f3065ba0d8f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:38:25 np0005593234 ovn_controller[134547]: 2026-01-23T10:38:25Z|00829|binding|INFO|Releasing lport 2348ddba-3dc3-4456-a637-f3065ba0d8f6 from this chassis (sb_readonly=0)
Jan 23 05:38:25 np0005593234 nova_compute[227762]: 2026-01-23 10:38:25.828 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:25 np0005593234 nova_compute[227762]: 2026-01-23 10:38:25.843 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.844 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fba2ba4a-d82c-4f8b-9754-c13fbec41a04.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fba2ba4a-d82c-4f8b-9754-c13fbec41a04.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.845 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[360b3916-f27d-4460-bd9f-cc15713205fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.845 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-fba2ba4a-d82c-4f8b-9754-c13fbec41a04
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/fba2ba4a-d82c-4f8b-9754-c13fbec41a04.pid.haproxy
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID fba2ba4a-d82c-4f8b-9754-c13fbec41a04
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:38:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:25.846 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'env', 'PROCESS_TAG=haproxy-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fba2ba4a-d82c-4f8b-9754-c13fbec41a04.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:38:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:26.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:26 np0005593234 podman[319162]: 2026-01-23 10:38:26.257227828 +0000 UTC m=+0.083035495 container create a4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 05:38:26 np0005593234 podman[319162]: 2026-01-23 10:38:26.195651984 +0000 UTC m=+0.021459661 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:38:26 np0005593234 systemd[1]: Started libpod-conmon-a4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36.scope.
Jan 23 05:38:26 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:38:26 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d5a339b19b6ee5ecb76dff9da654c6e1b4afa3c7fb76d06945c9fb62ea2941b3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.368 227766 DEBUG nova.compute.manager [req-018cd70a-fce4-4526-969c-b4289a9335e8 req-79ec7b44-8552-4ba5-8ab0-172f4db75e32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Received event network-vif-plugged-93cbf6f2-1b0c-4fcf-b194-5f85394193db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.369 227766 DEBUG oslo_concurrency.lockutils [req-018cd70a-fce4-4526-969c-b4289a9335e8 req-79ec7b44-8552-4ba5-8ab0-172f4db75e32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.369 227766 DEBUG oslo_concurrency.lockutils [req-018cd70a-fce4-4526-969c-b4289a9335e8 req-79ec7b44-8552-4ba5-8ab0-172f4db75e32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.369 227766 DEBUG oslo_concurrency.lockutils [req-018cd70a-fce4-4526-969c-b4289a9335e8 req-79ec7b44-8552-4ba5-8ab0-172f4db75e32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.369 227766 DEBUG nova.compute.manager [req-018cd70a-fce4-4526-969c-b4289a9335e8 req-79ec7b44-8552-4ba5-8ab0-172f4db75e32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Processing event network-vif-plugged-93cbf6f2-1b0c-4fcf-b194-5f85394193db _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:38:26 np0005593234 podman[319162]: 2026-01-23 10:38:26.388881411 +0000 UTC m=+0.214689108 container init a4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:38:26 np0005593234 podman[319162]: 2026-01-23 10:38:26.395125456 +0000 UTC m=+0.220933133 container start a4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:38:26 np0005593234 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[319179]: [NOTICE]   (319183) : New worker (319189) forked
Jan 23 05:38:26 np0005593234 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[319179]: [NOTICE]   (319183) : Loading success.
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.620 227766 DEBUG nova.compute.manager [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.622 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164706.6213348, 307f203d-cfc0-45a9-a0cd-3acee0ef7133 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.622 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] VM Started (Lifecycle Event)#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.626 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.631 227766 INFO nova.virt.libvirt.driver [-] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Instance spawned successfully.#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.632 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.654 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.661 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.668 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.669 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.670 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.670 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.670 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.671 227766 DEBUG nova.virt.libvirt.driver [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.710 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.711 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164706.6244617, 307f203d-cfc0-45a9-a0cd-3acee0ef7133 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.711 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.743 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.747 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164706.626493, 307f203d-cfc0-45a9-a0cd-3acee0ef7133 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.747 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.752 227766 INFO nova.compute.manager [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Took 4.77 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.752 227766 DEBUG nova.compute.manager [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.788 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.791 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.827 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.875 227766 INFO nova.compute.manager [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Took 7.68 seconds to build instance.#033[00m
Jan 23 05:38:26 np0005593234 nova_compute[227762]: 2026-01-23 10:38:26.902 227766 DEBUG oslo_concurrency.lockutils [None req-23791a78-1f3b-4f1b-883c-b0ecb7b733ab 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.780s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:26.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:27 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:38:27 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:38:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:38:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:28.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:38:28 np0005593234 nova_compute[227762]: 2026-01-23 10:38:28.480 227766 DEBUG nova.compute.manager [req-35028115-b99b-4f55-bc4b-a3394ae87d7a req-3b7af47b-4f47-4be2-8dc4-9dc6fc8a07d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Received event network-vif-plugged-93cbf6f2-1b0c-4fcf-b194-5f85394193db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:38:28 np0005593234 nova_compute[227762]: 2026-01-23 10:38:28.481 227766 DEBUG oslo_concurrency.lockutils [req-35028115-b99b-4f55-bc4b-a3394ae87d7a req-3b7af47b-4f47-4be2-8dc4-9dc6fc8a07d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:28 np0005593234 nova_compute[227762]: 2026-01-23 10:38:28.481 227766 DEBUG oslo_concurrency.lockutils [req-35028115-b99b-4f55-bc4b-a3394ae87d7a req-3b7af47b-4f47-4be2-8dc4-9dc6fc8a07d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:28 np0005593234 nova_compute[227762]: 2026-01-23 10:38:28.481 227766 DEBUG oslo_concurrency.lockutils [req-35028115-b99b-4f55-bc4b-a3394ae87d7a req-3b7af47b-4f47-4be2-8dc4-9dc6fc8a07d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:28 np0005593234 nova_compute[227762]: 2026-01-23 10:38:28.481 227766 DEBUG nova.compute.manager [req-35028115-b99b-4f55-bc4b-a3394ae87d7a req-3b7af47b-4f47-4be2-8dc4-9dc6fc8a07d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] No waiting events found dispatching network-vif-plugged-93cbf6f2-1b0c-4fcf-b194-5f85394193db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:38:28 np0005593234 nova_compute[227762]: 2026-01-23 10:38:28.481 227766 WARNING nova.compute.manager [req-35028115-b99b-4f55-bc4b-a3394ae87d7a req-3b7af47b-4f47-4be2-8dc4-9dc6fc8a07d3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Received unexpected event network-vif-plugged-93cbf6f2-1b0c-4fcf-b194-5f85394193db for instance with vm_state active and task_state None.#033[00m
Jan 23 05:38:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:38:28 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3750079014' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:38:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:28.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:29 np0005593234 nova_compute[227762]: 2026-01-23 10:38:29.089 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:29 np0005593234 nova_compute[227762]: 2026-01-23 10:38:29.244 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e373 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e374 e374: 3 total, 3 up, 3 in
Jan 23 05:38:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:30.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:30.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e375 e375: 3 total, 3 up, 3 in
Jan 23 05:38:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:32.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e376 e376: 3 total, 3 up, 3 in
Jan 23 05:38:32 np0005593234 podman[319291]: 2026-01-23 10:38:32.805184781 +0000 UTC m=+0.100144949 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:38:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:32.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:34 np0005593234 nova_compute[227762]: 2026-01-23 10:38:34.092 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:34.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:34 np0005593234 nova_compute[227762]: 2026-01-23 10:38:34.244 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e377 e377: 3 total, 3 up, 3 in
Jan 23 05:38:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e377 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:34.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:36.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:36.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:38.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:38.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e378 e378: 3 total, 3 up, 3 in
Jan 23 05:38:39 np0005593234 nova_compute[227762]: 2026-01-23 10:38:39.096 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:39 np0005593234 nova_compute[227762]: 2026-01-23 10:38:39.246 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e378 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:40.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e379 e379: 3 total, 3 up, 3 in
Jan 23 05:38:40 np0005593234 ovn_controller[134547]: 2026-01-23T10:38:40Z|00108|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9b:58:e7 10.100.0.12
Jan 23 05:38:40 np0005593234 ovn_controller[134547]: 2026-01-23T10:38:40Z|00109|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9b:58:e7 10.100.0.12
Jan 23 05:38:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:40.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:38:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:42.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:38:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:42.875 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:38:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:42.876 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:38:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:38:42.876 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:38:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:38:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:42.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:38:44 np0005593234 nova_compute[227762]: 2026-01-23 10:38:44.100 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:44.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:44 np0005593234 nova_compute[227762]: 2026-01-23 10:38:44.247 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:44.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:46.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:46.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:48.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:48.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:49 np0005593234 nova_compute[227762]: 2026-01-23 10:38:49.103 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:49 np0005593234 nova_compute[227762]: 2026-01-23 10:38:49.248 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:38:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:50.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:38:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:50.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:52.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:52.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:53 np0005593234 nova_compute[227762]: 2026-01-23 10:38:53.413 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:53 np0005593234 NetworkManager[48942]: <info>  [1769164733.4141] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/390)
Jan 23 05:38:53 np0005593234 NetworkManager[48942]: <info>  [1769164733.4153] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/391)
Jan 23 05:38:53 np0005593234 nova_compute[227762]: 2026-01-23 10:38:53.646 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:53 np0005593234 ovn_controller[134547]: 2026-01-23T10:38:53Z|00830|binding|INFO|Releasing lport 2348ddba-3dc3-4456-a637-f3065ba0d8f6 from this chassis (sb_readonly=0)
Jan 23 05:38:53 np0005593234 nova_compute[227762]: 2026-01-23 10:38:53.667 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:54 np0005593234 nova_compute[227762]: 2026-01-23 10:38:54.105 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:38:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:54.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:38:54 np0005593234 nova_compute[227762]: 2026-01-23 10:38:54.250 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:38:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:54.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:55 np0005593234 podman[319380]: 2026-01-23 10:38:55.762405557 +0000 UTC m=+0.050541990 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:38:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:56.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:56.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:58.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:38:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:38:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:58.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:38:59 np0005593234 nova_compute[227762]: 2026-01-23 10:38:59.109 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:59 np0005593234 nova_compute[227762]: 2026-01-23 10:38:59.253 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:38:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:00.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:00 np0005593234 nova_compute[227762]: 2026-01-23 10:39:00.465 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "d0cea430-15ec-471d-963b-41fd4fa4777c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:00 np0005593234 nova_compute[227762]: 2026-01-23 10:39:00.465 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:00 np0005593234 nova_compute[227762]: 2026-01-23 10:39:00.497 227766 DEBUG nova.compute.manager [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:39:00 np0005593234 nova_compute[227762]: 2026-01-23 10:39:00.590 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:00 np0005593234 nova_compute[227762]: 2026-01-23 10:39:00.590 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:00 np0005593234 nova_compute[227762]: 2026-01-23 10:39:00.597 227766 DEBUG nova.virt.hardware [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:39:00 np0005593234 nova_compute[227762]: 2026-01-23 10:39:00.597 227766 INFO nova.compute.claims [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:39:00 np0005593234 nova_compute[227762]: 2026-01-23 10:39:00.721 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:00.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:39:01 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1589936324' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.162 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.169 227766 DEBUG nova.compute.provider_tree [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.203 227766 DEBUG nova.scheduler.client.report [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.229 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.639s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.230 227766 DEBUG nova.compute.manager [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.314 227766 DEBUG nova.compute.manager [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.314 227766 DEBUG nova.network.neutron [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.336 227766 INFO nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.367 227766 DEBUG nova.compute.manager [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.471 227766 DEBUG nova.compute.manager [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.472 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.472 227766 INFO nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Creating image(s)#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.501 227766 DEBUG nova.storage.rbd_utils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image d0cea430-15ec-471d-963b-41fd4fa4777c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.530 227766 DEBUG nova.storage.rbd_utils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image d0cea430-15ec-471d-963b-41fd4fa4777c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.565 227766 DEBUG nova.storage.rbd_utils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image d0cea430-15ec-471d-963b-41fd4fa4777c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.570 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.644 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.645 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.645 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.646 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.672 227766 DEBUG nova.storage.rbd_utils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image d0cea430-15ec-471d-963b-41fd4fa4777c_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.676 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 d0cea430-15ec-471d-963b-41fd4fa4777c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:01 np0005593234 nova_compute[227762]: 2026-01-23 10:39:01.864 227766 DEBUG nova.policy [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '93cd560e84264023877c47122b5919de', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '6e762fca3b634c7aa1d994314c059c54', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:39:02 np0005593234 nova_compute[227762]: 2026-01-23 10:39:02.072 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 d0cea430-15ec-471d-963b-41fd4fa4777c_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.397s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:02 np0005593234 nova_compute[227762]: 2026-01-23 10:39:02.150 227766 DEBUG nova.storage.rbd_utils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] resizing rbd image d0cea430-15ec-471d-963b-41fd4fa4777c_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:39:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:02.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:02 np0005593234 nova_compute[227762]: 2026-01-23 10:39:02.420 227766 DEBUG nova.objects.instance [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'migration_context' on Instance uuid d0cea430-15ec-471d-963b-41fd4fa4777c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:39:02 np0005593234 nova_compute[227762]: 2026-01-23 10:39:02.436 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:39:02 np0005593234 nova_compute[227762]: 2026-01-23 10:39:02.436 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Ensure instance console log exists: /var/lib/nova/instances/d0cea430-15ec-471d-963b-41fd4fa4777c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:39:02 np0005593234 nova_compute[227762]: 2026-01-23 10:39:02.437 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:02 np0005593234 nova_compute[227762]: 2026-01-23 10:39:02.437 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:02 np0005593234 nova_compute[227762]: 2026-01-23 10:39:02.438 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:02.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:03 np0005593234 nova_compute[227762]: 2026-01-23 10:39:03.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:03 np0005593234 nova_compute[227762]: 2026-01-23 10:39:03.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:03 np0005593234 nova_compute[227762]: 2026-01-23 10:39:03.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:03 np0005593234 nova_compute[227762]: 2026-01-23 10:39:03.778 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:03 np0005593234 nova_compute[227762]: 2026-01-23 10:39:03.778 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:39:03 np0005593234 nova_compute[227762]: 2026-01-23 10:39:03.778 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:03 np0005593234 podman[319642]: 2026-01-23 10:39:03.782792947 +0000 UTC m=+0.077104401 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.110 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:04.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:39:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4057397976' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.204 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.254 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.319 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.319 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.386 227766 DEBUG nova.network.neutron [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Successfully created port: 50f13d72-f6d6-4b3a-8853-76d0a2f50240 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.461 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.462 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3945MB free_disk=20.959747314453125GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.463 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.463 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.595 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 307f203d-cfc0-45a9-a0cd-3acee0ef7133 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.596 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance d0cea430-15ec-471d-963b-41fd4fa4777c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.596 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.596 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:39:04 np0005593234 nova_compute[227762]: 2026-01-23 10:39:04.689 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:04.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:39:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3446103517' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:39:05 np0005593234 nova_compute[227762]: 2026-01-23 10:39:05.119 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:05 np0005593234 nova_compute[227762]: 2026-01-23 10:39:05.125 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:39:05 np0005593234 nova_compute[227762]: 2026-01-23 10:39:05.155 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:39:05 np0005593234 nova_compute[227762]: 2026-01-23 10:39:05.196 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:39:05 np0005593234 nova_compute[227762]: 2026-01-23 10:39:05.196 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.733s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:39:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2108616938' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:39:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:06.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:06 np0005593234 nova_compute[227762]: 2026-01-23 10:39:06.197 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:06 np0005593234 nova_compute[227762]: 2026-01-23 10:39:06.197 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:39:06 np0005593234 nova_compute[227762]: 2026-01-23 10:39:06.197 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:39:06 np0005593234 nova_compute[227762]: 2026-01-23 10:39:06.224 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:39:06 np0005593234 nova_compute[227762]: 2026-01-23 10:39:06.746 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:39:06 np0005593234 nova_compute[227762]: 2026-01-23 10:39:06.746 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:39:06 np0005593234 nova_compute[227762]: 2026-01-23 10:39:06.747 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:39:06 np0005593234 nova_compute[227762]: 2026-01-23 10:39:06.747 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 307f203d-cfc0-45a9-a0cd-3acee0ef7133 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:39:06 np0005593234 nova_compute[227762]: 2026-01-23 10:39:06.875 227766 DEBUG nova.network.neutron [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Successfully updated port: 50f13d72-f6d6-4b3a-8853-76d0a2f50240 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:39:06 np0005593234 nova_compute[227762]: 2026-01-23 10:39:06.895 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:39:06 np0005593234 nova_compute[227762]: 2026-01-23 10:39:06.895 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquired lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:39:06 np0005593234 nova_compute[227762]: 2026-01-23 10:39:06.895 227766 DEBUG nova.network.neutron [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:39:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:06.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:07 np0005593234 nova_compute[227762]: 2026-01-23 10:39:07.001 227766 DEBUG nova.compute.manager [req-5ebcd5fb-2d23-44bd-b046-cdf38d5ae2ca req-de585656-a891-4dc3-83b4-3923e9d09a8d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Received event network-changed-50f13d72-f6d6-4b3a-8853-76d0a2f50240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:39:07 np0005593234 nova_compute[227762]: 2026-01-23 10:39:07.001 227766 DEBUG nova.compute.manager [req-5ebcd5fb-2d23-44bd-b046-cdf38d5ae2ca req-de585656-a891-4dc3-83b4-3923e9d09a8d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Refreshing instance network info cache due to event network-changed-50f13d72-f6d6-4b3a-8853-76d0a2f50240. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:39:07 np0005593234 nova_compute[227762]: 2026-01-23 10:39:07.001 227766 DEBUG oslo_concurrency.lockutils [req-5ebcd5fb-2d23-44bd-b046-cdf38d5ae2ca req-de585656-a891-4dc3-83b4-3923e9d09a8d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:39:07 np0005593234 nova_compute[227762]: 2026-01-23 10:39:07.118 227766 DEBUG nova.network.neutron [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:39:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:08.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.306 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Updating instance_info_cache with network_info: [{"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.330 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.331 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.368 227766 DEBUG nova.network.neutron [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Updating instance_info_cache with network_info: [{"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.389 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Releasing lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.390 227766 DEBUG nova.compute.manager [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Instance network_info: |[{"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.390 227766 DEBUG oslo_concurrency.lockutils [req-5ebcd5fb-2d23-44bd-b046-cdf38d5ae2ca req-de585656-a891-4dc3-83b4-3923e9d09a8d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.390 227766 DEBUG nova.network.neutron [req-5ebcd5fb-2d23-44bd-b046-cdf38d5ae2ca req-de585656-a891-4dc3-83b4-3923e9d09a8d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Refreshing network info cache for port 50f13d72-f6d6-4b3a-8853-76d0a2f50240 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.393 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Start _get_guest_xml network_info=[{"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.397 227766 WARNING nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.405 227766 DEBUG nova.virt.libvirt.host [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.406 227766 DEBUG nova.virt.libvirt.host [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.409 227766 DEBUG nova.virt.libvirt.host [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.409 227766 DEBUG nova.virt.libvirt.host [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.410 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.411 227766 DEBUG nova.virt.hardware [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.411 227766 DEBUG nova.virt.hardware [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.412 227766 DEBUG nova.virt.hardware [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.412 227766 DEBUG nova.virt.hardware [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.412 227766 DEBUG nova.virt.hardware [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.412 227766 DEBUG nova.virt.hardware [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.412 227766 DEBUG nova.virt.hardware [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.413 227766 DEBUG nova.virt.hardware [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.413 227766 DEBUG nova.virt.hardware [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.413 227766 DEBUG nova.virt.hardware [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.413 227766 DEBUG nova.virt.hardware [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.416 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:39:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3357850603' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.839 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.869 227766 DEBUG nova.storage.rbd_utils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image d0cea430-15ec-471d-963b-41fd4fa4777c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:08 np0005593234 nova_compute[227762]: 2026-01-23 10:39:08.873 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:08.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.114 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.257 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:39:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2820779731' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.312 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.315 227766 DEBUG nova.virt.libvirt.vif [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:38:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=196,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP3FfIOd2lnI+tPBfDtyl7+3bVUJP3jvoQEZS2+zpCm94FEzq78d4QEW/4ixP6N6S+NwXEvQperhCcfeORiYVMygQWeTqWJgqUherQ/1aiNrcs4OJRb36XBDXhjh6k5P/Q==',key_name='tempest-keypair-529522234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e762fca3b634c7aa1d994314c059c54',ramdisk_id='',reservation_id='r-gcdmevgb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-63035580',owner_user_name='tempest-AttachVolumeMultiAttachTest-63035580-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:39:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93cd560e84264023877c47122b5919de',uuid=d0cea430-15ec-471d-963b-41fd4fa4777c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.315 227766 DEBUG nova.network.os_vif_util [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converting VIF {"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.316 227766 DEBUG nova.network.os_vif_util [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:07:ce,bridge_name='br-int',has_traffic_filtering=True,id=50f13d72-f6d6-4b3a-8853-76d0a2f50240,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f13d72-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.317 227766 DEBUG nova.objects.instance [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'pci_devices' on Instance uuid d0cea430-15ec-471d-963b-41fd4fa4777c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.334 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  <uuid>d0cea430-15ec-471d-963b-41fd4fa4777c</uuid>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  <name>instance-000000c4</name>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <nova:name>multiattach-server-1</nova:name>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:39:08</nova:creationTime>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <nova:user uuid="93cd560e84264023877c47122b5919de">tempest-AttachVolumeMultiAttachTest-63035580-project-member</nova:user>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <nova:project uuid="6e762fca3b634c7aa1d994314c059c54">tempest-AttachVolumeMultiAttachTest-63035580</nova:project>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <nova:port uuid="50f13d72-f6d6-4b3a-8853-76d0a2f50240">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <entry name="serial">d0cea430-15ec-471d-963b-41fd4fa4777c</entry>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <entry name="uuid">d0cea430-15ec-471d-963b-41fd4fa4777c</entry>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/d0cea430-15ec-471d-963b-41fd4fa4777c_disk">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/d0cea430-15ec-471d-963b-41fd4fa4777c_disk.config">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:b1:07:ce"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <target dev="tap50f13d72-f6"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/d0cea430-15ec-471d-963b-41fd4fa4777c/console.log" append="off"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:39:09 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:39:09 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:39:09 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:39:09 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.336 227766 DEBUG nova.compute.manager [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Preparing to wait for external event network-vif-plugged-50f13d72-f6d6-4b3a-8853-76d0a2f50240 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.337 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.337 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.338 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.339 227766 DEBUG nova.virt.libvirt.vif [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:38:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=196,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP3FfIOd2lnI+tPBfDtyl7+3bVUJP3jvoQEZS2+zpCm94FEzq78d4QEW/4ixP6N6S+NwXEvQperhCcfeORiYVMygQWeTqWJgqUherQ/1aiNrcs4OJRb36XBDXhjh6k5P/Q==',key_name='tempest-keypair-529522234',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='6e762fca3b634c7aa1d994314c059c54',ramdisk_id='',reservation_id='r-gcdmevgb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeMultiAttachTest-63035580',owner_user_name='tempest-AttachVolumeMultiAttachTest-63035580-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:39:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93cd560e84264023877c47122b5919de',uuid=d0cea430-15ec-471d-963b-41fd4fa4777c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.339 227766 DEBUG nova.network.os_vif_util [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converting VIF {"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.340 227766 DEBUG nova.network.os_vif_util [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b1:07:ce,bridge_name='br-int',has_traffic_filtering=True,id=50f13d72-f6d6-4b3a-8853-76d0a2f50240,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f13d72-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.341 227766 DEBUG os_vif [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:07:ce,bridge_name='br-int',has_traffic_filtering=True,id=50f13d72-f6d6-4b3a-8853-76d0a2f50240,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f13d72-f6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.341 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.342 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.343 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.347 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.348 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50f13d72-f6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.348 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap50f13d72-f6, col_values=(('external_ids', {'iface-id': '50f13d72-f6d6-4b3a-8853-76d0a2f50240', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b1:07:ce', 'vm-uuid': 'd0cea430-15ec-471d-963b-41fd4fa4777c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.350 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:09 np0005593234 NetworkManager[48942]: <info>  [1769164749.3510] manager: (tap50f13d72-f6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/392)
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.353 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.358 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.359 227766 INFO os_vif [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b1:07:ce,bridge_name='br-int',has_traffic_filtering=True,id=50f13d72-f6d6-4b3a-8853-76d0a2f50240,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f13d72-f6')#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.460 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.461 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.461 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No VIF found with MAC fa:16:3e:b1:07:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.462 227766 INFO nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Using config drive#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.491 227766 DEBUG nova.storage.rbd_utils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image d0cea430-15ec-471d-963b-41fd4fa4777c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:09 np0005593234 nova_compute[227762]: 2026-01-23 10:39:09.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.106 227766 INFO nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Creating config drive at /var/lib/nova/instances/d0cea430-15ec-471d-963b-41fd4fa4777c/disk.config#033[00m
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.112 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d0cea430-15ec-471d-963b-41fd4fa4777c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsa5acrtm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:10.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.253 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d0cea430-15ec-471d-963b-41fd4fa4777c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpsa5acrtm" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.287 227766 DEBUG nova.storage.rbd_utils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] rbd image d0cea430-15ec-471d-963b-41fd4fa4777c_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.292 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d0cea430-15ec-471d-963b-41fd4fa4777c/disk.config d0cea430-15ec-471d-963b-41fd4fa4777c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.833 227766 DEBUG oslo_concurrency.processutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d0cea430-15ec-471d-963b-41fd4fa4777c/disk.config d0cea430-15ec-471d-963b-41fd4fa4777c_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.541s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.834 227766 INFO nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Deleting local config drive /var/lib/nova/instances/d0cea430-15ec-471d-963b-41fd4fa4777c/disk.config because it was imported into RBD.#033[00m
Jan 23 05:39:10 np0005593234 kernel: tap50f13d72-f6: entered promiscuous mode
Jan 23 05:39:10 np0005593234 NetworkManager[48942]: <info>  [1769164750.8890] manager: (tap50f13d72-f6): new Tun device (/org/freedesktop/NetworkManager/Devices/393)
Jan 23 05:39:10 np0005593234 ovn_controller[134547]: 2026-01-23T10:39:10Z|00831|binding|INFO|Claiming lport 50f13d72-f6d6-4b3a-8853-76d0a2f50240 for this chassis.
Jan 23 05:39:10 np0005593234 ovn_controller[134547]: 2026-01-23T10:39:10Z|00832|binding|INFO|50f13d72-f6d6-4b3a-8853-76d0a2f50240: Claiming fa:16:3e:b1:07:ce 10.100.0.3
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.889 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.896 227766 DEBUG nova.network.neutron [req-5ebcd5fb-2d23-44bd-b046-cdf38d5ae2ca req-de585656-a891-4dc3-83b4-3923e9d09a8d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Updated VIF entry in instance network info cache for port 50f13d72-f6d6-4b3a-8853-76d0a2f50240. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.896 227766 DEBUG nova.network.neutron [req-5ebcd5fb-2d23-44bd-b046-cdf38d5ae2ca req-de585656-a891-4dc3-83b4-3923e9d09a8d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Updating instance_info_cache with network_info: [{"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:39:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:10.897 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:07:ce 10.100.0.3'], port_security=['fa:16:3e:b1:07:ce 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd0cea430-15ec-471d-963b-41fd4fa4777c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e762fca3b634c7aa1d994314c059c54', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ed138636-f650-4a09-b808-0b05f9067a5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0936335-b706-4400-8411-bdd084c8cdf7, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=50f13d72-f6d6-4b3a-8853-76d0a2f50240) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:39:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:10.898 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 50f13d72-f6d6-4b3a-8853-76d0a2f50240 in datapath fba2ba4a-d82c-4f8b-9754-c13fbec41a04 bound to our chassis#033[00m
Jan 23 05:39:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:10.900 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fba2ba4a-d82c-4f8b-9754-c13fbec41a04#033[00m
Jan 23 05:39:10 np0005593234 ovn_controller[134547]: 2026-01-23T10:39:10Z|00833|binding|INFO|Setting lport 50f13d72-f6d6-4b3a-8853-76d0a2f50240 ovn-installed in OVS
Jan 23 05:39:10 np0005593234 ovn_controller[134547]: 2026-01-23T10:39:10Z|00834|binding|INFO|Setting lport 50f13d72-f6d6-4b3a-8853-76d0a2f50240 up in Southbound
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.907 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.909 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:10 np0005593234 nova_compute[227762]: 2026-01-23 10:39:10.918 227766 DEBUG oslo_concurrency.lockutils [req-5ebcd5fb-2d23-44bd-b046-cdf38d5ae2ca req-de585656-a891-4dc3-83b4-3923e9d09a8d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:39:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:10.919 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e5e78ee0-5f57-4e9c-8cd5-99384a33bb09]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:10 np0005593234 systemd-udevd[319853]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:39:10 np0005593234 systemd-machined[195626]: New machine qemu-94-instance-000000c4.
Jan 23 05:39:10 np0005593234 NetworkManager[48942]: <info>  [1769164750.9388] device (tap50f13d72-f6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:39:10 np0005593234 NetworkManager[48942]: <info>  [1769164750.9393] device (tap50f13d72-f6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:39:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:10.949 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[aa243b48-005d-4e8e-8dea-22515c182d40]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:10.954 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[fc26791b-df45-4f7c-9196-73bbfc2633ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:10 np0005593234 systemd[1]: Started Virtual Machine qemu-94-instance-000000c4.
Jan 23 05:39:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:10.979 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[beddf615-d8fe-45e6-9566-9dac4fb2489b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:10.994 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2a4d0e56-5017-4c43-8029-33650defe020]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfba2ba4a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:db:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 8, 'tx_packets': 5, 'rx_bytes': 616, 'tx_bytes': 354, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 860063, 'reachable_time': 42218, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 319864, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:10.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:11.010 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8dbda760-50e7-4fd2-bcb7-a1cefb3fc380]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfba2ba4a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 860074, 'tstamp': 860074}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 319867, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfba2ba4a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 860077, 'tstamp': 860077}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 319867, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:11.012 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfba2ba4a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.014 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.015 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:11.015 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfba2ba4a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:11.015 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:39:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:11.016 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfba2ba4a-d0, col_values=(('external_ids', {'iface-id': '2348ddba-3dc3-4456-a637-f3065ba0d8f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:11.016 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.446 227766 DEBUG nova.compute.manager [req-8c83af5d-6e42-49b8-8e1a-56d2adfd79a8 req-f1ed553c-1975-4b7f-95d9-2e604f266bc9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Received event network-vif-plugged-50f13d72-f6d6-4b3a-8853-76d0a2f50240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.446 227766 DEBUG oslo_concurrency.lockutils [req-8c83af5d-6e42-49b8-8e1a-56d2adfd79a8 req-f1ed553c-1975-4b7f-95d9-2e604f266bc9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.446 227766 DEBUG oslo_concurrency.lockutils [req-8c83af5d-6e42-49b8-8e1a-56d2adfd79a8 req-f1ed553c-1975-4b7f-95d9-2e604f266bc9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.447 227766 DEBUG oslo_concurrency.lockutils [req-8c83af5d-6e42-49b8-8e1a-56d2adfd79a8 req-f1ed553c-1975-4b7f-95d9-2e604f266bc9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.447 227766 DEBUG nova.compute.manager [req-8c83af5d-6e42-49b8-8e1a-56d2adfd79a8 req-f1ed553c-1975-4b7f-95d9-2e604f266bc9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Processing event network-vif-plugged-50f13d72-f6d6-4b3a-8853-76d0a2f50240 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.489 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164751.4886668, d0cea430-15ec-471d-963b-41fd4fa4777c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.489 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] VM Started (Lifecycle Event)#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.491 227766 DEBUG nova.compute.manager [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.494 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.497 227766 INFO nova.virt.libvirt.driver [-] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Instance spawned successfully.#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.498 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.513 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.519 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.522 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.522 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.523 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.523 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.524 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.524 227766 DEBUG nova.virt.libvirt.driver [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.548 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.548 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164751.488998, d0cea430-15ec-471d-963b-41fd4fa4777c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.549 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.582 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.587 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164751.4939911, d0cea430-15ec-471d-963b-41fd4fa4777c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.587 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.593 227766 INFO nova.compute.manager [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Took 10.12 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.593 227766 DEBUG nova.compute.manager [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.610 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.614 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.677 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.972 227766 INFO nova.compute.manager [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Took 11.41 seconds to build instance.#033[00m
Jan 23 05:39:11 np0005593234 nova_compute[227762]: 2026-01-23 10:39:11.991 227766 DEBUG oslo_concurrency.lockutils [None req-6799e9cb-ec3a-4176-b5d2-90efcf564895 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.526s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:12.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:12.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:13 np0005593234 nova_compute[227762]: 2026-01-23 10:39:13.681 227766 DEBUG nova.compute.manager [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Received event network-vif-plugged-50f13d72-f6d6-4b3a-8853-76d0a2f50240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:39:13 np0005593234 nova_compute[227762]: 2026-01-23 10:39:13.682 227766 DEBUG oslo_concurrency.lockutils [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:13 np0005593234 nova_compute[227762]: 2026-01-23 10:39:13.682 227766 DEBUG oslo_concurrency.lockutils [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:13 np0005593234 nova_compute[227762]: 2026-01-23 10:39:13.682 227766 DEBUG oslo_concurrency.lockutils [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:13 np0005593234 nova_compute[227762]: 2026-01-23 10:39:13.682 227766 DEBUG nova.compute.manager [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] No waiting events found dispatching network-vif-plugged-50f13d72-f6d6-4b3a-8853-76d0a2f50240 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:39:13 np0005593234 nova_compute[227762]: 2026-01-23 10:39:13.683 227766 WARNING nova.compute.manager [req-78a3c245-e03b-4c9d-a567-19047df9cd9a req-1de2f41c-20d8-49f1-a3d1-75b998d59f8c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Received unexpected event network-vif-plugged-50f13d72-f6d6-4b3a-8853-76d0a2f50240 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:39:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:14.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:14 np0005593234 nova_compute[227762]: 2026-01-23 10:39:14.259 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:14 np0005593234 nova_compute[227762]: 2026-01-23 10:39:14.350 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:14 np0005593234 nova_compute[227762]: 2026-01-23 10:39:14.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:15.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:15 np0005593234 nova_compute[227762]: 2026-01-23 10:39:15.865 227766 DEBUG nova.compute.manager [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Received event network-changed-50f13d72-f6d6-4b3a-8853-76d0a2f50240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:39:15 np0005593234 nova_compute[227762]: 2026-01-23 10:39:15.865 227766 DEBUG nova.compute.manager [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Refreshing instance network info cache due to event network-changed-50f13d72-f6d6-4b3a-8853-76d0a2f50240. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:39:15 np0005593234 nova_compute[227762]: 2026-01-23 10:39:15.866 227766 DEBUG oslo_concurrency.lockutils [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:39:15 np0005593234 nova_compute[227762]: 2026-01-23 10:39:15.866 227766 DEBUG oslo_concurrency.lockutils [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:39:15 np0005593234 nova_compute[227762]: 2026-01-23 10:39:15.867 227766 DEBUG nova.network.neutron [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Refreshing network info cache for port 50f13d72-f6d6-4b3a-8853-76d0a2f50240 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:39:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:16.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:17.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:18 np0005593234 nova_compute[227762]: 2026-01-23 10:39:18.039 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:18.039 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=74, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=73) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:39:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:18.041 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:39:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:18.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:18 np0005593234 nova_compute[227762]: 2026-01-23 10:39:18.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:19.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:19 np0005593234 nova_compute[227762]: 2026-01-23 10:39:19.299 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:19 np0005593234 nova_compute[227762]: 2026-01-23 10:39:19.352 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:20.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:20 np0005593234 nova_compute[227762]: 2026-01-23 10:39:20.812 227766 DEBUG nova.network.neutron [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Updated VIF entry in instance network info cache for port 50f13d72-f6d6-4b3a-8853-76d0a2f50240. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:39:20 np0005593234 nova_compute[227762]: 2026-01-23 10:39:20.813 227766 DEBUG nova.network.neutron [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Updating instance_info_cache with network_info: [{"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.206", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:39:20 np0005593234 nova_compute[227762]: 2026-01-23 10:39:20.849 227766 DEBUG oslo_concurrency.lockutils [req-f31a25f0-4be3-43fa-8026-22aedafc9eba req-6a19bb7a-d153-487d-abcf-8ba95147212a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:39:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:21.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:22.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:23.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:24.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:24 np0005593234 nova_compute[227762]: 2026-01-23 10:39:24.301 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:24 np0005593234 nova_compute[227762]: 2026-01-23 10:39:24.353 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:25.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:25 np0005593234 ovn_controller[134547]: 2026-01-23T10:39:25Z|00110|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b1:07:ce 10.100.0.3
Jan 23 05:39:25 np0005593234 ovn_controller[134547]: 2026-01-23T10:39:25Z|00111|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b1:07:ce 10.100.0.3
Jan 23 05:39:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:26.044 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '74'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:39:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:26.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:26 np0005593234 nova_compute[227762]: 2026-01-23 10:39:26.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:26 np0005593234 podman[319969]: 2026-01-23 10:39:26.791504229 +0000 UTC m=+0.083118568 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:39:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:27.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:39:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2917504573' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:39:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:28.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:29.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:29 np0005593234 nova_compute[227762]: 2026-01-23 10:39:29.303 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:29 np0005593234 nova_compute[227762]: 2026-01-23 10:39:29.355 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e379 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:39:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:39:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:30.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.252 227766 DEBUG oslo_concurrency.lockutils [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "d0cea430-15ec-471d-963b-41fd4fa4777c" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.252 227766 DEBUG oslo_concurrency.lockutils [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.287 227766 DEBUG nova.objects.instance [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'flavor' on Instance uuid d0cea430-15ec-471d-963b-41fd4fa4777c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.359 227766 DEBUG oslo_concurrency.lockutils [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.106s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.617 227766 DEBUG oslo_concurrency.lockutils [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "d0cea430-15ec-471d-963b-41fd4fa4777c" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.617 227766 DEBUG oslo_concurrency.lockutils [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.618 227766 INFO nova.compute.manager [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Attaching volume 6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5 to /dev/vdb#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.794 227766 DEBUG os_brick.utils [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.796 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.811 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.811 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[73f5daa0-b7f4-4971-bcf6-03b57557fa4d]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.812 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.821 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.821 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[e456e905-2036-4c72-9018-dc60243d959b]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.823 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.831 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.832 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[240f16a6-b9ae-4595-a62f-b02c510157e8]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.833 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[9b77c990-1c44-4b12-afd2-329b9e99ec9d]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.833 227766 DEBUG oslo_concurrency.processutils [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.861 227766 DEBUG oslo_concurrency.processutils [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "nvme version" returned: 0 in 0.028s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.863 227766 DEBUG os_brick.initiator.connectors.lightos [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.864 227766 DEBUG os_brick.initiator.connectors.lightos [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.864 227766 DEBUG os_brick.initiator.connectors.lightos [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.864 227766 DEBUG os_brick.utils [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] <== get_connector_properties: return (69ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:39:30 np0005593234 nova_compute[227762]: 2026-01-23 10:39:30.865 227766 DEBUG nova.virt.block_device [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Updating existing volume attachment record: 531ec09e-2927-4dbd-8de1-6fdc7aac4fcc _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:39:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:39:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:39:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:39:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:31.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:39:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2775486351' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:39:31 np0005593234 nova_compute[227762]: 2026-01-23 10:39:31.857 227766 DEBUG nova.objects.instance [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'flavor' on Instance uuid d0cea430-15ec-471d-963b-41fd4fa4777c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:39:31 np0005593234 nova_compute[227762]: 2026-01-23 10:39:31.879 227766 DEBUG nova.virt.libvirt.driver [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Attempting to attach volume 6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:39:31 np0005593234 nova_compute[227762]: 2026-01-23 10:39:31.883 227766 DEBUG nova.virt.libvirt.guest [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:39:31 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:39:31 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5">
Jan 23 05:39:31 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:39:31 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:39:31 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:39:31 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:39:31 np0005593234 nova_compute[227762]:  <auth username="openstack">
Jan 23 05:39:31 np0005593234 nova_compute[227762]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:39:31 np0005593234 nova_compute[227762]:  </auth>
Jan 23 05:39:31 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:39:31 np0005593234 nova_compute[227762]:  <serial>6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5</serial>
Jan 23 05:39:31 np0005593234 nova_compute[227762]:  <shareable/>
Jan 23 05:39:31 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:39:31 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:39:32 np0005593234 nova_compute[227762]: 2026-01-23 10:39:32.024 227766 DEBUG nova.virt.libvirt.driver [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:39:32 np0005593234 nova_compute[227762]: 2026-01-23 10:39:32.024 227766 DEBUG nova.virt.libvirt.driver [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:39:32 np0005593234 nova_compute[227762]: 2026-01-23 10:39:32.025 227766 DEBUG nova.virt.libvirt.driver [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:39:32 np0005593234 nova_compute[227762]: 2026-01-23 10:39:32.025 227766 DEBUG nova.virt.libvirt.driver [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] No VIF found with MAC fa:16:3e:b1:07:ce, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:39:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:32.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:32 np0005593234 nova_compute[227762]: 2026-01-23 10:39:32.289 227766 DEBUG oslo_concurrency.lockutils [None req-5668b29c-d0be-4117-9173-ff71d8ba051e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 1.672s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e380 e380: 3 total, 3 up, 3 in
Jan 23 05:39:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:33.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:39:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3454312985' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:39:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:39:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3454312985' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:39:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:39:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2227559862' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:39:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:39:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2227559862' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:39:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:34.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:34 np0005593234 nova_compute[227762]: 2026-01-23 10:39:34.306 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:34 np0005593234 nova_compute[227762]: 2026-01-23 10:39:34.356 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:34 np0005593234 podman[320153]: 2026-01-23 10:39:34.803727504 +0000 UTC m=+0.093474391 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:39:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e380 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:35.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:35 np0005593234 nova_compute[227762]: 2026-01-23 10:39:35.982 227766 DEBUG oslo_concurrency.lockutils [None req-2ac48889-55ae-40ec-a69d-a1ecde32a7f1 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "d0cea430-15ec-471d-963b-41fd4fa4777c" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:35 np0005593234 nova_compute[227762]: 2026-01-23 10:39:35.982 227766 DEBUG oslo_concurrency.lockutils [None req-2ac48889-55ae-40ec-a69d-a1ecde32a7f1 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:36 np0005593234 nova_compute[227762]: 2026-01-23 10:39:36.001 227766 INFO nova.compute.manager [None req-2ac48889-55ae-40ec-a69d-a1ecde32a7f1 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Detaching volume 6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5#033[00m
Jan 23 05:39:36 np0005593234 nova_compute[227762]: 2026-01-23 10:39:36.198 227766 INFO nova.virt.block_device [None req-2ac48889-55ae-40ec-a69d-a1ecde32a7f1 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Attempting to driver detach volume 6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5 from mountpoint /dev/vdb#033[00m
Jan 23 05:39:36 np0005593234 nova_compute[227762]: 2026-01-23 10:39:36.210 227766 DEBUG nova.virt.libvirt.driver [None req-2ac48889-55ae-40ec-a69d-a1ecde32a7f1 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Attempting to detach device vdb from instance d0cea430-15ec-471d-963b-41fd4fa4777c from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:39:36 np0005593234 nova_compute[227762]: 2026-01-23 10:39:36.211 227766 DEBUG nova.virt.libvirt.guest [None req-2ac48889-55ae-40ec-a69d-a1ecde32a7f1 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5">
Jan 23 05:39:36 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  <serial>6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5</serial>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  <shareable/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:39:36 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:39:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:39:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:36.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:39:36 np0005593234 nova_compute[227762]: 2026-01-23 10:39:36.531 227766 INFO nova.virt.libvirt.driver [None req-2ac48889-55ae-40ec-a69d-a1ecde32a7f1 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Successfully detached device vdb from instance d0cea430-15ec-471d-963b-41fd4fa4777c from the persistent domain config.#033[00m
Jan 23 05:39:36 np0005593234 nova_compute[227762]: 2026-01-23 10:39:36.532 227766 DEBUG nova.virt.libvirt.driver [None req-2ac48889-55ae-40ec-a69d-a1ecde32a7f1 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance d0cea430-15ec-471d-963b-41fd4fa4777c from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:39:36 np0005593234 nova_compute[227762]: 2026-01-23 10:39:36.533 227766 DEBUG nova.virt.libvirt.guest [None req-2ac48889-55ae-40ec-a69d-a1ecde32a7f1 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5">
Jan 23 05:39:36 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  <serial>6fd77dfa-97fc-4041-84a2-c8fe6e49c5d5</serial>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  <shareable/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:39:36 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:39:36 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:39:36 np0005593234 nova_compute[227762]: 2026-01-23 10:39:36.592 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769164776.5920532, d0cea430-15ec-471d-963b-41fd4fa4777c => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:39:36 np0005593234 nova_compute[227762]: 2026-01-23 10:39:36.593 227766 DEBUG nova.virt.libvirt.driver [None req-2ac48889-55ae-40ec-a69d-a1ecde32a7f1 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance d0cea430-15ec-471d-963b-41fd4fa4777c _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:39:36 np0005593234 nova_compute[227762]: 2026-01-23 10:39:36.595 227766 INFO nova.virt.libvirt.driver [None req-2ac48889-55ae-40ec-a69d-a1ecde32a7f1 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Successfully detached device vdb from instance d0cea430-15ec-471d-963b-41fd4fa4777c from the live domain config.#033[00m
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #166. Immutable memtables: 0.
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.715088) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 105] Flushing memtable with next log file: 166
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164776715196, "job": 105, "event": "flush_started", "num_memtables": 1, "num_entries": 1302, "num_deletes": 251, "total_data_size": 2741567, "memory_usage": 2775264, "flush_reason": "Manual Compaction"}
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 105] Level-0 flush table #167: started
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164776736505, "cf_name": "default", "job": 105, "event": "table_file_creation", "file_number": 167, "file_size": 1809017, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 80002, "largest_seqno": 81299, "table_properties": {"data_size": 1803231, "index_size": 3116, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 13032, "raw_average_key_size": 20, "raw_value_size": 1791371, "raw_average_value_size": 2821, "num_data_blocks": 136, "num_entries": 635, "num_filter_entries": 635, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164680, "oldest_key_time": 1769164680, "file_creation_time": 1769164776, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 167, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 105] Flush lasted 21586 microseconds, and 4930 cpu microseconds.
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.736692) [db/flush_job.cc:967] [default] [JOB 105] Level-0 flush table #167: 1809017 bytes OK
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.736719) [db/memtable_list.cc:519] [default] Level-0 commit table #167 started
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.739600) [db/memtable_list.cc:722] [default] Level-0 commit table #167: memtable #1 done
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.739625) EVENT_LOG_v1 {"time_micros": 1769164776739618, "job": 105, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.739645) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 105] Try to delete WAL files size 2735262, prev total WAL file size 2735262, number of live WAL files 2.
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000163.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.740536) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037303238' seq:72057594037927935, type:22 .. '7061786F730037323830' seq:0, type:0; will stop at (end)
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 106] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 105 Base level 0, inputs: [167(1766KB)], [165(12MB)]
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164776740643, "job": 106, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [167], "files_L6": [165], "score": -1, "input_data_size": 15435168, "oldest_snapshot_seqno": -1}
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 106] Generated table #168: 9973 keys, 13459684 bytes, temperature: kUnknown
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164776856402, "cf_name": "default", "job": 106, "event": "table_file_creation", "file_number": 168, "file_size": 13459684, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13394546, "index_size": 39105, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24965, "raw_key_size": 263508, "raw_average_key_size": 26, "raw_value_size": 13219137, "raw_average_value_size": 1325, "num_data_blocks": 1493, "num_entries": 9973, "num_filter_entries": 9973, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769164776, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 168, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.856704) [db/compaction/compaction_job.cc:1663] [default] [JOB 106] Compacted 1@0 + 1@6 files to L6 => 13459684 bytes
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.862546) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 133.2 rd, 116.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.0 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(16.0) write-amplify(7.4) OK, records in: 10494, records dropped: 521 output_compression: NoCompression
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.862584) EVENT_LOG_v1 {"time_micros": 1769164776862574, "job": 106, "event": "compaction_finished", "compaction_time_micros": 115849, "compaction_time_cpu_micros": 34322, "output_level": 6, "num_output_files": 1, "total_output_size": 13459684, "num_input_records": 10494, "num_output_records": 9973, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000167.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164776862987, "job": 106, "event": "table_file_deletion", "file_number": 167}
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000165.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164776865241, "job": 106, "event": "table_file_deletion", "file_number": 165}
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.740471) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.865295) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.865300) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.865304) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.865305) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:36 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:39:36.865307) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:39:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:37.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:39:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:39:37 np0005593234 nova_compute[227762]: 2026-01-23 10:39:37.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:38 np0005593234 nova_compute[227762]: 2026-01-23 10:39:38.120 227766 DEBUG nova.objects.instance [None req-2ac48889-55ae-40ec-a69d-a1ecde32a7f1 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'flavor' on Instance uuid d0cea430-15ec-471d-963b-41fd4fa4777c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:39:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:38.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:38 np0005593234 nova_compute[227762]: 2026-01-23 10:39:38.248 227766 DEBUG oslo_concurrency.lockutils [None req-2ac48889-55ae-40ec-a69d-a1ecde32a7f1 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 2.266s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:39.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 e381: 3 total, 3 up, 3 in
Jan 23 05:39:39 np0005593234 nova_compute[227762]: 2026-01-23 10:39:39.309 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:39 np0005593234 nova_compute[227762]: 2026-01-23 10:39:39.358 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:40.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:41.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:39:41Z|00835|binding|INFO|Releasing lport 2348ddba-3dc3-4456-a637-f3065ba0d8f6 from this chassis (sb_readonly=0)
Jan 23 05:39:41 np0005593234 nova_compute[227762]: 2026-01-23 10:39:41.144 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:39:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:42.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:39:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:42.876 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:39:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:42.877 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:39:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:39:42.878 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:39:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:43.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:44.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:44 np0005593234 nova_compute[227762]: 2026-01-23 10:39:44.311 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:44 np0005593234 nova_compute[227762]: 2026-01-23 10:39:44.359 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:39:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/795972185' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:39:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:39:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/795972185' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:39:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:45.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:46 np0005593234 nova_compute[227762]: 2026-01-23 10:39:46.048 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:46 np0005593234 nova_compute[227762]: 2026-01-23 10:39:46.048 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:39:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:46.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:47.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:47 np0005593234 ovn_controller[134547]: 2026-01-23T10:39:47Z|00836|binding|INFO|Releasing lport 2348ddba-3dc3-4456-a637-f3065ba0d8f6 from this chassis (sb_readonly=0)
Jan 23 05:39:47 np0005593234 nova_compute[227762]: 2026-01-23 10:39:47.651 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:48.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:49.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:49 np0005593234 nova_compute[227762]: 2026-01-23 10:39:49.313 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:49 np0005593234 nova_compute[227762]: 2026-01-23 10:39:49.361 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:50.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:51.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:52.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:39:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:53.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:39:54 np0005593234 nova_compute[227762]: 2026-01-23 10:39:54.314 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:54.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:54 np0005593234 nova_compute[227762]: 2026-01-23 10:39:54.362 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:54 np0005593234 nova_compute[227762]: 2026-01-23 10:39:54.765 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:39:54 np0005593234 nova_compute[227762]: 2026-01-23 10:39:54.766 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:39:54 np0005593234 nova_compute[227762]: 2026-01-23 10:39:54.794 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:39:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:39:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:55.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:39:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:56.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:39:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:57.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:39:57 np0005593234 podman[320292]: 2026-01-23 10:39:57.755667074 +0000 UTC m=+0.050466398 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:39:58 np0005593234 nova_compute[227762]: 2026-01-23 10:39:58.159 227766 DEBUG nova.compute.manager [req-13ea32b8-1100-4dbf-beec-0e8ec8b664ba req-e5d79968-8b66-4494-afae-bf8e44173ca4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Received event network-changed-50f13d72-f6d6-4b3a-8853-76d0a2f50240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:39:58 np0005593234 nova_compute[227762]: 2026-01-23 10:39:58.160 227766 DEBUG nova.compute.manager [req-13ea32b8-1100-4dbf-beec-0e8ec8b664ba req-e5d79968-8b66-4494-afae-bf8e44173ca4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Refreshing instance network info cache due to event network-changed-50f13d72-f6d6-4b3a-8853-76d0a2f50240. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:39:58 np0005593234 nova_compute[227762]: 2026-01-23 10:39:58.160 227766 DEBUG oslo_concurrency.lockutils [req-13ea32b8-1100-4dbf-beec-0e8ec8b664ba req-e5d79968-8b66-4494-afae-bf8e44173ca4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:39:58 np0005593234 nova_compute[227762]: 2026-01-23 10:39:58.160 227766 DEBUG oslo_concurrency.lockutils [req-13ea32b8-1100-4dbf-beec-0e8ec8b664ba req-e5d79968-8b66-4494-afae-bf8e44173ca4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:39:58 np0005593234 nova_compute[227762]: 2026-01-23 10:39:58.160 227766 DEBUG nova.network.neutron [req-13ea32b8-1100-4dbf-beec-0e8ec8b664ba req-e5d79968-8b66-4494-afae-bf8e44173ca4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Refreshing network info cache for port 50f13d72-f6d6-4b3a-8853-76d0a2f50240 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:39:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:58.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:39:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:39:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:59.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:39:59 np0005593234 nova_compute[227762]: 2026-01-23 10:39:59.115 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:59 np0005593234 nova_compute[227762]: 2026-01-23 10:39:59.316 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:59 np0005593234 nova_compute[227762]: 2026-01-23 10:39:59.363 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:39:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:39:59 np0005593234 nova_compute[227762]: 2026-01-23 10:39:59.901 227766 DEBUG nova.network.neutron [req-13ea32b8-1100-4dbf-beec-0e8ec8b664ba req-e5d79968-8b66-4494-afae-bf8e44173ca4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Updated VIF entry in instance network info cache for port 50f13d72-f6d6-4b3a-8853-76d0a2f50240. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:39:59 np0005593234 nova_compute[227762]: 2026-01-23 10:39:59.901 227766 DEBUG nova.network.neutron [req-13ea32b8-1100-4dbf-beec-0e8ec8b664ba req-e5d79968-8b66-4494-afae-bf8e44173ca4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Updating instance_info_cache with network_info: [{"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:39:59 np0005593234 nova_compute[227762]: 2026-01-23 10:39:59.927 227766 DEBUG oslo_concurrency.lockutils [req-13ea32b8-1100-4dbf-beec-0e8ec8b664ba req-e5d79968-8b66-4494-afae-bf8e44173ca4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:40:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:40:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:00.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:40:00 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 05:40:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:01.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:02.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:40:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:03.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:40:03 np0005593234 nova_compute[227762]: 2026-01-23 10:40:03.773 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:04 np0005593234 nova_compute[227762]: 2026-01-23 10:40:04.318 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:04 np0005593234 nova_compute[227762]: 2026-01-23 10:40:04.365 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:04.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:04 np0005593234 nova_compute[227762]: 2026-01-23 10:40:04.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:04 np0005593234 nova_compute[227762]: 2026-01-23 10:40:04.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:40:04 np0005593234 nova_compute[227762]: 2026-01-23 10:40:04.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:40:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:05.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:05 np0005593234 podman[320366]: 2026-01-23 10:40:05.776818316 +0000 UTC m=+0.072521397 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0)
Jan 23 05:40:05 np0005593234 nova_compute[227762]: 2026-01-23 10:40:05.842 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:40:05 np0005593234 nova_compute[227762]: 2026-01-23 10:40:05.843 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:40:05 np0005593234 nova_compute[227762]: 2026-01-23 10:40:05.843 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:40:05 np0005593234 nova_compute[227762]: 2026-01-23 10:40:05.843 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 307f203d-cfc0-45a9-a0cd-3acee0ef7133 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:40:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:06.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:07.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:07 np0005593234 nova_compute[227762]: 2026-01-23 10:40:07.259 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:40:07Z|00837|binding|INFO|Releasing lport 2348ddba-3dc3-4456-a637-f3065ba0d8f6 from this chassis (sb_readonly=0)
Jan 23 05:40:08 np0005593234 nova_compute[227762]: 2026-01-23 10:40:08.023 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:08.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:08 np0005593234 nova_compute[227762]: 2026-01-23 10:40:08.408 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Updating instance_info_cache with network_info: [{"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:40:08 np0005593234 nova_compute[227762]: 2026-01-23 10:40:08.439 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:40:08 np0005593234 nova_compute[227762]: 2026-01-23 10:40:08.439 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:40:08 np0005593234 nova_compute[227762]: 2026-01-23 10:40:08.439 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:08 np0005593234 nova_compute[227762]: 2026-01-23 10:40:08.466 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:08 np0005593234 nova_compute[227762]: 2026-01-23 10:40:08.466 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:08 np0005593234 nova_compute[227762]: 2026-01-23 10:40:08.466 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:08 np0005593234 nova_compute[227762]: 2026-01-23 10:40:08.466 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:40:08 np0005593234 nova_compute[227762]: 2026-01-23 10:40:08.467 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:40:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/929287308' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.051 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.584s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:09.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.213 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.213 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.216 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.216 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.320 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.366 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.397 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.398 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3714MB free_disk=20.87606430053711GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.398 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.398 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.483 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 307f203d-cfc0-45a9-a0cd-3acee0ef7133 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.484 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance d0cea430-15ec-471d-963b-41fd4fa4777c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.484 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.484 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:40:09 np0005593234 nova_compute[227762]: 2026-01-23 10:40:09.533 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:40:10 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3540502193' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:40:10 np0005593234 nova_compute[227762]: 2026-01-23 10:40:10.060 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:10 np0005593234 nova_compute[227762]: 2026-01-23 10:40:10.067 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:40:10 np0005593234 nova_compute[227762]: 2026-01-23 10:40:10.086 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:40:10 np0005593234 nova_compute[227762]: 2026-01-23 10:40:10.115 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:40:10 np0005593234 nova_compute[227762]: 2026-01-23 10:40:10.116 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.717s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:40:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:10.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:40:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:40:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:11.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:40:11 np0005593234 nova_compute[227762]: 2026-01-23 10:40:11.421 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:11 np0005593234 nova_compute[227762]: 2026-01-23 10:40:11.421 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:11 np0005593234 nova_compute[227762]: 2026-01-23 10:40:11.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:11 np0005593234 nova_compute[227762]: 2026-01-23 10:40:11.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:40:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:12.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:12 np0005593234 nova_compute[227762]: 2026-01-23 10:40:12.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:13.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:13 np0005593234 ovn_controller[134547]: 2026-01-23T10:40:13Z|00838|binding|INFO|Releasing lport 2348ddba-3dc3-4456-a637-f3065ba0d8f6 from this chassis (sb_readonly=0)
Jan 23 05:40:13 np0005593234 nova_compute[227762]: 2026-01-23 10:40:13.883 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:14 np0005593234 nova_compute[227762]: 2026-01-23 10:40:14.322 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:14 np0005593234 nova_compute[227762]: 2026-01-23 10:40:14.368 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:14.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:14 np0005593234 nova_compute[227762]: 2026-01-23 10:40:14.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:15.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:15 np0005593234 nova_compute[227762]: 2026-01-23 10:40:15.308 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:16.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:17.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:40:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:18.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:40:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:18.510 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=75, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=74) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:40:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:18.511 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:40:18 np0005593234 nova_compute[227762]: 2026-01-23 10:40:18.587 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:18 np0005593234 nova_compute[227762]: 2026-01-23 10:40:18.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:19.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:19 np0005593234 nova_compute[227762]: 2026-01-23 10:40:19.325 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:19 np0005593234 nova_compute[227762]: 2026-01-23 10:40:19.370 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:20.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:21.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:22.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:40:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:23.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:40:23 np0005593234 nova_compute[227762]: 2026-01-23 10:40:23.917 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:24 np0005593234 nova_compute[227762]: 2026-01-23 10:40:24.327 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:24 np0005593234 nova_compute[227762]: 2026-01-23 10:40:24.372 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:24.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:25.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:25.514 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '75'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:26.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:26 np0005593234 nova_compute[227762]: 2026-01-23 10:40:26.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:40:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:40:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:27.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:40:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:28.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:28 np0005593234 podman[320502]: 2026-01-23 10:40:28.767225038 +0000 UTC m=+0.058820529 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:40:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:29.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:29 np0005593234 nova_compute[227762]: 2026-01-23 10:40:29.329 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:29 np0005593234 nova_compute[227762]: 2026-01-23 10:40:29.417 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:30.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:31.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:32.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:33.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:34 np0005593234 nova_compute[227762]: 2026-01-23 10:40:34.332 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:34 np0005593234 nova_compute[227762]: 2026-01-23 10:40:34.418 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:34.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:35.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:40:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:36.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:40:36 np0005593234 podman[320523]: 2026-01-23 10:40:36.790841547 +0000 UTC m=+0.081761175 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 23 05:40:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:37.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:38.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:40:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:39.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:40:39 np0005593234 nova_compute[227762]: 2026-01-23 10:40:39.335 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:39 np0005593234 nova_compute[227762]: 2026-01-23 10:40:39.420 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:40.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:40:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:40:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:40:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:40:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:40:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:41.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:42.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:42.877 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:42.878 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:42.879 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:43 np0005593234 nova_compute[227762]: 2026-01-23 10:40:43.046 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "6b2d76e8-0aab-4760-b64e-0097520255ce" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:43 np0005593234 nova_compute[227762]: 2026-01-23 10:40:43.046 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:43 np0005593234 nova_compute[227762]: 2026-01-23 10:40:43.063 227766 DEBUG nova.compute.manager [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:40:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:40:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:43.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:40:43 np0005593234 nova_compute[227762]: 2026-01-23 10:40:43.164 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:43 np0005593234 nova_compute[227762]: 2026-01-23 10:40:43.165 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:43 np0005593234 nova_compute[227762]: 2026-01-23 10:40:43.171 227766 DEBUG nova.virt.hardware [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:40:43 np0005593234 nova_compute[227762]: 2026-01-23 10:40:43.172 227766 INFO nova.compute.claims [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:40:43 np0005593234 nova_compute[227762]: 2026-01-23 10:40:43.350 227766 DEBUG oslo_concurrency.processutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:40:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2237339453' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:40:43 np0005593234 nova_compute[227762]: 2026-01-23 10:40:43.811 227766 DEBUG oslo_concurrency.processutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:43 np0005593234 nova_compute[227762]: 2026-01-23 10:40:43.819 227766 DEBUG nova.compute.provider_tree [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:40:43 np0005593234 nova_compute[227762]: 2026-01-23 10:40:43.959 227766 DEBUG nova.scheduler.client.report [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:40:43 np0005593234 nova_compute[227762]: 2026-01-23 10:40:43.997 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.832s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:43 np0005593234 nova_compute[227762]: 2026-01-23 10:40:43.997 227766 DEBUG nova.compute.manager [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.079 227766 DEBUG nova.compute.manager [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.080 227766 DEBUG nova.network.neutron [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.114 227766 INFO nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.141 227766 DEBUG nova.compute.manager [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.200 227766 INFO nova.virt.block_device [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Booting with volume 98d92b88-169d-4578-860b-b4af33fc6e51 at /dev/vda#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.337 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.379 227766 DEBUG nova.policy [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eb70c3aee8b64273a1930c0c2c231aff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd27c5465284b48a5818ef931d6251c43', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.390 227766 DEBUG os_brick.utils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.392 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.405 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.406 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[d58733af-a8de-437d-836b-0dfc5576c503]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.407 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.417 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.417 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[7ce35f98-efaf-44e1-bc9f-ddc3a1bf261b]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.418 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.421 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.428 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.429 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9c2805-9bae-4546-ba8c-e47f5b2857ad]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.430 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[4da66960-838a-4bc0-8eec-99e0a66a4032]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.431 227766 DEBUG oslo_concurrency.processutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.458 227766 DEBUG oslo_concurrency.processutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "nvme version" returned: 0 in 0.027s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.460 227766 DEBUG os_brick.initiator.connectors.lightos [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.461 227766 DEBUG os_brick.initiator.connectors.lightos [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.461 227766 DEBUG os_brick.initiator.connectors.lightos [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.461 227766 DEBUG os_brick.utils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] <== get_connector_properties: return (70ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:40:44 np0005593234 nova_compute[227762]: 2026-01-23 10:40:44.461 227766 DEBUG nova.virt.block_device [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Updating existing volume attachment record: 7fcd26fc-1269-44d4-9f00-2b865b7fa8ba _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:40:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:44.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:45.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:45 np0005593234 nova_compute[227762]: 2026-01-23 10:40:45.435 227766 DEBUG nova.network.neutron [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Successfully created port: e5c4f897-26f7-4661-9447-da88c5a96ccd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:40:45 np0005593234 nova_compute[227762]: 2026-01-23 10:40:45.864 227766 DEBUG nova.compute.manager [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:40:45 np0005593234 nova_compute[227762]: 2026-01-23 10:40:45.865 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:40:45 np0005593234 nova_compute[227762]: 2026-01-23 10:40:45.866 227766 INFO nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Creating image(s)#033[00m
Jan 23 05:40:45 np0005593234 nova_compute[227762]: 2026-01-23 10:40:45.866 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:40:45 np0005593234 nova_compute[227762]: 2026-01-23 10:40:45.866 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Ensure instance console log exists: /var/lib/nova/instances/6b2d76e8-0aab-4760-b64e-0097520255ce/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:40:45 np0005593234 nova_compute[227762]: 2026-01-23 10:40:45.867 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:45 np0005593234 nova_compute[227762]: 2026-01-23 10:40:45.867 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:45 np0005593234 nova_compute[227762]: 2026-01-23 10:40:45.867 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:46.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:46 np0005593234 nova_compute[227762]: 2026-01-23 10:40:46.996 227766 DEBUG nova.network.neutron [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Successfully updated port: e5c4f897-26f7-4661-9447-da88c5a96ccd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:40:47 np0005593234 nova_compute[227762]: 2026-01-23 10:40:47.011 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "refresh_cache-6b2d76e8-0aab-4760-b64e-0097520255ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:40:47 np0005593234 nova_compute[227762]: 2026-01-23 10:40:47.011 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquired lock "refresh_cache-6b2d76e8-0aab-4760-b64e-0097520255ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:40:47 np0005593234 nova_compute[227762]: 2026-01-23 10:40:47.011 227766 DEBUG nova.network.neutron [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:40:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:47.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:47 np0005593234 nova_compute[227762]: 2026-01-23 10:40:47.140 227766 DEBUG nova.compute.manager [req-cfe3ecdf-d518-41ca-97a7-7e68d3449c80 req-425536e5-2f99-4635-90d9-27072ca0d329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Received event network-changed-e5c4f897-26f7-4661-9447-da88c5a96ccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:40:47 np0005593234 nova_compute[227762]: 2026-01-23 10:40:47.140 227766 DEBUG nova.compute.manager [req-cfe3ecdf-d518-41ca-97a7-7e68d3449c80 req-425536e5-2f99-4635-90d9-27072ca0d329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Refreshing instance network info cache due to event network-changed-e5c4f897-26f7-4661-9447-da88c5a96ccd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:40:47 np0005593234 nova_compute[227762]: 2026-01-23 10:40:47.140 227766 DEBUG oslo_concurrency.lockutils [req-cfe3ecdf-d518-41ca-97a7-7e68d3449c80 req-425536e5-2f99-4635-90d9-27072ca0d329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-6b2d76e8-0aab-4760-b64e-0097520255ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:40:47 np0005593234 nova_compute[227762]: 2026-01-23 10:40:47.216 227766 DEBUG nova.network.neutron [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.489 227766 DEBUG nova.network.neutron [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Updating instance_info_cache with network_info: [{"id": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "address": "fa:16:3e:6d:0f:da", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5c4f897-26", "ovs_interfaceid": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.513 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Releasing lock "refresh_cache-6b2d76e8-0aab-4760-b64e-0097520255ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.514 227766 DEBUG nova.compute.manager [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Instance network_info: |[{"id": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "address": "fa:16:3e:6d:0f:da", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5c4f897-26", "ovs_interfaceid": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.514 227766 DEBUG oslo_concurrency.lockutils [req-cfe3ecdf-d518-41ca-97a7-7e68d3449c80 req-425536e5-2f99-4635-90d9-27072ca0d329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-6b2d76e8-0aab-4760-b64e-0097520255ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.514 227766 DEBUG nova.network.neutron [req-cfe3ecdf-d518-41ca-97a7-7e68d3449c80 req-425536e5-2f99-4635-90d9-27072ca0d329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Refreshing network info cache for port e5c4f897-26f7-4661-9447-da88c5a96ccd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.517 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Start _get_guest_xml network_info=[{"id": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "address": "fa:16:3e:6d:0f:da", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5c4f897-26", "ovs_interfaceid": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'boot_index': 0, 'mount_device': '/dev/vda', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-98d92b88-169d-4578-860b-b4af33fc6e51', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '98d92b88-169d-4578-860b-b4af33fc6e51', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': True, 'cacheable': False}, 'status': 'reserved', 'instance': '6b2d76e8-0aab-4760-b64e-0097520255ce', 'attached_at': '', 'detached_at': '', 'volume_id': '98d92b88-169d-4578-860b-b4af33fc6e51', 'serial': '98d92b88-169d-4578-860b-b4af33fc6e51'}, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '7fcd26fc-1269-44d4-9f00-2b865b7fa8ba', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.522 227766 WARNING nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.527 227766 DEBUG nova.virt.libvirt.host [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.528 227766 DEBUG nova.virt.libvirt.host [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:40:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:48.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.536 227766 DEBUG nova.virt.libvirt.host [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.537 227766 DEBUG nova.virt.libvirt.host [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.538 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.539 227766 DEBUG nova.virt.hardware [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.539 227766 DEBUG nova.virt.hardware [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.539 227766 DEBUG nova.virt.hardware [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.539 227766 DEBUG nova.virt.hardware [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.540 227766 DEBUG nova.virt.hardware [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.540 227766 DEBUG nova.virt.hardware [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.540 227766 DEBUG nova.virt.hardware [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.540 227766 DEBUG nova.virt.hardware [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.540 227766 DEBUG nova.virt.hardware [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.541 227766 DEBUG nova.virt.hardware [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.541 227766 DEBUG nova.virt.hardware [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.569 227766 DEBUG nova.storage.rbd_utils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image 6b2d76e8-0aab-4760-b64e-0097520255ce_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:40:48 np0005593234 nova_compute[227762]: 2026-01-23 10:40:48.573 227766 DEBUG oslo_concurrency.processutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3832209500' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.044 227766 DEBUG oslo_concurrency.processutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:49.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.173 227766 DEBUG os_brick.encryptors [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Using volume encryption metadata '{'encryption_key_id': 'd0a00c7e-e16a-4a37-9cd5-b0d2539f706b', 'control_location': 'front-end', 'cipher': 'aes-xts-plain64', 'key_size': 256, 'provider': 'luks'}' for connection: {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-98d92b88-169d-4578-860b-b4af33fc6e51', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '98d92b88-169d-4578-860b-b4af33fc6e51', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': True, 'cacheable': False}, 'status': 'reserved', 'instance': '6b2d76e8-0aab-4760-b64e-0097520255ce', 'attached_at': '', 'detached_at': '', 'volume_id': '98d92b88-169d-4578-860b-b4af33fc6e51', 'serial': '} get_encryption_metadata /usr/lib/python3.9/site-packages/os_brick/encryptors/__init__.py:135#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.176 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Creating Client object Client /usr/lib/python3.9/site-packages/barbicanclient/client.py:163#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.339 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #169. Immutable memtables: 0.
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.360504) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 107] Flushing memtable with next log file: 169
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164849360657, "job": 107, "event": "flush_started", "num_memtables": 1, "num_entries": 993, "num_deletes": 253, "total_data_size": 2029883, "memory_usage": 2052872, "flush_reason": "Manual Compaction"}
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 107] Level-0 flush table #170: started
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.423 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.425 227766 DEBUG barbicanclient.v1.secrets [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Getting secret - Secret href: https://barbican-internal.openstack.svc:9311/secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b get /usr/lib/python3.9/site-packages/barbicanclient/v1/secrets.py:514#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.426 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.454 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.454 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.480 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.481 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.506 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.507 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.528 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.528 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.564 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.565 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.590 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.591 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164849605160, "cf_name": "default", "job": 107, "event": "table_file_creation", "file_number": 170, "file_size": 853521, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 81304, "largest_seqno": 82292, "table_properties": {"data_size": 849839, "index_size": 1397, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1221, "raw_key_size": 10230, "raw_average_key_size": 21, "raw_value_size": 841737, "raw_average_value_size": 1735, "num_data_blocks": 62, "num_entries": 485, "num_filter_entries": 485, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164777, "oldest_key_time": 1769164777, "file_creation_time": 1769164849, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 170, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 107] Flush lasted 244722 microseconds, and 3623 cpu microseconds.
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.613 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.613 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.605242) [db/flush_job.cc:967] [default] [JOB 107] Level-0 flush table #170: 853521 bytes OK
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.605263) [db/memtable_list.cc:519] [default] Level-0 commit table #170 started
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.616974) [db/memtable_list.cc:722] [default] Level-0 commit table #170: memtable #1 done
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.617013) EVENT_LOG_v1 {"time_micros": 1769164849617006, "job": 107, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.617029) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 107] Try to delete WAL files size 2024887, prev total WAL file size 2087683, number of live WAL files 2.
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000166.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.617803) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740032373538' seq:72057594037927935, type:22 .. '6D6772737461740033303130' seq:0, type:0; will stop at (end)
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 108] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 107 Base level 0, inputs: [170(833KB)], [168(12MB)]
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164849617901, "job": 108, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [170], "files_L6": [168], "score": -1, "input_data_size": 14313205, "oldest_snapshot_seqno": -1}
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.639 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.639 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.669 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.669 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 108] Generated table #171: 9965 keys, 10926877 bytes, temperature: kUnknown
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164849708097, "cf_name": "default", "job": 108, "event": "table_file_creation", "file_number": 171, "file_size": 10926877, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10865618, "index_size": 35226, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24965, "raw_key_size": 263524, "raw_average_key_size": 26, "raw_value_size": 10694155, "raw_average_value_size": 1073, "num_data_blocks": 1333, "num_entries": 9965, "num_filter_entries": 9965, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769164849, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 171, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.707 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.708 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.708743) [db/compaction/compaction_job.cc:1663] [default] [JOB 108] Compacted 1@0 + 1@6 files to L6 => 10926877 bytes
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.711055) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 158.3 rd, 120.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 12.8 +0.0 blob) out(10.4 +0.0 blob), read-write-amplify(29.6) write-amplify(12.8) OK, records in: 10458, records dropped: 493 output_compression: NoCompression
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.711106) EVENT_LOG_v1 {"time_micros": 1769164849711086, "job": 108, "event": "compaction_finished", "compaction_time_micros": 90438, "compaction_time_cpu_micros": 31459, "output_level": 6, "num_output_files": 1, "total_output_size": 10926877, "num_input_records": 10458, "num_output_records": 9965, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000170.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164849711808, "job": 108, "event": "table_file_deletion", "file_number": 170}
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000168.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164849714835, "job": 108, "event": "table_file_deletion", "file_number": 168}
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.617726) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.714912) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.714918) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.714920) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.714954) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:40:49.714955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.734 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.735 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.770 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.771 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.802 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.803 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.839 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.839 227766 INFO barbicanclient.base [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Calculated Secrets uuid ref: secrets/d0a00c7e-e16a-4a37-9cd5-b0d2539f706b#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.874 227766 DEBUG barbicanclient.client [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.875 227766 DEBUG nova.virt.libvirt.host [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Secret XML: <secret ephemeral="no" private="no">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  <usage type="volume">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <volume>98d92b88-169d-4578-860b-b4af33fc6e51</volume>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  </usage>
Jan 23 05:40:49 np0005593234 nova_compute[227762]: </secret>
Jan 23 05:40:49 np0005593234 nova_compute[227762]: create_secret /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1131#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.913 227766 DEBUG nova.virt.libvirt.vif [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:40:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-581451877',display_name='tempest-TestVolumeBootPattern-server-581451877',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-581451877',id=199,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-glo4qv0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:40:44Z,user_data=None,user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=6b2d76e8-0aab-4760-b64e-0097520255ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "address": "fa:16:3e:6d:0f:da", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5c4f897-26", "ovs_interfaceid": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.914 227766 DEBUG nova.network.os_vif_util [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "address": "fa:16:3e:6d:0f:da", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5c4f897-26", "ovs_interfaceid": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.915 227766 DEBUG nova.network.os_vif_util [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:0f:da,bridge_name='br-int',has_traffic_filtering=True,id=e5c4f897-26f7-4661-9447-da88c5a96ccd,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5c4f897-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.917 227766 DEBUG nova.objects.instance [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lazy-loading 'pci_devices' on Instance uuid 6b2d76e8-0aab-4760-b64e-0097520255ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.936 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  <uuid>6b2d76e8-0aab-4760-b64e-0097520255ce</uuid>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  <name>instance-000000c7</name>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestVolumeBootPattern-server-581451877</nova:name>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:40:48</nova:creationTime>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <nova:user uuid="eb70c3aee8b64273a1930c0c2c231aff">tempest-TestVolumeBootPattern-2139361132-project-member</nova:user>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <nova:project uuid="d27c5465284b48a5818ef931d6251c43">tempest-TestVolumeBootPattern-2139361132</nova:project>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <nova:port uuid="e5c4f897-26f7-4661-9447-da88c5a96ccd">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <entry name="serial">6b2d76e8-0aab-4760-b64e-0097520255ce</entry>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <entry name="uuid">6b2d76e8-0aab-4760-b64e-0097520255ce</entry>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/6b2d76e8-0aab-4760-b64e-0097520255ce_disk.config">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="volumes/volume-98d92b88-169d-4578-860b-b4af33fc6e51">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <serial>98d92b88-169d-4578-860b-b4af33fc6e51</serial>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <encryption format="luks">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:        <secret type="passphrase" uuid="98f20044-e019-43e1-aab0-3a3b9ac0ccd5"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      </encryption>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:6d:0f:da"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <target dev="tape5c4f897-26"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/6b2d76e8-0aab-4760-b64e-0097520255ce/console.log" append="off"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:40:49 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:40:49 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:40:49 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:40:49 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.938 227766 DEBUG nova.compute.manager [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Preparing to wait for external event network-vif-plugged-e5c4f897-26f7-4661-9447-da88c5a96ccd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.938 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.938 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.939 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.939 227766 DEBUG nova.virt.libvirt.vif [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:40:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-581451877',display_name='tempest-TestVolumeBootPattern-server-581451877',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-581451877',id=199,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-glo4qv0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:40:44Z,user_data=None,user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=6b2d76e8-0aab-4760-b64e-0097520255ce,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "address": "fa:16:3e:6d:0f:da", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5c4f897-26", "ovs_interfaceid": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.939 227766 DEBUG nova.network.os_vif_util [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "address": "fa:16:3e:6d:0f:da", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5c4f897-26", "ovs_interfaceid": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.940 227766 DEBUG nova.network.os_vif_util [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:0f:da,bridge_name='br-int',has_traffic_filtering=True,id=e5c4f897-26f7-4661-9447-da88c5a96ccd,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5c4f897-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.940 227766 DEBUG os_vif [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:0f:da,bridge_name='br-int',has_traffic_filtering=True,id=e5c4f897-26f7-4661-9447-da88c5a96ccd,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5c4f897-26') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.941 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.941 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.942 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.953 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.954 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape5c4f897-26, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.954 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape5c4f897-26, col_values=(('external_ids', {'iface-id': 'e5c4f897-26f7-4661-9447-da88c5a96ccd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6d:0f:da', 'vm-uuid': '6b2d76e8-0aab-4760-b64e-0097520255ce'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.956 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:49 np0005593234 NetworkManager[48942]: <info>  [1769164849.9571] manager: (tape5c4f897-26): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/394)
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.958 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.963 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:49 np0005593234 nova_compute[227762]: 2026-01-23 10:40:49.965 227766 INFO os_vif [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:0f:da,bridge_name='br-int',has_traffic_filtering=True,id=e5c4f897-26f7-4661-9447-da88c5a96ccd,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5c4f897-26')#033[00m
Jan 23 05:40:50 np0005593234 nova_compute[227762]: 2026-01-23 10:40:50.020 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:40:50 np0005593234 nova_compute[227762]: 2026-01-23 10:40:50.021 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:40:50 np0005593234 nova_compute[227762]: 2026-01-23 10:40:50.021 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No VIF found with MAC fa:16:3e:6d:0f:da, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:40:50 np0005593234 nova_compute[227762]: 2026-01-23 10:40:50.021 227766 INFO nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Using config drive#033[00m
Jan 23 05:40:50 np0005593234 nova_compute[227762]: 2026-01-23 10:40:50.046 227766 DEBUG nova.storage.rbd_utils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image 6b2d76e8-0aab-4760-b64e-0097520255ce_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:40:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:40:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:40:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:50.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:40:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:51.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:40:51 np0005593234 nova_compute[227762]: 2026-01-23 10:40:51.912 227766 INFO nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Creating config drive at /var/lib/nova/instances/6b2d76e8-0aab-4760-b64e-0097520255ce/disk.config#033[00m
Jan 23 05:40:51 np0005593234 nova_compute[227762]: 2026-01-23 10:40:51.918 227766 DEBUG oslo_concurrency.processutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/6b2d76e8-0aab-4760-b64e-0097520255ce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6q0pekyu execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:52 np0005593234 nova_compute[227762]: 2026-01-23 10:40:52.055 227766 DEBUG oslo_concurrency.processutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/6b2d76e8-0aab-4760-b64e-0097520255ce/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6q0pekyu" returned: 0 in 0.137s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:52 np0005593234 nova_compute[227762]: 2026-01-23 10:40:52.081 227766 DEBUG nova.storage.rbd_utils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image 6b2d76e8-0aab-4760-b64e-0097520255ce_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:40:52 np0005593234 nova_compute[227762]: 2026-01-23 10:40:52.084 227766 DEBUG oslo_concurrency.processutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/6b2d76e8-0aab-4760-b64e-0097520255ce/disk.config 6b2d76e8-0aab-4760-b64e-0097520255ce_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:40:52 np0005593234 nova_compute[227762]: 2026-01-23 10:40:52.259 227766 DEBUG oslo_concurrency.processutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/6b2d76e8-0aab-4760-b64e-0097520255ce/disk.config 6b2d76e8-0aab-4760-b64e-0097520255ce_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.175s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:40:52 np0005593234 nova_compute[227762]: 2026-01-23 10:40:52.260 227766 INFO nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Deleting local config drive /var/lib/nova/instances/6b2d76e8-0aab-4760-b64e-0097520255ce/disk.config because it was imported into RBD.#033[00m
Jan 23 05:40:52 np0005593234 NetworkManager[48942]: <info>  [1769164852.3122] manager: (tape5c4f897-26): new Tun device (/org/freedesktop/NetworkManager/Devices/395)
Jan 23 05:40:52 np0005593234 kernel: tape5c4f897-26: entered promiscuous mode
Jan 23 05:40:52 np0005593234 nova_compute[227762]: 2026-01-23 10:40:52.314 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:40:52Z|00839|binding|INFO|Claiming lport e5c4f897-26f7-4661-9447-da88c5a96ccd for this chassis.
Jan 23 05:40:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:40:52Z|00840|binding|INFO|e5c4f897-26f7-4661-9447-da88c5a96ccd: Claiming fa:16:3e:6d:0f:da 10.100.0.4
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.323 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:0f:da 10.100.0.4'], port_security=['fa:16:3e:6d:0f:da 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '6b2d76e8-0aab-4760-b64e-0097520255ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-72854481-c2f9-4651-8ba1-fe321a8a5546', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd27c5465284b48a5818ef931d6251c43', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3a20e786-de0e-4392-a9c2-94b60112f57a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95d9ae35-aabe-45f7-a103-f14858b94e31, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=e5c4f897-26f7-4661-9447-da88c5a96ccd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.325 144381 INFO neutron.agent.ovn.metadata.agent [-] Port e5c4f897-26f7-4661-9447-da88c5a96ccd in datapath 72854481-c2f9-4651-8ba1-fe321a8a5546 bound to our chassis#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.328 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 72854481-c2f9-4651-8ba1-fe321a8a5546#033[00m
Jan 23 05:40:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:40:52Z|00841|binding|INFO|Setting lport e5c4f897-26f7-4661-9447-da88c5a96ccd ovn-installed in OVS
Jan 23 05:40:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:40:52Z|00842|binding|INFO|Setting lport e5c4f897-26f7-4661-9447-da88c5a96ccd up in Southbound
Jan 23 05:40:52 np0005593234 nova_compute[227762]: 2026-01-23 10:40:52.331 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:52 np0005593234 nova_compute[227762]: 2026-01-23 10:40:52.335 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.340 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[332fdcca-1ff9-4a2d-b2bc-bfe7421371e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.341 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap72854481-c1 in ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.343 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap72854481-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.343 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[557f7ace-ee63-4e88-a8e0-19d6186a2d0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.344 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cd06f0fd-5595-4301-8981-24b815d10c20]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 systemd-udevd[320933]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.357 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[02b5704e-3c45-46ce-b08a-c79373977253]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 systemd-machined[195626]: New machine qemu-95-instance-000000c7.
Jan 23 05:40:52 np0005593234 NetworkManager[48942]: <info>  [1769164852.3668] device (tape5c4f897-26): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:40:52 np0005593234 NetworkManager[48942]: <info>  [1769164852.3676] device (tape5c4f897-26): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:40:52 np0005593234 systemd[1]: Started Virtual Machine qemu-95-instance-000000c7.
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.384 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fbb36862-15c5-4881-bf69-79e3d8107a22]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.427 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[79606740-34ea-40f2-826a-89c8e1c078fd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 systemd-udevd[320936]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.434 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[51371058-3587-4eaf-a54c-34009fa95b6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 NetworkManager[48942]: <info>  [1769164852.4353] manager: (tap72854481-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/396)
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.468 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[538a3d38-a319-4383-a33e-694c77fd6e1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.472 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1649ddb9-8198-4b5f-a7ba-cf58ecb149b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 NetworkManager[48942]: <info>  [1769164852.4965] device (tap72854481-c0): carrier: link connected
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.502 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a5ee5424-3585-465a-9127-cb3e676b9ffb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.521 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[94fa86b5-e2c7-4775-9cff-8ce5a0dbb22e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap72854481-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:b6:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 256], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 874745, 'reachable_time': 44177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 320964, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.537 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a7509cea-cf9b-4393-a1f0-46eb892659dc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe81:b660'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 874745, 'tstamp': 874745}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 320965, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:52.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.556 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ca1dd120-37ab-487d-b813-32003f6bc9a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap72854481-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:b6:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 256], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 874745, 'reachable_time': 44177, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 320966, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.586 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3ecb233e-0de9-4a2f-acbe-3748df706b95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.647 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1442e439-a43f-4006-968c-ada7bf420a82]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.648 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72854481-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.648 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.649 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap72854481-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:52 np0005593234 nova_compute[227762]: 2026-01-23 10:40:52.650 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:52 np0005593234 NetworkManager[48942]: <info>  [1769164852.6512] manager: (tap72854481-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/397)
Jan 23 05:40:52 np0005593234 kernel: tap72854481-c0: entered promiscuous mode
Jan 23 05:40:52 np0005593234 nova_compute[227762]: 2026-01-23 10:40:52.652 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.653 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap72854481-c0, col_values=(('external_ids', {'iface-id': '6b08537e-a263-4eec-b987-1e42878f483a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:40:52 np0005593234 nova_compute[227762]: 2026-01-23 10:40:52.654 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:52 np0005593234 ovn_controller[134547]: 2026-01-23T10:40:52Z|00843|binding|INFO|Releasing lport 6b08537e-a263-4eec-b987-1e42878f483a from this chassis (sb_readonly=0)
Jan 23 05:40:52 np0005593234 nova_compute[227762]: 2026-01-23 10:40:52.656 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.657 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/72854481-c2f9-4651-8ba1-fe321a8a5546.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/72854481-c2f9-4651-8ba1-fe321a8a5546.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.657 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f2e28a51-10e6-489e-86ef-6c04f889b162]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.658 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-72854481-c2f9-4651-8ba1-fe321a8a5546
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/72854481-c2f9-4651-8ba1-fe321a8a5546.pid.haproxy
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 72854481-c2f9-4651-8ba1-fe321a8a5546
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:40:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:40:52.659 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'env', 'PROCESS_TAG=haproxy-72854481-c2f9-4651-8ba1-fe321a8a5546', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/72854481-c2f9-4651-8ba1-fe321a8a5546.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:40:52 np0005593234 nova_compute[227762]: 2026-01-23 10:40:52.671 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:53 np0005593234 podman[321034]: 2026-01-23 10:40:53.087305074 +0000 UTC m=+0.061966958 container create e52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:40:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:53.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:53 np0005593234 systemd[1]: Started libpod-conmon-e52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f.scope.
Jan 23 05:40:53 np0005593234 podman[321034]: 2026-01-23 10:40:53.053102575 +0000 UTC m=+0.027764479 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:40:53 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:40:53 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/db21e665ca16f0862de081301c6eef29cb334cee092026f5d743e21b28dd6baf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:40:53 np0005593234 podman[321034]: 2026-01-23 10:40:53.190163527 +0000 UTC m=+0.164825441 container init e52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:40:53 np0005593234 podman[321034]: 2026-01-23 10:40:53.198063354 +0000 UTC m=+0.172725238 container start e52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 05:40:53 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321050]: [NOTICE]   (321054) : New worker (321056) forked
Jan 23 05:40:53 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321050]: [NOTICE]   (321054) : Loading success.
Jan 23 05:40:54 np0005593234 nova_compute[227762]: 2026-01-23 10:40:54.341 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:54.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e381 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:54 np0005593234 nova_compute[227762]: 2026-01-23 10:40:54.957 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:40:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:55.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:40:55 np0005593234 nova_compute[227762]: 2026-01-23 10:40:55.258 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164855.2580774, 6b2d76e8-0aab-4760-b64e-0097520255ce => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:40:55 np0005593234 nova_compute[227762]: 2026-01-23 10:40:55.259 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] VM Started (Lifecycle Event)#033[00m
Jan 23 05:40:55 np0005593234 nova_compute[227762]: 2026-01-23 10:40:55.283 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:40:55 np0005593234 nova_compute[227762]: 2026-01-23 10:40:55.288 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164855.2589881, 6b2d76e8-0aab-4760-b64e-0097520255ce => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:40:55 np0005593234 nova_compute[227762]: 2026-01-23 10:40:55.288 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:40:55 np0005593234 nova_compute[227762]: 2026-01-23 10:40:55.356 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:40:55 np0005593234 nova_compute[227762]: 2026-01-23 10:40:55.360 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:40:55 np0005593234 nova_compute[227762]: 2026-01-23 10:40:55.388 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:40:55 np0005593234 nova_compute[227762]: 2026-01-23 10:40:55.970 227766 DEBUG nova.network.neutron [req-cfe3ecdf-d518-41ca-97a7-7e68d3449c80 req-425536e5-2f99-4635-90d9-27072ca0d329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Updated VIF entry in instance network info cache for port e5c4f897-26f7-4661-9447-da88c5a96ccd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:40:55 np0005593234 nova_compute[227762]: 2026-01-23 10:40:55.970 227766 DEBUG nova.network.neutron [req-cfe3ecdf-d518-41ca-97a7-7e68d3449c80 req-425536e5-2f99-4635-90d9-27072ca0d329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Updating instance_info_cache with network_info: [{"id": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "address": "fa:16:3e:6d:0f:da", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5c4f897-26", "ovs_interfaceid": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:40:55 np0005593234 nova_compute[227762]: 2026-01-23 10:40:55.993 227766 DEBUG oslo_concurrency.lockutils [req-cfe3ecdf-d518-41ca-97a7-7e68d3449c80 req-425536e5-2f99-4635-90d9-27072ca0d329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-6b2d76e8-0aab-4760-b64e-0097520255ce" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.222 227766 DEBUG nova.compute.manager [req-c6e8deac-61f4-4938-be95-8574754a9cee req-6a6dc636-3568-47fd-992c-09e57f0b5e58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Received event network-vif-plugged-e5c4f897-26f7-4661-9447-da88c5a96ccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.400 227766 DEBUG oslo_concurrency.lockutils [req-c6e8deac-61f4-4938-be95-8574754a9cee req-6a6dc636-3568-47fd-992c-09e57f0b5e58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.401 227766 DEBUG oslo_concurrency.lockutils [req-c6e8deac-61f4-4938-be95-8574754a9cee req-6a6dc636-3568-47fd-992c-09e57f0b5e58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.401 227766 DEBUG oslo_concurrency.lockutils [req-c6e8deac-61f4-4938-be95-8574754a9cee req-6a6dc636-3568-47fd-992c-09e57f0b5e58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.401 227766 DEBUG nova.compute.manager [req-c6e8deac-61f4-4938-be95-8574754a9cee req-6a6dc636-3568-47fd-992c-09e57f0b5e58 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Processing event network-vif-plugged-e5c4f897-26f7-4661-9447-da88c5a96ccd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.402 227766 DEBUG nova.compute.manager [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.405 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164856.4049923, 6b2d76e8-0aab-4760-b64e-0097520255ce => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.405 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.407 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.410 227766 INFO nova.virt.libvirt.driver [-] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Instance spawned successfully.#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.411 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.522 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.526 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.526 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.526 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.527 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.527 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.527 227766 DEBUG nova.virt.libvirt.driver [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.533 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:40:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:40:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:56.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.681 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.749 227766 INFO nova.compute.manager [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Took 10.89 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:40:56 np0005593234 nova_compute[227762]: 2026-01-23 10:40:56.750 227766 DEBUG nova.compute.manager [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:40:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:57.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:57 np0005593234 nova_compute[227762]: 2026-01-23 10:40:57.114 227766 INFO nova.compute.manager [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Took 13.98 seconds to build instance.#033[00m
Jan 23 05:40:57 np0005593234 nova_compute[227762]: 2026-01-23 10:40:57.408 227766 DEBUG oslo_concurrency.lockutils [None req-72710fb9-9c09-4819-92d0-8680fc487732 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.361s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:40:57 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4194580106' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:40:58 np0005593234 nova_compute[227762]: 2026-01-23 10:40:58.453 227766 DEBUG nova.compute.manager [req-7c14998c-dbd7-41fc-84e7-bd898f0289da req-a043d431-965f-4758-bf52-d1846267d9cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Received event network-vif-plugged-e5c4f897-26f7-4661-9447-da88c5a96ccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:40:58 np0005593234 nova_compute[227762]: 2026-01-23 10:40:58.454 227766 DEBUG oslo_concurrency.lockutils [req-7c14998c-dbd7-41fc-84e7-bd898f0289da req-a043d431-965f-4758-bf52-d1846267d9cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:40:58 np0005593234 nova_compute[227762]: 2026-01-23 10:40:58.454 227766 DEBUG oslo_concurrency.lockutils [req-7c14998c-dbd7-41fc-84e7-bd898f0289da req-a043d431-965f-4758-bf52-d1846267d9cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:40:58 np0005593234 nova_compute[227762]: 2026-01-23 10:40:58.454 227766 DEBUG oslo_concurrency.lockutils [req-7c14998c-dbd7-41fc-84e7-bd898f0289da req-a043d431-965f-4758-bf52-d1846267d9cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:40:58 np0005593234 nova_compute[227762]: 2026-01-23 10:40:58.455 227766 DEBUG nova.compute.manager [req-7c14998c-dbd7-41fc-84e7-bd898f0289da req-a043d431-965f-4758-bf52-d1846267d9cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] No waiting events found dispatching network-vif-plugged-e5c4f897-26f7-4661-9447-da88c5a96ccd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:40:58 np0005593234 nova_compute[227762]: 2026-01-23 10:40:58.455 227766 WARNING nova.compute.manager [req-7c14998c-dbd7-41fc-84e7-bd898f0289da req-a043d431-965f-4758-bf52-d1846267d9cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Received unexpected event network-vif-plugged-e5c4f897-26f7-4661-9447-da88c5a96ccd for instance with vm_state active and task_state None.#033[00m
Jan 23 05:40:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:58.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:40:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:40:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:59.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:40:59 np0005593234 nova_compute[227762]: 2026-01-23 10:40:59.343 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:40:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e382 e382: 3 total, 3 up, 3 in
Jan 23 05:40:59 np0005593234 podman[321074]: 2026-01-23 10:40:59.792143668 +0000 UTC m=+0.083748167 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:40:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:40:59 np0005593234 nova_compute[227762]: 2026-01-23 10:40:59.960 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:00.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.672 227766 DEBUG oslo_concurrency.lockutils [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "6b2d76e8-0aab-4760-b64e-0097520255ce" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.672 227766 DEBUG oslo_concurrency.lockutils [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.672 227766 DEBUG oslo_concurrency.lockutils [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.672 227766 DEBUG oslo_concurrency.lockutils [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.672 227766 DEBUG oslo_concurrency.lockutils [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.673 227766 INFO nova.compute.manager [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Terminating instance#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.674 227766 DEBUG nova.compute.manager [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:41:00 np0005593234 kernel: tape5c4f897-26 (unregistering): left promiscuous mode
Jan 23 05:41:00 np0005593234 NetworkManager[48942]: <info>  [1769164860.7175] device (tape5c4f897-26): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:41:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:41:00Z|00844|binding|INFO|Releasing lport e5c4f897-26f7-4661-9447-da88c5a96ccd from this chassis (sb_readonly=0)
Jan 23 05:41:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:41:00Z|00845|binding|INFO|Setting lport e5c4f897-26f7-4661-9447-da88c5a96ccd down in Southbound
Jan 23 05:41:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:41:00Z|00846|binding|INFO|Removing iface tape5c4f897-26 ovn-installed in OVS
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.726 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.728 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:00.740 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6d:0f:da 10.100.0.4'], port_security=['fa:16:3e:6d:0f:da 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '6b2d76e8-0aab-4760-b64e-0097520255ce', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-72854481-c2f9-4651-8ba1-fe321a8a5546', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd27c5465284b48a5818ef931d6251c43', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3a20e786-de0e-4392-a9c2-94b60112f57a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95d9ae35-aabe-45f7-a103-f14858b94e31, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=e5c4f897-26f7-4661-9447-da88c5a96ccd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:41:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:00.741 144381 INFO neutron.agent.ovn.metadata.agent [-] Port e5c4f897-26f7-4661-9447-da88c5a96ccd in datapath 72854481-c2f9-4651-8ba1-fe321a8a5546 unbound from our chassis#033[00m
Jan 23 05:41:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:00.743 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 72854481-c2f9-4651-8ba1-fe321a8a5546, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.743 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:00.744 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fb1f5dd6-c340-466e-9bdf-461adba30d30]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:00.745 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 namespace which is not needed anymore#033[00m
Jan 23 05:41:00 np0005593234 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000c7.scope: Deactivated successfully.
Jan 23 05:41:00 np0005593234 systemd[1]: machine-qemu\x2d95\x2dinstance\x2d000000c7.scope: Consumed 3.309s CPU time.
Jan 23 05:41:00 np0005593234 systemd-machined[195626]: Machine qemu-95-instance-000000c7 terminated.
Jan 23 05:41:00 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321050]: [NOTICE]   (321054) : haproxy version is 2.8.14-c23fe91
Jan 23 05:41:00 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321050]: [NOTICE]   (321054) : path to executable is /usr/sbin/haproxy
Jan 23 05:41:00 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321050]: [WARNING]  (321054) : Exiting Master process...
Jan 23 05:41:00 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321050]: [ALERT]    (321054) : Current worker (321056) exited with code 143 (Terminated)
Jan 23 05:41:00 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321050]: [WARNING]  (321054) : All workers exited. Exiting... (0)
Jan 23 05:41:00 np0005593234 systemd[1]: libpod-e52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f.scope: Deactivated successfully.
Jan 23 05:41:00 np0005593234 podman[321120]: 2026-01-23 10:41:00.875801336 +0000 UTC m=+0.044257274 container died e52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.895 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.902 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:00 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f-userdata-shm.mount: Deactivated successfully.
Jan 23 05:41:00 np0005593234 systemd[1]: var-lib-containers-storage-overlay-db21e665ca16f0862de081301c6eef29cb334cee092026f5d743e21b28dd6baf-merged.mount: Deactivated successfully.
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.917 227766 INFO nova.virt.libvirt.driver [-] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Instance destroyed successfully.#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.918 227766 DEBUG nova.objects.instance [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lazy-loading 'resources' on Instance uuid 6b2d76e8-0aab-4760-b64e-0097520255ce obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:41:00 np0005593234 podman[321120]: 2026-01-23 10:41:00.922876147 +0000 UTC m=+0.091332085 container cleanup e52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:41:00 np0005593234 systemd[1]: libpod-conmon-e52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f.scope: Deactivated successfully.
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.942 227766 DEBUG nova.virt.libvirt.vif [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:40:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-581451877',display_name='tempest-TestVolumeBootPattern-server-581451877',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-581451877',id=199,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:40:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-glo4qv0m',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:40:56Z,user_data=None,user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=6b2d76e8-0aab-4760-b64e-0097520255ce,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "address": "fa:16:3e:6d:0f:da", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5c4f897-26", "ovs_interfaceid": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.943 227766 DEBUG nova.network.os_vif_util [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "address": "fa:16:3e:6d:0f:da", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape5c4f897-26", "ovs_interfaceid": "e5c4f897-26f7-4661-9447-da88c5a96ccd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.944 227766 DEBUG nova.network.os_vif_util [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6d:0f:da,bridge_name='br-int',has_traffic_filtering=True,id=e5c4f897-26f7-4661-9447-da88c5a96ccd,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5c4f897-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.945 227766 DEBUG os_vif [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:0f:da,bridge_name='br-int',has_traffic_filtering=True,id=e5c4f897-26f7-4661-9447-da88c5a96ccd,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5c4f897-26') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.946 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.946 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape5c4f897-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.948 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.949 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:00 np0005593234 nova_compute[227762]: 2026-01-23 10:41:00.952 227766 INFO os_vif [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6d:0f:da,bridge_name='br-int',has_traffic_filtering=True,id=e5c4f897-26f7-4661-9447-da88c5a96ccd,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape5c4f897-26')#033[00m
Jan 23 05:41:00 np0005593234 podman[321158]: 2026-01-23 10:41:00.992136421 +0000 UTC m=+0.044989277 container remove e52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:41:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:00.999 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[80b54e95-9f0d-45fd-9380-fdb527b9ae70]: (4, ('Fri Jan 23 10:41:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 (e52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f)\ne52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f\nFri Jan 23 10:41:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 (e52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f)\ne52c232c896937e4ae2980d306f34267931f4ff779c5200b2e7f51375c6d654f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:01.001 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3ad8bf30-16be-48a5-894b-f13ed549f68a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:01.002 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72854481-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.003 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:01 np0005593234 kernel: tap72854481-c0: left promiscuous mode
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.019 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:01.022 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aa1a0e29-9bd1-4bed-a114-9d4e2349c04a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:01.040 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[844a885f-f4a0-449b-b603-0f793b830456]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:01.042 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[43de8e56-b0d9-4c9e-b983-42c25706e775]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:01.064 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e418ea16-0d66-46f4-9c78-22124cdc4aeb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 874738, 'reachable_time': 29901, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 321188, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:01 np0005593234 systemd[1]: run-netns-ovnmeta\x2d72854481\x2dc2f9\x2d4651\x2d8ba1\x2dfe321a8a5546.mount: Deactivated successfully.
Jan 23 05:41:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:01.070 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:41:01 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:01.070 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[5162b16f-9508-4e12-ab4f-79dd27cf8d13]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:41:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:01.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.117 227766 DEBUG nova.compute.manager [req-d61e42e6-c3df-4aee-9150-f15bee450fba req-6ea602bb-5b28-4b37-a604-8f3279998afb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Received event network-vif-unplugged-e5c4f897-26f7-4661-9447-da88c5a96ccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.119 227766 DEBUG oslo_concurrency.lockutils [req-d61e42e6-c3df-4aee-9150-f15bee450fba req-6ea602bb-5b28-4b37-a604-8f3279998afb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.119 227766 DEBUG oslo_concurrency.lockutils [req-d61e42e6-c3df-4aee-9150-f15bee450fba req-6ea602bb-5b28-4b37-a604-8f3279998afb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.119 227766 DEBUG oslo_concurrency.lockutils [req-d61e42e6-c3df-4aee-9150-f15bee450fba req-6ea602bb-5b28-4b37-a604-8f3279998afb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.120 227766 DEBUG nova.compute.manager [req-d61e42e6-c3df-4aee-9150-f15bee450fba req-6ea602bb-5b28-4b37-a604-8f3279998afb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] No waiting events found dispatching network-vif-unplugged-e5c4f897-26f7-4661-9447-da88c5a96ccd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.120 227766 DEBUG nova.compute.manager [req-d61e42e6-c3df-4aee-9150-f15bee450fba req-6ea602bb-5b28-4b37-a604-8f3279998afb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Received event network-vif-unplugged-e5c4f897-26f7-4661-9447-da88c5a96ccd for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.203 227766 INFO nova.virt.libvirt.driver [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Deleting instance files /var/lib/nova/instances/6b2d76e8-0aab-4760-b64e-0097520255ce_del#033[00m
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.204 227766 INFO nova.virt.libvirt.driver [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Deletion of /var/lib/nova/instances/6b2d76e8-0aab-4760-b64e-0097520255ce_del complete#033[00m
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.771 227766 INFO nova.compute.manager [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Took 1.10 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.772 227766 DEBUG oslo.service.loopingcall [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.773 227766 DEBUG nova.compute.manager [-] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:41:01 np0005593234 nova_compute[227762]: 2026-01-23 10:41:01.773 227766 DEBUG nova.network.neutron [-] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:41:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:02.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:03.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:03 np0005593234 nova_compute[227762]: 2026-01-23 10:41:03.275 227766 DEBUG nova.compute.manager [req-7dab0cd5-993b-46d4-a533-9e0e1cb20bbe req-53c0e517-85c6-4830-b517-924b5b5ed632 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Received event network-vif-plugged-e5c4f897-26f7-4661-9447-da88c5a96ccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:41:03 np0005593234 nova_compute[227762]: 2026-01-23 10:41:03.276 227766 DEBUG oslo_concurrency.lockutils [req-7dab0cd5-993b-46d4-a533-9e0e1cb20bbe req-53c0e517-85c6-4830-b517-924b5b5ed632 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:03 np0005593234 nova_compute[227762]: 2026-01-23 10:41:03.276 227766 DEBUG oslo_concurrency.lockutils [req-7dab0cd5-993b-46d4-a533-9e0e1cb20bbe req-53c0e517-85c6-4830-b517-924b5b5ed632 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:03 np0005593234 nova_compute[227762]: 2026-01-23 10:41:03.276 227766 DEBUG oslo_concurrency.lockutils [req-7dab0cd5-993b-46d4-a533-9e0e1cb20bbe req-53c0e517-85c6-4830-b517-924b5b5ed632 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:03 np0005593234 nova_compute[227762]: 2026-01-23 10:41:03.276 227766 DEBUG nova.compute.manager [req-7dab0cd5-993b-46d4-a533-9e0e1cb20bbe req-53c0e517-85c6-4830-b517-924b5b5ed632 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] No waiting events found dispatching network-vif-plugged-e5c4f897-26f7-4661-9447-da88c5a96ccd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:41:03 np0005593234 nova_compute[227762]: 2026-01-23 10:41:03.277 227766 WARNING nova.compute.manager [req-7dab0cd5-993b-46d4-a533-9e0e1cb20bbe req-53c0e517-85c6-4830-b517-924b5b5ed632 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Received unexpected event network-vif-plugged-e5c4f897-26f7-4661-9447-da88c5a96ccd for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:41:04 np0005593234 nova_compute[227762]: 2026-01-23 10:41:04.345 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:41:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:04.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:41:04 np0005593234 nova_compute[227762]: 2026-01-23 10:41:04.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:05.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:05.943 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=76, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=75) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:41:05 np0005593234 nova_compute[227762]: 2026-01-23 10:41:05.943 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:05.944 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:41:05 np0005593234 nova_compute[227762]: 2026-01-23 10:41:05.948 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:06 np0005593234 nova_compute[227762]: 2026-01-23 10:41:06.080 227766 DEBUG nova.network.neutron [-] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:41:06 np0005593234 nova_compute[227762]: 2026-01-23 10:41:06.086 227766 DEBUG nova.compute.manager [req-59f6ba56-1235-4791-82d7-beb4e4cdb8bf req-ba509999-90cc-4d48-acbb-1c87bd8a35c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Received event network-vif-deleted-e5c4f897-26f7-4661-9447-da88c5a96ccd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:41:06 np0005593234 nova_compute[227762]: 2026-01-23 10:41:06.087 227766 INFO nova.compute.manager [req-59f6ba56-1235-4791-82d7-beb4e4cdb8bf req-ba509999-90cc-4d48-acbb-1c87bd8a35c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Neutron deleted interface e5c4f897-26f7-4661-9447-da88c5a96ccd; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 05:41:06 np0005593234 nova_compute[227762]: 2026-01-23 10:41:06.087 227766 DEBUG nova.network.neutron [req-59f6ba56-1235-4791-82d7-beb4e4cdb8bf req-ba509999-90cc-4d48-acbb-1c87bd8a35c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:41:06 np0005593234 nova_compute[227762]: 2026-01-23 10:41:06.232 227766 INFO nova.compute.manager [-] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Took 4.46 seconds to deallocate network for instance.#033[00m
Jan 23 05:41:06 np0005593234 nova_compute[227762]: 2026-01-23 10:41:06.241 227766 DEBUG nova.compute.manager [req-59f6ba56-1235-4791-82d7-beb4e4cdb8bf req-ba509999-90cc-4d48-acbb-1c87bd8a35c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Detach interface failed, port_id=e5c4f897-26f7-4661-9447-da88c5a96ccd, reason: Instance 6b2d76e8-0aab-4760-b64e-0097520255ce could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 05:41:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:41:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:06.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:41:06 np0005593234 nova_compute[227762]: 2026-01-23 10:41:06.635 227766 INFO nova.compute.manager [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Took 0.40 seconds to detach 1 volumes for instance.#033[00m
Jan 23 05:41:06 np0005593234 nova_compute[227762]: 2026-01-23 10:41:06.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:06 np0005593234 nova_compute[227762]: 2026-01-23 10:41:06.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:41:06 np0005593234 nova_compute[227762]: 2026-01-23 10:41:06.806 227766 DEBUG oslo_concurrency.lockutils [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:06 np0005593234 nova_compute[227762]: 2026-01-23 10:41:06.807 227766 DEBUG oslo_concurrency.lockutils [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:06 np0005593234 nova_compute[227762]: 2026-01-23 10:41:06.924 227766 DEBUG oslo_concurrency.processutils [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:07.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:41:07 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3609882392' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:41:07 np0005593234 nova_compute[227762]: 2026-01-23 10:41:07.397 227766 DEBUG oslo_concurrency.processutils [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:07 np0005593234 nova_compute[227762]: 2026-01-23 10:41:07.404 227766 DEBUG nova.compute.provider_tree [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:41:07 np0005593234 nova_compute[227762]: 2026-01-23 10:41:07.571 227766 DEBUG nova.scheduler.client.report [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:41:07 np0005593234 podman[321265]: 2026-01-23 10:41:07.812992611 +0000 UTC m=+0.107720387 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 05:41:07 np0005593234 nova_compute[227762]: 2026-01-23 10:41:07.881 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:41:07 np0005593234 nova_compute[227762]: 2026-01-23 10:41:07.881 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:41:07 np0005593234 nova_compute[227762]: 2026-01-23 10:41:07.882 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:41:07 np0005593234 nova_compute[227762]: 2026-01-23 10:41:07.960 227766 DEBUG oslo_concurrency.lockutils [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.153s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:07 np0005593234 nova_compute[227762]: 2026-01-23 10:41:07.995 227766 INFO nova.scheduler.client.report [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Deleted allocations for instance 6b2d76e8-0aab-4760-b64e-0097520255ce#033[00m
Jan 23 05:41:08 np0005593234 nova_compute[227762]: 2026-01-23 10:41:08.161 227766 DEBUG oslo_concurrency.lockutils [None req-6dc04b5d-4d80-445a-b594-1e78f6cdb3fc eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "6b2d76e8-0aab-4760-b64e-0097520255ce" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.489s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:08.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:09.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:09 np0005593234 nova_compute[227762]: 2026-01-23 10:41:09.347 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e382 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:41:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:10.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:41:10 np0005593234 nova_compute[227762]: 2026-01-23 10:41:10.592 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Updating instance_info_cache with network_info: [{"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:41:10 np0005593234 nova_compute[227762]: 2026-01-23 10:41:10.719 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:41:10 np0005593234 nova_compute[227762]: 2026-01-23 10:41:10.719 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:41:10 np0005593234 nova_compute[227762]: 2026-01-23 10:41:10.720 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:10.946 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '76'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:10 np0005593234 nova_compute[227762]: 2026-01-23 10:41:10.951 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:11.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:11 np0005593234 nova_compute[227762]: 2026-01-23 10:41:11.371 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:11 np0005593234 nova_compute[227762]: 2026-01-23 10:41:11.371 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:11 np0005593234 nova_compute[227762]: 2026-01-23 10:41:11.373 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:11 np0005593234 nova_compute[227762]: 2026-01-23 10:41:11.373 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:41:11 np0005593234 nova_compute[227762]: 2026-01-23 10:41:11.373 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:41:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2176120804' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:41:11 np0005593234 nova_compute[227762]: 2026-01-23 10:41:11.837 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e383 e383: 3 total, 3 up, 3 in
Jan 23 05:41:12 np0005593234 nova_compute[227762]: 2026-01-23 10:41:12.090 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:41:12 np0005593234 nova_compute[227762]: 2026-01-23 10:41:12.090 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:41:12 np0005593234 nova_compute[227762]: 2026-01-23 10:41:12.096 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:41:12 np0005593234 nova_compute[227762]: 2026-01-23 10:41:12.097 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:41:12 np0005593234 nova_compute[227762]: 2026-01-23 10:41:12.304 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:41:12 np0005593234 nova_compute[227762]: 2026-01-23 10:41:12.306 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3681MB free_disk=20.80577850341797GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:41:12 np0005593234 nova_compute[227762]: 2026-01-23 10:41:12.306 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:12 np0005593234 nova_compute[227762]: 2026-01-23 10:41:12.306 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:12.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:12 np0005593234 nova_compute[227762]: 2026-01-23 10:41:12.665 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 307f203d-cfc0-45a9-a0cd-3acee0ef7133 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:41:12 np0005593234 nova_compute[227762]: 2026-01-23 10:41:12.666 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance d0cea430-15ec-471d-963b-41fd4fa4777c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:41:12 np0005593234 nova_compute[227762]: 2026-01-23 10:41:12.666 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:41:12 np0005593234 nova_compute[227762]: 2026-01-23 10:41:12.666 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:41:12 np0005593234 nova_compute[227762]: 2026-01-23 10:41:12.742 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:41:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:13.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:41:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:41:13 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/224167961' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:41:13 np0005593234 nova_compute[227762]: 2026-01-23 10:41:13.189 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:13 np0005593234 nova_compute[227762]: 2026-01-23 10:41:13.195 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:41:13 np0005593234 nova_compute[227762]: 2026-01-23 10:41:13.512 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:41:13 np0005593234 nova_compute[227762]: 2026-01-23 10:41:13.675 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:41:13 np0005593234 nova_compute[227762]: 2026-01-23 10:41:13.675 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.369s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:13 np0005593234 nova_compute[227762]: 2026-01-23 10:41:13.700 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:13 np0005593234 nova_compute[227762]: 2026-01-23 10:41:13.700 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:13 np0005593234 nova_compute[227762]: 2026-01-23 10:41:13.701 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:13 np0005593234 nova_compute[227762]: 2026-01-23 10:41:13.701 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:41:14 np0005593234 nova_compute[227762]: 2026-01-23 10:41:14.350 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:14.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:14 np0005593234 nova_compute[227762]: 2026-01-23 10:41:14.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e383 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:41:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:15.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:41:15 np0005593234 nova_compute[227762]: 2026-01-23 10:41:15.916 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164860.9151373, 6b2d76e8-0aab-4760-b64e-0097520255ce => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:41:15 np0005593234 nova_compute[227762]: 2026-01-23 10:41:15.916 227766 INFO nova.compute.manager [-] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:41:15 np0005593234 nova_compute[227762]: 2026-01-23 10:41:15.953 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:16 np0005593234 nova_compute[227762]: 2026-01-23 10:41:16.452 227766 DEBUG nova.compute.manager [None req-8d5a36bf-196b-41ad-89bb-3f200bb2c32d - - - - - -] [instance: 6b2d76e8-0aab-4760-b64e-0097520255ce] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:41:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:41:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:16.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:41:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:17.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:18.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:19.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:19 np0005593234 nova_compute[227762]: 2026-01-23 10:41:19.351 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e383 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:20.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:20 np0005593234 nova_compute[227762]: 2026-01-23 10:41:20.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:20 np0005593234 nova_compute[227762]: 2026-01-23 10:41:20.957 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:21.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:22.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:23.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:24 np0005593234 nova_compute[227762]: 2026-01-23 10:41:24.353 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e384 e384: 3 total, 3 up, 3 in
Jan 23 05:41:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:24.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e384 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:41:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:25.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:41:25 np0005593234 nova_compute[227762]: 2026-01-23 10:41:25.959 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:26.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:27.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:27 np0005593234 nova_compute[227762]: 2026-01-23 10:41:27.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:41:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:28.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:41:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:29.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:41:29 np0005593234 nova_compute[227762]: 2026-01-23 10:41:29.355 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e384 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:41:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:30.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:41:30 np0005593234 podman[321400]: 2026-01-23 10:41:30.765787105 +0000 UTC m=+0.050503419 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:41:31 np0005593234 nova_compute[227762]: 2026-01-23 10:41:31.006 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:31.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:32.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:33.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:34 np0005593234 nova_compute[227762]: 2026-01-23 10:41:34.355 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:34.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e384 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:35.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:36 np0005593234 nova_compute[227762]: 2026-01-23 10:41:36.008 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:36.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:37.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:38.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:38 np0005593234 podman[321423]: 2026-01-23 10:41:38.779406563 +0000 UTC m=+0.079568177 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 23 05:41:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:41:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:39.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:41:39 np0005593234 nova_compute[227762]: 2026-01-23 10:41:39.358 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e385 e385: 3 total, 3 up, 3 in
Jan 23 05:41:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:40.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:41 np0005593234 nova_compute[227762]: 2026-01-23 10:41:41.055 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:41.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:42.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:42.880 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:42.880 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:42.881 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:41:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:43.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:41:44 np0005593234 nova_compute[227762]: 2026-01-23 10:41:44.245 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:44 np0005593234 nova_compute[227762]: 2026-01-23 10:41:44.246 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:44 np0005593234 nova_compute[227762]: 2026-01-23 10:41:44.268 227766 DEBUG nova.compute.manager [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:41:44 np0005593234 nova_compute[227762]: 2026-01-23 10:41:44.411 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:44 np0005593234 nova_compute[227762]: 2026-01-23 10:41:44.412 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:44 np0005593234 nova_compute[227762]: 2026-01-23 10:41:44.412 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:44 np0005593234 nova_compute[227762]: 2026-01-23 10:41:44.420 227766 DEBUG nova.virt.hardware [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:41:44 np0005593234 nova_compute[227762]: 2026-01-23 10:41:44.420 227766 INFO nova.compute.claims [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:41:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:41:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2074444340' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:41:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:41:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2074444340' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:41:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:41:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:44.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:41:44 np0005593234 nova_compute[227762]: 2026-01-23 10:41:44.644 227766 DEBUG oslo_concurrency.processutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:45.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:41:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1843066479' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:41:45 np0005593234 nova_compute[227762]: 2026-01-23 10:41:45.214 227766 DEBUG oslo_concurrency.processutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.570s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:45 np0005593234 nova_compute[227762]: 2026-01-23 10:41:45.220 227766 DEBUG nova.compute.provider_tree [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:41:45 np0005593234 nova_compute[227762]: 2026-01-23 10:41:45.234 227766 DEBUG nova.scheduler.client.report [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:41:45 np0005593234 nova_compute[227762]: 2026-01-23 10:41:45.272 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.860s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:45 np0005593234 nova_compute[227762]: 2026-01-23 10:41:45.273 227766 DEBUG nova.compute.manager [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:41:45 np0005593234 nova_compute[227762]: 2026-01-23 10:41:45.326 227766 DEBUG nova.compute.manager [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:41:45 np0005593234 nova_compute[227762]: 2026-01-23 10:41:45.326 227766 DEBUG nova.network.neutron [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:41:45 np0005593234 nova_compute[227762]: 2026-01-23 10:41:45.389 227766 INFO nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:41:45 np0005593234 nova_compute[227762]: 2026-01-23 10:41:45.411 227766 DEBUG nova.compute.manager [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:41:45 np0005593234 nova_compute[227762]: 2026-01-23 10:41:45.460 227766 INFO nova.virt.block_device [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Booting with volume snapshot 56abda02-64b9-4403-a75d-7c77cd557d3a at /dev/vda#033[00m
Jan 23 05:41:45 np0005593234 nova_compute[227762]: 2026-01-23 10:41:45.547 227766 DEBUG nova.policy [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'eb70c3aee8b64273a1930c0c2c231aff', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd27c5465284b48a5818ef931d6251c43', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:41:46 np0005593234 nova_compute[227762]: 2026-01-23 10:41:46.058 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:46 np0005593234 nova_compute[227762]: 2026-01-23 10:41:46.490 227766 DEBUG nova.network.neutron [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Successfully created port: 5940561a-bc60-483b-b18f-1dc7f993cadb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:41:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:46.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:47.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:48 np0005593234 nova_compute[227762]: 2026-01-23 10:41:48.045 227766 DEBUG nova.network.neutron [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Successfully updated port: 5940561a-bc60-483b-b18f-1dc7f993cadb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:41:48 np0005593234 nova_compute[227762]: 2026-01-23 10:41:48.075 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "refresh_cache-875a53a5-020f-4a4e-a0cf-bcfd254ba895" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:41:48 np0005593234 nova_compute[227762]: 2026-01-23 10:41:48.076 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquired lock "refresh_cache-875a53a5-020f-4a4e-a0cf-bcfd254ba895" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:41:48 np0005593234 nova_compute[227762]: 2026-01-23 10:41:48.076 227766 DEBUG nova.network.neutron [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:41:48 np0005593234 nova_compute[227762]: 2026-01-23 10:41:48.166 227766 DEBUG nova.compute.manager [req-d87235ca-5afb-4133-a13b-6e94606f5a3b req-c15e25a0-fe15-482e-822b-a5bbd776e914 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Received event network-changed-5940561a-bc60-483b-b18f-1dc7f993cadb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:41:48 np0005593234 nova_compute[227762]: 2026-01-23 10:41:48.167 227766 DEBUG nova.compute.manager [req-d87235ca-5afb-4133-a13b-6e94606f5a3b req-c15e25a0-fe15-482e-822b-a5bbd776e914 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Refreshing instance network info cache due to event network-changed-5940561a-bc60-483b-b18f-1dc7f993cadb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:41:48 np0005593234 nova_compute[227762]: 2026-01-23 10:41:48.167 227766 DEBUG oslo_concurrency.lockutils [req-d87235ca-5afb-4133-a13b-6e94606f5a3b req-c15e25a0-fe15-482e-822b-a5bbd776e914 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-875a53a5-020f-4a4e-a0cf-bcfd254ba895" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:41:48 np0005593234 nova_compute[227762]: 2026-01-23 10:41:48.433 227766 DEBUG nova.network.neutron [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:41:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:48.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:48.971 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=77, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=76) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:41:48 np0005593234 nova_compute[227762]: 2026-01-23 10:41:48.972 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:48 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:48.972 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:41:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:41:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:49.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.384 227766 DEBUG nova.network.neutron [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Updating instance_info_cache with network_info: [{"id": "5940561a-bc60-483b-b18f-1dc7f993cadb", "address": "fa:16:3e:84:80:05", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5940561a-bc", "ovs_interfaceid": "5940561a-bc60-483b-b18f-1dc7f993cadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.404 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Releasing lock "refresh_cache-875a53a5-020f-4a4e-a0cf-bcfd254ba895" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.405 227766 DEBUG nova.compute.manager [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Instance network_info: |[{"id": "5940561a-bc60-483b-b18f-1dc7f993cadb", "address": "fa:16:3e:84:80:05", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5940561a-bc", "ovs_interfaceid": "5940561a-bc60-483b-b18f-1dc7f993cadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.405 227766 DEBUG oslo_concurrency.lockutils [req-d87235ca-5afb-4133-a13b-6e94606f5a3b req-c15e25a0-fe15-482e-822b-a5bbd776e914 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-875a53a5-020f-4a4e-a0cf-bcfd254ba895" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.405 227766 DEBUG nova.network.neutron [req-d87235ca-5afb-4133-a13b-6e94606f5a3b req-c15e25a0-fe15-482e-822b-a5bbd776e914 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Refreshing network info cache for port 5940561a-bc60-483b-b18f-1dc7f993cadb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.415 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e385 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.953 227766 DEBUG os_brick.utils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.955 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.967 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.012s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.968 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[e6d6224e-bb0f-459b-b70d-782e992a7a22]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.969 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.977 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.978 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[62ca05e3-97d2-43a8-8a2e-55498b0b3659]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.979 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.990 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.991 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[502cb642-9945-4fed-9298-1b1908bccae0]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.992 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[84aa6539-1a22-4ebf-9f47-78c03caf5a72]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:49 np0005593234 nova_compute[227762]: 2026-01-23 10:41:49.993 227766 DEBUG oslo_concurrency.processutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:50 np0005593234 nova_compute[227762]: 2026-01-23 10:41:50.029 227766 DEBUG oslo_concurrency.processutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "nvme version" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:50 np0005593234 nova_compute[227762]: 2026-01-23 10:41:50.032 227766 DEBUG os_brick.initiator.connectors.lightos [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:41:50 np0005593234 nova_compute[227762]: 2026-01-23 10:41:50.032 227766 DEBUG os_brick.initiator.connectors.lightos [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:41:50 np0005593234 nova_compute[227762]: 2026-01-23 10:41:50.032 227766 DEBUG os_brick.initiator.connectors.lightos [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:41:50 np0005593234 nova_compute[227762]: 2026-01-23 10:41:50.032 227766 DEBUG os_brick.utils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] <== get_connector_properties: return (78ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:41:50 np0005593234 nova_compute[227762]: 2026-01-23 10:41:50.033 227766 DEBUG nova.virt.block_device [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Updating existing volume attachment record: e4fd0a83-1264-4516-b2ab-863aaa086214 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:41:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:50.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:41:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3217911972' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:41:50 np0005593234 nova_compute[227762]: 2026-01-23 10:41:50.972 227766 DEBUG nova.network.neutron [req-d87235ca-5afb-4133-a13b-6e94606f5a3b req-c15e25a0-fe15-482e-822b-a5bbd776e914 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Updated VIF entry in instance network info cache for port 5940561a-bc60-483b-b18f-1dc7f993cadb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:41:50 np0005593234 nova_compute[227762]: 2026-01-23 10:41:50.972 227766 DEBUG nova.network.neutron [req-d87235ca-5afb-4133-a13b-6e94606f5a3b req-c15e25a0-fe15-482e-822b-a5bbd776e914 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Updating instance_info_cache with network_info: [{"id": "5940561a-bc60-483b-b18f-1dc7f993cadb", "address": "fa:16:3e:84:80:05", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5940561a-bc", "ovs_interfaceid": "5940561a-bc60-483b-b18f-1dc7f993cadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:41:50 np0005593234 nova_compute[227762]: 2026-01-23 10:41:50.990 227766 DEBUG oslo_concurrency.lockutils [req-d87235ca-5afb-4133-a13b-6e94606f5a3b req-c15e25a0-fe15-482e-822b-a5bbd776e914 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-875a53a5-020f-4a4e-a0cf-bcfd254ba895" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.059 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.116 227766 DEBUG nova.compute.manager [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.118 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.118 227766 INFO nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Creating image(s)#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.119 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.119 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Ensure instance console log exists: /var/lib/nova/instances/875a53a5-020f-4a4e-a0cf-bcfd254ba895/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.119 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.119 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.120 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.121 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Start _get_guest_xml network_info=[{"id": "5940561a-bc60-483b-b18f-1dc7f993cadb", "address": "fa:16:3e:84:80:05", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5940561a-bc", "ovs_interfaceid": "5940561a-bc60-483b-b18f-1dc7f993cadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'boot_index': 0, 'mount_device': '/dev/vda', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-53917a40-f345-4f87-a3c1-5297194341d6', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '53917a40-f345-4f87-a3c1-5297194341d6', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': '875a53a5-020f-4a4e-a0cf-bcfd254ba895', 'attached_at': '', 'detached_at': '', 'volume_id': '53917a40-f345-4f87-a3c1-5297194341d6', 'serial': '53917a40-f345-4f87-a3c1-5297194341d6'}, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': 'e4fd0a83-1264-4516-b2ab-863aaa086214', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.127 227766 WARNING nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.134 227766 DEBUG nova.virt.libvirt.host [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.134 227766 DEBUG nova.virt.libvirt.host [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.139 227766 DEBUG nova.virt.libvirt.host [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.139 227766 DEBUG nova.virt.libvirt.host [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.140 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.141 227766 DEBUG nova.virt.hardware [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.141 227766 DEBUG nova.virt.hardware [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.141 227766 DEBUG nova.virt.hardware [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.141 227766 DEBUG nova.virt.hardware [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.142 227766 DEBUG nova.virt.hardware [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.142 227766 DEBUG nova.virt.hardware [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.142 227766 DEBUG nova.virt.hardware [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.142 227766 DEBUG nova.virt.hardware [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.142 227766 DEBUG nova.virt.hardware [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.142 227766 DEBUG nova.virt.hardware [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.143 227766 DEBUG nova.virt.hardware [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:41:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:41:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:51.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.171 227766 DEBUG nova.storage.rbd_utils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image 875a53a5-020f-4a4e-a0cf-bcfd254ba895_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:41:51 np0005593234 nova_compute[227762]: 2026-01-23 10:41:51.176 227766 DEBUG oslo_concurrency.processutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:51.976 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '77'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:41:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4132706962' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.297 227766 DEBUG oslo_concurrency.processutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.121s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.330 227766 DEBUG nova.virt.libvirt.vif [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:41:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-559887613',display_name='tempest-TestVolumeBootPattern-server-559887613',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-559887613',id=200,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-d03gudju',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:41:45Z,user_data=None,user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=875a53a5-020f-4a4e-a0cf-bcfd254ba895,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5940561a-bc60-483b-b18f-1dc7f993cadb", "address": "fa:16:3e:84:80:05", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5940561a-bc", "ovs_interfaceid": "5940561a-bc60-483b-b18f-1dc7f993cadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.331 227766 DEBUG nova.network.os_vif_util [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "5940561a-bc60-483b-b18f-1dc7f993cadb", "address": "fa:16:3e:84:80:05", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5940561a-bc", "ovs_interfaceid": "5940561a-bc60-483b-b18f-1dc7f993cadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.332 227766 DEBUG nova.network.os_vif_util [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:80:05,bridge_name='br-int',has_traffic_filtering=True,id=5940561a-bc60-483b-b18f-1dc7f993cadb,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5940561a-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.333 227766 DEBUG nova.objects.instance [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lazy-loading 'pci_devices' on Instance uuid 875a53a5-020f-4a4e-a0cf-bcfd254ba895 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.351 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  <uuid>875a53a5-020f-4a4e-a0cf-bcfd254ba895</uuid>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  <name>instance-000000c8</name>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestVolumeBootPattern-server-559887613</nova:name>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:41:51</nova:creationTime>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <nova:user uuid="eb70c3aee8b64273a1930c0c2c231aff">tempest-TestVolumeBootPattern-2139361132-project-member</nova:user>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <nova:project uuid="d27c5465284b48a5818ef931d6251c43">tempest-TestVolumeBootPattern-2139361132</nova:project>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <nova:port uuid="5940561a-bc60-483b-b18f-1dc7f993cadb">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <entry name="serial">875a53a5-020f-4a4e-a0cf-bcfd254ba895</entry>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <entry name="uuid">875a53a5-020f-4a4e-a0cf-bcfd254ba895</entry>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/875a53a5-020f-4a4e-a0cf-bcfd254ba895_disk.config">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="volumes/volume-53917a40-f345-4f87-a3c1-5297194341d6">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <serial>53917a40-f345-4f87-a3c1-5297194341d6</serial>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:84:80:05"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <target dev="tap5940561a-bc"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/875a53a5-020f-4a4e-a0cf-bcfd254ba895/console.log" append="off"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:41:52 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:41:52 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:41:52 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:41:52 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.352 227766 DEBUG nova.compute.manager [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Preparing to wait for external event network-vif-plugged-5940561a-bc60-483b-b18f-1dc7f993cadb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.352 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.353 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.353 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.354 227766 DEBUG nova.virt.libvirt.vif [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:41:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-559887613',display_name='tempest-TestVolumeBootPattern-server-559887613',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-559887613',id=200,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-d03gudju',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:41:45Z,user_data=None,user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=875a53a5-020f-4a4e-a0cf-bcfd254ba895,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5940561a-bc60-483b-b18f-1dc7f993cadb", "address": "fa:16:3e:84:80:05", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5940561a-bc", "ovs_interfaceid": "5940561a-bc60-483b-b18f-1dc7f993cadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.354 227766 DEBUG nova.network.os_vif_util [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "5940561a-bc60-483b-b18f-1dc7f993cadb", "address": "fa:16:3e:84:80:05", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5940561a-bc", "ovs_interfaceid": "5940561a-bc60-483b-b18f-1dc7f993cadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.355 227766 DEBUG nova.network.os_vif_util [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:80:05,bridge_name='br-int',has_traffic_filtering=True,id=5940561a-bc60-483b-b18f-1dc7f993cadb,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5940561a-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.357 227766 DEBUG os_vif [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:80:05,bridge_name='br-int',has_traffic_filtering=True,id=5940561a-bc60-483b-b18f-1dc7f993cadb,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5940561a-bc') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.357 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.358 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.358 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.363 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.364 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5940561a-bc, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.365 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5940561a-bc, col_values=(('external_ids', {'iface-id': '5940561a-bc60-483b-b18f-1dc7f993cadb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:84:80:05', 'vm-uuid': '875a53a5-020f-4a4e-a0cf-bcfd254ba895'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.410 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:52 np0005593234 NetworkManager[48942]: <info>  [1769164912.4123] manager: (tap5940561a-bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/398)
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.413 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.417 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.418 227766 INFO os_vif [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:80:05,bridge_name='br-int',has_traffic_filtering=True,id=5940561a-bc60-483b-b18f-1dc7f993cadb,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5940561a-bc')#033[00m
Jan 23 05:41:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:52.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.658 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.658 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.659 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] No VIF found with MAC fa:16:3e:84:80:05, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.659 227766 INFO nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Using config drive#033[00m
Jan 23 05:41:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:41:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:41:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:41:52 np0005593234 nova_compute[227762]: 2026-01-23 10:41:52.962 227766 DEBUG nova.storage.rbd_utils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image 875a53a5-020f-4a4e-a0cf-bcfd254ba895_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:41:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:41:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:53.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:41:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:41:54 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2582244694' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.209 227766 INFO nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Creating config drive at /var/lib/nova/instances/875a53a5-020f-4a4e-a0cf-bcfd254ba895/disk.config#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.214 227766 DEBUG oslo_concurrency.processutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/875a53a5-020f-4a4e-a0cf-bcfd254ba895/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6nzb74om execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.349 227766 DEBUG oslo_concurrency.processutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/875a53a5-020f-4a4e-a0cf-bcfd254ba895/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6nzb74om" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.389 227766 DEBUG nova.storage.rbd_utils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] rbd image 875a53a5-020f-4a4e-a0cf-bcfd254ba895_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.394 227766 DEBUG oslo_concurrency.processutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/875a53a5-020f-4a4e-a0cf-bcfd254ba895/disk.config 875a53a5-020f-4a4e-a0cf-bcfd254ba895_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.421 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:41:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.0 total, 600.0 interval#012Cumulative writes: 16K writes, 82K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.16 GB, 0.03 MB/s#012Cumulative WAL: 16K writes, 16K syncs, 1.00 writes per sync, written: 0.16 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1613 writes, 8099 keys, 1613 commit groups, 1.0 writes per commit group, ingest: 16.51 MB, 0.03 MB/s#012Interval WAL: 1613 writes, 1613 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     43.0      2.39              0.30        54    0.044       0      0       0.0       0.0#012  L6      1/0   10.42 MB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   5.2    116.9     99.6      5.31              1.65        53    0.100    391K    28K       0.0       0.0#012 Sum      1/0   10.42 MB   0.0      0.6     0.1      0.5       0.6      0.1       0.0   6.2     80.7     82.0      7.69              1.96       107    0.072    391K    28K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.1     53.4     51.6      1.50              0.21        12    0.125     61K   3099       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.6     0.1      0.5       0.5      0.0       0.0   0.0    116.9     99.6      5.31              1.65        53    0.100    391K    28K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     43.0      2.38              0.30        53    0.045       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.0 total, 600.0 interval#012Flush(GB): cumulative 0.100, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.62 GB write, 0.11 MB/s write, 0.61 GB read, 0.10 MB/s read, 7.7 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 1.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 304.00 MB usage: 67.87 MB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 0 last_secs: 0.000455 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3883,65.07 MB,21.4046%) FilterBlock(107,1.06 MB,0.350325%) IndexBlock(107,1.73 MB,0.569449%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.560 227766 DEBUG oslo_concurrency.processutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/875a53a5-020f-4a4e-a0cf-bcfd254ba895/disk.config 875a53a5-020f-4a4e-a0cf-bcfd254ba895_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.166s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.561 227766 INFO nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Deleting local config drive /var/lib/nova/instances/875a53a5-020f-4a4e-a0cf-bcfd254ba895/disk.config because it was imported into RBD.#033[00m
Jan 23 05:41:54 np0005593234 kernel: tap5940561a-bc: entered promiscuous mode
Jan 23 05:41:54 np0005593234 NetworkManager[48942]: <info>  [1769164914.6119] manager: (tap5940561a-bc): new Tun device (/org/freedesktop/NetworkManager/Devices/399)
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.612 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:54 np0005593234 ovn_controller[134547]: 2026-01-23T10:41:54Z|00847|binding|INFO|Claiming lport 5940561a-bc60-483b-b18f-1dc7f993cadb for this chassis.
Jan 23 05:41:54 np0005593234 ovn_controller[134547]: 2026-01-23T10:41:54Z|00848|binding|INFO|5940561a-bc60-483b-b18f-1dc7f993cadb: Claiming fa:16:3e:84:80:05 10.100.0.8
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.621 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:80:05 10.100.0.8'], port_security=['fa:16:3e:84:80:05 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '875a53a5-020f-4a4e-a0cf-bcfd254ba895', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-72854481-c2f9-4651-8ba1-fe321a8a5546', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd27c5465284b48a5818ef931d6251c43', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3a20e786-de0e-4392-a9c2-94b60112f57a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95d9ae35-aabe-45f7-a103-f14858b94e31, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5940561a-bc60-483b-b18f-1dc7f993cadb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.623 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5940561a-bc60-483b-b18f-1dc7f993cadb in datapath 72854481-c2f9-4651-8ba1-fe321a8a5546 bound to our chassis#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.624 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 72854481-c2f9-4651-8ba1-fe321a8a5546#033[00m
Jan 23 05:41:54 np0005593234 ovn_controller[134547]: 2026-01-23T10:41:54Z|00849|binding|INFO|Setting lport 5940561a-bc60-483b-b18f-1dc7f993cadb ovn-installed in OVS
Jan 23 05:41:54 np0005593234 ovn_controller[134547]: 2026-01-23T10:41:54Z|00850|binding|INFO|Setting lport 5940561a-bc60-483b-b18f-1dc7f993cadb up in Southbound
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.628 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.631 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:54 np0005593234 systemd-udevd[321779]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.641 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8c7f6153-30fb-4087-9415-78b305d58d17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.642 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap72854481-c1 in ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:41:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:54.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.644 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap72854481-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.644 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d92cc407-5b0e-4077-b781-8ac65c373474]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.645 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6b01fb4c-2e93-4cf3-b064-12fd1bb3d804]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 NetworkManager[48942]: <info>  [1769164914.6514] device (tap5940561a-bc): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:41:54 np0005593234 NetworkManager[48942]: <info>  [1769164914.6520] device (tap5940561a-bc): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:41:54 np0005593234 systemd-machined[195626]: New machine qemu-96-instance-000000c8.
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.662 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[d1ada985-870e-4805-b2c8-908527d59c53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.678 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a7281ebb-b532-42fc-a910-ea80036d97a9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 systemd[1]: Started Virtual Machine qemu-96-instance-000000c8.
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.710 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e45c2d8b-e92e-45a6-ad3b-4a4654022ef6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 NetworkManager[48942]: <info>  [1769164914.7155] manager: (tap72854481-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/400)
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.714 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a7ed819a-7d76-4c60-8028-bc94f5295440]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 systemd-udevd[321784]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:41:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e386 e386: 3 total, 3 up, 3 in
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.746 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f0409c1e-cf24-48fa-8ff4-5ba4813254a8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.749 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7a0a23b4-0cd3-4a61-915d-c1a6b3930238]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 NetworkManager[48942]: <info>  [1769164914.7671] device (tap72854481-c0): carrier: link connected
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.771 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ba25dca6-b9ee-49b7-9b52-bd4ee3d1ef5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.785 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0de50413-c757-4157-a95c-d6fb49f2f683]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap72854481-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:b6:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 259], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 880972, 'reachable_time': 39834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 321813, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.800 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7a9dd572-8da7-420b-b748-569fdca404ec]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe81:b660'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 880972, 'tstamp': 880972}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 321814, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.814 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a8d05312-0b89-4449-b49b-927e0f4402f6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap72854481-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:81:b6:60'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 259], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 880972, 'reachable_time': 39834, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 321815, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.846 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5e82c0d4-a818-482c-a5e3-0ead299a4cad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.903 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e009b4dd-9489-4773-b5e6-91195abd0062]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.905 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72854481-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.905 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.906 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap72854481-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.908 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:54 np0005593234 NetworkManager[48942]: <info>  [1769164914.9087] manager: (tap72854481-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/401)
Jan 23 05:41:54 np0005593234 kernel: tap72854481-c0: entered promiscuous mode
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.910 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.911 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap72854481-c0, col_values=(('external_ids', {'iface-id': '6b08537e-a263-4eec-b987-1e42878f483a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.913 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:54 np0005593234 ovn_controller[134547]: 2026-01-23T10:41:54Z|00851|binding|INFO|Releasing lport 6b08537e-a263-4eec-b987-1e42878f483a from this chassis (sb_readonly=0)
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.925 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.926 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/72854481-c2f9-4651-8ba1-fe321a8a5546.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/72854481-c2f9-4651-8ba1-fe321a8a5546.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.927 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[985d31fe-6603-493a-a721-1d3ffedfbc73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.928 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-72854481-c2f9-4651-8ba1-fe321a8a5546
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/72854481-c2f9-4651-8ba1-fe321a8a5546.pid.haproxy
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 72854481-c2f9-4651-8ba1-fe321a8a5546
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:41:54 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:41:54.929 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'env', 'PROCESS_TAG=haproxy-72854481-c2f9-4651-8ba1-fe321a8a5546', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/72854481-c2f9-4651-8ba1-fe321a8a5546.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.935 227766 DEBUG nova.compute.manager [req-af36f555-efa1-4007-8632-87619b8d71d4 req-2a341264-a7cd-48f5-8b9f-ecceb748f28c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Received event network-vif-plugged-5940561a-bc60-483b-b18f-1dc7f993cadb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.935 227766 DEBUG oslo_concurrency.lockutils [req-af36f555-efa1-4007-8632-87619b8d71d4 req-2a341264-a7cd-48f5-8b9f-ecceb748f28c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.935 227766 DEBUG oslo_concurrency.lockutils [req-af36f555-efa1-4007-8632-87619b8d71d4 req-2a341264-a7cd-48f5-8b9f-ecceb748f28c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.936 227766 DEBUG oslo_concurrency.lockutils [req-af36f555-efa1-4007-8632-87619b8d71d4 req-2a341264-a7cd-48f5-8b9f-ecceb748f28c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:54 np0005593234 nova_compute[227762]: 2026-01-23 10:41:54.936 227766 DEBUG nova.compute.manager [req-af36f555-efa1-4007-8632-87619b8d71d4 req-2a341264-a7cd-48f5-8b9f-ecceb748f28c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Processing event network-vif-plugged-5940561a-bc60-483b-b18f-1dc7f993cadb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:41:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:41:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:55.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:41:55 np0005593234 podman[321881]: 2026-01-23 10:41:55.289465382 +0000 UTC m=+0.051057686 container create 0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Jan 23 05:41:55 np0005593234 systemd[1]: Started libpod-conmon-0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3.scope.
Jan 23 05:41:55 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:41:55 np0005593234 podman[321881]: 2026-01-23 10:41:55.262866761 +0000 UTC m=+0.024459085 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:41:55 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/57dbde3cc591cc88b43ae0e852f550236587c6e60fb3415bff5905af31a865cf/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:41:55 np0005593234 podman[321881]: 2026-01-23 10:41:55.373988443 +0000 UTC m=+0.135580767 container init 0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:41:55 np0005593234 podman[321881]: 2026-01-23 10:41:55.379915288 +0000 UTC m=+0.141507592 container start 0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:41:55 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321896]: [NOTICE]   (321900) : New worker (321902) forked
Jan 23 05:41:55 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321896]: [NOTICE]   (321900) : Loading success.
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.198 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164916.1983693, 875a53a5-020f-4a4e-a0cf-bcfd254ba895 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.199 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] VM Started (Lifecycle Event)#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.201 227766 DEBUG nova.compute.manager [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.203 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.206 227766 INFO nova.virt.libvirt.driver [-] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Instance spawned successfully.#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.206 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.221 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.226 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.229 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.229 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.230 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.230 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.231 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.231 227766 DEBUG nova.virt.libvirt.driver [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.256 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.257 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164916.2000976, 875a53a5-020f-4a4e-a0cf-bcfd254ba895 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.257 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.285 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.288 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769164916.202725, 875a53a5-020f-4a4e-a0cf-bcfd254ba895 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.288 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.296 227766 INFO nova.compute.manager [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Took 5.18 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.296 227766 DEBUG nova.compute.manager [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.308 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.311 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.355 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.394 227766 INFO nova.compute.manager [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Took 12.05 seconds to build instance.#033[00m
Jan 23 05:41:56 np0005593234 nova_compute[227762]: 2026-01-23 10:41:56.412 227766 DEBUG oslo_concurrency.lockutils [None req-29a55d40-67fb-4c21-9f15-18bab7825dfb eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.166s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:56.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:57 np0005593234 nova_compute[227762]: 2026-01-23 10:41:57.016 227766 DEBUG nova.compute.manager [req-e61cc063-ec3f-4e97-b355-968ca86e0bce req-996049e9-3d31-41e0-b712-cf164c341149 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Received event network-vif-plugged-5940561a-bc60-483b-b18f-1dc7f993cadb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:41:57 np0005593234 nova_compute[227762]: 2026-01-23 10:41:57.017 227766 DEBUG oslo_concurrency.lockutils [req-e61cc063-ec3f-4e97-b355-968ca86e0bce req-996049e9-3d31-41e0-b712-cf164c341149 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:41:57 np0005593234 nova_compute[227762]: 2026-01-23 10:41:57.018 227766 DEBUG oslo_concurrency.lockutils [req-e61cc063-ec3f-4e97-b355-968ca86e0bce req-996049e9-3d31-41e0-b712-cf164c341149 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:41:57 np0005593234 nova_compute[227762]: 2026-01-23 10:41:57.018 227766 DEBUG oslo_concurrency.lockutils [req-e61cc063-ec3f-4e97-b355-968ca86e0bce req-996049e9-3d31-41e0-b712-cf164c341149 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:41:57 np0005593234 nova_compute[227762]: 2026-01-23 10:41:57.018 227766 DEBUG nova.compute.manager [req-e61cc063-ec3f-4e97-b355-968ca86e0bce req-996049e9-3d31-41e0-b712-cf164c341149 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] No waiting events found dispatching network-vif-plugged-5940561a-bc60-483b-b18f-1dc7f993cadb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:41:57 np0005593234 nova_compute[227762]: 2026-01-23 10:41:57.019 227766 WARNING nova.compute.manager [req-e61cc063-ec3f-4e97-b355-968ca86e0bce req-996049e9-3d31-41e0-b712-cf164c341149 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Received unexpected event network-vif-plugged-5940561a-bc60-483b-b18f-1dc7f993cadb for instance with vm_state active and task_state None.#033[00m
Jan 23 05:41:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:41:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:57.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:41:57 np0005593234 nova_compute[227762]: 2026-01-23 10:41:57.457 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:41:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:58.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:41:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:41:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:41:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:59.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:41:59 np0005593234 nova_compute[227762]: 2026-01-23 10:41:59.418 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:41:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.209 227766 DEBUG oslo_concurrency.lockutils [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.210 227766 DEBUG oslo_concurrency.lockutils [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.211 227766 DEBUG oslo_concurrency.lockutils [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.211 227766 DEBUG oslo_concurrency.lockutils [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.212 227766 DEBUG oslo_concurrency.lockutils [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.213 227766 INFO nova.compute.manager [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Terminating instance#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.214 227766 DEBUG nova.compute.manager [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:42:00 np0005593234 kernel: tap5940561a-bc (unregistering): left promiscuous mode
Jan 23 05:42:00 np0005593234 NetworkManager[48942]: <info>  [1769164920.2594] device (tap5940561a-bc): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:42:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:42:00Z|00852|binding|INFO|Releasing lport 5940561a-bc60-483b-b18f-1dc7f993cadb from this chassis (sb_readonly=0)
Jan 23 05:42:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:42:00Z|00853|binding|INFO|Setting lport 5940561a-bc60-483b-b18f-1dc7f993cadb down in Southbound
Jan 23 05:42:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:42:00Z|00854|binding|INFO|Removing iface tap5940561a-bc ovn-installed in OVS
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.273 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:00.276 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:84:80:05 10.100.0.8'], port_security=['fa:16:3e:84:80:05 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '875a53a5-020f-4a4e-a0cf-bcfd254ba895', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-72854481-c2f9-4651-8ba1-fe321a8a5546', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd27c5465284b48a5818ef931d6251c43', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3a20e786-de0e-4392-a9c2-94b60112f57a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=95d9ae35-aabe-45f7-a103-f14858b94e31, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5940561a-bc60-483b-b18f-1dc7f993cadb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:42:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:00.278 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5940561a-bc60-483b-b18f-1dc7f993cadb in datapath 72854481-c2f9-4651-8ba1-fe321a8a5546 unbound from our chassis#033[00m
Jan 23 05:42:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:00.279 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 72854481-c2f9-4651-8ba1-fe321a8a5546, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:42:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:00.281 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aa9ce9e2-f37a-4ae0-afec-06020a345736]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:00.281 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 namespace which is not needed anymore#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.288 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:00 np0005593234 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000c8.scope: Deactivated successfully.
Jan 23 05:42:00 np0005593234 systemd[1]: machine-qemu\x2d96\x2dinstance\x2d000000c8.scope: Consumed 4.483s CPU time.
Jan 23 05:42:00 np0005593234 systemd-machined[195626]: Machine qemu-96-instance-000000c8 terminated.
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.437 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.443 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.453 227766 INFO nova.virt.libvirt.driver [-] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Instance destroyed successfully.#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.453 227766 DEBUG nova.objects.instance [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lazy-loading 'resources' on Instance uuid 875a53a5-020f-4a4e-a0cf-bcfd254ba895 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.467 227766 DEBUG nova.virt.libvirt.vif [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:41:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestVolumeBootPattern-server-559887613',display_name='tempest-TestVolumeBootPattern-server-559887613',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testvolumebootpattern-server-559887613',id=200,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:41:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d27c5465284b48a5818ef931d6251c43',ramdisk_id='',reservation_id='r-d03gudju',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestVolumeBootPattern-2139361132',owner_user_name='tempest-TestVolumeBootPattern-2139361132-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:41:56Z,user_data=None,user_id='eb70c3aee8b64273a1930c0c2c231aff',uuid=875a53a5-020f-4a4e-a0cf-bcfd254ba895,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5940561a-bc60-483b-b18f-1dc7f993cadb", "address": "fa:16:3e:84:80:05", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5940561a-bc", "ovs_interfaceid": "5940561a-bc60-483b-b18f-1dc7f993cadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.467 227766 DEBUG nova.network.os_vif_util [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converting VIF {"id": "5940561a-bc60-483b-b18f-1dc7f993cadb", "address": "fa:16:3e:84:80:05", "network": {"id": "72854481-c2f9-4651-8ba1-fe321a8a5546", "bridge": "br-int", "label": "tempest-TestVolumeBootPattern-823196877-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "d27c5465284b48a5818ef931d6251c43", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5940561a-bc", "ovs_interfaceid": "5940561a-bc60-483b-b18f-1dc7f993cadb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.469 227766 DEBUG nova.network.os_vif_util [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:84:80:05,bridge_name='br-int',has_traffic_filtering=True,id=5940561a-bc60-483b-b18f-1dc7f993cadb,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5940561a-bc') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.469 227766 DEBUG os_vif [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:80:05,bridge_name='br-int',has_traffic_filtering=True,id=5940561a-bc60-483b-b18f-1dc7f993cadb,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5940561a-bc') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.471 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.472 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5940561a-bc, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.473 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.475 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.478 227766 INFO os_vif [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:84:80:05,bridge_name='br-int',has_traffic_filtering=True,id=5940561a-bc60-483b-b18f-1dc7f993cadb,network=Network(72854481-c2f9-4651-8ba1-fe321a8a5546),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5940561a-bc')#033[00m
Jan 23 05:42:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:00.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.676 227766 DEBUG nova.compute.manager [req-14a18b19-ebdc-4e2f-b81e-6617eeafd515 req-3d5d8cda-22d4-4e90-9d0c-42dbad8fcb3e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Received event network-vif-unplugged-5940561a-bc60-483b-b18f-1dc7f993cadb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.676 227766 DEBUG oslo_concurrency.lockutils [req-14a18b19-ebdc-4e2f-b81e-6617eeafd515 req-3d5d8cda-22d4-4e90-9d0c-42dbad8fcb3e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.677 227766 DEBUG oslo_concurrency.lockutils [req-14a18b19-ebdc-4e2f-b81e-6617eeafd515 req-3d5d8cda-22d4-4e90-9d0c-42dbad8fcb3e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.677 227766 DEBUG oslo_concurrency.lockutils [req-14a18b19-ebdc-4e2f-b81e-6617eeafd515 req-3d5d8cda-22d4-4e90-9d0c-42dbad8fcb3e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.677 227766 DEBUG nova.compute.manager [req-14a18b19-ebdc-4e2f-b81e-6617eeafd515 req-3d5d8cda-22d4-4e90-9d0c-42dbad8fcb3e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] No waiting events found dispatching network-vif-unplugged-5940561a-bc60-483b-b18f-1dc7f993cadb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:42:00 np0005593234 nova_compute[227762]: 2026-01-23 10:42:00.677 227766 DEBUG nova.compute.manager [req-14a18b19-ebdc-4e2f-b81e-6617eeafd515 req-3d5d8cda-22d4-4e90-9d0c-42dbad8fcb3e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Received event network-vif-unplugged-5940561a-bc60-483b-b18f-1dc7f993cadb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:42:00 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321896]: [NOTICE]   (321900) : haproxy version is 2.8.14-c23fe91
Jan 23 05:42:00 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321896]: [NOTICE]   (321900) : path to executable is /usr/sbin/haproxy
Jan 23 05:42:00 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321896]: [WARNING]  (321900) : Exiting Master process...
Jan 23 05:42:00 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321896]: [ALERT]    (321900) : Current worker (321902) exited with code 143 (Terminated)
Jan 23 05:42:00 np0005593234 neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546[321896]: [WARNING]  (321900) : All workers exited. Exiting... (0)
Jan 23 05:42:00 np0005593234 systemd[1]: libpod-0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3.scope: Deactivated successfully.
Jan 23 05:42:00 np0005593234 podman[321993]: 2026-01-23 10:42:00.9128604 +0000 UTC m=+0.543715289 container died 0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:42:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:42:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:42:01 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3-userdata-shm.mount: Deactivated successfully.
Jan 23 05:42:01 np0005593234 systemd[1]: var-lib-containers-storage-overlay-57dbde3cc591cc88b43ae0e852f550236587c6e60fb3415bff5905af31a865cf-merged.mount: Deactivated successfully.
Jan 23 05:42:01 np0005593234 podman[322036]: 2026-01-23 10:42:01.070410282 +0000 UTC m=+0.136730653 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 23 05:42:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:01.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:01 np0005593234 nova_compute[227762]: 2026-01-23 10:42:01.169 227766 INFO nova.virt.libvirt.driver [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Deleting instance files /var/lib/nova/instances/875a53a5-020f-4a4e-a0cf-bcfd254ba895_del#033[00m
Jan 23 05:42:01 np0005593234 nova_compute[227762]: 2026-01-23 10:42:01.169 227766 INFO nova.virt.libvirt.driver [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Deletion of /var/lib/nova/instances/875a53a5-020f-4a4e-a0cf-bcfd254ba895_del complete#033[00m
Jan 23 05:42:01 np0005593234 podman[321993]: 2026-01-23 10:42:01.219555672 +0000 UTC m=+0.850410561 container cleanup 0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:42:01 np0005593234 nova_compute[227762]: 2026-01-23 10:42:01.222 227766 INFO nova.compute.manager [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Took 1.01 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:42:01 np0005593234 nova_compute[227762]: 2026-01-23 10:42:01.222 227766 DEBUG oslo.service.loopingcall [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:42:01 np0005593234 nova_compute[227762]: 2026-01-23 10:42:01.223 227766 DEBUG nova.compute.manager [-] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:42:01 np0005593234 nova_compute[227762]: 2026-01-23 10:42:01.223 227766 DEBUG nova.network.neutron [-] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:42:01 np0005593234 systemd[1]: libpod-conmon-0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3.scope: Deactivated successfully.
Jan 23 05:42:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:42:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:02.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:42:02 np0005593234 podman[322073]: 2026-01-23 10:42:02.918100681 +0000 UTC m=+1.679132323 container remove 0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:42:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:02.925 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e24d6e-9bf2-4da3-b8f5-edb79324fcab]: (4, ('Fri Jan 23 10:42:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 (0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3)\n0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3\nFri Jan 23 10:42:01 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 (0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3)\n0367cd68898aa086747d55b903983053fc9f193a45d708631f89e8ccf2baaae3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:02.926 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a4860a6b-3758-4b0c-b7d1-93b718232087]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:02.927 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72854481-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:42:02 np0005593234 nova_compute[227762]: 2026-01-23 10:42:02.934 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:02 np0005593234 kernel: tap72854481-c0: left promiscuous mode
Jan 23 05:42:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:02.938 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[118ed1fa-5b9b-45a5-b167-2a988c1461ee]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:02 np0005593234 nova_compute[227762]: 2026-01-23 10:42:02.950 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:02.962 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0aaf056d-0e87-45fc-bd4e-22c51f82e1d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:02.963 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d25f144b-7437-40ad-81bf-ce1536f09952]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:02.978 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dd3a3c49-40f9-452d-87d1-23cdf000853d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 880966, 'reachable_time': 19433, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 322091, 'error': None, 'target': 'ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:02 np0005593234 systemd[1]: run-netns-ovnmeta\x2d72854481\x2dc2f9\x2d4651\x2d8ba1\x2dfe321a8a5546.mount: Deactivated successfully.
Jan 23 05:42:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:02.981 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-72854481-c2f9-4651-8ba1-fe321a8a5546 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:42:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:02.982 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[6ad0827e-8734-402f-af79-d09f48ed584e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:42:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:42:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:03.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.172 227766 DEBUG nova.compute.manager [req-fa1c8e18-5611-4c44-9d11-05917c397ff6 req-420e8145-7798-41f5-b280-ee4fcb95f4a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Received event network-vif-plugged-5940561a-bc60-483b-b18f-1dc7f993cadb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.173 227766 DEBUG oslo_concurrency.lockutils [req-fa1c8e18-5611-4c44-9d11-05917c397ff6 req-420e8145-7798-41f5-b280-ee4fcb95f4a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.173 227766 DEBUG oslo_concurrency.lockutils [req-fa1c8e18-5611-4c44-9d11-05917c397ff6 req-420e8145-7798-41f5-b280-ee4fcb95f4a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.173 227766 DEBUG oslo_concurrency.lockutils [req-fa1c8e18-5611-4c44-9d11-05917c397ff6 req-420e8145-7798-41f5-b280-ee4fcb95f4a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.173 227766 DEBUG nova.compute.manager [req-fa1c8e18-5611-4c44-9d11-05917c397ff6 req-420e8145-7798-41f5-b280-ee4fcb95f4a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] No waiting events found dispatching network-vif-plugged-5940561a-bc60-483b-b18f-1dc7f993cadb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.173 227766 WARNING nova.compute.manager [req-fa1c8e18-5611-4c44-9d11-05917c397ff6 req-420e8145-7798-41f5-b280-ee4fcb95f4a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Received unexpected event network-vif-plugged-5940561a-bc60-483b-b18f-1dc7f993cadb for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.221 227766 DEBUG nova.network.neutron [-] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.246 227766 INFO nova.compute.manager [-] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Took 2.02 seconds to deallocate network for instance.#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.338 227766 DEBUG nova.compute.manager [req-3e78e938-059a-43c3-ba41-951ec99bd892 req-6c2ac2f3-b0a0-48c3-99ab-e5f4428f01ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Received event network-vif-deleted-5940561a-bc60-483b-b18f-1dc7f993cadb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.513 227766 INFO nova.compute.manager [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Took 0.27 seconds to detach 1 volumes for instance.#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.515 227766 DEBUG nova.compute.manager [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Deleting volume: 53917a40-f345-4f87-a3c1-5297194341d6 _cleanup_volumes /usr/lib/python3.9/site-packages/nova/compute/manager.py:3217#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.780 227766 DEBUG oslo_concurrency.lockutils [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.780 227766 DEBUG oslo_concurrency.lockutils [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.874 227766 DEBUG nova.scheduler.client.report [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.950 227766 DEBUG nova.scheduler.client.report [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:42:03 np0005593234 nova_compute[227762]: 2026-01-23 10:42:03.951 227766 DEBUG nova.compute.provider_tree [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:42:04 np0005593234 nova_compute[227762]: 2026-01-23 10:42:04.010 227766 DEBUG nova.scheduler.client.report [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:42:04 np0005593234 nova_compute[227762]: 2026-01-23 10:42:04.035 227766 DEBUG nova.scheduler.client.report [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:42:04 np0005593234 nova_compute[227762]: 2026-01-23 10:42:04.121 227766 DEBUG oslo_concurrency.processutils [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:04 np0005593234 nova_compute[227762]: 2026-01-23 10:42:04.419 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:42:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2244105419' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:42:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:42:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2244105419' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:42:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:42:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1922835936' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:42:04 np0005593234 nova_compute[227762]: 2026-01-23 10:42:04.539 227766 DEBUG oslo_concurrency.processutils [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:04 np0005593234 nova_compute[227762]: 2026-01-23 10:42:04.546 227766 DEBUG nova.compute.provider_tree [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:42:04 np0005593234 nova_compute[227762]: 2026-01-23 10:42:04.565 227766 DEBUG nova.scheduler.client.report [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:42:04 np0005593234 nova_compute[227762]: 2026-01-23 10:42:04.598 227766 DEBUG oslo_concurrency.lockutils [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:04 np0005593234 nova_compute[227762]: 2026-01-23 10:42:04.645 227766 INFO nova.scheduler.client.report [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Deleted allocations for instance 875a53a5-020f-4a4e-a0cf-bcfd254ba895#033[00m
Jan 23 05:42:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:04.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:04 np0005593234 nova_compute[227762]: 2026-01-23 10:42:04.744 227766 DEBUG oslo_concurrency.lockutils [None req-795c5c86-4d62-4cba-9049-83518578b699 eb70c3aee8b64273a1930c0c2c231aff d27c5465284b48a5818ef931d6251c43 - - default default] Lock "875a53a5-020f-4a4e-a0cf-bcfd254ba895" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.534s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e386 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:05.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:05 np0005593234 nova_compute[227762]: 2026-01-23 10:42:05.473 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:06.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:06 np0005593234 nova_compute[227762]: 2026-01-23 10:42:06.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:07.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:07 np0005593234 nova_compute[227762]: 2026-01-23 10:42:07.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:07 np0005593234 nova_compute[227762]: 2026-01-23 10:42:07.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:42:07 np0005593234 nova_compute[227762]: 2026-01-23 10:42:07.747 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:42:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e387 e387: 3 total, 3 up, 3 in
Jan 23 05:42:08 np0005593234 nova_compute[227762]: 2026-01-23 10:42:08.476 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:42:08 np0005593234 nova_compute[227762]: 2026-01-23 10:42:08.476 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:42:08 np0005593234 nova_compute[227762]: 2026-01-23 10:42:08.476 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:42:08 np0005593234 nova_compute[227762]: 2026-01-23 10:42:08.476 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 307f203d-cfc0-45a9-a0cd-3acee0ef7133 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:42:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:08.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:42:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:09.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:42:09 np0005593234 nova_compute[227762]: 2026-01-23 10:42:09.421 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:09 np0005593234 podman[322169]: 2026-01-23 10:42:09.79333728 +0000 UTC m=+0.085023966 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:42:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:42:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2551777174' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:42:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:42:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2551777174' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:42:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e387 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.166 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Updating instance_info_cache with network_info: [{"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.196 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.196 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.196 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.218 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.219 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.219 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.219 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.219 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.476 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:42:10 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1771989375' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.635 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.415s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:10.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.724 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.724 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.728 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.728 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.878 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.880 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3723MB free_disk=20.805683135986328GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.880 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:10 np0005593234 nova_compute[227762]: 2026-01-23 10:42:10.880 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:11 np0005593234 nova_compute[227762]: 2026-01-23 10:42:11.003 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 307f203d-cfc0-45a9-a0cd-3acee0ef7133 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:42:11 np0005593234 nova_compute[227762]: 2026-01-23 10:42:11.004 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance d0cea430-15ec-471d-963b-41fd4fa4777c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:42:11 np0005593234 nova_compute[227762]: 2026-01-23 10:42:11.004 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:42:11 np0005593234 nova_compute[227762]: 2026-01-23 10:42:11.004 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:42:11 np0005593234 nova_compute[227762]: 2026-01-23 10:42:11.111 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:42:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:11.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:42:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1628488826' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:42:12 np0005593234 nova_compute[227762]: 2026-01-23 10:42:12.244 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:42:12 np0005593234 nova_compute[227762]: 2026-01-23 10:42:12.250 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:42:12 np0005593234 nova_compute[227762]: 2026-01-23 10:42:12.275 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:42:12 np0005593234 nova_compute[227762]: 2026-01-23 10:42:12.319 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:42:12 np0005593234 nova_compute[227762]: 2026-01-23 10:42:12.319 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.439s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:42:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:12.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:42:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:13.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:13 np0005593234 nova_compute[227762]: 2026-01-23 10:42:13.868 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:13 np0005593234 nova_compute[227762]: 2026-01-23 10:42:13.869 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:14 np0005593234 nova_compute[227762]: 2026-01-23 10:42:14.423 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:14.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:14 np0005593234 nova_compute[227762]: 2026-01-23 10:42:14.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:14 np0005593234 nova_compute[227762]: 2026-01-23 10:42:14.789 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:14 np0005593234 nova_compute[227762]: 2026-01-23 10:42:14.790 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:42:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e387 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:42:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:15.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:42:15 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 23 05:42:15 np0005593234 nova_compute[227762]: 2026-01-23 10:42:15.452 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164920.4509444, 875a53a5-020f-4a4e-a0cf-bcfd254ba895 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:42:15 np0005593234 nova_compute[227762]: 2026-01-23 10:42:15.453 227766 INFO nova.compute.manager [-] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:42:15 np0005593234 nova_compute[227762]: 2026-01-23 10:42:15.480 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:15 np0005593234 nova_compute[227762]: 2026-01-23 10:42:15.504 227766 DEBUG nova.compute.manager [None req-dcbab684-b597-43a6-a76e-b1eac683003f - - - - - -] [instance: 875a53a5-020f-4a4e-a0cf-bcfd254ba895] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:42:15 np0005593234 nova_compute[227762]: 2026-01-23 10:42:15.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:42:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:16.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:42:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:17.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:18.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:42:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:19.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:42:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e388 e388: 3 total, 3 up, 3 in
Jan 23 05:42:19 np0005593234 nova_compute[227762]: 2026-01-23 10:42:19.426 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:20 np0005593234 nova_compute[227762]: 2026-01-23 10:42:20.505 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:20.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:21.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:22.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:22 np0005593234 nova_compute[227762]: 2026-01-23 10:42:22.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:23.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:24 np0005593234 nova_compute[227762]: 2026-01-23 10:42:24.428 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:24.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:25.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:25 np0005593234 nova_compute[227762]: 2026-01-23 10:42:25.506 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:26.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:27.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:28.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:29.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:42:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6000.5 total, 600.0 interval#012Cumulative writes: 70K writes, 283K keys, 70K commit groups, 1.0 writes per commit group, ingest: 0.29 GB, 0.05 MB/s#012Cumulative WAL: 70K writes, 26K syncs, 2.71 writes per sync, written: 0.29 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 6615 writes, 27K keys, 6615 commit groups, 1.0 writes per commit group, ingest: 29.91 MB, 0.05 MB/s#012Interval WAL: 6615 writes, 2392 syncs, 2.77 writes per sync, written: 0.03 GB, 0.05 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.31              0.00         1    0.311       0      0       0.0       0.0#012 Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.31              0.00         1    0.311       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.31              0.00         1    0.311       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.5 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.3 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bc37c4b350#2 capacity: 1.09 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000123978%) FilterBlock(3,0.33 KB,2.86102e-05%) IndexBlock(3,0.34 KB,2.99726e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **#012#012** Compaction Stats [m-0] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-0] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.5 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55bc37c4b350#2 capacity: 1.09 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 5e-05 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000123978%) FilterBlock(3,0.33 KB,2.86102e-05%) IndexBlock(3,0.34 KB,2.99726e-05%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [m-0] **#012#012** Compaction Stats [m-1] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0#012#012** Compaction Stats [m-1] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6000.5 total, 4800.0 interval#012Flush(GB): cumulative 0.000, interval 0.000#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 me
Jan 23 05:42:29 np0005593234 nova_compute[227762]: 2026-01-23 10:42:29.430 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:29 np0005593234 nova_compute[227762]: 2026-01-23 10:42:29.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:42:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:30 np0005593234 nova_compute[227762]: 2026-01-23 10:42:30.508 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:42:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:30.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:42:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:31.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:31 np0005593234 podman[322302]: 2026-01-23 10:42:31.754477893 +0000 UTC m=+0.046223495 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:42:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:32.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:33.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:34 np0005593234 nova_compute[227762]: 2026-01-23 10:42:34.432 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:34.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:35.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:35 np0005593234 nova_compute[227762]: 2026-01-23 10:42:35.510 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:36.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:37.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:38.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:39.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:39 np0005593234 nova_compute[227762]: 2026-01-23 10:42:39.433 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:40 np0005593234 nova_compute[227762]: 2026-01-23 10:42:40.512 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:40.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:40 np0005593234 podman[322330]: 2026-01-23 10:42:40.79383903 +0000 UTC m=+0.089736885 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:42:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:41.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:42.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:42.880 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:42:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:42.881 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:42:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:42.882 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:42:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:43.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:44 np0005593234 nova_compute[227762]: 2026-01-23 10:42:44.435 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:42:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2851214116' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:42:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:42:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2851214116' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:42:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:44.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:45.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:45 np0005593234 nova_compute[227762]: 2026-01-23 10:42:45.513 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:46.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:47.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:48.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:49.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:49 np0005593234 nova_compute[227762]: 2026-01-23 10:42:49.436 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:50 np0005593234 nova_compute[227762]: 2026-01-23 10:42:50.517 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:50.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:51.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:52.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:53.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:54 np0005593234 nova_compute[227762]: 2026-01-23 10:42:54.439 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:54.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:42:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:42:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:55.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:42:55 np0005593234 nova_compute[227762]: 2026-01-23 10:42:55.556 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:56.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:57.101 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=78, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=77) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:42:57 np0005593234 nova_compute[227762]: 2026-01-23 10:42:57.101 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:42:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:57.102 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:42:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:57.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:42:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:42:58.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:42:59 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:42:59.104 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '78'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:42:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:42:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:42:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:42:59.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:42:59 np0005593234 nova_compute[227762]: 2026-01-23 10:42:59.439 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:00 np0005593234 nova_compute[227762]: 2026-01-23 10:43:00.558 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:00.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:43:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:01.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:43:01 np0005593234 podman[322592]: 2026-01-23 10:43:01.308780028 +0000 UTC m=+0.459072364 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, CEPH_REF=reef, io.buildah.version=1.39.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 05:43:01 np0005593234 podman[322592]: 2026-01-23 10:43:01.407924766 +0000 UTC m=+0.558217092 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, CEPH_REF=reef, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:43:01 np0005593234 podman[322714]: 2026-01-23 10:43:01.900887138 +0000 UTC m=+0.057365464 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:43:02 np0005593234 podman[322759]: 2026-01-23 10:43:02.11541582 +0000 UTC m=+0.160674650 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:43:02 np0005593234 podman[322781]: 2026-01-23 10:43:02.192748976 +0000 UTC m=+0.056734073 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:43:02 np0005593234 podman[322759]: 2026-01-23 10:43:02.223949081 +0000 UTC m=+0.269207901 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:43:02 np0005593234 podman[322826]: 2026-01-23 10:43:02.491640055 +0000 UTC m=+0.093497102 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, architecture=x86_64, io.openshift.tags=Ceph keepalived, release=1793, version=2.2.4, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9)
Jan 23 05:43:02 np0005593234 podman[322846]: 2026-01-23 10:43:02.696775824 +0000 UTC m=+0.184994780 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.buildah.version=1.28.2, name=keepalived, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, com.redhat.component=keepalived-container, description=keepalived for Ceph, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, distribution-scope=public)
Jan 23 05:43:02 np0005593234 podman[322826]: 2026-01-23 10:43:02.703935868 +0000 UTC m=+0.305792915 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, distribution-scope=public, com.redhat.component=keepalived-container, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Jan 23 05:43:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:02.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #172. Immutable memtables: 0.
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:02.855215) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 109] Flushing memtable with next log file: 172
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164982855356, "job": 109, "event": "flush_started", "num_memtables": 1, "num_entries": 1645, "num_deletes": 254, "total_data_size": 3663062, "memory_usage": 3715816, "flush_reason": "Manual Compaction"}
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 109] Level-0 flush table #173: started
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164982878086, "cf_name": "default", "job": 109, "event": "table_file_creation", "file_number": 173, "file_size": 2406749, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 82297, "largest_seqno": 83937, "table_properties": {"data_size": 2399853, "index_size": 3966, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 15021, "raw_average_key_size": 20, "raw_value_size": 2385849, "raw_average_value_size": 3246, "num_data_blocks": 174, "num_entries": 735, "num_filter_entries": 735, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164849, "oldest_key_time": 1769164849, "file_creation_time": 1769164982, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 173, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 109] Flush lasted 22935 microseconds, and 6353 cpu microseconds.
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:02.878152) [db/flush_job.cc:967] [default] [JOB 109] Level-0 flush table #173: 2406749 bytes OK
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:02.878173) [db/memtable_list.cc:519] [default] Level-0 commit table #173 started
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:02.883429) [db/memtable_list.cc:722] [default] Level-0 commit table #173: memtable #1 done
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:02.883485) EVENT_LOG_v1 {"time_micros": 1769164982883473, "job": 109, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:02.883523) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 109] Try to delete WAL files size 3655445, prev total WAL file size 3676500, number of live WAL files 2.
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000169.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:02.885856) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037323739' seq:72057594037927935, type:22 .. '7061786F730037353331' seq:0, type:0; will stop at (end)
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 110] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 109 Base level 0, inputs: [173(2350KB)], [171(10MB)]
Jan 23 05:43:02 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164982886126, "job": 110, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [173], "files_L6": [171], "score": -1, "input_data_size": 13333626, "oldest_snapshot_seqno": -1}
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 110] Generated table #174: 10175 keys, 11379417 bytes, temperature: kUnknown
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164983043105, "cf_name": "default", "job": 110, "event": "table_file_creation", "file_number": 174, "file_size": 11379417, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11316332, "index_size": 36540, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25477, "raw_key_size": 268681, "raw_average_key_size": 26, "raw_value_size": 11140927, "raw_average_value_size": 1094, "num_data_blocks": 1383, "num_entries": 10175, "num_filter_entries": 10175, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769164982, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 174, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:03.043489) [db/compaction/compaction_job.cc:1663] [default] [JOB 110] Compacted 1@0 + 1@6 files to L6 => 11379417 bytes
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:03.046804) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 84.9 rd, 72.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 10.4 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(10.3) write-amplify(4.7) OK, records in: 10700, records dropped: 525 output_compression: NoCompression
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:03.046845) EVENT_LOG_v1 {"time_micros": 1769164983046830, "job": 110, "event": "compaction_finished", "compaction_time_micros": 157068, "compaction_time_cpu_micros": 31874, "output_level": 6, "num_output_files": 1, "total_output_size": 11379417, "num_input_records": 10700, "num_output_records": 10175, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000173.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164983047531, "job": 110, "event": "table_file_deletion", "file_number": 173}
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000171.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164983049703, "job": 110, "event": "table_file_deletion", "file_number": 171}
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:02.885656) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:03.049740) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:03.049744) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:03.049746) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:03.049747) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:43:03.049749) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:43:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:03.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:43:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:43:04 np0005593234 nova_compute[227762]: 2026-01-23 10:43:04.442 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:43:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:04.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:43:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:43:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:05.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:43:05 np0005593234 nova_compute[227762]: 2026-01-23 10:43:05.560 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:43:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:06.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:43:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:07.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:08 np0005593234 nova_compute[227762]: 2026-01-23 10:43:08.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:08.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:09.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:09 np0005593234 nova_compute[227762]: 2026-01-23 10:43:09.443 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:09 np0005593234 nova_compute[227762]: 2026-01-23 10:43:09.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:09 np0005593234 nova_compute[227762]: 2026-01-23 10:43:09.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:43:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:43:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:43:10 np0005593234 nova_compute[227762]: 2026-01-23 10:43:10.563 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:10.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:11.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:11 np0005593234 podman[323095]: 2026-01-23 10:43:11.793641065 +0000 UTC m=+0.088275189 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:43:11 np0005593234 nova_compute[227762]: 2026-01-23 10:43:11.857 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:43:11 np0005593234 nova_compute[227762]: 2026-01-23 10:43:11.857 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:43:11 np0005593234 nova_compute[227762]: 2026-01-23 10:43:11.857 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:43:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:12.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:13.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:14 np0005593234 nova_compute[227762]: 2026-01-23 10:43:14.445 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:14 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 05:43:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:14.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:14 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 23 05:43:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e388 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:15.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:15 np0005593234 nova_compute[227762]: 2026-01-23 10:43:15.564 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e389 e389: 3 total, 3 up, 3 in
Jan 23 05:43:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:43:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:16.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:43:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:17.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:18.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:19.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:19 np0005593234 nova_compute[227762]: 2026-01-23 10:43:19.478 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e389 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:20 np0005593234 nova_compute[227762]: 2026-01-23 10:43:20.565 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e390 e390: 3 total, 3 up, 3 in
Jan 23 05:43:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:20.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:43:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:21.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:43:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:22.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:23.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.062 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Updating instance_info_cache with network_info: [{"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.094 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-d0cea430-15ec-471d-963b-41fd4fa4777c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.095 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.095 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.095 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.095 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.096 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.096 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.096 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.096 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.124 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.125 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.125 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.125 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.125 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.482 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:43:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3985677121' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.585 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.701 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.702 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.707 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.707 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c4 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:43:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:43:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:24.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.910 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.912 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3714MB free_disk=20.805469512939453GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.912 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:24 np0005593234 nova_compute[227762]: 2026-01-23 10:43:24.912 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:25 np0005593234 nova_compute[227762]: 2026-01-23 10:43:25.014 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 307f203d-cfc0-45a9-a0cd-3acee0ef7133 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:43:25 np0005593234 nova_compute[227762]: 2026-01-23 10:43:25.015 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance d0cea430-15ec-471d-963b-41fd4fa4777c actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:43:25 np0005593234 nova_compute[227762]: 2026-01-23 10:43:25.015 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:43:25 np0005593234 nova_compute[227762]: 2026-01-23 10:43:25.015 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:43:25 np0005593234 nova_compute[227762]: 2026-01-23 10:43:25.112 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:43:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:43:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:25.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:43:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:43:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3551182004' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:43:25 np0005593234 nova_compute[227762]: 2026-01-23 10:43:25.543 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:43:25 np0005593234 nova_compute[227762]: 2026-01-23 10:43:25.548 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:43:25 np0005593234 nova_compute[227762]: 2026-01-23 10:43:25.566 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:25 np0005593234 nova_compute[227762]: 2026-01-23 10:43:25.570 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:43:25 np0005593234 nova_compute[227762]: 2026-01-23 10:43:25.572 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:43:25 np0005593234 nova_compute[227762]: 2026-01-23 10:43:25.572 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:43:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:26.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:43:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:27.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:28.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:43:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:29.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:43:29 np0005593234 nova_compute[227762]: 2026-01-23 10:43:29.483 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:30 np0005593234 nova_compute[227762]: 2026-01-23 10:43:30.569 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:43:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:30.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:43:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:43:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:31.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:43:32 np0005593234 podman[323226]: 2026-01-23 10:43:32.761742912 +0000 UTC m=+0.053388869 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:43:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:32.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:33.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:34 np0005593234 nova_compute[227762]: 2026-01-23 10:43:34.485 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:43:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:34.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:43:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e390 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:35.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:35 np0005593234 nova_compute[227762]: 2026-01-23 10:43:35.570 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:36.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:37.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e391 e391: 3 total, 3 up, 3 in
Jan 23 05:43:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e392 e392: 3 total, 3 up, 3 in
Jan 23 05:43:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:43:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:38.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:43:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:39.252 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:39 np0005593234 nova_compute[227762]: 2026-01-23 10:43:39.487 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:40 np0005593234 nova_compute[227762]: 2026-01-23 10:43:40.573 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:40.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:43:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:41.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:43:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:43:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2969203655' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:43:42 np0005593234 podman[323249]: 2026-01-23 10:43:42.773432067 +0000 UTC m=+0.071776943 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:43:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:43:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:42.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:43:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:43:42.881 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:43:42.882 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:43:42.882 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:43.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:44 np0005593234 nova_compute[227762]: 2026-01-23 10:43:44.488 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:44.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e392 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:45.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:45 np0005593234 nova_compute[227762]: 2026-01-23 10:43:45.574 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:46 np0005593234 nova_compute[227762]: 2026-01-23 10:43:46.565 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:46.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e393 e393: 3 total, 3 up, 3 in
Jan 23 05:43:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:47.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:43:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3679783447' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:43:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:43:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3679783447' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:43:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:48.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e394 e394: 3 total, 3 up, 3 in
Jan 23 05:43:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:49.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.491 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_image_cache_manager_pass run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.745 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.746 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.746 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.register_storage_use.<locals>.do_register_storage_use" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.747 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "storage-registry-lock" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.747 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "storage-registry-lock" acquired by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.747 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "storage-registry-lock" "released" by "nova.virt.storage_users.get_storage_users.<locals>.do_get_storage_users" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.804 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Verify base images _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:314#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.804 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Image id 84c0ef19-7f67-4bd3-95d8-507c3e0942ed yields fingerprint a6f655456a04e1d13ef2e44ed4544c38917863a2 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.804 227766 INFO nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] image 84c0ef19-7f67-4bd3-95d8-507c3e0942ed at (/var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2): checking#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.805 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] image 84c0ef19-7f67-4bd3-95d8-507c3e0942ed at (/var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2): image is in use _mark_in_use /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:279#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.806 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Image id  yields fingerprint da39a3ee5e6b4b0d3255bfef95601890afd80709 _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:319#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.807 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] 307f203d-cfc0-45a9-a0cd-3acee0ef7133 is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.807 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] d0cea430-15ec-471d-963b-41fd4fa4777c is a valid instance name _list_backing_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:126#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.807 227766 WARNING nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.807 227766 WARNING nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.807 227766 WARNING nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Unknown base file: /var/lib/nova/instances/_base/db5146c726563dd06be5c3f5cc1141007148d79c#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.807 227766 INFO nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Active base files: /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.808 227766 INFO nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Removable base files: /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af /var/lib/nova/instances/_base/db5146c726563dd06be5c3f5cc1141007148d79c#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.808 227766 INFO nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.808 227766 INFO nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/042c073dd2256184660c2c54412f562524aad4af#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.808 227766 INFO nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Base, swap or ephemeral file too young to remove: /var/lib/nova/instances/_base/db5146c726563dd06be5c3f5cc1141007148d79c#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.808 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Verification complete _age_and_verify_cached_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:350#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.808 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Verify swap images _age_and_verify_swap_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:299#033[00m
Jan 23 05:43:49 np0005593234 nova_compute[227762]: 2026-01-23 10:43:49.809 227766 DEBUG nova.virt.libvirt.imagecache [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Verify ephemeral images _age_and_verify_ephemeral_images /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagecache.py:284#033[00m
Jan 23 05:43:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e394 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:50 np0005593234 nova_compute[227762]: 2026-01-23 10:43:50.577 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:50.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:51.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:52.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:53.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e395 e395: 3 total, 3 up, 3 in
Jan 23 05:43:54 np0005593234 nova_compute[227762]: 2026-01-23 10:43:54.493 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:54.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e395 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:43:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:55.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:55 np0005593234 nova_compute[227762]: 2026-01-23 10:43:55.580 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:43:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:56.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:43:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:57.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:43:58.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:43:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:43:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:43:59.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:43:59 np0005593234 nova_compute[227762]: 2026-01-23 10:43:59.494 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:43:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e396 e396: 3 total, 3 up, 3 in
Jan 23 05:44:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:00 np0005593234 nova_compute[227762]: 2026-01-23 10:44:00.582 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:00.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:44:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:01.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:44:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:02.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:03.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:03 np0005593234 podman[323335]: 2026-01-23 10:44:03.749962078 +0000 UTC m=+0.042781437 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:44:04 np0005593234 nova_compute[227762]: 2026-01-23 10:44:04.496 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:04.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:05.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:44:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2409468524' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:44:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:44:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2409468524' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:44:05 np0005593234 nova_compute[227762]: 2026-01-23 10:44:05.583 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.324 227766 DEBUG oslo_concurrency.lockutils [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "d0cea430-15ec-471d-963b-41fd4fa4777c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.325 227766 DEBUG oslo_concurrency.lockutils [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.326 227766 DEBUG oslo_concurrency.lockutils [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.326 227766 DEBUG oslo_concurrency.lockutils [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.326 227766 DEBUG oslo_concurrency.lockutils [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.328 227766 INFO nova.compute.manager [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Terminating instance#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.329 227766 DEBUG nova.compute.manager [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:44:06 np0005593234 kernel: tap50f13d72-f6 (unregistering): left promiscuous mode
Jan 23 05:44:06 np0005593234 NetworkManager[48942]: <info>  [1769165046.3916] device (tap50f13d72-f6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:44:06 np0005593234 ovn_controller[134547]: 2026-01-23T10:44:06Z|00855|binding|INFO|Releasing lport 50f13d72-f6d6-4b3a-8853-76d0a2f50240 from this chassis (sb_readonly=0)
Jan 23 05:44:06 np0005593234 ovn_controller[134547]: 2026-01-23T10:44:06Z|00856|binding|INFO|Setting lport 50f13d72-f6d6-4b3a-8853-76d0a2f50240 down in Southbound
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.403 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:06 np0005593234 ovn_controller[134547]: 2026-01-23T10:44:06Z|00857|binding|INFO|Removing iface tap50f13d72-f6 ovn-installed in OVS
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.404 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.416 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.422 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b1:07:ce 10.100.0.3'], port_security=['fa:16:3e:b1:07:ce 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'd0cea430-15ec-471d-963b-41fd4fa4777c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e762fca3b634c7aa1d994314c059c54', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ed138636-f650-4a09-b808-0b05f9067a5a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0936335-b706-4400-8411-bdd084c8cdf7, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=50f13d72-f6d6-4b3a-8853-76d0a2f50240) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.423 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 50f13d72-f6d6-4b3a-8853-76d0a2f50240 in datapath fba2ba4a-d82c-4f8b-9754-c13fbec41a04 unbound from our chassis#033[00m
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.424 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fba2ba4a-d82c-4f8b-9754-c13fbec41a04#033[00m
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.443 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d73a62-24da-4c2f-82fb-1e8a25062c29]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:06 np0005593234 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000c4.scope: Deactivated successfully.
Jan 23 05:44:06 np0005593234 systemd[1]: machine-qemu\x2d94\x2dinstance\x2d000000c4.scope: Consumed 25.087s CPU time.
Jan 23 05:44:06 np0005593234 systemd-machined[195626]: Machine qemu-94-instance-000000c4 terminated.
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.476 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[dd033041-fae1-485e-810f-6f3fcad911ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.480 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4dc550-d77a-4acb-9104-dc9979167729]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.506 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[6a8d4694-d6df-4656-90d3-1c518c8d2137]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.522 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[763b25a3-b083-4b46-846e-7fe832f1124b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfba2ba4a-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:27:db:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 7, 'rx_bytes': 700, 'tx_bytes': 438, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 253], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 860063, 'reachable_time': 42218, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 6, 'inoctets': 448, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 228, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 6, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 448, 'outmcastoctets': 228, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 6, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 323419, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.535 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[80e54c2a-684e-4a1a-ae9c-0981a89d43da]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapfba2ba4a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 860074, 'tstamp': 860074}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 323420, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapfba2ba4a-d1'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 860077, 'tstamp': 860077}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 323420, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.538 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfba2ba4a-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.539 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.545 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.546 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfba2ba4a-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.547 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.547 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfba2ba4a-d0, col_values=(('external_ids', {'iface-id': '2348ddba-3dc3-4456-a637-f3065ba0d8f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:06.548 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.568 227766 INFO nova.virt.libvirt.driver [-] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Instance destroyed successfully.#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.569 227766 DEBUG nova.objects.instance [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'resources' on Instance uuid d0cea430-15ec-471d-963b-41fd4fa4777c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.630 227766 DEBUG nova.virt.libvirt.vif [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:38:58Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='multiattach-server-1',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='multiattach-server-1',id=196,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBP3FfIOd2lnI+tPBfDtyl7+3bVUJP3jvoQEZS2+zpCm94FEzq78d4QEW/4ixP6N6S+NwXEvQperhCcfeORiYVMygQWeTqWJgqUherQ/1aiNrcs4OJRb36XBDXhjh6k5P/Q==',key_name='tempest-keypair-529522234',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:39:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e762fca3b634c7aa1d994314c059c54',ramdisk_id='',reservation_id='r-gcdmevgb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeMultiAttachTest-63035580',owner_user_name='tempest-AttachVolumeMultiAttachTest-63035580-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:39:11Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='93cd560e84264023877c47122b5919de',uuid=d0cea430-15ec-471d-963b-41fd4fa4777c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.630 227766 DEBUG nova.network.os_vif_util [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converting VIF {"id": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "address": "fa:16:3e:b1:07:ce", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50f13d72-f6", "ovs_interfaceid": "50f13d72-f6d6-4b3a-8853-76d0a2f50240", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.631 227766 DEBUG nova.network.os_vif_util [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b1:07:ce,bridge_name='br-int',has_traffic_filtering=True,id=50f13d72-f6d6-4b3a-8853-76d0a2f50240,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f13d72-f6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.631 227766 DEBUG os_vif [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:07:ce,bridge_name='br-int',has_traffic_filtering=True,id=50f13d72-f6d6-4b3a-8853-76d0a2f50240,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f13d72-f6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.633 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.633 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50f13d72-f6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.635 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.637 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:44:06 np0005593234 nova_compute[227762]: 2026-01-23 10:44:06.639 227766 INFO os_vif [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b1:07:ce,bridge_name='br-int',has_traffic_filtering=True,id=50f13d72-f6d6-4b3a-8853-76d0a2f50240,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50f13d72-f6')#033[00m
Jan 23 05:44:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:06.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:07 np0005593234 nova_compute[227762]: 2026-01-23 10:44:07.115 227766 INFO nova.virt.libvirt.driver [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Deleting instance files /var/lib/nova/instances/d0cea430-15ec-471d-963b-41fd4fa4777c_del#033[00m
Jan 23 05:44:07 np0005593234 nova_compute[227762]: 2026-01-23 10:44:07.115 227766 INFO nova.virt.libvirt.driver [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Deletion of /var/lib/nova/instances/d0cea430-15ec-471d-963b-41fd4fa4777c_del complete#033[00m
Jan 23 05:44:07 np0005593234 nova_compute[227762]: 2026-01-23 10:44:07.218 227766 INFO nova.compute.manager [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Took 0.89 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:44:07 np0005593234 nova_compute[227762]: 2026-01-23 10:44:07.219 227766 DEBUG oslo.service.loopingcall [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:44:07 np0005593234 nova_compute[227762]: 2026-01-23 10:44:07.219 227766 DEBUG nova.compute.manager [-] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:44:07 np0005593234 nova_compute[227762]: 2026-01-23 10:44:07.219 227766 DEBUG nova.network.neutron [-] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:44:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:07.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:07 np0005593234 nova_compute[227762]: 2026-01-23 10:44:07.351 227766 DEBUG nova.compute.manager [req-d4f61345-da60-408d-9848-e098744c6b4b req-c63cc6f4-4007-43bf-8417-f8903a476a32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Received event network-vif-unplugged-50f13d72-f6d6-4b3a-8853-76d0a2f50240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:07 np0005593234 nova_compute[227762]: 2026-01-23 10:44:07.351 227766 DEBUG oslo_concurrency.lockutils [req-d4f61345-da60-408d-9848-e098744c6b4b req-c63cc6f4-4007-43bf-8417-f8903a476a32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:07 np0005593234 nova_compute[227762]: 2026-01-23 10:44:07.352 227766 DEBUG oslo_concurrency.lockutils [req-d4f61345-da60-408d-9848-e098744c6b4b req-c63cc6f4-4007-43bf-8417-f8903a476a32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:07 np0005593234 nova_compute[227762]: 2026-01-23 10:44:07.352 227766 DEBUG oslo_concurrency.lockutils [req-d4f61345-da60-408d-9848-e098744c6b4b req-c63cc6f4-4007-43bf-8417-f8903a476a32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:07 np0005593234 nova_compute[227762]: 2026-01-23 10:44:07.352 227766 DEBUG nova.compute.manager [req-d4f61345-da60-408d-9848-e098744c6b4b req-c63cc6f4-4007-43bf-8417-f8903a476a32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] No waiting events found dispatching network-vif-unplugged-50f13d72-f6d6-4b3a-8853-76d0a2f50240 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:44:07 np0005593234 nova_compute[227762]: 2026-01-23 10:44:07.352 227766 DEBUG nova.compute.manager [req-d4f61345-da60-408d-9848-e098744c6b4b req-c63cc6f4-4007-43bf-8417-f8903a476a32 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Received event network-vif-unplugged-50f13d72-f6d6-4b3a-8853-76d0a2f50240 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:08.607 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=79, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=78) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:44:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:08.608 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:44:08 np0005593234 nova_compute[227762]: 2026-01-23 10:44:08.645 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:08 np0005593234 nova_compute[227762]: 2026-01-23 10:44:08.659 227766 DEBUG nova.network.neutron [-] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:44:08 np0005593234 nova_compute[227762]: 2026-01-23 10:44:08.678 227766 INFO nova.compute.manager [-] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Took 1.46 seconds to deallocate network for instance.#033[00m
Jan 23 05:44:08 np0005593234 nova_compute[227762]: 2026-01-23 10:44:08.742 227766 DEBUG oslo_concurrency.lockutils [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:08 np0005593234 nova_compute[227762]: 2026-01-23 10:44:08.742 227766 DEBUG oslo_concurrency.lockutils [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:08 np0005593234 nova_compute[227762]: 2026-01-23 10:44:08.764 227766 DEBUG nova.compute.manager [req-d4dd9c9a-a5fc-46ed-aa19-dffaf4c43135 req-99f076a0-f235-42a0-8cfc-49ad9deee5bb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Received event network-vif-deleted-50f13d72-f6d6-4b3a-8853-76d0a2f50240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:08 np0005593234 nova_compute[227762]: 2026-01-23 10:44:08.821 227766 DEBUG oslo_concurrency.processutils [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:44:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:44:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:08.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:44:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:09.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:44:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3447324059' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:44:09 np0005593234 nova_compute[227762]: 2026-01-23 10:44:09.325 227766 DEBUG oslo_concurrency.processutils [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:44:09 np0005593234 nova_compute[227762]: 2026-01-23 10:44:09.333 227766 DEBUG nova.compute.provider_tree [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:44:09 np0005593234 nova_compute[227762]: 2026-01-23 10:44:09.499 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:09 np0005593234 nova_compute[227762]: 2026-01-23 10:44:09.731 227766 DEBUG nova.scheduler.client.report [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:44:09 np0005593234 nova_compute[227762]: 2026-01-23 10:44:09.809 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:09 np0005593234 nova_compute[227762]: 2026-01-23 10:44:09.809 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:44:09 np0005593234 nova_compute[227762]: 2026-01-23 10:44:09.810 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:44:10 np0005593234 nova_compute[227762]: 2026-01-23 10:44:10.121 227766 DEBUG nova.compute.manager [req-4bc43448-51e3-4cf3-b185-dbc236468a70 req-fd95d3e2-ea05-4c10-941a-491b2c7e17f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Received event network-vif-plugged-50f13d72-f6d6-4b3a-8853-76d0a2f50240 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:10 np0005593234 nova_compute[227762]: 2026-01-23 10:44:10.122 227766 DEBUG oslo_concurrency.lockutils [req-4bc43448-51e3-4cf3-b185-dbc236468a70 req-fd95d3e2-ea05-4c10-941a-491b2c7e17f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:10 np0005593234 nova_compute[227762]: 2026-01-23 10:44:10.123 227766 DEBUG oslo_concurrency.lockutils [req-4bc43448-51e3-4cf3-b185-dbc236468a70 req-fd95d3e2-ea05-4c10-941a-491b2c7e17f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:10 np0005593234 nova_compute[227762]: 2026-01-23 10:44:10.123 227766 DEBUG oslo_concurrency.lockutils [req-4bc43448-51e3-4cf3-b185-dbc236468a70 req-fd95d3e2-ea05-4c10-941a-491b2c7e17f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:10 np0005593234 nova_compute[227762]: 2026-01-23 10:44:10.124 227766 DEBUG nova.compute.manager [req-4bc43448-51e3-4cf3-b185-dbc236468a70 req-fd95d3e2-ea05-4c10-941a-491b2c7e17f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] No waiting events found dispatching network-vif-plugged-50f13d72-f6d6-4b3a-8853-76d0a2f50240 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:44:10 np0005593234 nova_compute[227762]: 2026-01-23 10:44:10.124 227766 WARNING nova.compute.manager [req-4bc43448-51e3-4cf3-b185-dbc236468a70 req-fd95d3e2-ea05-4c10-941a-491b2c7e17f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Received unexpected event network-vif-plugged-50f13d72-f6d6-4b3a-8853-76d0a2f50240 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:44:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:10.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:11 np0005593234 nova_compute[227762]: 2026-01-23 10:44:11.271 227766 DEBUG oslo_concurrency.lockutils [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 2.529s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:11.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:11 np0005593234 nova_compute[227762]: 2026-01-23 10:44:11.635 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:44:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:44:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:44:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:44:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:44:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:12.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:13.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:13 np0005593234 podman[323610]: 2026-01-23 10:44:13.796560455 +0000 UTC m=+0.088060583 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:44:14 np0005593234 nova_compute[227762]: 2026-01-23 10:44:14.501 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:14 np0005593234 nova_compute[227762]: 2026-01-23 10:44:14.642 227766 INFO nova.scheduler.client.report [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Deleted allocations for instance d0cea430-15ec-471d-963b-41fd4fa4777c#033[00m
Jan 23 05:44:14 np0005593234 nova_compute[227762]: 2026-01-23 10:44:14.672 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:44:14 np0005593234 nova_compute[227762]: 2026-01-23 10:44:14.672 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:44:14 np0005593234 nova_compute[227762]: 2026-01-23 10:44:14.672 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:44:14 np0005593234 nova_compute[227762]: 2026-01-23 10:44:14.673 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 307f203d-cfc0-45a9-a0cd-3acee0ef7133 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:44:14 np0005593234 nova_compute[227762]: 2026-01-23 10:44:14.809 227766 DEBUG oslo_concurrency.lockutils [None req-43a1cb82-9d00-417c-b5fd-84def2fd974e 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "d0cea430-15ec-471d-963b-41fd4fa4777c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 8.484s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:14.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:15.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:16 np0005593234 nova_compute[227762]: 2026-01-23 10:44:16.638 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:16.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:44:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:17.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.339 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Updating instance_info_cache with network_info: [{"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.415 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-307f203d-cfc0-45a9-a0cd-3acee0ef7133" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.415 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.415 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.415 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.416 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.416 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.416 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.416 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.416 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.472 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.472 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.472 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.473 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.473 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:44:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:18.610 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '79'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:18.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:44:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1254656893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:44:18 np0005593234 nova_compute[227762]: 2026-01-23 10:44:18.926 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.088 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.089 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000c0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.231 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.232 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3892MB free_disk=20.972766876220703GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.232 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.232 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:19.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.335 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 307f203d-cfc0-45a9-a0cd-3acee0ef7133 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.336 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.336 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.370 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:44:19 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:44:19 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.504 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:44:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/640681781' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.795 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.800 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.819 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.974 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:44:19 np0005593234 nova_compute[227762]: 2026-01-23 10:44:19.975 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:20.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:21.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:21 np0005593234 nova_compute[227762]: 2026-01-23 10:44:21.567 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165046.5656853, d0cea430-15ec-471d-963b-41fd4fa4777c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:44:21 np0005593234 nova_compute[227762]: 2026-01-23 10:44:21.567 227766 INFO nova.compute.manager [-] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:44:21 np0005593234 nova_compute[227762]: 2026-01-23 10:44:21.639 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:21 np0005593234 nova_compute[227762]: 2026-01-23 10:44:21.650 227766 DEBUG nova.compute.manager [None req-6b56f2c0-d1d1-4a7c-8c26-a88c44d579f7 - - - - - -] [instance: d0cea430-15ec-471d-963b-41fd4fa4777c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:44:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:22.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:23.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.415 227766 DEBUG oslo_concurrency.lockutils [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.415 227766 DEBUG oslo_concurrency.lockutils [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.416 227766 DEBUG oslo_concurrency.lockutils [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.416 227766 DEBUG oslo_concurrency.lockutils [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.417 227766 DEBUG oslo_concurrency.lockutils [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.419 227766 INFO nova.compute.manager [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Terminating instance#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.421 227766 DEBUG nova.compute.manager [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:44:24 np0005593234 kernel: tap93cbf6f2-1b (unregistering): left promiscuous mode
Jan 23 05:44:24 np0005593234 NetworkManager[48942]: <info>  [1769165064.4644] device (tap93cbf6f2-1b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:44:24 np0005593234 ovn_controller[134547]: 2026-01-23T10:44:24Z|00858|binding|INFO|Releasing lport 93cbf6f2-1b0c-4fcf-b194-5f85394193db from this chassis (sb_readonly=0)
Jan 23 05:44:24 np0005593234 ovn_controller[134547]: 2026-01-23T10:44:24Z|00859|binding|INFO|Setting lport 93cbf6f2-1b0c-4fcf-b194-5f85394193db down in Southbound
Jan 23 05:44:24 np0005593234 ovn_controller[134547]: 2026-01-23T10:44:24Z|00860|binding|INFO|Removing iface tap93cbf6f2-1b ovn-installed in OVS
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.480 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:24.491 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:58:e7 10.100.0.12'], port_security=['fa:16:3e:9b:58:e7 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '307f203d-cfc0-45a9-a0cd-3acee0ef7133', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '6e762fca3b634c7aa1d994314c059c54', 'neutron:revision_number': '4', 'neutron:security_group_ids': '48274d25-9599-424c-bfd1-ff8c0b4eb8cc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a0936335-b706-4400-8411-bdd084c8cdf7, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=93cbf6f2-1b0c-4fcf-b194-5f85394193db) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:44:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:24.493 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 93cbf6f2-1b0c-4fcf-b194-5f85394193db in datapath fba2ba4a-d82c-4f8b-9754-c13fbec41a04 unbound from our chassis#033[00m
Jan 23 05:44:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:24.495 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fba2ba4a-d82c-4f8b-9754-c13fbec41a04, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:44:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:24.496 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ad422f3c-60dc-41f2-a9d7-407027e47193]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:24.497 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04 namespace which is not needed anymore#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.524 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:24 np0005593234 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000c0.scope: Deactivated successfully.
Jan 23 05:44:24 np0005593234 systemd[1]: machine-qemu\x2d93\x2dinstance\x2d000000c0.scope: Consumed 27.819s CPU time.
Jan 23 05:44:24 np0005593234 systemd-machined[195626]: Machine qemu-93-instance-000000c0 terminated.
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.658 227766 INFO nova.virt.libvirt.driver [-] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Instance destroyed successfully.#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.658 227766 DEBUG nova.objects.instance [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lazy-loading 'resources' on Instance uuid 307f203d-cfc0-45a9-a0cd-3acee0ef7133 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.674 227766 DEBUG nova.virt.libvirt.vif [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:38:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeMultiAttachTest-server-1180367469',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumemultiattachtest-server-1180367469',id=192,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:38:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='6e762fca3b634c7aa1d994314c059c54',ramdisk_id='',reservation_id='r-1lyck9l7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-AttachVolumeMultiAttachTest-63035580',owner_user_name='tempest-AttachVolumeMultiAttachTest-63035580-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:38:26Z,user_data=None,user_id='93cd560e84264023877c47122b5919de',uuid=307f203d-cfc0-45a9-a0cd-3acee0ef7133,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.674 227766 DEBUG nova.network.os_vif_util [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converting VIF {"id": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "address": "fa:16:3e:9b:58:e7", "network": {"id": "fba2ba4a-d82c-4f8b-9754-c13fbec41a04", "bridge": "br-int", "label": "tempest-AttachVolumeMultiAttachTest-1976243271-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "6e762fca3b634c7aa1d994314c059c54", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap93cbf6f2-1b", "ovs_interfaceid": "93cbf6f2-1b0c-4fcf-b194-5f85394193db", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.675 227766 DEBUG nova.network.os_vif_util [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9b:58:e7,bridge_name='br-int',has_traffic_filtering=True,id=93cbf6f2-1b0c-4fcf-b194-5f85394193db,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93cbf6f2-1b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.675 227766 DEBUG os_vif [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:58:e7,bridge_name='br-int',has_traffic_filtering=True,id=93cbf6f2-1b0c-4fcf-b194-5f85394193db,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93cbf6f2-1b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.677 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.677 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap93cbf6f2-1b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.679 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.681 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.683 227766 INFO os_vif [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:58:e7,bridge_name='br-int',has_traffic_filtering=True,id=93cbf6f2-1b0c-4fcf-b194-5f85394193db,network=Network(fba2ba4a-d82c-4f8b-9754-c13fbec41a04),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap93cbf6f2-1b')#033[00m
Jan 23 05:44:24 np0005593234 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[319179]: [NOTICE]   (319183) : haproxy version is 2.8.14-c23fe91
Jan 23 05:44:24 np0005593234 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[319179]: [NOTICE]   (319183) : path to executable is /usr/sbin/haproxy
Jan 23 05:44:24 np0005593234 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[319179]: [WARNING]  (319183) : Exiting Master process...
Jan 23 05:44:24 np0005593234 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[319179]: [WARNING]  (319183) : Exiting Master process...
Jan 23 05:44:24 np0005593234 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[319179]: [ALERT]    (319183) : Current worker (319189) exited with code 143 (Terminated)
Jan 23 05:44:24 np0005593234 neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04[319179]: [WARNING]  (319183) : All workers exited. Exiting... (0)
Jan 23 05:44:24 np0005593234 systemd[1]: libpod-a4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36.scope: Deactivated successfully.
Jan 23 05:44:24 np0005593234 podman[323762]: 2026-01-23 10:44:24.701403107 +0000 UTC m=+0.097995933 container died a4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:44:24 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36-userdata-shm.mount: Deactivated successfully.
Jan 23 05:44:24 np0005593234 systemd[1]: var-lib-containers-storage-overlay-d5a339b19b6ee5ecb76dff9da654c6e1b4afa3c7fb76d06945c9fb62ea2941b3-merged.mount: Deactivated successfully.
Jan 23 05:44:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:24.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.905 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:24 np0005593234 nova_compute[227762]: 2026-01-23 10:44:24.927 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:24 np0005593234 podman[323762]: 2026-01-23 10:44:24.988966771 +0000 UTC m=+0.385559587 container cleanup a4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:44:24 np0005593234 systemd[1]: libpod-conmon-a4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36.scope: Deactivated successfully.
Jan 23 05:44:25 np0005593234 nova_compute[227762]: 2026-01-23 10:44:25.255 227766 INFO nova.virt.libvirt.driver [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Deleting instance files /var/lib/nova/instances/307f203d-cfc0-45a9-a0cd-3acee0ef7133_del#033[00m
Jan 23 05:44:25 np0005593234 nova_compute[227762]: 2026-01-23 10:44:25.255 227766 INFO nova.virt.libvirt.driver [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Deletion of /var/lib/nova/instances/307f203d-cfc0-45a9-a0cd-3acee0ef7133_del complete#033[00m
Jan 23 05:44:25 np0005593234 podman[323823]: 2026-01-23 10:44:25.291355789 +0000 UTC m=+0.273669121 container remove a4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:44:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:25.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:25.298 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cdab21f1-f72f-4141-aaed-02178797fb2c]: (4, ('Fri Jan 23 10:44:24 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04 (a4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36)\na4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36\nFri Jan 23 10:44:24 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04 (a4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36)\na4c5e55c46e8eac7a1419e0106527a2ed13540afda10db4937667dcd63447f36\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:25.301 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[17639a1e-07f4-45be-bded-71d1998b7bd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:25.303 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfba2ba4a-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:44:25 np0005593234 nova_compute[227762]: 2026-01-23 10:44:25.306 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:25 np0005593234 kernel: tapfba2ba4a-d0: left promiscuous mode
Jan 23 05:44:25 np0005593234 nova_compute[227762]: 2026-01-23 10:44:25.318 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:25.321 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[79ac3c9d-0154-4cc6-9251-0292ffa59771]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:25.343 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[658b1256-8689-4919-a8d4-e13351b2f407]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:25.344 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[226e8e78-50b2-428e-9b8c-f4f4eaec2674]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:25.359 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[47ee38ee-3e24-4723-98e8-f6962e4517e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 860057, 'reachable_time': 21342, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 323889, 'error': None, 'target': 'ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:25.362 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fba2ba4a-d82c-4f8b-9754-c13fbec41a04 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:44:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:25.362 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[5c93480d-f8f3-4cfa-bc63-797ee98e0ff7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:44:25 np0005593234 systemd[1]: run-netns-ovnmeta\x2dfba2ba4a\x2dd82c\x2d4f8b\x2d9754\x2dc13fbec41a04.mount: Deactivated successfully.
Jan 23 05:44:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e396 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:25 np0005593234 nova_compute[227762]: 2026-01-23 10:44:25.951 227766 INFO nova.compute.manager [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Took 1.53 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:44:25 np0005593234 nova_compute[227762]: 2026-01-23 10:44:25.951 227766 DEBUG oslo.service.loopingcall [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:44:25 np0005593234 nova_compute[227762]: 2026-01-23 10:44:25.952 227766 DEBUG nova.compute.manager [-] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:44:25 np0005593234 nova_compute[227762]: 2026-01-23 10:44:25.952 227766 DEBUG nova.network.neutron [-] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:44:26 np0005593234 nova_compute[227762]: 2026-01-23 10:44:26.260 227766 DEBUG nova.compute.manager [req-6ac72364-c930-4fc9-bbd3-31da2d557d50 req-8a6cf5aa-1dfe-4443-bd6f-809fb90c148a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Received event network-vif-unplugged-93cbf6f2-1b0c-4fcf-b194-5f85394193db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:26 np0005593234 nova_compute[227762]: 2026-01-23 10:44:26.260 227766 DEBUG oslo_concurrency.lockutils [req-6ac72364-c930-4fc9-bbd3-31da2d557d50 req-8a6cf5aa-1dfe-4443-bd6f-809fb90c148a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:26 np0005593234 nova_compute[227762]: 2026-01-23 10:44:26.261 227766 DEBUG oslo_concurrency.lockutils [req-6ac72364-c930-4fc9-bbd3-31da2d557d50 req-8a6cf5aa-1dfe-4443-bd6f-809fb90c148a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:26 np0005593234 nova_compute[227762]: 2026-01-23 10:44:26.261 227766 DEBUG oslo_concurrency.lockutils [req-6ac72364-c930-4fc9-bbd3-31da2d557d50 req-8a6cf5aa-1dfe-4443-bd6f-809fb90c148a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:26 np0005593234 nova_compute[227762]: 2026-01-23 10:44:26.261 227766 DEBUG nova.compute.manager [req-6ac72364-c930-4fc9-bbd3-31da2d557d50 req-8a6cf5aa-1dfe-4443-bd6f-809fb90c148a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] No waiting events found dispatching network-vif-unplugged-93cbf6f2-1b0c-4fcf-b194-5f85394193db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:44:26 np0005593234 nova_compute[227762]: 2026-01-23 10:44:26.261 227766 DEBUG nova.compute.manager [req-6ac72364-c930-4fc9-bbd3-31da2d557d50 req-8a6cf5aa-1dfe-4443-bd6f-809fb90c148a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Received event network-vif-unplugged-93cbf6f2-1b0c-4fcf-b194-5f85394193db for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:44:26 np0005593234 nova_compute[227762]: 2026-01-23 10:44:26.821 227766 DEBUG nova.network.neutron [-] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:44:26 np0005593234 nova_compute[227762]: 2026-01-23 10:44:26.845 227766 INFO nova.compute.manager [-] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Took 0.89 seconds to deallocate network for instance.#033[00m
Jan 23 05:44:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:44:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:26.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:44:26 np0005593234 nova_compute[227762]: 2026-01-23 10:44:26.896 227766 DEBUG nova.compute.manager [req-fe541bc7-8a83-4c42-a3ae-b202a593de12 req-23be8846-1dba-4dce-b750-ad6a1ac72049 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Received event network-vif-deleted-93cbf6f2-1b0c-4fcf-b194-5f85394193db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:27 np0005593234 nova_compute[227762]: 2026-01-23 10:44:27.016 227766 INFO nova.compute.manager [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Took 0.17 seconds to detach 1 volumes for instance.#033[00m
Jan 23 05:44:27 np0005593234 nova_compute[227762]: 2026-01-23 10:44:27.064 227766 DEBUG oslo_concurrency.lockutils [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:27 np0005593234 nova_compute[227762]: 2026-01-23 10:44:27.065 227766 DEBUG oslo_concurrency.lockutils [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:27 np0005593234 nova_compute[227762]: 2026-01-23 10:44:27.123 227766 DEBUG oslo_concurrency.processutils [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:44:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:27.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:44:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/346932007' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:44:27 np0005593234 nova_compute[227762]: 2026-01-23 10:44:27.618 227766 DEBUG oslo_concurrency.processutils [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.495s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:44:27 np0005593234 nova_compute[227762]: 2026-01-23 10:44:27.625 227766 DEBUG nova.compute.provider_tree [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:44:27 np0005593234 nova_compute[227762]: 2026-01-23 10:44:27.643 227766 DEBUG nova.scheduler.client.report [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:44:27 np0005593234 nova_compute[227762]: 2026-01-23 10:44:27.666 227766 DEBUG oslo_concurrency.lockutils [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:27 np0005593234 nova_compute[227762]: 2026-01-23 10:44:27.716 227766 INFO nova.scheduler.client.report [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Deleted allocations for instance 307f203d-cfc0-45a9-a0cd-3acee0ef7133#033[00m
Jan 23 05:44:27 np0005593234 nova_compute[227762]: 2026-01-23 10:44:27.817 227766 DEBUG oslo_concurrency.lockutils [None req-102f8ab3-9fa2-46be-848d-0d7a3c710b5f 93cd560e84264023877c47122b5919de 6e762fca3b634c7aa1d994314c059c54 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.402s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:44:28 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3913700069' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:44:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:44:28 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3913700069' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:44:28 np0005593234 nova_compute[227762]: 2026-01-23 10:44:28.403 227766 DEBUG nova.compute.manager [req-3e9ee5d8-f792-48b4-9a44-a2da7d112876 req-3fcb42fc-4c1b-46b2-b90f-4856510f0b89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Received event network-vif-plugged-93cbf6f2-1b0c-4fcf-b194-5f85394193db external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:44:28 np0005593234 nova_compute[227762]: 2026-01-23 10:44:28.403 227766 DEBUG oslo_concurrency.lockutils [req-3e9ee5d8-f792-48b4-9a44-a2da7d112876 req-3fcb42fc-4c1b-46b2-b90f-4856510f0b89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:28 np0005593234 nova_compute[227762]: 2026-01-23 10:44:28.403 227766 DEBUG oslo_concurrency.lockutils [req-3e9ee5d8-f792-48b4-9a44-a2da7d112876 req-3fcb42fc-4c1b-46b2-b90f-4856510f0b89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:28 np0005593234 nova_compute[227762]: 2026-01-23 10:44:28.404 227766 DEBUG oslo_concurrency.lockutils [req-3e9ee5d8-f792-48b4-9a44-a2da7d112876 req-3fcb42fc-4c1b-46b2-b90f-4856510f0b89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "307f203d-cfc0-45a9-a0cd-3acee0ef7133-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:28 np0005593234 nova_compute[227762]: 2026-01-23 10:44:28.404 227766 DEBUG nova.compute.manager [req-3e9ee5d8-f792-48b4-9a44-a2da7d112876 req-3fcb42fc-4c1b-46b2-b90f-4856510f0b89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] No waiting events found dispatching network-vif-plugged-93cbf6f2-1b0c-4fcf-b194-5f85394193db pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:44:28 np0005593234 nova_compute[227762]: 2026-01-23 10:44:28.404 227766 WARNING nova.compute.manager [req-3e9ee5d8-f792-48b4-9a44-a2da7d112876 req-3fcb42fc-4c1b-46b2-b90f-4856510f0b89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Received unexpected event network-vif-plugged-93cbf6f2-1b0c-4fcf-b194-5f85394193db for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:44:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:28.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:29.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:29 np0005593234 nova_compute[227762]: 2026-01-23 10:44:29.525 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:29 np0005593234 nova_compute[227762]: 2026-01-23 10:44:29.679 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e397 e397: 3 total, 3 up, 3 in
Jan 23 05:44:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:30.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:44:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:31.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:44:32 np0005593234 nova_compute[227762]: 2026-01-23 10:44:32.760 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:32.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:33.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:44:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2194694337' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:44:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:44:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2194694337' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:44:34 np0005593234 nova_compute[227762]: 2026-01-23 10:44:34.528 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:34 np0005593234 nova_compute[227762]: 2026-01-23 10:44:34.680 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:34 np0005593234 podman[323917]: 2026-01-23 10:44:34.800372918 +0000 UTC m=+0.087507575 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:44:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:34.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:35.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e397 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:36.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:37.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:38.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:39.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:39 np0005593234 nova_compute[227762]: 2026-01-23 10:44:39.530 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e398 e398: 3 total, 3 up, 3 in
Jan 23 05:44:39 np0005593234 nova_compute[227762]: 2026-01-23 10:44:39.656 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165064.6547174, 307f203d-cfc0-45a9-a0cd-3acee0ef7133 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:44:39 np0005593234 nova_compute[227762]: 2026-01-23 10:44:39.656 227766 INFO nova.compute.manager [-] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:44:39 np0005593234 nova_compute[227762]: 2026-01-23 10:44:39.682 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:39 np0005593234 nova_compute[227762]: 2026-01-23 10:44:39.687 227766 DEBUG nova.compute.manager [None req-b16d4515-59d5-41b8-9650-88950f6c1f39 - - - - - -] [instance: 307f203d-cfc0-45a9-a0cd-3acee0ef7133] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:44:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:40.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:41.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:42.882 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:44:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:42.882 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:44:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:44:42.882 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:44:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:42.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:43.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:44 np0005593234 nova_compute[227762]: 2026-01-23 10:44:44.533 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #175. Immutable memtables: 0.
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.551344) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 111] Flushing memtable with next log file: 175
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165084551724, "job": 111, "event": "flush_started", "num_memtables": 1, "num_entries": 1431, "num_deletes": 259, "total_data_size": 3054404, "memory_usage": 3091024, "flush_reason": "Manual Compaction"}
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 111] Level-0 flush table #176: started
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165084564313, "cf_name": "default", "job": 111, "event": "table_file_creation", "file_number": 176, "file_size": 1992865, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 83942, "largest_seqno": 85368, "table_properties": {"data_size": 1986614, "index_size": 3453, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1733, "raw_key_size": 13643, "raw_average_key_size": 20, "raw_value_size": 1973875, "raw_average_value_size": 2911, "num_data_blocks": 152, "num_entries": 678, "num_filter_entries": 678, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164982, "oldest_key_time": 1769164982, "file_creation_time": 1769165084, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 176, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 111] Flush lasted 12811 microseconds, and 5154 cpu microseconds.
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.564433) [db/flush_job.cc:967] [default] [JOB 111] Level-0 flush table #176: 1992865 bytes OK
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.564462) [db/memtable_list.cc:519] [default] Level-0 commit table #176 started
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.566368) [db/memtable_list.cc:722] [default] Level-0 commit table #176: memtable #1 done
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.566388) EVENT_LOG_v1 {"time_micros": 1769165084566381, "job": 111, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.566410) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 111] Try to delete WAL files size 3047560, prev total WAL file size 3047560, number of live WAL files 2.
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000172.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.567823) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033323637' seq:72057594037927935, type:22 .. '6C6F676D0033353230' seq:0, type:0; will stop at (end)
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 112] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 111 Base level 0, inputs: [176(1946KB)], [174(10MB)]
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165084568252, "job": 112, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [176], "files_L6": [174], "score": -1, "input_data_size": 13372282, "oldest_snapshot_seqno": -1}
Jan 23 05:44:44 np0005593234 nova_compute[227762]: 2026-01-23 10:44:44.684 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 112] Generated table #177: 10318 keys, 13231060 bytes, temperature: kUnknown
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165084735915, "cf_name": "default", "job": 112, "event": "table_file_creation", "file_number": 177, "file_size": 13231060, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13164676, "index_size": 39455, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25861, "raw_key_size": 272813, "raw_average_key_size": 26, "raw_value_size": 12984493, "raw_average_value_size": 1258, "num_data_blocks": 1504, "num_entries": 10318, "num_filter_entries": 10318, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769165084, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 177, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.736210) [db/compaction/compaction_job.cc:1663] [default] [JOB 112] Compacted 1@0 + 1@6 files to L6 => 13231060 bytes
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.738148) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 79.7 rd, 78.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 10.9 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(13.3) write-amplify(6.6) OK, records in: 10853, records dropped: 535 output_compression: NoCompression
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.738170) EVENT_LOG_v1 {"time_micros": 1769165084738160, "job": 112, "event": "compaction_finished", "compaction_time_micros": 167734, "compaction_time_cpu_micros": 51460, "output_level": 6, "num_output_files": 1, "total_output_size": 13231060, "num_input_records": 10853, "num_output_records": 10318, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000176.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165084738798, "job": 112, "event": "table_file_deletion", "file_number": 176}
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000174.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165084741480, "job": 112, "event": "table_file_deletion", "file_number": 174}
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.567662) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.741602) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.741610) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.741613) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.741616) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:44:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:44:44.741619) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:44:44 np0005593234 podman[323941]: 2026-01-23 10:44:44.809956327 +0000 UTC m=+0.104138195 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 05:44:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:44.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:45.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:46.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:47.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:48 np0005593234 nova_compute[227762]: 2026-01-23 10:44:48.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:48.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:49.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:49 np0005593234 nova_compute[227762]: 2026-01-23 10:44:49.533 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:49 np0005593234 nova_compute[227762]: 2026-01-23 10:44:49.723 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:50 np0005593234 nova_compute[227762]: 2026-01-23 10:44:50.503 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:50 np0005593234 nova_compute[227762]: 2026-01-23 10:44:50.684 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e398 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:50.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:51 np0005593234 nova_compute[227762]: 2026-01-23 10:44:51.066 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:51 np0005593234 nova_compute[227762]: 2026-01-23 10:44:51.066 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:44:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:51.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e399 e399: 3 total, 3 up, 3 in
Jan 23 05:44:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:52.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:53.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:54 np0005593234 nova_compute[227762]: 2026-01-23 10:44:54.559 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:54 np0005593234 nova_compute[227762]: 2026-01-23 10:44:54.724 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:54.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:55.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e399 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:44:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:56.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:44:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:57.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:44:57 np0005593234 nova_compute[227762]: 2026-01-23 10:44:57.763 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:44:57 np0005593234 nova_compute[227762]: 2026-01-23 10:44:57.764 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:44:58 np0005593234 nova_compute[227762]: 2026-01-23 10:44:58.591 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:44:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:44:58.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:44:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:44:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:44:59.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:44:59 np0005593234 nova_compute[227762]: 2026-01-23 10:44:59.561 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:44:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 e400: 3 total, 3 up, 3 in
Jan 23 05:44:59 np0005593234 nova_compute[227762]: 2026-01-23 10:44:59.769 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:00.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:01.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:02.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:03.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:04 np0005593234 nova_compute[227762]: 2026-01-23 10:45:04.563 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:04 np0005593234 nova_compute[227762]: 2026-01-23 10:45:04.771 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:04.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:05.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:05 np0005593234 podman[324056]: 2026-01-23 10:45:05.726524864 +0000 UTC m=+0.083981465 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:45:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:06.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:07.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:08.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:09.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:09 np0005593234 nova_compute[227762]: 2026-01-23 10:45:09.564 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:09 np0005593234 nova_compute[227762]: 2026-01-23 10:45:09.814 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:10 np0005593234 nova_compute[227762]: 2026-01-23 10:45:10.573 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:10 np0005593234 nova_compute[227762]: 2026-01-23 10:45:10.644 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:10 np0005593234 nova_compute[227762]: 2026-01-23 10:45:10.645 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:10 np0005593234 nova_compute[227762]: 2026-01-23 10:45:10.645 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:10 np0005593234 nova_compute[227762]: 2026-01-23 10:45:10.645 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:45:10 np0005593234 nova_compute[227762]: 2026-01-23 10:45:10.646 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:45:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:10.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:45:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3140484953' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:45:11 np0005593234 nova_compute[227762]: 2026-01-23 10:45:11.073 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:45:11 np0005593234 nova_compute[227762]: 2026-01-23 10:45:11.231 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:45:11 np0005593234 nova_compute[227762]: 2026-01-23 10:45:11.233 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4172MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:45:11 np0005593234 nova_compute[227762]: 2026-01-23 10:45:11.233 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:11 np0005593234 nova_compute[227762]: 2026-01-23 10:45:11.233 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:11.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:11 np0005593234 nova_compute[227762]: 2026-01-23 10:45:11.345 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:45:11 np0005593234 nova_compute[227762]: 2026-01-23 10:45:11.345 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:45:11 np0005593234 nova_compute[227762]: 2026-01-23 10:45:11.372 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:45:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:45:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3875409407' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:45:12 np0005593234 nova_compute[227762]: 2026-01-23 10:45:12.045 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.672s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:45:12 np0005593234 nova_compute[227762]: 2026-01-23 10:45:12.050 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:45:12 np0005593234 nova_compute[227762]: 2026-01-23 10:45:12.066 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:45:12 np0005593234 nova_compute[227762]: 2026-01-23 10:45:12.097 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:45:12 np0005593234 nova_compute[227762]: 2026-01-23 10:45:12.098 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.864s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:12 np0005593234 nova_compute[227762]: 2026-01-23 10:45:12.269 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:12 np0005593234 nova_compute[227762]: 2026-01-23 10:45:12.270 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:45:12 np0005593234 nova_compute[227762]: 2026-01-23 10:45:12.294 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:45:12 np0005593234 nova_compute[227762]: 2026-01-23 10:45:12.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:12.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:13.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:14 np0005593234 nova_compute[227762]: 2026-01-23 10:45:14.567 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:14 np0005593234 nova_compute[227762]: 2026-01-23 10:45:14.815 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:14.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:15.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:15 np0005593234 nova_compute[227762]: 2026-01-23 10:45:15.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:15 np0005593234 podman[324152]: 2026-01-23 10:45:15.778228966 +0000 UTC m=+0.076494731 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 05:45:16 np0005593234 nova_compute[227762]: 2026-01-23 10:45:16.359 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:16 np0005593234 nova_compute[227762]: 2026-01-23 10:45:16.359 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:16 np0005593234 nova_compute[227762]: 2026-01-23 10:45:16.377 227766 DEBUG nova.compute.manager [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:45:16 np0005593234 nova_compute[227762]: 2026-01-23 10:45:16.498 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:16 np0005593234 nova_compute[227762]: 2026-01-23 10:45:16.499 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:16 np0005593234 nova_compute[227762]: 2026-01-23 10:45:16.504 227766 DEBUG nova.virt.hardware [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:45:16 np0005593234 nova_compute[227762]: 2026-01-23 10:45:16.505 227766 INFO nova.compute.claims [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:45:16 np0005593234 nova_compute[227762]: 2026-01-23 10:45:16.616 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:45:16 np0005593234 nova_compute[227762]: 2026-01-23 10:45:16.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:16.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:45:17 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/264925219' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.056 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.063 227766 DEBUG nova.compute.provider_tree [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.080 227766 DEBUG nova.scheduler.client.report [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.108 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.609s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.109 227766 DEBUG nova.compute.manager [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.165 227766 DEBUG nova.compute.manager [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.166 227766 DEBUG nova.network.neutron [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.194 227766 INFO nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.213 227766 DEBUG nova.compute.manager [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.318 227766 DEBUG nova.compute.manager [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.319 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.320 227766 INFO nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Creating image(s)#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.348 227766 DEBUG nova.storage.rbd_utils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:45:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:17.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.391 227766 DEBUG nova.storage.rbd_utils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.417 227766 DEBUG nova.storage.rbd_utils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.421 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.450 227766 DEBUG nova.policy [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '296341ffca2441dc807d285fa14c966d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '36d7e7c7ddbd4cf785fafd0d35b0a2d8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.489 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.490 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.491 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.491 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.517 227766 DEBUG nova.storage.rbd_utils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.521 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 fb31c535-476a-4f92-866e-664b8b25e0fc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.823 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 fb31c535-476a-4f92-866e-664b8b25e0fc_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.302s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:45:17 np0005593234 nova_compute[227762]: 2026-01-23 10:45:17.887 227766 DEBUG nova.storage.rbd_utils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] resizing rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:45:18 np0005593234 nova_compute[227762]: 2026-01-23 10:45:18.215 227766 DEBUG nova.objects.instance [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'migration_context' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:45:18 np0005593234 nova_compute[227762]: 2026-01-23 10:45:18.240 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:45:18 np0005593234 nova_compute[227762]: 2026-01-23 10:45:18.240 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Ensure instance console log exists: /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:45:18 np0005593234 nova_compute[227762]: 2026-01-23 10:45:18.241 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:18 np0005593234 nova_compute[227762]: 2026-01-23 10:45:18.241 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:18 np0005593234 nova_compute[227762]: 2026-01-23 10:45:18.241 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:18.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:19 np0005593234 nova_compute[227762]: 2026-01-23 10:45:19.157 227766 DEBUG nova.network.neutron [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Successfully created port: 20adecab-752d-4d6c-95ac-7d46cf968926 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:45:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:19.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:19 np0005593234 nova_compute[227762]: 2026-01-23 10:45:19.569 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:19 np0005593234 nova_compute[227762]: 2026-01-23 10:45:19.816 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:19 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:45:19 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:45:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:45:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:45:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:45:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:20.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:21.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:21 np0005593234 nova_compute[227762]: 2026-01-23 10:45:21.503 227766 DEBUG nova.network.neutron [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Successfully updated port: 20adecab-752d-4d6c-95ac-7d46cf968926 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:45:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:45:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/965437104' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:45:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:22.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:23.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:24 np0005593234 nova_compute[227762]: 2026-01-23 10:45:24.213 227766 DEBUG nova.compute.manager [req-297a14cb-869f-44a2-8232-bf67bd9ebf46 req-1978b952-1be8-4bc9-8f2c-cf1b55720d5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-changed-20adecab-752d-4d6c-95ac-7d46cf968926 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:45:24 np0005593234 nova_compute[227762]: 2026-01-23 10:45:24.214 227766 DEBUG nova.compute.manager [req-297a14cb-869f-44a2-8232-bf67bd9ebf46 req-1978b952-1be8-4bc9-8f2c-cf1b55720d5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Refreshing instance network info cache due to event network-changed-20adecab-752d-4d6c-95ac-7d46cf968926. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:45:24 np0005593234 nova_compute[227762]: 2026-01-23 10:45:24.214 227766 DEBUG oslo_concurrency.lockutils [req-297a14cb-869f-44a2-8232-bf67bd9ebf46 req-1978b952-1be8-4bc9-8f2c-cf1b55720d5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:45:24 np0005593234 nova_compute[227762]: 2026-01-23 10:45:24.214 227766 DEBUG oslo_concurrency.lockutils [req-297a14cb-869f-44a2-8232-bf67bd9ebf46 req-1978b952-1be8-4bc9-8f2c-cf1b55720d5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:45:24 np0005593234 nova_compute[227762]: 2026-01-23 10:45:24.214 227766 DEBUG nova.network.neutron [req-297a14cb-869f-44a2-8232-bf67bd9ebf46 req-1978b952-1be8-4bc9-8f2c-cf1b55720d5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Refreshing network info cache for port 20adecab-752d-4d6c-95ac-7d46cf968926 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:45:24 np0005593234 nova_compute[227762]: 2026-01-23 10:45:24.227 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:45:24 np0005593234 nova_compute[227762]: 2026-01-23 10:45:24.472 227766 DEBUG nova.network.neutron [req-297a14cb-869f-44a2-8232-bf67bd9ebf46 req-1978b952-1be8-4bc9-8f2c-cf1b55720d5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:45:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:24.582 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=80, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=79) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:45:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:24.583 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:45:24 np0005593234 nova_compute[227762]: 2026-01-23 10:45:24.599 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:24 np0005593234 nova_compute[227762]: 2026-01-23 10:45:24.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:24 np0005593234 nova_compute[227762]: 2026-01-23 10:45:24.818 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:24 np0005593234 nova_compute[227762]: 2026-01-23 10:45:24.915 227766 DEBUG nova.network.neutron [req-297a14cb-869f-44a2-8232-bf67bd9ebf46 req-1978b952-1be8-4bc9-8f2c-cf1b55720d5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:45:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:24.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:25 np0005593234 nova_compute[227762]: 2026-01-23 10:45:25.308 227766 DEBUG oslo_concurrency.lockutils [req-297a14cb-869f-44a2-8232-bf67bd9ebf46 req-1978b952-1be8-4bc9-8f2c-cf1b55720d5f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:45:25 np0005593234 nova_compute[227762]: 2026-01-23 10:45:25.309 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquired lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:45:25 np0005593234 nova_compute[227762]: 2026-01-23 10:45:25.309 227766 DEBUG nova.network.neutron [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:45:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:25.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:25.585 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '80'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:45:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:26.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:27 np0005593234 nova_compute[227762]: 2026-01-23 10:45:27.079 227766 DEBUG nova.network.neutron [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:45:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:27.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:27 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:45:27 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.464 227766 DEBUG nova.network.neutron [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updating instance_info_cache with network_info: [{"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.491 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Releasing lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.492 227766 DEBUG nova.compute.manager [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Instance network_info: |[{"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.497 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Start _get_guest_xml network_info=[{"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.503 227766 WARNING nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.513 227766 DEBUG nova.virt.libvirt.host [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.514 227766 DEBUG nova.virt.libvirt.host [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.518 227766 DEBUG nova.virt.libvirt.host [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.518 227766 DEBUG nova.virt.libvirt.host [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.520 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.520 227766 DEBUG nova.virt.hardware [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.521 227766 DEBUG nova.virt.hardware [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.521 227766 DEBUG nova.virt.hardware [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.521 227766 DEBUG nova.virt.hardware [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.521 227766 DEBUG nova.virt.hardware [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.522 227766 DEBUG nova.virt.hardware [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.522 227766 DEBUG nova.virt.hardware [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.522 227766 DEBUG nova.virt.hardware [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.522 227766 DEBUG nova.virt.hardware [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.522 227766 DEBUG nova.virt.hardware [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.523 227766 DEBUG nova.virt.hardware [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:45:28 np0005593234 nova_compute[227762]: 2026-01-23 10:45:28.526 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:45:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:28.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:45:29 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4238485921' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.246 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.720s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.274 227766 DEBUG nova.storage.rbd_utils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.278 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:45:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:29.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.617 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:45:29 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1523823026' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.706 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.707 227766 DEBUG nova.virt.libvirt.vif [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:45:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-548177853',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-548177853',id=204,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHl4y18ZbM5/Piuxfm1CZQf0XAJEg8AGMP7q7u+IMXAx6Zt5rL1mJMSOsTTDZFJEhWjwFar/8Dgb+UXMig3/lwqhw1lDvKwVdlJtw/GUAFgmmy551r0TFxkUxIA1d9FOqw==',key_name='tempest-keypair-2130281890',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='36d7e7c7ddbd4cf785fafd0d35b0a2d8',ramdisk_id='',reservation_id='r-m9ehiyel',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-2030135659',owner_user_name='tempest-AttachVolumeShelveTestJSON-2030135659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:45:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='296341ffca2441dc807d285fa14c966d',uuid=fb31c535-476a-4f92-866e-664b8b25e0fc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.708 227766 DEBUG nova.network.os_vif_util [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converting VIF {"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.709 227766 DEBUG nova.network.os_vif_util [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.710 227766 DEBUG nova.objects.instance [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.727 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  <uuid>fb31c535-476a-4f92-866e-664b8b25e0fc</uuid>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  <name>instance-000000cc</name>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-548177853</nova:name>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:45:28</nova:creationTime>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <nova:user uuid="296341ffca2441dc807d285fa14c966d">tempest-AttachVolumeShelveTestJSON-2030135659-project-member</nova:user>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <nova:project uuid="36d7e7c7ddbd4cf785fafd0d35b0a2d8">tempest-AttachVolumeShelveTestJSON-2030135659</nova:project>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <nova:port uuid="20adecab-752d-4d6c-95ac-7d46cf968926">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <entry name="serial">fb31c535-476a-4f92-866e-664b8b25e0fc</entry>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <entry name="uuid">fb31c535-476a-4f92-866e-664b8b25e0fc</entry>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/fb31c535-476a-4f92-866e-664b8b25e0fc_disk">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/fb31c535-476a-4f92-866e-664b8b25e0fc_disk.config">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:bd:19:d2"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <target dev="tap20adecab-75"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/console.log" append="off"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:45:29 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:45:29 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:45:29 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:45:29 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.728 227766 DEBUG nova.compute.manager [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Preparing to wait for external event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.729 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.729 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.729 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.730 227766 DEBUG nova.virt.libvirt.vif [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:45:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-548177853',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-548177853',id=204,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHl4y18ZbM5/Piuxfm1CZQf0XAJEg8AGMP7q7u+IMXAx6Zt5rL1mJMSOsTTDZFJEhWjwFar/8Dgb+UXMig3/lwqhw1lDvKwVdlJtw/GUAFgmmy551r0TFxkUxIA1d9FOqw==',key_name='tempest-keypair-2130281890',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='36d7e7c7ddbd4cf785fafd0d35b0a2d8',ramdisk_id='',reservation_id='r-m9ehiyel',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-2030135659',owner_user_name='tempest-AttachVolumeShelveTestJSON-2030135659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:45:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='296341ffca2441dc807d285fa14c966d',uuid=fb31c535-476a-4f92-866e-664b8b25e0fc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.730 227766 DEBUG nova.network.os_vif_util [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converting VIF {"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.731 227766 DEBUG nova.network.os_vif_util [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.731 227766 DEBUG os_vif [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.731 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.732 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.732 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.736 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.736 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20adecab-75, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.736 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap20adecab-75, col_values=(('external_ids', {'iface-id': '20adecab-752d-4d6c-95ac-7d46cf968926', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:19:d2', 'vm-uuid': 'fb31c535-476a-4f92-866e-664b8b25e0fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.738 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:29 np0005593234 NetworkManager[48942]: <info>  [1769165129.7391] manager: (tap20adecab-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/402)
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.741 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.746 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.747 227766 INFO os_vif [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75')#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.909 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.910 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.910 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No VIF found with MAC fa:16:3e:bd:19:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.911 227766 INFO nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Using config drive#033[00m
Jan 23 05:45:29 np0005593234 nova_compute[227762]: 2026-01-23 10:45:29.935 227766 DEBUG nova.storage.rbd_utils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:45:30 np0005593234 nova_compute[227762]: 2026-01-23 10:45:30.603 227766 INFO nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Creating config drive at /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/disk.config#033[00m
Jan 23 05:45:30 np0005593234 nova_compute[227762]: 2026-01-23 10:45:30.609 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1od0153w execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:45:30 np0005593234 nova_compute[227762]: 2026-01-23 10:45:30.744 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1od0153w" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:45:30 np0005593234 nova_compute[227762]: 2026-01-23 10:45:30.771 227766 DEBUG nova.storage.rbd_utils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:45:30 np0005593234 nova_compute[227762]: 2026-01-23 10:45:30.774 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/disk.config fb31c535-476a-4f92-866e-664b8b25e0fc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:45:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:30.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:31.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:31 np0005593234 nova_compute[227762]: 2026-01-23 10:45:31.751 227766 DEBUG oslo_concurrency.processutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/disk.config fb31c535-476a-4f92-866e-664b8b25e0fc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.977s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:45:31 np0005593234 nova_compute[227762]: 2026-01-23 10:45:31.752 227766 INFO nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Deleting local config drive /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/disk.config because it was imported into RBD.#033[00m
Jan 23 05:45:31 np0005593234 kernel: tap20adecab-75: entered promiscuous mode
Jan 23 05:45:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:45:31Z|00861|binding|INFO|Claiming lport 20adecab-752d-4d6c-95ac-7d46cf968926 for this chassis.
Jan 23 05:45:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:45:31Z|00862|binding|INFO|20adecab-752d-4d6c-95ac-7d46cf968926: Claiming fa:16:3e:bd:19:d2 10.100.0.10
Jan 23 05:45:31 np0005593234 NetworkManager[48942]: <info>  [1769165131.8126] manager: (tap20adecab-75): new Tun device (/org/freedesktop/NetworkManager/Devices/403)
Jan 23 05:45:31 np0005593234 nova_compute[227762]: 2026-01-23 10:45:31.812 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:31 np0005593234 nova_compute[227762]: 2026-01-23 10:45:31.818 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.824 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:19:d2 10.100.0.10'], port_security=['fa:16:3e:bd:19:d2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fb31c535-476a-4f92-866e-664b8b25e0fc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-082e2952-c529-49ec-88e6-5e5c5580db01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '36d7e7c7ddbd4cf785fafd0d35b0a2d8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '556b10c0-b0e4-46d0-9f13-66e00af94e5f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6ea75f5-7173-4e04-a97c-cbcceff41ada, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=20adecab-752d-4d6c-95ac-7d46cf968926) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.826 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 20adecab-752d-4d6c-95ac-7d46cf968926 in datapath 082e2952-c529-49ec-88e6-5e5c5580db01 bound to our chassis#033[00m
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.828 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 082e2952-c529-49ec-88e6-5e5c5580db01#033[00m
Jan 23 05:45:31 np0005593234 systemd-machined[195626]: New machine qemu-97-instance-000000cc.
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.844 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[079eccde-21ae-4e80-a710-8d471c9c3ec7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.845 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap082e2952-c1 in ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.847 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap082e2952-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.847 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8d4442ef-33b3-486d-b2dc-fab7400f1aff]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.848 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[655c6b47-9c1b-45c8-916b-75837386165b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.862 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[de4ef6be-d08d-4a35-ac76-169efbf575c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.875 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6319d884-1fc1-411a-bf97-0f92839ea941]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:31 np0005593234 nova_compute[227762]: 2026-01-23 10:45:31.876 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:45:31Z|00863|binding|INFO|Setting lport 20adecab-752d-4d6c-95ac-7d46cf968926 ovn-installed in OVS
Jan 23 05:45:31 np0005593234 ovn_controller[134547]: 2026-01-23T10:45:31Z|00864|binding|INFO|Setting lport 20adecab-752d-4d6c-95ac-7d46cf968926 up in Southbound
Jan 23 05:45:31 np0005593234 nova_compute[227762]: 2026-01-23 10:45:31.879 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:31 np0005593234 systemd[1]: Started Virtual Machine qemu-97-instance-000000cc.
Jan 23 05:45:31 np0005593234 systemd-udevd[324748]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.906 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[43b87a3b-6d7a-454a-816d-6b0465f587a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.911 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f5d90e49-0975-4458-8fa3-60197635f3cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:31 np0005593234 NetworkManager[48942]: <info>  [1769165131.9124] manager: (tap082e2952-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/404)
Jan 23 05:45:31 np0005593234 systemd-udevd[324752]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:45:31 np0005593234 NetworkManager[48942]: <info>  [1769165131.9139] device (tap20adecab-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:45:31 np0005593234 NetworkManager[48942]: <info>  [1769165131.9148] device (tap20adecab-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:45:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.946 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1a1085d1-99a5-43ec-be0a-6bfbd544358c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.951 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e0c5da56-9d0e-457a-bbb2-b2de56039ef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:31 np0005593234 NetworkManager[48942]: <info>  [1769165131.9733] device (tap082e2952-c0): carrier: link connected
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.979 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[62fca863-bda0-46d9-be2a-5e3d22f82fb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:31.997 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[af61d1dc-3e0b-46f7-95ae-bef681c8df33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap082e2952-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:8e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 264], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 902693, 'reachable_time': 26581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 324776, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:32.013 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d9d163da-4e9c-4ed4-8368-b8e64713d9e6]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:8e23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 902693, 'tstamp': 902693}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 324777, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:32.039 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9bb7d8d9-0480-4c58-b699-dc4c0b332db8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap082e2952-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:8e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 264], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 902693, 'reachable_time': 26581, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 324778, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:32.074 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[90d0006a-0aea-4d20-b008-b6b5cdc55a8d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:32.146 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c504b294-b620-4e81-8acd-7c1aaf339016]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:32.148 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap082e2952-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:32.148 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:32.149 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap082e2952-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:45:32 np0005593234 NetworkManager[48942]: <info>  [1769165132.1513] manager: (tap082e2952-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/405)
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.150 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:32 np0005593234 kernel: tap082e2952-c0: entered promiscuous mode
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:32.155 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap082e2952-c0, col_values=(('external_ids', {'iface-id': 'e36b250d-7843-417b-b3f6-5e001769e85d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.155 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:45:32Z|00865|binding|INFO|Releasing lport e36b250d-7843-417b-b3f6-5e001769e85d from this chassis (sb_readonly=0)
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.167 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.176 227766 DEBUG nova.compute.manager [req-4c4e1556-c95a-4d9e-97cb-d810f70794a3 req-ace38fde-f90e-4d9e-aff4-e94ee85d8d7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.176 227766 DEBUG oslo_concurrency.lockutils [req-4c4e1556-c95a-4d9e-97cb-d810f70794a3 req-ace38fde-f90e-4d9e-aff4-e94ee85d8d7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.177 227766 DEBUG oslo_concurrency.lockutils [req-4c4e1556-c95a-4d9e-97cb-d810f70794a3 req-ace38fde-f90e-4d9e-aff4-e94ee85d8d7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.177 227766 DEBUG oslo_concurrency.lockutils [req-4c4e1556-c95a-4d9e-97cb-d810f70794a3 req-ace38fde-f90e-4d9e-aff4-e94ee85d8d7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.177 227766 DEBUG nova.compute.manager [req-4c4e1556-c95a-4d9e-97cb-d810f70794a3 req-ace38fde-f90e-4d9e-aff4-e94ee85d8d7d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Processing event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:32.182 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/082e2952-c529-49ec-88e6-5e5c5580db01.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/082e2952-c529-49ec-88e6-5e5c5580db01.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.182 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:32.183 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[085b9128-f5a7-41ef-af0e-cb95770d69e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:32.184 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-082e2952-c529-49ec-88e6-5e5c5580db01
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/082e2952-c529-49ec-88e6-5e5c5580db01.pid.haproxy
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 082e2952-c529-49ec-88e6-5e5c5580db01
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:45:32 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:32.186 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'env', 'PROCESS_TAG=haproxy-082e2952-c529-49ec-88e6-5e5c5580db01', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/082e2952-c529-49ec-88e6-5e5c5580db01.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:45:32 np0005593234 podman[324811]: 2026-01-23 10:45:32.563071753 +0000 UTC m=+0.022967989 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.912 227766 DEBUG nova.compute.manager [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.914 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165132.9132676, fb31c535-476a-4f92-866e-664b8b25e0fc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.914 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] VM Started (Lifecycle Event)#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.916 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.920 227766 INFO nova.virt.libvirt.driver [-] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Instance spawned successfully.#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.920 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.946 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.953 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.953 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.954 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.954 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.955 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.955 227766 DEBUG nova.virt.libvirt.driver [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:45:32 np0005593234 nova_compute[227762]: 2026-01-23 10:45:32.959 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:45:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:32.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:33 np0005593234 nova_compute[227762]: 2026-01-23 10:45:33.008 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:45:33 np0005593234 nova_compute[227762]: 2026-01-23 10:45:33.009 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165132.9134188, fb31c535-476a-4f92-866e-664b8b25e0fc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:45:33 np0005593234 nova_compute[227762]: 2026-01-23 10:45:33.009 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:45:33 np0005593234 podman[324811]: 2026-01-23 10:45:33.026292755 +0000 UTC m=+0.486188971 container create 4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 05:45:33 np0005593234 nova_compute[227762]: 2026-01-23 10:45:33.036 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:45:33 np0005593234 nova_compute[227762]: 2026-01-23 10:45:33.038 227766 INFO nova.compute.manager [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Took 15.72 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:45:33 np0005593234 nova_compute[227762]: 2026-01-23 10:45:33.038 227766 DEBUG nova.compute.manager [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:45:33 np0005593234 nova_compute[227762]: 2026-01-23 10:45:33.042 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165132.9162045, fb31c535-476a-4f92-866e-664b8b25e0fc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:45:33 np0005593234 nova_compute[227762]: 2026-01-23 10:45:33.042 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:45:33 np0005593234 nova_compute[227762]: 2026-01-23 10:45:33.071 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:45:33 np0005593234 nova_compute[227762]: 2026-01-23 10:45:33.074 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:45:33 np0005593234 systemd[1]: Started libpod-conmon-4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74.scope.
Jan 23 05:45:33 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:45:33 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3ba23310292e3831f462c77a1b707ec853e6a44cee9ebb5d64254426313661ee/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:45:33 np0005593234 podman[324811]: 2026-01-23 10:45:33.330247103 +0000 UTC m=+0.790143329 container init 4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:45:33 np0005593234 podman[324811]: 2026-01-23 10:45:33.336137626 +0000 UTC m=+0.796033842 container start 4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 05:45:33 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[324868]: [NOTICE]   (324872) : New worker (324874) forked
Jan 23 05:45:33 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[324868]: [NOTICE]   (324872) : Loading success.
Jan 23 05:45:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:33.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:34 np0005593234 nova_compute[227762]: 2026-01-23 10:45:34.099 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:45:34 np0005593234 nova_compute[227762]: 2026-01-23 10:45:34.141 227766 INFO nova.compute.manager [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Took 17.68 seconds to build instance.#033[00m
Jan 23 05:45:34 np0005593234 nova_compute[227762]: 2026-01-23 10:45:34.167 227766 DEBUG oslo_concurrency.lockutils [None req-74d91370-3dfc-49f3-ab93-0749b30793c9 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.808s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:34 np0005593234 nova_compute[227762]: 2026-01-23 10:45:34.257 227766 DEBUG nova.compute.manager [req-98ce217a-eaa1-4cde-9a84-9230286b0869 req-6812569b-5434-4c61-8d03-cd6d3d56834f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:45:34 np0005593234 nova_compute[227762]: 2026-01-23 10:45:34.257 227766 DEBUG oslo_concurrency.lockutils [req-98ce217a-eaa1-4cde-9a84-9230286b0869 req-6812569b-5434-4c61-8d03-cd6d3d56834f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:34 np0005593234 nova_compute[227762]: 2026-01-23 10:45:34.258 227766 DEBUG oslo_concurrency.lockutils [req-98ce217a-eaa1-4cde-9a84-9230286b0869 req-6812569b-5434-4c61-8d03-cd6d3d56834f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:34 np0005593234 nova_compute[227762]: 2026-01-23 10:45:34.258 227766 DEBUG oslo_concurrency.lockutils [req-98ce217a-eaa1-4cde-9a84-9230286b0869 req-6812569b-5434-4c61-8d03-cd6d3d56834f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:34 np0005593234 nova_compute[227762]: 2026-01-23 10:45:34.259 227766 DEBUG nova.compute.manager [req-98ce217a-eaa1-4cde-9a84-9230286b0869 req-6812569b-5434-4c61-8d03-cd6d3d56834f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] No waiting events found dispatching network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:45:34 np0005593234 nova_compute[227762]: 2026-01-23 10:45:34.259 227766 WARNING nova.compute.manager [req-98ce217a-eaa1-4cde-9a84-9230286b0869 req-6812569b-5434-4c61-8d03-cd6d3d56834f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received unexpected event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:45:34 np0005593234 nova_compute[227762]: 2026-01-23 10:45:34.619 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:34 np0005593234 nova_compute[227762]: 2026-01-23 10:45:34.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:34 np0005593234 nova_compute[227762]: 2026-01-23 10:45:34.739 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:34.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:35.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:36 np0005593234 podman[324885]: 2026-01-23 10:45:36.752639591 +0000 UTC m=+0.048769455 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 23 05:45:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:36.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:37.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:38.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:39 np0005593234 NetworkManager[48942]: <info>  [1769165139.1776] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/406)
Jan 23 05:45:39 np0005593234 nova_compute[227762]: 2026-01-23 10:45:39.176 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:39 np0005593234 NetworkManager[48942]: <info>  [1769165139.1791] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/407)
Jan 23 05:45:39 np0005593234 nova_compute[227762]: 2026-01-23 10:45:39.298 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:39 np0005593234 ovn_controller[134547]: 2026-01-23T10:45:39Z|00866|binding|INFO|Releasing lport e36b250d-7843-417b-b3f6-5e001769e85d from this chassis (sb_readonly=0)
Jan 23 05:45:39 np0005593234 nova_compute[227762]: 2026-01-23 10:45:39.310 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:39.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:39 np0005593234 nova_compute[227762]: 2026-01-23 10:45:39.533 227766 DEBUG nova.compute.manager [req-bade4fe1-0a58-4bf7-b616-bc45c46700da req-75e7c58d-877f-42a1-98f5-755077aace02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-changed-20adecab-752d-4d6c-95ac-7d46cf968926 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:45:39 np0005593234 nova_compute[227762]: 2026-01-23 10:45:39.534 227766 DEBUG nova.compute.manager [req-bade4fe1-0a58-4bf7-b616-bc45c46700da req-75e7c58d-877f-42a1-98f5-755077aace02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Refreshing instance network info cache due to event network-changed-20adecab-752d-4d6c-95ac-7d46cf968926. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:45:39 np0005593234 nova_compute[227762]: 2026-01-23 10:45:39.534 227766 DEBUG oslo_concurrency.lockutils [req-bade4fe1-0a58-4bf7-b616-bc45c46700da req-75e7c58d-877f-42a1-98f5-755077aace02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:45:39 np0005593234 nova_compute[227762]: 2026-01-23 10:45:39.534 227766 DEBUG oslo_concurrency.lockutils [req-bade4fe1-0a58-4bf7-b616-bc45c46700da req-75e7c58d-877f-42a1-98f5-755077aace02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:45:39 np0005593234 nova_compute[227762]: 2026-01-23 10:45:39.534 227766 DEBUG nova.network.neutron [req-bade4fe1-0a58-4bf7-b616-bc45c46700da req-75e7c58d-877f-42a1-98f5-755077aace02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Refreshing network info cache for port 20adecab-752d-4d6c-95ac-7d46cf968926 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:45:39 np0005593234 nova_compute[227762]: 2026-01-23 10:45:39.620 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:39 np0005593234 nova_compute[227762]: 2026-01-23 10:45:39.740 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:40.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:41.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:42.883 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:42.884 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:45:42.885 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:42.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:43 np0005593234 nova_compute[227762]: 2026-01-23 10:45:43.137 227766 DEBUG nova.network.neutron [req-bade4fe1-0a58-4bf7-b616-bc45c46700da req-75e7c58d-877f-42a1-98f5-755077aace02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updated VIF entry in instance network info cache for port 20adecab-752d-4d6c-95ac-7d46cf968926. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:45:43 np0005593234 nova_compute[227762]: 2026-01-23 10:45:43.138 227766 DEBUG nova.network.neutron [req-bade4fe1-0a58-4bf7-b616-bc45c46700da req-75e7c58d-877f-42a1-98f5-755077aace02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updating instance_info_cache with network_info: [{"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:45:43 np0005593234 nova_compute[227762]: 2026-01-23 10:45:43.167 227766 DEBUG oslo_concurrency.lockutils [req-bade4fe1-0a58-4bf7-b616-bc45c46700da req-75e7c58d-877f-42a1-98f5-755077aace02 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:45:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:45:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:43.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:45:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:45:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3875500104' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:45:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:45:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3875500104' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:45:44 np0005593234 nova_compute[227762]: 2026-01-23 10:45:44.622 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:44 np0005593234 nova_compute[227762]: 2026-01-23 10:45:44.741 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:44.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:45.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:45:45Z|00112|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:bd:19:d2 10.100.0.10
Jan 23 05:45:45 np0005593234 ovn_controller[134547]: 2026-01-23T10:45:45Z|00113|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bd:19:d2 10.100.0.10
Jan 23 05:45:46 np0005593234 podman[324935]: 2026-01-23 10:45:46.113399448 +0000 UTC m=+0.117329607 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:45:46 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #58. Immutable memtables: 14.
Jan 23 05:45:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:46.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:47.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:48 np0005593234 nova_compute[227762]: 2026-01-23 10:45:48.961 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:45:48 np0005593234 nova_compute[227762]: 2026-01-23 10:45:48.988 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Triggering sync for uuid fb31c535-476a-4f92-866e-664b8b25e0fc _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 05:45:48 np0005593234 nova_compute[227762]: 2026-01-23 10:45:48.988 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:48 np0005593234 nova_compute[227762]: 2026-01-23 10:45:48.989 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:48.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:49 np0005593234 nova_compute[227762]: 2026-01-23 10:45:49.017 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.028s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:45:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:49.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:49 np0005593234 nova_compute[227762]: 2026-01-23 10:45:49.627 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:49 np0005593234 nova_compute[227762]: 2026-01-23 10:45:49.743 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:51.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:51.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:51 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:53.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:53.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:54 np0005593234 nova_compute[227762]: 2026-01-23 10:45:54.679 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:54 np0005593234 nova_compute[227762]: 2026-01-23 10:45:54.745 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:55.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:55.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:45:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:45:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:57.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:45:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:57.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:57 np0005593234 nova_compute[227762]: 2026-01-23 10:45:57.634 227766 DEBUG oslo_concurrency.lockutils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:45:57 np0005593234 nova_compute[227762]: 2026-01-23 10:45:57.634 227766 DEBUG oslo_concurrency.lockutils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:45:57 np0005593234 nova_compute[227762]: 2026-01-23 10:45:57.635 227766 INFO nova.compute.manager [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Shelving#033[00m
Jan 23 05:45:57 np0005593234 nova_compute[227762]: 2026-01-23 10:45:57.655 227766 DEBUG nova.virt.libvirt.driver [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:45:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:45:59.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:45:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:45:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:45:59.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:45:59 np0005593234 nova_compute[227762]: 2026-01-23 10:45:59.717 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:45:59 np0005593234 nova_compute[227762]: 2026-01-23 10:45:59.747 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:00 np0005593234 kernel: tap20adecab-75 (unregistering): left promiscuous mode
Jan 23 05:46:00 np0005593234 NetworkManager[48942]: <info>  [1769165160.1423] device (tap20adecab-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:46:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:46:00Z|00867|binding|INFO|Releasing lport 20adecab-752d-4d6c-95ac-7d46cf968926 from this chassis (sb_readonly=0)
Jan 23 05:46:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:46:00Z|00868|binding|INFO|Setting lport 20adecab-752d-4d6c-95ac-7d46cf968926 down in Southbound
Jan 23 05:46:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:46:00Z|00869|binding|INFO|Removing iface tap20adecab-75 ovn-installed in OVS
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.146 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.156 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:19:d2 10.100.0.10'], port_security=['fa:16:3e:bd:19:d2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fb31c535-476a-4f92-866e-664b8b25e0fc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-082e2952-c529-49ec-88e6-5e5c5580db01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '36d7e7c7ddbd4cf785fafd0d35b0a2d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '556b10c0-b0e4-46d0-9f13-66e00af94e5f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.192'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6ea75f5-7173-4e04-a97c-cbcceff41ada, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=20adecab-752d-4d6c-95ac-7d46cf968926) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.159 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 20adecab-752d-4d6c-95ac-7d46cf968926 in datapath 082e2952-c529-49ec-88e6-5e5c5580db01 unbound from our chassis#033[00m
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.161 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 082e2952-c529-49ec-88e6-5e5c5580db01, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.163 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7c102d57-7fc8-4ce4-b66b-caf5a50b7659]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.164 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 namespace which is not needed anymore#033[00m
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.174 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:00 np0005593234 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000cc.scope: Deactivated successfully.
Jan 23 05:46:00 np0005593234 systemd[1]: machine-qemu\x2d97\x2dinstance\x2d000000cc.scope: Consumed 13.951s CPU time.
Jan 23 05:46:00 np0005593234 systemd-machined[195626]: Machine qemu-97-instance-000000cc terminated.
Jan 23 05:46:00 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[324868]: [NOTICE]   (324872) : haproxy version is 2.8.14-c23fe91
Jan 23 05:46:00 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[324868]: [NOTICE]   (324872) : path to executable is /usr/sbin/haproxy
Jan 23 05:46:00 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[324868]: [WARNING]  (324872) : Exiting Master process...
Jan 23 05:46:00 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[324868]: [ALERT]    (324872) : Current worker (324874) exited with code 143 (Terminated)
Jan 23 05:46:00 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[324868]: [WARNING]  (324872) : All workers exited. Exiting... (0)
Jan 23 05:46:00 np0005593234 systemd[1]: libpod-4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74.scope: Deactivated successfully.
Jan 23 05:46:00 np0005593234 podman[325019]: 2026-01-23 10:46:00.316920682 +0000 UTC m=+0.050747187 container died 4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 05:46:00 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74-userdata-shm.mount: Deactivated successfully.
Jan 23 05:46:00 np0005593234 systemd[1]: var-lib-containers-storage-overlay-3ba23310292e3831f462c77a1b707ec853e6a44cee9ebb5d64254426313661ee-merged.mount: Deactivated successfully.
Jan 23 05:46:00 np0005593234 podman[325019]: 2026-01-23 10:46:00.360332598 +0000 UTC m=+0.094159103 container cleanup 4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 05:46:00 np0005593234 systemd[1]: libpod-conmon-4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74.scope: Deactivated successfully.
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.374 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.379 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.417 227766 DEBUG nova.compute.manager [req-0360cf34-a447-425a-bfa2-57aba090efd4 req-8c3ef1b5-a7de-45b5-8da0-dcbabd179ec8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-vif-unplugged-20adecab-752d-4d6c-95ac-7d46cf968926 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.417 227766 DEBUG oslo_concurrency.lockutils [req-0360cf34-a447-425a-bfa2-57aba090efd4 req-8c3ef1b5-a7de-45b5-8da0-dcbabd179ec8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.418 227766 DEBUG oslo_concurrency.lockutils [req-0360cf34-a447-425a-bfa2-57aba090efd4 req-8c3ef1b5-a7de-45b5-8da0-dcbabd179ec8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.418 227766 DEBUG oslo_concurrency.lockutils [req-0360cf34-a447-425a-bfa2-57aba090efd4 req-8c3ef1b5-a7de-45b5-8da0-dcbabd179ec8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.418 227766 DEBUG nova.compute.manager [req-0360cf34-a447-425a-bfa2-57aba090efd4 req-8c3ef1b5-a7de-45b5-8da0-dcbabd179ec8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] No waiting events found dispatching network-vif-unplugged-20adecab-752d-4d6c-95ac-7d46cf968926 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.418 227766 WARNING nova.compute.manager [req-0360cf34-a447-425a-bfa2-57aba090efd4 req-8c3ef1b5-a7de-45b5-8da0-dcbabd179ec8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received unexpected event network-vif-unplugged-20adecab-752d-4d6c-95ac-7d46cf968926 for instance with vm_state active and task_state shelving.#033[00m
Jan 23 05:46:00 np0005593234 podman[325051]: 2026-01-23 10:46:00.429261202 +0000 UTC m=+0.045635627 container remove 4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.434 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[792854b6-f486-44f1-9797-c253d4714a0f]: (4, ('Fri Jan 23 10:46:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 (4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74)\n4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74\nFri Jan 23 10:46:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 (4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74)\n4c062ce44ddc8c2cbd40761a96fb0051fe6959aa3911e440ac2fa3cc92bcca74\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.436 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d1079e3c-82ee-4e62-aba0-299b2b3ad651]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.437 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap082e2952-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.438 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:00 np0005593234 kernel: tap082e2952-c0: left promiscuous mode
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.453 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.455 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c7c71699-3784-4577-a83a-c19ad9fa68b9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.481 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[945249fe-dc94-4744-96db-3b57a17b24c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.482 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1a90bcf9-dd04-4c89-be34-c1bb03c04e2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.495 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[75cf9b87-c436-4e0f-97e9-429c481135cf]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 902686, 'reachable_time': 35213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 325077, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.498 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:46:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:00.498 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[938d810a-b255-4141-aadf-fc160aa323c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:00 np0005593234 systemd[1]: run-netns-ovnmeta\x2d082e2952\x2dc529\x2d49ec\x2d88e6\x2d5e5c5580db01.mount: Deactivated successfully.
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.674 227766 INFO nova.virt.libvirt.driver [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.680 227766 INFO nova.virt.libvirt.driver [-] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Instance destroyed successfully.#033[00m
Jan 23 05:46:00 np0005593234 nova_compute[227762]: 2026-01-23 10:46:00.680 227766 DEBUG nova.objects.instance [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'numa_topology' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:46:01 np0005593234 nova_compute[227762]: 2026-01-23 10:46:01.013 227766 INFO nova.virt.libvirt.driver [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Beginning cold snapshot process#033[00m
Jan 23 05:46:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:01.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:01 np0005593234 nova_compute[227762]: 2026-01-23 10:46:01.154 227766 DEBUG nova.virt.libvirt.imagebackend [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 05:46:01 np0005593234 nova_compute[227762]: 2026-01-23 10:46:01.379 227766 DEBUG nova.storage.rbd_utils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] creating snapshot(0b71a61374784808b3116fe32528f8c2) on rbd image(fb31c535-476a-4f92-866e-664b8b25e0fc_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:46:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:01.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e400 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e401 e401: 3 total, 3 up, 3 in
Jan 23 05:46:02 np0005593234 nova_compute[227762]: 2026-01-23 10:46:02.092 227766 DEBUG nova.storage.rbd_utils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] cloning vms/fb31c535-476a-4f92-866e-664b8b25e0fc_disk@0b71a61374784808b3116fe32528f8c2 to images/a565646f-e62e-4f36-81a3-989264b3b2c8 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:46:02 np0005593234 nova_compute[227762]: 2026-01-23 10:46:02.523 227766 DEBUG nova.compute.manager [req-eebad512-4f21-4c72-b67a-4c594891f585 req-08032003-525d-4602-a5ff-7ac72818080f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:46:02 np0005593234 nova_compute[227762]: 2026-01-23 10:46:02.524 227766 DEBUG oslo_concurrency.lockutils [req-eebad512-4f21-4c72-b67a-4c594891f585 req-08032003-525d-4602-a5ff-7ac72818080f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:02 np0005593234 nova_compute[227762]: 2026-01-23 10:46:02.525 227766 DEBUG oslo_concurrency.lockutils [req-eebad512-4f21-4c72-b67a-4c594891f585 req-08032003-525d-4602-a5ff-7ac72818080f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:02 np0005593234 nova_compute[227762]: 2026-01-23 10:46:02.525 227766 DEBUG oslo_concurrency.lockutils [req-eebad512-4f21-4c72-b67a-4c594891f585 req-08032003-525d-4602-a5ff-7ac72818080f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:02 np0005593234 nova_compute[227762]: 2026-01-23 10:46:02.526 227766 DEBUG nova.compute.manager [req-eebad512-4f21-4c72-b67a-4c594891f585 req-08032003-525d-4602-a5ff-7ac72818080f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] No waiting events found dispatching network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:46:02 np0005593234 nova_compute[227762]: 2026-01-23 10:46:02.526 227766 WARNING nova.compute.manager [req-eebad512-4f21-4c72-b67a-4c594891f585 req-08032003-525d-4602-a5ff-7ac72818080f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received unexpected event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 23 05:46:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:03.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:03.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:03 np0005593234 nova_compute[227762]: 2026-01-23 10:46:03.763 227766 DEBUG nova.storage.rbd_utils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] flattening images/a565646f-e62e-4f36-81a3-989264b3b2c8 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 05:46:04 np0005593234 nova_compute[227762]: 2026-01-23 10:46:04.720 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:04 np0005593234 nova_compute[227762]: 2026-01-23 10:46:04.749 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:05.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:05.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:05 np0005593234 nova_compute[227762]: 2026-01-23 10:46:05.917 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:05.917 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=81, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=80) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:46:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:05.918 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:46:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:46:06 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3648757731' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:46:06 np0005593234 nova_compute[227762]: 2026-01-23 10:46:06.381 227766 DEBUG nova.storage.rbd_utils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] removing snapshot(0b71a61374784808b3116fe32528f8c2) on rbd image(fb31c535-476a-4f92-866e-664b8b25e0fc_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:46:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e402 e402: 3 total, 3 up, 3 in
Jan 23 05:46:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e402 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:06 np0005593234 nova_compute[227762]: 2026-01-23 10:46:06.963 227766 DEBUG nova.storage.rbd_utils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] creating snapshot(snap) on rbd image(a565646f-e62e-4f36-81a3-989264b3b2c8) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:46:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:07.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:07.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e403 e403: 3 total, 3 up, 3 in
Jan 23 05:46:07 np0005593234 podman[325272]: 2026-01-23 10:46:07.814638101 +0000 UTC m=+0.105547199 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:46:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:09.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:09.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:09 np0005593234 nova_compute[227762]: 2026-01-23 10:46:09.562 227766 INFO nova.virt.libvirt.driver [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Snapshot image upload complete#033[00m
Jan 23 05:46:09 np0005593234 nova_compute[227762]: 2026-01-23 10:46:09.563 227766 DEBUG nova.compute.manager [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:46:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e404 e404: 3 total, 3 up, 3 in
Jan 23 05:46:09 np0005593234 nova_compute[227762]: 2026-01-23 10:46:09.626 227766 INFO nova.compute.manager [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Shelve offloading#033[00m
Jan 23 05:46:09 np0005593234 nova_compute[227762]: 2026-01-23 10:46:09.633 227766 INFO nova.virt.libvirt.driver [-] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Instance destroyed successfully.#033[00m
Jan 23 05:46:09 np0005593234 nova_compute[227762]: 2026-01-23 10:46:09.634 227766 DEBUG nova.compute.manager [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:46:09 np0005593234 nova_compute[227762]: 2026-01-23 10:46:09.636 227766 DEBUG oslo_concurrency.lockutils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:46:09 np0005593234 nova_compute[227762]: 2026-01-23 10:46:09.636 227766 DEBUG oslo_concurrency.lockutils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquired lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:46:09 np0005593234 nova_compute[227762]: 2026-01-23 10:46:09.637 227766 DEBUG nova.network.neutron [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:46:09 np0005593234 nova_compute[227762]: 2026-01-23 10:46:09.722 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:09 np0005593234 nova_compute[227762]: 2026-01-23 10:46:09.751 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:09.920 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '81'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:46:10 np0005593234 nova_compute[227762]: 2026-01-23 10:46:10.773 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:10 np0005593234 nova_compute[227762]: 2026-01-23 10:46:10.774 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:46:10 np0005593234 nova_compute[227762]: 2026-01-23 10:46:10.774 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:46:10 np0005593234 nova_compute[227762]: 2026-01-23 10:46:10.791 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:46:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:46:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:11.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:46:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:46:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:11.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:46:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:46:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:13.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:46:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:46:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:13.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:46:13 np0005593234 nova_compute[227762]: 2026-01-23 10:46:13.417 227766 DEBUG nova.network.neutron [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updating instance_info_cache with network_info: [{"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:46:13 np0005593234 nova_compute[227762]: 2026-01-23 10:46:13.463 227766 DEBUG oslo_concurrency.lockutils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Releasing lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:46:13 np0005593234 nova_compute[227762]: 2026-01-23 10:46:13.467 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:46:13 np0005593234 nova_compute[227762]: 2026-01-23 10:46:13.468 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:46:13 np0005593234 nova_compute[227762]: 2026-01-23 10:46:13.468 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:46:14 np0005593234 nova_compute[227762]: 2026-01-23 10:46:14.724 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:14 np0005593234 nova_compute[227762]: 2026-01-23 10:46:14.752 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.029 227766 INFO nova.virt.libvirt.driver [-] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Instance destroyed successfully.#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.030 227766 DEBUG nova.objects.instance [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'resources' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:46:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:46:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:15.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.053 227766 DEBUG nova.virt.libvirt.vif [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:45:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-548177853',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-548177853',id=204,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHl4y18ZbM5/Piuxfm1CZQf0XAJEg8AGMP7q7u+IMXAx6Zt5rL1mJMSOsTTDZFJEhWjwFar/8Dgb+UXMig3/lwqhw1lDvKwVdlJtw/GUAFgmmy551r0TFxkUxIA1d9FOqw==',key_name='tempest-keypair-2130281890',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:45:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='36d7e7c7ddbd4cf785fafd0d35b0a2d8',ramdisk_id='',reservation_id='r-m9ehiyel',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-2030135659',owner_user_name='tempest-AttachVolumeShelveTestJSON-2030135659-project-member',shelved_at='2026-01-23T10:46:09.563460',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='a565646f-e62e-4f36-81a3-989264b3b2c8'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:46:01Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='296341ffca2441dc807d285fa14c966d',uuid=fb31c535-476a-4f92-866e-664b8b25e0fc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.054 227766 DEBUG nova.network.os_vif_util [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converting VIF {"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.055 227766 DEBUG nova.network.os_vif_util [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.056 227766 DEBUG os_vif [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.059 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.060 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20adecab-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.062 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.065 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.069 227766 INFO os_vif [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75')#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.137 227766 DEBUG nova.compute.manager [req-81635621-47ca-4351-8fe5-300b44ecb9b1 req-fb1141e5-6229-4963-b0d2-0092a9eff7d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-changed-20adecab-752d-4d6c-95ac-7d46cf968926 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.137 227766 DEBUG nova.compute.manager [req-81635621-47ca-4351-8fe5-300b44ecb9b1 req-fb1141e5-6229-4963-b0d2-0092a9eff7d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Refreshing instance network info cache due to event network-changed-20adecab-752d-4d6c-95ac-7d46cf968926. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.138 227766 DEBUG oslo_concurrency.lockutils [req-81635621-47ca-4351-8fe5-300b44ecb9b1 req-fb1141e5-6229-4963-b0d2-0092a9eff7d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.388 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165160.386885, fb31c535-476a-4f92-866e-664b8b25e0fc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.389 227766 INFO nova.compute.manager [-] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.399 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updating instance_info_cache with network_info: [{"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:46:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:46:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:15.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.419 227766 DEBUG nova.compute.manager [None req-68d4c652-31d7-4b99-b59f-e0386bed5061 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.421 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.421 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.421 227766 DEBUG oslo_concurrency.lockutils [req-81635621-47ca-4351-8fe5-300b44ecb9b1 req-fb1141e5-6229-4963-b0d2-0092a9eff7d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.421 227766 DEBUG nova.network.neutron [req-81635621-47ca-4351-8fe5-300b44ecb9b1 req-fb1141e5-6229-4963-b0d2-0092a9eff7d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Refreshing network info cache for port 20adecab-752d-4d6c-95ac-7d46cf968926 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.423 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.424 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.426 227766 DEBUG nova.compute.manager [None req-68d4c652-31d7-4b99-b59f-e0386bed5061 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved, current task_state: shelving_offloading, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.451 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.452 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.452 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.452 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.452 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.501 227766 INFO nova.compute.manager [None req-68d4c652-31d7-4b99-b59f-e0386bed5061 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] During sync_power_state the instance has a pending task (shelving_offloading). Skip.#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.533 227766 INFO nova.virt.libvirt.driver [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Deleting instance files /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc_del#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.534 227766 INFO nova.virt.libvirt.driver [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Deletion of /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc_del complete#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.611 227766 INFO nova.scheduler.client.report [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Deleted allocations for instance fb31c535-476a-4f92-866e-664b8b25e0fc#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.668 227766 DEBUG oslo_concurrency.lockutils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.669 227766 DEBUG oslo_concurrency.lockutils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.704 227766 DEBUG oslo_concurrency.processutils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:46:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1345498199' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:46:15 np0005593234 nova_compute[227762]: 2026-01-23 10:46:15.918 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.118 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.120 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4120MB free_disk=20.942508697509766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.120 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:46:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1652188875' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.245 227766 DEBUG oslo_concurrency.processutils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.541s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.251 227766 DEBUG nova.compute.provider_tree [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.270 227766 DEBUG nova.scheduler.client.report [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.297 227766 DEBUG oslo_concurrency.lockutils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.628s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.300 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.379 227766 DEBUG oslo_concurrency.lockutils [None req-cd723135-f8e7-4e3e-8147-b8f492006b5f 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 18.745s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.389 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.390 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.407 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:46:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3312775657' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:46:16 np0005593234 podman[325379]: 2026-01-23 10:46:16.830373788 +0000 UTC m=+0.123183290 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.849 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.855 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.873 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.907 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:46:16 np0005593234 nova_compute[227762]: 2026-01-23 10:46:16.907 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:17 np0005593234 nova_compute[227762]: 2026-01-23 10:46:17.039 227766 DEBUG nova.network.neutron [req-81635621-47ca-4351-8fe5-300b44ecb9b1 req-fb1141e5-6229-4963-b0d2-0092a9eff7d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updated VIF entry in instance network info cache for port 20adecab-752d-4d6c-95ac-7d46cf968926. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:46:17 np0005593234 nova_compute[227762]: 2026-01-23 10:46:17.040 227766 DEBUG nova.network.neutron [req-81635621-47ca-4351-8fe5-300b44ecb9b1 req-fb1141e5-6229-4963-b0d2-0092a9eff7d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updating instance_info_cache with network_info: [{"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": null, "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap20adecab-75", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:46:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:46:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:17.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:46:17 np0005593234 nova_compute[227762]: 2026-01-23 10:46:17.061 227766 DEBUG oslo_concurrency.lockutils [req-81635621-47ca-4351-8fe5-300b44ecb9b1 req-fb1141e5-6229-4963-b0d2-0092a9eff7d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:46:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:46:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:17.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #178. Immutable memtables: 0.
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:17.912703) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 113] Flushing memtable with next log file: 178
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165177912839, "job": 113, "event": "flush_started", "num_memtables": 1, "num_entries": 1265, "num_deletes": 254, "total_data_size": 2631659, "memory_usage": 2675600, "flush_reason": "Manual Compaction"}
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 113] Level-0 flush table #179: started
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165177941698, "cf_name": "default", "job": 113, "event": "table_file_creation", "file_number": 179, "file_size": 1736357, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 85373, "largest_seqno": 86633, "table_properties": {"data_size": 1730819, "index_size": 2932, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12456, "raw_average_key_size": 20, "raw_value_size": 1719490, "raw_average_value_size": 2814, "num_data_blocks": 128, "num_entries": 611, "num_filter_entries": 611, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165085, "oldest_key_time": 1769165085, "file_creation_time": 1769165177, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 179, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 113] Flush lasted 29044 microseconds, and 8262 cpu microseconds.
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:17.941773) [db/flush_job.cc:967] [default] [JOB 113] Level-0 flush table #179: 1736357 bytes OK
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:17.941800) [db/memtable_list.cc:519] [default] Level-0 commit table #179 started
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:17.943814) [db/memtable_list.cc:722] [default] Level-0 commit table #179: memtable #1 done
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:17.943837) EVENT_LOG_v1 {"time_micros": 1769165177943830, "job": 113, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:17.943858) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 113] Try to delete WAL files size 2625562, prev total WAL file size 2625562, number of live WAL files 2.
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000175.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:17.944894) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037353330' seq:72057594037927935, type:22 .. '7061786F730037373832' seq:0, type:0; will stop at (end)
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 114] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 113 Base level 0, inputs: [179(1695KB)], [177(12MB)]
Jan 23 05:46:17 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165177945124, "job": 114, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [179], "files_L6": [177], "score": -1, "input_data_size": 14967417, "oldest_snapshot_seqno": -1}
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 114] Generated table #180: 10402 keys, 13035955 bytes, temperature: kUnknown
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165178082921, "cf_name": "default", "job": 114, "event": "table_file_creation", "file_number": 180, "file_size": 13035955, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12969237, "index_size": 39597, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26053, "raw_key_size": 275309, "raw_average_key_size": 26, "raw_value_size": 12787640, "raw_average_value_size": 1229, "num_data_blocks": 1504, "num_entries": 10402, "num_filter_entries": 10402, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769165177, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 180, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:18.083149) [db/compaction/compaction_job.cc:1663] [default] [JOB 114] Compacted 1@0 + 1@6 files to L6 => 13035955 bytes
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:18.084663) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 108.6 rd, 94.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 12.6 +0.0 blob) out(12.4 +0.0 blob), read-write-amplify(16.1) write-amplify(7.5) OK, records in: 10929, records dropped: 527 output_compression: NoCompression
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:18.084680) EVENT_LOG_v1 {"time_micros": 1769165178084672, "job": 114, "event": "compaction_finished", "compaction_time_micros": 137859, "compaction_time_cpu_micros": 55112, "output_level": 6, "num_output_files": 1, "total_output_size": 13035955, "num_input_records": 10929, "num_output_records": 10402, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000179.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165178085072, "job": 114, "event": "table_file_deletion", "file_number": 179}
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000177.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165178087316, "job": 114, "event": "table_file_deletion", "file_number": 177}
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:17.944668) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:18.087370) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:18.087375) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:18.087377) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:18.087379) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:46:18 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:46:18.087381) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:46:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:19.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:19 np0005593234 nova_compute[227762]: 2026-01-23 10:46:19.230 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:19 np0005593234 nova_compute[227762]: 2026-01-23 10:46:19.230 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:19 np0005593234 nova_compute[227762]: 2026-01-23 10:46:19.230 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:19 np0005593234 nova_compute[227762]: 2026-01-23 10:46:19.231 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:19 np0005593234 nova_compute[227762]: 2026-01-23 10:46:19.231 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:46:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:19.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:19 np0005593234 nova_compute[227762]: 2026-01-23 10:46:19.726 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:19 np0005593234 nova_compute[227762]: 2026-01-23 10:46:19.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.120 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.273 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.274 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.274 227766 INFO nova.compute.manager [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Unshelving#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.385 227766 INFO nova.virt.block_device [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Booting with volume 183950c6-1381-47bd-9def-115173b33253 at /dev/vdc#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.563 227766 DEBUG os_brick.utils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.565 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.579 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.015s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.580 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[57435c15-e784-495c-a1f6-1cc25432ae6f]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.581 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.589 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.008s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.589 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[29994654-4169-45a2-9ca7-c3eeb3990d03]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.591 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.600 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.009s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.600 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[76f8b54a-4ba6-4b30-8f98-b32a5bbc134c]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.602 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[de74eb85-fa5b-4a62-9dfe-3c3ae014cd51]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.604 227766 DEBUG oslo_concurrency.processutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.637 227766 DEBUG oslo_concurrency.processutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "nvme version" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.640 227766 DEBUG os_brick.initiator.connectors.lightos [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.640 227766 DEBUG os_brick.initiator.connectors.lightos [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.641 227766 DEBUG os_brick.initiator.connectors.lightos [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.641 227766 DEBUG os_brick.utils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] <== get_connector_properties: return (76ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:46:20 np0005593234 nova_compute[227762]: 2026-01-23 10:46:20.641 227766 DEBUG nova.virt.block_device [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updating existing volume attachment record: 3d2ac2a0-6592-42ab-ad7e-3983e01cbe98 _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:46:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:21.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:46:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/884371159' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:46:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:21.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:21 np0005593234 nova_compute[227762]: 2026-01-23 10:46:21.748 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:21 np0005593234 nova_compute[227762]: 2026-01-23 10:46:21.748 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:21 np0005593234 nova_compute[227762]: 2026-01-23 10:46:21.754 227766 DEBUG nova.objects.instance [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'pci_requests' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:46:21 np0005593234 nova_compute[227762]: 2026-01-23 10:46:21.771 227766 DEBUG nova.objects.instance [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'numa_topology' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:46:21 np0005593234 nova_compute[227762]: 2026-01-23 10:46:21.789 227766 DEBUG nova.virt.hardware [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:46:21 np0005593234 nova_compute[227762]: 2026-01-23 10:46:21.789 227766 INFO nova.compute.claims [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:46:21 np0005593234 nova_compute[227762]: 2026-01-23 10:46:21.909 227766 DEBUG oslo_concurrency.processutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:46:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1707159134' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:46:22 np0005593234 nova_compute[227762]: 2026-01-23 10:46:22.325 227766 DEBUG oslo_concurrency.processutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:22 np0005593234 nova_compute[227762]: 2026-01-23 10:46:22.331 227766 DEBUG nova.compute.provider_tree [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:46:22 np0005593234 nova_compute[227762]: 2026-01-23 10:46:22.356 227766 DEBUG nova.scheduler.client.report [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:46:22 np0005593234 nova_compute[227762]: 2026-01-23 10:46:22.393 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.644s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:22 np0005593234 nova_compute[227762]: 2026-01-23 10:46:22.609 227766 INFO nova.network.neutron [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updating port 20adecab-752d-4d6c-95ac-7d46cf968926 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 05:46:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:46:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:23.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:46:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:23.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:23 np0005593234 nova_compute[227762]: 2026-01-23 10:46:23.748 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:46:23 np0005593234 nova_compute[227762]: 2026-01-23 10:46:23.748 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquired lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:46:23 np0005593234 nova_compute[227762]: 2026-01-23 10:46:23.748 227766 DEBUG nova.network.neutron [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:46:23 np0005593234 nova_compute[227762]: 2026-01-23 10:46:23.878 227766 DEBUG nova.compute.manager [req-273f293c-8d83-4917-9a1e-c1931a9fc6d3 req-ff81462f-d5e1-47f5-85ff-af1060be2fe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-changed-20adecab-752d-4d6c-95ac-7d46cf968926 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:46:23 np0005593234 nova_compute[227762]: 2026-01-23 10:46:23.878 227766 DEBUG nova.compute.manager [req-273f293c-8d83-4917-9a1e-c1931a9fc6d3 req-ff81462f-d5e1-47f5-85ff-af1060be2fe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Refreshing instance network info cache due to event network-changed-20adecab-752d-4d6c-95ac-7d46cf968926. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:46:23 np0005593234 nova_compute[227762]: 2026-01-23 10:46:23.878 227766 DEBUG oslo_concurrency.lockutils [req-273f293c-8d83-4917-9a1e-c1931a9fc6d3 req-ff81462f-d5e1-47f5-85ff-af1060be2fe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:46:24 np0005593234 nova_compute[227762]: 2026-01-23 10:46:24.729 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:24 np0005593234 nova_compute[227762]: 2026-01-23 10:46:24.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:24 np0005593234 nova_compute[227762]: 2026-01-23 10:46:24.969 227766 DEBUG nova.network.neutron [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updating instance_info_cache with network_info: [{"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:46:24 np0005593234 nova_compute[227762]: 2026-01-23 10:46:24.996 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Releasing lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:46:24 np0005593234 nova_compute[227762]: 2026-01-23 10:46:24.998 227766 DEBUG nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:46:24 np0005593234 nova_compute[227762]: 2026-01-23 10:46:24.998 227766 INFO nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Creating image(s)#033[00m
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.022 227766 DEBUG nova.storage.rbd_utils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.025 227766 DEBUG nova.objects.instance [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.026 227766 DEBUG oslo_concurrency.lockutils [req-273f293c-8d83-4917-9a1e-c1931a9fc6d3 req-ff81462f-d5e1-47f5-85ff-af1060be2fe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.027 227766 DEBUG nova.network.neutron [req-273f293c-8d83-4917-9a1e-c1931a9fc6d3 req-ff81462f-d5e1-47f5-85ff-af1060be2fe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Refreshing network info cache for port 20adecab-752d-4d6c-95ac-7d46cf968926 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:46:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:25.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.070 227766 DEBUG nova.storage.rbd_utils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.097 227766 DEBUG nova.storage.rbd_utils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.101 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "9ce7e694ceaf743d68b8ff4cf4f3450e89c59e17" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.101 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "9ce7e694ceaf743d68b8ff4cf4f3450e89c59e17" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.121 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:25.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.511 227766 DEBUG nova.virt.libvirt.imagebackend [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/a565646f-e62e-4f36-81a3-989264b3b2c8/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/a565646f-e62e-4f36-81a3-989264b3b2c8/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.575 227766 DEBUG nova.virt.libvirt.imagebackend [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Selected location: {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/a565646f-e62e-4f36-81a3-989264b3b2c8/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.577 227766 DEBUG nova.storage.rbd_utils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] cloning images/a565646f-e62e-4f36-81a3-989264b3b2c8@snap to None/fb31c535-476a-4f92-866e-664b8b25e0fc_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.690 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "9ce7e694ceaf743d68b8ff4cf4f3450e89c59e17" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.589s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.809 227766 DEBUG nova.objects.instance [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'migration_context' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:46:25 np0005593234 nova_compute[227762]: 2026-01-23 10:46:25.866 227766 DEBUG nova.storage.rbd_utils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] flattening vms/fb31c535-476a-4f92-866e-664b8b25e0fc_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.246 227766 DEBUG nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Image rbd:vms/fb31c535-476a-4f92-866e-664b8b25e0fc_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.247 227766 DEBUG nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.248 227766 DEBUG nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Ensure instance console log exists: /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.248 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.249 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.249 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.253 227766 DEBUG nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Start _get_guest_xml network_info=[{"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vdc': {'bus': 'virtio', 'dev': 'vdc', 'type': 'disk'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T10:45:57Z,direct_url=<?>,disk_format='raw',id=a565646f-e62e-4f36-81a3-989264b3b2c8,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-548177853-shelved',owner='36d7e7c7ddbd4cf785fafd0d35b0a2d8',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T10:46:09Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [{'boot_index': None, 'mount_device': '/dev/vdc', 'delete_on_termination': False, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-183950c6-1381-47bd-9def-115173b33253', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '183950c6-1381-47bd-9def-115173b33253', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'attached', 'instance': 'fb31c535-476a-4f92-866e-664b8b25e0fc', 'attached_at': '', 'detached_at': '', 'volume_id': '183950c6-1381-47bd-9def-115173b33253', 'serial': '183950c6-1381-47bd-9def-115173b33253'}, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '3d2ac2a0-6592-42ab-ad7e-3983e01cbe98', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.256 227766 WARNING nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.262 227766 DEBUG nova.virt.libvirt.host [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.263 227766 DEBUG nova.virt.libvirt.host [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.266 227766 DEBUG nova.virt.libvirt.host [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.268 227766 DEBUG nova.virt.libvirt.host [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.269 227766 DEBUG nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.269 227766 DEBUG nova.virt.hardware [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T10:45:57Z,direct_url=<?>,disk_format='raw',id=a565646f-e62e-4f36-81a3-989264b3b2c8,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-548177853-shelved',owner='36d7e7c7ddbd4cf785fafd0d35b0a2d8',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T10:46:09Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.270 227766 DEBUG nova.virt.hardware [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.270 227766 DEBUG nova.virt.hardware [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.270 227766 DEBUG nova.virt.hardware [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.270 227766 DEBUG nova.virt.hardware [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.271 227766 DEBUG nova.virt.hardware [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.271 227766 DEBUG nova.virt.hardware [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.271 227766 DEBUG nova.virt.hardware [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.271 227766 DEBUG nova.virt.hardware [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.271 227766 DEBUG nova.virt.hardware [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.272 227766 DEBUG nova.virt.hardware [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.272 227766 DEBUG nova.objects.instance [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.296 227766 DEBUG oslo_concurrency.processutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:46:26 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2115956439' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.750 227766 DEBUG oslo_concurrency.processutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.778 227766 DEBUG nova.storage.rbd_utils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:46:26 np0005593234 nova_compute[227762]: 2026-01-23 10:46:26.782 227766 DEBUG oslo_concurrency.processutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e404 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:27.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:46:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/187612226' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.244 227766 DEBUG oslo_concurrency.processutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:46:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:27.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.559 227766 DEBUG nova.virt.libvirt.vif [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:45:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-548177853',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-548177853',id=204,image_ref='a565646f-e62e-4f36-81a3-989264b3b2c8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-2130281890',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:45:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='36d7e7c7ddbd4cf785fafd0d35b0a2d8',ramdisk_id='',reservation_id='r-m9ehiyel',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-2030135659',owner_user_name='tempest-AttachVolumeShelveTestJSON-2030135659-project-member',shelved_at='2026-01-23T10:46:09.563460',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='a565646f-e62e-4f36-81a3-989264b3b2c8'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:46:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='296341ffca2441dc807d285fa14c966d',uuid=fb31c535-476a-4f92-866e-664b8b25e0fc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.560 227766 DEBUG nova.network.os_vif_util [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converting VIF {"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.562 227766 DEBUG nova.network.os_vif_util [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.564 227766 DEBUG nova.objects.instance [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.596 227766 DEBUG nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  <uuid>fb31c535-476a-4f92-866e-664b8b25e0fc</uuid>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  <name>instance-000000cc</name>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-548177853</nova:name>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:46:26</nova:creationTime>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <nova:user uuid="296341ffca2441dc807d285fa14c966d">tempest-AttachVolumeShelveTestJSON-2030135659-project-member</nova:user>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <nova:project uuid="36d7e7c7ddbd4cf785fafd0d35b0a2d8">tempest-AttachVolumeShelveTestJSON-2030135659</nova:project>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="a565646f-e62e-4f36-81a3-989264b3b2c8"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <nova:port uuid="20adecab-752d-4d6c-95ac-7d46cf968926">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <entry name="serial">fb31c535-476a-4f92-866e-664b8b25e0fc</entry>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <entry name="uuid">fb31c535-476a-4f92-866e-664b8b25e0fc</entry>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/fb31c535-476a-4f92-866e-664b8b25e0fc_disk">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/fb31c535-476a-4f92-866e-664b8b25e0fc_disk.config">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="volumes/volume-183950c6-1381-47bd-9def-115173b33253">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <target dev="vdc" bus="virtio"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <serial>183950c6-1381-47bd-9def-115173b33253</serial>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:bd:19:d2"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <target dev="tap20adecab-75"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/console.log" append="off"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <input type="keyboard" bus="usb"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:46:27 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:46:27 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:46:27 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:46:27 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.598 227766 DEBUG nova.compute.manager [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Preparing to wait for external event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.598 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.598 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.599 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.600 227766 DEBUG nova.virt.libvirt.vif [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:45:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-548177853',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-548177853',id=204,image_ref='a565646f-e62e-4f36-81a3-989264b3b2c8',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-2130281890',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:45:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='36d7e7c7ddbd4cf785fafd0d35b0a2d8',ramdisk_id='',reservation_id='r-m9ehiyel',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-2030135659',owner_user_name='tempest-AttachVolumeShelveTestJSON-2030135659-project-member',shelved_at='2026-01-23T10:46:09.563460',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='a565646f-e62e-4f36-81a3-989264b3b2c8'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:46:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='296341ffca2441dc807d285fa14c966d',uuid=fb31c535-476a-4f92-866e-664b8b25e0fc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.600 227766 DEBUG nova.network.os_vif_util [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converting VIF {"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.601 227766 DEBUG nova.network.os_vif_util [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.601 227766 DEBUG os_vif [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.602 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.602 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.603 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.606 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.607 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap20adecab-75, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.607 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap20adecab-75, col_values=(('external_ids', {'iface-id': '20adecab-752d-4d6c-95ac-7d46cf968926', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:bd:19:d2', 'vm-uuid': 'fb31c535-476a-4f92-866e-664b8b25e0fc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.609 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:27 np0005593234 NetworkManager[48942]: <info>  [1769165187.6101] manager: (tap20adecab-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/408)
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.611 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.616 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.617 227766 INFO os_vif [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75')#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.694 227766 DEBUG nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.694 227766 DEBUG nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No BDM found with device name vdc, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.695 227766 DEBUG nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.695 227766 DEBUG nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No VIF found with MAC fa:16:3e:bd:19:d2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.695 227766 INFO nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Using config drive#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.725 227766 DEBUG nova.storage.rbd_utils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.750 227766 DEBUG nova.objects.instance [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'ec2_ids' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:46:27 np0005593234 nova_compute[227762]: 2026-01-23 10:46:27.792 227766 DEBUG nova.objects.instance [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'keypairs' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:46:28 np0005593234 nova_compute[227762]: 2026-01-23 10:46:28.578 227766 INFO nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Creating config drive at /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/disk.config#033[00m
Jan 23 05:46:28 np0005593234 nova_compute[227762]: 2026-01-23 10:46:28.584 227766 DEBUG oslo_concurrency.processutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwffzl9v6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:28 np0005593234 nova_compute[227762]: 2026-01-23 10:46:28.719 227766 DEBUG oslo_concurrency.processutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwffzl9v6" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:28 np0005593234 nova_compute[227762]: 2026-01-23 10:46:28.749 227766 DEBUG nova.storage.rbd_utils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image fb31c535-476a-4f92-866e-664b8b25e0fc_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:46:28 np0005593234 nova_compute[227762]: 2026-01-23 10:46:28.753 227766 DEBUG oslo_concurrency.processutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/disk.config fb31c535-476a-4f92-866e-664b8b25e0fc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:46:28 np0005593234 nova_compute[227762]: 2026-01-23 10:46:28.931 227766 DEBUG oslo_concurrency.processutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/disk.config fb31c535-476a-4f92-866e-664b8b25e0fc_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.178s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:46:28 np0005593234 nova_compute[227762]: 2026-01-23 10:46:28.932 227766 INFO nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Deleting local config drive /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc/disk.config because it was imported into RBD.#033[00m
Jan 23 05:46:28 np0005593234 kernel: tap20adecab-75: entered promiscuous mode
Jan 23 05:46:29 np0005593234 NetworkManager[48942]: <info>  [1769165188.9873] manager: (tap20adecab-75): new Tun device (/org/freedesktop/NetworkManager/Devices/409)
Jan 23 05:46:29 np0005593234 systemd-udevd[326088]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:46:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:46:29Z|00870|binding|INFO|Claiming lport 20adecab-752d-4d6c-95ac-7d46cf968926 for this chassis.
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.019 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:46:29Z|00871|binding|INFO|20adecab-752d-4d6c-95ac-7d46cf968926: Claiming fa:16:3e:bd:19:d2 10.100.0.10
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.026 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:19:d2 10.100.0.10'], port_security=['fa:16:3e:bd:19:d2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fb31c535-476a-4f92-866e-664b8b25e0fc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-082e2952-c529-49ec-88e6-5e5c5580db01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '36d7e7c7ddbd4cf785fafd0d35b0a2d8', 'neutron:revision_number': '7', 'neutron:security_group_ids': '556b10c0-b0e4-46d0-9f13-66e00af94e5f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.192'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6ea75f5-7173-4e04-a97c-cbcceff41ada, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=20adecab-752d-4d6c-95ac-7d46cf968926) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.027 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 20adecab-752d-4d6c-95ac-7d46cf968926 in datapath 082e2952-c529-49ec-88e6-5e5c5580db01 bound to our chassis#033[00m
Jan 23 05:46:29 np0005593234 NetworkManager[48942]: <info>  [1769165189.0299] device (tap20adecab-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:46:29 np0005593234 NetworkManager[48942]: <info>  [1769165189.0312] device (tap20adecab-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.028 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 082e2952-c529-49ec-88e6-5e5c5580db01#033[00m
Jan 23 05:46:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:46:29Z|00872|binding|INFO|Setting lport 20adecab-752d-4d6c-95ac-7d46cf968926 ovn-installed in OVS
Jan 23 05:46:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:46:29Z|00873|binding|INFO|Setting lport 20adecab-752d-4d6c-95ac-7d46cf968926 up in Southbound
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.035 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.042 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[763a1d31-6258-4fb6-8e4e-96b317ecc1f0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.043 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap082e2952-c1 in ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.044 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap082e2952-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.045 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[62c0a0ad-4f9b-4890-8897-75d093d2dcc8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.045 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fbaf5a8b-f94d-4b79-99f5-afdc0f3eb7a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 systemd-machined[195626]: New machine qemu-98-instance-000000cc.
Jan 23 05:46:29 np0005593234 systemd[1]: Started Virtual Machine qemu-98-instance-000000cc.
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.059 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[82063e38-5695-4b68-86d6-0e32238c7242]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:29.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.083 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e8fb761a-2d71-4876-aa33-3fedff74387d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.112 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7f169e89-ae3b-4a9c-bc22-64f8f4ee7e3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 NetworkManager[48942]: <info>  [1769165189.1207] manager: (tap082e2952-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/410)
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.120 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c24afbd7-2cab-418f-b453-1ac30cad0e7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.151 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f251d08c-fc0e-43f9-b7f1-88b36939798f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.154 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f031de6a-afaa-4448-bafb-60d714d1925b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:46:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:46:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:46:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:46:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:46:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:46:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:46:29 np0005593234 NetworkManager[48942]: <info>  [1769165189.1781] device (tap082e2952-c0): carrier: link connected
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.183 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[0bba23d5-0ef2-4442-a9ad-bab6a13b2433]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.198 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[68d614e8-f477-4c2f-8222-27489814d921]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap082e2952-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:8e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 267], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 908413, 'reachable_time': 33194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 326124, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.212 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f3cd68de-ed8d-410c-b542-cea311a3327b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:8e23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 908413, 'tstamp': 908413}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 326125, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.228 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c3029be5-05b4-42d0-8f62-688229ae107f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap082e2952-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:8e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 267], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 908413, 'reachable_time': 33194, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 326126, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.260 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[384a0992-5c8d-4946-bca7-bfcb4c02e905]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.321 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ba2a85a7-e5e3-47d3-a0fe-a1e09a0e3c2c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.322 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap082e2952-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.323 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.324 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap082e2952-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.325 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:29 np0005593234 kernel: tap082e2952-c0: entered promiscuous mode
Jan 23 05:46:29 np0005593234 NetworkManager[48942]: <info>  [1769165189.3264] manager: (tap082e2952-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/411)
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.328 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.329 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap082e2952-c0, col_values=(('external_ids', {'iface-id': 'e36b250d-7843-417b-b3f6-5e001769e85d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.330 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:29 np0005593234 ovn_controller[134547]: 2026-01-23T10:46:29Z|00874|binding|INFO|Releasing lport e36b250d-7843-417b-b3f6-5e001769e85d from this chassis (sb_readonly=0)
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.344 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.345 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/082e2952-c529-49ec-88e6-5e5c5580db01.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/082e2952-c529-49ec-88e6-5e5c5580db01.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.346 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1f8766c6-bb39-43fb-9cb2-7735b31dd1b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.347 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-082e2952-c529-49ec-88e6-5e5c5580db01
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/082e2952-c529-49ec-88e6-5e5c5580db01.pid.haproxy
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 082e2952-c529-49ec-88e6-5e5c5580db01
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:46:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:29.348 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'env', 'PROCESS_TAG=haproxy-082e2952-c529-49ec-88e6-5e5c5580db01', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/082e2952-c529-49ec-88e6-5e5c5580db01.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:46:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:29.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.488 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165189.4886987, fb31c535-476a-4f92-866e-664b8b25e0fc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.489 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] VM Started (Lifecycle Event)#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.502 227766 DEBUG nova.network.neutron [req-273f293c-8d83-4917-9a1e-c1931a9fc6d3 req-ff81462f-d5e1-47f5-85ff-af1060be2fe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updated VIF entry in instance network info cache for port 20adecab-752d-4d6c-95ac-7d46cf968926. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.502 227766 DEBUG nova.network.neutron [req-273f293c-8d83-4917-9a1e-c1931a9fc6d3 req-ff81462f-d5e1-47f5-85ff-af1060be2fe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updating instance_info_cache with network_info: [{"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.524 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.528 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165189.4917836, fb31c535-476a-4f92-866e-664b8b25e0fc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.529 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.530 227766 DEBUG oslo_concurrency.lockutils [req-273f293c-8d83-4917-9a1e-c1931a9fc6d3 req-ff81462f-d5e1-47f5-85ff-af1060be2fe3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fb31c535-476a-4f92-866e-664b8b25e0fc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.565 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.568 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.590 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:46:29 np0005593234 podman[326218]: 2026-01-23 10:46:29.70102354 +0000 UTC m=+0.053043028 container create 1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.731 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:29 np0005593234 systemd[1]: Started libpod-conmon-1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859.scope.
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.748 227766 DEBUG nova.compute.manager [req-5a64889a-6437-4569-b253-dbded58d72a6 req-bac5243b-61f1-4e3e-a785-aac588ed3cfb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.748 227766 DEBUG oslo_concurrency.lockutils [req-5a64889a-6437-4569-b253-dbded58d72a6 req-bac5243b-61f1-4e3e-a785-aac588ed3cfb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.749 227766 DEBUG oslo_concurrency.lockutils [req-5a64889a-6437-4569-b253-dbded58d72a6 req-bac5243b-61f1-4e3e-a785-aac588ed3cfb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.749 227766 DEBUG oslo_concurrency.lockutils [req-5a64889a-6437-4569-b253-dbded58d72a6 req-bac5243b-61f1-4e3e-a785-aac588ed3cfb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.749 227766 DEBUG nova.compute.manager [req-5a64889a-6437-4569-b253-dbded58d72a6 req-bac5243b-61f1-4e3e-a785-aac588ed3cfb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Processing event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.750 227766 DEBUG nova.compute.manager [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.755 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165189.7550728, fb31c535-476a-4f92-866e-664b8b25e0fc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.755 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.757 227766 DEBUG nova.virt.libvirt.driver [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.760 227766 INFO nova.virt.libvirt.driver [-] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Instance spawned successfully.#033[00m
Jan 23 05:46:29 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:46:29 np0005593234 podman[326218]: 2026-01-23 10:46:29.668159504 +0000 UTC m=+0.020179012 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:46:29 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c42725e3098cdd5d9935f1cf36dfd75346c9d5715dc9292a055a1ce4126c9c28/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:46:29 np0005593234 podman[326218]: 2026-01-23 10:46:29.779616456 +0000 UTC m=+0.131635974 container init 1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.785 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:46:29 np0005593234 podman[326218]: 2026-01-23 10:46:29.78744508 +0000 UTC m=+0.139464568 container start 1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.791 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:46:29 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[326233]: [NOTICE]   (326237) : New worker (326239) forked
Jan 23 05:46:29 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[326233]: [NOTICE]   (326237) : Loading success.
Jan 23 05:46:29 np0005593234 nova_compute[227762]: 2026-01-23 10:46:29.814 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:46:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e405 e405: 3 total, 3 up, 3 in
Jan 23 05:46:30 np0005593234 nova_compute[227762]: 2026-01-23 10:46:30.626 227766 DEBUG nova.compute.manager [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:46:30 np0005593234 nova_compute[227762]: 2026-01-23 10:46:30.702 227766 DEBUG oslo_concurrency.lockutils [None req-1924b2a3-7391-4ad4-86e9-33a993fa7e3a 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 10.428s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:31.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:31.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:31 np0005593234 nova_compute[227762]: 2026-01-23 10:46:31.895 227766 DEBUG nova.compute.manager [req-74304d74-2c57-4c9c-8e05-6b24995c8eb1 req-2421f0ec-9c78-474f-8037-b4be305ac464 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:46:31 np0005593234 nova_compute[227762]: 2026-01-23 10:46:31.896 227766 DEBUG oslo_concurrency.lockutils [req-74304d74-2c57-4c9c-8e05-6b24995c8eb1 req-2421f0ec-9c78-474f-8037-b4be305ac464 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:31 np0005593234 nova_compute[227762]: 2026-01-23 10:46:31.896 227766 DEBUG oslo_concurrency.lockutils [req-74304d74-2c57-4c9c-8e05-6b24995c8eb1 req-2421f0ec-9c78-474f-8037-b4be305ac464 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:31 np0005593234 nova_compute[227762]: 2026-01-23 10:46:31.898 227766 DEBUG oslo_concurrency.lockutils [req-74304d74-2c57-4c9c-8e05-6b24995c8eb1 req-2421f0ec-9c78-474f-8037-b4be305ac464 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:31 np0005593234 nova_compute[227762]: 2026-01-23 10:46:31.899 227766 DEBUG nova.compute.manager [req-74304d74-2c57-4c9c-8e05-6b24995c8eb1 req-2421f0ec-9c78-474f-8037-b4be305ac464 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] No waiting events found dispatching network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:46:31 np0005593234 nova_compute[227762]: 2026-01-23 10:46:31.899 227766 WARNING nova.compute.manager [req-74304d74-2c57-4c9c-8e05-6b24995c8eb1 req-2421f0ec-9c78-474f-8037-b4be305ac464 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received unexpected event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:46:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e405 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:32 np0005593234 nova_compute[227762]: 2026-01-23 10:46:32.612 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:46:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:33.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:46:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:33.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e406 e406: 3 total, 3 up, 3 in
Jan 23 05:46:34 np0005593234 nova_compute[227762]: 2026-01-23 10:46:34.733 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:46:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:35.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:46:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:46:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:35.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:46:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:46:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:46:36 np0005593234 nova_compute[227762]: 2026-01-23 10:46:36.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:46:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:37.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:37.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:37 np0005593234 nova_compute[227762]: 2026-01-23 10:46:37.616 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:38 np0005593234 podman[326303]: 2026-01-23 10:46:38.762518544 +0000 UTC m=+0.052633951 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Jan 23 05:46:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:39.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:39.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:39 np0005593234 nova_compute[227762]: 2026-01-23 10:46:39.735 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:41.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:41.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:42 np0005593234 nova_compute[227762]: 2026-01-23 10:46:42.620 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:42 np0005593234 ovn_controller[134547]: 2026-01-23T10:46:42Z|00114|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:bd:19:d2 10.100.0.10
Jan 23 05:46:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:42.886 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:46:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:42.887 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:46:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:46:42.888 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:46:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:43.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:43.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:46:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2419190085' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:46:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:46:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2419190085' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:46:44 np0005593234 nova_compute[227762]: 2026-01-23 10:46:44.737 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:45.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:45.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:47.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:47 np0005593234 ovn_controller[134547]: 2026-01-23T10:46:47Z|00875|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 23 05:46:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e406 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:47.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:47 np0005593234 nova_compute[227762]: 2026-01-23 10:46:47.623 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:47 np0005593234 podman[326377]: 2026-01-23 10:46:47.788894416 +0000 UTC m=+0.087992547 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:46:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:49.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:49.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:49 np0005593234 nova_compute[227762]: 2026-01-23 10:46:49.740 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 e407: 3 total, 3 up, 3 in
Jan 23 05:46:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:51.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:51.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:52 np0005593234 nova_compute[227762]: 2026-01-23 10:46:52.628 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:53.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:53.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:54 np0005593234 nova_compute[227762]: 2026-01-23 10:46:54.743 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:55.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:46:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:55.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:46:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:57.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:46:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:57.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:57 np0005593234 nova_compute[227762]: 2026-01-23 10:46:57.631 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:46:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:46:59.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:46:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:46:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:46:59.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:46:59 np0005593234 nova_compute[227762]: 2026-01-23 10:46:59.745 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:00 np0005593234 nova_compute[227762]: 2026-01-23 10:47:00.915 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:00.917 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=82, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=81) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:47:00 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:00.918 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:47:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:01.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:01.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:02 np0005593234 nova_compute[227762]: 2026-01-23 10:47:02.671 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:03.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:47:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:03.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:47:04 np0005593234 nova_compute[227762]: 2026-01-23 10:47:04.746 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:04.920 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '82'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:05.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.251 227766 DEBUG oslo_concurrency.lockutils [None req-a3aec855-e3b6-44de-a370-c8c69a915643 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.251 227766 DEBUG oslo_concurrency.lockutils [None req-a3aec855-e3b6-44de-a370-c8c69a915643 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.271 227766 INFO nova.compute.manager [None req-a3aec855-e3b6-44de-a370-c8c69a915643 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Detaching volume 183950c6-1381-47bd-9def-115173b33253#033[00m
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.426 227766 INFO nova.virt.block_device [None req-a3aec855-e3b6-44de-a370-c8c69a915643 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Attempting to driver detach volume 183950c6-1381-47bd-9def-115173b33253 from mountpoint /dev/vdc#033[00m
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.439 227766 DEBUG nova.virt.libvirt.driver [None req-a3aec855-e3b6-44de-a370-c8c69a915643 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Attempting to detach device vdc from instance fb31c535-476a-4f92-866e-664b8b25e0fc from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.439 227766 DEBUG nova.virt.libvirt.guest [None req-a3aec855-e3b6-44de-a370-c8c69a915643 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:47:05 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-183950c6-1381-47bd-9def-115173b33253">
Jan 23 05:47:05 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:  <target dev="vdc" bus="virtio"/>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:  <serial>183950c6-1381-47bd-9def-115173b33253</serial>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 23 05:47:05 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:47:05 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.447 227766 INFO nova.virt.libvirt.driver [None req-a3aec855-e3b6-44de-a370-c8c69a915643 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Successfully detached device vdc from instance fb31c535-476a-4f92-866e-664b8b25e0fc from the persistent domain config.#033[00m
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.447 227766 DEBUG nova.virt.libvirt.driver [None req-a3aec855-e3b6-44de-a370-c8c69a915643 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] (1/8): Attempting to detach device vdc with device alias virtio-disk2 from instance fb31c535-476a-4f92-866e-664b8b25e0fc from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.447 227766 DEBUG nova.virt.libvirt.guest [None req-a3aec855-e3b6-44de-a370-c8c69a915643 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:47:05 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-183950c6-1381-47bd-9def-115173b33253">
Jan 23 05:47:05 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:  <target dev="vdc" bus="virtio"/>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:  <serial>183950c6-1381-47bd-9def-115173b33253</serial>
Jan 23 05:47:05 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x04" slot="0x00" function="0x0"/>
Jan 23 05:47:05 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:47:05 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:47:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:05.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.495 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769165225.49567, fb31c535-476a-4f92-866e-664b8b25e0fc => virtio-disk2> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.498 227766 DEBUG nova.virt.libvirt.driver [None req-a3aec855-e3b6-44de-a370-c8c69a915643 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Start waiting for the detach event from libvirt for device vdc with device alias virtio-disk2 for instance fb31c535-476a-4f92-866e-664b8b25e0fc _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.500 227766 INFO nova.virt.libvirt.driver [None req-a3aec855-e3b6-44de-a370-c8c69a915643 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Successfully detached device vdc from instance fb31c535-476a-4f92-866e-664b8b25e0fc from the live domain config.#033[00m
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.667 227766 DEBUG nova.objects.instance [None req-a3aec855-e3b6-44de-a370-c8c69a915643 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'flavor' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:47:05 np0005593234 nova_compute[227762]: 2026-01-23 10:47:05.716 227766 DEBUG oslo_concurrency.lockutils [None req-a3aec855-e3b6-44de-a370-c8c69a915643 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.465s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:06 np0005593234 nova_compute[227762]: 2026-01-23 10:47:06.932 227766 DEBUG oslo_concurrency.lockutils [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:06 np0005593234 nova_compute[227762]: 2026-01-23 10:47:06.932 227766 DEBUG oslo_concurrency.lockutils [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:06 np0005593234 nova_compute[227762]: 2026-01-23 10:47:06.932 227766 DEBUG oslo_concurrency.lockutils [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:06 np0005593234 nova_compute[227762]: 2026-01-23 10:47:06.933 227766 DEBUG oslo_concurrency.lockutils [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:06 np0005593234 nova_compute[227762]: 2026-01-23 10:47:06.933 227766 DEBUG oslo_concurrency.lockutils [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:06 np0005593234 nova_compute[227762]: 2026-01-23 10:47:06.934 227766 INFO nova.compute.manager [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Terminating instance#033[00m
Jan 23 05:47:06 np0005593234 nova_compute[227762]: 2026-01-23 10:47:06.936 227766 DEBUG nova.compute.manager [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:47:07 np0005593234 kernel: tap20adecab-75 (unregistering): left promiscuous mode
Jan 23 05:47:07 np0005593234 NetworkManager[48942]: <info>  [1769165227.0179] device (tap20adecab-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:47:07 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:47:07 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 05:47:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:47:07Z|00876|binding|INFO|Releasing lport 20adecab-752d-4d6c-95ac-7d46cf968926 from this chassis (sb_readonly=0)
Jan 23 05:47:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:47:07Z|00877|binding|INFO|Setting lport 20adecab-752d-4d6c-95ac-7d46cf968926 down in Southbound
Jan 23 05:47:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:47:07Z|00878|binding|INFO|Removing iface tap20adecab-75 ovn-installed in OVS
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.029 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.031 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.036 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:bd:19:d2 10.100.0.10'], port_security=['fa:16:3e:bd:19:d2 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'fb31c535-476a-4f92-866e-664b8b25e0fc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-082e2952-c529-49ec-88e6-5e5c5580db01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '36d7e7c7ddbd4cf785fafd0d35b0a2d8', 'neutron:revision_number': '9', 'neutron:security_group_ids': '556b10c0-b0e4-46d0-9f13-66e00af94e5f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.192', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6ea75f5-7173-4e04-a97c-cbcceff41ada, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=20adecab-752d-4d6c-95ac-7d46cf968926) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.037 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 20adecab-752d-4d6c-95ac-7d46cf968926 in datapath 082e2952-c529-49ec-88e6-5e5c5580db01 unbound from our chassis#033[00m
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.038 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 082e2952-c529-49ec-88e6-5e5c5580db01, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.040 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2939c88b-17eb-4461-9c9f-51e0b955488d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.041 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 namespace which is not needed anymore#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.051 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:07 np0005593234 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000cc.scope: Deactivated successfully.
Jan 23 05:47:07 np0005593234 systemd[1]: machine-qemu\x2d98\x2dinstance\x2d000000cc.scope: Consumed 15.220s CPU time.
Jan 23 05:47:07 np0005593234 systemd-machined[195626]: Machine qemu-98-instance-000000cc terminated.
Jan 23 05:47:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:07.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.169 227766 INFO nova.virt.libvirt.driver [-] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Instance destroyed successfully.#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.170 227766 DEBUG nova.objects.instance [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'resources' on Instance uuid fb31c535-476a-4f92-866e-664b8b25e0fc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.223 227766 DEBUG nova.virt.libvirt.vif [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:45:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-548177853',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-548177853',id=204,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHl4y18ZbM5/Piuxfm1CZQf0XAJEg8AGMP7q7u+IMXAx6Zt5rL1mJMSOsTTDZFJEhWjwFar/8Dgb+UXMig3/lwqhw1lDvKwVdlJtw/GUAFgmmy551r0TFxkUxIA1d9FOqw==',key_name='tempest-keypair-2130281890',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:46:30Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='36d7e7c7ddbd4cf785fafd0d35b0a2d8',ramdisk_id='',reservation_id='r-m9ehiyel',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-2030135659',owner_user_name='tempest-AttachVolumeShelveTestJSON-2030135659-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:46:30Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='296341ffca2441dc807d285fa14c966d',uuid=fb31c535-476a-4f92-866e-664b8b25e0fc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.224 227766 DEBUG nova.network.os_vif_util [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converting VIF {"id": "20adecab-752d-4d6c-95ac-7d46cf968926", "address": "fa:16:3e:bd:19:d2", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.192", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap20adecab-75", "ovs_interfaceid": "20adecab-752d-4d6c-95ac-7d46cf968926", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.225 227766 DEBUG nova.network.os_vif_util [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.225 227766 DEBUG os_vif [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.226 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.227 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap20adecab-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.230 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.233 227766 INFO os_vif [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:bd:19:d2,bridge_name='br-int',has_traffic_filtering=True,id=20adecab-752d-4d6c-95ac-7d46cf968926,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap20adecab-75')#033[00m
Jan 23 05:47:07 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[326233]: [NOTICE]   (326237) : haproxy version is 2.8.14-c23fe91
Jan 23 05:47:07 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[326233]: [NOTICE]   (326237) : path to executable is /usr/sbin/haproxy
Jan 23 05:47:07 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[326233]: [WARNING]  (326237) : Exiting Master process...
Jan 23 05:47:07 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[326233]: [WARNING]  (326237) : Exiting Master process...
Jan 23 05:47:07 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[326233]: [ALERT]    (326237) : Current worker (326239) exited with code 143 (Terminated)
Jan 23 05:47:07 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[326233]: [WARNING]  (326237) : All workers exited. Exiting... (0)
Jan 23 05:47:07 np0005593234 systemd[1]: libpod-1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859.scope: Deactivated successfully.
Jan 23 05:47:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:07 np0005593234 podman[326497]: 2026-01-23 10:47:07.267010595 +0000 UTC m=+0.144971939 container died 1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:47:07 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859-userdata-shm.mount: Deactivated successfully.
Jan 23 05:47:07 np0005593234 systemd[1]: var-lib-containers-storage-overlay-c42725e3098cdd5d9935f1cf36dfd75346c9d5715dc9292a055a1ce4126c9c28-merged.mount: Deactivated successfully.
Jan 23 05:47:07 np0005593234 podman[326497]: 2026-01-23 10:47:07.404655885 +0000 UTC m=+0.282617229 container cleanup 1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:47:07 np0005593234 systemd[1]: libpod-conmon-1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859.scope: Deactivated successfully.
Jan 23 05:47:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:07.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:07 np0005593234 podman[326555]: 2026-01-23 10:47:07.561900706 +0000 UTC m=+0.133557600 container remove 1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.568 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[57b24738-a82c-4584-9f8b-e9e668dceab8]: (4, ('Fri Jan 23 10:47:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 (1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859)\n1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859\nFri Jan 23 10:47:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 (1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859)\n1f9fea24f69fcfa94a93efa8b12e4cdc9b90d375b92710fa0d195bd468203859\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.570 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4f14ee33-4392-4bee-b514-35d6a687a18c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.571 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap082e2952-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.573 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:07 np0005593234 kernel: tap082e2952-c0: left promiscuous mode
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.587 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.589 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0430b8ab-d9b2-4c7c-8b70-42034e5a822e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.605 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2b5ec76d-a1dd-4622-bd4b-fbca83db9341]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.606 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c4b0fd6b-5e5e-45a3-bc63-39153ad648e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.621 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8a551af2-302f-423f-9b3b-66b101a59042]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 908406, 'reachable_time': 15039, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 326570, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.625 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:47:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:07.625 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[122fa0f0-202e-41c3-9c5d-d283db05f068]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:07 np0005593234 systemd[1]: run-netns-ovnmeta\x2d082e2952\x2dc529\x2d49ec\x2d88e6\x2d5e5c5580db01.mount: Deactivated successfully.
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.688 227766 DEBUG nova.compute.manager [req-1541ae76-8349-4112-aafb-ae90f9c77b46 req-bc400a6d-0988-49c2-91d6-90e13c4be7cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-vif-unplugged-20adecab-752d-4d6c-95ac-7d46cf968926 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.688 227766 DEBUG oslo_concurrency.lockutils [req-1541ae76-8349-4112-aafb-ae90f9c77b46 req-bc400a6d-0988-49c2-91d6-90e13c4be7cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.688 227766 DEBUG oslo_concurrency.lockutils [req-1541ae76-8349-4112-aafb-ae90f9c77b46 req-bc400a6d-0988-49c2-91d6-90e13c4be7cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.689 227766 DEBUG oslo_concurrency.lockutils [req-1541ae76-8349-4112-aafb-ae90f9c77b46 req-bc400a6d-0988-49c2-91d6-90e13c4be7cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.689 227766 DEBUG nova.compute.manager [req-1541ae76-8349-4112-aafb-ae90f9c77b46 req-bc400a6d-0988-49c2-91d6-90e13c4be7cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] No waiting events found dispatching network-vif-unplugged-20adecab-752d-4d6c-95ac-7d46cf968926 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:47:07 np0005593234 nova_compute[227762]: 2026-01-23 10:47:07.689 227766 DEBUG nova.compute.manager [req-1541ae76-8349-4112-aafb-ae90f9c77b46 req-bc400a6d-0988-49c2-91d6-90e13c4be7cf 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-vif-unplugged-20adecab-752d-4d6c-95ac-7d46cf968926 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:47:08 np0005593234 nova_compute[227762]: 2026-01-23 10:47:08.796 227766 INFO nova.virt.libvirt.driver [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Deleting instance files /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc_del#033[00m
Jan 23 05:47:08 np0005593234 nova_compute[227762]: 2026-01-23 10:47:08.796 227766 INFO nova.virt.libvirt.driver [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Deletion of /var/lib/nova/instances/fb31c535-476a-4f92-866e-664b8b25e0fc_del complete#033[00m
Jan 23 05:47:08 np0005593234 nova_compute[227762]: 2026-01-23 10:47:08.885 227766 INFO nova.compute.manager [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Took 1.95 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:47:08 np0005593234 nova_compute[227762]: 2026-01-23 10:47:08.886 227766 DEBUG oslo.service.loopingcall [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:47:08 np0005593234 nova_compute[227762]: 2026-01-23 10:47:08.886 227766 DEBUG nova.compute.manager [-] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:47:08 np0005593234 nova_compute[227762]: 2026-01-23 10:47:08.887 227766 DEBUG nova.network.neutron [-] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:47:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:09.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:09.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:09 np0005593234 nova_compute[227762]: 2026-01-23 10:47:09.748 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:09 np0005593234 podman[326574]: 2026-01-23 10:47:09.761346846 +0000 UTC m=+0.052573280 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:47:09 np0005593234 nova_compute[227762]: 2026-01-23 10:47:09.822 227766 DEBUG nova.compute.manager [req-b34251fd-6d91-49c4-a3eb-fc1863c03238 req-166732cc-7e50-4c65-841d-f5b021c39269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:09 np0005593234 nova_compute[227762]: 2026-01-23 10:47:09.822 227766 DEBUG oslo_concurrency.lockutils [req-b34251fd-6d91-49c4-a3eb-fc1863c03238 req-166732cc-7e50-4c65-841d-f5b021c39269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:09 np0005593234 nova_compute[227762]: 2026-01-23 10:47:09.822 227766 DEBUG oslo_concurrency.lockutils [req-b34251fd-6d91-49c4-a3eb-fc1863c03238 req-166732cc-7e50-4c65-841d-f5b021c39269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:09 np0005593234 nova_compute[227762]: 2026-01-23 10:47:09.823 227766 DEBUG oslo_concurrency.lockutils [req-b34251fd-6d91-49c4-a3eb-fc1863c03238 req-166732cc-7e50-4c65-841d-f5b021c39269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:09 np0005593234 nova_compute[227762]: 2026-01-23 10:47:09.823 227766 DEBUG nova.compute.manager [req-b34251fd-6d91-49c4-a3eb-fc1863c03238 req-166732cc-7e50-4c65-841d-f5b021c39269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] No waiting events found dispatching network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:47:09 np0005593234 nova_compute[227762]: 2026-01-23 10:47:09.823 227766 WARNING nova.compute.manager [req-b34251fd-6d91-49c4-a3eb-fc1863c03238 req-166732cc-7e50-4c65-841d-f5b021c39269 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received unexpected event network-vif-plugged-20adecab-752d-4d6c-95ac-7d46cf968926 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:47:10 np0005593234 nova_compute[227762]: 2026-01-23 10:47:10.184 227766 DEBUG nova.network.neutron [-] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:47:10 np0005593234 nova_compute[227762]: 2026-01-23 10:47:10.212 227766 INFO nova.compute.manager [-] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Took 1.33 seconds to deallocate network for instance.#033[00m
Jan 23 05:47:10 np0005593234 nova_compute[227762]: 2026-01-23 10:47:10.264 227766 DEBUG nova.compute.manager [req-c4da8b77-b204-4dbc-8b9b-8b019655d3da req-939dab6b-46e9-4464-8b90-0313c9813531 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Received event network-vif-deleted-20adecab-752d-4d6c-95ac-7d46cf968926 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:10 np0005593234 nova_compute[227762]: 2026-01-23 10:47:10.279 227766 DEBUG oslo_concurrency.lockutils [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:10 np0005593234 nova_compute[227762]: 2026-01-23 10:47:10.280 227766 DEBUG oslo_concurrency.lockutils [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:10 np0005593234 nova_compute[227762]: 2026-01-23 10:47:10.389 227766 DEBUG nova.scheduler.client.report [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:47:10 np0005593234 nova_compute[227762]: 2026-01-23 10:47:10.454 227766 DEBUG nova.scheduler.client.report [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:47:10 np0005593234 nova_compute[227762]: 2026-01-23 10:47:10.455 227766 DEBUG nova.compute.provider_tree [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:47:10 np0005593234 nova_compute[227762]: 2026-01-23 10:47:10.514 227766 DEBUG nova.scheduler.client.report [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:47:10 np0005593234 nova_compute[227762]: 2026-01-23 10:47:10.535 227766 DEBUG nova.scheduler.client.report [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:47:10 np0005593234 nova_compute[227762]: 2026-01-23 10:47:10.583 227766 DEBUG oslo_concurrency.processutils [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:47:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/13829002' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:47:11 np0005593234 nova_compute[227762]: 2026-01-23 10:47:11.022 227766 DEBUG oslo_concurrency.processutils [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:11 np0005593234 nova_compute[227762]: 2026-01-23 10:47:11.033 227766 DEBUG nova.compute.provider_tree [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:47:11 np0005593234 nova_compute[227762]: 2026-01-23 10:47:11.051 227766 DEBUG nova.scheduler.client.report [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:47:11 np0005593234 nova_compute[227762]: 2026-01-23 10:47:11.101 227766 DEBUG oslo_concurrency.lockutils [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.821s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:11.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:11 np0005593234 nova_compute[227762]: 2026-01-23 10:47:11.132 227766 INFO nova.scheduler.client.report [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Deleted allocations for instance fb31c535-476a-4f92-866e-664b8b25e0fc#033[00m
Jan 23 05:47:11 np0005593234 nova_compute[227762]: 2026-01-23 10:47:11.217 227766 DEBUG oslo_concurrency.lockutils [None req-68212d0a-9b97-47aa-aa6a-ae7ffccd2648 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "fb31c535-476a-4f92-866e-664b8b25e0fc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.284s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:11.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:11 np0005593234 nova_compute[227762]: 2026-01-23 10:47:11.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:11 np0005593234 nova_compute[227762]: 2026-01-23 10:47:11.775 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:11 np0005593234 nova_compute[227762]: 2026-01-23 10:47:11.776 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:11 np0005593234 nova_compute[227762]: 2026-01-23 10:47:11.776 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:11 np0005593234 nova_compute[227762]: 2026-01-23 10:47:11.777 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:47:11 np0005593234 nova_compute[227762]: 2026-01-23 10:47:11.778 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:47:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3745251873' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.212 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.435s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.230 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.390 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.392 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4111MB free_disk=20.94232177734375GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.392 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.393 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.454 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.455 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.483 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:47:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/888012241' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.910 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.916 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.941 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.980 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:47:12 np0005593234 nova_compute[227762]: 2026-01-23 10:47:12.981 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.588s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:13.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:13.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:13 np0005593234 nova_compute[227762]: 2026-01-23 10:47:13.980 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:13 np0005593234 nova_compute[227762]: 2026-01-23 10:47:13.981 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:47:13 np0005593234 nova_compute[227762]: 2026-01-23 10:47:13.981 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:47:13 np0005593234 nova_compute[227762]: 2026-01-23 10:47:13.999 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:47:14 np0005593234 nova_compute[227762]: 2026-01-23 10:47:13.999 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:14 np0005593234 nova_compute[227762]: 2026-01-23 10:47:14.749 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:15.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:15.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:17.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:17 np0005593234 nova_compute[227762]: 2026-01-23 10:47:17.282 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:17.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:18 np0005593234 nova_compute[227762]: 2026-01-23 10:47:18.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:18 np0005593234 podman[326665]: 2026-01-23 10:47:18.787436867 +0000 UTC m=+0.083079590 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:47:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:19.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.280 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.281 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.303 227766 DEBUG nova.compute.manager [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.394 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.395 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.402 227766 DEBUG nova.virt.hardware [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.403 227766 INFO nova.compute.claims [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:47:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:19.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.524 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.751 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:47:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1740507373' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.970 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.975 227766 DEBUG nova.compute.provider_tree [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:47:19 np0005593234 nova_compute[227762]: 2026-01-23 10:47:19.994 227766 DEBUG nova.scheduler.client.report [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.015 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.621s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.016 227766 DEBUG nova.compute.manager [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.080 227766 DEBUG nova.compute.manager [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.080 227766 DEBUG nova.network.neutron [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.107 227766 INFO nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.131 227766 DEBUG nova.compute.manager [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.226 227766 DEBUG nova.compute.manager [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.227 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.228 227766 INFO nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Creating image(s)#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.256 227766 DEBUG nova.storage.rbd_utils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.284 227766 DEBUG nova.storage.rbd_utils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.310 227766 DEBUG nova.storage.rbd_utils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.315 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.384 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.385 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.386 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.386 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.415 227766 DEBUG nova.storage.rbd_utils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.420 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.521 227766 DEBUG nova.policy [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '296341ffca2441dc807d285fa14c966d', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '36d7e7c7ddbd4cf785fafd0d35b0a2d8', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.780 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.360s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.845 227766 DEBUG nova.storage.rbd_utils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] resizing rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.946 227766 DEBUG nova.objects.instance [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'migration_context' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.963 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.964 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Ensure instance console log exists: /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.965 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.965 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:20 np0005593234 nova_compute[227762]: 2026-01-23 10:47:20.966 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:21.143 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:21.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:21 np0005593234 nova_compute[227762]: 2026-01-23 10:47:21.643 227766 DEBUG nova.network.neutron [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Successfully created port: 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:47:22 np0005593234 nova_compute[227762]: 2026-01-23 10:47:22.169 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165227.167391, fb31c535-476a-4f92-866e-664b8b25e0fc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:47:22 np0005593234 nova_compute[227762]: 2026-01-23 10:47:22.169 227766 INFO nova.compute.manager [-] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:47:22 np0005593234 nova_compute[227762]: 2026-01-23 10:47:22.203 227766 DEBUG nova.compute.manager [None req-628f48ea-022b-48fd-b934-c2bb94ae9c30 - - - - - -] [instance: fb31c535-476a-4f92-866e-664b8b25e0fc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:47:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:22 np0005593234 nova_compute[227762]: 2026-01-23 10:47:22.285 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:22 np0005593234 nova_compute[227762]: 2026-01-23 10:47:22.607 227766 DEBUG nova.network.neutron [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Successfully updated port: 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:47:22 np0005593234 nova_compute[227762]: 2026-01-23 10:47:22.625 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:47:22 np0005593234 nova_compute[227762]: 2026-01-23 10:47:22.625 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquired lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:47:22 np0005593234 nova_compute[227762]: 2026-01-23 10:47:22.625 227766 DEBUG nova.network.neutron [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:47:22 np0005593234 nova_compute[227762]: 2026-01-23 10:47:22.711 227766 DEBUG nova.compute.manager [req-cccb79a2-e786-43c3-89b6-e11ac97f58fc req-0b2fb37c-bec7-4758-8bb5-6971765cd678 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-changed-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:22 np0005593234 nova_compute[227762]: 2026-01-23 10:47:22.712 227766 DEBUG nova.compute.manager [req-cccb79a2-e786-43c3-89b6-e11ac97f58fc req-0b2fb37c-bec7-4758-8bb5-6971765cd678 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Refreshing instance network info cache due to event network-changed-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:47:22 np0005593234 nova_compute[227762]: 2026-01-23 10:47:22.712 227766 DEBUG oslo_concurrency.lockutils [req-cccb79a2-e786-43c3-89b6-e11ac97f58fc req-0b2fb37c-bec7-4758-8bb5-6971765cd678 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:47:22 np0005593234 nova_compute[227762]: 2026-01-23 10:47:22.806 227766 DEBUG nova.network.neutron [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:47:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:23.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:23.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.854 227766 DEBUG nova.network.neutron [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updating instance_info_cache with network_info: [{"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.879 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Releasing lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.880 227766 DEBUG nova.compute.manager [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Instance network_info: |[{"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.880 227766 DEBUG oslo_concurrency.lockutils [req-cccb79a2-e786-43c3-89b6-e11ac97f58fc req-0b2fb37c-bec7-4758-8bb5-6971765cd678 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.881 227766 DEBUG nova.network.neutron [req-cccb79a2-e786-43c3-89b6-e11ac97f58fc req-0b2fb37c-bec7-4758-8bb5-6971765cd678 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Refreshing network info cache for port 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.884 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Start _get_guest_xml network_info=[{"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.890 227766 WARNING nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.895 227766 DEBUG nova.virt.libvirt.host [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.896 227766 DEBUG nova.virt.libvirt.host [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.899 227766 DEBUG nova.virt.libvirt.host [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.899 227766 DEBUG nova.virt.libvirt.host [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.901 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.901 227766 DEBUG nova.virt.hardware [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.901 227766 DEBUG nova.virt.hardware [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.902 227766 DEBUG nova.virt.hardware [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.902 227766 DEBUG nova.virt.hardware [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.902 227766 DEBUG nova.virt.hardware [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.902 227766 DEBUG nova.virt.hardware [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.903 227766 DEBUG nova.virt.hardware [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.903 227766 DEBUG nova.virt.hardware [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.903 227766 DEBUG nova.virt.hardware [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.903 227766 DEBUG nova.virt.hardware [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.904 227766 DEBUG nova.virt.hardware [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:47:23 np0005593234 nova_compute[227762]: 2026-01-23 10:47:23.907 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:47:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2994444758' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.344 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.374 227766 DEBUG nova.storage.rbd_utils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.381 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.753 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:47:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1309963887' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.830 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.832 227766 DEBUG nova.virt.libvirt.vif [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:47:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1083202157',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-1083202157',id=208,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG5ho3CJ6mVqZQzXfJk0fahhh8Yqf11R44i9Fq9DFeNIqqNX5wHacVicDdiNzwbZtz9LlhyeXROqPF8aw2fBlj8o9f2Tzq3dN4qwFPSyYlBwK89/KDZiTx7iCS1VleFFZQ==',key_name='tempest-keypair-1986772238',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='36d7e7c7ddbd4cf785fafd0d35b0a2d8',ramdisk_id='',reservation_id='r-ht01gjtk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-2030135659',owner_user_name='tempest-AttachVolumeShelveTestJSON-2030135659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:47:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='296341ffca2441dc807d285fa14c966d',uuid=1f16f1e6-2ac3-4547-84bf-103e4be39e3a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.833 227766 DEBUG nova.network.os_vif_util [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converting VIF {"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.834 227766 DEBUG nova.network.os_vif_util [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.836 227766 DEBUG nova.objects.instance [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.855 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  <uuid>1f16f1e6-2ac3-4547-84bf-103e4be39e3a</uuid>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  <name>instance-000000d0</name>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-1083202157</nova:name>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:47:23</nova:creationTime>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <nova:user uuid="296341ffca2441dc807d285fa14c966d">tempest-AttachVolumeShelveTestJSON-2030135659-project-member</nova:user>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <nova:project uuid="36d7e7c7ddbd4cf785fafd0d35b0a2d8">tempest-AttachVolumeShelveTestJSON-2030135659</nova:project>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <nova:port uuid="9acbc2f5-e7f6-4b5e-8799-c611ad3392bf">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <entry name="serial">1f16f1e6-2ac3-4547-84bf-103e4be39e3a</entry>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <entry name="uuid">1f16f1e6-2ac3-4547-84bf-103e4be39e3a</entry>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk.config">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:c1:5f:89"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <target dev="tap9acbc2f5-e7"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/console.log" append="off"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:47:24 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:47:24 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:47:24 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:47:24 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.857 227766 DEBUG nova.compute.manager [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Preparing to wait for external event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.857 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.858 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.858 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.859 227766 DEBUG nova.virt.libvirt.vif [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:47:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1083202157',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-1083202157',id=208,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG5ho3CJ6mVqZQzXfJk0fahhh8Yqf11R44i9Fq9DFeNIqqNX5wHacVicDdiNzwbZtz9LlhyeXROqPF8aw2fBlj8o9f2Tzq3dN4qwFPSyYlBwK89/KDZiTx7iCS1VleFFZQ==',key_name='tempest-keypair-1986772238',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='36d7e7c7ddbd4cf785fafd0d35b0a2d8',ramdisk_id='',reservation_id='r-ht01gjtk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-AttachVolumeShelveTestJSON-2030135659',owner_user_name='tempest-AttachVolumeShelveTestJSON-2030135659-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:47:20Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='296341ffca2441dc807d285fa14c966d',uuid=1f16f1e6-2ac3-4547-84bf-103e4be39e3a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.859 227766 DEBUG nova.network.os_vif_util [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converting VIF {"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.860 227766 DEBUG nova.network.os_vif_util [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.860 227766 DEBUG os_vif [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.861 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.861 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.862 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.864 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.864 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9acbc2f5-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.865 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9acbc2f5-e7, col_values=(('external_ids', {'iface-id': '9acbc2f5-e7f6-4b5e-8799-c611ad3392bf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:5f:89', 'vm-uuid': '1f16f1e6-2ac3-4547-84bf-103e4be39e3a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.867 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:24 np0005593234 NetworkManager[48942]: <info>  [1769165244.8680] manager: (tap9acbc2f5-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/412)
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.869 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.874 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.875 227766 INFO os_vif [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7')#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.925 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.925 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.925 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No VIF found with MAC fa:16:3e:c1:5f:89, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.926 227766 INFO nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Using config drive#033[00m
Jan 23 05:47:24 np0005593234 nova_compute[227762]: 2026-01-23 10:47:24.948 227766 DEBUG nova.storage.rbd_utils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:47:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:25.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:25 np0005593234 nova_compute[227762]: 2026-01-23 10:47:25.280 227766 INFO nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Creating config drive at /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/disk.config#033[00m
Jan 23 05:47:25 np0005593234 nova_compute[227762]: 2026-01-23 10:47:25.286 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5boaer2l execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:25 np0005593234 nova_compute[227762]: 2026-01-23 10:47:25.442 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5boaer2l" returned: 0 in 0.156s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:25 np0005593234 nova_compute[227762]: 2026-01-23 10:47:25.476 227766 DEBUG nova.storage.rbd_utils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:47:25 np0005593234 nova_compute[227762]: 2026-01-23 10:47:25.480 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/disk.config 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:47:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:25.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:25 np0005593234 nova_compute[227762]: 2026-01-23 10:47:25.534 227766 DEBUG nova.network.neutron [req-cccb79a2-e786-43c3-89b6-e11ac97f58fc req-0b2fb37c-bec7-4758-8bb5-6971765cd678 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updated VIF entry in instance network info cache for port 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:47:25 np0005593234 nova_compute[227762]: 2026-01-23 10:47:25.535 227766 DEBUG nova.network.neutron [req-cccb79a2-e786-43c3-89b6-e11ac97f58fc req-0b2fb37c-bec7-4758-8bb5-6971765cd678 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updating instance_info_cache with network_info: [{"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:47:25 np0005593234 nova_compute[227762]: 2026-01-23 10:47:25.555 227766 DEBUG oslo_concurrency.lockutils [req-cccb79a2-e786-43c3-89b6-e11ac97f58fc req-0b2fb37c-bec7-4758-8bb5-6971765cd678 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:47:25 np0005593234 nova_compute[227762]: 2026-01-23 10:47:25.656 227766 DEBUG oslo_concurrency.processutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/disk.config 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.176s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:47:25 np0005593234 nova_compute[227762]: 2026-01-23 10:47:25.657 227766 INFO nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Deleting local config drive /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/disk.config because it was imported into RBD.#033[00m
Jan 23 05:47:25 np0005593234 kernel: tap9acbc2f5-e7: entered promiscuous mode
Jan 23 05:47:25 np0005593234 nova_compute[227762]: 2026-01-23 10:47:25.744 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:25 np0005593234 NetworkManager[48942]: <info>  [1769165245.7451] manager: (tap9acbc2f5-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/413)
Jan 23 05:47:25 np0005593234 ovn_controller[134547]: 2026-01-23T10:47:25Z|00879|binding|INFO|Claiming lport 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf for this chassis.
Jan 23 05:47:25 np0005593234 ovn_controller[134547]: 2026-01-23T10:47:25Z|00880|binding|INFO|9acbc2f5-e7f6-4b5e-8799-c611ad3392bf: Claiming fa:16:3e:c1:5f:89 10.100.0.13
Jan 23 05:47:25 np0005593234 nova_compute[227762]: 2026-01-23 10:47:25.747 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.757 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:5f:89 10.100.0.13'], port_security=['fa:16:3e:c1:5f:89 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1f16f1e6-2ac3-4547-84bf-103e4be39e3a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-082e2952-c529-49ec-88e6-5e5c5580db01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '36d7e7c7ddbd4cf785fafd0d35b0a2d8', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0f4c1420-fdc4-4f47-97c8-7ad48c8768c8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6ea75f5-7173-4e04-a97c-cbcceff41ada, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.759 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf in datapath 082e2952-c529-49ec-88e6-5e5c5580db01 bound to our chassis#033[00m
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.761 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 082e2952-c529-49ec-88e6-5e5c5580db01#033[00m
Jan 23 05:47:25 np0005593234 ovn_controller[134547]: 2026-01-23T10:47:25Z|00881|binding|INFO|Setting lport 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf ovn-installed in OVS
Jan 23 05:47:25 np0005593234 ovn_controller[134547]: 2026-01-23T10:47:25Z|00882|binding|INFO|Setting lport 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf up in Southbound
Jan 23 05:47:25 np0005593234 nova_compute[227762]: 2026-01-23 10:47:25.775 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.775 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b56d9665-aa2a-4fac-a2f2-44c8dbb9b171]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.777 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap082e2952-c1 in ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.780 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap082e2952-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.780 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9570fdb4-a69f-4551-b64e-1fa59421322d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.781 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[66818f57-e957-4daa-af4d-62123322d38d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:25 np0005593234 systemd-machined[195626]: New machine qemu-99-instance-000000d0.
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.798 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf5b6e9-f569-4971-8d4a-46ce2c2c3f1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:25 np0005593234 systemd[1]: Started Virtual Machine qemu-99-instance-000000d0.
Jan 23 05:47:25 np0005593234 systemd-udevd[327020]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.824 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9437cbdb-11ee-421d-b8e1-f8b432861baf]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:25 np0005593234 NetworkManager[48942]: <info>  [1769165245.8337] device (tap9acbc2f5-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:47:25 np0005593234 NetworkManager[48942]: <info>  [1769165245.8367] device (tap9acbc2f5-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.862 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ead0cd7f-6c7b-434d-9bfb-b2f5940bfff9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:25 np0005593234 systemd-udevd[327024]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:47:25 np0005593234 NetworkManager[48942]: <info>  [1769165245.8700] manager: (tap082e2952-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/414)
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.870 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[882474d0-40d9-4fdd-9c36-52d6267deb1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.897 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[94b0a84d-7606-4a62-9dd0-ce025ce1dd3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.900 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[47a5e33a-c37e-42d0-8c91-af51c6e4a83e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:25 np0005593234 NetworkManager[48942]: <info>  [1769165245.9195] device (tap082e2952-c0): carrier: link connected
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.924 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[9e1abc6a-e31b-4e72-b39c-93d98f40194d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.940 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[489283f3-903c-40da-8d87-de6f3ebcc2e2]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap082e2952-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:8e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 270], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 914087, 'reachable_time': 18749, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 327050, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.960 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0d97f300-6904-4e27-8a1d-04bf97eeabee]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:8e23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 914087, 'tstamp': 914087}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 327051, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:25.978 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f20e5b06-587b-4576-aaca-f0ec82c0a068]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap082e2952-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:8e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 270], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 914087, 'reachable_time': 18749, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 327052, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:26.009 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[74b8849c-53cf-4d8a-95e6-ffe8af6990f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.042 227766 DEBUG nova.compute.manager [req-e3e99000-22ec-479c-b857-f5ad10d7da1e req-3b275a20-bd3a-4e75-982f-1345f4c0450c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.042 227766 DEBUG oslo_concurrency.lockutils [req-e3e99000-22ec-479c-b857-f5ad10d7da1e req-3b275a20-bd3a-4e75-982f-1345f4c0450c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.042 227766 DEBUG oslo_concurrency.lockutils [req-e3e99000-22ec-479c-b857-f5ad10d7da1e req-3b275a20-bd3a-4e75-982f-1345f4c0450c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.043 227766 DEBUG oslo_concurrency.lockutils [req-e3e99000-22ec-479c-b857-f5ad10d7da1e req-3b275a20-bd3a-4e75-982f-1345f4c0450c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.043 227766 DEBUG nova.compute.manager [req-e3e99000-22ec-479c-b857-f5ad10d7da1e req-3b275a20-bd3a-4e75-982f-1345f4c0450c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Processing event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:26.066 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fc1b3253-d7ae-440a-a8b5-7fba2529bcbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:26.068 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap082e2952-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:26.068 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:26.069 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap082e2952-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:26 np0005593234 NetworkManager[48942]: <info>  [1769165246.0714] manager: (tap082e2952-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/415)
Jan 23 05:47:26 np0005593234 kernel: tap082e2952-c0: entered promiscuous mode
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.073 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:26.077 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap082e2952-c0, col_values=(('external_ids', {'iface-id': 'e36b250d-7843-417b-b3f6-5e001769e85d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.078 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:26 np0005593234 ovn_controller[134547]: 2026-01-23T10:47:26Z|00883|binding|INFO|Releasing lport e36b250d-7843-417b-b3f6-5e001769e85d from this chassis (sb_readonly=0)
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.079 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:26.079 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/082e2952-c529-49ec-88e6-5e5c5580db01.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/082e2952-c529-49ec-88e6-5e5c5580db01.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:26.081 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1fd58a29-551d-4ff4-8e88-92383e467cec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:26.081 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-082e2952-c529-49ec-88e6-5e5c5580db01
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/082e2952-c529-49ec-88e6-5e5c5580db01.pid.haproxy
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 082e2952-c529-49ec-88e6-5e5c5580db01
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:47:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:26.082 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'env', 'PROCESS_TAG=haproxy-082e2952-c529-49ec-88e6-5e5c5580db01', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/082e2952-c529-49ec-88e6-5e5c5580db01.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.092 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.274 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165246.2742264, 1f16f1e6-2ac3-4547-84bf-103e4be39e3a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.275 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] VM Started (Lifecycle Event)#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.278 227766 DEBUG nova.compute.manager [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.284 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.288 227766 INFO nova.virt.libvirt.driver [-] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Instance spawned successfully.#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.289 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.295 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.298 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.308 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.308 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.309 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.309 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.310 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.310 227766 DEBUG nova.virt.libvirt.driver [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.317 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.318 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165246.2744439, 1f16f1e6-2ac3-4547-84bf-103e4be39e3a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.318 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.350 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.353 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165246.280797, 1f16f1e6-2ac3-4547-84bf-103e4be39e3a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.353 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.373 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.377 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.380 227766 INFO nova.compute.manager [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Took 6.15 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.380 227766 DEBUG nova.compute.manager [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.412 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.453 227766 INFO nova.compute.manager [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Took 7.08 seconds to build instance.#033[00m
Jan 23 05:47:26 np0005593234 podman[327128]: 2026-01-23 10:47:26.466487521 +0000 UTC m=+0.065131134 container create ed1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 05:47:26 np0005593234 nova_compute[227762]: 2026-01-23 10:47:26.488 227766 DEBUG oslo_concurrency.lockutils [None req-da7abc53-db41-4336-bb39-c8930c27c8dc 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.207s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:26 np0005593234 systemd[1]: Started libpod-conmon-ed1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d.scope.
Jan 23 05:47:26 np0005593234 podman[327128]: 2026-01-23 10:47:26.428433208 +0000 UTC m=+0.027076901 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:47:26 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:47:26 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d603b42a7d8dd474346528c205e5334a436c13d760e792613224348b25c54276/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:47:26 np0005593234 podman[327128]: 2026-01-23 10:47:26.558675072 +0000 UTC m=+0.157318695 container init ed1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:47:26 np0005593234 podman[327128]: 2026-01-23 10:47:26.564307023 +0000 UTC m=+0.162950636 container start ed1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:47:26 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[327143]: [NOTICE]   (327147) : New worker (327149) forked
Jan 23 05:47:26 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[327143]: [NOTICE]   (327147) : Loading success.
Jan 23 05:47:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:27.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:27.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:28 np0005593234 nova_compute[227762]: 2026-01-23 10:47:28.151 227766 DEBUG nova.compute.manager [req-ade6a032-7ce1-4c12-b160-f55783bde8f6 req-cc237c30-d5fd-4363-9e9c-4e3739e9a928 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:28 np0005593234 nova_compute[227762]: 2026-01-23 10:47:28.152 227766 DEBUG oslo_concurrency.lockutils [req-ade6a032-7ce1-4c12-b160-f55783bde8f6 req-cc237c30-d5fd-4363-9e9c-4e3739e9a928 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:28 np0005593234 nova_compute[227762]: 2026-01-23 10:47:28.153 227766 DEBUG oslo_concurrency.lockutils [req-ade6a032-7ce1-4c12-b160-f55783bde8f6 req-cc237c30-d5fd-4363-9e9c-4e3739e9a928 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:28 np0005593234 nova_compute[227762]: 2026-01-23 10:47:28.154 227766 DEBUG oslo_concurrency.lockutils [req-ade6a032-7ce1-4c12-b160-f55783bde8f6 req-cc237c30-d5fd-4363-9e9c-4e3739e9a928 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:28 np0005593234 nova_compute[227762]: 2026-01-23 10:47:28.154 227766 DEBUG nova.compute.manager [req-ade6a032-7ce1-4c12-b160-f55783bde8f6 req-cc237c30-d5fd-4363-9e9c-4e3739e9a928 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] No waiting events found dispatching network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:47:28 np0005593234 nova_compute[227762]: 2026-01-23 10:47:28.155 227766 WARNING nova.compute.manager [req-ade6a032-7ce1-4c12-b160-f55783bde8f6 req-cc237c30-d5fd-4363-9e9c-4e3739e9a928 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received unexpected event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf for instance with vm_state active and task_state None.#033[00m
Jan 23 05:47:28 np0005593234 ceph-mgr[77448]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 05:47:29 np0005593234 nova_compute[227762]: 2026-01-23 10:47:29.057 227766 DEBUG nova.compute.manager [req-a1be9530-ef64-4ef2-b747-7ee2364b9329 req-0fa49026-7534-424e-8aef-b9bdf5e45328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-changed-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:47:29 np0005593234 nova_compute[227762]: 2026-01-23 10:47:29.058 227766 DEBUG nova.compute.manager [req-a1be9530-ef64-4ef2-b747-7ee2364b9329 req-0fa49026-7534-424e-8aef-b9bdf5e45328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Refreshing instance network info cache due to event network-changed-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:47:29 np0005593234 nova_compute[227762]: 2026-01-23 10:47:29.058 227766 DEBUG oslo_concurrency.lockutils [req-a1be9530-ef64-4ef2-b747-7ee2364b9329 req-0fa49026-7534-424e-8aef-b9bdf5e45328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:47:29 np0005593234 nova_compute[227762]: 2026-01-23 10:47:29.059 227766 DEBUG oslo_concurrency.lockutils [req-a1be9530-ef64-4ef2-b747-7ee2364b9329 req-0fa49026-7534-424e-8aef-b9bdf5e45328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:47:29 np0005593234 nova_compute[227762]: 2026-01-23 10:47:29.059 227766 DEBUG nova.network.neutron [req-a1be9530-ef64-4ef2-b747-7ee2364b9329 req-0fa49026-7534-424e-8aef-b9bdf5e45328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Refreshing network info cache for port 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:47:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:29.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:29.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:29 np0005593234 nova_compute[227762]: 2026-01-23 10:47:29.799 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:29 np0005593234 nova_compute[227762]: 2026-01-23 10:47:29.867 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:31 np0005593234 nova_compute[227762]: 2026-01-23 10:47:31.136 227766 DEBUG nova.network.neutron [req-a1be9530-ef64-4ef2-b747-7ee2364b9329 req-0fa49026-7534-424e-8aef-b9bdf5e45328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updated VIF entry in instance network info cache for port 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:47:31 np0005593234 nova_compute[227762]: 2026-01-23 10:47:31.137 227766 DEBUG nova.network.neutron [req-a1be9530-ef64-4ef2-b747-7ee2364b9329 req-0fa49026-7534-424e-8aef-b9bdf5e45328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updating instance_info_cache with network_info: [{"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:47:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:31.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:31 np0005593234 nova_compute[227762]: 2026-01-23 10:47:31.161 227766 DEBUG oslo_concurrency.lockutils [req-a1be9530-ef64-4ef2-b747-7ee2364b9329 req-0fa49026-7534-424e-8aef-b9bdf5e45328 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:47:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:31.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:33.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:47:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:33.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:47:34 np0005593234 nova_compute[227762]: 2026-01-23 10:47:34.801 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:34 np0005593234 nova_compute[227762]: 2026-01-23 10:47:34.868 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:35.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:35.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:37.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:47:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:37.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:38 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:47:38 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:47:38 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:47:38 np0005593234 nova_compute[227762]: 2026-01-23 10:47:38.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:47:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:39.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:39.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:39 np0005593234 nova_compute[227762]: 2026-01-23 10:47:39.838 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:39 np0005593234 nova_compute[227762]: 2026-01-23 10:47:39.871 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:40 np0005593234 podman[327346]: 2026-01-23 10:47:40.764925018 +0000 UTC m=+0.051546886 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:47:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:47:41Z|00115|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c1:5f:89 10.100.0.13
Jan 23 05:47:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:47:41Z|00116|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:5f:89 10.100.0.13
Jan 23 05:47:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:41.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:41.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:42.888 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:47:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:42.889 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:47:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:47:42.890 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:47:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:43.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:47:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:43.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:47:44 np0005593234 nova_compute[227762]: 2026-01-23 10:47:44.840 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:44 np0005593234 nova_compute[227762]: 2026-01-23 10:47:44.872 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:47:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:47:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:45.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:45.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:47.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:47:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:47.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:47:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:49.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:49.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:49 np0005593234 podman[327470]: 2026-01-23 10:47:49.808995928 +0000 UTC m=+0.108459825 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:47:49 np0005593234 nova_compute[227762]: 2026-01-23 10:47:49.842 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:49 np0005593234 nova_compute[227762]: 2026-01-23 10:47:49.874 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:51.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:51.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:53.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:53.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:54 np0005593234 nova_compute[227762]: 2026-01-23 10:47:54.843 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:54 np0005593234 nova_compute[227762]: 2026-01-23 10:47:54.875 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:55.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:47:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:55.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:47:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:57.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:57.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e407 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:47:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:47:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:47:59.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:47:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:47:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:47:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:47:59.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:47:59 np0005593234 nova_compute[227762]: 2026-01-23 10:47:59.846 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:47:59 np0005593234 nova_compute[227762]: 2026-01-23 10:47:59.876 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:48:00 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3303828801' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:48:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:48:00 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3303828801' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:48:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e408 e408: 3 total, 3 up, 3 in
Jan 23 05:48:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:48:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:01.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:48:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:01.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:02 np0005593234 nova_compute[227762]: 2026-01-23 10:48:02.515 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:02.516 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=83, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=82) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:48:02 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:02.517 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:48:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e408 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:03.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:03.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:04 np0005593234 nova_compute[227762]: 2026-01-23 10:48:04.037 227766 DEBUG oslo_concurrency.lockutils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:04 np0005593234 nova_compute[227762]: 2026-01-23 10:48:04.038 227766 DEBUG oslo_concurrency.lockutils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a" acquired by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:04 np0005593234 nova_compute[227762]: 2026-01-23 10:48:04.038 227766 INFO nova.compute.manager [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Shelving#033[00m
Jan 23 05:48:04 np0005593234 nova_compute[227762]: 2026-01-23 10:48:04.064 227766 DEBUG nova.virt.libvirt.driver [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 05:48:04 np0005593234 nova_compute[227762]: 2026-01-23 10:48:04.848 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:04 np0005593234 nova_compute[227762]: 2026-01-23 10:48:04.877 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:05.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:48:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:05.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:48:07 np0005593234 nova_compute[227762]: 2026-01-23 10:48:07.080 227766 INFO nova.virt.libvirt.driver [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 05:48:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:07.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:07.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:07 np0005593234 kernel: tap9acbc2f5-e7 (unregistering): left promiscuous mode
Jan 23 05:48:07 np0005593234 NetworkManager[48942]: <info>  [1769165287.6113] device (tap9acbc2f5-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:48:07 np0005593234 nova_compute[227762]: 2026-01-23 10:48:07.626 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:48:07Z|00884|binding|INFO|Releasing lport 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf from this chassis (sb_readonly=0)
Jan 23 05:48:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:48:07Z|00885|binding|INFO|Setting lport 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf down in Southbound
Jan 23 05:48:07 np0005593234 ovn_controller[134547]: 2026-01-23T10:48:07Z|00886|binding|INFO|Removing iface tap9acbc2f5-e7 ovn-installed in OVS
Jan 23 05:48:07 np0005593234 nova_compute[227762]: 2026-01-23 10:48:07.628 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:07.639 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:5f:89 10.100.0.13'], port_security=['fa:16:3e:c1:5f:89 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1f16f1e6-2ac3-4547-84bf-103e4be39e3a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-082e2952-c529-49ec-88e6-5e5c5580db01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '36d7e7c7ddbd4cf785fafd0d35b0a2d8', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0f4c1420-fdc4-4f47-97c8-7ad48c8768c8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6ea75f5-7173-4e04-a97c-cbcceff41ada, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:48:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:07.640 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf in datapath 082e2952-c529-49ec-88e6-5e5c5580db01 unbound from our chassis#033[00m
Jan 23 05:48:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:07.642 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 082e2952-c529-49ec-88e6-5e5c5580db01, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:48:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:07.644 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1ead7db2-d18e-4b94-9bf5-b55f28d1b547]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:07.646 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 namespace which is not needed anymore#033[00m
Jan 23 05:48:07 np0005593234 nova_compute[227762]: 2026-01-23 10:48:07.653 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e408 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:07 np0005593234 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000d0.scope: Deactivated successfully.
Jan 23 05:48:07 np0005593234 systemd[1]: machine-qemu\x2d99\x2dinstance\x2d000000d0.scope: Consumed 14.657s CPU time.
Jan 23 05:48:07 np0005593234 systemd-machined[195626]: Machine qemu-99-instance-000000d0 terminated.
Jan 23 05:48:07 np0005593234 nova_compute[227762]: 2026-01-23 10:48:07.899 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:07 np0005593234 nova_compute[227762]: 2026-01-23 10:48:07.904 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:07 np0005593234 nova_compute[227762]: 2026-01-23 10:48:07.916 227766 INFO nova.virt.libvirt.driver [-] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Instance destroyed successfully.#033[00m
Jan 23 05:48:07 np0005593234 nova_compute[227762]: 2026-01-23 10:48:07.917 227766 DEBUG nova.objects.instance [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:48:07 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[327143]: [NOTICE]   (327147) : haproxy version is 2.8.14-c23fe91
Jan 23 05:48:07 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[327143]: [NOTICE]   (327147) : path to executable is /usr/sbin/haproxy
Jan 23 05:48:07 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[327143]: [WARNING]  (327147) : Exiting Master process...
Jan 23 05:48:07 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[327143]: [ALERT]    (327147) : Current worker (327149) exited with code 143 (Terminated)
Jan 23 05:48:07 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[327143]: [WARNING]  (327147) : All workers exited. Exiting... (0)
Jan 23 05:48:07 np0005593234 systemd[1]: libpod-ed1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d.scope: Deactivated successfully.
Jan 23 05:48:07 np0005593234 podman[327581]: 2026-01-23 10:48:07.930360385 +0000 UTC m=+0.192898067 container died ed1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 05:48:08 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ed1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d-userdata-shm.mount: Deactivated successfully.
Jan 23 05:48:08 np0005593234 systemd[1]: var-lib-containers-storage-overlay-d603b42a7d8dd474346528c205e5334a436c13d760e792613224348b25c54276-merged.mount: Deactivated successfully.
Jan 23 05:48:08 np0005593234 podman[327581]: 2026-01-23 10:48:08.334836177 +0000 UTC m=+0.597373859 container cleanup ed1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 05:48:08 np0005593234 systemd[1]: libpod-conmon-ed1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d.scope: Deactivated successfully.
Jan 23 05:48:08 np0005593234 nova_compute[227762]: 2026-01-23 10:48:08.377 227766 INFO nova.virt.libvirt.driver [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Beginning cold snapshot process#033[00m
Jan 23 05:48:08 np0005593234 nova_compute[227762]: 2026-01-23 10:48:08.563 227766 DEBUG nova.virt.libvirt.imagebackend [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No parent info for 84c0ef19-7f67-4bd3-95d8-507c3e0942ed; asking the Image API where its store is _get_parent_pool /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1163#033[00m
Jan 23 05:48:08 np0005593234 podman[327623]: 2026-01-23 10:48:08.888992007 +0000 UTC m=+0.530487901 container remove ed1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:48:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:08.897 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[75583977-b9a2-4a11-a0ff-37363315ff25]: (4, ('Fri Jan 23 10:48:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 (ed1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d)\ned1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d\nFri Jan 23 10:48:08 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 (ed1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d)\ned1d2e7324a24126ab3ed69e0555d7fc7add44144e3f9e8b731d67177b0e239d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:08.899 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b23a1553-54c5-4e02-a327-0fb5b8dd8db5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:08.900 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap082e2952-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:48:08 np0005593234 nova_compute[227762]: 2026-01-23 10:48:08.902 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:08 np0005593234 kernel: tap082e2952-c0: left promiscuous mode
Jan 23 05:48:08 np0005593234 nova_compute[227762]: 2026-01-23 10:48:08.921 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:08.923 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c8fc962e-6e7e-4b3f-9f16-d6c7e795f9e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:08.940 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[87c39df0-dd56-4d14-9c5c-4cc613830336]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:08.941 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5da5eee5-f900-4653-9d0f-12ccb78991f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:08.956 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c9da6277-c642-46a4-be11-3ca815ace9eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 914081, 'reachable_time': 42382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 327676, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:08 np0005593234 systemd[1]: run-netns-ovnmeta\x2d082e2952\x2dc529\x2d49ec\x2d88e6\x2d5e5c5580db01.mount: Deactivated successfully.
Jan 23 05:48:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:08.959 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:48:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:08.960 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[ee63a3ea-8e4f-4699-8ec0-0be0a630f008]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:09 np0005593234 nova_compute[227762]: 2026-01-23 10:48:09.040 227766 DEBUG nova.storage.rbd_utils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] creating snapshot(6b3769c81dd742b09f3ce30664e99a5e) on rbd image(1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:48:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:09.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:48:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:09.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:48:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e409 e409: 3 total, 3 up, 3 in
Jan 23 05:48:09 np0005593234 nova_compute[227762]: 2026-01-23 10:48:09.884 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:10 np0005593234 nova_compute[227762]: 2026-01-23 10:48:10.507 227766 DEBUG nova.compute.manager [req-71651ebd-97fd-4737-85f9-b0f253b4fedf req-29771e79-1c5b-4c55-9c95-528c5505bb16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-vif-unplugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:48:10 np0005593234 nova_compute[227762]: 2026-01-23 10:48:10.509 227766 DEBUG oslo_concurrency.lockutils [req-71651ebd-97fd-4737-85f9-b0f253b4fedf req-29771e79-1c5b-4c55-9c95-528c5505bb16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:10 np0005593234 nova_compute[227762]: 2026-01-23 10:48:10.510 227766 DEBUG oslo_concurrency.lockutils [req-71651ebd-97fd-4737-85f9-b0f253b4fedf req-29771e79-1c5b-4c55-9c95-528c5505bb16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:10 np0005593234 nova_compute[227762]: 2026-01-23 10:48:10.510 227766 DEBUG oslo_concurrency.lockutils [req-71651ebd-97fd-4737-85f9-b0f253b4fedf req-29771e79-1c5b-4c55-9c95-528c5505bb16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:10 np0005593234 nova_compute[227762]: 2026-01-23 10:48:10.511 227766 DEBUG nova.compute.manager [req-71651ebd-97fd-4737-85f9-b0f253b4fedf req-29771e79-1c5b-4c55-9c95-528c5505bb16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] No waiting events found dispatching network-vif-unplugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:48:10 np0005593234 nova_compute[227762]: 2026-01-23 10:48:10.512 227766 WARNING nova.compute.manager [req-71651ebd-97fd-4737-85f9-b0f253b4fedf req-29771e79-1c5b-4c55-9c95-528c5505bb16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received unexpected event network-vif-unplugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 23 05:48:10 np0005593234 nova_compute[227762]: 2026-01-23 10:48:10.512 227766 DEBUG nova.compute.manager [req-71651ebd-97fd-4737-85f9-b0f253b4fedf req-29771e79-1c5b-4c55-9c95-528c5505bb16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:48:10 np0005593234 nova_compute[227762]: 2026-01-23 10:48:10.513 227766 DEBUG oslo_concurrency.lockutils [req-71651ebd-97fd-4737-85f9-b0f253b4fedf req-29771e79-1c5b-4c55-9c95-528c5505bb16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:10 np0005593234 nova_compute[227762]: 2026-01-23 10:48:10.514 227766 DEBUG oslo_concurrency.lockutils [req-71651ebd-97fd-4737-85f9-b0f253b4fedf req-29771e79-1c5b-4c55-9c95-528c5505bb16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:10 np0005593234 nova_compute[227762]: 2026-01-23 10:48:10.514 227766 DEBUG oslo_concurrency.lockutils [req-71651ebd-97fd-4737-85f9-b0f253b4fedf req-29771e79-1c5b-4c55-9c95-528c5505bb16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:10 np0005593234 nova_compute[227762]: 2026-01-23 10:48:10.515 227766 DEBUG nova.compute.manager [req-71651ebd-97fd-4737-85f9-b0f253b4fedf req-29771e79-1c5b-4c55-9c95-528c5505bb16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] No waiting events found dispatching network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:48:10 np0005593234 nova_compute[227762]: 2026-01-23 10:48:10.515 227766 WARNING nova.compute.manager [req-71651ebd-97fd-4737-85f9-b0f253b4fedf req-29771e79-1c5b-4c55-9c95-528c5505bb16 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received unexpected event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf for instance with vm_state active and task_state shelving_image_uploading.#033[00m
Jan 23 05:48:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e410 e410: 3 total, 3 up, 3 in
Jan 23 05:48:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:48:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:11.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:48:11 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:11.519 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '83'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:48:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:11.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:11 np0005593234 podman[327696]: 2026-01-23 10:48:11.81326812 +0000 UTC m=+0.091612144 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Jan 23 05:48:12 np0005593234 nova_compute[227762]: 2026-01-23 10:48:12.160 227766 DEBUG nova.storage.rbd_utils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] cloning vms/1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk@6b3769c81dd742b09f3ce30664e99a5e to images/2958e5e7-f8c1-4f81-af25-87197bff4d89 clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:48:12 np0005593234 nova_compute[227762]: 2026-01-23 10:48:12.476 227766 DEBUG nova.storage.rbd_utils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] flattening images/2958e5e7-f8c1-4f81-af25-87197bff4d89 flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 05:48:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e410 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:12 np0005593234 nova_compute[227762]: 2026-01-23 10:48:12.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:12 np0005593234 nova_compute[227762]: 2026-01-23 10:48:12.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:48:12 np0005593234 nova_compute[227762]: 2026-01-23 10:48:12.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:48:12 np0005593234 nova_compute[227762]: 2026-01-23 10:48:12.785 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:48:12 np0005593234 nova_compute[227762]: 2026-01-23 10:48:12.787 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:48:12 np0005593234 nova_compute[227762]: 2026-01-23 10:48:12.788 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:48:12 np0005593234 nova_compute[227762]: 2026-01-23 10:48:12.789 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:48:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:13.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:13.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:14 np0005593234 nova_compute[227762]: 2026-01-23 10:48:14.886 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:48:14 np0005593234 nova_compute[227762]: 2026-01-23 10:48:14.887 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:48:14 np0005593234 nova_compute[227762]: 2026-01-23 10:48:14.888 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 05:48:14 np0005593234 nova_compute[227762]: 2026-01-23 10:48:14.888 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:48:14 np0005593234 nova_compute[227762]: 2026-01-23 10:48:14.930 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:14 np0005593234 nova_compute[227762]: 2026-01-23 10:48:14.931 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 05:48:15 np0005593234 nova_compute[227762]: 2026-01-23 10:48:15.045 227766 DEBUG nova.storage.rbd_utils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] removing snapshot(6b3769c81dd742b09f3ce30664e99a5e) on rbd image(1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk) remove_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:489#033[00m
Jan 23 05:48:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:15.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:48:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:15.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:48:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e411 e411: 3 total, 3 up, 3 in
Jan 23 05:48:16 np0005593234 nova_compute[227762]: 2026-01-23 10:48:16.422 227766 DEBUG nova.storage.rbd_utils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] creating snapshot(snap) on rbd image(2958e5e7-f8c1-4f81-af25-87197bff4d89) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:48:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e412 e412: 3 total, 3 up, 3 in
Jan 23 05:48:16 np0005593234 nova_compute[227762]: 2026-01-23 10:48:16.899 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updating instance_info_cache with network_info: [{"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:48:16 np0005593234 nova_compute[227762]: 2026-01-23 10:48:16.944 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:48:16 np0005593234 nova_compute[227762]: 2026-01-23 10:48:16.945 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:48:16 np0005593234 nova_compute[227762]: 2026-01-23 10:48:16.947 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:16 np0005593234 nova_compute[227762]: 2026-01-23 10:48:16.947 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:16 np0005593234 nova_compute[227762]: 2026-01-23 10:48:16.989 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:16 np0005593234 nova_compute[227762]: 2026-01-23 10:48:16.989 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:16 np0005593234 nova_compute[227762]: 2026-01-23 10:48:16.990 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:16 np0005593234 nova_compute[227762]: 2026-01-23 10:48:16.990 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:48:16 np0005593234 nova_compute[227762]: 2026-01-23 10:48:16.990 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:48:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:17.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:48:17 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3019329489' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:48:17 np0005593234 nova_compute[227762]: 2026-01-23 10:48:17.440 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:48:17 np0005593234 nova_compute[227762]: 2026-01-23 10:48:17.542 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000d0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:48:17 np0005593234 nova_compute[227762]: 2026-01-23 10:48:17.543 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000d0 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:48:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:48:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:17.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:48:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e412 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:17 np0005593234 nova_compute[227762]: 2026-01-23 10:48:17.694 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:48:17 np0005593234 nova_compute[227762]: 2026-01-23 10:48:17.695 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4111MB free_disk=20.942638397216797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:48:17 np0005593234 nova_compute[227762]: 2026-01-23 10:48:17.695 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:17 np0005593234 nova_compute[227762]: 2026-01-23 10:48:17.696 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:17 np0005593234 nova_compute[227762]: 2026-01-23 10:48:17.825 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 1f16f1e6-2ac3-4547-84bf-103e4be39e3a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:48:17 np0005593234 nova_compute[227762]: 2026-01-23 10:48:17.826 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:48:17 np0005593234 nova_compute[227762]: 2026-01-23 10:48:17.826 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:48:17 np0005593234 nova_compute[227762]: 2026-01-23 10:48:17.928 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:48:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:48:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2725558302' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:48:18 np0005593234 nova_compute[227762]: 2026-01-23 10:48:18.375 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:48:18 np0005593234 nova_compute[227762]: 2026-01-23 10:48:18.381 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:48:18 np0005593234 nova_compute[227762]: 2026-01-23 10:48:18.424 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:48:18 np0005593234 nova_compute[227762]: 2026-01-23 10:48:18.496 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:48:18 np0005593234 nova_compute[227762]: 2026-01-23 10:48:18.496 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.800s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:19 np0005593234 nova_compute[227762]: 2026-01-23 10:48:19.237 227766 INFO nova.virt.libvirt.driver [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Snapshot image upload complete#033[00m
Jan 23 05:48:19 np0005593234 nova_compute[227762]: 2026-01-23 10:48:19.238 227766 DEBUG nova.compute.manager [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:48:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:19.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:19 np0005593234 nova_compute[227762]: 2026-01-23 10:48:19.306 227766 INFO nova.compute.manager [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Shelve offloading#033[00m
Jan 23 05:48:19 np0005593234 nova_compute[227762]: 2026-01-23 10:48:19.317 227766 INFO nova.virt.libvirt.driver [-] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Instance destroyed successfully.#033[00m
Jan 23 05:48:19 np0005593234 nova_compute[227762]: 2026-01-23 10:48:19.317 227766 DEBUG nova.compute.manager [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:48:19 np0005593234 nova_compute[227762]: 2026-01-23 10:48:19.320 227766 DEBUG oslo_concurrency.lockutils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:48:19 np0005593234 nova_compute[227762]: 2026-01-23 10:48:19.321 227766 DEBUG oslo_concurrency.lockutils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquired lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:48:19 np0005593234 nova_compute[227762]: 2026-01-23 10:48:19.321 227766 DEBUG nova.network.neutron [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:48:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:19.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:20 np0005593234 nova_compute[227762]: 2026-01-23 10:48:20.000 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:20 np0005593234 podman[327856]: 2026-01-23 10:48:20.819682389 +0000 UTC m=+0.111500763 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 23 05:48:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:21.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:21 np0005593234 nova_compute[227762]: 2026-01-23 10:48:21.294 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:21 np0005593234 nova_compute[227762]: 2026-01-23 10:48:21.331 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:21 np0005593234 nova_compute[227762]: 2026-01-23 10:48:21.331 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:21 np0005593234 nova_compute[227762]: 2026-01-23 10:48:21.332 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:21 np0005593234 nova_compute[227762]: 2026-01-23 10:48:21.332 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:48:21 np0005593234 nova_compute[227762]: 2026-01-23 10:48:21.561 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:48:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:21.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:48:21 np0005593234 nova_compute[227762]: 2026-01-23 10:48:21.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:21 np0005593234 nova_compute[227762]: 2026-01-23 10:48:21.760 227766 DEBUG nova.network.neutron [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updating instance_info_cache with network_info: [{"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:48:21 np0005593234 nova_compute[227762]: 2026-01-23 10:48:21.773 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:21 np0005593234 nova_compute[227762]: 2026-01-23 10:48:21.797 227766 DEBUG oslo_concurrency.lockutils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Releasing lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:48:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e412 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:22 np0005593234 nova_compute[227762]: 2026-01-23 10:48:22.916 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165287.9146624, 1f16f1e6-2ac3-4547-84bf-103e4be39e3a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:48:22 np0005593234 nova_compute[227762]: 2026-01-23 10:48:22.916 227766 INFO nova.compute.manager [-] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:48:22 np0005593234 nova_compute[227762]: 2026-01-23 10:48:22.944 227766 DEBUG nova.compute.manager [None req-4ebe7b72-d2f2-4b3d-ad99-4bb0e1fed78e - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:48:22 np0005593234 nova_compute[227762]: 2026-01-23 10:48:22.947 227766 DEBUG nova.compute.manager [None req-4ebe7b72-d2f2-4b3d-ad99-4bb0e1fed78e - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: shelved, current task_state: shelving_offloading, current DB power_state: 4, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:48:22 np0005593234 nova_compute[227762]: 2026-01-23 10:48:22.974 227766 INFO nova.compute.manager [None req-4ebe7b72-d2f2-4b3d-ad99-4bb0e1fed78e - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] During sync_power_state the instance has a pending task (shelving_offloading). Skip.#033[00m
Jan 23 05:48:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:23.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.374 227766 INFO nova.virt.libvirt.driver [-] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Instance destroyed successfully.#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.375 227766 DEBUG nova.objects.instance [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'resources' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.399 227766 DEBUG nova.virt.libvirt.vif [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:47:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1083202157',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-1083202157',id=208,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG5ho3CJ6mVqZQzXfJk0fahhh8Yqf11R44i9Fq9DFeNIqqNX5wHacVicDdiNzwbZtz9LlhyeXROqPF8aw2fBlj8o9f2Tzq3dN4qwFPSyYlBwK89/KDZiTx7iCS1VleFFZQ==',key_name='tempest-keypair-1986772238',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:47:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='36d7e7c7ddbd4cf785fafd0d35b0a2d8',ramdisk_id='',reservation_id='r-ht01gjtk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-2030135659',owner_user_name='tempest-AttachVolumeShelveTestJSON-2030135659-project-member',shelved_at='2026-01-23T10:48:19.238527',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='2958e5e7-f8c1-4f81-af25-87197bff4d89'},tags=<?>,task_state='shelving_offloading',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:48:08Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='296341ffca2441dc807d285fa14c966d',uuid=1f16f1e6-2ac3-4547-84bf-103e4be39e3a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='shelved') vif={"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.400 227766 DEBUG nova.network.os_vif_util [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converting VIF {"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.401 227766 DEBUG nova.network.os_vif_util [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.401 227766 DEBUG os_vif [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.403 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.403 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9acbc2f5-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.406 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.408 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.411 227766 INFO os_vif [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7')#033[00m
Jan 23 05:48:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:23.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.623 227766 DEBUG nova.compute.manager [req-d2ba46b0-c3b2-4370-b2ac-0666dd27cad1 req-12b47fee-3b73-4221-be44-006a58a3ca55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-changed-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.624 227766 DEBUG nova.compute.manager [req-d2ba46b0-c3b2-4370-b2ac-0666dd27cad1 req-12b47fee-3b73-4221-be44-006a58a3ca55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Refreshing instance network info cache due to event network-changed-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.625 227766 DEBUG oslo_concurrency.lockutils [req-d2ba46b0-c3b2-4370-b2ac-0666dd27cad1 req-12b47fee-3b73-4221-be44-006a58a3ca55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.625 227766 DEBUG oslo_concurrency.lockutils [req-d2ba46b0-c3b2-4370-b2ac-0666dd27cad1 req-12b47fee-3b73-4221-be44-006a58a3ca55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.626 227766 DEBUG nova.network.neutron [req-d2ba46b0-c3b2-4370-b2ac-0666dd27cad1 req-12b47fee-3b73-4221-be44-006a58a3ca55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Refreshing network info cache for port 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.847 227766 INFO nova.virt.libvirt.driver [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Deleting instance files /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a_del#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.847 227766 INFO nova.virt.libvirt.driver [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Deletion of /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a_del complete#033[00m
Jan 23 05:48:23 np0005593234 nova_compute[227762]: 2026-01-23 10:48:23.990 227766 INFO nova.scheduler.client.report [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Deleted allocations for instance 1f16f1e6-2ac3-4547-84bf-103e4be39e3a#033[00m
Jan 23 05:48:24 np0005593234 nova_compute[227762]: 2026-01-23 10:48:24.054 227766 DEBUG oslo_concurrency.lockutils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:24 np0005593234 nova_compute[227762]: 2026-01-23 10:48:24.054 227766 DEBUG oslo_concurrency.lockutils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:24 np0005593234 nova_compute[227762]: 2026-01-23 10:48:24.085 227766 DEBUG oslo_concurrency.processutils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:48:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:48:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/817530165' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:48:24 np0005593234 nova_compute[227762]: 2026-01-23 10:48:24.542 227766 DEBUG oslo_concurrency.processutils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:48:24 np0005593234 nova_compute[227762]: 2026-01-23 10:48:24.548 227766 DEBUG nova.compute.provider_tree [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:48:24 np0005593234 nova_compute[227762]: 2026-01-23 10:48:24.572 227766 DEBUG nova.scheduler.client.report [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:48:24 np0005593234 nova_compute[227762]: 2026-01-23 10:48:24.608 227766 DEBUG oslo_concurrency.lockutils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e413 e413: 3 total, 3 up, 3 in
Jan 23 05:48:24 np0005593234 nova_compute[227762]: 2026-01-23 10:48:24.661 227766 DEBUG oslo_concurrency.lockutils [None req-b0005e43-9e9a-49b3-b4bb-0cbb7c8fd9f7 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a" "released" by "nova.compute.manager.ComputeManager.shelve_instance.<locals>.do_shelve_instance" :: held 20.623s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:25 np0005593234 nova_compute[227762]: 2026-01-23 10:48:25.023 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.003000098s ======
Jan 23 05:48:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:25.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000098s
Jan 23 05:48:25 np0005593234 nova_compute[227762]: 2026-01-23 10:48:25.569 227766 DEBUG nova.network.neutron [req-d2ba46b0-c3b2-4370-b2ac-0666dd27cad1 req-12b47fee-3b73-4221-be44-006a58a3ca55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updated VIF entry in instance network info cache for port 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:48:25 np0005593234 nova_compute[227762]: 2026-01-23 10:48:25.569 227766 DEBUG nova.network.neutron [req-d2ba46b0-c3b2-4370-b2ac-0666dd27cad1 req-12b47fee-3b73-4221-be44-006a58a3ca55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updating instance_info_cache with network_info: [{"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": null, "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:48:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:25.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:25 np0005593234 nova_compute[227762]: 2026-01-23 10:48:25.614 227766 DEBUG oslo_concurrency.lockutils [req-d2ba46b0-c3b2-4370-b2ac-0666dd27cad1 req-12b47fee-3b73-4221-be44-006a58a3ca55 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:48:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:48:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:27.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:48:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:27.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:27 np0005593234 nova_compute[227762]: 2026-01-23 10:48:27.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:28 np0005593234 nova_compute[227762]: 2026-01-23 10:48:28.406 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:29.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:29.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:30 np0005593234 nova_compute[227762]: 2026-01-23 10:48:30.024 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:48:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:31.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:48:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:31.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:32 np0005593234 nova_compute[227762]: 2026-01-23 10:48:32.345 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:32 np0005593234 nova_compute[227762]: 2026-01-23 10:48:32.346 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:32 np0005593234 nova_compute[227762]: 2026-01-23 10:48:32.346 227766 INFO nova.compute.manager [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Unshelving#033[00m
Jan 23 05:48:32 np0005593234 nova_compute[227762]: 2026-01-23 10:48:32.473 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:32 np0005593234 nova_compute[227762]: 2026-01-23 10:48:32.473 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:32 np0005593234 nova_compute[227762]: 2026-01-23 10:48:32.479 227766 DEBUG nova.objects.instance [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'pci_requests' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:48:32 np0005593234 nova_compute[227762]: 2026-01-23 10:48:32.498 227766 DEBUG nova.objects.instance [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'numa_topology' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:48:32 np0005593234 nova_compute[227762]: 2026-01-23 10:48:32.522 227766 DEBUG nova.virt.hardware [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:48:32 np0005593234 nova_compute[227762]: 2026-01-23 10:48:32.522 227766 INFO nova.compute.claims [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:48:32 np0005593234 nova_compute[227762]: 2026-01-23 10:48:32.658 227766 DEBUG oslo_concurrency.processutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:48:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:48:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/54848019' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:48:33 np0005593234 nova_compute[227762]: 2026-01-23 10:48:33.090 227766 DEBUG oslo_concurrency.processutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:48:33 np0005593234 nova_compute[227762]: 2026-01-23 10:48:33.095 227766 DEBUG nova.compute.provider_tree [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:48:33 np0005593234 nova_compute[227762]: 2026-01-23 10:48:33.125 227766 DEBUG nova.scheduler.client.report [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:48:33 np0005593234 nova_compute[227762]: 2026-01-23 10:48:33.177 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:33.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:33 np0005593234 nova_compute[227762]: 2026-01-23 10:48:33.409 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:33.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:33 np0005593234 nova_compute[227762]: 2026-01-23 10:48:33.684 227766 INFO nova.network.neutron [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updating port 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 05:48:35 np0005593234 nova_compute[227762]: 2026-01-23 10:48:35.026 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:48:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:35.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:48:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:48:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:35.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:48:36 np0005593234 nova_compute[227762]: 2026-01-23 10:48:36.003 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:48:36 np0005593234 nova_compute[227762]: 2026-01-23 10:48:36.004 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquired lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:48:36 np0005593234 nova_compute[227762]: 2026-01-23 10:48:36.004 227766 DEBUG nova.network.neutron [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:48:36 np0005593234 nova_compute[227762]: 2026-01-23 10:48:36.263 227766 DEBUG nova.compute.manager [req-0736ab78-59f1-4e7d-880c-014eee284b4d req-b3d48729-4e55-4c8c-8fdc-b5b561d79bb3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-changed-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:48:36 np0005593234 nova_compute[227762]: 2026-01-23 10:48:36.264 227766 DEBUG nova.compute.manager [req-0736ab78-59f1-4e7d-880c-014eee284b4d req-b3d48729-4e55-4c8c-8fdc-b5b561d79bb3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Refreshing instance network info cache due to event network-changed-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:48:36 np0005593234 nova_compute[227762]: 2026-01-23 10:48:36.264 227766 DEBUG oslo_concurrency.lockutils [req-0736ab78-59f1-4e7d-880c-014eee284b4d req-b3d48729-4e55-4c8c-8fdc-b5b561d79bb3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:48:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:37.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:37.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:38 np0005593234 nova_compute[227762]: 2026-01-23 10:48:38.411 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:38 np0005593234 nova_compute[227762]: 2026-01-23 10:48:38.546 227766 DEBUG nova.network.neutron [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updating instance_info_cache with network_info: [{"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:48:38 np0005593234 nova_compute[227762]: 2026-01-23 10:48:38.574 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Releasing lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:48:38 np0005593234 nova_compute[227762]: 2026-01-23 10:48:38.576 227766 DEBUG nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:48:38 np0005593234 nova_compute[227762]: 2026-01-23 10:48:38.576 227766 INFO nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Creating image(s)#033[00m
Jan 23 05:48:38 np0005593234 nova_compute[227762]: 2026-01-23 10:48:38.599 227766 DEBUG nova.storage.rbd_utils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:48:38 np0005593234 nova_compute[227762]: 2026-01-23 10:48:38.602 227766 DEBUG nova.objects.instance [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:48:38 np0005593234 nova_compute[227762]: 2026-01-23 10:48:38.604 227766 DEBUG oslo_concurrency.lockutils [req-0736ab78-59f1-4e7d-880c-014eee284b4d req-b3d48729-4e55-4c8c-8fdc-b5b561d79bb3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:48:38 np0005593234 nova_compute[227762]: 2026-01-23 10:48:38.604 227766 DEBUG nova.network.neutron [req-0736ab78-59f1-4e7d-880c-014eee284b4d req-b3d48729-4e55-4c8c-8fdc-b5b561d79bb3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Refreshing network info cache for port 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:48:38 np0005593234 nova_compute[227762]: 2026-01-23 10:48:38.653 227766 DEBUG nova.storage.rbd_utils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:48:38 np0005593234 nova_compute[227762]: 2026-01-23 10:48:38.679 227766 DEBUG nova.storage.rbd_utils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:48:38 np0005593234 nova_compute[227762]: 2026-01-23 10:48:38.683 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "d5dd7708417c6230468ce13813d6e30d36b10490" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:38 np0005593234 nova_compute[227762]: 2026-01-23 10:48:38.683 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "d5dd7708417c6230468ce13813d6e30d36b10490" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:39.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:39 np0005593234 nova_compute[227762]: 2026-01-23 10:48:39.403 227766 DEBUG nova.virt.libvirt.imagebackend [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/2958e5e7-f8c1-4f81-af25-87197bff4d89/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/2958e5e7-f8c1-4f81-af25-87197bff4d89/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 05:48:39 np0005593234 nova_compute[227762]: 2026-01-23 10:48:39.479 227766 DEBUG nova.virt.libvirt.imagebackend [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Selected location: {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/2958e5e7-f8c1-4f81-af25-87197bff4d89/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 23 05:48:39 np0005593234 nova_compute[227762]: 2026-01-23 10:48:39.480 227766 DEBUG nova.storage.rbd_utils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] cloning images/2958e5e7-f8c1-4f81-af25-87197bff4d89@snap to None/1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 05:48:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:39.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:39 np0005593234 nova_compute[227762]: 2026-01-23 10:48:39.614 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "d5dd7708417c6230468ce13813d6e30d36b10490" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.930s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:39 np0005593234 nova_compute[227762]: 2026-01-23 10:48:39.783 227766 DEBUG nova.objects.instance [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'migration_context' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:48:39 np0005593234 nova_compute[227762]: 2026-01-23 10:48:39.875 227766 DEBUG nova.storage.rbd_utils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] flattening vms/1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.028 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.541 227766 DEBUG nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Image rbd:vms/1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.542 227766 DEBUG nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.542 227766 DEBUG nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Ensure instance console log exists: /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.543 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.543 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.543 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.545 227766 DEBUG nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Start _get_guest_xml network_info=[{"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T10:48:03Z,direct_url=<?>,disk_format='raw',id=2958e5e7-f8c1-4f81-af25-87197bff4d89,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-1083202157-shelved',owner='36d7e7c7ddbd4cf785fafd0d35b0a2d8',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T10:48:18Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.548 227766 WARNING nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.552 227766 DEBUG nova.virt.libvirt.host [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.552 227766 DEBUG nova.virt.libvirt.host [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.559 227766 DEBUG nova.virt.libvirt.host [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.559 227766 DEBUG nova.virt.libvirt.host [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.560 227766 DEBUG nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.560 227766 DEBUG nova.virt.hardware [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T10:48:03Z,direct_url=<?>,disk_format='raw',id=2958e5e7-f8c1-4f81-af25-87197bff4d89,min_disk=1,min_ram=0,name='tempest-AttachVolumeShelveTestJSON-server-1083202157-shelved',owner='36d7e7c7ddbd4cf785fafd0d35b0a2d8',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T10:48:18Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.561 227766 DEBUG nova.virt.hardware [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.561 227766 DEBUG nova.virt.hardware [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.561 227766 DEBUG nova.virt.hardware [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.561 227766 DEBUG nova.virt.hardware [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.561 227766 DEBUG nova.virt.hardware [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.562 227766 DEBUG nova.virt.hardware [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.562 227766 DEBUG nova.virt.hardware [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.562 227766 DEBUG nova.virt.hardware [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.562 227766 DEBUG nova.virt.hardware [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.562 227766 DEBUG nova.virt.hardware [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.563 227766 DEBUG nova.objects.instance [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.589 227766 DEBUG oslo_concurrency.processutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:48:40 np0005593234 nova_compute[227762]: 2026-01-23 10:48:40.740 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:48:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:48:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/167611609' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.047 227766 DEBUG oslo_concurrency.processutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.070 227766 DEBUG nova.storage.rbd_utils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.075 227766 DEBUG oslo_concurrency.processutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:48:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:41.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:48:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1553830384' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.536 227766 DEBUG oslo_concurrency.processutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.538 227766 DEBUG nova.virt.libvirt.vif [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:47:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1083202157',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-1083202157',id=208,image_ref='2958e5e7-f8c1-4f81-af25-87197bff4d89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1986772238',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:47:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='36d7e7c7ddbd4cf785fafd0d35b0a2d8',ramdisk_id='',reservation_id='r-ht01gjtk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-2030135659',owner_user_name='tempest-AttachVolumeShelveTestJSON-2030135659-project-member',shelved_at='2026-01-23T10:48:19.238527',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='2958e5e7-f8c1-4f81-af25-87197bff4d89'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:48:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='296341ffca2441dc807d285fa14c966d',uuid=1f16f1e6-2ac3-4547-84bf-103e4be39e3a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.539 227766 DEBUG nova.network.os_vif_util [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converting VIF {"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.540 227766 DEBUG nova.network.os_vif_util [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.541 227766 DEBUG nova.objects.instance [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.566 227766 DEBUG nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  <uuid>1f16f1e6-2ac3-4547-84bf-103e4be39e3a</uuid>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  <name>instance-000000d0</name>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <nova:name>tempest-AttachVolumeShelveTestJSON-server-1083202157</nova:name>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:48:40</nova:creationTime>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <nova:user uuid="296341ffca2441dc807d285fa14c966d">tempest-AttachVolumeShelveTestJSON-2030135659-project-member</nova:user>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <nova:project uuid="36d7e7c7ddbd4cf785fafd0d35b0a2d8">tempest-AttachVolumeShelveTestJSON-2030135659</nova:project>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="2958e5e7-f8c1-4f81-af25-87197bff4d89"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <nova:port uuid="9acbc2f5-e7f6-4b5e-8799-c611ad3392bf">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <entry name="serial">1f16f1e6-2ac3-4547-84bf-103e4be39e3a</entry>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <entry name="uuid">1f16f1e6-2ac3-4547-84bf-103e4be39e3a</entry>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk.config">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:c1:5f:89"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <target dev="tap9acbc2f5-e7"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/console.log" append="off"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <input type="keyboard" bus="usb"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:48:41 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:48:41 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:48:41 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:48:41 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.568 227766 DEBUG nova.compute.manager [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Preparing to wait for external event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.569 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.569 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.569 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.570 227766 DEBUG nova.virt.libvirt.vif [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:47:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1083202157',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-1083202157',id=208,image_ref='2958e5e7-f8c1-4f81-af25-87197bff4d89',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-keypair-1986772238',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:47:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='36d7e7c7ddbd4cf785fafd0d35b0a2d8',ramdisk_id='',reservation_id='r-ht01gjtk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-2030135659',owner_user_name='tempest-AttachVolumeShelveTestJSON-2030135659-project-member',shelved_at='2026-01-23T10:48:19.238527',shelved_host='compute-2.ctlplane.example.com',shelved_image_id='2958e5e7-f8c1-4f81-af25-87197bff4d89'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:48:32Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='296341ffca2441dc807d285fa14c966d',uuid=1f16f1e6-2ac3-4547-84bf-103e4be39e3a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.571 227766 DEBUG nova.network.os_vif_util [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converting VIF {"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.571 227766 DEBUG nova.network.os_vif_util [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.572 227766 DEBUG os_vif [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.572 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.573 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.574 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.577 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.577 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9acbc2f5-e7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.578 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9acbc2f5-e7, col_values=(('external_ids', {'iface-id': '9acbc2f5-e7f6-4b5e-8799-c611ad3392bf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c1:5f:89', 'vm-uuid': '1f16f1e6-2ac3-4547-84bf-103e4be39e3a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.580 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:41 np0005593234 NetworkManager[48942]: <info>  [1769165321.5810] manager: (tap9acbc2f5-e7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/416)
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.582 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.585 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.586 227766 INFO os_vif [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7')#033[00m
Jan 23 05:48:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:41.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.644 227766 DEBUG nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.645 227766 DEBUG nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.645 227766 DEBUG nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] No VIF found with MAC fa:16:3e:c1:5f:89, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.646 227766 INFO nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Using config drive#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.668 227766 DEBUG nova.storage.rbd_utils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.733 227766 DEBUG nova.objects.instance [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:48:41 np0005593234 nova_compute[227762]: 2026-01-23 10:48:41.801 227766 DEBUG nova.objects.instance [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'keypairs' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:48:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e413 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:42 np0005593234 podman[328301]: 2026-01-23 10:48:42.751649689 +0000 UTC m=+0.048990465 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Jan 23 05:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:42.889 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:42.890 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:42.890 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:42 np0005593234 nova_compute[227762]: 2026-01-23 10:48:42.956 227766 INFO nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Creating config drive at /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/disk.config#033[00m
Jan 23 05:48:42 np0005593234 nova_compute[227762]: 2026-01-23 10:48:42.964 227766 DEBUG oslo_concurrency.processutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjbyns69e execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.112 227766 DEBUG oslo_concurrency.processutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpjbyns69e" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.143 227766 DEBUG nova.storage.rbd_utils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] rbd image 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.148 227766 DEBUG oslo_concurrency.processutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/disk.config 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.253 227766 DEBUG nova.network.neutron [req-0736ab78-59f1-4e7d-880c-014eee284b4d req-b3d48729-4e55-4c8c-8fdc-b5b561d79bb3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updated VIF entry in instance network info cache for port 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.254 227766 DEBUG nova.network.neutron [req-0736ab78-59f1-4e7d-880c-014eee284b4d req-b3d48729-4e55-4c8c-8fdc-b5b561d79bb3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updating instance_info_cache with network_info: [{"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:48:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:43.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.326 227766 DEBUG oslo_concurrency.lockutils [req-0736ab78-59f1-4e7d-880c-014eee284b4d req-b3d48729-4e55-4c8c-8fdc-b5b561d79bb3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-1f16f1e6-2ac3-4547-84bf-103e4be39e3a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.339 227766 DEBUG oslo_concurrency.processutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/disk.config 1f16f1e6-2ac3-4547-84bf-103e4be39e3a_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.191s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.339 227766 INFO nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Deleting local config drive /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a/disk.config because it was imported into RBD.#033[00m
Jan 23 05:48:43 np0005593234 kernel: tap9acbc2f5-e7: entered promiscuous mode
Jan 23 05:48:43 np0005593234 NetworkManager[48942]: <info>  [1769165323.4024] manager: (tap9acbc2f5-e7): new Tun device (/org/freedesktop/NetworkManager/Devices/417)
Jan 23 05:48:43 np0005593234 ovn_controller[134547]: 2026-01-23T10:48:43Z|00887|binding|INFO|Claiming lport 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf for this chassis.
Jan 23 05:48:43 np0005593234 ovn_controller[134547]: 2026-01-23T10:48:43Z|00888|binding|INFO|9acbc2f5-e7f6-4b5e-8799-c611ad3392bf: Claiming fa:16:3e:c1:5f:89 10.100.0.13
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.405 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.409 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.413 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:43 np0005593234 NetworkManager[48942]: <info>  [1769165323.4158] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/418)
Jan 23 05:48:43 np0005593234 NetworkManager[48942]: <info>  [1769165323.4171] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/419)
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.424 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:5f:89 10.100.0.13'], port_security=['fa:16:3e:c1:5f:89 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1f16f1e6-2ac3-4547-84bf-103e4be39e3a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-082e2952-c529-49ec-88e6-5e5c5580db01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '36d7e7c7ddbd4cf785fafd0d35b0a2d8', 'neutron:revision_number': '7', 'neutron:security_group_ids': '0f4c1420-fdc4-4f47-97c8-7ad48c8768c8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.173'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6ea75f5-7173-4e04-a97c-cbcceff41ada, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.426 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf in datapath 082e2952-c529-49ec-88e6-5e5c5580db01 bound to our chassis#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.428 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 082e2952-c529-49ec-88e6-5e5c5580db01#033[00m
Jan 23 05:48:43 np0005593234 systemd-udevd[328372]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.442 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aadaeea9-7d45-4d89-b38d-95f830f0a1c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.444 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap082e2952-c1 in ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.446 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap082e2952-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.446 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[17a99e9a-6e8f-4fcb-bb73-21bcc40c35a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.447 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[06c9bc84-2e2e-4b4a-8047-e2cd5ea98fa8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 NetworkManager[48942]: <info>  [1769165323.4571] device (tap9acbc2f5-e7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:48:43 np0005593234 NetworkManager[48942]: <info>  [1769165323.4576] device (tap9acbc2f5-e7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:48:43 np0005593234 systemd-machined[195626]: New machine qemu-100-instance-000000d0.
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.463 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[4e69c3f6-46f8-4a9e-9bc9-996f84d907df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 systemd[1]: Started Virtual Machine qemu-100-instance-000000d0.
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.493 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8089d4c1-0de5-4617-b167-c005c5c111cb]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.502 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.513 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:43 np0005593234 ovn_controller[134547]: 2026-01-23T10:48:43Z|00889|binding|INFO|Setting lport 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf ovn-installed in OVS
Jan 23 05:48:43 np0005593234 ovn_controller[134547]: 2026-01-23T10:48:43Z|00890|binding|INFO|Setting lport 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf up in Southbound
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.531 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.538 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ea756d31-fff0-4654-8418-728139b36713]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.543 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bea3977d-ede5-465a-9321-c7ebdd67401c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 NetworkManager[48942]: <info>  [1769165323.5450] manager: (tap082e2952-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/420)
Jan 23 05:48:43 np0005593234 systemd-udevd[328379]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.587 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[0a449d64-ed3b-4e2f-9030-679b94bba37f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.594 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d0bd7488-f692-4793-8b6e-f30c4d68bfbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:43.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:43 np0005593234 NetworkManager[48942]: <info>  [1769165323.6200] device (tap082e2952-c0): carrier: link connected
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.625 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[60a9351d-91cb-4bec-9a07-22f02ea6797c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.642 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5a9163c5-1892-4109-a6c8-4c242abaae7f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap082e2952-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:8e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 273], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 921857, 'reachable_time': 21738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 328408, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.670 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[06be1ff2-8cbc-4d84-9f07-cf4c3247dd21]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe91:8e23'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 921857, 'tstamp': 921857}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 328409, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.696 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[edc3277a-8bc3-4b2d-8897-4415c66ffd79]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap082e2952-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:91:8e:23'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 273], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 921857, 'reachable_time': 21738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 328410, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.739 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4694eaa4-c1a9-4656-8202-0ba667853fc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.828 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[587d2de3-4e23-4df1-95bb-36e33af01fa9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.830 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap082e2952-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.830 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.830 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap082e2952-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.833 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:43 np0005593234 NetworkManager[48942]: <info>  [1769165323.8336] manager: (tap082e2952-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/421)
Jan 23 05:48:43 np0005593234 kernel: tap082e2952-c0: entered promiscuous mode
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.835 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.836 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap082e2952-c0, col_values=(('external_ids', {'iface-id': 'e36b250d-7843-417b-b3f6-5e001769e85d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.837 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:43 np0005593234 ovn_controller[134547]: 2026-01-23T10:48:43Z|00891|binding|INFO|Releasing lport e36b250d-7843-417b-b3f6-5e001769e85d from this chassis (sb_readonly=0)
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.851 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.853 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/082e2952-c529-49ec-88e6-5e5c5580db01.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/082e2952-c529-49ec-88e6-5e5c5580db01.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.854 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5e2a9653-98d0-46e1-8609-791e89992409]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.855 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-082e2952-c529-49ec-88e6-5e5c5580db01
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/082e2952-c529-49ec-88e6-5e5c5580db01.pid.haproxy
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 082e2952-c529-49ec-88e6-5e5c5580db01
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:48:43 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:48:43.856 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'env', 'PROCESS_TAG=haproxy-082e2952-c529-49ec-88e6-5e5c5580db01', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/082e2952-c529-49ec-88e6-5e5c5580db01.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.917 227766 DEBUG nova.compute.manager [req-52108f90-38e0-4362-aa4b-e38b3f653280 req-8ef22c35-cdaa-4b90-9540-c75b9bffd6c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.918 227766 DEBUG oslo_concurrency.lockutils [req-52108f90-38e0-4362-aa4b-e38b3f653280 req-8ef22c35-cdaa-4b90-9540-c75b9bffd6c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.918 227766 DEBUG oslo_concurrency.lockutils [req-52108f90-38e0-4362-aa4b-e38b3f653280 req-8ef22c35-cdaa-4b90-9540-c75b9bffd6c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.919 227766 DEBUG oslo_concurrency.lockutils [req-52108f90-38e0-4362-aa4b-e38b3f653280 req-8ef22c35-cdaa-4b90-9540-c75b9bffd6c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:43 np0005593234 nova_compute[227762]: 2026-01-23 10:48:43.919 227766 DEBUG nova.compute.manager [req-52108f90-38e0-4362-aa4b-e38b3f653280 req-8ef22c35-cdaa-4b90-9540-c75b9bffd6c9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Processing event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.015 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165324.0141706, 1f16f1e6-2ac3-4547-84bf-103e4be39e3a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.015 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] VM Started (Lifecycle Event)#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.018 227766 DEBUG nova.compute.manager [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.020 227766 DEBUG nova.virt.libvirt.driver [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.026 227766 INFO nova.virt.libvirt.driver [-] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Instance spawned successfully.#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.043 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.046 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.070 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.071 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165324.0143309, 1f16f1e6-2ac3-4547-84bf-103e4be39e3a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.071 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.095 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.100 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165324.0198326, 1f16f1e6-2ac3-4547-84bf-103e4be39e3a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.100 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.121 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.127 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:48:44 np0005593234 nova_compute[227762]: 2026-01-23 10:48:44.155 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:48:44 np0005593234 podman[328486]: 2026-01-23 10:48:44.303107134 +0000 UTC m=+0.046367321 container create 3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:48:44 np0005593234 systemd[1]: Started libpod-conmon-3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83.scope.
Jan 23 05:48:44 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:48:44 np0005593234 podman[328486]: 2026-01-23 10:48:44.279867198 +0000 UTC m=+0.023127405 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:48:44 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/99d80fabc7e7d81de5feb9173295b9ba6a2e0b8e903dfef60daa03ac6ca59151/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:48:44 np0005593234 podman[328486]: 2026-01-23 10:48:44.389858091 +0000 UTC m=+0.133118288 container init 3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:48:44 np0005593234 podman[328486]: 2026-01-23 10:48:44.395344157 +0000 UTC m=+0.138604344 container start 3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:48:44 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[328501]: [NOTICE]   (328505) : New worker (328507) forked
Jan 23 05:48:44 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[328501]: [NOTICE]   (328505) : Loading success.
Jan 23 05:48:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e414 e414: 3 total, 3 up, 3 in
Jan 23 05:48:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:48:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1629093717' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:48:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:48:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1629093717' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:48:45 np0005593234 nova_compute[227762]: 2026-01-23 10:48:45.029 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:45 np0005593234 nova_compute[227762]: 2026-01-23 10:48:45.122 227766 DEBUG nova.compute.manager [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:48:45 np0005593234 nova_compute[227762]: 2026-01-23 10:48:45.198 227766 DEBUG oslo_concurrency.lockutils [None req-10813662-6f7e-4c8c-af21-83ad63c726a1 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 12.852s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:45.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:48:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:48:45 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:48:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:45.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:46 np0005593234 nova_compute[227762]: 2026-01-23 10:48:46.117 227766 DEBUG nova.compute.manager [req-aec8274e-da04-4b55-a054-b9877d90f528 req-b4af66b0-b2ef-47cf-8148-cb5ad8186273 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:48:46 np0005593234 nova_compute[227762]: 2026-01-23 10:48:46.118 227766 DEBUG oslo_concurrency.lockutils [req-aec8274e-da04-4b55-a054-b9877d90f528 req-b4af66b0-b2ef-47cf-8148-cb5ad8186273 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:48:46 np0005593234 nova_compute[227762]: 2026-01-23 10:48:46.118 227766 DEBUG oslo_concurrency.lockutils [req-aec8274e-da04-4b55-a054-b9877d90f528 req-b4af66b0-b2ef-47cf-8148-cb5ad8186273 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:48:46 np0005593234 nova_compute[227762]: 2026-01-23 10:48:46.118 227766 DEBUG oslo_concurrency.lockutils [req-aec8274e-da04-4b55-a054-b9877d90f528 req-b4af66b0-b2ef-47cf-8148-cb5ad8186273 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:48:46 np0005593234 nova_compute[227762]: 2026-01-23 10:48:46.118 227766 DEBUG nova.compute.manager [req-aec8274e-da04-4b55-a054-b9877d90f528 req-b4af66b0-b2ef-47cf-8148-cb5ad8186273 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] No waiting events found dispatching network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:48:46 np0005593234 nova_compute[227762]: 2026-01-23 10:48:46.119 227766 WARNING nova.compute.manager [req-aec8274e-da04-4b55-a054-b9877d90f528 req-b4af66b0-b2ef-47cf-8148-cb5ad8186273 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received unexpected event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf for instance with vm_state active and task_state None.#033[00m
Jan 23 05:48:46 np0005593234 nova_compute[227762]: 2026-01-23 10:48:46.582 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:48:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:47.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:48:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:48:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:47.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:48:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e414 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:48:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:49.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:48:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:49.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e415 e415: 3 total, 3 up, 3 in
Jan 23 05:48:50 np0005593234 nova_compute[227762]: 2026-01-23 10:48:50.032 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:51.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:51 np0005593234 nova_compute[227762]: 2026-01-23 10:48:51.584 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:51.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:51 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:48:51 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:48:51 np0005593234 podman[328705]: 2026-01-23 10:48:51.893768417 +0000 UTC m=+0.174861077 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:48:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:48:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:53.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:48:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.037001200s ======
Jan 23 05:48:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:53.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.037001200s
Jan 23 05:48:55 np0005593234 nova_compute[227762]: 2026-01-23 10:48:55.034 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:55.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:55.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:56 np0005593234 nova_compute[227762]: 2026-01-23 10:48:56.588 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:48:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:48:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:57.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:48:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:57.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:48:57 np0005593234 ovn_controller[134547]: 2026-01-23T10:48:57Z|00117|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c1:5f:89 10.100.0.13
Jan 23 05:48:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:48:59.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:48:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:48:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:48:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:48:59.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:00 np0005593234 nova_compute[227762]: 2026-01-23 10:49:00.037 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:49:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:01.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:49:01 np0005593234 nova_compute[227762]: 2026-01-23 10:49:01.617 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:01.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:03.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:49:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:03.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:49:03 np0005593234 nova_compute[227762]: 2026-01-23 10:49:03.948 227766 DEBUG oslo_concurrency.lockutils [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:49:03 np0005593234 nova_compute[227762]: 2026-01-23 10:49:03.949 227766 DEBUG oslo_concurrency.lockutils [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:49:03 np0005593234 nova_compute[227762]: 2026-01-23 10:49:03.949 227766 DEBUG oslo_concurrency.lockutils [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:49:03 np0005593234 nova_compute[227762]: 2026-01-23 10:49:03.949 227766 DEBUG oslo_concurrency.lockutils [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:49:03 np0005593234 nova_compute[227762]: 2026-01-23 10:49:03.950 227766 DEBUG oslo_concurrency.lockutils [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:49:03 np0005593234 nova_compute[227762]: 2026-01-23 10:49:03.951 227766 INFO nova.compute.manager [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Terminating instance#033[00m
Jan 23 05:49:03 np0005593234 nova_compute[227762]: 2026-01-23 10:49:03.952 227766 DEBUG nova.compute.manager [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:49:04 np0005593234 kernel: tap9acbc2f5-e7 (unregistering): left promiscuous mode
Jan 23 05:49:04 np0005593234 NetworkManager[48942]: <info>  [1769165344.0203] device (tap9acbc2f5-e7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.030 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:49:04Z|00892|binding|INFO|Releasing lport 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf from this chassis (sb_readonly=0)
Jan 23 05:49:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:49:04Z|00893|binding|INFO|Setting lport 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf down in Southbound
Jan 23 05:49:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:49:04Z|00894|binding|INFO|Removing iface tap9acbc2f5-e7 ovn-installed in OVS
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.033 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.047 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:5f:89 10.100.0.13'], port_security=['fa:16:3e:c1:5f:89 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '1f16f1e6-2ac3-4547-84bf-103e4be39e3a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-082e2952-c529-49ec-88e6-5e5c5580db01', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '36d7e7c7ddbd4cf785fafd0d35b0a2d8', 'neutron:revision_number': '9', 'neutron:security_group_ids': '0f4c1420-fdc4-4f47-97c8-7ad48c8768c8', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.173', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=b6ea75f5-7173-4e04-a97c-cbcceff41ada, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.049 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.052 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 9acbc2f5-e7f6-4b5e-8799-c611ad3392bf in datapath 082e2952-c529-49ec-88e6-5e5c5580db01 unbound from our chassis#033[00m
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.054 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 082e2952-c529-49ec-88e6-5e5c5580db01, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.057 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[21fc27fe-6aa5-4740-a7ca-534301a7ae88]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.058 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 namespace which is not needed anymore#033[00m
Jan 23 05:49:04 np0005593234 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000d0.scope: Deactivated successfully.
Jan 23 05:49:04 np0005593234 systemd[1]: machine-qemu\x2d100\x2dinstance\x2d000000d0.scope: Consumed 14.191s CPU time.
Jan 23 05:49:04 np0005593234 systemd-machined[195626]: Machine qemu-100-instance-000000d0 terminated.
Jan 23 05:49:04 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[328501]: [NOTICE]   (328505) : haproxy version is 2.8.14-c23fe91
Jan 23 05:49:04 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[328501]: [NOTICE]   (328505) : path to executable is /usr/sbin/haproxy
Jan 23 05:49:04 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[328501]: [WARNING]  (328505) : Exiting Master process...
Jan 23 05:49:04 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[328501]: [ALERT]    (328505) : Current worker (328507) exited with code 143 (Terminated)
Jan 23 05:49:04 np0005593234 neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01[328501]: [WARNING]  (328505) : All workers exited. Exiting... (0)
Jan 23 05:49:04 np0005593234 systemd[1]: libpod-3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83.scope: Deactivated successfully.
Jan 23 05:49:04 np0005593234 podman[328807]: 2026-01-23 10:49:04.191840649 +0000 UTC m=+0.043860039 container died 3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.199 227766 INFO nova.virt.libvirt.driver [-] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Instance destroyed successfully.#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.201 227766 DEBUG nova.objects.instance [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lazy-loading 'resources' on Instance uuid 1f16f1e6-2ac3-4547-84bf-103e4be39e3a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.230 227766 DEBUG nova.virt.libvirt.vif [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:47:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-AttachVolumeShelveTestJSON-server-1083202157',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-attachvolumeshelvetestjson-server-1083202157',id=208,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG5ho3CJ6mVqZQzXfJk0fahhh8Yqf11R44i9Fq9DFeNIqqNX5wHacVicDdiNzwbZtz9LlhyeXROqPF8aw2fBlj8o9f2Tzq3dN4qwFPSyYlBwK89/KDZiTx7iCS1VleFFZQ==',key_name='tempest-keypair-1986772238',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:48:45Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='36d7e7c7ddbd4cf785fafd0d35b0a2d8',ramdisk_id='',reservation_id='r-ht01gjtk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-AttachVolumeShelveTestJSON-2030135659',owner_user_name='tempest-AttachVolumeShelveTestJSON-2030135659-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:48:45Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='296341ffca2441dc807d285fa14c966d',uuid=1f16f1e6-2ac3-4547-84bf-103e4be39e3a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.231 227766 DEBUG nova.network.os_vif_util [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converting VIF {"id": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "address": "fa:16:3e:c1:5f:89", "network": {"id": "082e2952-c529-49ec-88e6-5e5c5580db01", "bridge": "br-int", "label": "tempest-AttachVolumeShelveTestJSON-1240917525-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.173", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "36d7e7c7ddbd4cf785fafd0d35b0a2d8", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9acbc2f5-e7", "ovs_interfaceid": "9acbc2f5-e7f6-4b5e-8799-c611ad3392bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.233 227766 DEBUG nova.network.os_vif_util [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.234 227766 DEBUG os_vif [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.236 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.237 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9acbc2f5-e7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:49:04 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83-userdata-shm.mount: Deactivated successfully.
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.242 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:04 np0005593234 systemd[1]: var-lib-containers-storage-overlay-99d80fabc7e7d81de5feb9173295b9ba6a2e0b8e903dfef60daa03ac6ca59151-merged.mount: Deactivated successfully.
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.246 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.249 227766 INFO os_vif [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c1:5f:89,bridge_name='br-int',has_traffic_filtering=True,id=9acbc2f5-e7f6-4b5e-8799-c611ad3392bf,network=Network(082e2952-c529-49ec-88e6-5e5c5580db01),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9acbc2f5-e7')#033[00m
Jan 23 05:49:04 np0005593234 podman[328807]: 2026-01-23 10:49:04.256479136 +0000 UTC m=+0.108498526 container cleanup 3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 05:49:04 np0005593234 systemd[1]: libpod-conmon-3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83.scope: Deactivated successfully.
Jan 23 05:49:04 np0005593234 podman[328856]: 2026-01-23 10:49:04.328072956 +0000 UTC m=+0.050157122 container remove 3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.335 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6beb871e-b9b4-4274-a1c3-49b702ed069f]: (4, ('Fri Jan 23 10:49:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 (3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83)\n3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83\nFri Jan 23 10:49:04 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 (3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83)\n3069d4df60f391a0ec635d25d4c296723ce09ef4d894576ebdd45b2afdaf1a83\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.336 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[69e4bd62-4bc1-47be-815e-83c7a6698ac7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.337 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap082e2952-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.339 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:04 np0005593234 kernel: tap082e2952-c0: left promiscuous mode
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.353 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.354 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.356 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e5dffa32-ea31-4e26-9165-77ad74cfe2b0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.376 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6f2f7484-448b-464e-ad6a-9c6504e734a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.377 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ec3b642b-ffa4-4ea0-a4bc-3e517b9650fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.394 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0e2dc4b0-931c-44e3-9ac7-18c88643e4f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 921849, 'reachable_time': 27190, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 328880, 'error': None, 'target': 'ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:49:04 np0005593234 systemd[1]: run-netns-ovnmeta\x2d082e2952\x2dc529\x2d49ec\x2d88e6\x2d5e5c5580db01.mount: Deactivated successfully.
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.399 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-082e2952-c529-49ec-88e6-5e5c5580db01 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:49:04 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:04.400 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[07ccee74-6d43-4fa3-ac02-80d9a5c83054]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.663 227766 INFO nova.virt.libvirt.driver [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Deleting instance files /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a_del#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.664 227766 INFO nova.virt.libvirt.driver [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Deletion of /var/lib/nova/instances/1f16f1e6-2ac3-4547-84bf-103e4be39e3a_del complete#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.741 227766 INFO nova.compute.manager [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Took 0.79 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.742 227766 DEBUG oslo.service.loopingcall [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.742 227766 DEBUG nova.compute.manager [-] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:49:04 np0005593234 nova_compute[227762]: 2026-01-23 10:49:04.743 227766 DEBUG nova.network.neutron [-] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:49:05 np0005593234 nova_compute[227762]: 2026-01-23 10:49:05.022 227766 DEBUG nova.compute.manager [req-9591fc0d-b780-4242-96d3-6053842b5ea8 req-9390e227-e436-49c6-84d8-7a5de7509829 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-vif-unplugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:49:05 np0005593234 nova_compute[227762]: 2026-01-23 10:49:05.023 227766 DEBUG oslo_concurrency.lockutils [req-9591fc0d-b780-4242-96d3-6053842b5ea8 req-9390e227-e436-49c6-84d8-7a5de7509829 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:49:05 np0005593234 nova_compute[227762]: 2026-01-23 10:49:05.023 227766 DEBUG oslo_concurrency.lockutils [req-9591fc0d-b780-4242-96d3-6053842b5ea8 req-9390e227-e436-49c6-84d8-7a5de7509829 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:49:05 np0005593234 nova_compute[227762]: 2026-01-23 10:49:05.023 227766 DEBUG oslo_concurrency.lockutils [req-9591fc0d-b780-4242-96d3-6053842b5ea8 req-9390e227-e436-49c6-84d8-7a5de7509829 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:49:05 np0005593234 nova_compute[227762]: 2026-01-23 10:49:05.023 227766 DEBUG nova.compute.manager [req-9591fc0d-b780-4242-96d3-6053842b5ea8 req-9390e227-e436-49c6-84d8-7a5de7509829 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] No waiting events found dispatching network-vif-unplugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:49:05 np0005593234 nova_compute[227762]: 2026-01-23 10:49:05.024 227766 DEBUG nova.compute.manager [req-9591fc0d-b780-4242-96d3-6053842b5ea8 req-9390e227-e436-49c6-84d8-7a5de7509829 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-vif-unplugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:49:05 np0005593234 nova_compute[227762]: 2026-01-23 10:49:05.062 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:05.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:05.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:07 np0005593234 nova_compute[227762]: 2026-01-23 10:49:07.168 227766 DEBUG nova.compute.manager [req-11a6ed81-040d-49ca-a091-b3282c060631 req-41d1d6c5-062f-4acb-ba2a-70db71ffbbca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:49:07 np0005593234 nova_compute[227762]: 2026-01-23 10:49:07.168 227766 DEBUG oslo_concurrency.lockutils [req-11a6ed81-040d-49ca-a091-b3282c060631 req-41d1d6c5-062f-4acb-ba2a-70db71ffbbca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:49:07 np0005593234 nova_compute[227762]: 2026-01-23 10:49:07.168 227766 DEBUG oslo_concurrency.lockutils [req-11a6ed81-040d-49ca-a091-b3282c060631 req-41d1d6c5-062f-4acb-ba2a-70db71ffbbca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:49:07 np0005593234 nova_compute[227762]: 2026-01-23 10:49:07.169 227766 DEBUG oslo_concurrency.lockutils [req-11a6ed81-040d-49ca-a091-b3282c060631 req-41d1d6c5-062f-4acb-ba2a-70db71ffbbca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:49:07 np0005593234 nova_compute[227762]: 2026-01-23 10:49:07.169 227766 DEBUG nova.compute.manager [req-11a6ed81-040d-49ca-a091-b3282c060631 req-41d1d6c5-062f-4acb-ba2a-70db71ffbbca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] No waiting events found dispatching network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:49:07 np0005593234 nova_compute[227762]: 2026-01-23 10:49:07.169 227766 WARNING nova.compute.manager [req-11a6ed81-040d-49ca-a091-b3282c060631 req-41d1d6c5-062f-4acb-ba2a-70db71ffbbca 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received unexpected event network-vif-plugged-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf for instance with vm_state active and task_state deleting.#033[00m
Jan 23 05:49:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:07.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:07 np0005593234 nova_compute[227762]: 2026-01-23 10:49:07.653 227766 DEBUG nova.network.neutron [-] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:49:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:07.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:07 np0005593234 nova_compute[227762]: 2026-01-23 10:49:07.670 227766 INFO nova.compute.manager [-] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Took 2.93 seconds to deallocate network for instance.#033[00m
Jan 23 05:49:07 np0005593234 nova_compute[227762]: 2026-01-23 10:49:07.743 227766 DEBUG oslo_concurrency.lockutils [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:49:07 np0005593234 nova_compute[227762]: 2026-01-23 10:49:07.744 227766 DEBUG oslo_concurrency.lockutils [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:49:07 np0005593234 nova_compute[227762]: 2026-01-23 10:49:07.812 227766 DEBUG oslo_concurrency.processutils [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:49:07 np0005593234 nova_compute[227762]: 2026-01-23 10:49:07.860 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:07.862 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=84, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=83) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:49:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:07.863 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:49:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:07.866 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '84'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:49:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:49:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2601296955' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:49:08 np0005593234 nova_compute[227762]: 2026-01-23 10:49:08.233 227766 DEBUG oslo_concurrency.processutils [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.421s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:49:08 np0005593234 nova_compute[227762]: 2026-01-23 10:49:08.240 227766 DEBUG nova.compute.provider_tree [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:49:08 np0005593234 nova_compute[227762]: 2026-01-23 10:49:08.274 227766 DEBUG nova.scheduler.client.report [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:49:08 np0005593234 nova_compute[227762]: 2026-01-23 10:49:08.309 227766 DEBUG oslo_concurrency.lockutils [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.565s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:49:08 np0005593234 nova_compute[227762]: 2026-01-23 10:49:08.375 227766 INFO nova.scheduler.client.report [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Deleted allocations for instance 1f16f1e6-2ac3-4547-84bf-103e4be39e3a#033[00m
Jan 23 05:49:08 np0005593234 nova_compute[227762]: 2026-01-23 10:49:08.459 227766 DEBUG oslo_concurrency.lockutils [None req-1887987d-ed41-4ed0-b735-087a2b196ed2 296341ffca2441dc807d285fa14c966d 36d7e7c7ddbd4cf785fafd0d35b0a2d8 - - default default] Lock "1f16f1e6-2ac3-4547-84bf-103e4be39e3a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.510s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:49:09 np0005593234 nova_compute[227762]: 2026-01-23 10:49:09.239 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:49:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:09.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:49:09 np0005593234 nova_compute[227762]: 2026-01-23 10:49:09.319 227766 DEBUG nova.compute.manager [req-88a285a9-08f0-43f8-ad87-24fd826fce7f req-b8f1f912-c883-4b8b-b845-ac79e0da2041 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Received event network-vif-deleted-9acbc2f5-e7f6-4b5e-8799-c611ad3392bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:49:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:49:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:09.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #181. Immutable memtables: 0.
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.672402) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 115] Flushing memtable with next log file: 181
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165349672598, "job": 115, "event": "flush_started", "num_memtables": 1, "num_entries": 2095, "num_deletes": 254, "total_data_size": 4893896, "memory_usage": 4965472, "flush_reason": "Manual Compaction"}
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 115] Level-0 flush table #182: started
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165349688762, "cf_name": "default", "job": 115, "event": "table_file_creation", "file_number": 182, "file_size": 1972973, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 86638, "largest_seqno": 88728, "table_properties": {"data_size": 1966413, "index_size": 3441, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 17287, "raw_average_key_size": 21, "raw_value_size": 1951870, "raw_average_value_size": 2409, "num_data_blocks": 152, "num_entries": 810, "num_filter_entries": 810, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165178, "oldest_key_time": 1769165178, "file_creation_time": 1769165349, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 182, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 115] Flush lasted 16338 microseconds, and 5366 cpu microseconds.
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.688859) [db/flush_job.cc:967] [default] [JOB 115] Level-0 flush table #182: 1972973 bytes OK
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.688886) [db/memtable_list.cc:519] [default] Level-0 commit table #182 started
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.691353) [db/memtable_list.cc:722] [default] Level-0 commit table #182: memtable #1 done
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.691378) EVENT_LOG_v1 {"time_micros": 1769165349691370, "job": 115, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.691400) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 115] Try to delete WAL files size 4884425, prev total WAL file size 4884425, number of live WAL files 2.
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000178.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.693544) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033303039' seq:72057594037927935, type:22 .. '6D6772737461740033323630' seq:0, type:0; will stop at (end)
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 116] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 115 Base level 0, inputs: [182(1926KB)], [180(12MB)]
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165349693665, "job": 116, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [182], "files_L6": [180], "score": -1, "input_data_size": 15008928, "oldest_snapshot_seqno": -1}
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 116] Generated table #183: 10766 keys, 12381831 bytes, temperature: kUnknown
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165349770412, "cf_name": "default", "job": 116, "event": "table_file_creation", "file_number": 183, "file_size": 12381831, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12314808, "index_size": 39006, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26949, "raw_key_size": 283223, "raw_average_key_size": 26, "raw_value_size": 12129069, "raw_average_value_size": 1126, "num_data_blocks": 1483, "num_entries": 10766, "num_filter_entries": 10766, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769165349, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 183, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.770719) [db/compaction/compaction_job.cc:1663] [default] [JOB 116] Compacted 1@0 + 1@6 files to L6 => 12381831 bytes
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.771940) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 195.4 rd, 161.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.4 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(13.9) write-amplify(6.3) OK, records in: 11212, records dropped: 446 output_compression: NoCompression
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.771955) EVENT_LOG_v1 {"time_micros": 1769165349771948, "job": 116, "event": "compaction_finished", "compaction_time_micros": 76825, "compaction_time_cpu_micros": 29418, "output_level": 6, "num_output_files": 1, "total_output_size": 12381831, "num_input_records": 11212, "num_output_records": 10766, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000182.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165349772378, "job": 116, "event": "table_file_deletion", "file_number": 182}
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000180.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165349774402, "job": 116, "event": "table_file_deletion", "file_number": 180}
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.693457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.774449) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.774453) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.774454) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.774456) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:09 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:09.774457) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:10 np0005593234 nova_compute[227762]: 2026-01-23 10:49:10.064 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:11.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:49:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:11.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:49:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:49:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2541801108' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:49:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:49:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2541801108' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:49:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:12 np0005593234 nova_compute[227762]: 2026-01-23 10:49:12.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:12 np0005593234 nova_compute[227762]: 2026-01-23 10:49:12.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:49:12 np0005593234 nova_compute[227762]: 2026-01-23 10:49:12.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:49:12 np0005593234 nova_compute[227762]: 2026-01-23 10:49:12.767 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:49:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:13.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:13.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:13 np0005593234 podman[328961]: 2026-01-23 10:49:13.789186681 +0000 UTC m=+0.072354025 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:49:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:49:13 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/124892785' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:49:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:49:13 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/124892785' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:49:14 np0005593234 nova_compute[227762]: 2026-01-23 10:49:14.241 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:14 np0005593234 nova_compute[227762]: 2026-01-23 10:49:14.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:14 np0005593234 nova_compute[227762]: 2026-01-23 10:49:14.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:14 np0005593234 nova_compute[227762]: 2026-01-23 10:49:14.775 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:49:14 np0005593234 nova_compute[227762]: 2026-01-23 10:49:14.776 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:49:14 np0005593234 nova_compute[227762]: 2026-01-23 10:49:14.776 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:49:14 np0005593234 nova_compute[227762]: 2026-01-23 10:49:14.776 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:49:14 np0005593234 nova_compute[227762]: 2026-01-23 10:49:14.777 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:49:15 np0005593234 nova_compute[227762]: 2026-01-23 10:49:15.066 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:49:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2207027115' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:49:15 np0005593234 nova_compute[227762]: 2026-01-23 10:49:15.200 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:49:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:15.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:15 np0005593234 nova_compute[227762]: 2026-01-23 10:49:15.358 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:49:15 np0005593234 nova_compute[227762]: 2026-01-23 10:49:15.359 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4123MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:49:15 np0005593234 nova_compute[227762]: 2026-01-23 10:49:15.359 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:49:15 np0005593234 nova_compute[227762]: 2026-01-23 10:49:15.360 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:49:15 np0005593234 nova_compute[227762]: 2026-01-23 10:49:15.442 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:49:15 np0005593234 nova_compute[227762]: 2026-01-23 10:49:15.442 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:49:15 np0005593234 nova_compute[227762]: 2026-01-23 10:49:15.510 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:49:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:15.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:49:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1775302711' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:49:15 np0005593234 nova_compute[227762]: 2026-01-23 10:49:15.948 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:49:15 np0005593234 nova_compute[227762]: 2026-01-23 10:49:15.953 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:49:15 np0005593234 nova_compute[227762]: 2026-01-23 10:49:15.984 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:49:16 np0005593234 nova_compute[227762]: 2026-01-23 10:49:16.018 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:49:16 np0005593234 nova_compute[227762]: 2026-01-23 10:49:16.018 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.659s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:49:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:17.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:17.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:19 np0005593234 nova_compute[227762]: 2026-01-23 10:49:19.197 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165344.196047, 1f16f1e6-2ac3-4547-84bf-103e4be39e3a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:49:19 np0005593234 nova_compute[227762]: 2026-01-23 10:49:19.198 227766 INFO nova.compute.manager [-] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:49:19 np0005593234 nova_compute[227762]: 2026-01-23 10:49:19.244 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:19.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:19 np0005593234 nova_compute[227762]: 2026-01-23 10:49:19.639 227766 DEBUG nova.compute.manager [None req-debff924-755e-419c-ae5a-4d4f36205e9a - - - - - -] [instance: 1f16f1e6-2ac3-4547-84bf-103e4be39e3a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:49:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:49:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:19.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:49:20 np0005593234 nova_compute[227762]: 2026-01-23 10:49:20.068 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:21 np0005593234 nova_compute[227762]: 2026-01-23 10:49:21.018 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:21 np0005593234 nova_compute[227762]: 2026-01-23 10:49:21.019 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:21 np0005593234 nova_compute[227762]: 2026-01-23 10:49:21.020 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:49:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:21.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:21.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:21 np0005593234 nova_compute[227762]: 2026-01-23 10:49:21.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:22 np0005593234 nova_compute[227762]: 2026-01-23 10:49:22.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:22 np0005593234 nova_compute[227762]: 2026-01-23 10:49:22.775 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:22 np0005593234 nova_compute[227762]: 2026-01-23 10:49:22.851 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:22 np0005593234 podman[329033]: 2026-01-23 10:49:22.853376066 +0000 UTC m=+0.148474999 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 05:49:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:23.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:23.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:24 np0005593234 nova_compute[227762]: 2026-01-23 10:49:24.294 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:25 np0005593234 nova_compute[227762]: 2026-01-23 10:49:25.069 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:25.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:25.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:27.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:49:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:27.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:49:29 np0005593234 nova_compute[227762]: 2026-01-23 10:49:29.296 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:29.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:29.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:29 np0005593234 nova_compute[227762]: 2026-01-23 10:49:29.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:30 np0005593234 nova_compute[227762]: 2026-01-23 10:49:30.070 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:31.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:31.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:49:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:33.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:49:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:49:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:33.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:49:34 np0005593234 nova_compute[227762]: 2026-01-23 10:49:34.298 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:35 np0005593234 nova_compute[227762]: 2026-01-23 10:49:35.075 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:49:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:35.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:49:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:35.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:49:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:37.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:49:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e415 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:37.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:39 np0005593234 nova_compute[227762]: 2026-01-23 10:49:39.303 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:49:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:39.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:49:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:39.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:40 np0005593234 nova_compute[227762]: 2026-01-23 10:49:40.075 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:49:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:41.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e416 e416: 3 total, 3 up, 3 in
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #184. Immutable memtables: 0.
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.703247) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 117] Flushing memtable with next log file: 184
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165381703298, "job": 117, "event": "flush_started", "num_memtables": 1, "num_entries": 569, "num_deletes": 251, "total_data_size": 836031, "memory_usage": 847112, "flush_reason": "Manual Compaction"}
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 117] Level-0 flush table #185: started
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165381709607, "cf_name": "default", "job": 117, "event": "table_file_creation", "file_number": 185, "file_size": 551401, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 88733, "largest_seqno": 89297, "table_properties": {"data_size": 548474, "index_size": 898, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 965, "raw_key_size": 7009, "raw_average_key_size": 19, "raw_value_size": 542600, "raw_average_value_size": 1478, "num_data_blocks": 40, "num_entries": 367, "num_filter_entries": 367, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165349, "oldest_key_time": 1769165349, "file_creation_time": 1769165381, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 185, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 117] Flush lasted 6391 microseconds, and 2167 cpu microseconds.
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:49:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:41.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.709648) [db/flush_job.cc:967] [default] [JOB 117] Level-0 flush table #185: 551401 bytes OK
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.709661) [db/memtable_list.cc:519] [default] Level-0 commit table #185 started
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.711843) [db/memtable_list.cc:722] [default] Level-0 commit table #185: memtable #1 done
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.711855) EVENT_LOG_v1 {"time_micros": 1769165381711851, "job": 117, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.711871) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 117] Try to delete WAL files size 832769, prev total WAL file size 832769, number of live WAL files 2.
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000181.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.712414) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730037373831' seq:72057594037927935, type:22 .. '7061786F730038303333' seq:0, type:0; will stop at (end)
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 118] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 117 Base level 0, inputs: [185(538KB)], [183(11MB)]
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165381712772, "job": 118, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [185], "files_L6": [183], "score": -1, "input_data_size": 12933232, "oldest_snapshot_seqno": -1}
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 118] Generated table #186: 10619 keys, 11027119 bytes, temperature: kUnknown
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165381795310, "cf_name": "default", "job": 118, "event": "table_file_creation", "file_number": 186, "file_size": 11027119, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 10962266, "index_size": 37220, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26565, "raw_key_size": 280870, "raw_average_key_size": 26, "raw_value_size": 10780247, "raw_average_value_size": 1015, "num_data_blocks": 1401, "num_entries": 10619, "num_filter_entries": 10619, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769165381, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 186, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.795597) [db/compaction/compaction_job.cc:1663] [default] [JOB 118] Compacted 1@0 + 1@6 files to L6 => 11027119 bytes
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.797076) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 156.7 rd, 133.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.5, 11.8 +0.0 blob) out(10.5 +0.0 blob), read-write-amplify(43.5) write-amplify(20.0) OK, records in: 11133, records dropped: 514 output_compression: NoCompression
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.797092) EVENT_LOG_v1 {"time_micros": 1769165381797084, "job": 118, "event": "compaction_finished", "compaction_time_micros": 82543, "compaction_time_cpu_micros": 42352, "output_level": 6, "num_output_files": 1, "total_output_size": 11027119, "num_input_records": 11133, "num_output_records": 10619, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000185.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165381797288, "job": 118, "event": "table_file_deletion", "file_number": 185}
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000183.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165381799412, "job": 118, "event": "table_file_deletion", "file_number": 183}
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.712263) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.799530) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.799538) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.799542) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.799546) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:41 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:49:41.799550) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:49:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e416 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e417 e417: 3 total, 3 up, 3 in
Jan 23 05:49:42 np0005593234 nova_compute[227762]: 2026-01-23 10:49:42.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:42.891 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:49:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:42.892 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:49:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:49:42.892 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:49:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:49:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:43.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:49:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:43.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:44 np0005593234 nova_compute[227762]: 2026-01-23 10:49:44.306 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:49:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1059601420' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:49:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:49:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1059601420' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:49:44 np0005593234 podman[329122]: 2026-01-23 10:49:44.79915884 +0000 UTC m=+0.085903240 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 05:49:45 np0005593234 nova_compute[227762]: 2026-01-23 10:49:45.077 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:49:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:45.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:49:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:49:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:45.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:49:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:49:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:47.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:49:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e417 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:49:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:47.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:49:49 np0005593234 nova_compute[227762]: 2026-01-23 10:49:49.308 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:49:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:49.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:49:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 e418: 3 total, 3 up, 3 in
Jan 23 05:49:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:49.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:50 np0005593234 nova_compute[227762]: 2026-01-23 10:49:50.080 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:51.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:51.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:53.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:53.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:53 np0005593234 podman[329325]: 2026-01-23 10:49:53.790432993 +0000 UTC m=+0.090308472 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:49:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:49:53 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:49:54 np0005593234 nova_compute[227762]: 2026-01-23 10:49:54.310 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:49:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:49:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:49:55 np0005593234 nova_compute[227762]: 2026-01-23 10:49:55.082 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:55.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:55.730 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:55 np0005593234 nova_compute[227762]: 2026-01-23 10:49:55.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:49:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:57.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:49:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:57.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:59 np0005593234 nova_compute[227762]: 2026-01-23 10:49:59.312 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:49:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:49:59.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:49:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:49:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:49:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:49:59.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:00 np0005593234 nova_compute[227762]: 2026-01-23 10:50:00.085 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:00 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 05:50:00 np0005593234 ovn_controller[134547]: 2026-01-23T10:50:00Z|00895|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Jan 23 05:50:00 np0005593234 nova_compute[227762]: 2026-01-23 10:50:00.763 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:00 np0005593234 nova_compute[227762]: 2026-01-23 10:50:00.763 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:50:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:01.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:50:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:01.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:50:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:50:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:50:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:50:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1812711914' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:50:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:50:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1812711914' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:50:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:03.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:50:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:03.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:50:04 np0005593234 nova_compute[227762]: 2026-01-23 10:50:04.316 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:05 np0005593234 nova_compute[227762]: 2026-01-23 10:50:05.118 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:05.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:05.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:50:06 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1738129310' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:50:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:50:06 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1738129310' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:50:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:50:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:07.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:50:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:07.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:07 np0005593234 nova_compute[227762]: 2026-01-23 10:50:07.777 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:07 np0005593234 nova_compute[227762]: 2026-01-23 10:50:07.778 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:50:07 np0005593234 nova_compute[227762]: 2026-01-23 10:50:07.797 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:50:09 np0005593234 nova_compute[227762]: 2026-01-23 10:50:09.318 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:09.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:50:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:09.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:50:10 np0005593234 nova_compute[227762]: 2026-01-23 10:50:10.117 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:50:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:11.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:50:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:11.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:13.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:50:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:13.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:50:13 np0005593234 nova_compute[227762]: 2026-01-23 10:50:13.766 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:13 np0005593234 nova_compute[227762]: 2026-01-23 10:50:13.766 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:50:13 np0005593234 nova_compute[227762]: 2026-01-23 10:50:13.766 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:50:13 np0005593234 nova_compute[227762]: 2026-01-23 10:50:13.788 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:50:14 np0005593234 nova_compute[227762]: 2026-01-23 10:50:14.320 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:14 np0005593234 nova_compute[227762]: 2026-01-23 10:50:14.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:14 np0005593234 nova_compute[227762]: 2026-01-23 10:50:14.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:15 np0005593234 nova_compute[227762]: 2026-01-23 10:50:15.119 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:50:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:15.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:50:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:15.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:15 np0005593234 podman[329463]: 2026-01-23 10:50:15.800462398 +0000 UTC m=+0.084505646 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 05:50:15 np0005593234 nova_compute[227762]: 2026-01-23 10:50:15.831 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:50:15 np0005593234 nova_compute[227762]: 2026-01-23 10:50:15.831 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:50:15 np0005593234 nova_compute[227762]: 2026-01-23 10:50:15.832 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:50:15 np0005593234 nova_compute[227762]: 2026-01-23 10:50:15.832 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:50:15 np0005593234 nova_compute[227762]: 2026-01-23 10:50:15.833 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:50:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:50:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3392730987' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:50:16 np0005593234 nova_compute[227762]: 2026-01-23 10:50:16.261 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:50:16 np0005593234 nova_compute[227762]: 2026-01-23 10:50:16.420 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:50:16 np0005593234 nova_compute[227762]: 2026-01-23 10:50:16.421 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4140MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:50:16 np0005593234 nova_compute[227762]: 2026-01-23 10:50:16.421 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:50:16 np0005593234 nova_compute[227762]: 2026-01-23 10:50:16.422 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:50:16 np0005593234 nova_compute[227762]: 2026-01-23 10:50:16.514 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:50:16 np0005593234 nova_compute[227762]: 2026-01-23 10:50:16.514 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:50:16 np0005593234 nova_compute[227762]: 2026-01-23 10:50:16.535 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:50:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:50:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/595081027' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:50:16 np0005593234 nova_compute[227762]: 2026-01-23 10:50:16.950 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.414s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:50:16 np0005593234 nova_compute[227762]: 2026-01-23 10:50:16.958 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:50:16 np0005593234 nova_compute[227762]: 2026-01-23 10:50:16.981 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:50:16 np0005593234 nova_compute[227762]: 2026-01-23 10:50:16.982 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:50:16 np0005593234 nova_compute[227762]: 2026-01-23 10:50:16.983 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:50:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:17.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:17.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:19 np0005593234 nova_compute[227762]: 2026-01-23 10:50:19.323 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:19.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:19.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:20 np0005593234 nova_compute[227762]: 2026-01-23 10:50:20.121 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:21.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:21.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:21 np0005593234 nova_compute[227762]: 2026-01-23 10:50:21.983 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:21 np0005593234 nova_compute[227762]: 2026-01-23 10:50:21.983 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:21 np0005593234 nova_compute[227762]: 2026-01-23 10:50:21.983 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:50:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:22 np0005593234 nova_compute[227762]: 2026-01-23 10:50:22.740 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:22 np0005593234 nova_compute[227762]: 2026-01-23 10:50:22.765 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:22 np0005593234 nova_compute[227762]: 2026-01-23 10:50:22.916 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:50:22.916 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=85, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=84) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:50:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:50:22.917 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:50:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:23.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:23 np0005593234 nova_compute[227762]: 2026-01-23 10:50:23.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:50:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:23.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:50:24 np0005593234 nova_compute[227762]: 2026-01-23 10:50:24.325 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:24 np0005593234 podman[329531]: 2026-01-23 10:50:24.807624422 +0000 UTC m=+0.101067678 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:50:25 np0005593234 nova_compute[227762]: 2026-01-23 10:50:25.124 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:25.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:25.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:27.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:27.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:50:28.919 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '85'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:50:29 np0005593234 nova_compute[227762]: 2026-01-23 10:50:29.328 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:50:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:29.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:50:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:29.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:30 np0005593234 nova_compute[227762]: 2026-01-23 10:50:30.126 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:30 np0005593234 nova_compute[227762]: 2026-01-23 10:50:30.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:50:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:31.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:50:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:31.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:33.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:33.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:34 np0005593234 nova_compute[227762]: 2026-01-23 10:50:34.330 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:35 np0005593234 nova_compute[227762]: 2026-01-23 10:50:35.130 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:35.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:35.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:37.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:37.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:39 np0005593234 nova_compute[227762]: 2026-01-23 10:50:39.389 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:39.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:50:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:39.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:50:40 np0005593234 nova_compute[227762]: 2026-01-23 10:50:40.132 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:41.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:41.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:42 np0005593234 nova_compute[227762]: 2026-01-23 10:50:42.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:50:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:50:42.893 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:50:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:50:42.894 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:50:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:50:42.894 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:50:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:43.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:50:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:43.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:50:44 np0005593234 nova_compute[227762]: 2026-01-23 10:50:44.391 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:45 np0005593234 nova_compute[227762]: 2026-01-23 10:50:45.134 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:45.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:45.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:46 np0005593234 podman[329620]: 2026-01-23 10:50:46.754727696 +0000 UTC m=+0.049257313 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 05:50:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:47.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:47.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:49 np0005593234 nova_compute[227762]: 2026-01-23 10:50:49.393 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:49.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:50:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:49.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:50:50 np0005593234 nova_compute[227762]: 2026-01-23 10:50:50.136 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:51.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:50:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:51.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:50:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:53.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:53 np0005593234 ovn_controller[134547]: 2026-01-23T10:50:53Z|00896|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Jan 23 05:50:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:53.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:54 np0005593234 nova_compute[227762]: 2026-01-23 10:50:54.395 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:55 np0005593234 nova_compute[227762]: 2026-01-23 10:50:55.138 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:55 np0005593234 nova_compute[227762]: 2026-01-23 10:50:55.239 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Acquiring lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:50:55 np0005593234 nova_compute[227762]: 2026-01-23 10:50:55.239 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:50:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:55.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:55.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:55 np0005593234 podman[329693]: 2026-01-23 10:50:55.812742922 +0000 UTC m=+0.101395229 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:50:56 np0005593234 nova_compute[227762]: 2026-01-23 10:50:56.232 227766 DEBUG nova.compute.manager [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:50:56 np0005593234 nova_compute[227762]: 2026-01-23 10:50:56.324 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:50:56 np0005593234 nova_compute[227762]: 2026-01-23 10:50:56.325 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:50:56 np0005593234 nova_compute[227762]: 2026-01-23 10:50:56.340 227766 DEBUG nova.virt.hardware [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:50:56 np0005593234 nova_compute[227762]: 2026-01-23 10:50:56.341 227766 INFO nova.compute.claims [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:50:56 np0005593234 nova_compute[227762]: 2026-01-23 10:50:56.556 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:50:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:50:56 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1553193866' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:50:56 np0005593234 nova_compute[227762]: 2026-01-23 10:50:56.999 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:50:57 np0005593234 nova_compute[227762]: 2026-01-23 10:50:57.006 227766 DEBUG nova.compute.provider_tree [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:50:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:57.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:50:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:57.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:57 np0005593234 nova_compute[227762]: 2026-01-23 10:50:57.941 227766 DEBUG nova.scheduler.client.report [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.006 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.681s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.007 227766 DEBUG nova.compute.manager [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.128 227766 DEBUG nova.compute.manager [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.128 227766 DEBUG nova.network.neutron [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.230 227766 INFO nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.251 227766 DEBUG nova.compute.manager [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.472 227766 DEBUG nova.compute.manager [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.473 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.474 227766 INFO nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Creating image(s)#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.505 227766 DEBUG nova.storage.rbd_utils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] rbd image cad430d0-9af9-46f1-ad8b-38438fc2030b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.533 227766 DEBUG nova.storage.rbd_utils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] rbd image cad430d0-9af9-46f1-ad8b-38438fc2030b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.561 227766 DEBUG nova.storage.rbd_utils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] rbd image cad430d0-9af9-46f1-ad8b-38438fc2030b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.565 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.631 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.633 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.634 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.635 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.669 227766 DEBUG nova.storage.rbd_utils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] rbd image cad430d0-9af9-46f1-ad8b-38438fc2030b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.673 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 cad430d0-9af9-46f1-ad8b-38438fc2030b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.948 227766 DEBUG nova.policy [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '7d7e6f562c9d4d81bf1f8d5462870e30', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9533be9d361246bdb0a7c1bd3015db66', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:50:58 np0005593234 nova_compute[227762]: 2026-01-23 10:50:58.983 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 cad430d0-9af9-46f1-ad8b-38438fc2030b_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.310s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:50:59 np0005593234 nova_compute[227762]: 2026-01-23 10:50:59.051 227766 DEBUG nova.storage.rbd_utils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] resizing rbd image cad430d0-9af9-46f1-ad8b-38438fc2030b_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:50:59 np0005593234 nova_compute[227762]: 2026-01-23 10:50:59.157 227766 DEBUG nova.objects.instance [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lazy-loading 'migration_context' on Instance uuid cad430d0-9af9-46f1-ad8b-38438fc2030b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:50:59 np0005593234 nova_compute[227762]: 2026-01-23 10:50:59.397 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:50:59 np0005593234 nova_compute[227762]: 2026-01-23 10:50:59.450 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:50:59 np0005593234 nova_compute[227762]: 2026-01-23 10:50:59.451 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Ensure instance console log exists: /var/lib/nova/instances/cad430d0-9af9-46f1-ad8b-38438fc2030b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:50:59 np0005593234 nova_compute[227762]: 2026-01-23 10:50:59.452 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:50:59 np0005593234 nova_compute[227762]: 2026-01-23 10:50:59.452 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:50:59 np0005593234 nova_compute[227762]: 2026-01-23 10:50:59.453 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:50:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:50:59.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:50:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:50:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:50:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:50:59.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:00 np0005593234 nova_compute[227762]: 2026-01-23 10:51:00.139 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:01.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:01.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:03.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:03.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:04 np0005593234 nova_compute[227762]: 2026-01-23 10:51:04.233 227766 DEBUG nova.network.neutron [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Successfully created port: 5da2fb68-f183-46ec-b307-762bc7c0eae1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:51:04 np0005593234 nova_compute[227762]: 2026-01-23 10:51:04.400 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:05 np0005593234 nova_compute[227762]: 2026-01-23 10:51:05.140 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:51:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:51:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:51:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:51:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:05.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:51:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:05.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:51:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:51:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:07.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:07.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:08 np0005593234 nova_compute[227762]: 2026-01-23 10:51:08.817 227766 DEBUG nova.network.neutron [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Successfully updated port: 5da2fb68-f183-46ec-b307-762bc7c0eae1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:51:09 np0005593234 nova_compute[227762]: 2026-01-23 10:51:09.019 227766 DEBUG nova.compute.manager [req-ea8fa8c6-4d41-4e1e-a304-4c05cd66a3d1 req-b48b1a7d-ef3b-4cd8-b500-5c70af5edfef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Received event network-changed-5da2fb68-f183-46ec-b307-762bc7c0eae1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:51:09 np0005593234 nova_compute[227762]: 2026-01-23 10:51:09.019 227766 DEBUG nova.compute.manager [req-ea8fa8c6-4d41-4e1e-a304-4c05cd66a3d1 req-b48b1a7d-ef3b-4cd8-b500-5c70af5edfef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Refreshing instance network info cache due to event network-changed-5da2fb68-f183-46ec-b307-762bc7c0eae1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:51:09 np0005593234 nova_compute[227762]: 2026-01-23 10:51:09.020 227766 DEBUG oslo_concurrency.lockutils [req-ea8fa8c6-4d41-4e1e-a304-4c05cd66a3d1 req-b48b1a7d-ef3b-4cd8-b500-5c70af5edfef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-cad430d0-9af9-46f1-ad8b-38438fc2030b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:51:09 np0005593234 nova_compute[227762]: 2026-01-23 10:51:09.020 227766 DEBUG oslo_concurrency.lockutils [req-ea8fa8c6-4d41-4e1e-a304-4c05cd66a3d1 req-b48b1a7d-ef3b-4cd8-b500-5c70af5edfef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-cad430d0-9af9-46f1-ad8b-38438fc2030b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:51:09 np0005593234 nova_compute[227762]: 2026-01-23 10:51:09.020 227766 DEBUG nova.network.neutron [req-ea8fa8c6-4d41-4e1e-a304-4c05cd66a3d1 req-b48b1a7d-ef3b-4cd8-b500-5c70af5edfef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Refreshing network info cache for port 5da2fb68-f183-46ec-b307-762bc7c0eae1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:51:09 np0005593234 nova_compute[227762]: 2026-01-23 10:51:09.071 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Acquiring lock "refresh_cache-cad430d0-9af9-46f1-ad8b-38438fc2030b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:51:09 np0005593234 nova_compute[227762]: 2026-01-23 10:51:09.320 227766 DEBUG nova.network.neutron [req-ea8fa8c6-4d41-4e1e-a304-4c05cd66a3d1 req-b48b1a7d-ef3b-4cd8-b500-5c70af5edfef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:51:09 np0005593234 nova_compute[227762]: 2026-01-23 10:51:09.437 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:09.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:09.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:10 np0005593234 nova_compute[227762]: 2026-01-23 10:51:10.015 227766 DEBUG nova.network.neutron [req-ea8fa8c6-4d41-4e1e-a304-4c05cd66a3d1 req-b48b1a7d-ef3b-4cd8-b500-5c70af5edfef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:51:10 np0005593234 nova_compute[227762]: 2026-01-23 10:51:10.142 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:10 np0005593234 nova_compute[227762]: 2026-01-23 10:51:10.532 227766 DEBUG oslo_concurrency.lockutils [req-ea8fa8c6-4d41-4e1e-a304-4c05cd66a3d1 req-b48b1a7d-ef3b-4cd8-b500-5c70af5edfef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-cad430d0-9af9-46f1-ad8b-38438fc2030b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:51:10 np0005593234 nova_compute[227762]: 2026-01-23 10:51:10.532 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Acquired lock "refresh_cache-cad430d0-9af9-46f1-ad8b-38438fc2030b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:51:10 np0005593234 nova_compute[227762]: 2026-01-23 10:51:10.533 227766 DEBUG nova.network.neutron [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:51:11 np0005593234 nova_compute[227762]: 2026-01-23 10:51:11.103 227766 DEBUG nova.network.neutron [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:51:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:11.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:51:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:11.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.042 227766 DEBUG nova.network.neutron [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Updating instance_info_cache with network_info: [{"id": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "address": "fa:16:3e:b5:5a:2a", "network": {"id": "fe1b8e52-7b7b-4d1d-a352-d131d1cac17f", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-1479617766-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9533be9d361246bdb0a7c1bd3015db66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5da2fb68-f1", "ovs_interfaceid": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.247 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Releasing lock "refresh_cache-cad430d0-9af9-46f1-ad8b-38438fc2030b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.247 227766 DEBUG nova.compute.manager [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Instance network_info: |[{"id": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "address": "fa:16:3e:b5:5a:2a", "network": {"id": "fe1b8e52-7b7b-4d1d-a352-d131d1cac17f", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-1479617766-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9533be9d361246bdb0a7c1bd3015db66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5da2fb68-f1", "ovs_interfaceid": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.249 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Start _get_guest_xml network_info=[{"id": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "address": "fa:16:3e:b5:5a:2a", "network": {"id": "fe1b8e52-7b7b-4d1d-a352-d131d1cac17f", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-1479617766-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9533be9d361246bdb0a7c1bd3015db66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5da2fb68-f1", "ovs_interfaceid": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.254 227766 WARNING nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.258 227766 DEBUG nova.virt.libvirt.host [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.258 227766 DEBUG nova.virt.libvirt.host [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.261 227766 DEBUG nova.virt.libvirt.host [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.261 227766 DEBUG nova.virt.libvirt.host [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.262 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.263 227766 DEBUG nova.virt.hardware [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.263 227766 DEBUG nova.virt.hardware [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.263 227766 DEBUG nova.virt.hardware [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.263 227766 DEBUG nova.virt.hardware [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.264 227766 DEBUG nova.virt.hardware [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.264 227766 DEBUG nova.virt.hardware [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.264 227766 DEBUG nova.virt.hardware [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.264 227766 DEBUG nova.virt.hardware [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.265 227766 DEBUG nova.virt.hardware [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.265 227766 DEBUG nova.virt.hardware [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.265 227766 DEBUG nova.virt.hardware [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.268 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:51:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:51:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3482485221' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.726 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.757 227766 DEBUG nova.storage.rbd_utils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] rbd image cad430d0-9af9-46f1-ad8b-38438fc2030b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:51:12 np0005593234 nova_compute[227762]: 2026-01-23 10:51:12.761 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:51:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:51:13 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1334794948' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.210 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.213 227766 DEBUG nova.virt.libvirt.vif [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:50:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestEncryptedCinderVolumes-server-2136626245',display_name='tempest-TestEncryptedCinderVolumes-server-2136626245',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testencryptedcindervolumes-server-2136626245',id=209,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCuiBqvEtsGgfmA0vuszrHVvw7K+RIn5/GzDuVpnZeUj7Bzvv0dWPbHfPCN50+GDI0uVcs7x3mMFsgg3h5hvujMZpusgdPvznJ8NRbHRE7JTX+XI36WbEmYI2og5OCDx2w==',key_name='tempest-keypair-225997112',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9533be9d361246bdb0a7c1bd3015db66',ramdisk_id='',reservation_id='r-d2hfsqgw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestEncryptedCinderVolumes-2052193645',owner_user_name='tempest-TestEncryptedCinderVolumes-2052193645-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:50:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='7d7e6f562c9d4d81bf1f8d5462870e30',uuid=cad430d0-9af9-46f1-ad8b-38438fc2030b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "address": "fa:16:3e:b5:5a:2a", "network": {"id": "fe1b8e52-7b7b-4d1d-a352-d131d1cac17f", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-1479617766-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9533be9d361246bdb0a7c1bd3015db66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5da2fb68-f1", "ovs_interfaceid": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.213 227766 DEBUG nova.network.os_vif_util [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Converting VIF {"id": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "address": "fa:16:3e:b5:5a:2a", "network": {"id": "fe1b8e52-7b7b-4d1d-a352-d131d1cac17f", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-1479617766-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9533be9d361246bdb0a7c1bd3015db66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5da2fb68-f1", "ovs_interfaceid": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.215 227766 DEBUG nova.network.os_vif_util [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:5a:2a,bridge_name='br-int',has_traffic_filtering=True,id=5da2fb68-f183-46ec-b307-762bc7c0eae1,network=Network(fe1b8e52-7b7b-4d1d-a352-d131d1cac17f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5da2fb68-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.216 227766 DEBUG nova.objects.instance [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lazy-loading 'pci_devices' on Instance uuid cad430d0-9af9-46f1-ad8b-38438fc2030b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.233 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  <uuid>cad430d0-9af9-46f1-ad8b-38438fc2030b</uuid>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  <name>instance-000000d1</name>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestEncryptedCinderVolumes-server-2136626245</nova:name>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:51:12</nova:creationTime>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <nova:user uuid="7d7e6f562c9d4d81bf1f8d5462870e30">tempest-TestEncryptedCinderVolumes-2052193645-project-member</nova:user>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <nova:project uuid="9533be9d361246bdb0a7c1bd3015db66">tempest-TestEncryptedCinderVolumes-2052193645</nova:project>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <nova:port uuid="5da2fb68-f183-46ec-b307-762bc7c0eae1">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <entry name="serial">cad430d0-9af9-46f1-ad8b-38438fc2030b</entry>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <entry name="uuid">cad430d0-9af9-46f1-ad8b-38438fc2030b</entry>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/cad430d0-9af9-46f1-ad8b-38438fc2030b_disk">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/cad430d0-9af9-46f1-ad8b-38438fc2030b_disk.config">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:b5:5a:2a"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <target dev="tap5da2fb68-f1"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/cad430d0-9af9-46f1-ad8b-38438fc2030b/console.log" append="off"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:51:13 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:51:13 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:51:13 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:51:13 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.235 227766 DEBUG nova.compute.manager [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Preparing to wait for external event network-vif-plugged-5da2fb68-f183-46ec-b307-762bc7c0eae1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.235 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Acquiring lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.235 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.236 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.237 227766 DEBUG nova.virt.libvirt.vif [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:50:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestEncryptedCinderVolumes-server-2136626245',display_name='tempest-TestEncryptedCinderVolumes-server-2136626245',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testencryptedcindervolumes-server-2136626245',id=209,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCuiBqvEtsGgfmA0vuszrHVvw7K+RIn5/GzDuVpnZeUj7Bzvv0dWPbHfPCN50+GDI0uVcs7x3mMFsgg3h5hvujMZpusgdPvznJ8NRbHRE7JTX+XI36WbEmYI2og5OCDx2w==',key_name='tempest-keypair-225997112',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9533be9d361246bdb0a7c1bd3015db66',ramdisk_id='',reservation_id='r-d2hfsqgw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestEncryptedCinderVolumes-2052193645',owner_user_name='tempest-TestEncryptedCinderVolumes-2052193645-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:50:58Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='7d7e6f562c9d4d81bf1f8d5462870e30',uuid=cad430d0-9af9-46f1-ad8b-38438fc2030b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "address": "fa:16:3e:b5:5a:2a", "network": {"id": "fe1b8e52-7b7b-4d1d-a352-d131d1cac17f", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-1479617766-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9533be9d361246bdb0a7c1bd3015db66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5da2fb68-f1", "ovs_interfaceid": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.237 227766 DEBUG nova.network.os_vif_util [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Converting VIF {"id": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "address": "fa:16:3e:b5:5a:2a", "network": {"id": "fe1b8e52-7b7b-4d1d-a352-d131d1cac17f", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-1479617766-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9533be9d361246bdb0a7c1bd3015db66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5da2fb68-f1", "ovs_interfaceid": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.238 227766 DEBUG nova.network.os_vif_util [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b5:5a:2a,bridge_name='br-int',has_traffic_filtering=True,id=5da2fb68-f183-46ec-b307-762bc7c0eae1,network=Network(fe1b8e52-7b7b-4d1d-a352-d131d1cac17f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5da2fb68-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.238 227766 DEBUG os_vif [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:5a:2a,bridge_name='br-int',has_traffic_filtering=True,id=5da2fb68-f183-46ec-b307-762bc7c0eae1,network=Network(fe1b8e52-7b7b-4d1d-a352-d131d1cac17f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5da2fb68-f1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.239 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.240 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.240 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.245 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.246 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5da2fb68-f1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.247 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5da2fb68-f1, col_values=(('external_ids', {'iface-id': '5da2fb68-f183-46ec-b307-762bc7c0eae1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b5:5a:2a', 'vm-uuid': 'cad430d0-9af9-46f1-ad8b-38438fc2030b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.249 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:13 np0005593234 NetworkManager[48942]: <info>  [1769165473.2500] manager: (tap5da2fb68-f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/422)
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.252 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.257 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.258 227766 INFO os_vif [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b5:5a:2a,bridge_name='br-int',has_traffic_filtering=True,id=5da2fb68-f183-46ec-b307-762bc7c0eae1,network=Network(fe1b8e52-7b7b-4d1d-a352-d131d1cac17f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5da2fb68-f1')#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.498 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.498 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.499 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] No VIF found with MAC fa:16:3e:b5:5a:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.499 227766 INFO nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Using config drive#033[00m
Jan 23 05:51:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:13.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.524 227766 DEBUG nova.storage.rbd_utils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] rbd image cad430d0-9af9-46f1-ad8b-38438fc2030b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:51:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:13.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.858 227766 INFO nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Creating config drive at /var/lib/nova/instances/cad430d0-9af9-46f1-ad8b-38438fc2030b/disk.config#033[00m
Jan 23 05:51:13 np0005593234 nova_compute[227762]: 2026-01-23 10:51:13.864 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/cad430d0-9af9-46f1-ad8b-38438fc2030b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1eqj81o2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.003 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/cad430d0-9af9-46f1-ad8b-38438fc2030b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1eqj81o2" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.037 227766 DEBUG nova.storage.rbd_utils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] rbd image cad430d0-9af9-46f1-ad8b-38438fc2030b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.041 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/cad430d0-9af9-46f1-ad8b-38438fc2030b/disk.config cad430d0-9af9-46f1-ad8b-38438fc2030b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.212 227766 DEBUG oslo_concurrency.processutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/cad430d0-9af9-46f1-ad8b-38438fc2030b/disk.config cad430d0-9af9-46f1-ad8b-38438fc2030b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.171s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.213 227766 INFO nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Deleting local config drive /var/lib/nova/instances/cad430d0-9af9-46f1-ad8b-38438fc2030b/disk.config because it was imported into RBD.#033[00m
Jan 23 05:51:14 np0005593234 kernel: tap5da2fb68-f1: entered promiscuous mode
Jan 23 05:51:14 np0005593234 NetworkManager[48942]: <info>  [1769165474.2850] manager: (tap5da2fb68-f1): new Tun device (/org/freedesktop/NetworkManager/Devices/423)
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.286 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:51:14Z|00897|binding|INFO|Claiming lport 5da2fb68-f183-46ec-b307-762bc7c0eae1 for this chassis.
Jan 23 05:51:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:51:14Z|00898|binding|INFO|5da2fb68-f183-46ec-b307-762bc7c0eae1: Claiming fa:16:3e:b5:5a:2a 10.100.0.13
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.290 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.292 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.298 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:14 np0005593234 systemd-udevd[330234]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:51:14 np0005593234 systemd-machined[195626]: New machine qemu-101-instance-000000d1.
Jan 23 05:51:14 np0005593234 NetworkManager[48942]: <info>  [1769165474.3301] device (tap5da2fb68-f1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:51:14 np0005593234 NetworkManager[48942]: <info>  [1769165474.3310] device (tap5da2fb68-f1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:51:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:51:14Z|00899|binding|INFO|Setting lport 5da2fb68-f183-46ec-b307-762bc7c0eae1 ovn-installed in OVS
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.360 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:14 np0005593234 systemd[1]: Started Virtual Machine qemu-101-instance-000000d1.
Jan 23 05:51:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:51:14Z|00900|binding|INFO|Setting lport 5da2fb68-f183-46ec-b307-762bc7c0eae1 up in Southbound
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.544 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:5a:2a 10.100.0.13'], port_security=['fa:16:3e:b5:5a:2a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'cad430d0-9af9-46f1-ad8b-38438fc2030b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9533be9d361246bdb0a7c1bd3015db66', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b1bad509-a799-4dcc-92ea-816a14d07688', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80d47a5e-844e-4435-9eba-3ada7eceb5f3, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5da2fb68-f183-46ec-b307-762bc7c0eae1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.545 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5da2fb68-f183-46ec-b307-762bc7c0eae1 in datapath fe1b8e52-7b7b-4d1d-a352-d131d1cac17f bound to our chassis#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.546 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fe1b8e52-7b7b-4d1d-a352-d131d1cac17f#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.558 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d0ded41b-f762-48c1-a483-20d580bfaa73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.560 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfe1b8e52-71 in ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.562 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfe1b8e52-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.562 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[532b3b84-8178-4b8a-9e9b-f259c62cd63d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.563 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2d6ec175-a922-42ac-b5d2-2978e1145595]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.576 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[8eca494f-3531-4272-a78f-ae7dc7981482]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.594 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4792372b-eacf-4caf-8739-35361fde6b58]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.630 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[0e554729-9711-4607-8534-48c8ce32dd94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.635 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b218c4ee-9140-4a87-878d-5f93e145aed1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 NetworkManager[48942]: <info>  [1769165474.6374] manager: (tapfe1b8e52-70): new Veth device (/org/freedesktop/NetworkManager/Devices/424)
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.682 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[25c07a68-9c8b-40b7-91b8-c29c05402d36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.686 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[03144652-fa94-4205-8ef2-a573205b49d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 NetworkManager[48942]: <info>  [1769165474.7168] device (tapfe1b8e52-70): carrier: link connected
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.724 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d768102b-a13f-4e71-84bd-367fb27674de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.744 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e757e00a-034f-41f3-8469-a9f8ed44929e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe1b8e52-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:49:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 276], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 936967, 'reachable_time': 30257, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 330268, 'error': None, 'target': 'ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.769 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a6c8f5fc-71f9-4582-ba68-330252ebae30]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed2:49cc'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 936967, 'tstamp': 936967}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 330269, 'error': None, 'target': 'ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.772 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.773 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.774 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.795 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0d30bbe5-a5ba-4298-a24d-d1b5d15a3347]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfe1b8e52-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:49:cc'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 276], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 936967, 'reachable_time': 30257, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 330270, 'error': None, 'target': 'ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.831 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7eb75e8e-6e89-4389-b0cd-6cbf2a1fdc49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.911 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1a6ff6af-11fd-4c82-b34e-2d4a286356f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.914 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe1b8e52-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.915 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.915 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfe1b8e52-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:51:14 np0005593234 kernel: tapfe1b8e52-70: entered promiscuous mode
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.918 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:14 np0005593234 NetworkManager[48942]: <info>  [1769165474.9189] manager: (tapfe1b8e52-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/425)
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.925 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfe1b8e52-70, col_values=(('external_ids', {'iface-id': '61ecd62f-11f4-4883-a35f-e2717449d670'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:51:14 np0005593234 ovn_controller[134547]: 2026-01-23T10:51:14Z|00901|binding|INFO|Releasing lport 61ecd62f-11f4-4883-a35f-e2717449d670 from this chassis (sb_readonly=0)
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.928 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.929 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fe1b8e52-7b7b-4d1d-a352-d131d1cac17f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fe1b8e52-7b7b-4d1d-a352-d131d1cac17f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.930 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7772dd36-8576-4cd6-a05e-715be0461b33]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.931 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/fe1b8e52-7b7b-4d1d-a352-d131d1cac17f.pid.haproxy
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID fe1b8e52-7b7b-4d1d-a352-d131d1cac17f
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:51:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:14.932 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f', 'env', 'PROCESS_TAG=haproxy-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fe1b8e52-7b7b-4d1d-a352-d131d1cac17f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:51:14 np0005593234 nova_compute[227762]: 2026-01-23 10:51:14.948 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.131 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165475.1303375, cad430d0-9af9-46f1-ad8b-38438fc2030b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.132 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] VM Started (Lifecycle Event)#033[00m
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.145 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:51:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3611647336' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.225 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:51:15 np0005593234 podman[330367]: 2026-01-23 10:51:15.352443586 +0000 UTC m=+0.059338957 container create 604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.381 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.385 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165475.1306083, cad430d0-9af9-46f1-ad8b-38438fc2030b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.386 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:51:15 np0005593234 systemd[1]: Started libpod-conmon-604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca.scope.
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.410 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.416 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:51:15 np0005593234 podman[330367]: 2026-01-23 10:51:15.322527346 +0000 UTC m=+0.029422707 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.419 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000d1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.419 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000d1 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 05:51:15 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:51:15 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38e543f590f045ac7cf898ae66a4fbe0e32725e16678206b45de7d0e930afc60/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:51:15 np0005593234 podman[330367]: 2026-01-23 10:51:15.446033923 +0000 UTC m=+0.152929284 container init 604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:51:15 np0005593234 podman[330367]: 2026-01-23 10:51:15.451331583 +0000 UTC m=+0.158226914 container start 604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.454 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:51:15 np0005593234 neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f[330382]: [NOTICE]   (330386) : New worker (330388) forked
Jan 23 05:51:15 np0005593234 neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f[330382]: [NOTICE]   (330386) : Loading success.
Jan 23 05:51:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:15.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.576 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.577 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4046MB free_disk=20.967525482177734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.577 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:51:15 np0005593234 nova_compute[227762]: 2026-01-23 10:51:15.577 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:51:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:15.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:51:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.250 227766 DEBUG nova.compute.manager [req-e34d0005-4e33-4721-9a55-9c30d2db22ce req-4682edaf-5127-42f9-8953-d996d00c09f3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Received event network-vif-plugged-5da2fb68-f183-46ec-b307-762bc7c0eae1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.251 227766 DEBUG oslo_concurrency.lockutils [req-e34d0005-4e33-4721-9a55-9c30d2db22ce req-4682edaf-5127-42f9-8953-d996d00c09f3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.251 227766 DEBUG oslo_concurrency.lockutils [req-e34d0005-4e33-4721-9a55-9c30d2db22ce req-4682edaf-5127-42f9-8953-d996d00c09f3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.252 227766 DEBUG oslo_concurrency.lockutils [req-e34d0005-4e33-4721-9a55-9c30d2db22ce req-4682edaf-5127-42f9-8953-d996d00c09f3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.252 227766 DEBUG nova.compute.manager [req-e34d0005-4e33-4721-9a55-9c30d2db22ce req-4682edaf-5127-42f9-8953-d996d00c09f3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Processing event network-vif-plugged-5da2fb68-f183-46ec-b307-762bc7c0eae1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.254 227766 DEBUG nova.compute.manager [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.267 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165476.2630408, cad430d0-9af9-46f1-ad8b-38438fc2030b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.268 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.272 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.282 227766 INFO nova.virt.libvirt.driver [-] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Instance spawned successfully.#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.283 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.289 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.293 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.301 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance cad430d0-9af9-46f1-ad8b-38438fc2030b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.301 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.302 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.308 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.309 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.309 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.310 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.310 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.311 227766 DEBUG nova.virt.libvirt.driver [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.316 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.336 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.368 227766 INFO nova.compute.manager [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Took 17.90 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.369 227766 DEBUG nova.compute.manager [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:51:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:51:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/183850597' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.790 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:51:16 np0005593234 nova_compute[227762]: 2026-01-23 10:51:16.796 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:51:17 np0005593234 nova_compute[227762]: 2026-01-23 10:51:17.138 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:51:17 np0005593234 nova_compute[227762]: 2026-01-23 10:51:17.491 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:51:17 np0005593234 nova_compute[227762]: 2026-01-23 10:51:17.491 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.914s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:51:17 np0005593234 nova_compute[227762]: 2026-01-23 10:51:17.493 227766 INFO nova.compute.manager [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Took 21.20 seconds to build instance.#033[00m
Jan 23 05:51:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:17 np0005593234 nova_compute[227762]: 2026-01-23 10:51:17.511 227766 DEBUG oslo_concurrency.lockutils [None req-3414d476-2409-4b99-8a0c-5b4d9c4b6f50 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:51:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:17.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:17 np0005593234 podman[330470]: 2026-01-23 10:51:17.764497736 +0000 UTC m=+0.052675734 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 05:51:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:17.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.290 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.362 227766 DEBUG nova.compute.manager [req-4a2c97e5-8f1c-4945-b25b-21898cebb020 req-4997cf81-f01b-436a-8812-c72fb1fcf50f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Received event network-vif-plugged-5da2fb68-f183-46ec-b307-762bc7c0eae1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.363 227766 DEBUG oslo_concurrency.lockutils [req-4a2c97e5-8f1c-4945-b25b-21898cebb020 req-4997cf81-f01b-436a-8812-c72fb1fcf50f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.363 227766 DEBUG oslo_concurrency.lockutils [req-4a2c97e5-8f1c-4945-b25b-21898cebb020 req-4997cf81-f01b-436a-8812-c72fb1fcf50f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.363 227766 DEBUG oslo_concurrency.lockutils [req-4a2c97e5-8f1c-4945-b25b-21898cebb020 req-4997cf81-f01b-436a-8812-c72fb1fcf50f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.363 227766 DEBUG nova.compute.manager [req-4a2c97e5-8f1c-4945-b25b-21898cebb020 req-4997cf81-f01b-436a-8812-c72fb1fcf50f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] No waiting events found dispatching network-vif-plugged-5da2fb68-f183-46ec-b307-762bc7c0eae1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.364 227766 WARNING nova.compute.manager [req-4a2c97e5-8f1c-4945-b25b-21898cebb020 req-4997cf81-f01b-436a-8812-c72fb1fcf50f 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Received unexpected event network-vif-plugged-5da2fb68-f183-46ec-b307-762bc7c0eae1 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.492 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.493 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.493 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.905 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-cad430d0-9af9-46f1-ad8b-38438fc2030b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.906 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-cad430d0-9af9-46f1-ad8b-38438fc2030b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.906 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 05:51:18 np0005593234 nova_compute[227762]: 2026-01-23 10:51:18.906 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid cad430d0-9af9-46f1-ad8b-38438fc2030b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:51:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:19.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:19.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #187. Immutable memtables: 0.
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.876237) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 119] Flushing memtable with next log file: 187
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165479876322, "job": 119, "event": "flush_started", "num_memtables": 1, "num_entries": 1297, "num_deletes": 256, "total_data_size": 2778529, "memory_usage": 2809344, "flush_reason": "Manual Compaction"}
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 119] Level-0 flush table #188: started
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165479891416, "cf_name": "default", "job": 119, "event": "table_file_creation", "file_number": 188, "file_size": 1812004, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 89302, "largest_seqno": 90594, "table_properties": {"data_size": 1806451, "index_size": 2884, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12372, "raw_average_key_size": 19, "raw_value_size": 1794978, "raw_average_value_size": 2876, "num_data_blocks": 127, "num_entries": 624, "num_filter_entries": 624, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165382, "oldest_key_time": 1769165382, "file_creation_time": 1769165479, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 188, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 119] Flush lasted 15235 microseconds, and 4633 cpu microseconds.
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.891480) [db/flush_job.cc:967] [default] [JOB 119] Level-0 flush table #188: 1812004 bytes OK
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.891499) [db/memtable_list.cc:519] [default] Level-0 commit table #188 started
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.893345) [db/memtable_list.cc:722] [default] Level-0 commit table #188: memtable #1 done
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.893359) EVENT_LOG_v1 {"time_micros": 1769165479893354, "job": 119, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.893377) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 119] Try to delete WAL files size 2772278, prev total WAL file size 2772278, number of live WAL files 2.
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000184.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.894246) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033353139' seq:72057594037927935, type:22 .. '6C6F676D0033373731' seq:0, type:0; will stop at (end)
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 120] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 119 Base level 0, inputs: [188(1769KB)], [186(10MB)]
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165479894334, "job": 120, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [188], "files_L6": [186], "score": -1, "input_data_size": 12839123, "oldest_snapshot_seqno": -1}
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 120] Generated table #189: 10712 keys, 12699505 bytes, temperature: kUnknown
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165479967690, "cf_name": "default", "job": 120, "event": "table_file_creation", "file_number": 189, "file_size": 12699505, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12632058, "index_size": 39558, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26821, "raw_key_size": 283797, "raw_average_key_size": 26, "raw_value_size": 12446374, "raw_average_value_size": 1161, "num_data_blocks": 1499, "num_entries": 10712, "num_filter_entries": 10712, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769165479, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 189, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.968168) [db/compaction/compaction_job.cc:1663] [default] [JOB 120] Compacted 1@0 + 1@6 files to L6 => 12699505 bytes
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.969692) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.9 rd, 173.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 10.5 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(14.1) write-amplify(7.0) OK, records in: 11243, records dropped: 531 output_compression: NoCompression
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.969710) EVENT_LOG_v1 {"time_micros": 1769165479969702, "job": 120, "event": "compaction_finished", "compaction_time_micros": 73425, "compaction_time_cpu_micros": 28830, "output_level": 6, "num_output_files": 1, "total_output_size": 12699505, "num_input_records": 11243, "num_output_records": 10712, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000188.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165479970208, "job": 120, "event": "table_file_deletion", "file_number": 188}
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000186.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165479971991, "job": 120, "event": "table_file_deletion", "file_number": 186}
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.894188) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.972048) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.972051) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.972053) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.972054) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:51:19 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:51:19.972056) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:51:20 np0005593234 nova_compute[227762]: 2026-01-23 10:51:20.133 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:20 np0005593234 NetworkManager[48942]: <info>  [1769165480.1337] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/426)
Jan 23 05:51:20 np0005593234 NetworkManager[48942]: <info>  [1769165480.1345] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/427)
Jan 23 05:51:20 np0005593234 nova_compute[227762]: 2026-01-23 10:51:20.206 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:20 np0005593234 nova_compute[227762]: 2026-01-23 10:51:20.208 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:51:20Z|00902|binding|INFO|Releasing lport 61ecd62f-11f4-4883-a35f-e2717449d670 from this chassis (sb_readonly=0)
Jan 23 05:51:20 np0005593234 nova_compute[227762]: 2026-01-23 10:51:20.227 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:21.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:21 np0005593234 nova_compute[227762]: 2026-01-23 10:51:21.527 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Updating instance_info_cache with network_info: [{"id": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "address": "fa:16:3e:b5:5a:2a", "network": {"id": "fe1b8e52-7b7b-4d1d-a352-d131d1cac17f", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-1479617766-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9533be9d361246bdb0a7c1bd3015db66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5da2fb68-f1", "ovs_interfaceid": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:51:21 np0005593234 nova_compute[227762]: 2026-01-23 10:51:21.556 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-cad430d0-9af9-46f1-ad8b-38438fc2030b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:51:21 np0005593234 nova_compute[227762]: 2026-01-23 10:51:21.557 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 05:51:21 np0005593234 nova_compute[227762]: 2026-01-23 10:51:21.557 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:21 np0005593234 nova_compute[227762]: 2026-01-23 10:51:21.558 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:21 np0005593234 nova_compute[227762]: 2026-01-23 10:51:21.559 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:51:21 np0005593234 nova_compute[227762]: 2026-01-23 10:51:21.711 227766 DEBUG nova.compute.manager [req-cfc5d64c-e285-47ab-936c-5ed2beac2e27 req-ea01abd2-b2bc-40fe-938d-e3c73d8499f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Received event network-changed-5da2fb68-f183-46ec-b307-762bc7c0eae1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:51:21 np0005593234 nova_compute[227762]: 2026-01-23 10:51:21.711 227766 DEBUG nova.compute.manager [req-cfc5d64c-e285-47ab-936c-5ed2beac2e27 req-ea01abd2-b2bc-40fe-938d-e3c73d8499f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Refreshing instance network info cache due to event network-changed-5da2fb68-f183-46ec-b307-762bc7c0eae1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:51:21 np0005593234 nova_compute[227762]: 2026-01-23 10:51:21.712 227766 DEBUG oslo_concurrency.lockutils [req-cfc5d64c-e285-47ab-936c-5ed2beac2e27 req-ea01abd2-b2bc-40fe-938d-e3c73d8499f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-cad430d0-9af9-46f1-ad8b-38438fc2030b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:51:21 np0005593234 nova_compute[227762]: 2026-01-23 10:51:21.712 227766 DEBUG oslo_concurrency.lockutils [req-cfc5d64c-e285-47ab-936c-5ed2beac2e27 req-ea01abd2-b2bc-40fe-938d-e3c73d8499f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-cad430d0-9af9-46f1-ad8b-38438fc2030b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:51:21 np0005593234 nova_compute[227762]: 2026-01-23 10:51:21.712 227766 DEBUG nova.network.neutron [req-cfc5d64c-e285-47ab-936c-5ed2beac2e27 req-ea01abd2-b2bc-40fe-938d-e3c73d8499f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Refreshing network info cache for port 5da2fb68-f183-46ec-b307-762bc7c0eae1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:51:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:21.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:22 np0005593234 nova_compute[227762]: 2026-01-23 10:51:22.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:22 np0005593234 nova_compute[227762]: 2026-01-23 10:51:22.822 227766 DEBUG nova.network.neutron [req-cfc5d64c-e285-47ab-936c-5ed2beac2e27 req-ea01abd2-b2bc-40fe-938d-e3c73d8499f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Updated VIF entry in instance network info cache for port 5da2fb68-f183-46ec-b307-762bc7c0eae1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:51:22 np0005593234 nova_compute[227762]: 2026-01-23 10:51:22.823 227766 DEBUG nova.network.neutron [req-cfc5d64c-e285-47ab-936c-5ed2beac2e27 req-ea01abd2-b2bc-40fe-938d-e3c73d8499f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Updating instance_info_cache with network_info: [{"id": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "address": "fa:16:3e:b5:5a:2a", "network": {"id": "fe1b8e52-7b7b-4d1d-a352-d131d1cac17f", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-1479617766-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9533be9d361246bdb0a7c1bd3015db66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5da2fb68-f1", "ovs_interfaceid": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:51:23 np0005593234 nova_compute[227762]: 2026-01-23 10:51:23.094 227766 DEBUG oslo_concurrency.lockutils [req-cfc5d64c-e285-47ab-936c-5ed2beac2e27 req-ea01abd2-b2bc-40fe-938d-e3c73d8499f5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-cad430d0-9af9-46f1-ad8b-38438fc2030b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:51:23 np0005593234 nova_compute[227762]: 2026-01-23 10:51:23.294 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:23.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:23 np0005593234 nova_compute[227762]: 2026-01-23 10:51:23.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:51:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:23.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:51:25 np0005593234 nova_compute[227762]: 2026-01-23 10:51:25.211 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:25.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:25 np0005593234 nova_compute[227762]: 2026-01-23 10:51:25.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:51:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:25.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:51:26 np0005593234 podman[330499]: 2026-01-23 10:51:26.829056004 +0000 UTC m=+0.122048063 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 23 05:51:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:27.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:51:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:27.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:51:28 np0005593234 nova_compute[227762]: 2026-01-23 10:51:28.295 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:29.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:29.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:30 np0005593234 nova_compute[227762]: 2026-01-23 10:51:30.215 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:31.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:31 np0005593234 nova_compute[227762]: 2026-01-23 10:51:31.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:31.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:51:32Z|00118|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b5:5a:2a 10.100.0.13
Jan 23 05:51:32 np0005593234 ovn_controller[134547]: 2026-01-23T10:51:32Z|00119|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b5:5a:2a 10.100.0.13
Jan 23 05:51:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:33 np0005593234 nova_compute[227762]: 2026-01-23 10:51:33.299 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:33.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:33.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:35 np0005593234 nova_compute[227762]: 2026-01-23 10:51:35.217 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:35.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:35.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:37.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:37.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:38 np0005593234 nova_compute[227762]: 2026-01-23 10:51:38.303 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:39.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:51:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:39.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:51:40 np0005593234 nova_compute[227762]: 2026-01-23 10:51:40.253 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:51:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:41.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:51:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:51:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:41.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:51:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:42 np0005593234 nova_compute[227762]: 2026-01-23 10:51:42.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:51:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:42.894 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:51:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:42.895 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:51:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:51:42.896 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:51:43 np0005593234 nova_compute[227762]: 2026-01-23 10:51:43.307 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:51:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:43.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:51:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:43.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:45 np0005593234 nova_compute[227762]: 2026-01-23 10:51:45.256 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:51:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:45.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:51:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:45.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:47.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:47.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:48 np0005593234 nova_compute[227762]: 2026-01-23 10:51:48.311 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:48 np0005593234 podman[330587]: 2026-01-23 10:51:48.758306234 +0000 UTC m=+0.052884330 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 05:51:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:51:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:49.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:51:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:49.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:51:50Z|00903|memory_trim|INFO|Detected inactivity (last active 30025 ms ago): trimming memory
Jan 23 05:51:50 np0005593234 nova_compute[227762]: 2026-01-23 10:51:50.258 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:51:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:51.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:51:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:51.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:53 np0005593234 nova_compute[227762]: 2026-01-23 10:51:53.315 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:53.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:51:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:53.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:51:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:51:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.0 total, 600.0 interval#012Cumulative writes: 17K writes, 90K keys, 17K commit groups, 1.0 writes per commit group, ingest: 0.18 GB, 0.03 MB/s#012Cumulative WAL: 17K writes, 17K syncs, 1.00 writes per sync, written: 0.18 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1577 writes, 7897 keys, 1577 commit groups, 1.0 writes per commit group, ingest: 16.11 MB, 0.03 MB/s#012Interval WAL: 1577 writes, 1577 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     45.2      2.49              0.34        60    0.041       0      0       0.0       0.0#012  L6      1/0   12.11 MB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   5.3    116.5     99.8      6.00              1.89        59    0.102    457K    31K       0.0       0.0#012 Sum      1/0   12.11 MB   0.0      0.7     0.1      0.6       0.7      0.1       0.0   6.3     82.3     83.8      8.49              2.23       119    0.071    457K    31K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.0     98.5    100.6      0.80              0.27        12    0.067     66K   3078       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.7     0.1      0.6       0.6      0.0       0.0   0.0    116.5     99.8      6.00              1.89        59    0.102    457K    31K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     45.3      2.49              0.34        59    0.042       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 6600.0 total, 600.0 interval#012Flush(GB): cumulative 0.110, interval 0.010#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.69 GB write, 0.11 MB/s write, 0.68 GB read, 0.11 MB/s read, 8.5 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.08 GB read, 0.13 MB/s read, 0.8 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 304.00 MB usage: 76.22 MB table_size: 0 occupancy: 18446744073709551615 collections: 12 last_copies: 0 last_secs: 0.000644 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(4355,72.99 MB,24.0111%) FilterBlock(119,1.24 MB,0.407786%) IndexBlock(119,1.98 MB,0.652951%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 05:51:55 np0005593234 nova_compute[227762]: 2026-01-23 10:51:55.262 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:55.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:55.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:56 np0005593234 nova_compute[227762]: 2026-01-23 10:51:56.486 227766 DEBUG oslo_concurrency.lockutils [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Acquiring lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:51:56 np0005593234 nova_compute[227762]: 2026-01-23 10:51:56.486 227766 DEBUG oslo_concurrency.lockutils [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" acquired by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:51:56 np0005593234 nova_compute[227762]: 2026-01-23 10:51:56.524 227766 DEBUG nova.objects.instance [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lazy-loading 'flavor' on Instance uuid cad430d0-9af9-46f1-ad8b-38438fc2030b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:51:56 np0005593234 nova_compute[227762]: 2026-01-23 10:51:56.564 227766 DEBUG oslo_concurrency.lockutils [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" "released" by "nova.compute.manager.ComputeManager.reserve_block_device_name.<locals>.do_reserve" :: held 0.079s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.099 227766 DEBUG oslo_concurrency.lockutils [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Acquiring lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.100 227766 DEBUG oslo_concurrency.lockutils [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" acquired by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.100 227766 INFO nova.compute.manager [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Attaching volume fe67c9d0-617c-4383-b125-c4c25cee2e92 to /dev/vdb#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.344 227766 DEBUG os_brick.utils [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.346 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.362 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.016s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.362 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[0e54eaa1-f5a5-4dbc-8b9b-19697c5186ee]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.363 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.374 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.374 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[24b181ee-bce7-4be7-89d2-5d25851b0db2]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.375 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.385 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.010s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.385 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[61f762e7-6ba4-4878-9f6c-638de0d14ea9]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.386 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[428b201d-f2a0-4d85-b240-c51fd8a646f1]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.387 227766 DEBUG oslo_concurrency.processutils [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.436 227766 DEBUG oslo_concurrency.processutils [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] CMD "nvme version" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.438 227766 DEBUG os_brick.initiator.connectors.lightos [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.438 227766 DEBUG os_brick.initiator.connectors.lightos [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.438 227766 DEBUG os_brick.initiator.connectors.lightos [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.439 227766 DEBUG os_brick.utils [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] <== get_connector_properties: return (94ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 05:51:57 np0005593234 nova_compute[227762]: 2026-01-23 10:51:57.439 227766 DEBUG nova.virt.block_device [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Updating existing volume attachment record: 2ee11420-b20b-4205-8376-172c18b34b0d _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 05:51:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:51:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:57.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:57 np0005593234 podman[330668]: 2026-01-23 10:51:57.827282344 +0000 UTC m=+0.114006183 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 23 05:51:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:57.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:51:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2444619642' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:51:58 np0005593234 nova_compute[227762]: 2026-01-23 10:51:58.318 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.673 227766 DEBUG os_brick.encryptors [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Using volume encryption metadata '{'encryption_key_id': '70457a1c-4b64-474d-8489-dd7f91885f60', 'control_location': 'front-end', 'cipher': 'aes-xts-plain64', 'key_size': 256, 'provider': 'luks'}' for connection: {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-fe67c9d0-617c-4383-b125-c4c25cee2e92', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'fe67c9d0-617c-4383-b125-c4c25cee2e92', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': True, 'cacheable': False}, 'status': 'reserved', 'instance': 'cad430d0-9af9-46f1-ad8b-38438fc2030b', 'attached_at': '', 'detached_at': '', 'volume_id': 'fe67c9d0-617c-4383-b125-c4c25cee2e92', 'serial': '} get_encryption_metadata /usr/lib/python3.9/site-packages/os_brick/encryptors/__init__.py:135#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.680 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Creating Client object Client /usr/lib/python3.9/site-packages/barbicanclient/client.py:163#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.695 227766 DEBUG barbicanclient.v1.secrets [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Getting secret - Secret href: https://barbican-internal.openstack.svc:9311/secrets/70457a1c-4b64-474d-8489-dd7f91885f60 get /usr/lib/python3.9/site-packages/barbicanclient/v1/secrets.py:514#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.696 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.719 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.719 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.745 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.746 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.765 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.766 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:51:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000065s ======
Jan 23 05:51:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:51:59.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.786 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.786 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.808 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.809 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.842 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.842 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.867 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.867 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:51:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:51:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:51:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:51:59.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.891 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.892 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.919 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.920 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.965 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.966 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.984 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:51:59 np0005593234 nova_compute[227762]: 2026-01-23 10:51:59.985 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:52:00 np0005593234 nova_compute[227762]: 2026-01-23 10:52:00.004 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:52:00 np0005593234 nova_compute[227762]: 2026-01-23 10:52:00.005 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:52:00 np0005593234 nova_compute[227762]: 2026-01-23 10:52:00.023 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:52:00 np0005593234 nova_compute[227762]: 2026-01-23 10:52:00.023 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:52:00 np0005593234 nova_compute[227762]: 2026-01-23 10:52:00.042 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:52:00 np0005593234 nova_compute[227762]: 2026-01-23 10:52:00.043 227766 INFO barbicanclient.base [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Calculated Secrets uuid ref: secrets/70457a1c-4b64-474d-8489-dd7f91885f60#033[00m
Jan 23 05:52:00 np0005593234 nova_compute[227762]: 2026-01-23 10:52:00.061 227766 DEBUG barbicanclient.client [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Response status 200 _check_status_code /usr/lib/python3.9/site-packages/barbicanclient/client.py:87#033[00m
Jan 23 05:52:00 np0005593234 nova_compute[227762]: 2026-01-23 10:52:00.062 227766 DEBUG nova.virt.libvirt.host [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Secret XML: <secret ephemeral="no" private="no">
Jan 23 05:52:00 np0005593234 nova_compute[227762]:  <usage type="volume">
Jan 23 05:52:00 np0005593234 nova_compute[227762]:    <volume>fe67c9d0-617c-4383-b125-c4c25cee2e92</volume>
Jan 23 05:52:00 np0005593234 nova_compute[227762]:  </usage>
Jan 23 05:52:00 np0005593234 nova_compute[227762]: </secret>
Jan 23 05:52:00 np0005593234 nova_compute[227762]: create_secret /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1131#033[00m
Jan 23 05:52:00 np0005593234 nova_compute[227762]: 2026-01-23 10:52:00.073 227766 DEBUG nova.objects.instance [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lazy-loading 'flavor' on Instance uuid cad430d0-9af9-46f1-ad8b-38438fc2030b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:52:00 np0005593234 nova_compute[227762]: 2026-01-23 10:52:00.101 227766 DEBUG nova.virt.libvirt.driver [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Attempting to attach volume fe67c9d0-617c-4383-b125-c4c25cee2e92 with discard support enabled to an instance using an unsupported configuration. target_bus = virtio. Trim commands will not be issued to the storage device. _check_discard_for_attach_volume /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2168#033[00m
Jan 23 05:52:00 np0005593234 nova_compute[227762]: 2026-01-23 10:52:00.104 227766 DEBUG nova.virt.libvirt.guest [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] attach device xml: <disk type="network" device="disk">
Jan 23 05:52:00 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:52:00 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-fe67c9d0-617c-4383-b125-c4c25cee2e92">
Jan 23 05:52:00 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:52:00 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:52:00 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:52:00 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:52:00 np0005593234 nova_compute[227762]:  <auth username="openstack">
Jan 23 05:52:00 np0005593234 nova_compute[227762]:    <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:52:00 np0005593234 nova_compute[227762]:  </auth>
Jan 23 05:52:00 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:52:00 np0005593234 nova_compute[227762]:  <serial>fe67c9d0-617c-4383-b125-c4c25cee2e92</serial>
Jan 23 05:52:00 np0005593234 nova_compute[227762]:  <encryption format="luks">
Jan 23 05:52:00 np0005593234 nova_compute[227762]:    <secret type="passphrase" uuid="3a7431dd-6906-4eef-a357-a6a13355f72c"/>
Jan 23 05:52:00 np0005593234 nova_compute[227762]:  </encryption>
Jan 23 05:52:00 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:52:00 np0005593234 nova_compute[227762]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Jan 23 05:52:00 np0005593234 nova_compute[227762]: 2026-01-23 10:52:00.262 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:52:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:01.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:52:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:52:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:01.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:52:02 np0005593234 nova_compute[227762]: 2026-01-23 10:52:02.381 227766 DEBUG nova.virt.libvirt.driver [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:52:02 np0005593234 nova_compute[227762]: 2026-01-23 10:52:02.382 227766 DEBUG nova.virt.libvirt.driver [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] No BDM found with device name vdb, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:52:02 np0005593234 nova_compute[227762]: 2026-01-23 10:52:02.383 227766 DEBUG nova.virt.libvirt.driver [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:52:02 np0005593234 nova_compute[227762]: 2026-01-23 10:52:02.383 227766 DEBUG nova.virt.libvirt.driver [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] No VIF found with MAC fa:16:3e:b5:5a:2a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:52:02 np0005593234 nova_compute[227762]: 2026-01-23 10:52:02.629 227766 DEBUG oslo_concurrency.lockutils [None req-3774dc41-f5a2-4124-b647-46ba393a29f7 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" "released" by "nova.compute.manager.ComputeManager.attach_volume.<locals>.do_attach_volume" :: held 5.529s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.323 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.455 227766 DEBUG oslo_concurrency.lockutils [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Acquiring lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.456 227766 DEBUG oslo_concurrency.lockutils [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" acquired by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.497 227766 INFO nova.compute.manager [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Detaching volume fe67c9d0-617c-4383-b125-c4c25cee2e92#033[00m
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.710 227766 INFO nova.virt.block_device [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Attempting to driver detach volume fe67c9d0-617c-4383-b125-c4c25cee2e92 from mountpoint /dev/vdb#033[00m
Jan 23 05:52:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:03.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.834 227766 DEBUG os_brick.encryptors [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Using volume encryption metadata '{'encryption_key_id': '70457a1c-4b64-474d-8489-dd7f91885f60', 'control_location': 'front-end', 'cipher': 'aes-xts-plain64', 'key_size': 256, 'provider': 'luks'}' for connection: {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-fe67c9d0-617c-4383-b125-c4c25cee2e92', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': 'fe67c9d0-617c-4383-b125-c4c25cee2e92', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': True, 'cacheable': False}, 'status': 'reserved', 'instance': 'cad430d0-9af9-46f1-ad8b-38438fc2030b', 'attached_at': '', 'detached_at': '', 'volume_id': 'fe67c9d0-617c-4383-b125-c4c25cee2e92', 'serial': '} get_encryption_metadata /usr/lib/python3.9/site-packages/os_brick/encryptors/__init__.py:135#033[00m
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.841 227766 DEBUG nova.virt.libvirt.driver [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Attempting to detach device vdb from instance cad430d0-9af9-46f1-ad8b-38438fc2030b from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.841 227766 DEBUG nova.virt.libvirt.guest [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-fe67c9d0-617c-4383-b125-c4c25cee2e92">
Jan 23 05:52:03 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  <serial>fe67c9d0-617c-4383-b125-c4c25cee2e92</serial>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  <encryption format="luks">
Jan 23 05:52:03 np0005593234 nova_compute[227762]:    <secret type="passphrase" uuid="3a7431dd-6906-4eef-a357-a6a13355f72c"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  </encryption>
Jan 23 05:52:03 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:52:03 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.848 227766 INFO nova.virt.libvirt.driver [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Successfully detached device vdb from instance cad430d0-9af9-46f1-ad8b-38438fc2030b from the persistent domain config.#033[00m
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.849 227766 DEBUG nova.virt.libvirt.driver [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] (1/8): Attempting to detach device vdb with device alias virtio-disk1 from instance cad430d0-9af9-46f1-ad8b-38438fc2030b from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.849 227766 DEBUG nova.virt.libvirt.guest [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] detach device xml: <disk type="network" device="disk">
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  <source protocol="rbd" name="volumes/volume-fe67c9d0-617c-4383-b125-c4c25cee2e92">
Jan 23 05:52:03 np0005593234 nova_compute[227762]:    <host name="192.168.122.100" port="6789"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:    <host name="192.168.122.102" port="6789"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:    <host name="192.168.122.101" port="6789"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  </source>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  <target dev="vdb" bus="virtio"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  <serial>fe67c9d0-617c-4383-b125-c4c25cee2e92</serial>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  <address type="pci" domain="0x0000" bus="0x06" slot="0x00" function="0x0"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  <encryption format="luks">
Jan 23 05:52:03 np0005593234 nova_compute[227762]:    <secret type="passphrase" uuid="3a7431dd-6906-4eef-a357-a6a13355f72c"/>
Jan 23 05:52:03 np0005593234 nova_compute[227762]:  </encryption>
Jan 23 05:52:03 np0005593234 nova_compute[227762]: </disk>
Jan 23 05:52:03 np0005593234 nova_compute[227762]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Jan 23 05:52:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:03.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.907 227766 DEBUG nova.virt.libvirt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Received event <DeviceRemovedEvent: 1769165523.9068043, cad430d0-9af9-46f1-ad8b-38438fc2030b => virtio-disk1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.908 227766 DEBUG nova.virt.libvirt.driver [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Start waiting for the detach event from libvirt for device vdb with device alias virtio-disk1 for instance cad430d0-9af9-46f1-ad8b-38438fc2030b _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Jan 23 05:52:03 np0005593234 nova_compute[227762]: 2026-01-23 10:52:03.910 227766 INFO nova.virt.libvirt.driver [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Successfully detached device vdb from instance cad430d0-9af9-46f1-ad8b-38438fc2030b from the live domain config.#033[00m
Jan 23 05:52:04 np0005593234 nova_compute[227762]: 2026-01-23 10:52:04.102 227766 DEBUG nova.objects.instance [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lazy-loading 'flavor' on Instance uuid cad430d0-9af9-46f1-ad8b-38438fc2030b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:52:04 np0005593234 nova_compute[227762]: 2026-01-23 10:52:04.148 227766 DEBUG oslo_concurrency.lockutils [None req-c6f6eb36-6b3f-4e2f-8dd5-df88e37d37a5 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" "released" by "nova.compute.manager.ComputeManager.detach_volume.<locals>.do_detach_volume" :: held 0.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.096 227766 DEBUG oslo_concurrency.lockutils [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Acquiring lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.096 227766 DEBUG oslo_concurrency.lockutils [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.097 227766 DEBUG oslo_concurrency.lockutils [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Acquiring lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.097 227766 DEBUG oslo_concurrency.lockutils [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.098 227766 DEBUG oslo_concurrency.lockutils [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.099 227766 INFO nova.compute.manager [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Terminating instance#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.100 227766 DEBUG nova.compute.manager [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:52:05 np0005593234 kernel: tap5da2fb68-f1 (unregistering): left promiscuous mode
Jan 23 05:52:05 np0005593234 NetworkManager[48942]: <info>  [1769165525.1629] device (tap5da2fb68-f1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:52:05 np0005593234 ovn_controller[134547]: 2026-01-23T10:52:05Z|00904|binding|INFO|Releasing lport 5da2fb68-f183-46ec-b307-762bc7c0eae1 from this chassis (sb_readonly=0)
Jan 23 05:52:05 np0005593234 ovn_controller[134547]: 2026-01-23T10:52:05Z|00905|binding|INFO|Setting lport 5da2fb68-f183-46ec-b307-762bc7c0eae1 down in Southbound
Jan 23 05:52:05 np0005593234 ovn_controller[134547]: 2026-01-23T10:52:05Z|00906|binding|INFO|Removing iface tap5da2fb68-f1 ovn-installed in OVS
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.166 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.169 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.176 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b5:5a:2a 10.100.0.13'], port_security=['fa:16:3e:b5:5a:2a 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'cad430d0-9af9-46f1-ad8b-38438fc2030b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9533be9d361246bdb0a7c1bd3015db66', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b1bad509-a799-4dcc-92ea-816a14d07688', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.195'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=80d47a5e-844e-4435-9eba-3ada7eceb5f3, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5da2fb68-f183-46ec-b307-762bc7c0eae1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.178 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5da2fb68-f183-46ec-b307-762bc7c0eae1 in datapath fe1b8e52-7b7b-4d1d-a352-d131d1cac17f unbound from our chassis#033[00m
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.180 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fe1b8e52-7b7b-4d1d-a352-d131d1cac17f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.182 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cebf5d50-2073-4054-9e7d-2327e583156b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.184 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f namespace which is not needed anymore#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.191 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:05 np0005593234 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000d1.scope: Deactivated successfully.
Jan 23 05:52:05 np0005593234 systemd[1]: machine-qemu\x2d101\x2dinstance\x2d000000d1.scope: Consumed 18.251s CPU time.
Jan 23 05:52:05 np0005593234 systemd-machined[195626]: Machine qemu-101-instance-000000d1 terminated.
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.264 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:05 np0005593234 neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f[330382]: [NOTICE]   (330386) : haproxy version is 2.8.14-c23fe91
Jan 23 05:52:05 np0005593234 neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f[330382]: [NOTICE]   (330386) : path to executable is /usr/sbin/haproxy
Jan 23 05:52:05 np0005593234 neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f[330382]: [WARNING]  (330386) : Exiting Master process...
Jan 23 05:52:05 np0005593234 neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f[330382]: [ALERT]    (330386) : Current worker (330388) exited with code 143 (Terminated)
Jan 23 05:52:05 np0005593234 neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f[330382]: [WARNING]  (330386) : All workers exited. Exiting... (0)
Jan 23 05:52:05 np0005593234 systemd[1]: libpod-604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca.scope: Deactivated successfully.
Jan 23 05:52:05 np0005593234 podman[330745]: 2026-01-23 10:52:05.320395826 +0000 UTC m=+0.044114589 container died 604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.338 227766 INFO nova.virt.libvirt.driver [-] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Instance destroyed successfully.#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.338 227766 DEBUG nova.objects.instance [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lazy-loading 'resources' on Instance uuid cad430d0-9af9-46f1-ad8b-38438fc2030b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:52:05 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca-userdata-shm.mount: Deactivated successfully.
Jan 23 05:52:05 np0005593234 systemd[1]: var-lib-containers-storage-overlay-38e543f590f045ac7cf898ae66a4fbe0e32725e16678206b45de7d0e930afc60-merged.mount: Deactivated successfully.
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.355 227766 DEBUG nova.virt.libvirt.vif [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:50:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestEncryptedCinderVolumes-server-2136626245',display_name='tempest-TestEncryptedCinderVolumes-server-2136626245',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testencryptedcindervolumes-server-2136626245',id=209,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCuiBqvEtsGgfmA0vuszrHVvw7K+RIn5/GzDuVpnZeUj7Bzvv0dWPbHfPCN50+GDI0uVcs7x3mMFsgg3h5hvujMZpusgdPvznJ8NRbHRE7JTX+XI36WbEmYI2og5OCDx2w==',key_name='tempest-keypair-225997112',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:51:16Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9533be9d361246bdb0a7c1bd3015db66',ramdisk_id='',reservation_id='r-d2hfsqgw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestEncryptedCinderVolumes-2052193645',owner_user_name='tempest-TestEncryptedCinderVolumes-2052193645-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:51:17Z,user_data='IyEvYmluL3NoCmVjaG8gIlByaW50aW5nIGNpcnJvcyB1c2VyIGF1dGhvcml6ZWQga2V5cyIKY2F0IH5jaXJyb3MvLnNzaC9hdXRob3JpemVkX2tleXMgfHwgdHJ1ZQo=',user_id='7d7e6f562c9d4d81bf1f8d5462870e30',uuid=cad430d0-9af9-46f1-ad8b-38438fc2030b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "address": "fa:16:3e:b5:5a:2a", "network": {"id": "fe1b8e52-7b7b-4d1d-a352-d131d1cac17f", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-1479617766-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9533be9d361246bdb0a7c1bd3015db66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5da2fb68-f1", "ovs_interfaceid": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.356 227766 DEBUG nova.network.os_vif_util [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Converting VIF {"id": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "address": "fa:16:3e:b5:5a:2a", "network": {"id": "fe1b8e52-7b7b-4d1d-a352-d131d1cac17f", "bridge": "br-int", "label": "tempest-TestEncryptedCinderVolumes-1479617766-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.195", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9533be9d361246bdb0a7c1bd3015db66", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5da2fb68-f1", "ovs_interfaceid": "5da2fb68-f183-46ec-b307-762bc7c0eae1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.357 227766 DEBUG nova.network.os_vif_util [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b5:5a:2a,bridge_name='br-int',has_traffic_filtering=True,id=5da2fb68-f183-46ec-b307-762bc7c0eae1,network=Network(fe1b8e52-7b7b-4d1d-a352-d131d1cac17f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5da2fb68-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.357 227766 DEBUG os_vif [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b5:5a:2a,bridge_name='br-int',has_traffic_filtering=True,id=5da2fb68-f183-46ec-b307-762bc7c0eae1,network=Network(fe1b8e52-7b7b-4d1d-a352-d131d1cac17f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5da2fb68-f1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.360 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.360 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5da2fb68-f1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.363 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.364 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.367 227766 INFO os_vif [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b5:5a:2a,bridge_name='br-int',has_traffic_filtering=True,id=5da2fb68-f183-46ec-b307-762bc7c0eae1,network=Network(fe1b8e52-7b7b-4d1d-a352-d131d1cac17f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5da2fb68-f1')#033[00m
Jan 23 05:52:05 np0005593234 podman[330745]: 2026-01-23 10:52:05.369843624 +0000 UTC m=+0.093562387 container cleanup 604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:52:05 np0005593234 systemd[1]: libpod-conmon-604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca.scope: Deactivated successfully.
Jan 23 05:52:05 np0005593234 podman[330793]: 2026-01-23 10:52:05.431891677 +0000 UTC m=+0.038718234 container remove 604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.437 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a0b97c12-e0bc-42a8-b685-51db37a4c61b]: (4, ('Fri Jan 23 10:52:05 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f (604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca)\n604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca\nFri Jan 23 10:52:05 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f (604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca)\n604664d91fb471be029e89cc023c9c9842d09475d7fd4ffc1370d8efe0e963ca\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.439 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[92da00d8-7d4a-44e8-b2f6-7077e6251f64]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.441 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfe1b8e52-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.443 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:05 np0005593234 kernel: tapfe1b8e52-70: left promiscuous mode
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.447 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[06d504bf-cd3e-46e4-a983-d325a4cc5901]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.459 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.466 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bc4d0628-68cf-4c2d-a0a9-6f3c255d9b92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.467 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3c4e7a7e-bdc4-4cb8-9892-d98ebd10e63d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.485 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2f1961b4-3310-4f68-8b13-a8ca5dd22701]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 936958, 'reachable_time': 31111, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 330820, 'error': None, 'target': 'ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:05 np0005593234 systemd[1]: run-netns-ovnmeta\x2dfe1b8e52\x2d7b7b\x2d4d1d\x2da352\x2dd131d1cac17f.mount: Deactivated successfully.
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.491 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fe1b8e52-7b7b-4d1d-a352-d131d1cac17f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:52:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:05.492 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[d7a83733-ef07-4de3-9b20-88cc6d29bdd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:52:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:52:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:05.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.809 227766 INFO nova.virt.libvirt.driver [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Deleting instance files /var/lib/nova/instances/cad430d0-9af9-46f1-ad8b-38438fc2030b_del#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.810 227766 INFO nova.virt.libvirt.driver [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Deletion of /var/lib/nova/instances/cad430d0-9af9-46f1-ad8b-38438fc2030b_del complete#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.867 227766 INFO nova.compute.manager [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Took 0.77 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.869 227766 DEBUG oslo.service.loopingcall [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.870 227766 DEBUG nova.compute.manager [-] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:52:05 np0005593234 nova_compute[227762]: 2026-01-23 10:52:05.870 227766 DEBUG nova.network.neutron [-] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:52:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:05.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:07 np0005593234 nova_compute[227762]: 2026-01-23 10:52:07.317 227766 DEBUG nova.compute.manager [req-34a6e174-8efe-439d-afc2-585fe542dec4 req-18070ae6-55fb-4812-9d5c-2eb2bd6cb367 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Received event network-vif-unplugged-5da2fb68-f183-46ec-b307-762bc7c0eae1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:52:07 np0005593234 nova_compute[227762]: 2026-01-23 10:52:07.318 227766 DEBUG oslo_concurrency.lockutils [req-34a6e174-8efe-439d-afc2-585fe542dec4 req-18070ae6-55fb-4812-9d5c-2eb2bd6cb367 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:07 np0005593234 nova_compute[227762]: 2026-01-23 10:52:07.318 227766 DEBUG oslo_concurrency.lockutils [req-34a6e174-8efe-439d-afc2-585fe542dec4 req-18070ae6-55fb-4812-9d5c-2eb2bd6cb367 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:07 np0005593234 nova_compute[227762]: 2026-01-23 10:52:07.318 227766 DEBUG oslo_concurrency.lockutils [req-34a6e174-8efe-439d-afc2-585fe542dec4 req-18070ae6-55fb-4812-9d5c-2eb2bd6cb367 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:07 np0005593234 nova_compute[227762]: 2026-01-23 10:52:07.318 227766 DEBUG nova.compute.manager [req-34a6e174-8efe-439d-afc2-585fe542dec4 req-18070ae6-55fb-4812-9d5c-2eb2bd6cb367 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] No waiting events found dispatching network-vif-unplugged-5da2fb68-f183-46ec-b307-762bc7c0eae1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:52:07 np0005593234 nova_compute[227762]: 2026-01-23 10:52:07.318 227766 DEBUG nova.compute.manager [req-34a6e174-8efe-439d-afc2-585fe542dec4 req-18070ae6-55fb-4812-9d5c-2eb2bd6cb367 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Received event network-vif-unplugged-5da2fb68-f183-46ec-b307-762bc7c0eae1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:52:07 np0005593234 nova_compute[227762]: 2026-01-23 10:52:07.574 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:07.575 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=86, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=85) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:52:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:07.576 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:52:07 np0005593234 nova_compute[227762]: 2026-01-23 10:52:07.625 227766 DEBUG nova.network.neutron [-] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:52:07 np0005593234 nova_compute[227762]: 2026-01-23 10:52:07.664 227766 INFO nova.compute.manager [-] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Took 1.79 seconds to deallocate network for instance.#033[00m
Jan 23 05:52:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:07 np0005593234 nova_compute[227762]: 2026-01-23 10:52:07.742 227766 DEBUG oslo_concurrency.lockutils [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:07 np0005593234 nova_compute[227762]: 2026-01-23 10:52:07.742 227766 DEBUG oslo_concurrency.lockutils [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:07.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:07 np0005593234 nova_compute[227762]: 2026-01-23 10:52:07.790 227766 DEBUG oslo_concurrency.processutils [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:52:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.003000098s ======
Jan 23 05:52:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:07.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000098s
Jan 23 05:52:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:52:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3120907918' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:52:08 np0005593234 nova_compute[227762]: 2026-01-23 10:52:08.244 227766 DEBUG oslo_concurrency.processutils [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:52:08 np0005593234 nova_compute[227762]: 2026-01-23 10:52:08.251 227766 DEBUG nova.compute.provider_tree [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:52:08 np0005593234 nova_compute[227762]: 2026-01-23 10:52:08.269 227766 DEBUG nova.scheduler.client.report [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:52:08 np0005593234 nova_compute[227762]: 2026-01-23 10:52:08.310 227766 DEBUG oslo_concurrency.lockutils [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.568s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:08 np0005593234 nova_compute[227762]: 2026-01-23 10:52:08.359 227766 INFO nova.scheduler.client.report [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Deleted allocations for instance cad430d0-9af9-46f1-ad8b-38438fc2030b#033[00m
Jan 23 05:52:08 np0005593234 nova_compute[227762]: 2026-01-23 10:52:08.433 227766 DEBUG oslo_concurrency.lockutils [None req-10890db6-617a-4886-85dd-612b647214e3 7d7e6f562c9d4d81bf1f8d5462870e30 9533be9d361246bdb0a7c1bd3015db66 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.337s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:09 np0005593234 nova_compute[227762]: 2026-01-23 10:52:09.447 227766 DEBUG nova.compute.manager [req-6872b168-f8f6-4559-aa75-b83715a09d65 req-4619db75-95f2-4eb8-ba9a-c00c69f28f67 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Received event network-vif-deleted-5da2fb68-f183-46ec-b307-762bc7c0eae1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:52:09 np0005593234 nova_compute[227762]: 2026-01-23 10:52:09.447 227766 DEBUG nova.compute.manager [req-6872b168-f8f6-4559-aa75-b83715a09d65 req-4619db75-95f2-4eb8-ba9a-c00c69f28f67 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Received event network-vif-plugged-5da2fb68-f183-46ec-b307-762bc7c0eae1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:52:09 np0005593234 nova_compute[227762]: 2026-01-23 10:52:09.448 227766 DEBUG oslo_concurrency.lockutils [req-6872b168-f8f6-4559-aa75-b83715a09d65 req-4619db75-95f2-4eb8-ba9a-c00c69f28f67 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:09 np0005593234 nova_compute[227762]: 2026-01-23 10:52:09.448 227766 DEBUG oslo_concurrency.lockutils [req-6872b168-f8f6-4559-aa75-b83715a09d65 req-4619db75-95f2-4eb8-ba9a-c00c69f28f67 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:09 np0005593234 nova_compute[227762]: 2026-01-23 10:52:09.448 227766 DEBUG oslo_concurrency.lockutils [req-6872b168-f8f6-4559-aa75-b83715a09d65 req-4619db75-95f2-4eb8-ba9a-c00c69f28f67 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "cad430d0-9af9-46f1-ad8b-38438fc2030b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:09 np0005593234 nova_compute[227762]: 2026-01-23 10:52:09.448 227766 DEBUG nova.compute.manager [req-6872b168-f8f6-4559-aa75-b83715a09d65 req-4619db75-95f2-4eb8-ba9a-c00c69f28f67 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] No waiting events found dispatching network-vif-plugged-5da2fb68-f183-46ec-b307-762bc7c0eae1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:52:09 np0005593234 nova_compute[227762]: 2026-01-23 10:52:09.448 227766 WARNING nova.compute.manager [req-6872b168-f8f6-4559-aa75-b83715a09d65 req-4619db75-95f2-4eb8-ba9a-c00c69f28f67 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Received unexpected event network-vif-plugged-5da2fb68-f183-46ec-b307-762bc7c0eae1 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:52:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:09.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:09.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:10 np0005593234 nova_compute[227762]: 2026-01-23 10:52:10.266 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:10 np0005593234 nova_compute[227762]: 2026-01-23 10:52:10.362 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:52:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/634261944' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:52:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:52:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/634261944' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:52:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:11.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:11.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:12.579 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '86'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:52:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:52:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:13.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:52:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:13.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:14 np0005593234 nova_compute[227762]: 2026-01-23 10:52:14.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:14 np0005593234 nova_compute[227762]: 2026-01-23 10:52:14.772 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:14 np0005593234 nova_compute[227762]: 2026-01-23 10:52:14.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:14 np0005593234 nova_compute[227762]: 2026-01-23 10:52:14.774 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:14 np0005593234 nova_compute[227762]: 2026-01-23 10:52:14.774 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:52:14 np0005593234 nova_compute[227762]: 2026-01-23 10:52:14.775 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:52:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:52:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1139434701' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:52:15 np0005593234 nova_compute[227762]: 2026-01-23 10:52:15.269 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:15 np0005593234 nova_compute[227762]: 2026-01-23 10:52:15.277 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:52:15 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 23 05:52:15 np0005593234 nova_compute[227762]: 2026-01-23 10:52:15.364 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:15 np0005593234 nova_compute[227762]: 2026-01-23 10:52:15.446 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:52:15 np0005593234 nova_compute[227762]: 2026-01-23 10:52:15.448 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4125MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:52:15 np0005593234 nova_compute[227762]: 2026-01-23 10:52:15.448 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:15 np0005593234 nova_compute[227762]: 2026-01-23 10:52:15.449 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:15.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:52:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:15.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.104 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.108 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.170 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.237 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.238 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.259 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.285 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.305 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.549 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.693 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:52:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:52:16 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:52:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:52:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1246371264' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.796 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.802 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.819 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.843 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:52:16 np0005593234 nova_compute[227762]: 2026-01-23 10:52:16.843 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.395s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:17.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:17 np0005593234 nova_compute[227762]: 2026-01-23 10:52:17.844 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:17 np0005593234 nova_compute[227762]: 2026-01-23 10:52:17.845 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:52:17 np0005593234 nova_compute[227762]: 2026-01-23 10:52:17.845 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:52:17 np0005593234 nova_compute[227762]: 2026-01-23 10:52:17.874 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:52:17 np0005593234 nova_compute[227762]: 2026-01-23 10:52:17.874 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:17.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:19 np0005593234 nova_compute[227762]: 2026-01-23 10:52:19.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:19 np0005593234 nova_compute[227762]: 2026-01-23 10:52:19.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:52:19 np0005593234 podman[331079]: 2026-01-23 10:52:19.783822644 +0000 UTC m=+0.077980496 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 23 05:52:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:19.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:19.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:20 np0005593234 nova_compute[227762]: 2026-01-23 10:52:20.271 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:20 np0005593234 nova_compute[227762]: 2026-01-23 10:52:20.336 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165525.3351972, cad430d0-9af9-46f1-ad8b-38438fc2030b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:52:20 np0005593234 nova_compute[227762]: 2026-01-23 10:52:20.337 227766 INFO nova.compute.manager [-] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:52:20 np0005593234 nova_compute[227762]: 2026-01-23 10:52:20.365 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:20 np0005593234 nova_compute[227762]: 2026-01-23 10:52:20.382 227766 DEBUG nova.compute.manager [None req-c3975368-e7be-42d6-b0c3-b11fc1031f09 - - - - - -] [instance: cad430d0-9af9-46f1-ad8b-38438fc2030b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:52:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:52:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:21.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:52:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:21.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:23 np0005593234 nova_compute[227762]: 2026-01-23 10:52:23.740 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:23.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:23.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:24 np0005593234 nova_compute[227762]: 2026-01-23 10:52:24.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:25 np0005593234 nova_compute[227762]: 2026-01-23 10:52:25.272 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:25 np0005593234 nova_compute[227762]: 2026-01-23 10:52:25.367 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:52:25 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:52:25 np0005593234 nova_compute[227762]: 2026-01-23 10:52:25.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:52:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:25.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:52:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:25.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:27 np0005593234 nova_compute[227762]: 2026-01-23 10:52:27.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:52:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:27.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:52:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000065s ======
Jan 23 05:52:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:27.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Jan 23 05:52:28 np0005593234 podman[331158]: 2026-01-23 10:52:28.792192324 +0000 UTC m=+0.090514849 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:52:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 05:52:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.5 total, 600.0 interval#012Cumulative writes: 74K writes, 298K keys, 74K commit groups, 1.0 writes per commit group, ingest: 0.30 GB, 0.05 MB/s#012Cumulative WAL: 74K writes, 27K syncs, 2.69 writes per sync, written: 0.30 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4268 writes, 14K keys, 4268 commit groups, 1.0 writes per commit group, ingest: 13.04 MB, 0.02 MB/s#012Interval WAL: 4268 writes, 1717 syncs, 2.49 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 05:52:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:29.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:29.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:30 np0005593234 nova_compute[227762]: 2026-01-23 10:52:30.278 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:30 np0005593234 nova_compute[227762]: 2026-01-23 10:52:30.369 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:52:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:31.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:52:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:52:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:31.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:52:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:33 np0005593234 nova_compute[227762]: 2026-01-23 10:52:33.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:33.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:33.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:35 np0005593234 nova_compute[227762]: 2026-01-23 10:52:35.279 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:35 np0005593234 nova_compute[227762]: 2026-01-23 10:52:35.370 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:35.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:35.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:37.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:52:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:37.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:52:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:39.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:39.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:40 np0005593234 nova_compute[227762]: 2026-01-23 10:52:40.280 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:40 np0005593234 nova_compute[227762]: 2026-01-23 10:52:40.372 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:41.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:41.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:42.896 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:52:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:42.896 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:52:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:52:42.897 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:52:43 np0005593234 nova_compute[227762]: 2026-01-23 10:52:43.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:52:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:43.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:43.944 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:52:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1501272416' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:52:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:52:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1501272416' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:52:45 np0005593234 nova_compute[227762]: 2026-01-23 10:52:45.282 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:45 np0005593234 nova_compute[227762]: 2026-01-23 10:52:45.373 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:45.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:45.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:52:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:47.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:52:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:47.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:49.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:49.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:49 np0005593234 podman[331293]: 2026-01-23 10:52:49.980111853 +0000 UTC m=+0.083963608 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 05:52:50 np0005593234 nova_compute[227762]: 2026-01-23 10:52:50.283 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:50 np0005593234 nova_compute[227762]: 2026-01-23 10:52:50.375 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:52:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:51.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:52:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:52:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:51.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:52:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:53.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:53.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:55 np0005593234 nova_compute[227762]: 2026-01-23 10:52:55.284 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:55 np0005593234 nova_compute[227762]: 2026-01-23 10:52:55.376 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:52:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:52:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:55.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:52:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:55.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:52:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:57.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:52:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:57.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:52:59 np0005593234 podman[331319]: 2026-01-23 10:52:59.790644163 +0000 UTC m=+0.089991582 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 05:52:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:52:59.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:52:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:52:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:52:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:52:59.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:00 np0005593234 nova_compute[227762]: 2026-01-23 10:53:00.287 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:00 np0005593234 nova_compute[227762]: 2026-01-23 10:53:00.378 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:53:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:01.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:53:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:01.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:03.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:03.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:04 np0005593234 ovn_controller[134547]: 2026-01-23T10:53:04Z|00907|memory_trim|INFO|Detected inactivity (last active 30006 ms ago): trimming memory
Jan 23 05:53:05 np0005593234 nova_compute[227762]: 2026-01-23 10:53:05.288 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:05 np0005593234 nova_compute[227762]: 2026-01-23 10:53:05.381 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:05.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:05.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:07.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:07.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:09.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:53:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:09.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:53:10 np0005593234 nova_compute[227762]: 2026-01-23 10:53:10.291 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:10 np0005593234 nova_compute[227762]: 2026-01-23 10:53:10.382 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:11.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:11.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:13.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:13.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:14 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593234 nova_compute[227762]: 2026-01-23 10:53:14.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:14 np0005593234 nova_compute[227762]: 2026-01-23 10:53:14.770 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:53:14 np0005593234 nova_compute[227762]: 2026-01-23 10:53:14.771 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:53:14 np0005593234 nova_compute[227762]: 2026-01-23 10:53:14.771 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:53:14 np0005593234 nova_compute[227762]: 2026-01-23 10:53:14.772 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:53:14 np0005593234 nova_compute[227762]: 2026-01-23 10:53:14.772 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:53:14 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 23 05:53:14 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 23 05:53:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:53:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/823008715' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.227 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.292 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.375 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.376 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4153MB free_disk=20.942890167236328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.376 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.376 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.383 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.446 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.447 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.471 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:53:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:53:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4173995569' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.902 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.907 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:53:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:15.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.924 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.926 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:53:15 np0005593234 nova_compute[227762]: 2026-01-23 10:53:15.926 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.549s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:53:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:53:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:15.981 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #190. Immutable memtables: 0.
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.006033) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 121] Flushing memtable with next log file: 190
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165596006133, "job": 121, "event": "flush_started", "num_memtables": 1, "num_entries": 1312, "num_deletes": 251, "total_data_size": 2949166, "memory_usage": 2987568, "flush_reason": "Manual Compaction"}
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 121] Level-0 flush table #191: started
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165596019095, "cf_name": "default", "job": 121, "event": "table_file_creation", "file_number": 191, "file_size": 1947382, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 90600, "largest_seqno": 91906, "table_properties": {"data_size": 1941751, "index_size": 3025, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12135, "raw_average_key_size": 19, "raw_value_size": 1930427, "raw_average_value_size": 3164, "num_data_blocks": 135, "num_entries": 610, "num_filter_entries": 610, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165480, "oldest_key_time": 1769165480, "file_creation_time": 1769165596, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 191, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 121] Flush lasted 13107 microseconds, and 5184 cpu microseconds.
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.019160) [db/flush_job.cc:967] [default] [JOB 121] Level-0 flush table #191: 1947382 bytes OK
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.019180) [db/memtable_list.cc:519] [default] Level-0 commit table #191 started
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.020784) [db/memtable_list.cc:722] [default] Level-0 commit table #191: memtable #1 done
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.020799) EVENT_LOG_v1 {"time_micros": 1769165596020794, "job": 121, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.020817) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 121] Try to delete WAL files size 2942947, prev total WAL file size 2942947, number of live WAL files 2.
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000187.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.021524) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038303332' seq:72057594037927935, type:22 .. '7061786F730038323834' seq:0, type:0; will stop at (end)
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 122] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 121 Base level 0, inputs: [191(1901KB)], [189(12MB)]
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165596021653, "job": 122, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [191], "files_L6": [189], "score": -1, "input_data_size": 14646887, "oldest_snapshot_seqno": -1}
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 122] Generated table #192: 10807 keys, 12731654 bytes, temperature: kUnknown
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165596106990, "cf_name": "default", "job": 122, "event": "table_file_creation", "file_number": 192, "file_size": 12731654, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12663652, "index_size": 39857, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27077, "raw_key_size": 286470, "raw_average_key_size": 26, "raw_value_size": 12476443, "raw_average_value_size": 1154, "num_data_blocks": 1506, "num_entries": 10807, "num_filter_entries": 10807, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769165596, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 192, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.107262) [db/compaction/compaction_job.cc:1663] [default] [JOB 122] Compacted 1@0 + 1@6 files to L6 => 12731654 bytes
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.109091) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 171.7 rd, 149.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.1 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(14.1) write-amplify(6.5) OK, records in: 11322, records dropped: 515 output_compression: NoCompression
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.109110) EVENT_LOG_v1 {"time_micros": 1769165596109101, "job": 122, "event": "compaction_finished", "compaction_time_micros": 85293, "compaction_time_cpu_micros": 29138, "output_level": 6, "num_output_files": 1, "total_output_size": 12731654, "num_input_records": 11322, "num_output_records": 10807, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000191.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165596109549, "job": 122, "event": "table_file_deletion", "file_number": 191}
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000189.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165596111609, "job": 122, "event": "table_file_deletion", "file_number": 189}
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.021437) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.111709) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.111714) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.111717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.111719) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:53:16 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:53:16.111721) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:53:16 np0005593234 nova_compute[227762]: 2026-01-23 10:53:16.927 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:16 np0005593234 nova_compute[227762]: 2026-01-23 10:53:16.927 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:53:16 np0005593234 nova_compute[227762]: 2026-01-23 10:53:16.928 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:53:16 np0005593234 nova_compute[227762]: 2026-01-23 10:53:16.953 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:53:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:53:17.107 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=87, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=86) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:53:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:53:17.108 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:53:17 np0005593234 nova_compute[227762]: 2026-01-23 10:53:17.146 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:53:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:17.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:53:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:53:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:17.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:53:18 np0005593234 nova_compute[227762]: 2026-01-23 10:53:18.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:53:19.110 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '87'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:53:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:19.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:19.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:20 np0005593234 nova_compute[227762]: 2026-01-23 10:53:20.294 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:20 np0005593234 nova_compute[227762]: 2026-01-23 10:53:20.385 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:20 np0005593234 nova_compute[227762]: 2026-01-23 10:53:20.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:20 np0005593234 nova_compute[227762]: 2026-01-23 10:53:20.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:53:20 np0005593234 podman[331451]: 2026-01-23 10:53:20.75131981 +0000 UTC m=+0.046287338 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:53:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:21.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:53:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:21.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:53:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:23.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:23.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:24 np0005593234 nova_compute[227762]: 2026-01-23 10:53:24.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:25 np0005593234 nova_compute[227762]: 2026-01-23 10:53:25.296 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:25 np0005593234 nova_compute[227762]: 2026-01-23 10:53:25.386 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:25 np0005593234 podman[331647]: 2026-01-23 10:53:25.587406131 +0000 UTC m=+0.056632830 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.label-schema.build-date=20250507, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 23 05:53:25 np0005593234 podman[331647]: 2026-01-23 10:53:25.691125452 +0000 UTC m=+0.160352191 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0)
Jan 23 05:53:25 np0005593234 nova_compute[227762]: 2026-01-23 10:53:25.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:53:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:25.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:53:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:25.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:26 np0005593234 podman[331804]: 2026-01-23 10:53:26.532425046 +0000 UTC m=+0.347593636 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:53:26 np0005593234 podman[331827]: 2026-01-23 10:53:26.627990176 +0000 UTC m=+0.070905499 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:53:26 np0005593234 podman[331804]: 2026-01-23 10:53:26.636019804 +0000 UTC m=+0.451188314 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 05:53:26 np0005593234 podman[331872]: 2026-01-23 10:53:26.876894551 +0000 UTC m=+0.068942535 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, name=keepalived, release=1793, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, vcs-type=git, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.expose-services=, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, io.buildah.version=1.28.2, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc.)
Jan 23 05:53:26 np0005593234 podman[331872]: 2026-01-23 10:53:26.918367703 +0000 UTC m=+0.110415657 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, com.redhat.component=keepalived-container, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, vcs-type=git, release=1793, architecture=x86_64, io.buildah.version=1.28.2, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph)
Jan 23 05:53:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:27.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:53:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:27.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:53:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:53:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:53:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:53:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:53:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:53:28 np0005593234 nova_compute[227762]: 2026-01-23 10:53:28.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:29.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:29.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:30 np0005593234 nova_compute[227762]: 2026-01-23 10:53:30.298 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:30 np0005593234 podman[332061]: 2026-01-23 10:53:30.330327219 +0000 UTC m=+0.093964199 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:53:30 np0005593234 nova_compute[227762]: 2026-01-23 10:53:30.387 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:31.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:31.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:33 np0005593234 nova_compute[227762]: 2026-01-23 10:53:33.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:33.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:34.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:35 np0005593234 nova_compute[227762]: 2026-01-23 10:53:35.300 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:35 np0005593234 nova_compute[227762]: 2026-01-23 10:53:35.389 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:35.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:36.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:53:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:53:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e418 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:37.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:53:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:38.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:53:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:39.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:53:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:40.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:53:40 np0005593234 nova_compute[227762]: 2026-01-23 10:53:40.304 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:40 np0005593234 nova_compute[227762]: 2026-01-23 10:53:40.391 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e419 e419: 3 total, 3 up, 3 in
Jan 23 05:53:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:53:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:41.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:53:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:53:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:42.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:53:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e419 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:53:42.897 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:53:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:53:42.898 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:53:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:53:42.898 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:53:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:43.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:53:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:44.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:53:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:53:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1518152127' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:53:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:53:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1518152127' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:53:45 np0005593234 nova_compute[227762]: 2026-01-23 10:53:45.305 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:45 np0005593234 nova_compute[227762]: 2026-01-23 10:53:45.392 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:45 np0005593234 nova_compute[227762]: 2026-01-23 10:53:45.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:53:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:45.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:53:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:46.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:53:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e419 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:53:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:47.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:53:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:53:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:48.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:53:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e420 e420: 3 total, 3 up, 3 in
Jan 23 05:53:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:49.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:50.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:50 np0005593234 nova_compute[227762]: 2026-01-23 10:53:50.308 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:50 np0005593234 nova_compute[227762]: 2026-01-23 10:53:50.393 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:51 np0005593234 podman[332227]: 2026-01-23 10:53:51.781558776 +0000 UTC m=+0.077157689 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 05:53:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:51.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:52.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e420 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:53.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:54.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:55 np0005593234 nova_compute[227762]: 2026-01-23 10:53:55.309 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 e421: 3 total, 3 up, 3 in
Jan 23 05:53:55 np0005593234 nova_compute[227762]: 2026-01-23 10:53:55.395 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:53:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:55.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:56.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:53:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:53:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:57.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:53:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:53:58.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:53:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:53:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:53:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:53:59.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:00.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:00 np0005593234 nova_compute[227762]: 2026-01-23 10:54:00.312 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:00 np0005593234 nova_compute[227762]: 2026-01-23 10:54:00.397 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:00 np0005593234 podman[332254]: 2026-01-23 10:54:00.795603959 +0000 UTC m=+0.092417480 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 05:54:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:01.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:02.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:03.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:04.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:05 np0005593234 nova_compute[227762]: 2026-01-23 10:54:05.314 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:05 np0005593234 nova_compute[227762]: 2026-01-23 10:54:05.398 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:05.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:06.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:07.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:08.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:09.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:10.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:10 np0005593234 nova_compute[227762]: 2026-01-23 10:54:10.314 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:10 np0005593234 nova_compute[227762]: 2026-01-23 10:54:10.399 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:12.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:12.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:54:12.232 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=88, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=87) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:54:12 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:54:12.233 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:54:12 np0005593234 nova_compute[227762]: 2026-01-23 10:54:12.276 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:54:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:14.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:54:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:54:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:14.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:54:15 np0005593234 nova_compute[227762]: 2026-01-23 10:54:15.316 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:15 np0005593234 nova_compute[227762]: 2026-01-23 10:54:15.400 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:15 np0005593234 nova_compute[227762]: 2026-01-23 10:54:15.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:15 np0005593234 nova_compute[227762]: 2026-01-23 10:54:15.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:54:15 np0005593234 nova_compute[227762]: 2026-01-23 10:54:15.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:54:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:16.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:54:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:16.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:54:16 np0005593234 nova_compute[227762]: 2026-01-23 10:54:16.127 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:54:16 np0005593234 nova_compute[227762]: 2026-01-23 10:54:16.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:16 np0005593234 nova_compute[227762]: 2026-01-23 10:54:16.788 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:54:16 np0005593234 nova_compute[227762]: 2026-01-23 10:54:16.788 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:54:16 np0005593234 nova_compute[227762]: 2026-01-23 10:54:16.788 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:54:16 np0005593234 nova_compute[227762]: 2026-01-23 10:54:16.789 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:54:16 np0005593234 nova_compute[227762]: 2026-01-23 10:54:16.789 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:54:17 np0005593234 nova_compute[227762]: 2026-01-23 10:54:17.261 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:54:17 np0005593234 nova_compute[227762]: 2026-01-23 10:54:17.478 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:54:17 np0005593234 nova_compute[227762]: 2026-01-23 10:54:17.479 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4162MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:54:17 np0005593234 nova_compute[227762]: 2026-01-23 10:54:17.479 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:54:17 np0005593234 nova_compute[227762]: 2026-01-23 10:54:17.480 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:54:17 np0005593234 nova_compute[227762]: 2026-01-23 10:54:17.586 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:54:17 np0005593234 nova_compute[227762]: 2026-01-23 10:54:17.586 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:54:17 np0005593234 nova_compute[227762]: 2026-01-23 10:54:17.640 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:54:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:18.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:18.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:54:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1992672950' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:54:18 np0005593234 nova_compute[227762]: 2026-01-23 10:54:18.083 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:54:18 np0005593234 nova_compute[227762]: 2026-01-23 10:54:18.088 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:54:18 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:54:18.236 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '88'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:54:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:20.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:54:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:20.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:54:20 np0005593234 nova_compute[227762]: 2026-01-23 10:54:20.317 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:20 np0005593234 nova_compute[227762]: 2026-01-23 10:54:20.401 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:20 np0005593234 nova_compute[227762]: 2026-01-23 10:54:20.875 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:54:20 np0005593234 nova_compute[227762]: 2026-01-23 10:54:20.877 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:54:20 np0005593234 nova_compute[227762]: 2026-01-23 10:54:20.877 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.398s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:54:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:22.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:22.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:22 np0005593234 podman[332386]: 2026-01-23 10:54:22.762454019 +0000 UTC m=+0.062301222 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 05:54:23 np0005593234 nova_compute[227762]: 2026-01-23 10:54:23.879 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:23 np0005593234 nova_compute[227762]: 2026-01-23 10:54:23.880 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:23 np0005593234 nova_compute[227762]: 2026-01-23 10:54:23.880 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:54:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:24.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:54:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:24.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:54:25 np0005593234 nova_compute[227762]: 2026-01-23 10:54:25.319 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:25 np0005593234 nova_compute[227762]: 2026-01-23 10:54:25.403 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:25 np0005593234 nova_compute[227762]: 2026-01-23 10:54:25.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:26.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:54:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:26.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:54:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:27 np0005593234 nova_compute[227762]: 2026-01-23 10:54:27.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:28.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:54:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:28.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:54:28 np0005593234 nova_compute[227762]: 2026-01-23 10:54:28.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:29 np0005593234 nova_compute[227762]: 2026-01-23 10:54:29.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:30.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:54:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:30.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:54:30 np0005593234 nova_compute[227762]: 2026-01-23 10:54:30.321 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:30 np0005593234 nova_compute[227762]: 2026-01-23 10:54:30.404 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:31 np0005593234 podman[332433]: 2026-01-23 10:54:31.17270432 +0000 UTC m=+0.120817672 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:54:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:32.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:32.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:34.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:34.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:34 np0005593234 nova_compute[227762]: 2026-01-23 10:54:34.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:35 np0005593234 nova_compute[227762]: 2026-01-23 10:54:35.325 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:35 np0005593234 nova_compute[227762]: 2026-01-23 10:54:35.406 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:36.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:54:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:36.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:54:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:54:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:54:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:54:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:54:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:54:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:38.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:54:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:38.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:54:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:40.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:40.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:40 np0005593234 nova_compute[227762]: 2026-01-23 10:54:40.327 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:40 np0005593234 nova_compute[227762]: 2026-01-23 10:54:40.407 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:54:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:42.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:54:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:42.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:54:42.898 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:54:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:54:42.899 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:54:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:54:42.899 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:54:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:44.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:44.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:45 np0005593234 nova_compute[227762]: 2026-01-23 10:54:45.328 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:45 np0005593234 nova_compute[227762]: 2026-01-23 10:54:45.410 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:45 np0005593234 nova_compute[227762]: 2026-01-23 10:54:45.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:54:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:54:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:46.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:54:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:46.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:54:46 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:54:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4a2c6f0 =====
Jan 23 05:54:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4a2c6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4a2c6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:48.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:48.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:50 np0005593234 nova_compute[227762]: 2026-01-23 10:54:50.329 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:50.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4a2c6f0 =====
Jan 23 05:54:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4a2c6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:54:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4a2c6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:50.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:54:50 np0005593234 nova_compute[227762]: 2026-01-23 10:54:50.412 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:52.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:52.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:53 np0005593234 podman[332726]: 2026-01-23 10:54:53.762661083 +0000 UTC m=+0.055111782 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:54:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:54:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:54.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:54:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:54.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:55 np0005593234 nova_compute[227762]: 2026-01-23 10:54:55.330 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:55 np0005593234 nova_compute[227762]: 2026-01-23 10:54:55.413 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:54:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:54:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:56.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:54:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:56.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:54:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:54:58.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:54:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:54:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:54:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:54:58.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:00 np0005593234 nova_compute[227762]: 2026-01-23 10:55:00.332 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:00.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:00.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:00 np0005593234 nova_compute[227762]: 2026-01-23 10:55:00.414 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:00 np0005593234 nova_compute[227762]: 2026-01-23 10:55:00.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:01 np0005593234 podman[332750]: 2026-01-23 10:55:01.800427218 +0000 UTC m=+0.101334237 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 05:55:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:02.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:02.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:02 np0005593234 nova_compute[227762]: 2026-01-23 10:55:02.763 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:02 np0005593234 nova_compute[227762]: 2026-01-23 10:55:02.763 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 05:55:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:04.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:04.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:05 np0005593234 nova_compute[227762]: 2026-01-23 10:55:05.335 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:05 np0005593234 nova_compute[227762]: 2026-01-23 10:55:05.416 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:55:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:06.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:55:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:06.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:08.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:55:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:08.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:55:10 np0005593234 nova_compute[227762]: 2026-01-23 10:55:10.336 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:10.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:10 np0005593234 nova_compute[227762]: 2026-01-23 10:55:10.418 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:10.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:55:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:12.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:55:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:12.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:14.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:14.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:15 np0005593234 nova_compute[227762]: 2026-01-23 10:55:15.338 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:15 np0005593234 nova_compute[227762]: 2026-01-23 10:55:15.420 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:15 np0005593234 nova_compute[227762]: 2026-01-23 10:55:15.945 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:15 np0005593234 nova_compute[227762]: 2026-01-23 10:55:15.946 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:55:15 np0005593234 nova_compute[227762]: 2026-01-23 10:55:15.946 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:55:16 np0005593234 nova_compute[227762]: 2026-01-23 10:55:16.007 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:55:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:16.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:55:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:16.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:55:16 np0005593234 nova_compute[227762]: 2026-01-23 10:55:16.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:17 np0005593234 nova_compute[227762]: 2026-01-23 10:55:17.251 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:55:17 np0005593234 nova_compute[227762]: 2026-01-23 10:55:17.252 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:55:17 np0005593234 nova_compute[227762]: 2026-01-23 10:55:17.252 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:55:17 np0005593234 nova_compute[227762]: 2026-01-23 10:55:17.253 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:55:17 np0005593234 nova_compute[227762]: 2026-01-23 10:55:17.253 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:55:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:55:17 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/680526313' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:55:17 np0005593234 nova_compute[227762]: 2026-01-23 10:55:17.740 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:55:17 np0005593234 nova_compute[227762]: 2026-01-23 10:55:17.975 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:55:17 np0005593234 nova_compute[227762]: 2026-01-23 10:55:17.976 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4181MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:55:17 np0005593234 nova_compute[227762]: 2026-01-23 10:55:17.977 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:55:17 np0005593234 nova_compute[227762]: 2026-01-23 10:55:17.977 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:55:18 np0005593234 nova_compute[227762]: 2026-01-23 10:55:18.282 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:55:18 np0005593234 nova_compute[227762]: 2026-01-23 10:55:18.283 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:55:18 np0005593234 nova_compute[227762]: 2026-01-23 10:55:18.335 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:55:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:18.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:18.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:55:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3455468115' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:55:18 np0005593234 nova_compute[227762]: 2026-01-23 10:55:18.776 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:55:18 np0005593234 nova_compute[227762]: 2026-01-23 10:55:18.782 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:55:19 np0005593234 nova_compute[227762]: 2026-01-23 10:55:19.060 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:55:19 np0005593234 nova_compute[227762]: 2026-01-23 10:55:19.061 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:55:19 np0005593234 nova_compute[227762]: 2026-01-23 10:55:19.062 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:55:19 np0005593234 nova_compute[227762]: 2026-01-23 10:55:19.062 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:19 np0005593234 nova_compute[227762]: 2026-01-23 10:55:19.062 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 05:55:19 np0005593234 nova_compute[227762]: 2026-01-23 10:55:19.089 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 05:55:20 np0005593234 nova_compute[227762]: 2026-01-23 10:55:20.339 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:20.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:20 np0005593234 nova_compute[227762]: 2026-01-23 10:55:20.422 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:20.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:22 np0005593234 nova_compute[227762]: 2026-01-23 10:55:22.091 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:55:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:22.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:55:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:22.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:22 np0005593234 nova_compute[227762]: 2026-01-23 10:55:22.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:22 np0005593234 nova_compute[227762]: 2026-01-23 10:55:22.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:55:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:24.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:55:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:24.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:55:24 np0005593234 podman[332883]: 2026-01-23 10:55:24.766367875 +0000 UTC m=+0.054536832 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 05:55:25 np0005593234 nova_compute[227762]: 2026-01-23 10:55:25.341 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:25 np0005593234 nova_compute[227762]: 2026-01-23 10:55:25.424 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:26.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:26.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:26 np0005593234 nova_compute[227762]: 2026-01-23 10:55:26.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:28.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:28.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:28 np0005593234 nova_compute[227762]: 2026-01-23 10:55:28.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:30 np0005593234 nova_compute[227762]: 2026-01-23 10:55:30.344 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:30 np0005593234 nova_compute[227762]: 2026-01-23 10:55:30.425 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:30.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:30.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:30 np0005593234 nova_compute[227762]: 2026-01-23 10:55:30.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:55:31.387 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=89, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=88) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:55:31 np0005593234 nova_compute[227762]: 2026-01-23 10:55:31.388 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:55:31.389 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:55:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:32.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:32.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:32 np0005593234 podman[332957]: 2026-01-23 10:55:32.795658478 +0000 UTC m=+0.088066319 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:55:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:55:33.391 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '89'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:55:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:34.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:34.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:34 np0005593234 nova_compute[227762]: 2026-01-23 10:55:34.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:35 np0005593234 nova_compute[227762]: 2026-01-23 10:55:35.346 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:35 np0005593234 nova_compute[227762]: 2026-01-23 10:55:35.428 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:36.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:36.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:38.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:38.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:40 np0005593234 nova_compute[227762]: 2026-01-23 10:55:40.349 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:40 np0005593234 nova_compute[227762]: 2026-01-23 10:55:40.430 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:40.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:40.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:55:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:42.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:55:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:42.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:55:42.900 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:55:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:55:42.900 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:55:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:55:42.900 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:55:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:44.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:55:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:44.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:55:45 np0005593234 nova_compute[227762]: 2026-01-23 10:55:45.351 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:45 np0005593234 nova_compute[227762]: 2026-01-23 10:55:45.431 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:46 np0005593234 nova_compute[227762]: 2026-01-23 10:55:46.021 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:55:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:46.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:55:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:46.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:47 np0005593234 nova_compute[227762]: 2026-01-23 10:55:47.741 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:55:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:48.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:48.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:55:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:55:50 np0005593234 nova_compute[227762]: 2026-01-23 10:55:50.353 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:50 np0005593234 nova_compute[227762]: 2026-01-23 10:55:50.434 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:50.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:50.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:55:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:55:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:55:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:55:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:52.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:55:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:52.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:54.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:54.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:55 np0005593234 nova_compute[227762]: 2026-01-23 10:55:55.354 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:55 np0005593234 nova_compute[227762]: 2026-01-23 10:55:55.436 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:55:55 np0005593234 podman[333178]: 2026-01-23 10:55:55.767284961 +0000 UTC m=+0.051549057 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 23 05:55:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:55:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:56.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:55:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:55:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:56.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:55:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:55:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:55:58.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:55:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:55:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:55:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:55:58.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:00 np0005593234 nova_compute[227762]: 2026-01-23 10:56:00.356 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:00 np0005593234 nova_compute[227762]: 2026-01-23 10:56:00.437 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:00.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:00.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:02.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:02.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:03 np0005593234 podman[333202]: 2026-01-23 10:56:03.799433776 +0000 UTC m=+0.088267257 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 23 05:56:03 np0005593234 nova_compute[227762]: 2026-01-23 10:56:03.961 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:04.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:04.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:05 np0005593234 nova_compute[227762]: 2026-01-23 10:56:05.360 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:05 np0005593234 nova_compute[227762]: 2026-01-23 10:56:05.438 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:06.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:06.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:07 np0005593234 nova_compute[227762]: 2026-01-23 10:56:07.259 227766 DEBUG nova.compute.manager [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 23 05:56:07 np0005593234 nova_compute[227762]: 2026-01-23 10:56:07.536 227766 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:07 np0005593234 nova_compute[227762]: 2026-01-23 10:56:07.537 227766 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:07 np0005593234 nova_compute[227762]: 2026-01-23 10:56:07.895 227766 DEBUG nova.objects.instance [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lazy-loading 'pci_requests' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:56:07 np0005593234 nova_compute[227762]: 2026-01-23 10:56:07.917 227766 DEBUG nova.virt.hardware [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:56:07 np0005593234 nova_compute[227762]: 2026-01-23 10:56:07.918 227766 INFO nova.compute.claims [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:56:07 np0005593234 nova_compute[227762]: 2026-01-23 10:56:07.919 227766 DEBUG nova.objects.instance [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lazy-loading 'resources' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:56:07 np0005593234 nova_compute[227762]: 2026-01-23 10:56:07.931 227766 DEBUG nova.objects.instance [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lazy-loading 'numa_topology' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:56:07 np0005593234 nova_compute[227762]: 2026-01-23 10:56:07.942 227766 DEBUG nova.objects.instance [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lazy-loading 'pci_devices' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:56:07 np0005593234 nova_compute[227762]: 2026-01-23 10:56:07.985 227766 INFO nova.compute.resource_tracker [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating resource usage from migration 55b5cf0b-3cad-4f1d-86af-6b08513cd259#033[00m
Jan 23 05:56:07 np0005593234 nova_compute[227762]: 2026-01-23 10:56:07.986 227766 DEBUG nova.compute.resource_tracker [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Starting to track incoming migration 55b5cf0b-3cad-4f1d-86af-6b08513cd259 with flavor 68d42077-c749-4366-ba3e-07758debb02d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 05:56:08 np0005593234 nova_compute[227762]: 2026-01-23 10:56:08.048 227766 DEBUG oslo_concurrency.processutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:56:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:56:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3737315146' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:56:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:08.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:08 np0005593234 nova_compute[227762]: 2026-01-23 10:56:08.469 227766 DEBUG oslo_concurrency.processutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.422s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:56:08 np0005593234 nova_compute[227762]: 2026-01-23 10:56:08.476 227766 DEBUG nova.compute.provider_tree [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:56:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 05:56:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:08.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 05:56:08 np0005593234 nova_compute[227762]: 2026-01-23 10:56:08.522 227766 DEBUG nova.scheduler.client.report [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:56:08 np0005593234 nova_compute[227762]: 2026-01-23 10:56:08.574 227766 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 1.037s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:08 np0005593234 nova_compute[227762]: 2026-01-23 10:56:08.574 227766 INFO nova.compute.manager [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Migrating#033[00m
Jan 23 05:56:09 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:56:09 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:56:10 np0005593234 nova_compute[227762]: 2026-01-23 10:56:10.361 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:10 np0005593234 nova_compute[227762]: 2026-01-23 10:56:10.441 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:10.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:10.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:12.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:12.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:13 np0005593234 systemd-logind[794]: New session 69 of user nova.
Jan 23 05:56:13 np0005593234 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 05:56:13 np0005593234 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 05:56:13 np0005593234 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 05:56:13 np0005593234 systemd[1]: Starting User Manager for UID 42436...
Jan 23 05:56:13 np0005593234 systemd[333359]: Queued start job for default target Main User Target.
Jan 23 05:56:13 np0005593234 systemd[333359]: Created slice User Application Slice.
Jan 23 05:56:13 np0005593234 systemd[333359]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 05:56:13 np0005593234 systemd[333359]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 05:56:13 np0005593234 systemd[333359]: Reached target Paths.
Jan 23 05:56:13 np0005593234 systemd[333359]: Reached target Timers.
Jan 23 05:56:13 np0005593234 systemd[333359]: Starting D-Bus User Message Bus Socket...
Jan 23 05:56:13 np0005593234 systemd[333359]: Starting Create User's Volatile Files and Directories...
Jan 23 05:56:13 np0005593234 systemd[333359]: Finished Create User's Volatile Files and Directories.
Jan 23 05:56:13 np0005593234 systemd[333359]: Listening on D-Bus User Message Bus Socket.
Jan 23 05:56:13 np0005593234 systemd[333359]: Reached target Sockets.
Jan 23 05:56:13 np0005593234 systemd[333359]: Reached target Basic System.
Jan 23 05:56:13 np0005593234 systemd[333359]: Reached target Main User Target.
Jan 23 05:56:13 np0005593234 systemd[333359]: Startup finished in 137ms.
Jan 23 05:56:13 np0005593234 systemd[1]: Started User Manager for UID 42436.
Jan 23 05:56:13 np0005593234 systemd[1]: Started Session 69 of User nova.
Jan 23 05:56:13 np0005593234 systemd[1]: session-69.scope: Deactivated successfully.
Jan 23 05:56:13 np0005593234 systemd-logind[794]: Session 69 logged out. Waiting for processes to exit.
Jan 23 05:56:13 np0005593234 systemd-logind[794]: Removed session 69.
Jan 23 05:56:13 np0005593234 systemd-logind[794]: New session 71 of user nova.
Jan 23 05:56:13 np0005593234 systemd[1]: Started Session 71 of User nova.
Jan 23 05:56:14 np0005593234 systemd[1]: session-71.scope: Deactivated successfully.
Jan 23 05:56:14 np0005593234 systemd-logind[794]: Session 71 logged out. Waiting for processes to exit.
Jan 23 05:56:14 np0005593234 systemd-logind[794]: Removed session 71.
Jan 23 05:56:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:14.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:14.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:15 np0005593234 nova_compute[227762]: 2026-01-23 10:56:15.362 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:15 np0005593234 nova_compute[227762]: 2026-01-23 10:56:15.442 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:15 np0005593234 nova_compute[227762]: 2026-01-23 10:56:15.769 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:15 np0005593234 nova_compute[227762]: 2026-01-23 10:56:15.769 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:56:15 np0005593234 nova_compute[227762]: 2026-01-23 10:56:15.769 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:56:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:16.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:16.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:16 np0005593234 nova_compute[227762]: 2026-01-23 10:56:16.849 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:56:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:17 np0005593234 nova_compute[227762]: 2026-01-23 10:56:17.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:17 np0005593234 nova_compute[227762]: 2026-01-23 10:56:17.767 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:17 np0005593234 nova_compute[227762]: 2026-01-23 10:56:17.767 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:17 np0005593234 nova_compute[227762]: 2026-01-23 10:56:17.768 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:17 np0005593234 nova_compute[227762]: 2026-01-23 10:56:17.768 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:56:17 np0005593234 nova_compute[227762]: 2026-01-23 10:56:17.768 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:56:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:56:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4071858077' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:56:18 np0005593234 nova_compute[227762]: 2026-01-23 10:56:18.217 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:56:18 np0005593234 nova_compute[227762]: 2026-01-23 10:56:18.416 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:56:18 np0005593234 nova_compute[227762]: 2026-01-23 10:56:18.417 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4129MB free_disk=20.942726135253906GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:56:18 np0005593234 nova_compute[227762]: 2026-01-23 10:56:18.418 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:18 np0005593234 nova_compute[227762]: 2026-01-23 10:56:18.418 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:18.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:18.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.161 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Migration for instance fcb93bcf-9612-4dc7-9996-238d2739d8cb refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.214 227766 INFO nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating resource usage from migration 55b5cf0b-3cad-4f1d-86af-6b08513cd259#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.214 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Starting to track incoming migration 55b5cf0b-3cad-4f1d-86af-6b08513cd259 with flavor 68d42077-c749-4366-ba3e-07758debb02d _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.305 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance with task_state "resize_migrated" is not being actively managed by this compute host but has allocations referencing this compute node (89873210-bee9-46e9-9f9d-0cd7a156c3a8): {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. Skipping heal of allocations during the task state transition. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1708#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.306 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.306 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.364 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.402 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.446 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.452 227766 DEBUG nova.compute.manager [req-b562786b-c274-44b9-839a-fe0130752f49 req-1a1377a8-3b97-4d1b-b9be-6ce30ac9643d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.453 227766 DEBUG oslo_concurrency.lockutils [req-b562786b-c274-44b9-839a-fe0130752f49 req-1a1377a8-3b97-4d1b-b9be-6ce30ac9643d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.453 227766 DEBUG oslo_concurrency.lockutils [req-b562786b-c274-44b9-839a-fe0130752f49 req-1a1377a8-3b97-4d1b-b9be-6ce30ac9643d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.454 227766 DEBUG oslo_concurrency.lockutils [req-b562786b-c274-44b9-839a-fe0130752f49 req-1a1377a8-3b97-4d1b-b9be-6ce30ac9643d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.454 227766 DEBUG nova.compute.manager [req-b562786b-c274-44b9-839a-fe0130752f49 req-1a1377a8-3b97-4d1b-b9be-6ce30ac9643d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.454 227766 WARNING nova.compute.manager [req-b562786b-c274-44b9-839a-fe0130752f49 req-1a1377a8-3b97-4d1b-b9be-6ce30ac9643d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 05:56:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:20.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:20.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:56:20 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2782976524' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.837 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.842 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.930 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.932 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.932 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.514s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:20 np0005593234 nova_compute[227762]: 2026-01-23 10:56:20.959 227766 INFO nova.network.neutron [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating port 62f573cf-0476-448d-b148-040cec7b1042 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 05:56:22 np0005593234 nova_compute[227762]: 2026-01-23 10:56:22.085 227766 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:56:22 np0005593234 nova_compute[227762]: 2026-01-23 10:56:22.086 227766 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:56:22 np0005593234 nova_compute[227762]: 2026-01-23 10:56:22.086 227766 DEBUG nova.network.neutron [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:56:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:22.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:22.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e421 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:23 np0005593234 nova_compute[227762]: 2026-01-23 10:56:23.482 227766 DEBUG nova.compute.manager [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:23 np0005593234 nova_compute[227762]: 2026-01-23 10:56:23.482 227766 DEBUG oslo_concurrency.lockutils [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:23 np0005593234 nova_compute[227762]: 2026-01-23 10:56:23.483 227766 DEBUG oslo_concurrency.lockutils [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:23 np0005593234 nova_compute[227762]: 2026-01-23 10:56:23.483 227766 DEBUG oslo_concurrency.lockutils [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:23 np0005593234 nova_compute[227762]: 2026-01-23 10:56:23.483 227766 DEBUG nova.compute.manager [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:23 np0005593234 nova_compute[227762]: 2026-01-23 10:56:23.483 227766 WARNING nova.compute.manager [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 05:56:23 np0005593234 nova_compute[227762]: 2026-01-23 10:56:23.484 227766 DEBUG nova.compute.manager [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-changed-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:23 np0005593234 nova_compute[227762]: 2026-01-23 10:56:23.484 227766 DEBUG nova.compute.manager [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing instance network info cache due to event network-changed-62f573cf-0476-448d-b148-040cec7b1042. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:56:23 np0005593234 nova_compute[227762]: 2026-01-23 10:56:23.484 227766 DEBUG oslo_concurrency.lockutils [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:56:23 np0005593234 nova_compute[227762]: 2026-01-23 10:56:23.932 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:24 np0005593234 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 05:56:24 np0005593234 systemd[333359]: Activating special unit Exit the Session...
Jan 23 05:56:24 np0005593234 systemd[333359]: Stopped target Main User Target.
Jan 23 05:56:24 np0005593234 systemd[333359]: Stopped target Basic System.
Jan 23 05:56:24 np0005593234 systemd[333359]: Stopped target Paths.
Jan 23 05:56:24 np0005593234 systemd[333359]: Stopped target Sockets.
Jan 23 05:56:24 np0005593234 systemd[333359]: Stopped target Timers.
Jan 23 05:56:24 np0005593234 systemd[333359]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 05:56:24 np0005593234 systemd[333359]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 05:56:24 np0005593234 systemd[333359]: Closed D-Bus User Message Bus Socket.
Jan 23 05:56:24 np0005593234 systemd[333359]: Stopped Create User's Volatile Files and Directories.
Jan 23 05:56:24 np0005593234 systemd[333359]: Removed slice User Application Slice.
Jan 23 05:56:24 np0005593234 systemd[333359]: Reached target Shutdown.
Jan 23 05:56:24 np0005593234 systemd[333359]: Finished Exit the Session.
Jan 23 05:56:24 np0005593234 systemd[333359]: Reached target Exit the Session.
Jan 23 05:56:24 np0005593234 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 05:56:24 np0005593234 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 05:56:24 np0005593234 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 05:56:24 np0005593234 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 05:56:24 np0005593234 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 05:56:24 np0005593234 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 05:56:24 np0005593234 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 05:56:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:56:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:24.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:56:24 np0005593234 nova_compute[227762]: 2026-01-23 10:56:24.541 227766 DEBUG nova.network.neutron [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:56:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:24.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:24 np0005593234 nova_compute[227762]: 2026-01-23 10:56:24.606 227766 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:56:24 np0005593234 nova_compute[227762]: 2026-01-23 10:56:24.609 227766 DEBUG oslo_concurrency.lockutils [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:56:24 np0005593234 nova_compute[227762]: 2026-01-23 10:56:24.609 227766 DEBUG nova.network.neutron [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing network info cache for port 62f573cf-0476-448d-b148-040cec7b1042 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:56:24 np0005593234 nova_compute[227762]: 2026-01-23 10:56:24.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:24 np0005593234 nova_compute[227762]: 2026-01-23 10:56:24.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:56:24 np0005593234 nova_compute[227762]: 2026-01-23 10:56:24.780 227766 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 23 05:56:24 np0005593234 nova_compute[227762]: 2026-01-23 10:56:24.782 227766 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 05:56:24 np0005593234 nova_compute[227762]: 2026-01-23 10:56:24.782 227766 INFO nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Creating image(s)#033[00m
Jan 23 05:56:24 np0005593234 nova_compute[227762]: 2026-01-23 10:56:24.826 227766 DEBUG nova.storage.rbd_utils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] creating snapshot(nova-resize) on rbd image(fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 05:56:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e422 e422: 3 total, 3 up, 3 in
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.367 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.449 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.874 227766 DEBUG nova.objects.instance [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lazy-loading 'trusted_certs' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.975 227766 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.976 227766 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Ensure instance console log exists: /var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.976 227766 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.976 227766 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.977 227766 DEBUG oslo_concurrency.lockutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.979 227766 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Start _get_guest_xml network_info=[{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--878952243", "vif_mac": "fa:16:3e:f9:18:47"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.983 227766 WARNING nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.988 227766 DEBUG nova.virt.libvirt.host [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.989 227766 DEBUG nova.virt.libvirt.host [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.991 227766 DEBUG nova.virt.libvirt.host [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.991 227766 DEBUG nova.virt.libvirt.host [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.993 227766 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.993 227766 DEBUG nova.virt.hardware [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.993 227766 DEBUG nova.virt.hardware [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.994 227766 DEBUG nova.virt.hardware [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.994 227766 DEBUG nova.virt.hardware [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.994 227766 DEBUG nova.virt.hardware [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.994 227766 DEBUG nova.virt.hardware [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.995 227766 DEBUG nova.virt.hardware [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.995 227766 DEBUG nova.virt.hardware [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.995 227766 DEBUG nova.virt.hardware [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.995 227766 DEBUG nova.virt.hardware [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.995 227766 DEBUG nova.virt.hardware [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:56:25 np0005593234 nova_compute[227762]: 2026-01-23 10:56:25.996 227766 DEBUG nova.objects.instance [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lazy-loading 'vcpu_model' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:56:26 np0005593234 nova_compute[227762]: 2026-01-23 10:56:26.078 227766 DEBUG oslo_concurrency.processutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:56:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:26.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:56:26 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/445311975' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:56:26 np0005593234 nova_compute[227762]: 2026-01-23 10:56:26.495 227766 DEBUG nova.network.neutron [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updated VIF entry in instance network info cache for port 62f573cf-0476-448d-b148-040cec7b1042. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:56:26 np0005593234 nova_compute[227762]: 2026-01-23 10:56:26.497 227766 DEBUG nova.network.neutron [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:56:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:26.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:26 np0005593234 nova_compute[227762]: 2026-01-23 10:56:26.566 227766 DEBUG oslo_concurrency.lockutils [req-468a1008-689a-4602-9881-9fa55d3694aa req-3181fbaf-1b45-4779-9fe9-1aea0f114ee7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:56:26 np0005593234 nova_compute[227762]: 2026-01-23 10:56:26.699 227766 DEBUG oslo_concurrency.processutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.621s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:56:26 np0005593234 nova_compute[227762]: 2026-01-23 10:56:26.740 227766 DEBUG oslo_concurrency.processutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:56:26 np0005593234 podman[333527]: 2026-01-23 10:56:26.777651674 +0000 UTC m=+0.062659361 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:56:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:56:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2030747439' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.190 227766 DEBUG oslo_concurrency.processutils [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.192 227766 DEBUG nova.virt.libvirt.vif [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:55:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-660546175',display_name='tempest-TestNetworkAdvancedServerOps-server-660546175',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-660546175',id=211,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHhNBRegjioPAmva8qyqPMyd4dn+3hiBNwe4BWzp1VDgZFgQ+g4FdHcnXo+cwpLDWgKnm4yCRqf2eKNNhFM/EbeI6EnjlmNiu32pnRKGBZGgO4FKlvjQptQtJfMEpsL1DQ==',key_name='tempest-TestNetworkAdvancedServerOps-1958820888',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:55:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-5dvavyin',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:56:20Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=fcb93bcf-9612-4dc7-9996-238d2739d8cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--878952243", "vif_mac": "fa:16:3e:f9:18:47"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.192 227766 DEBUG nova.network.os_vif_util [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converting VIF {"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--878952243", "vif_mac": "fa:16:3e:f9:18:47"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.193 227766 DEBUG nova.network.os_vif_util [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.197 227766 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  <uuid>fcb93bcf-9612-4dc7-9996-238d2739d8cb</uuid>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  <name>instance-000000d3</name>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-660546175</nova:name>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:56:25</nova:creationTime>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <nova:port uuid="62f573cf-0476-448d-b148-040cec7b1042">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <entry name="serial">fcb93bcf-9612-4dc7-9996-238d2739d8cb</entry>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <entry name="uuid">fcb93bcf-9612-4dc7-9996-238d2739d8cb</entry>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/fcb93bcf-9612-4dc7-9996-238d2739d8cb_disk.config">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:f9:18:47"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <target dev="tap62f573cf-04"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/fcb93bcf-9612-4dc7-9996-238d2739d8cb/console.log" append="off"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:56:27 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:56:27 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:56:27 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:56:27 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.198 227766 DEBUG nova.virt.libvirt.vif [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:55:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-660546175',display_name='tempest-TestNetworkAdvancedServerOps-server-660546175',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-660546175',id=211,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHhNBRegjioPAmva8qyqPMyd4dn+3hiBNwe4BWzp1VDgZFgQ+g4FdHcnXo+cwpLDWgKnm4yCRqf2eKNNhFM/EbeI6EnjlmNiu32pnRKGBZGgO4FKlvjQptQtJfMEpsL1DQ==',key_name='tempest-TestNetworkAdvancedServerOps-1958820888',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:55:33Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-5dvavyin',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:56:20Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=fcb93bcf-9612-4dc7-9996-238d2739d8cb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--878952243", "vif_mac": "fa:16:3e:f9:18:47"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.199 227766 DEBUG nova.network.os_vif_util [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converting VIF {"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--878952243", "vif_mac": "fa:16:3e:f9:18:47"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.200 227766 DEBUG nova.network.os_vif_util [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.200 227766 DEBUG os_vif [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.201 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.202 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.202 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.205 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.206 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap62f573cf-04, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.206 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap62f573cf-04, col_values=(('external_ids', {'iface-id': '62f573cf-0476-448d-b148-040cec7b1042', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f9:18:47', 'vm-uuid': 'fcb93bcf-9612-4dc7-9996-238d2739d8cb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.207 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 NetworkManager[48942]: <info>  [1769165787.2094] manager: (tap62f573cf-04): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/428)
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.210 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.214 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.215 227766 INFO os_vif [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04')#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.372 227766 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.372 227766 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.373 227766 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] No VIF found with MAC fa:16:3e:f9:18:47, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.373 227766 INFO nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Using config drive#033[00m
Jan 23 05:56:27 np0005593234 kernel: tap62f573cf-04: entered promiscuous mode
Jan 23 05:56:27 np0005593234 NetworkManager[48942]: <info>  [1769165787.4576] manager: (tap62f573cf-04): new Tun device (/org/freedesktop/NetworkManager/Devices/429)
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.457 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 ovn_controller[134547]: 2026-01-23T10:56:27Z|00908|binding|INFO|Claiming lport 62f573cf-0476-448d-b148-040cec7b1042 for this chassis.
Jan 23 05:56:27 np0005593234 ovn_controller[134547]: 2026-01-23T10:56:27Z|00909|binding|INFO|62f573cf-0476-448d-b148-040cec7b1042: Claiming fa:16:3e:f9:18:47 10.100.0.14
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.462 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.467 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.470 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 NetworkManager[48942]: <info>  [1769165787.4724] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/430)
Jan 23 05:56:27 np0005593234 NetworkManager[48942]: <info>  [1769165787.4730] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/431)
Jan 23 05:56:27 np0005593234 systemd-udevd[333618]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:56:27 np0005593234 systemd-machined[195626]: New machine qemu-102-instance-000000d3.
Jan 23 05:56:27 np0005593234 NetworkManager[48942]: <info>  [1769165787.5008] device (tap62f573cf-04): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:56:27 np0005593234 NetworkManager[48942]: <info>  [1769165787.5014] device (tap62f573cf-04): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.510 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:18:47 10.100.0.14'], port_security=['fa:16:3e:f9:18:47 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fcb93bcf-9612-4dc7-9996-238d2739d8cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '5afed19d-3ff6-4459-b8a0-c5fc6a279e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.249'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=866a455a-94b4-4bbd-a367-b902a726ce2f, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=62f573cf-0476-448d-b148-040cec7b1042) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.511 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 62f573cf-0476-448d-b148-040cec7b1042 in datapath 6c737d6f-3e00-482b-aed5-4f8eabd246f2 bound to our chassis#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.512 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6c737d6f-3e00-482b-aed5-4f8eabd246f2#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.522 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[30956f6d-8830-4947-b31f-07ede4993719]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.523 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6c737d6f-31 in ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.525 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6c737d6f-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.525 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[adb74151-512e-45cc-be41-d011814f9fae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.526 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[33eb54a9-3118-430b-b5e0-ec45106796cf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 systemd[1]: Started Virtual Machine qemu-102-instance-000000d3.
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.539 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[95098e5b-f104-405d-ac58-091d7eb22b99]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.545 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.553 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.560 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 ovn_controller[134547]: 2026-01-23T10:56:27Z|00910|binding|INFO|Setting lport 62f573cf-0476-448d-b148-040cec7b1042 up in Southbound
Jan 23 05:56:27 np0005593234 ovn_controller[134547]: 2026-01-23T10:56:27Z|00911|binding|INFO|Setting lport 62f573cf-0476-448d-b148-040cec7b1042 ovn-installed in OVS
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.563 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.566 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[164e96a3-2550-418e-a30a-175bc5df79c0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.593 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[4f8579aa-bec7-441a-9856-4faedb100051]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.599 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b8eb7d9c-d6b5-4d0d-9d41-b0af0d4138ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 NetworkManager[48942]: <info>  [1769165787.6005] manager: (tap6c737d6f-30): new Veth device (/org/freedesktop/NetworkManager/Devices/432)
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.625 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a6c53c25-4dc5-4368-84a0-409641fbf5e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.628 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[0227c11c-d79f-412b-b031-43bd54a4dbd8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 NetworkManager[48942]: <info>  [1769165787.6499] device (tap6c737d6f-30): carrier: link connected
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.653 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[64329b3e-a938-435e-90dd-ffdc9da7c790]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.673 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5e974834-b18a-4dc5-acda-08514ea03df1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c737d6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:05:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 279], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 968261, 'reachable_time': 35139, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 333652, 'error': None, 'target': 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.689 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f703396d-ccc3-4e1f-b33f-d76cfba187d1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe22:508'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 968261, 'tstamp': 968261}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 333653, 'error': None, 'target': 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.706 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca2f197-3da4-437c-a8cd-f39ada65f55b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6c737d6f-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:05:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 279], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 968261, 'reachable_time': 35139, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 333654, 'error': None, 'target': 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e422 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.744 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b1f8f395-685c-4c0d-b58b-15a2765013c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.805 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[56824379-76ca-4871-a50a-29901098f5ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.807 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c737d6f-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.808 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.808 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6c737d6f-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.810 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 kernel: tap6c737d6f-30: entered promiscuous mode
Jan 23 05:56:27 np0005593234 NetworkManager[48942]: <info>  [1769165787.8111] manager: (tap6c737d6f-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/433)
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.813 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.814 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6c737d6f-30, col_values=(('external_ids', {'iface-id': '8bc5480b-7bdc-475b-b309-693291ebc39a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.815 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 ovn_controller[134547]: 2026-01-23T10:56:27Z|00912|binding|INFO|Releasing lport 8bc5480b-7bdc-475b-b309-693291ebc39a from this chassis (sb_readonly=0)
Jan 23 05:56:27 np0005593234 nova_compute[227762]: 2026-01-23 10:56:27.828 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.829 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6c737d6f-3e00-482b-aed5-4f8eabd246f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6c737d6f-3e00-482b-aed5-4f8eabd246f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.830 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[053597b0-16d8-496b-a1df-65ff34d873a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.831 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-6c737d6f-3e00-482b-aed5-4f8eabd246f2
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/6c737d6f-3e00-482b-aed5-4f8eabd246f2.pid.haproxy
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 6c737d6f-3e00-482b-aed5-4f8eabd246f2
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:56:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:27.831 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'env', 'PROCESS_TAG=haproxy-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6c737d6f-3e00-482b-aed5-4f8eabd246f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.193 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165788.1927798, fcb93bcf-9612-4dc7-9996-238d2739d8cb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.193 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.196 227766 DEBUG nova.compute.manager [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.199 227766 INFO nova.virt.libvirt.driver [-] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance running successfully.#033[00m
Jan 23 05:56:28 np0005593234 virtqemud[227483]: argument unsupported: QEMU guest agent is not configured
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.201 227766 DEBUG nova.virt.libvirt.guest [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.202 227766 DEBUG nova.virt.libvirt.driver [None req-356b7fc8-1100-4ddc-bd38-dc267737c0ca 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 23 05:56:28 np0005593234 podman[333727]: 2026-01-23 10:56:28.203325249 +0000 UTC m=+0.052837125 container create 6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 05:56:28 np0005593234 systemd[1]: Started libpod-conmon-6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6.scope.
Jan 23 05:56:28 np0005593234 podman[333727]: 2026-01-23 10:56:28.172488084 +0000 UTC m=+0.021999980 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:56:28 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:56:28 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97b58cb05d4feff3d1ea68c2cbc09bddf019a24a957d04d3e1524411e280275d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:56:28 np0005593234 podman[333727]: 2026-01-23 10:56:28.288852354 +0000 UTC m=+0.138364250 container init 6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:56:28 np0005593234 podman[333727]: 2026-01-23 10:56:28.293953833 +0000 UTC m=+0.143465709 container start 6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:56:28 np0005593234 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[333744]: [NOTICE]   (333748) : New worker (333750) forked
Jan 23 05:56:28 np0005593234 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[333744]: [NOTICE]   (333748) : Loading success.
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.352 227766 DEBUG nova.compute.manager [req-47a371a0-9083-4a43-b12b-5090fc3c1d1c req-ff013f80-53bb-48b1-99ad-84125c567bcb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.352 227766 DEBUG oslo_concurrency.lockutils [req-47a371a0-9083-4a43-b12b-5090fc3c1d1c req-ff013f80-53bb-48b1-99ad-84125c567bcb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.352 227766 DEBUG oslo_concurrency.lockutils [req-47a371a0-9083-4a43-b12b-5090fc3c1d1c req-ff013f80-53bb-48b1-99ad-84125c567bcb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.353 227766 DEBUG oslo_concurrency.lockutils [req-47a371a0-9083-4a43-b12b-5090fc3c1d1c req-ff013f80-53bb-48b1-99ad-84125c567bcb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.353 227766 DEBUG nova.compute.manager [req-47a371a0-9083-4a43-b12b-5090fc3c1d1c req-ff013f80-53bb-48b1-99ad-84125c567bcb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.353 227766 WARNING nova.compute.manager [req-47a371a0-9083-4a43-b12b-5090fc3c1d1c req-ff013f80-53bb-48b1-99ad-84125c567bcb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state active and task_state resize_finish.#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.365 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.371 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.421 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.422 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165788.1954203, fcb93bcf-9612-4dc7-9996-238d2739d8cb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.422 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] VM Started (Lifecycle Event)#033[00m
Jan 23 05:56:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:28.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:28.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.646 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.648 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:56:28 np0005593234 nova_compute[227762]: 2026-01-23 10:56:28.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:29 np0005593234 nova_compute[227762]: 2026-01-23 10:56:29.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:30 np0005593234 nova_compute[227762]: 2026-01-23 10:56:30.401 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:30.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:30.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:31 np0005593234 nova_compute[227762]: 2026-01-23 10:56:31.486 227766 DEBUG nova.compute.manager [req-b40c9a11-5646-48f3-882e-c95127728d08 req-db66dad4-f751-40d5-a103-4bdfa7d8b005 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:31 np0005593234 nova_compute[227762]: 2026-01-23 10:56:31.487 227766 DEBUG oslo_concurrency.lockutils [req-b40c9a11-5646-48f3-882e-c95127728d08 req-db66dad4-f751-40d5-a103-4bdfa7d8b005 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:31 np0005593234 nova_compute[227762]: 2026-01-23 10:56:31.487 227766 DEBUG oslo_concurrency.lockutils [req-b40c9a11-5646-48f3-882e-c95127728d08 req-db66dad4-f751-40d5-a103-4bdfa7d8b005 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:31 np0005593234 nova_compute[227762]: 2026-01-23 10:56:31.488 227766 DEBUG oslo_concurrency.lockutils [req-b40c9a11-5646-48f3-882e-c95127728d08 req-db66dad4-f751-40d5-a103-4bdfa7d8b005 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:31 np0005593234 nova_compute[227762]: 2026-01-23 10:56:31.488 227766 DEBUG nova.compute.manager [req-b40c9a11-5646-48f3-882e-c95127728d08 req-db66dad4-f751-40d5-a103-4bdfa7d8b005 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:31 np0005593234 nova_compute[227762]: 2026-01-23 10:56:31.489 227766 WARNING nova.compute.manager [req-b40c9a11-5646-48f3-882e-c95127728d08 req-db66dad4-f751-40d5-a103-4bdfa7d8b005 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state resized and task_state None.#033[00m
Jan 23 05:56:31 np0005593234 nova_compute[227762]: 2026-01-23 10:56:31.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:31 np0005593234 nova_compute[227762]: 2026-01-23 10:56:31.787 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:32 np0005593234 nova_compute[227762]: 2026-01-23 10:56:32.209 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:32.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:32.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e422 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:33 np0005593234 nova_compute[227762]: 2026-01-23 10:56:33.329 227766 DEBUG nova.network.neutron [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Port 62f573cf-0476-448d-b148-040cec7b1042 binding to destination host compute-2.ctlplane.example.com is already ACTIVE migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3171#033[00m
Jan 23 05:56:33 np0005593234 nova_compute[227762]: 2026-01-23 10:56:33.329 227766 DEBUG oslo_concurrency.lockutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:56:33 np0005593234 nova_compute[227762]: 2026-01-23 10:56:33.330 227766 DEBUG oslo_concurrency.lockutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:56:33 np0005593234 nova_compute[227762]: 2026-01-23 10:56:33.330 227766 DEBUG nova.network.neutron [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:56:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:56:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:34.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:56:34 np0005593234 nova_compute[227762]: 2026-01-23 10:56:34.523 227766 DEBUG nova.network.neutron [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:56:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:56:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:34.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:56:34 np0005593234 nova_compute[227762]: 2026-01-23 10:56:34.774 227766 DEBUG oslo_concurrency.lockutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:56:34 np0005593234 podman[333812]: 2026-01-23 10:56:34.817547509 +0000 UTC m=+0.101415443 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 05:56:35 np0005593234 kernel: tap62f573cf-04 (unregistering): left promiscuous mode
Jan 23 05:56:35 np0005593234 NetworkManager[48942]: <info>  [1769165795.3211] device (tap62f573cf-04): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:56:35 np0005593234 ovn_controller[134547]: 2026-01-23T10:56:35Z|00913|binding|INFO|Releasing lport 62f573cf-0476-448d-b148-040cec7b1042 from this chassis (sb_readonly=0)
Jan 23 05:56:35 np0005593234 ovn_controller[134547]: 2026-01-23T10:56:35Z|00914|binding|INFO|Setting lport 62f573cf-0476-448d-b148-040cec7b1042 down in Southbound
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.333 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:35 np0005593234 ovn_controller[134547]: 2026-01-23T10:56:35Z|00915|binding|INFO|Removing iface tap62f573cf-04 ovn-installed in OVS
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.336 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.350 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:35 np0005593234 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000d3.scope: Deactivated successfully.
Jan 23 05:56:35 np0005593234 systemd[1]: machine-qemu\x2d102\x2dinstance\x2d000000d3.scope: Consumed 7.378s CPU time.
Jan 23 05:56:35 np0005593234 systemd-machined[195626]: Machine qemu-102-instance-000000d3 terminated.
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.402 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f9:18:47 10.100.0.14'], port_security=['fa:16:3e:f9:18:47 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'fcb93bcf-9612-4dc7-9996-238d2739d8cb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '5afed19d-3ff6-4459-b8a0-c5fc6a279e66', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.249', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=866a455a-94b4-4bbd-a367-b902a726ce2f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=62f573cf-0476-448d-b148-040cec7b1042) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.404 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.405 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 62f573cf-0476-448d-b148-040cec7b1042 in datapath 6c737d6f-3e00-482b-aed5-4f8eabd246f2 unbound from our chassis#033[00m
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.406 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6c737d6f-3e00-482b-aed5-4f8eabd246f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.408 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b7ed5cc0-6e32-4d95-832d-6580e2bdb8a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.408 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 namespace which is not needed anymore#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.436 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.441 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.451 227766 INFO nova.virt.libvirt.driver [-] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Instance destroyed successfully.#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.452 227766 DEBUG nova.objects.instance [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'resources' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:56:35 np0005593234 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[333744]: [NOTICE]   (333748) : haproxy version is 2.8.14-c23fe91
Jan 23 05:56:35 np0005593234 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[333744]: [NOTICE]   (333748) : path to executable is /usr/sbin/haproxy
Jan 23 05:56:35 np0005593234 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[333744]: [WARNING]  (333748) : Exiting Master process...
Jan 23 05:56:35 np0005593234 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[333744]: [ALERT]    (333748) : Current worker (333750) exited with code 143 (Terminated)
Jan 23 05:56:35 np0005593234 neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2[333744]: [WARNING]  (333748) : All workers exited. Exiting... (0)
Jan 23 05:56:35 np0005593234 systemd[1]: libpod-6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6.scope: Deactivated successfully.
Jan 23 05:56:35 np0005593234 podman[333871]: 2026-01-23 10:56:35.547226703 +0000 UTC m=+0.039644401 container died 6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 05:56:35 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6-userdata-shm.mount: Deactivated successfully.
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.584 227766 DEBUG nova.virt.libvirt.vif [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:55:07Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-660546175',display_name='tempest-TestNetworkAdvancedServerOps-server-660546175',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-660546175',id=211,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHhNBRegjioPAmva8qyqPMyd4dn+3hiBNwe4BWzp1VDgZFgQ+g4FdHcnXo+cwpLDWgKnm4yCRqf2eKNNhFM/EbeI6EnjlmNiu32pnRKGBZGgO4FKlvjQptQtJfMEpsL1DQ==',key_name='tempest-TestNetworkAdvancedServerOps-1958820888',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:56:28Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-5dvavyin',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:56:28Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=fcb93bcf-9612-4dc7-9996-238d2739d8cb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.584 227766 DEBUG nova.network.os_vif_util [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.585 227766 DEBUG nova.network.os_vif_util [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.585 227766 DEBUG os_vif [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.588 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.589 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap62f573cf-04, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.590 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.592 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:35 np0005593234 systemd[1]: var-lib-containers-storage-overlay-97b58cb05d4feff3d1ea68c2cbc09bddf019a24a957d04d3e1524411e280275d-merged.mount: Deactivated successfully.
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.596 227766 INFO os_vif [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f9:18:47,bridge_name='br-int',has_traffic_filtering=True,id=62f573cf-0476-448d-b148-040cec7b1042,network=Network(6c737d6f-3e00-482b-aed5-4f8eabd246f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap62f573cf-04')#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.601 227766 DEBUG oslo_concurrency.lockutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.601 227766 DEBUG oslo_concurrency.lockutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:35 np0005593234 podman[333871]: 2026-01-23 10:56:35.608622953 +0000 UTC m=+0.101040661 container cleanup 6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 05:56:35 np0005593234 systemd[1]: libpod-conmon-6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6.scope: Deactivated successfully.
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.619 227766 DEBUG nova.objects.instance [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'migration_context' on Instance uuid fcb93bcf-9612-4dc7-9996-238d2739d8cb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:56:35 np0005593234 podman[333901]: 2026-01-23 10:56:35.671005114 +0000 UTC m=+0.041603292 container remove 6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.676 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5be40672-b6ce-4675-855c-24b0a4832af9]: (4, ('Fri Jan 23 10:56:35 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 (6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6)\n6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6\nFri Jan 23 10:56:35 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 (6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6)\n6970ed23882568890900edbade716d27089187617a32fa3966cb5a9d7a92b3a6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.678 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[951ee0ce-32bb-4c96-9bde-e96553b5a4b7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.679 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6c737d6f-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.679 227766 DEBUG oslo_concurrency.processutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:56:35 np0005593234 kernel: tap6c737d6f-30: left promiscuous mode
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.695 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f4c49496-b185-4761-b277-f06daa8bae4b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.704 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.710 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b3e0cc54-271e-4790-93d7-14e6f7a09dbb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.711 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[762ffefb-3ea0-40e5-b41c-1ce7c96b26fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.724 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[adc7c0d9-efac-48ec-88f5-a72b3c6b5b99]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 968254, 'reachable_time': 33901, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 333917, 'error': None, 'target': 'ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:35 np0005593234 systemd[1]: run-netns-ovnmeta\x2d6c737d6f\x2d3e00\x2d482b\x2daed5\x2d4f8eabd246f2.mount: Deactivated successfully.
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.729 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6c737d6f-3e00-482b-aed5-4f8eabd246f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:56:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:35.730 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[6977dcab-5738-4f5f-87cd-da496212722e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:56:35 np0005593234 nova_compute[227762]: 2026-01-23 10:56:35.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:56:36 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/843716931' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:56:36 np0005593234 nova_compute[227762]: 2026-01-23 10:56:36.105 227766 DEBUG oslo_concurrency.processutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:56:36 np0005593234 nova_compute[227762]: 2026-01-23 10:56:36.111 227766 DEBUG nova.compute.provider_tree [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:56:36 np0005593234 nova_compute[227762]: 2026-01-23 10:56:36.314 227766 DEBUG nova.scheduler.client.report [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:56:36 np0005593234 nova_compute[227762]: 2026-01-23 10:56:36.421 227766 DEBUG oslo_concurrency.lockutils [None req-36fbf002-d478-4664-8cb8-385dc6f4964c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_dest" :: held 0.820s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:36.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:36.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:36 np0005593234 nova_compute[227762]: 2026-01-23 10:56:36.586 227766 DEBUG nova.compute.manager [req-904073c5-9676-4afb-b060-0bc21098fa76 req-c7350599-9461-45a9-be62-53f1424c4344 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:36 np0005593234 nova_compute[227762]: 2026-01-23 10:56:36.587 227766 DEBUG oslo_concurrency.lockutils [req-904073c5-9676-4afb-b060-0bc21098fa76 req-c7350599-9461-45a9-be62-53f1424c4344 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:36 np0005593234 nova_compute[227762]: 2026-01-23 10:56:36.587 227766 DEBUG oslo_concurrency.lockutils [req-904073c5-9676-4afb-b060-0bc21098fa76 req-c7350599-9461-45a9-be62-53f1424c4344 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:36 np0005593234 nova_compute[227762]: 2026-01-23 10:56:36.587 227766 DEBUG oslo_concurrency.lockutils [req-904073c5-9676-4afb-b060-0bc21098fa76 req-c7350599-9461-45a9-be62-53f1424c4344 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:36 np0005593234 nova_compute[227762]: 2026-01-23 10:56:36.587 227766 DEBUG nova.compute.manager [req-904073c5-9676-4afb-b060-0bc21098fa76 req-c7350599-9461-45a9-be62-53f1424c4344 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:36 np0005593234 nova_compute[227762]: 2026-01-23 10:56:36.587 227766 WARNING nova.compute.manager [req-904073c5-9676-4afb-b060-0bc21098fa76 req-c7350599-9461-45a9-be62-53f1424c4344 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-unplugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:56:37 np0005593234 nova_compute[227762]: 2026-01-23 10:56:37.232 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:37.234 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=90, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=89) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:56:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:37.236 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:56:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e422 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:38.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:38.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:38 np0005593234 nova_compute[227762]: 2026-01-23 10:56:38.714 227766 DEBUG nova.compute.manager [req-74e7e2ad-7ec7-413a-80dd-6d97464501f0 req-ed12580b-107d-459a-9508-29cf3df126f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:38 np0005593234 nova_compute[227762]: 2026-01-23 10:56:38.714 227766 DEBUG oslo_concurrency.lockutils [req-74e7e2ad-7ec7-413a-80dd-6d97464501f0 req-ed12580b-107d-459a-9508-29cf3df126f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:38 np0005593234 nova_compute[227762]: 2026-01-23 10:56:38.714 227766 DEBUG oslo_concurrency.lockutils [req-74e7e2ad-7ec7-413a-80dd-6d97464501f0 req-ed12580b-107d-459a-9508-29cf3df126f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:38 np0005593234 nova_compute[227762]: 2026-01-23 10:56:38.714 227766 DEBUG oslo_concurrency.lockutils [req-74e7e2ad-7ec7-413a-80dd-6d97464501f0 req-ed12580b-107d-459a-9508-29cf3df126f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:38 np0005593234 nova_compute[227762]: 2026-01-23 10:56:38.714 227766 DEBUG nova.compute.manager [req-74e7e2ad-7ec7-413a-80dd-6d97464501f0 req-ed12580b-107d-459a-9508-29cf3df126f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:38 np0005593234 nova_compute[227762]: 2026-01-23 10:56:38.714 227766 WARNING nova.compute.manager [req-74e7e2ad-7ec7-413a-80dd-6d97464501f0 req-ed12580b-107d-459a-9508-29cf3df126f9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:56:40 np0005593234 nova_compute[227762]: 2026-01-23 10:56:40.406 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:40.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:40.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:40 np0005593234 nova_compute[227762]: 2026-01-23 10:56:40.591 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:41 np0005593234 nova_compute[227762]: 2026-01-23 10:56:41.842 227766 DEBUG nova.compute.manager [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-changed-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:41 np0005593234 nova_compute[227762]: 2026-01-23 10:56:41.843 227766 DEBUG nova.compute.manager [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing instance network info cache due to event network-changed-62f573cf-0476-448d-b148-040cec7b1042. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:56:41 np0005593234 nova_compute[227762]: 2026-01-23 10:56:41.843 227766 DEBUG oslo_concurrency.lockutils [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:56:41 np0005593234 nova_compute[227762]: 2026-01-23 10:56:41.843 227766 DEBUG oslo_concurrency.lockutils [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:56:41 np0005593234 nova_compute[227762]: 2026-01-23 10:56:41.843 227766 DEBUG nova.network.neutron [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Refreshing network info cache for port 62f573cf-0476-448d-b148-040cec7b1042 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:56:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:42.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:42.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e422 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:42.901 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:42.901 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:42.901 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:43 np0005593234 nova_compute[227762]: 2026-01-23 10:56:43.219 227766 DEBUG nova.network.neutron [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updated VIF entry in instance network info cache for port 62f573cf-0476-448d-b148-040cec7b1042. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:56:43 np0005593234 nova_compute[227762]: 2026-01-23 10:56:43.219 227766 DEBUG nova.network.neutron [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Updating instance_info_cache with network_info: [{"id": "62f573cf-0476-448d-b148-040cec7b1042", "address": "fa:16:3e:f9:18:47", "network": {"id": "6c737d6f-3e00-482b-aed5-4f8eabd246f2", "bridge": "br-int", "label": "tempest-network-smoke--878952243", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap62f573cf-04", "ovs_interfaceid": "62f573cf-0476-448d-b148-040cec7b1042", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:56:43 np0005593234 nova_compute[227762]: 2026-01-23 10:56:43.238 227766 DEBUG oslo_concurrency.lockutils [req-c9013f0d-770b-4a22-ac2f-d88fb8e321eb req-20431192-24ce-498a-bffc-7837bd85fd6b 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-fcb93bcf-9612-4dc7-9996-238d2739d8cb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:56:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:44.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:56:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2308389627' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:56:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:56:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2308389627' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:56:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:44.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:45 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:56:45.238 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '90'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:56:45 np0005593234 nova_compute[227762]: 2026-01-23 10:56:45.407 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e423 e423: 3 total, 3 up, 3 in
Jan 23 05:56:45 np0005593234 nova_compute[227762]: 2026-01-23 10:56:45.593 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:46.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:46.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e423 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:47 np0005593234 nova_compute[227762]: 2026-01-23 10:56:47.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:56:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:48.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:48.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:50 np0005593234 nova_compute[227762]: 2026-01-23 10:56:50.409 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:50 np0005593234 nova_compute[227762]: 2026-01-23 10:56:50.449 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165795.4487762, fcb93bcf-9612-4dc7-9996-238d2739d8cb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:56:50 np0005593234 nova_compute[227762]: 2026-01-23 10:56:50.450 227766 INFO nova.compute.manager [-] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:56:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:50.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:56:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:50.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:56:50 np0005593234 nova_compute[227762]: 2026-01-23 10:56:50.594 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:50 np0005593234 nova_compute[227762]: 2026-01-23 10:56:50.783 227766 DEBUG nova.compute.manager [None req-7dce6d0d-52ba-44e6-b879-fd55beda948d - - - - - -] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:56:51 np0005593234 nova_compute[227762]: 2026-01-23 10:56:51.384 227766 DEBUG nova.compute.manager [req-34d6ed0a-c5f2-48cf-b0f8-f52a3a6c5e43 req-88b453e7-50b4-4ffa-a20a-fa6d9a7ed171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:56:51 np0005593234 nova_compute[227762]: 2026-01-23 10:56:51.384 227766 DEBUG oslo_concurrency.lockutils [req-34d6ed0a-c5f2-48cf-b0f8-f52a3a6c5e43 req-88b453e7-50b4-4ffa-a20a-fa6d9a7ed171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:56:51 np0005593234 nova_compute[227762]: 2026-01-23 10:56:51.384 227766 DEBUG oslo_concurrency.lockutils [req-34d6ed0a-c5f2-48cf-b0f8-f52a3a6c5e43 req-88b453e7-50b4-4ffa-a20a-fa6d9a7ed171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:56:51 np0005593234 nova_compute[227762]: 2026-01-23 10:56:51.384 227766 DEBUG oslo_concurrency.lockutils [req-34d6ed0a-c5f2-48cf-b0f8-f52a3a6c5e43 req-88b453e7-50b4-4ffa-a20a-fa6d9a7ed171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "fcb93bcf-9612-4dc7-9996-238d2739d8cb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:56:51 np0005593234 nova_compute[227762]: 2026-01-23 10:56:51.385 227766 DEBUG nova.compute.manager [req-34d6ed0a-c5f2-48cf-b0f8-f52a3a6c5e43 req-88b453e7-50b4-4ffa-a20a-fa6d9a7ed171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] No waiting events found dispatching network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:56:51 np0005593234 nova_compute[227762]: 2026-01-23 10:56:51.385 227766 WARNING nova.compute.manager [req-34d6ed0a-c5f2-48cf-b0f8-f52a3a6c5e43 req-88b453e7-50b4-4ffa-a20a-fa6d9a7ed171 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: fcb93bcf-9612-4dc7-9996-238d2739d8cb] Received unexpected event network-vif-plugged-62f573cf-0476-448d-b148-040cec7b1042 for instance with vm_state resized and task_state resize_reverting.#033[00m
Jan 23 05:56:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:52.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:52.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e423 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:54.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:54.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:55 np0005593234 nova_compute[227762]: 2026-01-23 10:56:55.411 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:55 np0005593234 nova_compute[227762]: 2026-01-23 10:56:55.597 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:56:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:56:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:56.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:56:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:56.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #193. Immutable memtables: 0.
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.601155) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 123] Flushing memtable with next log file: 193
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165817601231, "job": 123, "event": "flush_started", "num_memtables": 1, "num_entries": 2372, "num_deletes": 252, "total_data_size": 5883889, "memory_usage": 5943416, "flush_reason": "Manual Compaction"}
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 123] Level-0 flush table #194: started
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165817635679, "cf_name": "default", "job": 123, "event": "table_file_creation", "file_number": 194, "file_size": 3852097, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 91911, "largest_seqno": 94278, "table_properties": {"data_size": 3842288, "index_size": 6238, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19971, "raw_average_key_size": 20, "raw_value_size": 3822848, "raw_average_value_size": 3928, "num_data_blocks": 271, "num_entries": 973, "num_filter_entries": 973, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165597, "oldest_key_time": 1769165597, "file_creation_time": 1769165817, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 194, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 123] Flush lasted 34604 microseconds, and 10502 cpu microseconds.
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.635754) [db/flush_job.cc:967] [default] [JOB 123] Level-0 flush table #194: 3852097 bytes OK
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.635783) [db/memtable_list.cc:519] [default] Level-0 commit table #194 started
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.639324) [db/memtable_list.cc:722] [default] Level-0 commit table #194: memtable #1 done
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.639345) EVENT_LOG_v1 {"time_micros": 1769165817639338, "job": 123, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.639366) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 123] Try to delete WAL files size 5873381, prev total WAL file size 5873381, number of live WAL files 2.
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000190.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.641184) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038323833' seq:72057594037927935, type:22 .. '7061786F730038353335' seq:0, type:0; will stop at (end)
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 124] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 123 Base level 0, inputs: [194(3761KB)], [192(12MB)]
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165817641377, "job": 124, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [194], "files_L6": [192], "score": -1, "input_data_size": 16583751, "oldest_snapshot_seqno": -1}
Jan 23 05:56:57 np0005593234 podman[334002]: 2026-01-23 10:56:57.754441389 +0000 UTC m=+0.053203586 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 e424: 3 total, 3 up, 3 in
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 124] Generated table #195: 11255 keys, 14633508 bytes, temperature: kUnknown
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165817766865, "cf_name": "default", "job": 124, "event": "table_file_creation", "file_number": 195, "file_size": 14633508, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14560906, "index_size": 43369, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28165, "raw_key_size": 296561, "raw_average_key_size": 26, "raw_value_size": 14364208, "raw_average_value_size": 1276, "num_data_blocks": 1651, "num_entries": 11255, "num_filter_entries": 11255, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769165817, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 195, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.767075) [db/compaction/compaction_job.cc:1663] [default] [JOB 124] Compacted 1@0 + 1@6 files to L6 => 14633508 bytes
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.771709) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 132.1 rd, 116.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 12.1 +0.0 blob) out(14.0 +0.0 blob), read-write-amplify(8.1) write-amplify(3.8) OK, records in: 11780, records dropped: 525 output_compression: NoCompression
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.771741) EVENT_LOG_v1 {"time_micros": 1769165817771728, "job": 124, "event": "compaction_finished", "compaction_time_micros": 125542, "compaction_time_cpu_micros": 64804, "output_level": 6, "num_output_files": 1, "total_output_size": 14633508, "num_input_records": 11780, "num_output_records": 11255, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.641055) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.771831) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.771836) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.771838) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.771840) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:56:57.771842) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000194.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165817772693, "job": 0, "event": "table_file_deletion", "file_number": 194}
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000192.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165817774989, "job": 0, "event": "table_file_deletion", "file_number": 192}
Jan 23 05:56:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:56:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:56:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:56:58.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:56:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:56:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:56:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:56:58.591 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:00 np0005593234 nova_compute[227762]: 2026-01-23 10:57:00.413 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:00.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:00.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:00 np0005593234 nova_compute[227762]: 2026-01-23 10:57:00.599 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:57:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:02.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:57:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:57:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:02.597 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:57:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:04.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:57:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:04.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:57:05 np0005593234 ovn_controller[134547]: 2026-01-23T10:57:05Z|00916|memory_trim|INFO|Detected inactivity (last active 30014 ms ago): trimming memory
Jan 23 05:57:05 np0005593234 nova_compute[227762]: 2026-01-23 10:57:05.442 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:05 np0005593234 nova_compute[227762]: 2026-01-23 10:57:05.600 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:05 np0005593234 podman[334025]: 2026-01-23 10:57:05.830647799 +0000 UTC m=+0.112297683 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:57:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:06.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:06.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:08.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:08.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 05:57:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:57:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 05:57:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:57:10 np0005593234 nova_compute[227762]: 2026-01-23 10:57:10.445 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:10.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:10 np0005593234 nova_compute[227762]: 2026-01-23 10:57:10.603 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:10.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:57:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:57:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:57:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:12.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:12.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:14.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:14.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:15 np0005593234 nova_compute[227762]: 2026-01-23 10:57:15.446 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:15 np0005593234 nova_compute[227762]: 2026-01-23 10:57:15.605 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:57:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:16.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:57:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:16.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:16 np0005593234 nova_compute[227762]: 2026-01-23 10:57:16.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:16 np0005593234 nova_compute[227762]: 2026-01-23 10:57:16.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:57:16 np0005593234 nova_compute[227762]: 2026-01-23 10:57:16.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:57:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:18.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:57:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:18.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:57:19 np0005593234 nova_compute[227762]: 2026-01-23 10:57:19.615 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:57:19 np0005593234 nova_compute[227762]: 2026-01-23 10:57:19.615 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:20 np0005593234 nova_compute[227762]: 2026-01-23 10:57:20.449 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:20.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:20 np0005593234 nova_compute[227762]: 2026-01-23 10:57:20.607 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:20.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:57:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:22.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:57:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:22.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:22 np0005593234 nova_compute[227762]: 2026-01-23 10:57:22.821 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:57:22 np0005593234 nova_compute[227762]: 2026-01-23 10:57:22.821 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:57:22 np0005593234 nova_compute[227762]: 2026-01-23 10:57:22.821 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:57:22 np0005593234 nova_compute[227762]: 2026-01-23 10:57:22.822 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:57:22 np0005593234 nova_compute[227762]: 2026-01-23 10:57:22.822 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:57:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:57:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3232292869' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:57:23 np0005593234 nova_compute[227762]: 2026-01-23 10:57:23.271 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:57:23 np0005593234 nova_compute[227762]: 2026-01-23 10:57:23.419 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:57:23 np0005593234 nova_compute[227762]: 2026-01-23 10:57:23.420 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4135MB free_disk=20.942649841308594GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:57:23 np0005593234 nova_compute[227762]: 2026-01-23 10:57:23.420 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:57:23 np0005593234 nova_compute[227762]: 2026-01-23 10:57:23.420 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.109 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.110 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.177 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.237 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.238 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.252 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.284 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.317 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.391 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:57:24.392 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=91, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=90) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:57:24 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:57:24.393 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:57:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:57:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:24.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:57:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:57:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:24.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:57:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:57:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/799691098' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.765 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.773 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.803 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.804 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:57:24 np0005593234 nova_compute[227762]: 2026-01-23 10:57:24.805 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.384s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:57:25 np0005593234 nova_compute[227762]: 2026-01-23 10:57:25.451 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:25 np0005593234 nova_compute[227762]: 2026-01-23 10:57:25.609 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:25 np0005593234 nova_compute[227762]: 2026-01-23 10:57:25.933 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:25 np0005593234 nova_compute[227762]: 2026-01-23 10:57:25.934 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:25 np0005593234 nova_compute[227762]: 2026-01-23 10:57:25.934 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:57:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:57:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:26.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:57:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:26.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:57:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:28.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:57:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:28.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:28 np0005593234 podman[334410]: 2026-01-23 10:57:28.765938401 +0000 UTC m=+0.053159034 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 05:57:30 np0005593234 nova_compute[227762]: 2026-01-23 10:57:30.453 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:57:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:30.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:57:30 np0005593234 nova_compute[227762]: 2026-01-23 10:57:30.610 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:30.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:30 np0005593234 nova_compute[227762]: 2026-01-23 10:57:30.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:30 np0005593234 nova_compute[227762]: 2026-01-23 10:57:30.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:57:31.395 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '91'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:57:31 np0005593234 nova_compute[227762]: 2026-01-23 10:57:31.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:57:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:32.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:57:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:57:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:32.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:57:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:33 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:57:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:34.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:34.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:57:35 np0005593234 nova_compute[227762]: 2026-01-23 10:57:35.455 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:35 np0005593234 nova_compute[227762]: 2026-01-23 10:57:35.613 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:35 np0005593234 nova_compute[227762]: 2026-01-23 10:57:35.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:36.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:36.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:36 np0005593234 podman[334531]: 2026-01-23 10:57:36.777329302 +0000 UTC m=+0.077897617 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 05:57:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:38.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:38.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:40 np0005593234 nova_compute[227762]: 2026-01-23 10:57:40.460 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:40.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:40 np0005593234 nova_compute[227762]: 2026-01-23 10:57:40.614 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:40.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:42.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:42.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:57:42.902 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:57:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:57:42.903 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:57:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:57:42.903 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:57:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:44.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:44.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:44 np0005593234 nova_compute[227762]: 2026-01-23 10:57:44.685 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:44 np0005593234 nova_compute[227762]: 2026-01-23 10:57:44.718 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:45 np0005593234 nova_compute[227762]: 2026-01-23 10:57:45.462 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:45 np0005593234 nova_compute[227762]: 2026-01-23 10:57:45.617 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:46.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:46.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:47 np0005593234 nova_compute[227762]: 2026-01-23 10:57:47.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:57:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:48.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:48.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:50 np0005593234 nova_compute[227762]: 2026-01-23 10:57:50.464 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:57:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:50.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:57:50 np0005593234 nova_compute[227762]: 2026-01-23 10:57:50.619 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:50.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:57:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:52.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:57:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:52.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #196. Immutable memtables: 0.
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.886100) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 125] Flushing memtable with next log file: 196
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165872886136, "job": 125, "event": "flush_started", "num_memtables": 1, "num_entries": 793, "num_deletes": 258, "total_data_size": 1482983, "memory_usage": 1500024, "flush_reason": "Manual Compaction"}
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 125] Level-0 flush table #197: started
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165872892547, "cf_name": "default", "job": 125, "event": "table_file_creation", "file_number": 197, "file_size": 673149, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 94283, "largest_seqno": 95071, "table_properties": {"data_size": 669770, "index_size": 1158, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9221, "raw_average_key_size": 21, "raw_value_size": 662569, "raw_average_value_size": 1519, "num_data_blocks": 50, "num_entries": 436, "num_filter_entries": 436, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165817, "oldest_key_time": 1769165817, "file_creation_time": 1769165872, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 197, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 125] Flush lasted 6500 microseconds, and 2715 cpu microseconds.
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.892599) [db/flush_job.cc:967] [default] [JOB 125] Level-0 flush table #197: 673149 bytes OK
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.892615) [db/memtable_list.cc:519] [default] Level-0 commit table #197 started
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.894605) [db/memtable_list.cc:722] [default] Level-0 commit table #197: memtable #1 done
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.894618) EVENT_LOG_v1 {"time_micros": 1769165872894613, "job": 125, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.894633) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 125] Try to delete WAL files size 1478770, prev total WAL file size 1478770, number of live WAL files 2.
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000193.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.895200) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033323539' seq:72057594037927935, type:22 .. '6D6772737461740033353137' seq:0, type:0; will stop at (end)
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 126] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 125 Base level 0, inputs: [197(657KB)], [195(13MB)]
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165872895260, "job": 126, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [197], "files_L6": [195], "score": -1, "input_data_size": 15306657, "oldest_snapshot_seqno": -1}
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 126] Generated table #198: 11175 keys, 11655371 bytes, temperature: kUnknown
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165872961456, "cf_name": "default", "job": 126, "event": "table_file_creation", "file_number": 198, "file_size": 11655371, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11587585, "index_size": 38745, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27973, "raw_key_size": 295117, "raw_average_key_size": 26, "raw_value_size": 11396622, "raw_average_value_size": 1019, "num_data_blocks": 1460, "num_entries": 11175, "num_filter_entries": 11175, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769165872, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 198, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.961865) [db/compaction/compaction_job.cc:1663] [default] [JOB 126] Compacted 1@0 + 1@6 files to L6 => 11655371 bytes
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.963641) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 230.6 rd, 175.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 14.0 +0.0 blob) out(11.1 +0.0 blob), read-write-amplify(40.1) write-amplify(17.3) OK, records in: 11691, records dropped: 516 output_compression: NoCompression
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.963671) EVENT_LOG_v1 {"time_micros": 1769165872963658, "job": 126, "event": "compaction_finished", "compaction_time_micros": 66381, "compaction_time_cpu_micros": 30492, "output_level": 6, "num_output_files": 1, "total_output_size": 11655371, "num_input_records": 11691, "num_output_records": 11175, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000197.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165872964024, "job": 126, "event": "table_file_deletion", "file_number": 197}
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000195.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165872969032, "job": 126, "event": "table_file_deletion", "file_number": 195}
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.895116) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.969129) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.969134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.969136) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.969138) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:57:52 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:57:52.969139) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:57:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:54.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:54.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:55 np0005593234 nova_compute[227762]: 2026-01-23 10:57:55.467 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:55 np0005593234 nova_compute[227762]: 2026-01-23 10:57:55.621 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:57:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:57:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:56.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:57:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:56.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:57:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:57:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:57:58.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:57:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:57:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:57:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:57:58.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:57:59 np0005593234 podman[334622]: 2026-01-23 10:57:59.766726825 +0000 UTC m=+0.061622268 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 05:58:00 np0005593234 nova_compute[227762]: 2026-01-23 10:58:00.502 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:00.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:00 np0005593234 nova_compute[227762]: 2026-01-23 10:58:00.622 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:00.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:02.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:58:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:02.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:58:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:58:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:04.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:58:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:04.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:05 np0005593234 nova_compute[227762]: 2026-01-23 10:58:05.505 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:05 np0005593234 nova_compute[227762]: 2026-01-23 10:58:05.624 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:58:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:06.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:58:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:06.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:07 np0005593234 podman[334645]: 2026-01-23 10:58:07.818788642 +0000 UTC m=+0.112625264 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 23 05:58:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:58:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:08.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:58:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:08.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:10 np0005593234 nova_compute[227762]: 2026-01-23 10:58:10.508 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:10.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:10 np0005593234 nova_compute[227762]: 2026-01-23 10:58:10.626 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:58:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:10.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:58:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:12.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:12.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:14.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:14.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:15 np0005593234 nova_compute[227762]: 2026-01-23 10:58:15.510 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:15 np0005593234 nova_compute[227762]: 2026-01-23 10:58:15.627 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:58:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:16.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:58:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:16.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:17 np0005593234 nova_compute[227762]: 2026-01-23 10:58:17.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:17 np0005593234 nova_compute[227762]: 2026-01-23 10:58:17.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:58:17 np0005593234 nova_compute[227762]: 2026-01-23 10:58:17.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:58:17 np0005593234 nova_compute[227762]: 2026-01-23 10:58:17.764 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:58:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:18.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:58:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:18.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:58:18 np0005593234 nova_compute[227762]: 2026-01-23 10:58:18.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:18 np0005593234 nova_compute[227762]: 2026-01-23 10:58:18.768 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:18 np0005593234 nova_compute[227762]: 2026-01-23 10:58:18.769 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:18 np0005593234 nova_compute[227762]: 2026-01-23 10:58:18.769 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:18 np0005593234 nova_compute[227762]: 2026-01-23 10:58:18.770 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:58:18 np0005593234 nova_compute[227762]: 2026-01-23 10:58:18.770 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:58:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:58:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1150867384' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:58:19 np0005593234 nova_compute[227762]: 2026-01-23 10:58:19.210 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:58:19 np0005593234 nova_compute[227762]: 2026-01-23 10:58:19.407 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:58:19 np0005593234 nova_compute[227762]: 2026-01-23 10:58:19.408 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4164MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:58:19 np0005593234 nova_compute[227762]: 2026-01-23 10:58:19.409 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:19 np0005593234 nova_compute[227762]: 2026-01-23 10:58:19.409 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:19 np0005593234 nova_compute[227762]: 2026-01-23 10:58:19.476 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:58:19 np0005593234 nova_compute[227762]: 2026-01-23 10:58:19.476 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:58:19 np0005593234 nova_compute[227762]: 2026-01-23 10:58:19.501 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:58:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:58:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4069418747' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:58:19 np0005593234 nova_compute[227762]: 2026-01-23 10:58:19.974 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:58:19 np0005593234 nova_compute[227762]: 2026-01-23 10:58:19.980 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:58:20 np0005593234 nova_compute[227762]: 2026-01-23 10:58:20.003 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:58:20 np0005593234 nova_compute[227762]: 2026-01-23 10:58:20.005 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:58:20 np0005593234 nova_compute[227762]: 2026-01-23 10:58:20.005 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.596s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:20 np0005593234 nova_compute[227762]: 2026-01-23 10:58:20.521 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:20.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:20 np0005593234 nova_compute[227762]: 2026-01-23 10:58:20.628 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:58:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:20.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:58:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:58:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:22.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:58:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:22.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:23 np0005593234 nova_compute[227762]: 2026-01-23 10:58:23.005 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:24 np0005593234 ovn_controller[134547]: 2026-01-23T10:58:24Z|00917|memory_trim|INFO|Detected inactivity (last active 30007 ms ago): trimming memory
Jan 23 05:58:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:24.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:24.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:25 np0005593234 nova_compute[227762]: 2026-01-23 10:58:25.523 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:25 np0005593234 nova_compute[227762]: 2026-01-23 10:58:25.629 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:26.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:26.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:27 np0005593234 nova_compute[227762]: 2026-01-23 10:58:27.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:27 np0005593234 nova_compute[227762]: 2026-01-23 10:58:27.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:58:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:28.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:28.735 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:30 np0005593234 nova_compute[227762]: 2026-01-23 10:58:30.527 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:58:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:30.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:58:30 np0005593234 nova_compute[227762]: 2026-01-23 10:58:30.631 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:30.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:30 np0005593234 podman[334778]: 2026-01-23 10:58:30.753512544 +0000 UTC m=+0.054198067 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Jan 23 05:58:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:32.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:32.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:32 np0005593234 nova_compute[227762]: 2026-01-23 10:58:32.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:32 np0005593234 nova_compute[227762]: 2026-01-23 10:58:32.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:33 np0005593234 nova_compute[227762]: 2026-01-23 10:58:33.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:33 np0005593234 nova_compute[227762]: 2026-01-23 10:58:33.756 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 05:58:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:58:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:34.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:58:34 np0005593234 nova_compute[227762]: 2026-01-23 10:58:34.720 227766 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Creating tmpfile /var/lib/nova/instances/tmpri2lv6mn to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Jan 23 05:58:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:34.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:34 np0005593234 nova_compute[227762]: 2026-01-23 10:58:34.810 227766 DEBUG nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpri2lv6mn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Jan 23 05:58:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:58:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:58:35 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:58:35 np0005593234 nova_compute[227762]: 2026-01-23 10:58:35.528 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:35 np0005593234 nova_compute[227762]: 2026-01-23 10:58:35.632 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:35 np0005593234 nova_compute[227762]: 2026-01-23 10:58:35.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:35 np0005593234 nova_compute[227762]: 2026-01-23 10:58:35.780 227766 DEBUG nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpri2lv6mn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='575c76d9-7306-429f-baca-4d450f37c388',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Jan 23 05:58:35 np0005593234 nova_compute[227762]: 2026-01-23 10:58:35.801 227766 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:58:35 np0005593234 nova_compute[227762]: 2026-01-23 10:58:35.801 227766 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquired lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:58:35 np0005593234 nova_compute[227762]: 2026-01-23 10:58:35.802 227766 DEBUG nova.network.neutron [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:58:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:36.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:58:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:36.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.259 227766 DEBUG nova.network.neutron [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updating instance_info_cache with network_info: [{"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.274 227766 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Releasing lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.276 227766 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpri2lv6mn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='575c76d9-7306-429f-baca-4d450f37c388',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.277 227766 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Creating instance directory: /var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388 pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.277 227766 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Ensure instance console log exists: /var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.277 227766 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.278 227766 DEBUG nova.virt.libvirt.vif [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:57:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1747551892',display_name='tempest-TestNetworkAdvancedServerOps-server-1747551892',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1747551892',id=212,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNLmxmIGURR44ldVtZLj5mDiJy9rKp0lzbESRue6Wnd2DycIawGU9GHFwoSAqxCF+VQclApTf6ivzJfIihq8OTikttBlSKyYddrT5smaN5oWB8DMKx3zgAFW+FO8eDOMGA==',key_name='tempest-TestNetworkAdvancedServerOps-647126014',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:58:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-pdznqph4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:58:08Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=575c76d9-7306-429f-baca-4d450f37c388,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.279 227766 DEBUG nova.network.os_vif_util [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converting VIF {"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.279 227766 DEBUG nova.network.os_vif_util [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.280 227766 DEBUG os_vif [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.280 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.281 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.281 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.285 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.285 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeb5d7473-26, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.285 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapeb5d7473-26, col_values=(('external_ids', {'iface-id': 'eb5d7473-2661-43a3-bf30-f8ccc9735ea6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:da:56', 'vm-uuid': '575c76d9-7306-429f-baca-4d450f37c388'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.287 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:37 np0005593234 NetworkManager[48942]: <info>  [1769165917.2883] manager: (tapeb5d7473-26): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/434)
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.289 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.293 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.294 227766 INFO os_vif [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26')#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.295 227766 DEBUG nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Jan 23 05:58:37 np0005593234 nova_compute[227762]: 2026-01-23 10:58:37.295 227766 DEBUG nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpri2lv6mn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='575c76d9-7306-429f-baca-4d450f37c388',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Jan 23 05:58:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:38 np0005593234 nova_compute[227762]: 2026-01-23 10:58:38.257 227766 DEBUG nova.network.neutron [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Port eb5d7473-2661-43a3-bf30-f8ccc9735ea6 updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Jan 23 05:58:38 np0005593234 nova_compute[227762]: 2026-01-23 10:58:38.258 227766 DEBUG nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=18432,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpri2lv6mn',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='575c76d9-7306-429f-baca-4d450f37c388',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Jan 23 05:58:38 np0005593234 systemd[1]: Starting libvirt proxy daemon...
Jan 23 05:58:38 np0005593234 systemd[1]: Started libvirt proxy daemon.
Jan 23 05:58:38 np0005593234 podman[334983]: 2026-01-23 10:58:38.491858889 +0000 UTC m=+0.086407664 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:58:38 np0005593234 kernel: tapeb5d7473-26: entered promiscuous mode
Jan 23 05:58:38 np0005593234 NetworkManager[48942]: <info>  [1769165918.5499] manager: (tapeb5d7473-26): new Tun device (/org/freedesktop/NetworkManager/Devices/435)
Jan 23 05:58:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:58:38Z|00918|binding|INFO|Claiming lport eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for this additional chassis.
Jan 23 05:58:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:58:38Z|00919|binding|INFO|eb5d7473-2661-43a3-bf30-f8ccc9735ea6: Claiming fa:16:3e:a0:da:56 10.100.0.13
Jan 23 05:58:38 np0005593234 nova_compute[227762]: 2026-01-23 10:58:38.550 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:38 np0005593234 nova_compute[227762]: 2026-01-23 10:58:38.555 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:38 np0005593234 nova_compute[227762]: 2026-01-23 10:58:38.559 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:38 np0005593234 nova_compute[227762]: 2026-01-23 10:58:38.560 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:38 np0005593234 NetworkManager[48942]: <info>  [1769165918.5609] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/436)
Jan 23 05:58:38 np0005593234 NetworkManager[48942]: <info>  [1769165918.5614] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/437)
Jan 23 05:58:38 np0005593234 systemd-udevd[335042]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:58:38 np0005593234 systemd-machined[195626]: New machine qemu-103-instance-000000d4.
Jan 23 05:58:38 np0005593234 NetworkManager[48942]: <info>  [1769165918.5883] device (tapeb5d7473-26): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:58:38 np0005593234 NetworkManager[48942]: <info>  [1769165918.5889] device (tapeb5d7473-26): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:58:38 np0005593234 systemd[1]: Started Virtual Machine qemu-103-instance-000000d4.
Jan 23 05:58:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:38.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:38 np0005593234 nova_compute[227762]: 2026-01-23 10:58:38.640 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:38 np0005593234 nova_compute[227762]: 2026-01-23 10:58:38.645 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:38 np0005593234 ovn_controller[134547]: 2026-01-23T10:58:38Z|00920|binding|INFO|Setting lport eb5d7473-2661-43a3-bf30-f8ccc9735ea6 ovn-installed in OVS
Jan 23 05:58:38 np0005593234 nova_compute[227762]: 2026-01-23 10:58:38.659 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:38.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:39 np0005593234 nova_compute[227762]: 2026-01-23 10:58:39.715 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165919.7153773, 575c76d9-7306-429f-baca-4d450f37c388 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:58:39 np0005593234 nova_compute[227762]: 2026-01-23 10:58:39.716 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] VM Started (Lifecycle Event)#033[00m
Jan 23 05:58:39 np0005593234 nova_compute[227762]: 2026-01-23 10:58:39.734 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:58:40 np0005593234 nova_compute[227762]: 2026-01-23 10:58:40.237 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165920.2369082, 575c76d9-7306-429f-baca-4d450f37c388 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:58:40 np0005593234 nova_compute[227762]: 2026-01-23 10:58:40.237 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:58:40 np0005593234 nova_compute[227762]: 2026-01-23 10:58:40.255 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:58:40 np0005593234 nova_compute[227762]: 2026-01-23 10:58:40.258 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:58:40 np0005593234 nova_compute[227762]: 2026-01-23 10:58:40.292 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Jan 23 05:58:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:40.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:40 np0005593234 nova_compute[227762]: 2026-01-23 10:58:40.718 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:40.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:58:41Z|00921|binding|INFO|Claiming lport eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for this chassis.
Jan 23 05:58:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:58:41Z|00922|binding|INFO|eb5d7473-2661-43a3-bf30-f8ccc9735ea6: Claiming fa:16:3e:a0:da:56 10.100.0.13
Jan 23 05:58:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:58:41Z|00923|binding|INFO|Setting lport eb5d7473-2661-43a3-bf30-f8ccc9735ea6 up in Southbound
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.142 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:da:56 10.100.0.13'], port_security=['fa:16:3e:a0:da:56 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '575c76d9-7306-429f-baca-4d450f37c388', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '11', 'neutron:security_group_ids': '6b9b22d8-1c0e-45a7-b51b-d7eb1b043141', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.250'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c950e56e-abf1-437a-896b-f8357baa2295, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=eb5d7473-2661-43a3-bf30-f8ccc9735ea6) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.143 144381 INFO neutron.agent.ovn.metadata.agent [-] Port eb5d7473-2661-43a3-bf30-f8ccc9735ea6 in datapath 709bd24b-e32e-4388-bc40-5f1d023f1ad4 bound to our chassis#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.144 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 709bd24b-e32e-4388-bc40-5f1d023f1ad4#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.155 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a92a9c8f-02d2-40bc-8d99-a3e94188ba25]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.156 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap709bd24b-e1 in ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.158 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap709bd24b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.158 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2f11ff52-bade-431e-894f-7d36186c9500]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.159 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1a01340a-4a77-428e-8761-743711b60c2b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.170 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[5568eb7c-aff1-43b4-9df0-24a5524f18b9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.183 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d9499fb5-8922-4428-8a1c-a75d6c61a519]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.218 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b7beb154-a4a4-483b-981c-859bf2a3f822]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.224 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[122b7f08-8ea1-4477-9b18-7d6c18588e5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 NetworkManager[48942]: <info>  [1769165921.2261] manager: (tap709bd24b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/438)
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.263 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[5f07689d-9a6e-46e5-9ca1-cb2b4453f1ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.266 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[bf108e6b-6c97-4021-bc36-9e663d3aca7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 NetworkManager[48942]: <info>  [1769165921.2896] device (tap709bd24b-e0): carrier: link connected
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.295 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c73f1659-59cd-4104-94ba-56a19ef3af9c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.311 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[20e2575b-a2bb-4763-a89d-d11267032e19]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap709bd24b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:b0:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 282], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 981624, 'reachable_time': 16554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 335168, 'error': None, 'target': 'ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 nova_compute[227762]: 2026-01-23 10:58:41.322 227766 INFO nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Post operation of migration started#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.327 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7d973223-2c7d-4d74-8ef1-4c5dc59db256]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee4:b0f9'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 981624, 'tstamp': 981624}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 335169, 'error': None, 'target': 'ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.346 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[de65ee1a-ccbb-4071-83f8-2cc63980d1f9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap709bd24b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e4:b0:f9'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 282], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 981624, 'reachable_time': 16554, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 335170, 'error': None, 'target': 'ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.375 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[66a94e7f-37eb-4c87-af09-4ab8c8691b9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.436 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[34efb116-dd50-482f-a966-53476d26a0ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.438 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap709bd24b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.438 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.439 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap709bd24b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:41 np0005593234 nova_compute[227762]: 2026-01-23 10:58:41.440 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:41 np0005593234 kernel: tap709bd24b-e0: entered promiscuous mode
Jan 23 05:58:41 np0005593234 NetworkManager[48942]: <info>  [1769165921.4422] manager: (tap709bd24b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/439)
Jan 23 05:58:41 np0005593234 nova_compute[227762]: 2026-01-23 10:58:41.444 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.445 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap709bd24b-e0, col_values=(('external_ids', {'iface-id': '5311e73f-96f0-4470-83d4-e798caf6a353'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:41 np0005593234 ovn_controller[134547]: 2026-01-23T10:58:41Z|00924|binding|INFO|Releasing lport 5311e73f-96f0-4470-83d4-e798caf6a353 from this chassis (sb_readonly=0)
Jan 23 05:58:41 np0005593234 nova_compute[227762]: 2026-01-23 10:58:41.448 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:41 np0005593234 nova_compute[227762]: 2026-01-23 10:58:41.449 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.449 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/709bd24b-e32e-4388-bc40-5f1d023f1ad4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/709bd24b-e32e-4388-bc40-5f1d023f1ad4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.450 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[499e28b0-9822-451b-adb4-d86c72755a97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.451 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-709bd24b-e32e-4388-bc40-5f1d023f1ad4
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/709bd24b-e32e-4388-bc40-5f1d023f1ad4.pid.haproxy
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 709bd24b-e32e-4388-bc40-5f1d023f1ad4
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:58:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:41.452 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'env', 'PROCESS_TAG=haproxy-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/709bd24b-e32e-4388-bc40-5f1d023f1ad4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:58:41 np0005593234 nova_compute[227762]: 2026-01-23 10:58:41.461 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:41 np0005593234 nova_compute[227762]: 2026-01-23 10:58:41.638 227766 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:58:41 np0005593234 nova_compute[227762]: 2026-01-23 10:58:41.638 227766 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquired lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:58:41 np0005593234 nova_compute[227762]: 2026-01-23 10:58:41.639 227766 DEBUG nova.network.neutron [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:58:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:58:41 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:58:41 np0005593234 podman[335203]: 2026-01-23 10:58:41.82661796 +0000 UTC m=+0.051277745 container create 1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 05:58:41 np0005593234 systemd[1]: Started libpod-conmon-1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2.scope.
Jan 23 05:58:41 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:58:41 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d6f23ee7b42b7c9bac6ae5f0a316f89b34002b5fa944904d1237df1bb7c58c87/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:58:41 np0005593234 podman[335203]: 2026-01-23 10:58:41.799261694 +0000 UTC m=+0.023921499 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:58:41 np0005593234 podman[335203]: 2026-01-23 10:58:41.896102673 +0000 UTC m=+0.120762478 container init 1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:58:41 np0005593234 podman[335203]: 2026-01-23 10:58:41.901376858 +0000 UTC m=+0.126036643 container start 1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:58:41 np0005593234 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[335218]: [NOTICE]   (335222) : New worker (335224) forked
Jan 23 05:58:41 np0005593234 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[335218]: [NOTICE]   (335222) : Loading success.
Jan 23 05:58:42 np0005593234 nova_compute[227762]: 2026-01-23 10:58:42.288 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:42 np0005593234 nova_compute[227762]: 2026-01-23 10:58:42.603 227766 DEBUG nova.network.neutron [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updating instance_info_cache with network_info: [{"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:58:42 np0005593234 nova_compute[227762]: 2026-01-23 10:58:42.623 227766 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Releasing lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:58:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:42.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:42 np0005593234 nova_compute[227762]: 2026-01-23 10:58:42.640 227766 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:42 np0005593234 nova_compute[227762]: 2026-01-23 10:58:42.640 227766 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:42 np0005593234 nova_compute[227762]: 2026-01-23 10:58:42.641 227766 DEBUG oslo_concurrency.lockutils [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:42 np0005593234 nova_compute[227762]: 2026-01-23 10:58:42.647 227766 INFO nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Jan 23 05:58:42 np0005593234 virtqemud[227483]: Domain id=103 name='instance-000000d4' uuid=575c76d9-7306-429f-baca-4d450f37c388 is tainted: custom-monitor
Jan 23 05:58:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:42.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:42.905 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:42.906 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:42.907 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:43 np0005593234 nova_compute[227762]: 2026-01-23 10:58:43.654 227766 INFO nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Jan 23 05:58:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:44.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:44 np0005593234 nova_compute[227762]: 2026-01-23 10:58:44.660 227766 INFO nova.virt.libvirt.driver [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Jan 23 05:58:44 np0005593234 nova_compute[227762]: 2026-01-23 10:58:44.666 227766 DEBUG nova.compute.manager [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:58:44 np0005593234 nova_compute[227762]: 2026-01-23 10:58:44.689 227766 DEBUG nova.objects.instance [None req-01ba9cb1-59db-48a5-b8f9-e4972ac60fa8 22656a4d33784250b8f522a77dc0909d eac31b2500aa40729c9ae6441d1a3f2e - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 05:58:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:58:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:44.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:58:45 np0005593234 nova_compute[227762]: 2026-01-23 10:58:45.719 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:46.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:58:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:46.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:58:47 np0005593234 nova_compute[227762]: 2026-01-23 10:58:47.290 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #199. Immutable memtables: 0.
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.809024) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 127] Flushing memtable with next log file: 199
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165927809085, "job": 127, "event": "flush_started", "num_memtables": 1, "num_entries": 780, "num_deletes": 256, "total_data_size": 1474682, "memory_usage": 1495200, "flush_reason": "Manual Compaction"}
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 127] Level-0 flush table #200: started
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165927816948, "cf_name": "default", "job": 127, "event": "table_file_creation", "file_number": 200, "file_size": 974112, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 95076, "largest_seqno": 95851, "table_properties": {"data_size": 970325, "index_size": 1566, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 8391, "raw_average_key_size": 19, "raw_value_size": 962768, "raw_average_value_size": 2183, "num_data_blocks": 69, "num_entries": 441, "num_filter_entries": 441, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165874, "oldest_key_time": 1769165874, "file_creation_time": 1769165927, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 200, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 127] Flush lasted 7998 microseconds, and 3232 cpu microseconds.
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.817026) [db/flush_job.cc:967] [default] [JOB 127] Level-0 flush table #200: 974112 bytes OK
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.817043) [db/memtable_list.cc:519] [default] Level-0 commit table #200 started
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.818497) [db/memtable_list.cc:722] [default] Level-0 commit table #200: memtable #1 done
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.818509) EVENT_LOG_v1 {"time_micros": 1769165927818505, "job": 127, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.818526) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 127] Try to delete WAL files size 1470538, prev total WAL file size 1475939, number of live WAL files 2.
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000196.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.819204) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373730' seq:72057594037927935, type:22 .. '6C6F676D0034303232' seq:0, type:0; will stop at (end)
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 128] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 127 Base level 0, inputs: [200(951KB)], [198(11MB)]
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165927819294, "job": 128, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [200], "files_L6": [198], "score": -1, "input_data_size": 12629483, "oldest_snapshot_seqno": -1}
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 128] Generated table #201: 11092 keys, 12495263 bytes, temperature: kUnknown
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165927897664, "cf_name": "default", "job": 128, "event": "table_file_creation", "file_number": 201, "file_size": 12495263, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12426681, "index_size": 39733, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27781, "raw_key_size": 294353, "raw_average_key_size": 26, "raw_value_size": 12235817, "raw_average_value_size": 1103, "num_data_blocks": 1499, "num_entries": 11092, "num_filter_entries": 11092, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769165927, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 201, "seqno_to_time_mapping": "N/A"}}
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.897966) [db/compaction/compaction_job.cc:1663] [default] [JOB 128] Compacted 1@0 + 1@6 files to L6 => 12495263 bytes
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.934138) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 161.0 rd, 159.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 11.1 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(25.8) write-amplify(12.8) OK, records in: 11616, records dropped: 524 output_compression: NoCompression
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.934174) EVENT_LOG_v1 {"time_micros": 1769165927934161, "job": 128, "event": "compaction_finished", "compaction_time_micros": 78451, "compaction_time_cpu_micros": 29755, "output_level": 6, "num_output_files": 1, "total_output_size": 12495263, "num_input_records": 11616, "num_output_records": 11092, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000200.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165927934527, "job": 128, "event": "table_file_deletion", "file_number": 200}
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000198.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769165927936529, "job": 128, "event": "table_file_deletion", "file_number": 198}
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.819094) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.936614) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.936619) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.936621) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.936623) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:58:47 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-10:58:47.936625) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 05:58:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:48.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:48 np0005593234 nova_compute[227762]: 2026-01-23 10:58:48.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:58:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:48.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:49 np0005593234 nova_compute[227762]: 2026-01-23 10:58:49.065 227766 INFO nova.compute.manager [None req-207a945e-bc87-42ff-9c87-aaefda13dcdb 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Get console output#033[00m
Jan 23 05:58:49 np0005593234 nova_compute[227762]: 2026-01-23 10:58:49.072 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:58:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:49.578 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=92, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=91) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:58:49 np0005593234 nova_compute[227762]: 2026-01-23 10:58:49.578 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:49 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:49.579 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.438 227766 DEBUG nova.compute.manager [req-50e750c0-3432-43f3-b519-c7d11b870dbd req-51db8d32-d54c-46ab-91e6-81119b369c89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-changed-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.439 227766 DEBUG nova.compute.manager [req-50e750c0-3432-43f3-b519-c7d11b870dbd req-51db8d32-d54c-46ab-91e6-81119b369c89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Refreshing instance network info cache due to event network-changed-eb5d7473-2661-43a3-bf30-f8ccc9735ea6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.439 227766 DEBUG oslo_concurrency.lockutils [req-50e750c0-3432-43f3-b519-c7d11b870dbd req-51db8d32-d54c-46ab-91e6-81119b369c89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.439 227766 DEBUG oslo_concurrency.lockutils [req-50e750c0-3432-43f3-b519-c7d11b870dbd req-51db8d32-d54c-46ab-91e6-81119b369c89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.439 227766 DEBUG nova.network.neutron [req-50e750c0-3432-43f3-b519-c7d11b870dbd req-51db8d32-d54c-46ab-91e6-81119b369c89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Refreshing network info cache for port eb5d7473-2661-43a3-bf30-f8ccc9735ea6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.503 227766 DEBUG oslo_concurrency.lockutils [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.504 227766 DEBUG oslo_concurrency.lockutils [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.504 227766 DEBUG oslo_concurrency.lockutils [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.504 227766 DEBUG oslo_concurrency.lockutils [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.505 227766 DEBUG oslo_concurrency.lockutils [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.506 227766 INFO nova.compute.manager [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Terminating instance#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.507 227766 DEBUG nova.compute.manager [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:58:50 np0005593234 kernel: tapeb5d7473-26 (unregistering): left promiscuous mode
Jan 23 05:58:50 np0005593234 NetworkManager[48942]: <info>  [1769165930.5465] device (tapeb5d7473-26): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.551 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:58:50Z|00925|binding|INFO|Releasing lport eb5d7473-2661-43a3-bf30-f8ccc9735ea6 from this chassis (sb_readonly=0)
Jan 23 05:58:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:58:50Z|00926|binding|INFO|Setting lport eb5d7473-2661-43a3-bf30-f8ccc9735ea6 down in Southbound
Jan 23 05:58:50 np0005593234 ovn_controller[134547]: 2026-01-23T10:58:50Z|00927|binding|INFO|Removing iface tapeb5d7473-26 ovn-installed in OVS
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.553 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.557 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:da:56 10.100.0.13'], port_security=['fa:16:3e:a0:da:56 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '575c76d9-7306-429f-baca-4d450f37c388', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '13', 'neutron:security_group_ids': '6b9b22d8-1c0e-45a7-b51b-d7eb1b043141', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c950e56e-abf1-437a-896b-f8357baa2295, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=eb5d7473-2661-43a3-bf30-f8ccc9735ea6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.559 144381 INFO neutron.agent.ovn.metadata.agent [-] Port eb5d7473-2661-43a3-bf30-f8ccc9735ea6 in datapath 709bd24b-e32e-4388-bc40-5f1d023f1ad4 unbound from our chassis#033[00m
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.560 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 709bd24b-e32e-4388-bc40-5f1d023f1ad4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.561 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[07599f85-1267-415b-b80b-e1a071f7e456]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.562 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4 namespace which is not needed anymore#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.575 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:50 np0005593234 systemd[1]: machine-qemu\x2d103\x2dinstance\x2d000000d4.scope: Deactivated successfully.
Jan 23 05:58:50 np0005593234 systemd[1]: machine-qemu\x2d103\x2dinstance\x2d000000d4.scope: Consumed 1.938s CPU time.
Jan 23 05:58:50 np0005593234 systemd-machined[195626]: Machine qemu-103-instance-000000d4 terminated.
Jan 23 05:58:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:50.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:50 np0005593234 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[335218]: [NOTICE]   (335222) : haproxy version is 2.8.14-c23fe91
Jan 23 05:58:50 np0005593234 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[335218]: [NOTICE]   (335222) : path to executable is /usr/sbin/haproxy
Jan 23 05:58:50 np0005593234 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[335218]: [WARNING]  (335222) : Exiting Master process...
Jan 23 05:58:50 np0005593234 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[335218]: [ALERT]    (335222) : Current worker (335224) exited with code 143 (Terminated)
Jan 23 05:58:50 np0005593234 neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4[335218]: [WARNING]  (335222) : All workers exited. Exiting... (0)
Jan 23 05:58:50 np0005593234 systemd[1]: libpod-1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2.scope: Deactivated successfully.
Jan 23 05:58:50 np0005593234 podman[335260]: 2026-01-23 10:58:50.701934057 +0000 UTC m=+0.045610697 container died 1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.720 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.727 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:50 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2-userdata-shm.mount: Deactivated successfully.
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.733 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:50 np0005593234 systemd[1]: var-lib-containers-storage-overlay-d6f23ee7b42b7c9bac6ae5f0a316f89b34002b5fa944904d1237df1bb7c58c87-merged.mount: Deactivated successfully.
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.744 227766 INFO nova.virt.libvirt.driver [-] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Instance destroyed successfully.#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.745 227766 DEBUG nova.objects.instance [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'resources' on Instance uuid 575c76d9-7306-429f-baca-4d450f37c388 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:58:50 np0005593234 podman[335260]: 2026-01-23 10:58:50.747365378 +0000 UTC m=+0.091042018 container cleanup 1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:58:50 np0005593234 systemd[1]: libpod-conmon-1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2.scope: Deactivated successfully.
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.765 227766 DEBUG nova.virt.libvirt.vif [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T10:57:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1747551892',display_name='tempest-TestNetworkAdvancedServerOps-server-1747551892',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1747551892',id=212,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNLmxmIGURR44ldVtZLj5mDiJy9rKp0lzbESRue6Wnd2DycIawGU9GHFwoSAqxCF+VQclApTf6ivzJfIihq8OTikttBlSKyYddrT5smaN5oWB8DMKx3zgAFW+FO8eDOMGA==',key_name='tempest-TestNetworkAdvancedServerOps-647126014',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:58:08Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-pdznqph4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:58:44Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=575c76d9-7306-429f-baca-4d450f37c388,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.766 227766 DEBUG nova.network.os_vif_util [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.250", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.767 227766 DEBUG nova.network.os_vif_util [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.767 227766 DEBUG os_vif [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.769 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:58:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:50.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.769 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb5d7473-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.771 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.772 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.774 227766 INFO os_vif [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:da:56,bridge_name='br-int',has_traffic_filtering=True,id=eb5d7473-2661-43a3-bf30-f8ccc9735ea6,network=Network(709bd24b-e32e-4388-bc40-5f1d023f1ad4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapeb5d7473-26')#033[00m
Jan 23 05:58:50 np0005593234 podman[335299]: 2026-01-23 10:58:50.809580885 +0000 UTC m=+0.041552341 container remove 1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.815 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ef7f5ef9-a084-458b-8d7d-8e59a90798f6]: (4, ('Fri Jan 23 10:58:50 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4 (1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2)\n1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2\nFri Jan 23 10:58:50 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4 (1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2)\n1459658414a4e3dcbe4b5d8b5c6b3f144da40d44186ba55e737ee44f69e071d2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.817 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[578ee9db-e8fb-4639-997d-460904a43b08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.819 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap709bd24b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.821 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:50 np0005593234 kernel: tap709bd24b-e0: left promiscuous mode
Jan 23 05:58:50 np0005593234 nova_compute[227762]: 2026-01-23 10:58:50.835 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.838 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0b3deb58-b89b-4a4d-9d5f-603649f514d1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.852 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dc2a4441-33f0-4b5f-91b9-186c809d06be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.853 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f3e9a222-50dd-4f8d-bd7e-c5b384e636a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.872 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5a4cb87e-a439-4c74-8fc6-60709c8638e4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 981617, 'reachable_time': 29819, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 335331, 'error': None, 'target': 'ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.876 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-709bd24b-e32e-4388-bc40-5f1d023f1ad4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:58:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:50.876 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[3a3debf4-3890-4747-8e12-fbded1fb7a6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:58:50 np0005593234 systemd[1]: run-netns-ovnmeta\x2d709bd24b\x2de32e\x2d4388\x2dbc40\x2d5f1d023f1ad4.mount: Deactivated successfully.
Jan 23 05:58:51 np0005593234 nova_compute[227762]: 2026-01-23 10:58:51.025 227766 DEBUG nova.compute.manager [req-09ee04dc-f226-4929-b577-6c1da804169b req-d9aabdeb-d715-4a22-ae82-19181f1225b7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-unplugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:51 np0005593234 nova_compute[227762]: 2026-01-23 10:58:51.026 227766 DEBUG oslo_concurrency.lockutils [req-09ee04dc-f226-4929-b577-6c1da804169b req-d9aabdeb-d715-4a22-ae82-19181f1225b7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:51 np0005593234 nova_compute[227762]: 2026-01-23 10:58:51.026 227766 DEBUG oslo_concurrency.lockutils [req-09ee04dc-f226-4929-b577-6c1da804169b req-d9aabdeb-d715-4a22-ae82-19181f1225b7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:51 np0005593234 nova_compute[227762]: 2026-01-23 10:58:51.027 227766 DEBUG oslo_concurrency.lockutils [req-09ee04dc-f226-4929-b577-6c1da804169b req-d9aabdeb-d715-4a22-ae82-19181f1225b7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:51 np0005593234 nova_compute[227762]: 2026-01-23 10:58:51.027 227766 DEBUG nova.compute.manager [req-09ee04dc-f226-4929-b577-6c1da804169b req-d9aabdeb-d715-4a22-ae82-19181f1225b7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] No waiting events found dispatching network-vif-unplugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:58:51 np0005593234 nova_compute[227762]: 2026-01-23 10:58:51.028 227766 DEBUG nova.compute.manager [req-09ee04dc-f226-4929-b577-6c1da804169b req-d9aabdeb-d715-4a22-ae82-19181f1225b7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-unplugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:58:51 np0005593234 nova_compute[227762]: 2026-01-23 10:58:51.208 227766 INFO nova.virt.libvirt.driver [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Deleting instance files /var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388_del#033[00m
Jan 23 05:58:51 np0005593234 nova_compute[227762]: 2026-01-23 10:58:51.209 227766 INFO nova.virt.libvirt.driver [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Deletion of /var/lib/nova/instances/575c76d9-7306-429f-baca-4d450f37c388_del complete#033[00m
Jan 23 05:58:51 np0005593234 nova_compute[227762]: 2026-01-23 10:58:51.268 227766 INFO nova.compute.manager [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Took 0.76 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 05:58:51 np0005593234 nova_compute[227762]: 2026-01-23 10:58:51.269 227766 DEBUG oslo.service.loopingcall [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 05:58:51 np0005593234 nova_compute[227762]: 2026-01-23 10:58:51.269 227766 DEBUG nova.compute.manager [-] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 05:58:51 np0005593234 nova_compute[227762]: 2026-01-23 10:58:51.269 227766 DEBUG nova.network.neutron [-] [instance: 575c76d9-7306-429f-baca-4d450f37c388] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 05:58:52 np0005593234 nova_compute[227762]: 2026-01-23 10:58:52.520 227766 DEBUG nova.network.neutron [-] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:58:52 np0005593234 nova_compute[227762]: 2026-01-23 10:58:52.543 227766 INFO nova.compute.manager [-] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Took 1.27 seconds to deallocate network for instance.#033[00m
Jan 23 05:58:52 np0005593234 nova_compute[227762]: 2026-01-23 10:58:52.588 227766 DEBUG nova.compute.manager [req-9df7bb81-19ab-49aa-a0dc-f24621db86ea req-28bd72d9-237a-4e25-8774-63b2a32fb869 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-deleted-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:52 np0005593234 nova_compute[227762]: 2026-01-23 10:58:52.591 227766 DEBUG oslo_concurrency.lockutils [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:52 np0005593234 nova_compute[227762]: 2026-01-23 10:58:52.591 227766 DEBUG oslo_concurrency.lockutils [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:52 np0005593234 nova_compute[227762]: 2026-01-23 10:58:52.598 227766 DEBUG oslo_concurrency.lockutils [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.007s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:52 np0005593234 nova_compute[227762]: 2026-01-23 10:58:52.632 227766 INFO nova.scheduler.client.report [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Deleted allocations for instance 575c76d9-7306-429f-baca-4d450f37c388#033[00m
Jan 23 05:58:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:52.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:52 np0005593234 nova_compute[227762]: 2026-01-23 10:58:52.644 227766 DEBUG nova.network.neutron [req-50e750c0-3432-43f3-b519-c7d11b870dbd req-51db8d32-d54c-46ab-91e6-81119b369c89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updated VIF entry in instance network info cache for port eb5d7473-2661-43a3-bf30-f8ccc9735ea6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:58:52 np0005593234 nova_compute[227762]: 2026-01-23 10:58:52.645 227766 DEBUG nova.network.neutron [req-50e750c0-3432-43f3-b519-c7d11b870dbd req-51db8d32-d54c-46ab-91e6-81119b369c89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Updating instance_info_cache with network_info: [{"id": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "address": "fa:16:3e:a0:da:56", "network": {"id": "709bd24b-e32e-4388-bc40-5f1d023f1ad4", "bridge": "br-int", "label": "tempest-network-smoke--346964421", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapeb5d7473-26", "ovs_interfaceid": "eb5d7473-2661-43a3-bf30-f8ccc9735ea6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:58:52 np0005593234 nova_compute[227762]: 2026-01-23 10:58:52.673 227766 DEBUG oslo_concurrency.lockutils [req-50e750c0-3432-43f3-b519-c7d11b870dbd req-51db8d32-d54c-46ab-91e6-81119b369c89 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-575c76d9-7306-429f-baca-4d450f37c388" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:58:52 np0005593234 nova_compute[227762]: 2026-01-23 10:58:52.759 227766 DEBUG oslo_concurrency.lockutils [None req-10cd2d79-707d-49e1-906e-c2273e3621a1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:58:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:52.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:58:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:53 np0005593234 nova_compute[227762]: 2026-01-23 10:58:53.137 227766 DEBUG nova.compute.manager [req-6d37f29e-885a-4e3b-a634-54091c441319 req-700ebd2d-f96d-4357-8b89-6630d9abb5a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:58:53 np0005593234 nova_compute[227762]: 2026-01-23 10:58:53.138 227766 DEBUG oslo_concurrency.lockutils [req-6d37f29e-885a-4e3b-a634-54091c441319 req-700ebd2d-f96d-4357-8b89-6630d9abb5a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "575c76d9-7306-429f-baca-4d450f37c388-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:58:53 np0005593234 nova_compute[227762]: 2026-01-23 10:58:53.138 227766 DEBUG oslo_concurrency.lockutils [req-6d37f29e-885a-4e3b-a634-54091c441319 req-700ebd2d-f96d-4357-8b89-6630d9abb5a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:58:53 np0005593234 nova_compute[227762]: 2026-01-23 10:58:53.138 227766 DEBUG oslo_concurrency.lockutils [req-6d37f29e-885a-4e3b-a634-54091c441319 req-700ebd2d-f96d-4357-8b89-6630d9abb5a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "575c76d9-7306-429f-baca-4d450f37c388-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:58:53 np0005593234 nova_compute[227762]: 2026-01-23 10:58:53.139 227766 DEBUG nova.compute.manager [req-6d37f29e-885a-4e3b-a634-54091c441319 req-700ebd2d-f96d-4357-8b89-6630d9abb5a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] No waiting events found dispatching network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:58:53 np0005593234 nova_compute[227762]: 2026-01-23 10:58:53.139 227766 WARNING nova.compute.manager [req-6d37f29e-885a-4e3b-a634-54091c441319 req-700ebd2d-f96d-4357-8b89-6630d9abb5a9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Received unexpected event network-vif-plugged-eb5d7473-2661-43a3-bf30-f8ccc9735ea6 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 05:58:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:54.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:54.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:55 np0005593234 nova_compute[227762]: 2026-01-23 10:58:55.724 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:55 np0005593234 nova_compute[227762]: 2026-01-23 10:58:55.772 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:58:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:56.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:58:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:56.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:57 np0005593234 nova_compute[227762]: 2026-01-23 10:58:57.273 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:57 np0005593234 nova_compute[227762]: 2026-01-23 10:58:57.386 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:58:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:58:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:58:58.582 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '92'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:58:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:58:58.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:58:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:58:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:58:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:58:58.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:59:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:00.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:59:00 np0005593234 nova_compute[227762]: 2026-01-23 10:59:00.774 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:00.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:01 np0005593234 podman[335389]: 2026-01-23 10:59:01.761328682 +0000 UTC m=+0.055261299 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 05:59:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:59:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:02.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:59:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:02.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:04.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:04.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:05 np0005593234 nova_compute[227762]: 2026-01-23 10:59:05.741 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165930.7396698, 575c76d9-7306-429f-baca-4d450f37c388 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:59:05 np0005593234 nova_compute[227762]: 2026-01-23 10:59:05.741 227766 INFO nova.compute.manager [-] [instance: 575c76d9-7306-429f-baca-4d450f37c388] VM Stopped (Lifecycle Event)#033[00m
Jan 23 05:59:05 np0005593234 nova_compute[227762]: 2026-01-23 10:59:05.775 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:59:05 np0005593234 nova_compute[227762]: 2026-01-23 10:59:05.776 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:06.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:06.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:06 np0005593234 nova_compute[227762]: 2026-01-23 10:59:06.795 227766 DEBUG nova.compute.manager [None req-6eba85f0-01e5-4fa2-a887-8df81395e6e1 - - - - - -] [instance: 575c76d9-7306-429f-baca-4d450f37c388] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:59:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:59:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:08.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:59:08 np0005593234 podman[335415]: 2026-01-23 10:59:08.783034029 +0000 UTC m=+0.081009395 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 05:59:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:08.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:10.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:10 np0005593234 nova_compute[227762]: 2026-01-23 10:59:10.777 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:10.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:12.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:59:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:12.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:59:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.164 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "7362e95f-78ad-433d-a32f-700454cf3816" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.165 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.191 227766 DEBUG nova.compute.manager [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.363 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.363 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.370 227766 DEBUG nova.virt.hardware [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.370 227766 INFO nova.compute.claims [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.467 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:59:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:14.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:59:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:14.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:59:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:59:14 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2970528158' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.899 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.911 227766 DEBUG nova.compute.provider_tree [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.927 227766 DEBUG nova.scheduler.client.report [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.950 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.951 227766 DEBUG nova.compute.manager [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.997 227766 DEBUG nova.compute.manager [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 05:59:14 np0005593234 nova_compute[227762]: 2026-01-23 10:59:14.998 227766 DEBUG nova.network.neutron [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.018 227766 INFO nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.038 227766 DEBUG nova.compute.manager [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.127 227766 DEBUG nova.compute.manager [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.129 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.129 227766 INFO nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Creating image(s)#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.165 227766 DEBUG nova.storage.rbd_utils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 7362e95f-78ad-433d-a32f-700454cf3816_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.203 227766 DEBUG nova.storage.rbd_utils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 7362e95f-78ad-433d-a32f-700454cf3816_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.236 227766 DEBUG nova.storage.rbd_utils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 7362e95f-78ad-433d-a32f-700454cf3816_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.241 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.302 227766 DEBUG nova.policy [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '420c366dc5dc45a48da4e0b18c93043f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c06f98b51aeb48de91d116fda54a161f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.329 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.330 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.330 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.330 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.360 227766 DEBUG nova.storage.rbd_utils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 7362e95f-78ad-433d-a32f-700454cf3816_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.364 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 7362e95f-78ad-433d-a32f-700454cf3816_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.779 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:59:15 np0005593234 nova_compute[227762]: 2026-01-23 10:59:15.933 227766 DEBUG nova.network.neutron [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Successfully created port: 5afebfcd-4030-4d9b-90d1-046a64cb92e1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 05:59:16 np0005593234 nova_compute[227762]: 2026-01-23 10:59:16.034 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 7362e95f-78ad-433d-a32f-700454cf3816_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.671s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:59:16 np0005593234 nova_compute[227762]: 2026-01-23 10:59:16.119 227766 DEBUG nova.storage.rbd_utils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] resizing rbd image 7362e95f-78ad-433d-a32f-700454cf3816_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 05:59:16 np0005593234 nova_compute[227762]: 2026-01-23 10:59:16.278 227766 DEBUG nova.objects.instance [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'migration_context' on Instance uuid 7362e95f-78ad-433d-a32f-700454cf3816 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:59:16 np0005593234 nova_compute[227762]: 2026-01-23 10:59:16.294 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 05:59:16 np0005593234 nova_compute[227762]: 2026-01-23 10:59:16.294 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Ensure instance console log exists: /var/lib/nova/instances/7362e95f-78ad-433d-a32f-700454cf3816/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 05:59:16 np0005593234 nova_compute[227762]: 2026-01-23 10:59:16.295 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:16 np0005593234 nova_compute[227762]: 2026-01-23 10:59:16.295 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:16 np0005593234 nova_compute[227762]: 2026-01-23 10:59:16.295 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:16.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:16.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:17 np0005593234 nova_compute[227762]: 2026-01-23 10:59:17.328 227766 DEBUG nova.network.neutron [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Successfully updated port: 5afebfcd-4030-4d9b-90d1-046a64cb92e1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 05:59:17 np0005593234 nova_compute[227762]: 2026-01-23 10:59:17.347 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-7362e95f-78ad-433d-a32f-700454cf3816" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:59:17 np0005593234 nova_compute[227762]: 2026-01-23 10:59:17.347 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-7362e95f-78ad-433d-a32f-700454cf3816" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:59:17 np0005593234 nova_compute[227762]: 2026-01-23 10:59:17.347 227766 DEBUG nova.network.neutron [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 05:59:17 np0005593234 nova_compute[227762]: 2026-01-23 10:59:17.407 227766 DEBUG nova.compute.manager [req-88bf7462-f88c-4650-8b1e-e82174dfa03c req-48d4bb75-ae53-42e6-8042-db976522714a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Received event network-changed-5afebfcd-4030-4d9b-90d1-046a64cb92e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:59:17 np0005593234 nova_compute[227762]: 2026-01-23 10:59:17.408 227766 DEBUG nova.compute.manager [req-88bf7462-f88c-4650-8b1e-e82174dfa03c req-48d4bb75-ae53-42e6-8042-db976522714a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Refreshing instance network info cache due to event network-changed-5afebfcd-4030-4d9b-90d1-046a64cb92e1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:59:17 np0005593234 nova_compute[227762]: 2026-01-23 10:59:17.408 227766 DEBUG oslo_concurrency.lockutils [req-88bf7462-f88c-4650-8b1e-e82174dfa03c req-48d4bb75-ae53-42e6-8042-db976522714a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-7362e95f-78ad-433d-a32f-700454cf3816" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:59:17 np0005593234 nova_compute[227762]: 2026-01-23 10:59:17.458 227766 DEBUG nova.network.neutron [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 05:59:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.307 227766 DEBUG nova.network.neutron [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Updating instance_info_cache with network_info: [{"id": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "address": "fa:16:3e:50:8b:df", "network": {"id": "0c091981-560c-4de9-99d3-f975cddf2b53", "bridge": "br-int", "label": "tempest-network-smoke--155774509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afebfcd-40", "ovs_interfaceid": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.330 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-7362e95f-78ad-433d-a32f-700454cf3816" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.330 227766 DEBUG nova.compute.manager [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Instance network_info: |[{"id": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "address": "fa:16:3e:50:8b:df", "network": {"id": "0c091981-560c-4de9-99d3-f975cddf2b53", "bridge": "br-int", "label": "tempest-network-smoke--155774509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afebfcd-40", "ovs_interfaceid": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.331 227766 DEBUG oslo_concurrency.lockutils [req-88bf7462-f88c-4650-8b1e-e82174dfa03c req-48d4bb75-ae53-42e6-8042-db976522714a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-7362e95f-78ad-433d-a32f-700454cf3816" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.331 227766 DEBUG nova.network.neutron [req-88bf7462-f88c-4650-8b1e-e82174dfa03c req-48d4bb75-ae53-42e6-8042-db976522714a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Refreshing network info cache for port 5afebfcd-4030-4d9b-90d1-046a64cb92e1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.333 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Start _get_guest_xml network_info=[{"id": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "address": "fa:16:3e:50:8b:df", "network": {"id": "0c091981-560c-4de9-99d3-f975cddf2b53", "bridge": "br-int", "label": "tempest-network-smoke--155774509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afebfcd-40", "ovs_interfaceid": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.343 227766 WARNING nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.352 227766 DEBUG nova.virt.libvirt.host [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.353 227766 DEBUG nova.virt.libvirt.host [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.361 227766 DEBUG nova.virt.libvirt.host [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.361 227766 DEBUG nova.virt.libvirt.host [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.362 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.363 227766 DEBUG nova.virt.hardware [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.363 227766 DEBUG nova.virt.hardware [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.363 227766 DEBUG nova.virt.hardware [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.363 227766 DEBUG nova.virt.hardware [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.364 227766 DEBUG nova.virt.hardware [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.364 227766 DEBUG nova.virt.hardware [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.364 227766 DEBUG nova.virt.hardware [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.364 227766 DEBUG nova.virt.hardware [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.364 227766 DEBUG nova.virt.hardware [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.364 227766 DEBUG nova.virt.hardware [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.365 227766 DEBUG nova.virt.hardware [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.368 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:59:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:18.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.764 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.765 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.765 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.765 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.766 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:59:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:59:18 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4059972488' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:59:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 05:59:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:18.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.826 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.853 227766 DEBUG nova.storage.rbd_utils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 7362e95f-78ad-433d-a32f-700454cf3816_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:59:18 np0005593234 nova_compute[227762]: 2026-01-23 10:59:18.857 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:59:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:59:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3800244151' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.217 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.312 227766 DEBUG nova.network.neutron [req-88bf7462-f88c-4650-8b1e-e82174dfa03c req-48d4bb75-ae53-42e6-8042-db976522714a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Updated VIF entry in instance network info cache for port 5afebfcd-4030-4d9b-90d1-046a64cb92e1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.313 227766 DEBUG nova.network.neutron [req-88bf7462-f88c-4650-8b1e-e82174dfa03c req-48d4bb75-ae53-42e6-8042-db976522714a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Updating instance_info_cache with network_info: [{"id": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "address": "fa:16:3e:50:8b:df", "network": {"id": "0c091981-560c-4de9-99d3-f975cddf2b53", "bridge": "br-int", "label": "tempest-network-smoke--155774509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afebfcd-40", "ovs_interfaceid": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:59:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 05:59:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1433319980' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.334 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.336 227766 DEBUG nova.virt.libvirt.vif [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:59:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-509682616',display_name='tempest-TestNetworkAdvancedServerOps-server-509682616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-509682616',id=213,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKyriUxXvNltk/O+zE9daakY9SI2zEc97QqjMK/+blZNB+bRt+v8usoOjGwtuzN0xO+IjRyT6c2+NhDQ52GFSyepRN8aIWYSyw+6u3TL4lbwnVvMU36tF0SZSRh2ptkRgw==',key_name='tempest-TestNetworkAdvancedServerOps-495860055',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-tt0cqd7s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:59:15Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=7362e95f-78ad-433d-a32f-700454cf3816,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "address": "fa:16:3e:50:8b:df", "network": {"id": "0c091981-560c-4de9-99d3-f975cddf2b53", "bridge": "br-int", "label": "tempest-network-smoke--155774509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afebfcd-40", "ovs_interfaceid": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.336 227766 DEBUG nova.network.os_vif_util [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "address": "fa:16:3e:50:8b:df", "network": {"id": "0c091981-560c-4de9-99d3-f975cddf2b53", "bridge": "br-int", "label": "tempest-network-smoke--155774509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afebfcd-40", "ovs_interfaceid": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.337 227766 DEBUG nova.network.os_vif_util [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:8b:df,bridge_name='br-int',has_traffic_filtering=True,id=5afebfcd-4030-4d9b-90d1-046a64cb92e1,network=Network(0c091981-560c-4de9-99d3-f975cddf2b53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afebfcd-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.339 227766 DEBUG nova.objects.instance [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid 7362e95f-78ad-433d-a32f-700454cf3816 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.342 227766 DEBUG oslo_concurrency.lockutils [req-88bf7462-f88c-4650-8b1e-e82174dfa03c req-48d4bb75-ae53-42e6-8042-db976522714a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-7362e95f-78ad-433d-a32f-700454cf3816" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.357 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] End _get_guest_xml xml=<domain type="kvm">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  <uuid>7362e95f-78ad-433d-a32f-700454cf3816</uuid>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  <name>instance-000000d5</name>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-509682616</nova:name>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 10:59:18</nova:creationTime>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <nova:port uuid="5afebfcd-4030-4d9b-90d1-046a64cb92e1">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <system>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <entry name="serial">7362e95f-78ad-433d-a32f-700454cf3816</entry>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <entry name="uuid">7362e95f-78ad-433d-a32f-700454cf3816</entry>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    </system>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  <os>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  </os>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  <features>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  </features>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  </clock>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  <devices>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/7362e95f-78ad-433d-a32f-700454cf3816_disk">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/7362e95f-78ad-433d-a32f-700454cf3816_disk.config">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      </source>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      </auth>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    </disk>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:50:8b:df"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <target dev="tap5afebfcd-40"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    </interface>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/7362e95f-78ad-433d-a32f-700454cf3816/console.log" append="off"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    </serial>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <video>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    </video>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    </rng>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 05:59:19 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 05:59:19 np0005593234 nova_compute[227762]:  </devices>
Jan 23 05:59:19 np0005593234 nova_compute[227762]: </domain>
Jan 23 05:59:19 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.359 227766 DEBUG nova.compute.manager [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Preparing to wait for external event network-vif-plugged-5afebfcd-4030-4d9b-90d1-046a64cb92e1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.359 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "7362e95f-78ad-433d-a32f-700454cf3816-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.360 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.360 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.361 227766 DEBUG nova.virt.libvirt.vif [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:59:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-509682616',display_name='tempest-TestNetworkAdvancedServerOps-server-509682616',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-509682616',id=213,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKyriUxXvNltk/O+zE9daakY9SI2zEc97QqjMK/+blZNB+bRt+v8usoOjGwtuzN0xO+IjRyT6c2+NhDQ52GFSyepRN8aIWYSyw+6u3TL4lbwnVvMU36tF0SZSRh2ptkRgw==',key_name='tempest-TestNetworkAdvancedServerOps-495860055',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-tt0cqd7s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:59:15Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=7362e95f-78ad-433d-a32f-700454cf3816,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "address": "fa:16:3e:50:8b:df", "network": {"id": "0c091981-560c-4de9-99d3-f975cddf2b53", "bridge": "br-int", "label": "tempest-network-smoke--155774509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afebfcd-40", "ovs_interfaceid": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.362 227766 DEBUG nova.network.os_vif_util [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "address": "fa:16:3e:50:8b:df", "network": {"id": "0c091981-560c-4de9-99d3-f975cddf2b53", "bridge": "br-int", "label": "tempest-network-smoke--155774509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afebfcd-40", "ovs_interfaceid": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.363 227766 DEBUG nova.network.os_vif_util [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:50:8b:df,bridge_name='br-int',has_traffic_filtering=True,id=5afebfcd-4030-4d9b-90d1-046a64cb92e1,network=Network(0c091981-560c-4de9-99d3-f975cddf2b53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afebfcd-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.363 227766 DEBUG os_vif [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:8b:df,bridge_name='br-int',has_traffic_filtering=True,id=5afebfcd-4030-4d9b-90d1-046a64cb92e1,network=Network(0c091981-560c-4de9-99d3-f975cddf2b53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afebfcd-40') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.364 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.364 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.365 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.370 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.370 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5afebfcd-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.371 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5afebfcd-40, col_values=(('external_ids', {'iface-id': '5afebfcd-4030-4d9b-90d1-046a64cb92e1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:50:8b:df', 'vm-uuid': '7362e95f-78ad-433d-a32f-700454cf3816'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.372 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:19 np0005593234 NetworkManager[48942]: <info>  [1769165959.3736] manager: (tap5afebfcd-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/440)
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.375 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.379 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.380 227766 INFO os_vif [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:50:8b:df,bridge_name='br-int',has_traffic_filtering=True,id=5afebfcd-4030-4d9b-90d1-046a64cb92e1,network=Network(0c091981-560c-4de9-99d3-f975cddf2b53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afebfcd-40')#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.428 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.428 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.429 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No VIF found with MAC fa:16:3e:50:8b:df, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.429 227766 INFO nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Using config drive#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.468 227766 DEBUG nova.storage.rbd_utils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 7362e95f-78ad-433d-a32f-700454cf3816_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.492 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.493 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4106MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.493 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.494 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.564 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 7362e95f-78ad-433d-a32f-700454cf3816 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.564 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.564 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.598 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.788 227766 INFO nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Creating config drive at /var/lib/nova/instances/7362e95f-78ad-433d-a32f-700454cf3816/disk.config#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.795 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7362e95f-78ad-433d-a32f-700454cf3816/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq0135erd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.934 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7362e95f-78ad-433d-a32f-700454cf3816/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq0135erd" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.972 227766 DEBUG nova.storage.rbd_utils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 7362e95f-78ad-433d-a32f-700454cf3816_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 05:59:19 np0005593234 nova_compute[227762]: 2026-01-23 10:59:19.976 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7362e95f-78ad-433d-a32f-700454cf3816/disk.config 7362e95f-78ad-433d-a32f-700454cf3816_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 05:59:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 05:59:20 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3090118431' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.086 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.093 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.112 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.154 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.154 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.661s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.182 227766 DEBUG oslo_concurrency.processutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7362e95f-78ad-433d-a32f-700454cf3816/disk.config 7362e95f-78ad-433d-a32f-700454cf3816_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.206s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.183 227766 INFO nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Deleting local config drive /var/lib/nova/instances/7362e95f-78ad-433d-a32f-700454cf3816/disk.config because it was imported into RBD.#033[00m
Jan 23 05:59:20 np0005593234 kernel: tap5afebfcd-40: entered promiscuous mode
Jan 23 05:59:20 np0005593234 NetworkManager[48942]: <info>  [1769165960.2339] manager: (tap5afebfcd-40): new Tun device (/org/freedesktop/NetworkManager/Devices/441)
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.233 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:59:20Z|00928|binding|INFO|Claiming lport 5afebfcd-4030-4d9b-90d1-046a64cb92e1 for this chassis.
Jan 23 05:59:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:59:20Z|00929|binding|INFO|5afebfcd-4030-4d9b-90d1-046a64cb92e1: Claiming fa:16:3e:50:8b:df 10.100.0.14
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.240 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.247 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:8b:df 10.100.0.14'], port_security=['fa:16:3e:50:8b:df 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7362e95f-78ad-433d-a32f-700454cf3816', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c091981-560c-4de9-99d3-f975cddf2b53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '12e30436-ebe4-4dcb-9cc1-5b6eafaa9ddb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=56aace94-4ec0-4335-af49-e5a323e8f28b, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5afebfcd-4030-4d9b-90d1-046a64cb92e1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.248 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5afebfcd-4030-4d9b-90d1-046a64cb92e1 in datapath 0c091981-560c-4de9-99d3-f975cddf2b53 bound to our chassis#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.249 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0c091981-560c-4de9-99d3-f975cddf2b53#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.260 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ba2ad719-354e-42a3-a310-1a5da0de924e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.261 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0c091981-51 in ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.262 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0c091981-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.263 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[05f43b72-b515-4576-a6cb-6d2f37ff7fbe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.263 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d65bf230-7f25-42bd-be50-d3d14ceab42f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 systemd-udevd[335865]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 05:59:20 np0005593234 systemd-machined[195626]: New machine qemu-104-instance-000000d5.
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.275 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[012cbc83-130f-40fa-9f4b-5fbd4ee3d4a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 NetworkManager[48942]: <info>  [1769165960.2823] device (tap5afebfcd-40): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 05:59:20 np0005593234 NetworkManager[48942]: <info>  [1769165960.2833] device (tap5afebfcd-40): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 05:59:20 np0005593234 systemd[1]: Started Virtual Machine qemu-104-instance-000000d5.
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.296 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[58c637f3-5097-47a8-9cce-9b072b2a9997]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:59:20Z|00930|binding|INFO|Setting lport 5afebfcd-4030-4d9b-90d1-046a64cb92e1 ovn-installed in OVS
Jan 23 05:59:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:59:20Z|00931|binding|INFO|Setting lport 5afebfcd-4030-4d9b-90d1-046a64cb92e1 up in Southbound
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.332 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[38f8df5c-1733-4bbf-a07a-2cefcbc7b17a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.356 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:20 np0005593234 NetworkManager[48942]: <info>  [1769165960.3579] manager: (tap0c091981-50): new Veth device (/org/freedesktop/NetworkManager/Devices/442)
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.356 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[65c1ace6-7b8d-4427-aad0-4ff7116b4c50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.392 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf0403a-5c2f-4c40-8b75-08f78a467a7b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.395 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[0b1602c5-ef80-4f2e-b078-a95942a8fea6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 NetworkManager[48942]: <info>  [1769165960.4185] device (tap0c091981-50): carrier: link connected
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.427 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[df69dacb-03fb-46f1-9e5f-b18a038c2743]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.442 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[32d77d8e-1a7c-465e-8bd8-5f279038d2e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c091981-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:98:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 285], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 985537, 'reachable_time': 25285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 335898, 'error': None, 'target': 'ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.458 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[847706da-0b33-423d-99c3-93c751e68f44]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:98be'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 985537, 'tstamp': 985537}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 335899, 'error': None, 'target': 'ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.476 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[60af463b-1964-4714-8e13-505fa3824308]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0c091981-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:98:be'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 285], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 985537, 'reachable_time': 25285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 335900, 'error': None, 'target': 'ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.509 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[24b61e54-70d0-4860-a523-7c6995e8ae21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.566 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[db5d02e3-736c-4f79-8111-87c810636e0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.568 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c091981-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.568 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.569 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c091981-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:59:20 np0005593234 NetworkManager[48942]: <info>  [1769165960.5723] manager: (tap0c091981-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/443)
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.572 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:20 np0005593234 kernel: tap0c091981-50: entered promiscuous mode
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.575 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0c091981-50, col_values=(('external_ids', {'iface-id': 'a2626475-68ca-4e99-bc68-e36e1438c38c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:59:20 np0005593234 ovn_controller[134547]: 2026-01-23T10:59:20Z|00932|binding|INFO|Releasing lport a2626475-68ca-4e99-bc68-e36e1438c38c from this chassis (sb_readonly=0)
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.576 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.589 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.594 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0c091981-560c-4de9-99d3-f975cddf2b53.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0c091981-560c-4de9-99d3-f975cddf2b53.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.595 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b5a986e4-c4c6-4650-8949-21963dbc37b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.596 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-0c091981-560c-4de9-99d3-f975cddf2b53
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/0c091981-560c-4de9-99d3-f975cddf2b53.pid.haproxy
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 0c091981-560c-4de9-99d3-f975cddf2b53
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 05:59:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:20.597 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53', 'env', 'PROCESS_TAG=haproxy-0c091981-560c-4de9-99d3-f975cddf2b53', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0c091981-560c-4de9-99d3-f975cddf2b53.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 05:59:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:20.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:20 np0005593234 nova_compute[227762]: 2026-01-23 10:59:20.782 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:20.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:20 np0005593234 podman[335946]: 2026-01-23 10:59:20.96635492 +0000 UTC m=+0.058192071 container create e95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 05:59:20 np0005593234 systemd[1]: Started libpod-conmon-e95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f.scope.
Jan 23 05:59:21 np0005593234 systemd[1]: Started libcrun container.
Jan 23 05:59:21 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ca00ed34b8b99ab5651961ad475f096e5b991062dbd6d1786721067b5eec75fa/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 05:59:21 np0005593234 podman[335946]: 2026-01-23 10:59:20.933872244 +0000 UTC m=+0.025709415 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 05:59:21 np0005593234 podman[335946]: 2026-01-23 10:59:21.039368034 +0000 UTC m=+0.131205215 container init e95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 05:59:21 np0005593234 podman[335946]: 2026-01-23 10:59:21.046642171 +0000 UTC m=+0.138479432 container start e95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 05:59:21 np0005593234 neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53[335988]: [NOTICE]   (335993) : New worker (335995) forked
Jan 23 05:59:21 np0005593234 neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53[335988]: [NOTICE]   (335993) : Loading success.
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.078 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165961.0776627, 7362e95f-78ad-433d-a32f-700454cf3816 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.078 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] VM Started (Lifecycle Event)#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.083 227766 DEBUG nova.compute.manager [req-dbff6a9f-7eab-416e-8120-d034e51ac093 req-270b7e30-5f19-44d4-a3e1-67d097bd2ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Received event network-vif-plugged-5afebfcd-4030-4d9b-90d1-046a64cb92e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.083 227766 DEBUG oslo_concurrency.lockutils [req-dbff6a9f-7eab-416e-8120-d034e51ac093 req-270b7e30-5f19-44d4-a3e1-67d097bd2ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "7362e95f-78ad-433d-a32f-700454cf3816-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.084 227766 DEBUG oslo_concurrency.lockutils [req-dbff6a9f-7eab-416e-8120-d034e51ac093 req-270b7e30-5f19-44d4-a3e1-67d097bd2ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.084 227766 DEBUG oslo_concurrency.lockutils [req-dbff6a9f-7eab-416e-8120-d034e51ac093 req-270b7e30-5f19-44d4-a3e1-67d097bd2ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.084 227766 DEBUG nova.compute.manager [req-dbff6a9f-7eab-416e-8120-d034e51ac093 req-270b7e30-5f19-44d4-a3e1-67d097bd2ac2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Processing event network-vif-plugged-5afebfcd-4030-4d9b-90d1-046a64cb92e1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.085 227766 DEBUG nova.compute.manager [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.088 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.092 227766 INFO nova.virt.libvirt.driver [-] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Instance spawned successfully.#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.093 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.102 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.106 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.115 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.115 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.116 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.116 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.117 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.117 227766 DEBUG nova.virt.libvirt.driver [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.123 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.123 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165961.080199, 7362e95f-78ad-433d-a32f-700454cf3816 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.123 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.151 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.154 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165961.0874681, 7362e95f-78ad-433d-a32f-700454cf3816 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.154 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.155 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.156 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.156 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.184 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.184 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.187 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.189 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.194 227766 INFO nova.compute.manager [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Took 6.07 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.194 227766 DEBUG nova.compute.manager [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.205 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.247 227766 INFO nova.compute.manager [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Took 6.91 seconds to build instance.#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.263 227766 DEBUG oslo_concurrency.lockutils [None req-6283c943-6523-4e8b-99c5-13c7025b4d0e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.098s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:21 np0005593234 nova_compute[227762]: 2026-01-23 10:59:21.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:22.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:59:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:22.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:59:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:24 np0005593234 nova_compute[227762]: 2026-01-23 10:59:24.373 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:24 np0005593234 nova_compute[227762]: 2026-01-23 10:59:24.547 227766 DEBUG nova.compute.manager [req-51d45063-3a59-4081-aac9-d7fe0f6d6ca5 req-69d805f9-cbb2-42f8-b4d4-e4dd02cb2bbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Received event network-vif-plugged-5afebfcd-4030-4d9b-90d1-046a64cb92e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:59:24 np0005593234 nova_compute[227762]: 2026-01-23 10:59:24.548 227766 DEBUG oslo_concurrency.lockutils [req-51d45063-3a59-4081-aac9-d7fe0f6d6ca5 req-69d805f9-cbb2-42f8-b4d4-e4dd02cb2bbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "7362e95f-78ad-433d-a32f-700454cf3816-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:24 np0005593234 nova_compute[227762]: 2026-01-23 10:59:24.548 227766 DEBUG oslo_concurrency.lockutils [req-51d45063-3a59-4081-aac9-d7fe0f6d6ca5 req-69d805f9-cbb2-42f8-b4d4-e4dd02cb2bbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:24 np0005593234 nova_compute[227762]: 2026-01-23 10:59:24.548 227766 DEBUG oslo_concurrency.lockutils [req-51d45063-3a59-4081-aac9-d7fe0f6d6ca5 req-69d805f9-cbb2-42f8-b4d4-e4dd02cb2bbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:24 np0005593234 nova_compute[227762]: 2026-01-23 10:59:24.549 227766 DEBUG nova.compute.manager [req-51d45063-3a59-4081-aac9-d7fe0f6d6ca5 req-69d805f9-cbb2-42f8-b4d4-e4dd02cb2bbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] No waiting events found dispatching network-vif-plugged-5afebfcd-4030-4d9b-90d1-046a64cb92e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:59:24 np0005593234 nova_compute[227762]: 2026-01-23 10:59:24.549 227766 WARNING nova.compute.manager [req-51d45063-3a59-4081-aac9-d7fe0f6d6ca5 req-69d805f9-cbb2-42f8-b4d4-e4dd02cb2bbd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Received unexpected event network-vif-plugged-5afebfcd-4030-4d9b-90d1-046a64cb92e1 for instance with vm_state active and task_state None.#033[00m
Jan 23 05:59:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:24.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:24.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:25 np0005593234 nova_compute[227762]: 2026-01-23 10:59:25.785 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:26.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:26.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:27 np0005593234 nova_compute[227762]: 2026-01-23 10:59:27.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:27 np0005593234 nova_compute[227762]: 2026-01-23 10:59:27.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 05:59:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:59:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:28.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:59:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:28.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:29 np0005593234 nova_compute[227762]: 2026-01-23 10:59:29.375 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:30.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:30 np0005593234 nova_compute[227762]: 2026-01-23 10:59:30.788 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:30.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:32.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:32 np0005593234 nova_compute[227762]: 2026-01-23 10:59:32.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:32 np0005593234 podman[336010]: 2026-01-23 10:59:32.778451608 +0000 UTC m=+0.062348572 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent)
Jan 23 05:59:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:32.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:33 np0005593234 nova_compute[227762]: 2026-01-23 10:59:33.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:34 np0005593234 nova_compute[227762]: 2026-01-23 10:59:34.376 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:34.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:34 np0005593234 nova_compute[227762]: 2026-01-23 10:59:34.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:34.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:35 np0005593234 ovn_controller[134547]: 2026-01-23T10:59:35Z|00120|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:50:8b:df 10.100.0.14
Jan 23 05:59:35 np0005593234 ovn_controller[134547]: 2026-01-23T10:59:35Z|00121|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:50:8b:df 10.100.0.14
Jan 23 05:59:35 np0005593234 nova_compute[227762]: 2026-01-23 10:59:35.790 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:59:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:36.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:59:36 np0005593234 nova_compute[227762]: 2026-01-23 10:59:36.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:59:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:36.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:59:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:38.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:38.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:39 np0005593234 ovn_controller[134547]: 2026-01-23T10:59:39Z|00933|binding|INFO|Releasing lport a2626475-68ca-4e99-bc68-e36e1438c38c from this chassis (sb_readonly=0)
Jan 23 05:59:39 np0005593234 nova_compute[227762]: 2026-01-23 10:59:39.224 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:39 np0005593234 NetworkManager[48942]: <info>  [1769165979.2261] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/444)
Jan 23 05:59:39 np0005593234 NetworkManager[48942]: <info>  [1769165979.2272] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/445)
Jan 23 05:59:39 np0005593234 nova_compute[227762]: 2026-01-23 10:59:39.227 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:39 np0005593234 ovn_controller[134547]: 2026-01-23T10:59:39Z|00934|binding|INFO|Releasing lport a2626475-68ca-4e99-bc68-e36e1438c38c from this chassis (sb_readonly=0)
Jan 23 05:59:39 np0005593234 nova_compute[227762]: 2026-01-23 10:59:39.231 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:39 np0005593234 nova_compute[227762]: 2026-01-23 10:59:39.377 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:39 np0005593234 nova_compute[227762]: 2026-01-23 10:59:39.738 227766 DEBUG nova.compute.manager [req-e7183db7-a1aa-4e9d-8f78-99f3f1ba2b0c req-e5433a5b-c625-4462-ac10-d7034bcc5820 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Received event network-changed-5afebfcd-4030-4d9b-90d1-046a64cb92e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:59:39 np0005593234 nova_compute[227762]: 2026-01-23 10:59:39.738 227766 DEBUG nova.compute.manager [req-e7183db7-a1aa-4e9d-8f78-99f3f1ba2b0c req-e5433a5b-c625-4462-ac10-d7034bcc5820 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Refreshing instance network info cache due to event network-changed-5afebfcd-4030-4d9b-90d1-046a64cb92e1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:59:39 np0005593234 nova_compute[227762]: 2026-01-23 10:59:39.738 227766 DEBUG oslo_concurrency.lockutils [req-e7183db7-a1aa-4e9d-8f78-99f3f1ba2b0c req-e5433a5b-c625-4462-ac10-d7034bcc5820 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-7362e95f-78ad-433d-a32f-700454cf3816" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:59:39 np0005593234 nova_compute[227762]: 2026-01-23 10:59:39.738 227766 DEBUG oslo_concurrency.lockutils [req-e7183db7-a1aa-4e9d-8f78-99f3f1ba2b0c req-e5433a5b-c625-4462-ac10-d7034bcc5820 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-7362e95f-78ad-433d-a32f-700454cf3816" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:59:39 np0005593234 nova_compute[227762]: 2026-01-23 10:59:39.739 227766 DEBUG nova.network.neutron [req-e7183db7-a1aa-4e9d-8f78-99f3f1ba2b0c req-e5433a5b-c625-4462-ac10-d7034bcc5820 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Refreshing network info cache for port 5afebfcd-4030-4d9b-90d1-046a64cb92e1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:59:39 np0005593234 podman[336083]: 2026-01-23 10:59:39.795045404 +0000 UTC m=+0.085066483 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 05:59:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:59:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:40.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:59:40 np0005593234 nova_compute[227762]: 2026-01-23 10:59:40.792 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:40.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:41 np0005593234 nova_compute[227762]: 2026-01-23 10:59:41.339 227766 DEBUG nova.network.neutron [req-e7183db7-a1aa-4e9d-8f78-99f3f1ba2b0c req-e5433a5b-c625-4462-ac10-d7034bcc5820 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Updated VIF entry in instance network info cache for port 5afebfcd-4030-4d9b-90d1-046a64cb92e1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 05:59:41 np0005593234 nova_compute[227762]: 2026-01-23 10:59:41.339 227766 DEBUG nova.network.neutron [req-e7183db7-a1aa-4e9d-8f78-99f3f1ba2b0c req-e5433a5b-c625-4462-ac10-d7034bcc5820 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Updating instance_info_cache with network_info: [{"id": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "address": "fa:16:3e:50:8b:df", "network": {"id": "0c091981-560c-4de9-99d3-f975cddf2b53", "bridge": "br-int", "label": "tempest-network-smoke--155774509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afebfcd-40", "ovs_interfaceid": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 05:59:41 np0005593234 nova_compute[227762]: 2026-01-23 10:59:41.358 227766 DEBUG oslo_concurrency.lockutils [req-e7183db7-a1aa-4e9d-8f78-99f3f1ba2b0c req-e5433a5b-c625-4462-ac10-d7034bcc5820 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-7362e95f-78ad-433d-a32f-700454cf3816" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 05:59:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 05:59:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:59:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 05:59:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:42.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:42.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:42.906 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:42.907 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:42.908 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:43 np0005593234 nova_compute[227762]: 2026-01-23 10:59:43.225 227766 INFO nova.compute.manager [None req-204bffa4-1f21-4bd1-b4b2-1a139622169c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Get console output#033[00m
Jan 23 05:59:43 np0005593234 nova_compute[227762]: 2026-01-23 10:59:43.232 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:59:43 np0005593234 nova_compute[227762]: 2026-01-23 10:59:43.529 227766 INFO nova.compute.manager [None req-e8774378-7d3f-4eaa-8ab5-fa6411b7308f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Pausing#033[00m
Jan 23 05:59:43 np0005593234 nova_compute[227762]: 2026-01-23 10:59:43.530 227766 DEBUG nova.objects.instance [None req-e8774378-7d3f-4eaa-8ab5-fa6411b7308f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'flavor' on Instance uuid 7362e95f-78ad-433d-a32f-700454cf3816 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:59:43 np0005593234 nova_compute[227762]: 2026-01-23 10:59:43.575 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165983.5751917, 7362e95f-78ad-433d-a32f-700454cf3816 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:59:43 np0005593234 nova_compute[227762]: 2026-01-23 10:59:43.577 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] VM Paused (Lifecycle Event)#033[00m
Jan 23 05:59:43 np0005593234 nova_compute[227762]: 2026-01-23 10:59:43.579 227766 DEBUG nova.compute.manager [None req-e8774378-7d3f-4eaa-8ab5-fa6411b7308f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:59:43 np0005593234 nova_compute[227762]: 2026-01-23 10:59:43.659 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:59:43 np0005593234 nova_compute[227762]: 2026-01-23 10:59:43.664 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:59:44 np0005593234 nova_compute[227762]: 2026-01-23 10:59:44.379 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 05:59:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/382536321' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 05:59:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 05:59:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/382536321' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 05:59:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:44.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:44.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:45 np0005593234 nova_compute[227762]: 2026-01-23 10:59:45.794 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:46.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:46.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:59:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 05:59:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:48.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:48.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:49 np0005593234 nova_compute[227762]: 2026-01-23 10:59:49.382 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:49 np0005593234 nova_compute[227762]: 2026-01-23 10:59:49.763 227766 INFO nova.compute.manager [None req-7651ba9a-ebaa-4946-9ce6-fbdd43bfcd29 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Get console output#033[00m
Jan 23 05:59:49 np0005593234 nova_compute[227762]: 2026-01-23 10:59:49.767 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:59:49 np0005593234 nova_compute[227762]: 2026-01-23 10:59:49.994 227766 INFO nova.compute.manager [None req-2d6da874-6e6d-4a61-b466-7d7421b3d4d5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Unpausing#033[00m
Jan 23 05:59:49 np0005593234 nova_compute[227762]: 2026-01-23 10:59:49.996 227766 DEBUG nova.objects.instance [None req-2d6da874-6e6d-4a61-b466-7d7421b3d4d5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'flavor' on Instance uuid 7362e95f-78ad-433d-a32f-700454cf3816 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:59:50 np0005593234 nova_compute[227762]: 2026-01-23 10:59:50.032 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769165990.032654, 7362e95f-78ad-433d-a32f-700454cf3816 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 05:59:50 np0005593234 nova_compute[227762]: 2026-01-23 10:59:50.033 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] VM Resumed (Lifecycle Event)#033[00m
Jan 23 05:59:50 np0005593234 virtqemud[227483]: argument unsupported: QEMU guest agent is not configured
Jan 23 05:59:50 np0005593234 nova_compute[227762]: 2026-01-23 10:59:50.037 227766 DEBUG nova.virt.libvirt.guest [None req-2d6da874-6e6d-4a61-b466-7d7421b3d4d5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 05:59:50 np0005593234 nova_compute[227762]: 2026-01-23 10:59:50.038 227766 DEBUG nova.compute.manager [None req-2d6da874-6e6d-4a61-b466-7d7421b3d4d5 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:59:50 np0005593234 nova_compute[227762]: 2026-01-23 10:59:50.064 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 05:59:50 np0005593234 nova_compute[227762]: 2026-01-23 10:59:50.067 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 05:59:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:59:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:50.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:59:50 np0005593234 nova_compute[227762]: 2026-01-23 10:59:50.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 05:59:50 np0005593234 nova_compute[227762]: 2026-01-23 10:59:50.795 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:50.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:52.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:52.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:54 np0005593234 nova_compute[227762]: 2026-01-23 10:59:54.385 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:59:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:54.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:59:54 np0005593234 nova_compute[227762]: 2026-01-23 10:59:54.821 227766 INFO nova.compute.manager [None req-b21d77e4-7e62-4342-8ce4-1232352da176 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Get console output#033[00m
Jan 23 05:59:54 np0005593234 nova_compute[227762]: 2026-01-23 10:59:54.826 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 05:59:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:54.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:55 np0005593234 nova_compute[227762]: 2026-01-23 10:59:55.797 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.209 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:56.209 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=93, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=92) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:59:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:56.212 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.541 227766 DEBUG nova.compute.manager [req-06dedc91-ae84-4f28-ae94-a28231b3dad2 req-b16f967d-7ca3-4a04-8592-b1ab0403766c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Received event network-changed-5afebfcd-4030-4d9b-90d1-046a64cb92e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.542 227766 DEBUG nova.compute.manager [req-06dedc91-ae84-4f28-ae94-a28231b3dad2 req-b16f967d-7ca3-4a04-8592-b1ab0403766c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Refreshing instance network info cache due to event network-changed-5afebfcd-4030-4d9b-90d1-046a64cb92e1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.542 227766 DEBUG oslo_concurrency.lockutils [req-06dedc91-ae84-4f28-ae94-a28231b3dad2 req-b16f967d-7ca3-4a04-8592-b1ab0403766c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-7362e95f-78ad-433d-a32f-700454cf3816" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.542 227766 DEBUG oslo_concurrency.lockutils [req-06dedc91-ae84-4f28-ae94-a28231b3dad2 req-b16f967d-7ca3-4a04-8592-b1ab0403766c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-7362e95f-78ad-433d-a32f-700454cf3816" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.543 227766 DEBUG nova.network.neutron [req-06dedc91-ae84-4f28-ae94-a28231b3dad2 req-b16f967d-7ca3-4a04-8592-b1ab0403766c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Refreshing network info cache for port 5afebfcd-4030-4d9b-90d1-046a64cb92e1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.674 227766 DEBUG oslo_concurrency.lockutils [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "7362e95f-78ad-433d-a32f-700454cf3816" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.675 227766 DEBUG oslo_concurrency.lockutils [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.675 227766 DEBUG oslo_concurrency.lockutils [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "7362e95f-78ad-433d-a32f-700454cf3816-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.676 227766 DEBUG oslo_concurrency.lockutils [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.676 227766 DEBUG oslo_concurrency.lockutils [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.677 227766 INFO nova.compute.manager [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Terminating instance#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.679 227766 DEBUG nova.compute.manager [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 05:59:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:56.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:56 np0005593234 kernel: tap5afebfcd-40 (unregistering): left promiscuous mode
Jan 23 05:59:56 np0005593234 NetworkManager[48942]: <info>  [1769165996.7426] device (tap5afebfcd-40): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 05:59:56 np0005593234 ovn_controller[134547]: 2026-01-23T10:59:56Z|00935|binding|INFO|Releasing lport 5afebfcd-4030-4d9b-90d1-046a64cb92e1 from this chassis (sb_readonly=0)
Jan 23 05:59:56 np0005593234 ovn_controller[134547]: 2026-01-23T10:59:56Z|00936|binding|INFO|Setting lport 5afebfcd-4030-4d9b-90d1-046a64cb92e1 down in Southbound
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.754 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:56 np0005593234 ovn_controller[134547]: 2026-01-23T10:59:56Z|00937|binding|INFO|Removing iface tap5afebfcd-40 ovn-installed in OVS
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.757 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.783 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:56 np0005593234 systemd[1]: machine-qemu\x2d104\x2dinstance\x2d000000d5.scope: Deactivated successfully.
Jan 23 05:59:56 np0005593234 systemd[1]: machine-qemu\x2d104\x2dinstance\x2d000000d5.scope: Consumed 14.111s CPU time.
Jan 23 05:59:56 np0005593234 systemd-machined[195626]: Machine qemu-104-instance-000000d5 terminated.
Jan 23 05:59:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:59:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:56.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.929 227766 INFO nova.virt.libvirt.driver [-] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Instance destroyed successfully.#033[00m
Jan 23 05:59:56 np0005593234 nova_compute[227762]: 2026-01-23 10:59:56.930 227766 DEBUG nova.objects.instance [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'resources' on Instance uuid 7362e95f-78ad-433d-a32f-700454cf3816 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 05:59:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.251 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:50:8b:df 10.100.0.14'], port_security=['fa:16:3e:50:8b:df 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '7362e95f-78ad-433d-a32f-700454cf3816', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0c091981-560c-4de9-99d3-f975cddf2b53', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '12e30436-ebe4-4dcb-9cc1-5b6eafaa9ddb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=56aace94-4ec0-4335-af49-e5a323e8f28b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=5afebfcd-4030-4d9b-90d1-046a64cb92e1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.253 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 5afebfcd-4030-4d9b-90d1-046a64cb92e1 in datapath 0c091981-560c-4de9-99d3-f975cddf2b53 unbound from our chassis#033[00m
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.254 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0c091981-560c-4de9-99d3-f975cddf2b53, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.256 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2d708464-acb7-44d4-83b8-3e14ca593b35]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.256 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53 namespace which is not needed anymore#033[00m
Jan 23 05:59:58 np0005593234 neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53[335988]: [NOTICE]   (335993) : haproxy version is 2.8.14-c23fe91
Jan 23 05:59:58 np0005593234 neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53[335988]: [NOTICE]   (335993) : path to executable is /usr/sbin/haproxy
Jan 23 05:59:58 np0005593234 neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53[335988]: [WARNING]  (335993) : Exiting Master process...
Jan 23 05:59:58 np0005593234 neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53[335988]: [ALERT]    (335993) : Current worker (335995) exited with code 143 (Terminated)
Jan 23 05:59:58 np0005593234 neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53[335988]: [WARNING]  (335993) : All workers exited. Exiting... (0)
Jan 23 05:59:58 np0005593234 systemd[1]: libpod-e95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f.scope: Deactivated successfully.
Jan 23 05:59:58 np0005593234 podman[336387]: 2026-01-23 10:59:58.41687813 +0000 UTC m=+0.057743798 container died e95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 05:59:58 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f-userdata-shm.mount: Deactivated successfully.
Jan 23 05:59:58 np0005593234 systemd[1]: var-lib-containers-storage-overlay-ca00ed34b8b99ab5651961ad475f096e5b991062dbd6d1786721067b5eec75fa-merged.mount: Deactivated successfully.
Jan 23 05:59:58 np0005593234 podman[336387]: 2026-01-23 10:59:58.460665909 +0000 UTC m=+0.101531577 container cleanup e95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 05:59:58 np0005593234 systemd[1]: libpod-conmon-e95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f.scope: Deactivated successfully.
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.515 227766 DEBUG nova.virt.libvirt.vif [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:59:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-509682616',display_name='tempest-TestNetworkAdvancedServerOps-server-509682616',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-509682616',id=213,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKyriUxXvNltk/O+zE9daakY9SI2zEc97QqjMK/+blZNB+bRt+v8usoOjGwtuzN0xO+IjRyT6c2+NhDQ52GFSyepRN8aIWYSyw+6u3TL4lbwnVvMU36tF0SZSRh2ptkRgw==',key_name='tempest-TestNetworkAdvancedServerOps-495860055',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:59:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-tt0cqd7s',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:59:50Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=7362e95f-78ad-433d-a32f-700454cf3816,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "address": "fa:16:3e:50:8b:df", "network": {"id": "0c091981-560c-4de9-99d3-f975cddf2b53", "bridge": "br-int", "label": "tempest-network-smoke--155774509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afebfcd-40", "ovs_interfaceid": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.516 227766 DEBUG nova.network.os_vif_util [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "address": "fa:16:3e:50:8b:df", "network": {"id": "0c091981-560c-4de9-99d3-f975cddf2b53", "bridge": "br-int", "label": "tempest-network-smoke--155774509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.186", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afebfcd-40", "ovs_interfaceid": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.517 227766 DEBUG nova.network.os_vif_util [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:50:8b:df,bridge_name='br-int',has_traffic_filtering=True,id=5afebfcd-4030-4d9b-90d1-046a64cb92e1,network=Network(0c091981-560c-4de9-99d3-f975cddf2b53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afebfcd-40') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.518 227766 DEBUG os_vif [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:50:8b:df,bridge_name='br-int',has_traffic_filtering=True,id=5afebfcd-4030-4d9b-90d1-046a64cb92e1,network=Network(0c091981-560c-4de9-99d3-f975cddf2b53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afebfcd-40') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.520 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.520 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5afebfcd-40, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.522 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.523 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.527 227766 INFO os_vif [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:50:8b:df,bridge_name='br-int',has_traffic_filtering=True,id=5afebfcd-4030-4d9b-90d1-046a64cb92e1,network=Network(0c091981-560c-4de9-99d3-f975cddf2b53),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5afebfcd-40')#033[00m
Jan 23 05:59:58 np0005593234 podman[336415]: 2026-01-23 10:59:58.52875697 +0000 UTC m=+0.047017822 container remove e95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.538 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[899a7e26-4978-4dce-b43f-6a3f6497dc94]: (4, ('Fri Jan 23 10:59:58 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53 (e95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f)\ne95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f\nFri Jan 23 10:59:58 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53 (e95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f)\ne95f46577d76c37c54c0a5416f003019b1d8478aedb7d3fc6df0cba58e89333f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.541 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[729347a9-b764-463d-b5ed-915369934247]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.542 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c091981-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 05:59:58 np0005593234 kernel: tap0c091981-50: left promiscuous mode
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.549 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.550 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3daaa6ec-76bd-48f7-b0ef-3ab66d001562]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.561 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.567 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e71ab160-5c8d-4b18-8c64-0e70245b64a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.568 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f345d85d-2bb1-4088-a03e-f70bce9970f0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.588 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6ca09c69-687e-461f-9645-4ea02139d8cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 985528, 'reachable_time': 21921, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 336449, 'error': None, 'target': 'ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.592 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0c091981-560c-4de9-99d3-f975cddf2b53 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 05:59:58 np0005593234 systemd[1]: run-netns-ovnmeta\x2d0c091981\x2d560c\x2d4de9\x2d99d3\x2df975cddf2b53.mount: Deactivated successfully.
Jan 23 05:59:58 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 10:59:58.593 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[7e4728c7-662e-464e-93b9-8946ea5b2146]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 05:59:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 05:59:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:10:59:58.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.812 227766 DEBUG nova.compute.manager [req-10950b31-946f-48cb-924e-eaf56261aeea req-a200e66b-bd80-4dca-9118-8bc9af725f53 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Received event network-vif-unplugged-5afebfcd-4030-4d9b-90d1-046a64cb92e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.812 227766 DEBUG oslo_concurrency.lockutils [req-10950b31-946f-48cb-924e-eaf56261aeea req-a200e66b-bd80-4dca-9118-8bc9af725f53 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "7362e95f-78ad-433d-a32f-700454cf3816-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.813 227766 DEBUG oslo_concurrency.lockutils [req-10950b31-946f-48cb-924e-eaf56261aeea req-a200e66b-bd80-4dca-9118-8bc9af725f53 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.813 227766 DEBUG oslo_concurrency.lockutils [req-10950b31-946f-48cb-924e-eaf56261aeea req-a200e66b-bd80-4dca-9118-8bc9af725f53 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.814 227766 DEBUG nova.compute.manager [req-10950b31-946f-48cb-924e-eaf56261aeea req-a200e66b-bd80-4dca-9118-8bc9af725f53 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] No waiting events found dispatching network-vif-unplugged-5afebfcd-4030-4d9b-90d1-046a64cb92e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 05:59:58 np0005593234 nova_compute[227762]: 2026-01-23 10:59:58.814 227766 DEBUG nova.compute.manager [req-10950b31-946f-48cb-924e-eaf56261aeea req-a200e66b-bd80-4dca-9118-8bc9af725f53 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Received event network-vif-unplugged-5afebfcd-4030-4d9b-90d1-046a64cb92e1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 05:59:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 05:59:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 05:59:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:10:59:58.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 05:59:59 np0005593234 nova_compute[227762]: 2026-01-23 10:59:59.171 227766 INFO nova.virt.libvirt.driver [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Deleting instance files /var/lib/nova/instances/7362e95f-78ad-433d-a32f-700454cf3816_del#033[00m
Jan 23 05:59:59 np0005593234 nova_compute[227762]: 2026-01-23 10:59:59.172 227766 INFO nova.virt.libvirt.driver [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Deletion of /var/lib/nova/instances/7362e95f-78ad-433d-a32f-700454cf3816_del complete#033[00m
Jan 23 06:00:00 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 06:00:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:00:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:00.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:00:00 np0005593234 nova_compute[227762]: 2026-01-23 11:00:00.798 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:00.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:01 np0005593234 nova_compute[227762]: 2026-01-23 11:00:01.218 227766 DEBUG nova.compute.manager [req-b3df5f81-7a11-49a0-8a2f-e0a4ceb041ed req-990dc60d-efed-4df5-b25c-2349389d8fea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Received event network-vif-plugged-5afebfcd-4030-4d9b-90d1-046a64cb92e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:00:01 np0005593234 nova_compute[227762]: 2026-01-23 11:00:01.218 227766 DEBUG oslo_concurrency.lockutils [req-b3df5f81-7a11-49a0-8a2f-e0a4ceb041ed req-990dc60d-efed-4df5-b25c-2349389d8fea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "7362e95f-78ad-433d-a32f-700454cf3816-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:01 np0005593234 nova_compute[227762]: 2026-01-23 11:00:01.219 227766 DEBUG oslo_concurrency.lockutils [req-b3df5f81-7a11-49a0-8a2f-e0a4ceb041ed req-990dc60d-efed-4df5-b25c-2349389d8fea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:01 np0005593234 nova_compute[227762]: 2026-01-23 11:00:01.219 227766 DEBUG oslo_concurrency.lockutils [req-b3df5f81-7a11-49a0-8a2f-e0a4ceb041ed req-990dc60d-efed-4df5-b25c-2349389d8fea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:01 np0005593234 nova_compute[227762]: 2026-01-23 11:00:01.219 227766 DEBUG nova.compute.manager [req-b3df5f81-7a11-49a0-8a2f-e0a4ceb041ed req-990dc60d-efed-4df5-b25c-2349389d8fea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] No waiting events found dispatching network-vif-plugged-5afebfcd-4030-4d9b-90d1-046a64cb92e1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:00:01 np0005593234 nova_compute[227762]: 2026-01-23 11:00:01.219 227766 WARNING nova.compute.manager [req-b3df5f81-7a11-49a0-8a2f-e0a4ceb041ed req-990dc60d-efed-4df5-b25c-2349389d8fea 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Received unexpected event network-vif-plugged-5afebfcd-4030-4d9b-90d1-046a64cb92e1 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 06:00:01 np0005593234 nova_compute[227762]: 2026-01-23 11:00:01.261 227766 INFO nova.compute.manager [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Took 4.58 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 06:00:01 np0005593234 nova_compute[227762]: 2026-01-23 11:00:01.261 227766 DEBUG oslo.service.loopingcall [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 06:00:01 np0005593234 nova_compute[227762]: 2026-01-23 11:00:01.262 227766 DEBUG nova.compute.manager [-] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 06:00:01 np0005593234 nova_compute[227762]: 2026-01-23 11:00:01.262 227766 DEBUG nova.network.neutron [-] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 06:00:01 np0005593234 nova_compute[227762]: 2026-01-23 11:00:01.729 227766 DEBUG nova.network.neutron [req-06dedc91-ae84-4f28-ae94-a28231b3dad2 req-b16f967d-7ca3-4a04-8592-b1ab0403766c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Updated VIF entry in instance network info cache for port 5afebfcd-4030-4d9b-90d1-046a64cb92e1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:00:01 np0005593234 nova_compute[227762]: 2026-01-23 11:00:01.729 227766 DEBUG nova.network.neutron [req-06dedc91-ae84-4f28-ae94-a28231b3dad2 req-b16f967d-7ca3-4a04-8592-b1ab0403766c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Updating instance_info_cache with network_info: [{"id": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "address": "fa:16:3e:50:8b:df", "network": {"id": "0c091981-560c-4de9-99d3-f975cddf2b53", "bridge": "br-int", "label": "tempest-network-smoke--155774509", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5afebfcd-40", "ovs_interfaceid": "5afebfcd-4030-4d9b-90d1-046a64cb92e1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:00:01 np0005593234 nova_compute[227762]: 2026-01-23 11:00:01.759 227766 DEBUG oslo_concurrency.lockutils [req-06dedc91-ae84-4f28-ae94-a28231b3dad2 req-b16f967d-7ca3-4a04-8592-b1ab0403766c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-7362e95f-78ad-433d-a32f-700454cf3816" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:00:02 np0005593234 nova_compute[227762]: 2026-01-23 11:00:02.018 227766 DEBUG nova.network.neutron [-] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:00:02 np0005593234 nova_compute[227762]: 2026-01-23 11:00:02.046 227766 INFO nova.compute.manager [-] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Took 0.78 seconds to deallocate network for instance.#033[00m
Jan 23 06:00:02 np0005593234 nova_compute[227762]: 2026-01-23 11:00:02.099 227766 DEBUG oslo_concurrency.lockutils [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:02 np0005593234 nova_compute[227762]: 2026-01-23 11:00:02.100 227766 DEBUG oslo_concurrency.lockutils [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:02 np0005593234 nova_compute[227762]: 2026-01-23 11:00:02.114 227766 DEBUG nova.compute.manager [req-52753d8a-90ea-4a59-96c1-f8d8cf2c4017 req-30298256-d13d-449d-9cfb-67ffd0836b8e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Received event network-vif-deleted-5afebfcd-4030-4d9b-90d1-046a64cb92e1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:00:02 np0005593234 nova_compute[227762]: 2026-01-23 11:00:02.225 227766 DEBUG oslo_concurrency.processutils [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:00:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:00:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4101013213' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:00:02 np0005593234 nova_compute[227762]: 2026-01-23 11:00:02.674 227766 DEBUG oslo_concurrency.processutils [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:00:02 np0005593234 nova_compute[227762]: 2026-01-23 11:00:02.682 227766 DEBUG nova.compute.provider_tree [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:00:02 np0005593234 nova_compute[227762]: 2026-01-23 11:00:02.707 227766 DEBUG nova.scheduler.client.report [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:00:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:02.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:02 np0005593234 nova_compute[227762]: 2026-01-23 11:00:02.732 227766 DEBUG oslo_concurrency.lockutils [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:02 np0005593234 nova_compute[227762]: 2026-01-23 11:00:02.764 227766 INFO nova.scheduler.client.report [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Deleted allocations for instance 7362e95f-78ad-433d-a32f-700454cf3816#033[00m
Jan 23 06:00:02 np0005593234 nova_compute[227762]: 2026-01-23 11:00:02.848 227766 DEBUG oslo_concurrency.lockutils [None req-22b2f2e9-6f40-4c21-b688-e016c22086e2 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "7362e95f-78ad-433d-a32f-700454cf3816" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:02.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:03 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:03.215 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '93'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:00:03 np0005593234 podman[336477]: 2026-01-23 11:00:03.375272837 +0000 UTC m=+0.049907462 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 06:00:03 np0005593234 nova_compute[227762]: 2026-01-23 11:00:03.524 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:04.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:04.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:05 np0005593234 nova_compute[227762]: 2026-01-23 11:00:05.801 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:06.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:06.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:08 np0005593234 nova_compute[227762]: 2026-01-23 11:00:08.528 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:00:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:08.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:00:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:08.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:09 np0005593234 nova_compute[227762]: 2026-01-23 11:00:09.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:09 np0005593234 nova_compute[227762]: 2026-01-23 11:00:09.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 06:00:09 np0005593234 nova_compute[227762]: 2026-01-23 11:00:09.770 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:09 np0005593234 nova_compute[227762]: 2026-01-23 11:00:09.845 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:10.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:10 np0005593234 podman[336501]: 2026-01-23 11:00:10.798430989 +0000 UTC m=+0.094090124 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Jan 23 06:00:10 np0005593234 nova_compute[227762]: 2026-01-23 11:00:10.803 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:10.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:11 np0005593234 nova_compute[227762]: 2026-01-23 11:00:11.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:11 np0005593234 nova_compute[227762]: 2026-01-23 11:00:11.928 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769165996.9272146, 7362e95f-78ad-433d-a32f-700454cf3816 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:00:11 np0005593234 nova_compute[227762]: 2026-01-23 11:00:11.928 227766 INFO nova.compute.manager [-] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] VM Stopped (Lifecycle Event)#033[00m
Jan 23 06:00:11 np0005593234 nova_compute[227762]: 2026-01-23 11:00:11.946 227766 DEBUG nova.compute.manager [None req-17b3c09f-a48b-42a3-bd70-7938c69f2436 - - - - - -] [instance: 7362e95f-78ad-433d-a32f-700454cf3816] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:00:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:12.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:12.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:13 np0005593234 nova_compute[227762]: 2026-01-23 11:00:13.531 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:14.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:14.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:15 np0005593234 nova_compute[227762]: 2026-01-23 11:00:15.805 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:16.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:16.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:18 np0005593234 nova_compute[227762]: 2026-01-23 11:00:18.534 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:18.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:18 np0005593234 nova_compute[227762]: 2026-01-23 11:00:18.757 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:18 np0005593234 nova_compute[227762]: 2026-01-23 11:00:18.789 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:18 np0005593234 nova_compute[227762]: 2026-01-23 11:00:18.789 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:18 np0005593234 nova_compute[227762]: 2026-01-23 11:00:18.789 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:18 np0005593234 nova_compute[227762]: 2026-01-23 11:00:18.790 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:00:18 np0005593234 nova_compute[227762]: 2026-01-23 11:00:18.790 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:00:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:18.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:00:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/353334960' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:00:19 np0005593234 nova_compute[227762]: 2026-01-23 11:00:19.246 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:00:19 np0005593234 nova_compute[227762]: 2026-01-23 11:00:19.379 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:00:19 np0005593234 nova_compute[227762]: 2026-01-23 11:00:19.380 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4133MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:00:19 np0005593234 nova_compute[227762]: 2026-01-23 11:00:19.380 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:19 np0005593234 nova_compute[227762]: 2026-01-23 11:00:19.381 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:19 np0005593234 nova_compute[227762]: 2026-01-23 11:00:19.438 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:00:19 np0005593234 nova_compute[227762]: 2026-01-23 11:00:19.438 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:00:19 np0005593234 nova_compute[227762]: 2026-01-23 11:00:19.452 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:00:19 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:00:19 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3756827525' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:00:19 np0005593234 nova_compute[227762]: 2026-01-23 11:00:19.859 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.406s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:00:19 np0005593234 nova_compute[227762]: 2026-01-23 11:00:19.868 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:00:20 np0005593234 nova_compute[227762]: 2026-01-23 11:00:20.104 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:00:20 np0005593234 nova_compute[227762]: 2026-01-23 11:00:20.130 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:00:20 np0005593234 nova_compute[227762]: 2026-01-23 11:00:20.130 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.750s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:00:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:20.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:00:20 np0005593234 nova_compute[227762]: 2026-01-23 11:00:20.807 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:20.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:22 np0005593234 nova_compute[227762]: 2026-01-23 11:00:22.118 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:22 np0005593234 nova_compute[227762]: 2026-01-23 11:00:22.119 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:00:22 np0005593234 nova_compute[227762]: 2026-01-23 11:00:22.119 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:00:22 np0005593234 nova_compute[227762]: 2026-01-23 11:00:22.134 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:00:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:22.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:22 np0005593234 nova_compute[227762]: 2026-01-23 11:00:22.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:22.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:23 np0005593234 nova_compute[227762]: 2026-01-23 11:00:23.538 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:24.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:24.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:25 np0005593234 nova_compute[227762]: 2026-01-23 11:00:25.810 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:26.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:26.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:28 np0005593234 nova_compute[227762]: 2026-01-23 11:00:28.541 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:28 np0005593234 nova_compute[227762]: 2026-01-23 11:00:28.739 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "44a526e3-4f37-4d95-a98c-45a5937384e7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:28 np0005593234 nova_compute[227762]: 2026-01-23 11:00:28.739 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:28.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:28 np0005593234 nova_compute[227762]: 2026-01-23 11:00:28.770 227766 DEBUG nova.compute.manager [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 06:00:28 np0005593234 nova_compute[227762]: 2026-01-23 11:00:28.867 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:28 np0005593234 nova_compute[227762]: 2026-01-23 11:00:28.868 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:28 np0005593234 nova_compute[227762]: 2026-01-23 11:00:28.874 227766 DEBUG nova.virt.hardware [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 06:00:28 np0005593234 nova_compute[227762]: 2026-01-23 11:00:28.874 227766 INFO nova.compute.claims [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 06:00:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:28.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:28 np0005593234 nova_compute[227762]: 2026-01-23 11:00:28.983 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:00:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:00:29 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/880356' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.406 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.414 227766 DEBUG nova.compute.provider_tree [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.439 227766 DEBUG nova.scheduler.client.report [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.473 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.475 227766 DEBUG nova.compute.manager [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.536 227766 DEBUG nova.compute.manager [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.537 227766 DEBUG nova.network.neutron [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.555 227766 INFO nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.581 227766 DEBUG nova.compute.manager [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.701 227766 DEBUG nova.compute.manager [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.702 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.702 227766 INFO nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Creating image(s)#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.732 227766 DEBUG nova.storage.rbd_utils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 44a526e3-4f37-4d95-a98c-45a5937384e7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.767 227766 DEBUG nova.storage.rbd_utils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 44a526e3-4f37-4d95-a98c-45a5937384e7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.800 227766 DEBUG nova.storage.rbd_utils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 44a526e3-4f37-4d95-a98c-45a5937384e7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.805 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.837 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.838 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.876 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.877 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.878 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.878 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.913 227766 DEBUG nova.storage.rbd_utils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 44a526e3-4f37-4d95-a98c-45a5937384e7_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:00:29 np0005593234 nova_compute[227762]: 2026-01-23 11:00:29.917 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 44a526e3-4f37-4d95-a98c-45a5937384e7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:00:30 np0005593234 nova_compute[227762]: 2026-01-23 11:00:30.214 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 44a526e3-4f37-4d95-a98c-45a5937384e7_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.297s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:00:30 np0005593234 nova_compute[227762]: 2026-01-23 11:00:30.287 227766 DEBUG nova.storage.rbd_utils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] resizing rbd image 44a526e3-4f37-4d95-a98c-45a5937384e7_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 06:00:30 np0005593234 nova_compute[227762]: 2026-01-23 11:00:30.394 227766 DEBUG nova.objects.instance [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'migration_context' on Instance uuid 44a526e3-4f37-4d95-a98c-45a5937384e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:00:30 np0005593234 nova_compute[227762]: 2026-01-23 11:00:30.608 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 06:00:30 np0005593234 nova_compute[227762]: 2026-01-23 11:00:30.609 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Ensure instance console log exists: /var/lib/nova/instances/44a526e3-4f37-4d95-a98c-45a5937384e7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 06:00:30 np0005593234 nova_compute[227762]: 2026-01-23 11:00:30.609 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:30 np0005593234 nova_compute[227762]: 2026-01-23 11:00:30.610 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:30 np0005593234 nova_compute[227762]: 2026-01-23 11:00:30.610 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:30 np0005593234 nova_compute[227762]: 2026-01-23 11:00:30.620 227766 DEBUG nova.policy [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '420c366dc5dc45a48da4e0b18c93043f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c06f98b51aeb48de91d116fda54a161f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 06:00:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:30.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:30 np0005593234 nova_compute[227762]: 2026-01-23 11:00:30.812 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:30.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:31 np0005593234 nova_compute[227762]: 2026-01-23 11:00:31.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:31 np0005593234 nova_compute[227762]: 2026-01-23 11:00:31.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 06:00:31 np0005593234 nova_compute[227762]: 2026-01-23 11:00:31.762 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 06:00:32 np0005593234 nova_compute[227762]: 2026-01-23 11:00:32.454 227766 DEBUG nova.network.neutron [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Successfully created port: 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 06:00:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:32.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:32 np0005593234 nova_compute[227762]: 2026-01-23 11:00:32.761 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:32.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:33 np0005593234 nova_compute[227762]: 2026-01-23 11:00:33.543 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:33 np0005593234 podman[336821]: 2026-01-23 11:00:33.754734848 +0000 UTC m=+0.049496859 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 06:00:34 np0005593234 nova_compute[227762]: 2026-01-23 11:00:34.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:34.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:34.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:35 np0005593234 nova_compute[227762]: 2026-01-23 11:00:35.816 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:36 np0005593234 nova_compute[227762]: 2026-01-23 11:00:36.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:36.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:36.933 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:37 np0005593234 nova_compute[227762]: 2026-01-23 11:00:37.309 227766 DEBUG nova.network.neutron [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Successfully updated port: 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 06:00:37 np0005593234 nova_compute[227762]: 2026-01-23 11:00:37.333 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:00:37 np0005593234 nova_compute[227762]: 2026-01-23 11:00:37.333 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:00:37 np0005593234 nova_compute[227762]: 2026-01-23 11:00:37.333 227766 DEBUG nova.network.neutron [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:00:37 np0005593234 nova_compute[227762]: 2026-01-23 11:00:37.445 227766 DEBUG nova.compute.manager [req-2e46e589-bf0e-41c1-a331-98bc02c55d1b req-a76ba15c-47ad-400d-906f-7d10c618fe13 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received event network-changed-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:00:37 np0005593234 nova_compute[227762]: 2026-01-23 11:00:37.446 227766 DEBUG nova.compute.manager [req-2e46e589-bf0e-41c1-a331-98bc02c55d1b req-a76ba15c-47ad-400d-906f-7d10c618fe13 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Refreshing instance network info cache due to event network-changed-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:00:37 np0005593234 nova_compute[227762]: 2026-01-23 11:00:37.446 227766 DEBUG oslo_concurrency.lockutils [req-2e46e589-bf0e-41c1-a331-98bc02c55d1b req-a76ba15c-47ad-400d-906f-7d10c618fe13 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:00:37 np0005593234 nova_compute[227762]: 2026-01-23 11:00:37.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:37 np0005593234 nova_compute[227762]: 2026-01-23 11:00:37.760 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:38 np0005593234 nova_compute[227762]: 2026-01-23 11:00:38.388 227766 DEBUG nova.network.neutron [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #202. Immutable memtables: 0.
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.414883) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 129] Flushing memtable with next log file: 202
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166038414979, "job": 129, "event": "flush_started", "num_memtables": 1, "num_entries": 1285, "num_deletes": 251, "total_data_size": 2810372, "memory_usage": 2845616, "flush_reason": "Manual Compaction"}
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 129] Level-0 flush table #203: started
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166038429502, "cf_name": "default", "job": 129, "event": "table_file_creation", "file_number": 203, "file_size": 1855448, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 95856, "largest_seqno": 97136, "table_properties": {"data_size": 1849926, "index_size": 2916, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11948, "raw_average_key_size": 19, "raw_value_size": 1838809, "raw_average_value_size": 3054, "num_data_blocks": 130, "num_entries": 602, "num_filter_entries": 602, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769165927, "oldest_key_time": 1769165927, "file_creation_time": 1769166038, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 203, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 129] Flush lasted 14717 microseconds, and 6385 cpu microseconds.
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.429614) [db/flush_job.cc:967] [default] [JOB 129] Level-0 flush table #203: 1855448 bytes OK
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.429640) [db/memtable_list.cc:519] [default] Level-0 commit table #203 started
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.431450) [db/memtable_list.cc:722] [default] Level-0 commit table #203: memtable #1 done
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.431463) EVENT_LOG_v1 {"time_micros": 1769166038431458, "job": 129, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.431479) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 129] Try to delete WAL files size 2804258, prev total WAL file size 2804258, number of live WAL files 2.
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000199.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.432376) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038353334' seq:72057594037927935, type:22 .. '7061786F730038373836' seq:0, type:0; will stop at (end)
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 130] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 129 Base level 0, inputs: [203(1811KB)], [201(11MB)]
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166038432493, "job": 130, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [203], "files_L6": [201], "score": -1, "input_data_size": 14350711, "oldest_snapshot_seqno": -1}
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 130] Generated table #204: 11175 keys, 12422072 bytes, temperature: kUnknown
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166038499801, "cf_name": "default", "job": 130, "event": "table_file_creation", "file_number": 204, "file_size": 12422072, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12352942, "index_size": 40089, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27973, "raw_key_size": 296742, "raw_average_key_size": 26, "raw_value_size": 12160649, "raw_average_value_size": 1088, "num_data_blocks": 1509, "num_entries": 11175, "num_filter_entries": 11175, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769166038, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 204, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.500233) [db/compaction/compaction_job.cc:1663] [default] [JOB 130] Compacted 1@0 + 1@6 files to L6 => 12422072 bytes
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.501869) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 212.5 rd, 183.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 11.9 +0.0 blob) out(11.8 +0.0 blob), read-write-amplify(14.4) write-amplify(6.7) OK, records in: 11694, records dropped: 519 output_compression: NoCompression
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.501890) EVENT_LOG_v1 {"time_micros": 1769166038501880, "job": 130, "event": "compaction_finished", "compaction_time_micros": 67543, "compaction_time_cpu_micros": 28487, "output_level": 6, "num_output_files": 1, "total_output_size": 12422072, "num_input_records": 11694, "num_output_records": 11175, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000203.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166038502822, "job": 130, "event": "table_file_deletion", "file_number": 203}
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000201.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166038506199, "job": 130, "event": "table_file_deletion", "file_number": 201}
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.432242) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.506471) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.506477) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.506479) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.506480) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:00:38 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:00:38.506482) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:00:38 np0005593234 nova_compute[227762]: 2026-01-23 11:00:38.545 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:38.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:38.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.389 227766 DEBUG nova.network.neutron [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Updating instance_info_cache with network_info: [{"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.412 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.413 227766 DEBUG nova.compute.manager [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Instance network_info: |[{"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.413 227766 DEBUG oslo_concurrency.lockutils [req-2e46e589-bf0e-41c1-a331-98bc02c55d1b req-a76ba15c-47ad-400d-906f-7d10c618fe13 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.414 227766 DEBUG nova.network.neutron [req-2e46e589-bf0e-41c1-a331-98bc02c55d1b req-a76ba15c-47ad-400d-906f-7d10c618fe13 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Refreshing network info cache for port 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.416 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Start _get_guest_xml network_info=[{"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.421 227766 WARNING nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.428 227766 DEBUG nova.virt.libvirt.host [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.429 227766 DEBUG nova.virt.libvirt.host [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.432 227766 DEBUG nova.virt.libvirt.host [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.432 227766 DEBUG nova.virt.libvirt.host [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.433 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.434 227766 DEBUG nova.virt.hardware [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.434 227766 DEBUG nova.virt.hardware [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.434 227766 DEBUG nova.virt.hardware [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.435 227766 DEBUG nova.virt.hardware [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.435 227766 DEBUG nova.virt.hardware [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.435 227766 DEBUG nova.virt.hardware [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.435 227766 DEBUG nova.virt.hardware [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.435 227766 DEBUG nova.virt.hardware [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.436 227766 DEBUG nova.virt.hardware [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.436 227766 DEBUG nova.virt.hardware [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.436 227766 DEBUG nova.virt.hardware [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.439 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:00:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:00:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3389552114' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.870 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.892 227766 DEBUG nova.storage.rbd_utils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 44a526e3-4f37-4d95-a98c-45a5937384e7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:00:39 np0005593234 nova_compute[227762]: 2026-01-23 11:00:39.896 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:00:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:00:40 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2984761573' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.321 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.323 227766 DEBUG nova.virt.libvirt.vif [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:00:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1432894641',display_name='tempest-TestNetworkAdvancedServerOps-server-1432894641',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1432894641',id=214,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO6Rgoec6R7+ZYgsddopjR9l5FplV8tN2PJmCmlV60Pib4goPLAOzFEeWZQShT7+lCH35hEqU0idsqmj+Oi79fUwESNDsABzmiKYg2NI49VRpRvg2c1Duh/33xLsRI6OFw==',key_name='tempest-TestNetworkAdvancedServerOps-1646664422',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-ymeku15o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:00:29Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=44a526e3-4f37-4d95-a98c-45a5937384e7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.324 227766 DEBUG nova.network.os_vif_util [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.324 227766 DEBUG nova.network.os_vif_util [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:55:45,bridge_name='br-int',has_traffic_filtering=True,id=36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1,network=Network(f1da7606-9f16-49a8-8326-606f8222c72a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36f5a7d1-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.325 227766 DEBUG nova.objects.instance [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid 44a526e3-4f37-4d95-a98c-45a5937384e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.349 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] End _get_guest_xml xml=<domain type="kvm">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  <uuid>44a526e3-4f37-4d95-a98c-45a5937384e7</uuid>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  <name>instance-000000d6</name>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1432894641</nova:name>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 11:00:39</nova:creationTime>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <nova:port uuid="36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <system>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <entry name="serial">44a526e3-4f37-4d95-a98c-45a5937384e7</entry>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <entry name="uuid">44a526e3-4f37-4d95-a98c-45a5937384e7</entry>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    </system>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  <os>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  </os>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  <features>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  </features>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  </clock>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  <devices>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/44a526e3-4f37-4d95-a98c-45a5937384e7_disk">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/44a526e3-4f37-4d95-a98c-45a5937384e7_disk.config">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:40:55:45"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <target dev="tap36f5a7d1-6a"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    </interface>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/44a526e3-4f37-4d95-a98c-45a5937384e7/console.log" append="off"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    </serial>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <video>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    </video>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    </rng>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 06:00:40 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 06:00:40 np0005593234 nova_compute[227762]:  </devices>
Jan 23 06:00:40 np0005593234 nova_compute[227762]: </domain>
Jan 23 06:00:40 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.350 227766 DEBUG nova.compute.manager [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Preparing to wait for external event network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.350 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.350 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.351 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.351 227766 DEBUG nova.virt.libvirt.vif [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:00:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1432894641',display_name='tempest-TestNetworkAdvancedServerOps-server-1432894641',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1432894641',id=214,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO6Rgoec6R7+ZYgsddopjR9l5FplV8tN2PJmCmlV60Pib4goPLAOzFEeWZQShT7+lCH35hEqU0idsqmj+Oi79fUwESNDsABzmiKYg2NI49VRpRvg2c1Duh/33xLsRI6OFw==',key_name='tempest-TestNetworkAdvancedServerOps-1646664422',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-ymeku15o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:00:29Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=44a526e3-4f37-4d95-a98c-45a5937384e7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.352 227766 DEBUG nova.network.os_vif_util [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.352 227766 DEBUG nova.network.os_vif_util [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:55:45,bridge_name='br-int',has_traffic_filtering=True,id=36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1,network=Network(f1da7606-9f16-49a8-8326-606f8222c72a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36f5a7d1-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.352 227766 DEBUG os_vif [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:55:45,bridge_name='br-int',has_traffic_filtering=True,id=36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1,network=Network(f1da7606-9f16-49a8-8326-606f8222c72a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36f5a7d1-6a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.353 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.353 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.354 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.356 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.356 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap36f5a7d1-6a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.357 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap36f5a7d1-6a, col_values=(('external_ids', {'iface-id': '36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:40:55:45', 'vm-uuid': '44a526e3-4f37-4d95-a98c-45a5937384e7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.358 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:40 np0005593234 NetworkManager[48942]: <info>  [1769166040.3593] manager: (tap36f5a7d1-6a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/446)
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.361 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.367 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.368 227766 INFO os_vif [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:55:45,bridge_name='br-int',has_traffic_filtering=True,id=36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1,network=Network(f1da7606-9f16-49a8-8326-606f8222c72a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36f5a7d1-6a')#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.416 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.417 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.417 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No VIF found with MAC fa:16:3e:40:55:45, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.418 227766 INFO nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Using config drive#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.438 227766 DEBUG nova.storage.rbd_utils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 44a526e3-4f37-4d95-a98c-45a5937384e7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:00:40 np0005593234 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 23 06:00:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:40.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.818 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.928 227766 INFO nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Creating config drive at /var/lib/nova/instances/44a526e3-4f37-4d95-a98c-45a5937384e7/disk.config#033[00m
Jan 23 06:00:40 np0005593234 nova_compute[227762]: 2026-01-23 11:00:40.935 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/44a526e3-4f37-4d95-a98c-45a5937384e7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_nke0vq6 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:00:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:40.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.072 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/44a526e3-4f37-4d95-a98c-45a5937384e7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_nke0vq6" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.096 227766 DEBUG nova.storage.rbd_utils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image 44a526e3-4f37-4d95-a98c-45a5937384e7_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.099 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/44a526e3-4f37-4d95-a98c-45a5937384e7/disk.config 44a526e3-4f37-4d95-a98c-45a5937384e7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.242 227766 DEBUG oslo_concurrency.processutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/44a526e3-4f37-4d95-a98c-45a5937384e7/disk.config 44a526e3-4f37-4d95-a98c-45a5937384e7_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.243 227766 INFO nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Deleting local config drive /var/lib/nova/instances/44a526e3-4f37-4d95-a98c-45a5937384e7/disk.config because it was imported into RBD.#033[00m
Jan 23 06:00:41 np0005593234 kernel: tap36f5a7d1-6a: entered promiscuous mode
Jan 23 06:00:41 np0005593234 ovn_controller[134547]: 2026-01-23T11:00:41Z|00938|binding|INFO|Claiming lport 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 for this chassis.
Jan 23 06:00:41 np0005593234 ovn_controller[134547]: 2026-01-23T11:00:41Z|00939|binding|INFO|36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1: Claiming fa:16:3e:40:55:45 10.100.0.11
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.291 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:41 np0005593234 NetworkManager[48942]: <info>  [1769166041.2934] manager: (tap36f5a7d1-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/447)
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.306 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:55:45 10.100.0.11'], port_security=['fa:16:3e:40:55:45 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '44a526e3-4f37-4d95-a98c-45a5937384e7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1da7606-9f16-49a8-8326-606f8222c72a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3a76f7ca-087a-4d0c-8616-c4c740ce4ffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04f692a2-333e-430f-ac5b-4d3166e27ee2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.307 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 in datapath f1da7606-9f16-49a8-8326-606f8222c72a bound to our chassis#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.309 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f1da7606-9f16-49a8-8326-606f8222c72a#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.322 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e8b460-3f09-48f7-93cd-734bdd60923d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.325 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf1da7606-91 in ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.327 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf1da7606-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.328 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8ab22752-90ab-4a81-86f5-e2988265e457]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.328 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[be15b154-258b-45e9-bdb0-fa96a5253e37]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 systemd-machined[195626]: New machine qemu-105-instance-000000d6.
Jan 23 06:00:41 np0005593234 systemd[1]: Started Virtual Machine qemu-105-instance-000000d6.
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.347 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[3b371826-cc95-49e0-97a3-7fa42ddf8113]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 systemd-udevd[337041]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:00:41 np0005593234 NetworkManager[48942]: <info>  [1769166041.3677] device (tap36f5a7d1-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:00:41 np0005593234 NetworkManager[48942]: <info>  [1769166041.3691] device (tap36f5a7d1-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.369 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.375 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[69e6cb17-322d-4f4d-acdf-200d86bd0bc1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 ovn_controller[134547]: 2026-01-23T11:00:41Z|00940|binding|INFO|Setting lport 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 ovn-installed in OVS
Jan 23 06:00:41 np0005593234 ovn_controller[134547]: 2026-01-23T11:00:41Z|00941|binding|INFO|Setting lport 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 up in Southbound
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.380 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.406 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[92d8dbd5-5c49-45e1-bc3a-a0b12c0a03e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.411 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1326446e-7f49-4bb8-96b4-e93ed3286538]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 NetworkManager[48942]: <info>  [1769166041.4135] manager: (tapf1da7606-90): new Veth device (/org/freedesktop/NetworkManager/Devices/448)
Jan 23 06:00:41 np0005593234 podman[337027]: 2026-01-23 11:00:41.426116028 +0000 UTC m=+0.105265004 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.445 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[948e878f-c75c-48c9-afb9-d0054b27ea94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.448 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc4f9fb-9218-47a8-90cf-e57d7de9013a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 NetworkManager[48942]: <info>  [1769166041.4665] device (tapf1da7606-90): carrier: link connected
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.472 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f5894562-db32-4b49-8664-d079462fefdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.491 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[60f64854-0ea7-4294-866c-6258e07dde4f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf1da7606-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:83:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 288], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 993642, 'reachable_time': 40292, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 337087, 'error': None, 'target': 'ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.506 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[97f0ec3e-69dd-46e4-977c-c01e995b2c4b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:8303'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 993642, 'tstamp': 993642}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 337088, 'error': None, 'target': 'ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.523 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c29039cb-8c21-41e5-a726-d2fa77e667fa]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf1da7606-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:83:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 288], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 993642, 'reachable_time': 40292, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 337089, 'error': None, 'target': 'ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.546 227766 DEBUG nova.network.neutron [req-2e46e589-bf0e-41c1-a331-98bc02c55d1b req-a76ba15c-47ad-400d-906f-7d10c618fe13 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Updated VIF entry in instance network info cache for port 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.547 227766 DEBUG nova.network.neutron [req-2e46e589-bf0e-41c1-a331-98bc02c55d1b req-a76ba15c-47ad-400d-906f-7d10c618fe13 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Updating instance_info_cache with network_info: [{"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.552 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0c5a5b88-27e9-4bdc-91bb-9ba94bebceeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.563 227766 DEBUG oslo_concurrency.lockutils [req-2e46e589-bf0e-41c1-a331-98bc02c55d1b req-a76ba15c-47ad-400d-906f-7d10c618fe13 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.606 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cc5f0bfe-13b5-43b5-9f33-c1a6a0cdb41c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.608 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1da7606-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.608 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.609 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf1da7606-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.610 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:41 np0005593234 NetworkManager[48942]: <info>  [1769166041.6110] manager: (tapf1da7606-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/449)
Jan 23 06:00:41 np0005593234 kernel: tapf1da7606-90: entered promiscuous mode
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.613 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf1da7606-90, col_values=(('external_ids', {'iface-id': '8becf582-b7f3-45a1-86ae-65ec0d4a1a4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.614 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:41 np0005593234 ovn_controller[134547]: 2026-01-23T11:00:41Z|00942|binding|INFO|Releasing lport 8becf582-b7f3-45a1-86ae-65ec0d4a1a4c from this chassis (sb_readonly=0)
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.629 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.630 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f1da7606-9f16-49a8-8326-606f8222c72a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f1da7606-9f16-49a8-8326-606f8222c72a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.631 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ccfed291-ed38-4afd-80fb-1601f2a46df7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.631 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-f1da7606-9f16-49a8-8326-606f8222c72a
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/f1da7606-9f16-49a8-8326-606f8222c72a.pid.haproxy
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID f1da7606-9f16-49a8-8326-606f8222c72a
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:00:41 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:41.632 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a', 'env', 'PROCESS_TAG=haproxy-f1da7606-9f16-49a8-8326-606f8222c72a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f1da7606-9f16-49a8-8326-606f8222c72a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.651 227766 DEBUG nova.compute.manager [req-a98e16b9-e0a1-40c9-9d5b-add913b089ec req-1d7e64b4-114f-479b-bd88-51527436f688 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received event network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.651 227766 DEBUG oslo_concurrency.lockutils [req-a98e16b9-e0a1-40c9-9d5b-add913b089ec req-1d7e64b4-114f-479b-bd88-51527436f688 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.651 227766 DEBUG oslo_concurrency.lockutils [req-a98e16b9-e0a1-40c9-9d5b-add913b089ec req-1d7e64b4-114f-479b-bd88-51527436f688 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.652 227766 DEBUG oslo_concurrency.lockutils [req-a98e16b9-e0a1-40c9-9d5b-add913b089ec req-1d7e64b4-114f-479b-bd88-51527436f688 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.652 227766 DEBUG nova.compute.manager [req-a98e16b9-e0a1-40c9-9d5b-add913b089ec req-1d7e64b4-114f-479b-bd88-51527436f688 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Processing event network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.882 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166041.8817616, 44a526e3-4f37-4d95-a98c-45a5937384e7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.883 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] VM Started (Lifecycle Event)#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.884 227766 DEBUG nova.compute.manager [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.888 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.890 227766 INFO nova.virt.libvirt.driver [-] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Instance spawned successfully.#033[00m
Jan 23 06:00:41 np0005593234 nova_compute[227762]: 2026-01-23 11:00:41.891 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 06:00:42 np0005593234 podman[337161]: 2026-01-23 11:00:42.005992856 +0000 UTC m=+0.051099110 container create a97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 06:00:42 np0005593234 systemd[1]: Started libpod-conmon-a97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11.scope.
Jan 23 06:00:42 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:00:42 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/16e92db69f0ff0156c9a6db8eaf34ffebb4bdf641416d50c9676448f901e01fc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.071 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.072 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.072 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.073 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.073 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.074 227766 DEBUG nova.virt.libvirt.driver [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.077 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:00:42 np0005593234 podman[337161]: 2026-01-23 11:00:41.978669322 +0000 UTC m=+0.023775606 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.081 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:00:42 np0005593234 podman[337161]: 2026-01-23 11:00:42.083270633 +0000 UTC m=+0.128376907 container init a97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 06:00:42 np0005593234 podman[337161]: 2026-01-23 11:00:42.088959981 +0000 UTC m=+0.134066245 container start a97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.106 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.107 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166041.882075, 44a526e3-4f37-4d95-a98c-45a5937384e7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.107 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] VM Paused (Lifecycle Event)#033[00m
Jan 23 06:00:42 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337176]: [NOTICE]   (337180) : New worker (337182) forked
Jan 23 06:00:42 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337176]: [NOTICE]   (337180) : Loading success.
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.164 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.169 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166041.8871238, 44a526e3-4f37-4d95-a98c-45a5937384e7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.169 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.192 227766 INFO nova.compute.manager [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Took 12.49 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.192 227766 DEBUG nova.compute.manager [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.194 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.199 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.229 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.251 227766 INFO nova.compute.manager [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Took 13.42 seconds to build instance.#033[00m
Jan 23 06:00:42 np0005593234 nova_compute[227762]: 2026-01-23 11:00:42.265 227766 DEBUG oslo_concurrency.lockutils [None req-622d7dd1-c970-4085-833e-255cda30e6a3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.526s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:42.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:42.908 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:42.909 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:00:42.910 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:42.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:43 np0005593234 nova_compute[227762]: 2026-01-23 11:00:43.768 227766 DEBUG nova.compute.manager [req-c71ee4dc-6119-498f-97af-145d76c9cdef req-ba6c0f01-a81f-45cb-86e8-37dbcf504fcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received event network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:00:43 np0005593234 nova_compute[227762]: 2026-01-23 11:00:43.769 227766 DEBUG oslo_concurrency.lockutils [req-c71ee4dc-6119-498f-97af-145d76c9cdef req-ba6c0f01-a81f-45cb-86e8-37dbcf504fcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:00:43 np0005593234 nova_compute[227762]: 2026-01-23 11:00:43.770 227766 DEBUG oslo_concurrency.lockutils [req-c71ee4dc-6119-498f-97af-145d76c9cdef req-ba6c0f01-a81f-45cb-86e8-37dbcf504fcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:00:43 np0005593234 nova_compute[227762]: 2026-01-23 11:00:43.771 227766 DEBUG oslo_concurrency.lockutils [req-c71ee4dc-6119-498f-97af-145d76c9cdef req-ba6c0f01-a81f-45cb-86e8-37dbcf504fcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:00:43 np0005593234 nova_compute[227762]: 2026-01-23 11:00:43.771 227766 DEBUG nova.compute.manager [req-c71ee4dc-6119-498f-97af-145d76c9cdef req-ba6c0f01-a81f-45cb-86e8-37dbcf504fcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] No waiting events found dispatching network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:00:43 np0005593234 nova_compute[227762]: 2026-01-23 11:00:43.772 227766 WARNING nova.compute.manager [req-c71ee4dc-6119-498f-97af-145d76c9cdef req-ba6c0f01-a81f-45cb-86e8-37dbcf504fcd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received unexpected event network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:00:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:00:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/780170405' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:00:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:00:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/780170405' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:00:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:44.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:44.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:45 np0005593234 nova_compute[227762]: 2026-01-23 11:00:45.359 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:45 np0005593234 nova_compute[227762]: 2026-01-23 11:00:45.870 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:46.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:46.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:48.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:48.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:49 np0005593234 NetworkManager[48942]: <info>  [1769166049.5250] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/450)
Jan 23 06:00:49 np0005593234 NetworkManager[48942]: <info>  [1769166049.5260] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/451)
Jan 23 06:00:49 np0005593234 ovn_controller[134547]: 2026-01-23T11:00:49Z|00943|binding|INFO|Releasing lport 8becf582-b7f3-45a1-86ae-65ec0d4a1a4c from this chassis (sb_readonly=0)
Jan 23 06:00:49 np0005593234 nova_compute[227762]: 2026-01-23 11:00:49.524 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:49 np0005593234 ovn_controller[134547]: 2026-01-23T11:00:49Z|00944|binding|INFO|Releasing lport 8becf582-b7f3-45a1-86ae-65ec0d4a1a4c from this chassis (sb_readonly=0)
Jan 23 06:00:49 np0005593234 nova_compute[227762]: 2026-01-23 11:00:49.592 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:49 np0005593234 nova_compute[227762]: 2026-01-23 11:00:49.596 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:49 np0005593234 nova_compute[227762]: 2026-01-23 11:00:49.851 227766 DEBUG nova.compute.manager [req-65070faa-15e9-4c37-bb76-469b9f47cb75 req-47754f1b-afec-4cc7-beb3-f66da43e0a62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received event network-changed-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:00:49 np0005593234 nova_compute[227762]: 2026-01-23 11:00:49.851 227766 DEBUG nova.compute.manager [req-65070faa-15e9-4c37-bb76-469b9f47cb75 req-47754f1b-afec-4cc7-beb3-f66da43e0a62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Refreshing instance network info cache due to event network-changed-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:00:49 np0005593234 nova_compute[227762]: 2026-01-23 11:00:49.852 227766 DEBUG oslo_concurrency.lockutils [req-65070faa-15e9-4c37-bb76-469b9f47cb75 req-47754f1b-afec-4cc7-beb3-f66da43e0a62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:00:49 np0005593234 nova_compute[227762]: 2026-01-23 11:00:49.852 227766 DEBUG oslo_concurrency.lockutils [req-65070faa-15e9-4c37-bb76-469b9f47cb75 req-47754f1b-afec-4cc7-beb3-f66da43e0a62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:00:49 np0005593234 nova_compute[227762]: 2026-01-23 11:00:49.852 227766 DEBUG nova.network.neutron [req-65070faa-15e9-4c37-bb76-469b9f47cb75 req-47754f1b-afec-4cc7-beb3-f66da43e0a62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Refreshing network info cache for port 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:00:50 np0005593234 nova_compute[227762]: 2026-01-23 11:00:50.396 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:50.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:50 np0005593234 nova_compute[227762]: 2026-01-23 11:00:50.872 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:50.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:51 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:00:51 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:00:51 np0005593234 nova_compute[227762]: 2026-01-23 11:00:51.584 227766 DEBUG nova.network.neutron [req-65070faa-15e9-4c37-bb76-469b9f47cb75 req-47754f1b-afec-4cc7-beb3-f66da43e0a62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Updated VIF entry in instance network info cache for port 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:00:51 np0005593234 nova_compute[227762]: 2026-01-23 11:00:51.585 227766 DEBUG nova.network.neutron [req-65070faa-15e9-4c37-bb76-469b9f47cb75 req-47754f1b-afec-4cc7-beb3-f66da43e0a62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Updating instance_info_cache with network_info: [{"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:00:51 np0005593234 nova_compute[227762]: 2026-01-23 11:00:51.604 227766 DEBUG oslo_concurrency.lockutils [req-65070faa-15e9-4c37-bb76-469b9f47cb75 req-47754f1b-afec-4cc7-beb3-f66da43e0a62 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:00:51 np0005593234 nova_compute[227762]: 2026-01-23 11:00:51.761 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:00:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:00:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:00:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:00:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:52.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:52.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:54.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:00:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:54.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:55 np0005593234 nova_compute[227762]: 2026-01-23 11:00:55.398 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:55 np0005593234 ovn_controller[134547]: 2026-01-23T11:00:55Z|00122|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:40:55:45 10.100.0.11
Jan 23 06:00:55 np0005593234 ovn_controller[134547]: 2026-01-23T11:00:55Z|00123|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:40:55:45 10.100.0.11
Jan 23 06:00:55 np0005593234 nova_compute[227762]: 2026-01-23 11:00:55.905 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:00:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:00:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:56.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:00:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:56.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:00:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:00:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:00:58.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:00:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:00:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:00:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:00:58.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:01:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:01:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:01:00 np0005593234 nova_compute[227762]: 2026-01-23 11:01:00.401 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:00.776 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:00 np0005593234 nova_compute[227762]: 2026-01-23 11:01:00.912 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:00.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:01 np0005593234 nova_compute[227762]: 2026-01-23 11:01:01.524 227766 INFO nova.compute.manager [None req-cda48517-8ef2-4922-a173-29c0caf893fc 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Get console output#033[00m
Jan 23 06:01:01 np0005593234 nova_compute[227762]: 2026-01-23 11:01:01.532 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 06:01:01 np0005593234 nova_compute[227762]: 2026-01-23 11:01:01.840 227766 DEBUG oslo_concurrency.lockutils [None req-f63292f1-c850-4b31-84ff-d80c9f93e768 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "44a526e3-4f37-4d95-a98c-45a5937384e7" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:01 np0005593234 nova_compute[227762]: 2026-01-23 11:01:01.840 227766 DEBUG oslo_concurrency.lockutils [None req-f63292f1-c850-4b31-84ff-d80c9f93e768 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:01 np0005593234 nova_compute[227762]: 2026-01-23 11:01:01.841 227766 INFO nova.compute.manager [None req-f63292f1-c850-4b31-84ff-d80c9f93e768 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Rebooting instance#033[00m
Jan 23 06:01:01 np0005593234 nova_compute[227762]: 2026-01-23 11:01:01.857 227766 DEBUG oslo_concurrency.lockutils [None req-f63292f1-c850-4b31-84ff-d80c9f93e768 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:01:01 np0005593234 nova_compute[227762]: 2026-01-23 11:01:01.857 227766 DEBUG oslo_concurrency.lockutils [None req-f63292f1-c850-4b31-84ff-d80c9f93e768 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:01:01 np0005593234 nova_compute[227762]: 2026-01-23 11:01:01.857 227766 DEBUG nova.network.neutron [None req-f63292f1-c850-4b31-84ff-d80c9f93e768 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:01:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:01:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:02.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:01:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:02.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:03 np0005593234 nova_compute[227762]: 2026-01-23 11:01:03.832 227766 DEBUG nova.network.neutron [None req-f63292f1-c850-4b31-84ff-d80c9f93e768 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Updating instance_info_cache with network_info: [{"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:01:03 np0005593234 nova_compute[227762]: 2026-01-23 11:01:03.849 227766 DEBUG oslo_concurrency.lockutils [None req-f63292f1-c850-4b31-84ff-d80c9f93e768 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:01:03 np0005593234 nova_compute[227762]: 2026-01-23 11:01:03.851 227766 DEBUG nova.compute.manager [None req-f63292f1-c850-4b31-84ff-d80c9f93e768 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:01:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:01:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:04.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:01:04 np0005593234 podman[337444]: 2026-01-23 11:01:04.793771838 +0000 UTC m=+0.085477704 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 06:01:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:04.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:05 np0005593234 nova_compute[227762]: 2026-01-23 11:01:05.403 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:05 np0005593234 nova_compute[227762]: 2026-01-23 11:01:05.914 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:06 np0005593234 kernel: tap36f5a7d1-6a (unregistering): left promiscuous mode
Jan 23 06:01:06 np0005593234 NetworkManager[48942]: <info>  [1769166066.7603] device (tap36f5a7d1-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:01:06 np0005593234 nova_compute[227762]: 2026-01-23 11:01:06.769 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:06 np0005593234 ovn_controller[134547]: 2026-01-23T11:01:06Z|00945|binding|INFO|Releasing lport 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 from this chassis (sb_readonly=0)
Jan 23 06:01:06 np0005593234 ovn_controller[134547]: 2026-01-23T11:01:06Z|00946|binding|INFO|Setting lport 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 down in Southbound
Jan 23 06:01:06 np0005593234 ovn_controller[134547]: 2026-01-23T11:01:06Z|00947|binding|INFO|Removing iface tap36f5a7d1-6a ovn-installed in OVS
Jan 23 06:01:06 np0005593234 nova_compute[227762]: 2026-01-23 11:01:06.772 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:06.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:06 np0005593234 nova_compute[227762]: 2026-01-23 11:01:06.788 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:06 np0005593234 systemd[1]: machine-qemu\x2d105\x2dinstance\x2d000000d6.scope: Deactivated successfully.
Jan 23 06:01:06 np0005593234 systemd[1]: machine-qemu\x2d105\x2dinstance\x2d000000d6.scope: Consumed 13.831s CPU time.
Jan 23 06:01:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:06.813 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:55:45 10.100.0.11'], port_security=['fa:16:3e:40:55:45 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '44a526e3-4f37-4d95-a98c-45a5937384e7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1da7606-9f16-49a8-8326-606f8222c72a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3a76f7ca-087a-4d0c-8616-c4c740ce4ffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04f692a2-333e-430f-ac5b-4d3166e27ee2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:01:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:06.815 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 in datapath f1da7606-9f16-49a8-8326-606f8222c72a unbound from our chassis#033[00m
Jan 23 06:01:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:06.816 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f1da7606-9f16-49a8-8326-606f8222c72a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:01:06 np0005593234 systemd-machined[195626]: Machine qemu-105-instance-000000d6 terminated.
Jan 23 06:01:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:06.818 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1d3d4ae8-543e-4526-a16e-4298f2a7d889]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:06.818 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a namespace which is not needed anymore#033[00m
Jan 23 06:01:06 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337176]: [NOTICE]   (337180) : haproxy version is 2.8.14-c23fe91
Jan 23 06:01:06 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337176]: [NOTICE]   (337180) : path to executable is /usr/sbin/haproxy
Jan 23 06:01:06 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337176]: [WARNING]  (337180) : Exiting Master process...
Jan 23 06:01:06 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337176]: [ALERT]    (337180) : Current worker (337182) exited with code 143 (Terminated)
Jan 23 06:01:06 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337176]: [WARNING]  (337180) : All workers exited. Exiting... (0)
Jan 23 06:01:06 np0005593234 systemd[1]: libpod-a97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11.scope: Deactivated successfully.
Jan 23 06:01:06 np0005593234 conmon[337176]: conmon a97d1b9b0d012d037def <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11.scope/container/memory.events
Jan 23 06:01:06 np0005593234 podman[337491]: 2026-01-23 11:01:06.945072 +0000 UTC m=+0.043096610 container died a97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:01:06 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11-userdata-shm.mount: Deactivated successfully.
Jan 23 06:01:06 np0005593234 systemd[1]: var-lib-containers-storage-overlay-16e92db69f0ff0156c9a6db8eaf34ffebb4bdf641416d50c9676448f901e01fc-merged.mount: Deactivated successfully.
Jan 23 06:01:06 np0005593234 podman[337491]: 2026-01-23 11:01:06.975576024 +0000 UTC m=+0.073600634 container cleanup a97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 06:01:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:01:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:06.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:01:06 np0005593234 nova_compute[227762]: 2026-01-23 11:01:06.985 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:06 np0005593234 nova_compute[227762]: 2026-01-23 11:01:06.990 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:06 np0005593234 systemd[1]: libpod-conmon-a97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11.scope: Deactivated successfully.
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.000 227766 INFO nova.virt.libvirt.driver [None req-f63292f1-c850-4b31-84ff-d80c9f93e768 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Instance shutdown successfully.#033[00m
Jan 23 06:01:07 np0005593234 virtqemud[227483]: End of file while reading data: Input/output error
Jan 23 06:01:07 np0005593234 podman[337520]: 2026-01-23 11:01:07.070929746 +0000 UTC m=+0.069346050 container remove a97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:01:07 np0005593234 kernel: tap36f5a7d1-6a: entered promiscuous mode
Jan 23 06:01:07 np0005593234 systemd-udevd[337469]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:01:07 np0005593234 NetworkManager[48942]: <info>  [1769166067.0770] manager: (tap36f5a7d1-6a): new Tun device (/org/freedesktop/NetworkManager/Devices/452)
Jan 23 06:01:07 np0005593234 ovn_controller[134547]: 2026-01-23T11:01:07Z|00948|binding|INFO|Claiming lport 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 for this chassis.
Jan 23 06:01:07 np0005593234 ovn_controller[134547]: 2026-01-23T11:01:07Z|00949|binding|INFO|36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1: Claiming fa:16:3e:40:55:45 10.100.0.11
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.078 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.078 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0d2f8ab5-d442-447f-a81b-0e22b63e3efb]: (4, ('Fri Jan 23 11:01:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a (a97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11)\na97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11\nFri Jan 23 11:01:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a (a97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11)\na97d1b9b0d012d037defbf1143c4008caf595ab58a9b29cf67a14159d5ec4d11\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.080 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6874aa03-e816-493e-a021-e0f8f867b381]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.081 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1da7606-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.082 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.086 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:55:45 10.100.0.11'], port_security=['fa:16:3e:40:55:45 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '44a526e3-4f37-4d95-a98c-45a5937384e7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1da7606-9f16-49a8-8326-606f8222c72a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3a76f7ca-087a-4d0c-8616-c4c740ce4ffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04f692a2-333e-430f-ac5b-4d3166e27ee2, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:01:07 np0005593234 NetworkManager[48942]: <info>  [1769166067.0884] device (tap36f5a7d1-6a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:01:07 np0005593234 NetworkManager[48942]: <info>  [1769166067.0892] device (tap36f5a7d1-6a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.093 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:07 np0005593234 ovn_controller[134547]: 2026-01-23T11:01:07Z|00950|binding|INFO|Setting lport 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 ovn-installed in OVS
Jan 23 06:01:07 np0005593234 ovn_controller[134547]: 2026-01-23T11:01:07Z|00951|binding|INFO|Setting lport 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 up in Southbound
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.104 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:07 np0005593234 kernel: tapf1da7606-90: left promiscuous mode
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.107 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.110 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[48982bd7-fc62-43cf-88bd-8e4013d84e68]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 systemd-machined[195626]: New machine qemu-106-instance-000000d6.
Jan 23 06:01:07 np0005593234 systemd[1]: Started Virtual Machine qemu-106-instance-000000d6.
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.130 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e7dcd606-5dd8-438e-8f02-b44f36e0c3d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.132 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[925eaf23-ea66-4cb2-a857-d51dff45f955]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.149 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3421b600-203c-436e-b3b8-03895f24d70d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 993636, 'reachable_time': 43707, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 337556, 'error': None, 'target': 'ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 systemd[1]: run-netns-ovnmeta\x2df1da7606\x2d9f16\x2d49a8\x2d8326\x2d606f8222c72a.mount: Deactivated successfully.
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.154 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.155 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[a6fc2be6-a21f-47bd-813b-6c93c349ce52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.156 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 in datapath f1da7606-9f16-49a8-8326-606f8222c72a bound to our chassis#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.157 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f1da7606-9f16-49a8-8326-606f8222c72a#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.168 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f809b3dd-7119-449a-bf68-219f8429023b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.169 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf1da7606-91 in ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.170 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf1da7606-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.170 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3e67eabb-3650-493e-8fc8-f2f171d936a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.171 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e534db27-e5f0-4185-bcfd-5df1b527ae8c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.185 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[88eefd30-dd40-4cb1-8aef-b37b8b644bdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.202 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[16d9fb7d-1884-4acd-80cc-94b837f098fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.230 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[842101f2-0004-44e6-b33f-64fe9e97a0c3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.235 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9d56cf4f-094f-420b-afdc-dfd21951ed9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 NetworkManager[48942]: <info>  [1769166067.2366] manager: (tapf1da7606-90): new Veth device (/org/freedesktop/NetworkManager/Devices/453)
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.268 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[6c82953b-053a-450f-82cc-71537c77c1f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.271 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f24964bb-d414-4c6d-9cb0-d9b65120553b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 NetworkManager[48942]: <info>  [1769166067.2924] device (tapf1da7606-90): carrier: link connected
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.298 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a9e40db6-13dd-4bc6-a2c5-6cb7e8bfe86f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.314 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3ba5461c-3d99-4b5b-ad73-cc1cad56ced9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf1da7606-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:83:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 291], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 996225, 'reachable_time': 28377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 337587, 'error': None, 'target': 'ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.329 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[902ccad9-46e9-42d9-8350-6f6ca4309255]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe36:8303'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 996225, 'tstamp': 996225}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 337588, 'error': None, 'target': 'ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.346 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[13653c91-c685-4176-bb62-e2c614c14d68]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf1da7606-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:36:83:03'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 291], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 996225, 'reachable_time': 28377, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 337589, 'error': None, 'target': 'ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.377 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3f0c677b-fe21-40b2-a427-03088963f849]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.435 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e5aadc9f-bd94-4659-b06b-1743a768e7b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.436 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1da7606-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.437 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.437 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf1da7606-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.439 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:07 np0005593234 NetworkManager[48942]: <info>  [1769166067.4399] manager: (tapf1da7606-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/454)
Jan 23 06:01:07 np0005593234 kernel: tapf1da7606-90: entered promiscuous mode
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.441 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.442 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf1da7606-90, col_values=(('external_ids', {'iface-id': '8becf582-b7f3-45a1-86ae-65ec0d4a1a4c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.442 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:07 np0005593234 ovn_controller[134547]: 2026-01-23T11:01:07Z|00952|binding|INFO|Releasing lport 8becf582-b7f3-45a1-86ae-65ec0d4a1a4c from this chassis (sb_readonly=0)
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.444 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.444 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f1da7606-9f16-49a8-8326-606f8222c72a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f1da7606-9f16-49a8-8326-606f8222c72a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.445 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[fb5ee552-a7eb-4893-a098-4ed001a6e7ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.446 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-f1da7606-9f16-49a8-8326-606f8222c72a
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/f1da7606-9f16-49a8-8326-606f8222c72a.pid.haproxy
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID f1da7606-9f16-49a8-8326-606f8222c72a
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:01:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:07.446 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a', 'env', 'PROCESS_TAG=haproxy-f1da7606-9f16-49a8-8326-606f8222c72a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f1da7606-9f16-49a8-8326-606f8222c72a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.457 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.587 227766 DEBUG nova.compute.manager [req-1a91eae0-de78-4fa0-bf57-2dbd07bee6b8 req-5471df41-57c5-445a-8d6e-ba1cbbb9cd52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received event network-vif-unplugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.587 227766 DEBUG oslo_concurrency.lockutils [req-1a91eae0-de78-4fa0-bf57-2dbd07bee6b8 req-5471df41-57c5-445a-8d6e-ba1cbbb9cd52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.588 227766 DEBUG oslo_concurrency.lockutils [req-1a91eae0-de78-4fa0-bf57-2dbd07bee6b8 req-5471df41-57c5-445a-8d6e-ba1cbbb9cd52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.588 227766 DEBUG oslo_concurrency.lockutils [req-1a91eae0-de78-4fa0-bf57-2dbd07bee6b8 req-5471df41-57c5-445a-8d6e-ba1cbbb9cd52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.588 227766 DEBUG nova.compute.manager [req-1a91eae0-de78-4fa0-bf57-2dbd07bee6b8 req-5471df41-57c5-445a-8d6e-ba1cbbb9cd52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] No waiting events found dispatching network-vif-unplugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.588 227766 WARNING nova.compute.manager [req-1a91eae0-de78-4fa0-bf57-2dbd07bee6b8 req-5471df41-57c5-445a-8d6e-ba1cbbb9cd52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received unexpected event network-vif-unplugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 for instance with vm_state active and task_state reboot_started.#033[00m
Jan 23 06:01:07 np0005593234 podman[337639]: 2026-01-23 11:01:07.850082057 +0000 UTC m=+0.049667824 container create 75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 06:01:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:07 np0005593234 systemd[1]: Started libpod-conmon-75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c.scope.
Jan 23 06:01:07 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:01:07 np0005593234 podman[337639]: 2026-01-23 11:01:07.823889479 +0000 UTC m=+0.023475276 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:01:07 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f35b00f959664f489d7d54065f0afc2cf2286f9e094b35d5a7b84b70852098/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.930 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for 44a526e3-4f37-4d95-a98c-45a5937384e7 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.931 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166067.9297748, 44a526e3-4f37-4d95-a98c-45a5937384e7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.932 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.937 227766 INFO nova.virt.libvirt.driver [-] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Instance running successfully.#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.937 227766 INFO nova.virt.libvirt.driver [None req-f63292f1-c850-4b31-84ff-d80c9f93e768 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Instance soft rebooted successfully.#033[00m
Jan 23 06:01:07 np0005593234 nova_compute[227762]: 2026-01-23 11:01:07.938 227766 DEBUG nova.compute.manager [None req-f63292f1-c850-4b31-84ff-d80c9f93e768 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:01:07 np0005593234 podman[337639]: 2026-01-23 11:01:07.940401393 +0000 UTC m=+0.139987160 container init 75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:01:07 np0005593234 podman[337639]: 2026-01-23 11:01:07.945927895 +0000 UTC m=+0.145513662 container start 75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:01:07 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337679]: [NOTICE]   (337683) : New worker (337685) forked
Jan 23 06:01:07 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337679]: [NOTICE]   (337683) : Loading success.
Jan 23 06:01:08 np0005593234 nova_compute[227762]: 2026-01-23 11:01:08.175 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:01:08 np0005593234 nova_compute[227762]: 2026-01-23 11:01:08.180 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:01:08 np0005593234 nova_compute[227762]: 2026-01-23 11:01:08.518 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] During sync_power_state the instance has a pending task (reboot_started). Skip.#033[00m
Jan 23 06:01:08 np0005593234 nova_compute[227762]: 2026-01-23 11:01:08.518 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166067.9298608, 44a526e3-4f37-4d95-a98c-45a5937384e7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:01:08 np0005593234 nova_compute[227762]: 2026-01-23 11:01:08.518 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] VM Started (Lifecycle Event)#033[00m
Jan 23 06:01:08 np0005593234 nova_compute[227762]: 2026-01-23 11:01:08.532 227766 DEBUG oslo_concurrency.lockutils [None req-f63292f1-c850-4b31-84ff-d80c9f93e768 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 6.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:08 np0005593234 nova_compute[227762]: 2026-01-23 11:01:08.550 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:01:08 np0005593234 nova_compute[227762]: 2026-01-23 11:01:08.553 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:01:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:08.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:01:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:08.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.663 227766 DEBUG nova.compute.manager [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received event network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.664 227766 DEBUG oslo_concurrency.lockutils [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.664 227766 DEBUG oslo_concurrency.lockutils [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.664 227766 DEBUG oslo_concurrency.lockutils [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.665 227766 DEBUG nova.compute.manager [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] No waiting events found dispatching network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.665 227766 WARNING nova.compute.manager [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received unexpected event network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.665 227766 DEBUG nova.compute.manager [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received event network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.665 227766 DEBUG oslo_concurrency.lockutils [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.666 227766 DEBUG oslo_concurrency.lockutils [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.666 227766 DEBUG oslo_concurrency.lockutils [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.666 227766 DEBUG nova.compute.manager [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] No waiting events found dispatching network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.666 227766 WARNING nova.compute.manager [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received unexpected event network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.667 227766 DEBUG nova.compute.manager [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received event network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.667 227766 DEBUG oslo_concurrency.lockutils [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.667 227766 DEBUG oslo_concurrency.lockutils [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.667 227766 DEBUG oslo_concurrency.lockutils [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.668 227766 DEBUG nova.compute.manager [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] No waiting events found dispatching network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:01:09 np0005593234 nova_compute[227762]: 2026-01-23 11:01:09.668 227766 WARNING nova.compute.manager [req-78e50bf9-d381-48ca-b29a-eb9c84e2d669 req-5ee9e98d-4683-4982-acaa-d9ba1a82cb5a 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received unexpected event network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:01:10 np0005593234 nova_compute[227762]: 2026-01-23 11:01:10.405 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:10.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:10 np0005593234 nova_compute[227762]: 2026-01-23 11:01:10.916 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:01:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:10.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:01:11 np0005593234 podman[337696]: 2026-01-23 11:01:11.837377829 +0000 UTC m=+0.129617275 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 23 06:01:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:01:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:12.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:01:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:12.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:01:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:14.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:01:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:14.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:15 np0005593234 nova_compute[227762]: 2026-01-23 11:01:15.407 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:15 np0005593234 nova_compute[227762]: 2026-01-23 11:01:15.919 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:01:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:16.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:01:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:16.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:18.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:18.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:19 np0005593234 ovn_controller[134547]: 2026-01-23T11:01:19Z|00124|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:40:55:45 10.100.0.11
Jan 23 06:01:20 np0005593234 nova_compute[227762]: 2026-01-23 11:01:20.411 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:20 np0005593234 nova_compute[227762]: 2026-01-23 11:01:20.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:20 np0005593234 nova_compute[227762]: 2026-01-23 11:01:20.776 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:20 np0005593234 nova_compute[227762]: 2026-01-23 11:01:20.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:20 np0005593234 nova_compute[227762]: 2026-01-23 11:01:20.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:20 np0005593234 nova_compute[227762]: 2026-01-23 11:01:20.777 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:01:20 np0005593234 nova_compute[227762]: 2026-01-23 11:01:20.777 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:01:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:20.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:20 np0005593234 nova_compute[227762]: 2026-01-23 11:01:20.920 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 06:01:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:21.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 06:01:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:01:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4142364440' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.208 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.278 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000d6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.278 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000d6 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.430 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.432 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3940MB free_disk=20.942676544189453GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.432 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.432 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.495 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 44a526e3-4f37-4d95-a98c-45a5937384e7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.495 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.496 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.525 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:01:21 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:01:21 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1382378761' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.959 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.965 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:01:21 np0005593234 nova_compute[227762]: 2026-01-23 11:01:21.980 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:01:22 np0005593234 nova_compute[227762]: 2026-01-23 11:01:22.002 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:01:22 np0005593234 nova_compute[227762]: 2026-01-23 11:01:22.002 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.570s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:22.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:23.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:24 np0005593234 nova_compute[227762]: 2026-01-23 11:01:24.003 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:24 np0005593234 nova_compute[227762]: 2026-01-23 11:01:24.004 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:01:24 np0005593234 nova_compute[227762]: 2026-01-23 11:01:24.005 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:01:24 np0005593234 nova_compute[227762]: 2026-01-23 11:01:24.388 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:01:24 np0005593234 nova_compute[227762]: 2026-01-23 11:01:24.389 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:01:24 np0005593234 nova_compute[227762]: 2026-01-23 11:01:24.389 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 06:01:24 np0005593234 nova_compute[227762]: 2026-01-23 11:01:24.389 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 44a526e3-4f37-4d95-a98c-45a5937384e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:01:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:24.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:25.009 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:25 np0005593234 nova_compute[227762]: 2026-01-23 11:01:25.353 227766 INFO nova.compute.manager [None req-f6837fcb-c89b-47ff-944e-6f6a9fac22bb 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Get console output#033[00m
Jan 23 06:01:25 np0005593234 nova_compute[227762]: 2026-01-23 11:01:25.358 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 06:01:25 np0005593234 nova_compute[227762]: 2026-01-23 11:01:25.414 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:25 np0005593234 nova_compute[227762]: 2026-01-23 11:01:25.820 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Updating instance_info_cache with network_info: [{"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:01:25 np0005593234 nova_compute[227762]: 2026-01-23 11:01:25.841 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:01:25 np0005593234 nova_compute[227762]: 2026-01-23 11:01:25.841 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 06:01:25 np0005593234 nova_compute[227762]: 2026-01-23 11:01:25.841 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:25 np0005593234 nova_compute[227762]: 2026-01-23 11:01:25.923 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:25 np0005593234 nova_compute[227762]: 2026-01-23 11:01:25.941 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:25.943 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=94, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=93) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:01:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:25.944 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.254 227766 DEBUG nova.compute.manager [req-e46f4983-5e01-40c0-bab7-a929bf37b1c0 req-ac4fd47d-c50d-4537-a867-62acae11b633 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received event network-changed-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.255 227766 DEBUG nova.compute.manager [req-e46f4983-5e01-40c0-bab7-a929bf37b1c0 req-ac4fd47d-c50d-4537-a867-62acae11b633 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Refreshing instance network info cache due to event network-changed-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.255 227766 DEBUG oslo_concurrency.lockutils [req-e46f4983-5e01-40c0-bab7-a929bf37b1c0 req-ac4fd47d-c50d-4537-a867-62acae11b633 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.255 227766 DEBUG oslo_concurrency.lockutils [req-e46f4983-5e01-40c0-bab7-a929bf37b1c0 req-ac4fd47d-c50d-4537-a867-62acae11b633 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.256 227766 DEBUG nova.network.neutron [req-e46f4983-5e01-40c0-bab7-a929bf37b1c0 req-ac4fd47d-c50d-4537-a867-62acae11b633 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Refreshing network info cache for port 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.321 227766 DEBUG oslo_concurrency.lockutils [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "44a526e3-4f37-4d95-a98c-45a5937384e7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.322 227766 DEBUG oslo_concurrency.lockutils [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.322 227766 DEBUG oslo_concurrency.lockutils [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.323 227766 DEBUG oslo_concurrency.lockutils [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.323 227766 DEBUG oslo_concurrency.lockutils [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.324 227766 INFO nova.compute.manager [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Terminating instance#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.325 227766 DEBUG nova.compute.manager [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 06:01:26 np0005593234 kernel: tap36f5a7d1-6a (unregistering): left promiscuous mode
Jan 23 06:01:26 np0005593234 NetworkManager[48942]: <info>  [1769166086.3914] device (tap36f5a7d1-6a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:01:26 np0005593234 ovn_controller[134547]: 2026-01-23T11:01:26Z|00953|binding|INFO|Releasing lport 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 from this chassis (sb_readonly=0)
Jan 23 06:01:26 np0005593234 ovn_controller[134547]: 2026-01-23T11:01:26Z|00954|binding|INFO|Setting lport 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 down in Southbound
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.396 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:26 np0005593234 ovn_controller[134547]: 2026-01-23T11:01:26Z|00955|binding|INFO|Removing iface tap36f5a7d1-6a ovn-installed in OVS
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.398 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.411 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:55:45 10.100.0.11'], port_security=['fa:16:3e:40:55:45 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '44a526e3-4f37-4d95-a98c-45a5937384e7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f1da7606-9f16-49a8-8326-606f8222c72a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '3a76f7ca-087a-4d0c-8616-c4c740ce4ffd', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=04f692a2-333e-430f-ac5b-4d3166e27ee2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.412 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 in datapath f1da7606-9f16-49a8-8326-606f8222c72a unbound from our chassis#033[00m
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.413 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f1da7606-9f16-49a8-8326-606f8222c72a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.415 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[83582a34-a5ec-4336-bce3-8eff1004d7c2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.415 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a namespace which is not needed anymore#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.417 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:26 np0005593234 systemd[1]: machine-qemu\x2d106\x2dinstance\x2d000000d6.scope: Deactivated successfully.
Jan 23 06:01:26 np0005593234 systemd[1]: machine-qemu\x2d106\x2dinstance\x2d000000d6.scope: Consumed 13.520s CPU time.
Jan 23 06:01:26 np0005593234 systemd-machined[195626]: Machine qemu-106-instance-000000d6 terminated.
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.544 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:26 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337679]: [NOTICE]   (337683) : haproxy version is 2.8.14-c23fe91
Jan 23 06:01:26 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337679]: [NOTICE]   (337683) : path to executable is /usr/sbin/haproxy
Jan 23 06:01:26 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337679]: [WARNING]  (337683) : Exiting Master process...
Jan 23 06:01:26 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337679]: [WARNING]  (337683) : Exiting Master process...
Jan 23 06:01:26 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337679]: [ALERT]    (337683) : Current worker (337685) exited with code 143 (Terminated)
Jan 23 06:01:26 np0005593234 neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a[337679]: [WARNING]  (337683) : All workers exited. Exiting... (0)
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.550 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:26 np0005593234 systemd[1]: libpod-75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c.scope: Deactivated successfully.
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.557 227766 INFO nova.virt.libvirt.driver [-] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Instance destroyed successfully.#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.557 227766 DEBUG nova.objects.instance [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'resources' on Instance uuid 44a526e3-4f37-4d95-a98c-45a5937384e7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:01:26 np0005593234 podman[337852]: 2026-01-23 11:01:26.559894993 +0000 UTC m=+0.051516082 container died 75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.575 227766 DEBUG nova.virt.libvirt.vif [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:00:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1432894641',display_name='tempest-TestNetworkAdvancedServerOps-server-1432894641',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1432894641',id=214,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBO6Rgoec6R7+ZYgsddopjR9l5FplV8tN2PJmCmlV60Pib4goPLAOzFEeWZQShT7+lCH35hEqU0idsqmj+Oi79fUwESNDsABzmiKYg2NI49VRpRvg2c1Duh/33xLsRI6OFw==',key_name='tempest-TestNetworkAdvancedServerOps-1646664422',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:00:42Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-ymeku15o',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:01:08Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=44a526e3-4f37-4d95-a98c-45a5937384e7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.576 227766 DEBUG nova.network.os_vif_util [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.577 227766 DEBUG nova.network.os_vif_util [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:40:55:45,bridge_name='br-int',has_traffic_filtering=True,id=36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1,network=Network(f1da7606-9f16-49a8-8326-606f8222c72a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36f5a7d1-6a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.577 227766 DEBUG os_vif [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:40:55:45,bridge_name='br-int',has_traffic_filtering=True,id=36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1,network=Network(f1da7606-9f16-49a8-8326-606f8222c72a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36f5a7d1-6a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.580 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.580 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap36f5a7d1-6a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.583 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.587 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.590 227766 INFO os_vif [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:40:55:45,bridge_name='br-int',has_traffic_filtering=True,id=36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1,network=Network(f1da7606-9f16-49a8-8326-606f8222c72a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap36f5a7d1-6a')#033[00m
Jan 23 06:01:26 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c-userdata-shm.mount: Deactivated successfully.
Jan 23 06:01:26 np0005593234 systemd[1]: var-lib-containers-storage-overlay-e6f35b00f959664f489d7d54065f0afc2cf2286f9e094b35d5a7b84b70852098-merged.mount: Deactivated successfully.
Jan 23 06:01:26 np0005593234 podman[337852]: 2026-01-23 11:01:26.610337121 +0000 UTC m=+0.101958210 container cleanup 75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 06:01:26 np0005593234 systemd[1]: libpod-conmon-75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c.scope: Deactivated successfully.
Jan 23 06:01:26 np0005593234 podman[337904]: 2026-01-23 11:01:26.673027472 +0000 UTC m=+0.040342303 container remove 75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.678 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a0d098e6-687b-4725-be45-cbf65c685dc2]: (4, ('Fri Jan 23 11:01:26 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a (75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c)\n75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c\nFri Jan 23 11:01:26 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a (75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c)\n75a6e95e9907aa69dbb90d34cb24d5805ed8a0b54adaa003d3b0a49cb2de932c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.680 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1201bab9-90b3-4ec9-bdd9-fca2c1693a85]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.681 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf1da7606-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.683 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:26 np0005593234 kernel: tapf1da7606-90: left promiscuous mode
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.696 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.699 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a08eb0ca-3cff-4c82-ac8a-fe1e34cd4979]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.719 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[87e52dc1-c873-4d88-9872-35dd96c06b9a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.720 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b19b5990-ec9f-4c89-adc3-a0247caedfa9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.733 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ae98c594-77fb-4a90-8617-442d8100def0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 996218, 'reachable_time': 32393, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 337919, 'error': None, 'target': 'ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:26 np0005593234 systemd[1]: run-netns-ovnmeta\x2df1da7606\x2d9f16\x2d49a8\x2d8326\x2d606f8222c72a.mount: Deactivated successfully.
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.737 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f1da7606-9f16-49a8-8326-606f8222c72a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:01:26 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:26.737 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[48165d0f-c1a7-4ab4-bfa9-3b02a716a38e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.787 227766 DEBUG nova.compute.manager [req-18a842de-67d4-4cd2-9cfc-f29d0f8dda61 req-0a4581a0-3335-4c5f-af53-a6ff4cf6b978 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received event network-vif-unplugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.788 227766 DEBUG oslo_concurrency.lockutils [req-18a842de-67d4-4cd2-9cfc-f29d0f8dda61 req-0a4581a0-3335-4c5f-af53-a6ff4cf6b978 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.788 227766 DEBUG oslo_concurrency.lockutils [req-18a842de-67d4-4cd2-9cfc-f29d0f8dda61 req-0a4581a0-3335-4c5f-af53-a6ff4cf6b978 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.788 227766 DEBUG oslo_concurrency.lockutils [req-18a842de-67d4-4cd2-9cfc-f29d0f8dda61 req-0a4581a0-3335-4c5f-af53-a6ff4cf6b978 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.788 227766 DEBUG nova.compute.manager [req-18a842de-67d4-4cd2-9cfc-f29d0f8dda61 req-0a4581a0-3335-4c5f-af53-a6ff4cf6b978 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] No waiting events found dispatching network-vif-unplugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:01:26 np0005593234 nova_compute[227762]: 2026-01-23 11:01:26.789 227766 DEBUG nova.compute.manager [req-18a842de-67d4-4cd2-9cfc-f29d0f8dda61 req-0a4581a0-3335-4c5f-af53-a6ff4cf6b978 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received event network-vif-unplugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 06:01:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:26.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:27.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:27 np0005593234 nova_compute[227762]: 2026-01-23 11:01:27.020 227766 INFO nova.virt.libvirt.driver [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Deleting instance files /var/lib/nova/instances/44a526e3-4f37-4d95-a98c-45a5937384e7_del#033[00m
Jan 23 06:01:27 np0005593234 nova_compute[227762]: 2026-01-23 11:01:27.021 227766 INFO nova.virt.libvirt.driver [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Deletion of /var/lib/nova/instances/44a526e3-4f37-4d95-a98c-45a5937384e7_del complete#033[00m
Jan 23 06:01:27 np0005593234 nova_compute[227762]: 2026-01-23 11:01:27.096 227766 INFO nova.compute.manager [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Took 0.77 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 06:01:27 np0005593234 nova_compute[227762]: 2026-01-23 11:01:27.096 227766 DEBUG oslo.service.loopingcall [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 06:01:27 np0005593234 nova_compute[227762]: 2026-01-23 11:01:27.097 227766 DEBUG nova.compute.manager [-] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 06:01:27 np0005593234 nova_compute[227762]: 2026-01-23 11:01:27.097 227766 DEBUG nova.network.neutron [-] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 06:01:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:28.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:29.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:29 np0005593234 nova_compute[227762]: 2026-01-23 11:01:29.148 227766 DEBUG nova.compute.manager [req-53e8d3bd-86e2-4c4a-a871-1169adaf2213 req-230b1130-8c56-4edc-a13d-28b98518edc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received event network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:01:29 np0005593234 nova_compute[227762]: 2026-01-23 11:01:29.148 227766 DEBUG oslo_concurrency.lockutils [req-53e8d3bd-86e2-4c4a-a871-1169adaf2213 req-230b1130-8c56-4edc-a13d-28b98518edc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:29 np0005593234 nova_compute[227762]: 2026-01-23 11:01:29.148 227766 DEBUG oslo_concurrency.lockutils [req-53e8d3bd-86e2-4c4a-a871-1169adaf2213 req-230b1130-8c56-4edc-a13d-28b98518edc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:29 np0005593234 nova_compute[227762]: 2026-01-23 11:01:29.149 227766 DEBUG oslo_concurrency.lockutils [req-53e8d3bd-86e2-4c4a-a871-1169adaf2213 req-230b1130-8c56-4edc-a13d-28b98518edc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:29 np0005593234 nova_compute[227762]: 2026-01-23 11:01:29.149 227766 DEBUG nova.compute.manager [req-53e8d3bd-86e2-4c4a-a871-1169adaf2213 req-230b1130-8c56-4edc-a13d-28b98518edc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] No waiting events found dispatching network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:01:29 np0005593234 nova_compute[227762]: 2026-01-23 11:01:29.149 227766 WARNING nova.compute.manager [req-53e8d3bd-86e2-4c4a-a871-1169adaf2213 req-230b1130-8c56-4edc-a13d-28b98518edc3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received unexpected event network-vif-plugged-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 06:01:30 np0005593234 nova_compute[227762]: 2026-01-23 11:01:30.230 227766 DEBUG nova.network.neutron [-] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:01:30 np0005593234 nova_compute[227762]: 2026-01-23 11:01:30.337 227766 INFO nova.compute.manager [-] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Took 3.24 seconds to deallocate network for instance.#033[00m
Jan 23 06:01:30 np0005593234 nova_compute[227762]: 2026-01-23 11:01:30.490 227766 DEBUG oslo_concurrency.lockutils [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:30 np0005593234 nova_compute[227762]: 2026-01-23 11:01:30.491 227766 DEBUG oslo_concurrency.lockutils [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:30 np0005593234 nova_compute[227762]: 2026-01-23 11:01:30.502 227766 DEBUG nova.network.neutron [req-e46f4983-5e01-40c0-bab7-a929bf37b1c0 req-ac4fd47d-c50d-4537-a867-62acae11b633 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Updated VIF entry in instance network info cache for port 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:01:30 np0005593234 nova_compute[227762]: 2026-01-23 11:01:30.503 227766 DEBUG nova.network.neutron [req-e46f4983-5e01-40c0-bab7-a929bf37b1c0 req-ac4fd47d-c50d-4537-a867-62acae11b633 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Updating instance_info_cache with network_info: [{"id": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "address": "fa:16:3e:40:55:45", "network": {"id": "f1da7606-9f16-49a8-8326-606f8222c72a", "bridge": "br-int", "label": "tempest-network-smoke--957000541", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap36f5a7d1-6a", "ovs_interfaceid": "36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:01:30 np0005593234 nova_compute[227762]: 2026-01-23 11:01:30.560 227766 DEBUG oslo_concurrency.lockutils [req-e46f4983-5e01-40c0-bab7-a929bf37b1c0 req-ac4fd47d-c50d-4537-a867-62acae11b633 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-44a526e3-4f37-4d95-a98c-45a5937384e7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:01:30 np0005593234 nova_compute[227762]: 2026-01-23 11:01:30.731 227766 DEBUG oslo_concurrency.processutils [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:01:30 np0005593234 nova_compute[227762]: 2026-01-23 11:01:30.777 227766 DEBUG nova.compute.manager [req-2e67d5fa-df17-4921-acbe-499ea9863f74 req-4aa50486-7a1f-4c35-bd66-75e4efbcdcfc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Received event network-vif-deleted-36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:01:30 np0005593234 nova_compute[227762]: 2026-01-23 11:01:30.777 227766 INFO nova.compute.manager [req-2e67d5fa-df17-4921-acbe-499ea9863f74 req-4aa50486-7a1f-4c35-bd66-75e4efbcdcfc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Neutron deleted interface 36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 06:01:30 np0005593234 nova_compute[227762]: 2026-01-23 11:01:30.778 227766 DEBUG nova.network.neutron [req-2e67d5fa-df17-4921-acbe-499ea9863f74 req-4aa50486-7a1f-4c35-bd66-75e4efbcdcfc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:01:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:30.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:30 np0005593234 nova_compute[227762]: 2026-01-23 11:01:30.881 227766 DEBUG nova.compute.manager [req-2e67d5fa-df17-4921-acbe-499ea9863f74 req-4aa50486-7a1f-4c35-bd66-75e4efbcdcfc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Detach interface failed, port_id=36f5a7d1-6aca-44f8-acdb-4a0f64f22cf1, reason: Instance 44a526e3-4f37-4d95-a98c-45a5937384e7 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 06:01:30 np0005593234 nova_compute[227762]: 2026-01-23 11:01:30.925 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:31.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:01:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/450524046' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:01:31 np0005593234 nova_compute[227762]: 2026-01-23 11:01:31.174 227766 DEBUG oslo_concurrency.processutils [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:01:31 np0005593234 nova_compute[227762]: 2026-01-23 11:01:31.180 227766 DEBUG nova.compute.provider_tree [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:01:31 np0005593234 nova_compute[227762]: 2026-01-23 11:01:31.206 227766 DEBUG nova.scheduler.client.report [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:01:31 np0005593234 nova_compute[227762]: 2026-01-23 11:01:31.262 227766 DEBUG oslo_concurrency.lockutils [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.771s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:31 np0005593234 nova_compute[227762]: 2026-01-23 11:01:31.313 227766 INFO nova.scheduler.client.report [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Deleted allocations for instance 44a526e3-4f37-4d95-a98c-45a5937384e7#033[00m
Jan 23 06:01:31 np0005593234 nova_compute[227762]: 2026-01-23 11:01:31.393 227766 DEBUG oslo_concurrency.lockutils [None req-9bea5d09-4de0-42a7-909d-b6f22191532f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "44a526e3-4f37-4d95-a98c-45a5937384e7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.071s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:31 np0005593234 nova_compute[227762]: 2026-01-23 11:01:31.629 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:31 np0005593234 nova_compute[227762]: 2026-01-23 11:01:31.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:31 np0005593234 nova_compute[227762]: 2026-01-23 11:01:31.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:01:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:31.946 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '94'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:01:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:32.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:01:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:33.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:01:33 np0005593234 nova_compute[227762]: 2026-01-23 11:01:33.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:34 np0005593234 nova_compute[227762]: 2026-01-23 11:01:34.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:34.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:34 np0005593234 podman[337971]: 2026-01-23 11:01:34.911728356 +0000 UTC m=+0.074900254 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 06:01:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:35.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:35 np0005593234 nova_compute[227762]: 2026-01-23 11:01:35.926 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:36 np0005593234 nova_compute[227762]: 2026-01-23 11:01:36.631 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:01:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:36.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:01:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:37.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:38 np0005593234 nova_compute[227762]: 2026-01-23 11:01:38.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:38 np0005593234 nova_compute[227762]: 2026-01-23 11:01:38.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:38.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:39.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:01:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:40.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:01:40 np0005593234 nova_compute[227762]: 2026-01-23 11:01:40.928 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:01:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:41.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:01:41 np0005593234 nova_compute[227762]: 2026-01-23 11:01:41.556 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769166086.5554757, 44a526e3-4f37-4d95-a98c-45a5937384e7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:01:41 np0005593234 nova_compute[227762]: 2026-01-23 11:01:41.557 227766 INFO nova.compute.manager [-] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] VM Stopped (Lifecycle Event)#033[00m
Jan 23 06:01:41 np0005593234 nova_compute[227762]: 2026-01-23 11:01:41.633 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:42 np0005593234 nova_compute[227762]: 2026-01-23 11:01:42.097 227766 DEBUG nova.compute.manager [None req-f9c9b1ad-0967-45cb-81e3-a64af8e2108a - - - - - -] [instance: 44a526e3-4f37-4d95-a98c-45a5937384e7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:01:42 np0005593234 podman[338020]: 2026-01-23 11:01:42.813469611 +0000 UTC m=+0.108913648 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:01:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:42.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:42.909 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:01:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:42.910 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:01:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:01:42.910 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:01:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:01:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:43.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:01:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:44.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:01:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/567970132' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:01:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:01:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/567970132' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:01:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:45.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:45 np0005593234 nova_compute[227762]: 2026-01-23 11:01:45.775 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:45 np0005593234 nova_compute[227762]: 2026-01-23 11:01:45.847 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:45 np0005593234 nova_compute[227762]: 2026-01-23 11:01:45.931 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:46 np0005593234 nova_compute[227762]: 2026-01-23 11:01:46.635 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:46.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:47.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:01:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:48.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:01:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:49.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:01:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:50.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:01:50 np0005593234 nova_compute[227762]: 2026-01-23 11:01:50.933 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:51.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:51 np0005593234 nova_compute[227762]: 2026-01-23 11:01:51.638 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:01:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:52.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:01:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:53.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:53 np0005593234 nova_compute[227762]: 2026-01-23 11:01:53.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:01:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.0 total, 600.0 interval#012Cumulative writes: 19K writes, 97K keys, 19K commit groups, 1.0 writes per commit group, ingest: 0.19 GB, 0.03 MB/s#012Cumulative WAL: 19K writes, 19K syncs, 1.00 writes per sync, written: 0.19 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1426 writes, 6964 keys, 1426 commit groups, 1.0 writes per commit group, ingest: 15.01 MB, 0.03 MB/s#012Interval WAL: 1426 writes, 1426 syncs, 1.00 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     47.3      2.57              0.36        65    0.039       0      0       0.0       0.0#012  L6      1/0   11.85 MB   0.0      0.8     0.1      0.6       0.6      0.0       0.0   5.4    119.7    102.7      6.42              2.07        64    0.100    515K    34K       0.0       0.0#012 Sum      1/0   11.85 MB   0.0      0.8     0.1      0.6       0.8      0.1       0.0   6.4     85.6     86.9      8.99              2.44       129    0.070    515K    34K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.9    140.2    139.7      0.50              0.21        10    0.050     58K   2599       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.8     0.1      0.6       0.6      0.0       0.0   0.0    119.7    102.7      6.42              2.07        64    0.100    515K    34K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     47.4      2.56              0.36        64    0.040       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 7200.0 total, 600.0 interval#012Flush(GB): cumulative 0.118, interval 0.009#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.76 GB write, 0.11 MB/s write, 0.75 GB read, 0.11 MB/s read, 9.0 seconds#012Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.5 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 304.00 MB usage: 84.12 MB table_size: 0 occupancy: 18446744073709551615 collections: 13 last_copies: 0 last_secs: 0.000442 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(4804,80.52 MB,26.4856%) FilterBlock(129,1.40 MB,0.462015%) IndexBlock(129,2.20 MB,0.724998%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 06:01:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:01:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:54.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:01:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:01:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:55.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:01:55 np0005593234 nova_compute[227762]: 2026-01-23 11:01:55.934 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:56 np0005593234 nova_compute[227762]: 2026-01-23 11:01:56.640 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:01:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:56.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:57.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:01:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:01:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 06:01:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:01:58.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 06:01:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:01:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:01:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:01:59.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:00.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:00 np0005593234 nova_compute[227762]: 2026-01-23 11:02:00.936 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:01.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:01 np0005593234 nova_compute[227762]: 2026-01-23 11:02:01.642 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:02.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:03.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:04.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:02:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:05.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:02:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:05 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:05 np0005593234 podman[338239]: 2026-01-23 11:02:05.801602332 +0000 UTC m=+0.085204816 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Jan 23 06:02:05 np0005593234 nova_compute[227762]: 2026-01-23 11:02:05.937 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:06 np0005593234 nova_compute[227762]: 2026-01-23 11:02:06.645 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:02:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:02:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:06.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:02:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:07.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:02:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:02:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:08.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:02:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:02:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:09.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:02:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:10.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:10 np0005593234 nova_compute[227762]: 2026-01-23 11:02:10.975 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:11.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:11 np0005593234 nova_compute[227762]: 2026-01-23 11:02:11.647 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:12.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:13.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:13 np0005593234 nova_compute[227762]: 2026-01-23 11:02:13.332 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "c442e253-3331-4edd-8629-6321ddc21de6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:13 np0005593234 nova_compute[227762]: 2026-01-23 11:02:13.333 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:13 np0005593234 nova_compute[227762]: 2026-01-23 11:02:13.363 227766 DEBUG nova.compute.manager [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 06:02:13 np0005593234 nova_compute[227762]: 2026-01-23 11:02:13.489 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:13 np0005593234 nova_compute[227762]: 2026-01-23 11:02:13.490 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:13 np0005593234 nova_compute[227762]: 2026-01-23 11:02:13.498 227766 DEBUG nova.virt.hardware [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 06:02:13 np0005593234 nova_compute[227762]: 2026-01-23 11:02:13.498 227766 INFO nova.compute.claims [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 06:02:13 np0005593234 nova_compute[227762]: 2026-01-23 11:02:13.696 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:13 np0005593234 podman[338265]: 2026-01-23 11:02:13.856159724 +0000 UTC m=+0.135244871 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 23 06:02:14 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:02:14 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1997505149' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.180 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.188 227766 DEBUG nova.compute.provider_tree [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.212 227766 DEBUG nova.scheduler.client.report [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.246 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.756s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.246 227766 DEBUG nova.compute.manager [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.301 227766 DEBUG nova.compute.manager [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.302 227766 DEBUG nova.network.neutron [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.337 227766 INFO nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.361 227766 DEBUG nova.compute.manager [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.546 227766 DEBUG nova.compute.manager [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.548 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.549 227766 INFO nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Creating image(s)#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.589 227766 DEBUG nova.storage.rbd_utils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.628 227766 DEBUG nova.storage.rbd_utils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.668 227766 DEBUG nova.storage.rbd_utils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.674 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.776 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.778 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.779 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.780 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:14 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:14 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.824 227766 DEBUG nova.storage.rbd_utils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:14 np0005593234 nova_compute[227762]: 2026-01-23 11:02:14.829 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 c442e253-3331-4edd-8629-6321ddc21de6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:14.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:15.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:15 np0005593234 nova_compute[227762]: 2026-01-23 11:02:15.167 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 c442e253-3331-4edd-8629-6321ddc21de6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.337s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:15 np0005593234 nova_compute[227762]: 2026-01-23 11:02:15.241 227766 DEBUG nova.storage.rbd_utils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] resizing rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 06:02:15 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 23 06:02:15 np0005593234 nova_compute[227762]: 2026-01-23 11:02:15.589 227766 DEBUG nova.policy [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '420c366dc5dc45a48da4e0b18c93043f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c06f98b51aeb48de91d116fda54a161f', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 06:02:15 np0005593234 nova_compute[227762]: 2026-01-23 11:02:15.671 227766 DEBUG nova.objects.instance [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'migration_context' on Instance uuid c442e253-3331-4edd-8629-6321ddc21de6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:02:15 np0005593234 nova_compute[227762]: 2026-01-23 11:02:15.689 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 06:02:15 np0005593234 nova_compute[227762]: 2026-01-23 11:02:15.689 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Ensure instance console log exists: /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 06:02:15 np0005593234 nova_compute[227762]: 2026-01-23 11:02:15.690 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:15 np0005593234 nova_compute[227762]: 2026-01-23 11:02:15.690 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:15 np0005593234 nova_compute[227762]: 2026-01-23 11:02:15.691 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:15 np0005593234 nova_compute[227762]: 2026-01-23 11:02:15.977 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:16 np0005593234 nova_compute[227762]: 2026-01-23 11:02:16.650 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:16.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:17.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:17 np0005593234 nova_compute[227762]: 2026-01-23 11:02:17.692 227766 DEBUG nova.network.neutron [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Successfully created port: 7ca05cbe-776c-477b-a267-f19b2dcefdb6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 06:02:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:18.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:02:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:19.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:02:20 np0005593234 nova_compute[227762]: 2026-01-23 11:02:20.625 227766 DEBUG nova.network.neutron [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Successfully updated port: 7ca05cbe-776c-477b-a267-f19b2dcefdb6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 06:02:20 np0005593234 nova_compute[227762]: 2026-01-23 11:02:20.647 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-c442e253-3331-4edd-8629-6321ddc21de6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:02:20 np0005593234 nova_compute[227762]: 2026-01-23 11:02:20.647 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-c442e253-3331-4edd-8629-6321ddc21de6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:02:20 np0005593234 nova_compute[227762]: 2026-01-23 11:02:20.647 227766 DEBUG nova.network.neutron [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:02:20 np0005593234 nova_compute[227762]: 2026-01-23 11:02:20.761 227766 DEBUG nova.compute.manager [req-a87fb160-ac24-4cf3-8ea6-2e1901ef7653 req-3c3936bb-1fba-4f3a-aae3-cc8c8aaa44e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received event network-changed-7ca05cbe-776c-477b-a267-f19b2dcefdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:02:20 np0005593234 nova_compute[227762]: 2026-01-23 11:02:20.762 227766 DEBUG nova.compute.manager [req-a87fb160-ac24-4cf3-8ea6-2e1901ef7653 req-3c3936bb-1fba-4f3a-aae3-cc8c8aaa44e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Refreshing instance network info cache due to event network-changed-7ca05cbe-776c-477b-a267-f19b2dcefdb6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:02:20 np0005593234 nova_compute[227762]: 2026-01-23 11:02:20.762 227766 DEBUG oslo_concurrency.lockutils [req-a87fb160-ac24-4cf3-8ea6-2e1901ef7653 req-3c3936bb-1fba-4f3a-aae3-cc8c8aaa44e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c442e253-3331-4edd-8629-6321ddc21de6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:02:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:02:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:20.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:02:20 np0005593234 nova_compute[227762]: 2026-01-23 11:02:20.890 227766 DEBUG nova.network.neutron [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 06:02:20 np0005593234 nova_compute[227762]: 2026-01-23 11:02:20.979 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:02:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:21.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:02:21 np0005593234 nova_compute[227762]: 2026-01-23 11:02:21.652 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:21 np0005593234 nova_compute[227762]: 2026-01-23 11:02:21.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:21 np0005593234 nova_compute[227762]: 2026-01-23 11:02:21.772 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:21 np0005593234 nova_compute[227762]: 2026-01-23 11:02:21.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:21 np0005593234 nova_compute[227762]: 2026-01-23 11:02:21.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:21 np0005593234 nova_compute[227762]: 2026-01-23 11:02:21.774 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:02:21 np0005593234 nova_compute[227762]: 2026-01-23 11:02:21.774 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:02:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3072477552' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.225 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.413 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.415 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4082MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.415 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.416 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.519 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance c442e253-3331-4edd-8629-6321ddc21de6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.520 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.520 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.563 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.599 227766 DEBUG nova.network.neutron [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Updating instance_info_cache with network_info: [{"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.632 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-c442e253-3331-4edd-8629-6321ddc21de6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.632 227766 DEBUG nova.compute.manager [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Instance network_info: |[{"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.633 227766 DEBUG oslo_concurrency.lockutils [req-a87fb160-ac24-4cf3-8ea6-2e1901ef7653 req-3c3936bb-1fba-4f3a-aae3-cc8c8aaa44e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c442e253-3331-4edd-8629-6321ddc21de6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.633 227766 DEBUG nova.network.neutron [req-a87fb160-ac24-4cf3-8ea6-2e1901ef7653 req-3c3936bb-1fba-4f3a-aae3-cc8c8aaa44e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Refreshing network info cache for port 7ca05cbe-776c-477b-a267-f19b2dcefdb6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.637 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Start _get_guest_xml network_info=[{"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.642 227766 WARNING nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.647 227766 DEBUG nova.virt.libvirt.host [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.648 227766 DEBUG nova.virt.libvirt.host [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.653 227766 DEBUG nova.virt.libvirt.host [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.654 227766 DEBUG nova.virt.libvirt.host [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.655 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.656 227766 DEBUG nova.virt.hardware [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.656 227766 DEBUG nova.virt.hardware [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.656 227766 DEBUG nova.virt.hardware [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.657 227766 DEBUG nova.virt.hardware [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.657 227766 DEBUG nova.virt.hardware [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.657 227766 DEBUG nova.virt.hardware [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.658 227766 DEBUG nova.virt.hardware [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.658 227766 DEBUG nova.virt.hardware [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.658 227766 DEBUG nova.virt.hardware [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.659 227766 DEBUG nova.virt.hardware [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.659 227766 DEBUG nova.virt.hardware [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 06:02:22 np0005593234 nova_compute[227762]: 2026-01-23 11:02:22.663 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:22.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:02:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2697929983' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.015 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.019 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.036 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.068 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.068 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:02:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3451485977' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:02:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:23.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.341 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.677s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.371 227766 DEBUG nova.storage.rbd_utils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.375 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:02:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2882860584' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.913 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.537s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.914 227766 DEBUG nova.virt.libvirt.vif [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:02:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1829658395',display_name='tempest-TestNetworkAdvancedServerOps-server-1829658395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1829658395',id=215,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIhJKs1r/JsFpu0BWPjavFqXIYAEvCluC0N9cRbmna2+YHH/vJ8/TdScoYsLAbGtghLOti713rszaER/EFM55mk2aJZ8CNxjq9lExotDDxKoBGUHWGshyO59EAlOfmR5Og==',key_name='tempest-TestNetworkAdvancedServerOps-632629835',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-znrqda5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:02:14Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=c442e253-3331-4edd-8629-6321ddc21de6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.915 227766 DEBUG nova.network.os_vif_util [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.916 227766 DEBUG nova.network.os_vif_util [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.917 227766 DEBUG nova.objects.instance [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid c442e253-3331-4edd-8629-6321ddc21de6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.944 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] End _get_guest_xml xml=<domain type="kvm">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  <uuid>c442e253-3331-4edd-8629-6321ddc21de6</uuid>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  <name>instance-000000d7</name>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1829658395</nova:name>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 11:02:22</nova:creationTime>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <nova:port uuid="7ca05cbe-776c-477b-a267-f19b2dcefdb6">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <system>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <entry name="serial">c442e253-3331-4edd-8629-6321ddc21de6</entry>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <entry name="uuid">c442e253-3331-4edd-8629-6321ddc21de6</entry>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    </system>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  <os>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  </os>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  <features>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  </features>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  </clock>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  <devices>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/c442e253-3331-4edd-8629-6321ddc21de6_disk">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/c442e253-3331-4edd-8629-6321ddc21de6_disk.config">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:7b:50:12"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <target dev="tap7ca05cbe-77"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    </interface>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/console.log" append="off"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    </serial>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <video>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    </video>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    </rng>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 06:02:23 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 06:02:23 np0005593234 nova_compute[227762]:  </devices>
Jan 23 06:02:23 np0005593234 nova_compute[227762]: </domain>
Jan 23 06:02:23 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.946 227766 DEBUG nova.compute.manager [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Preparing to wait for external event network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.946 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "c442e253-3331-4edd-8629-6321ddc21de6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.947 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.947 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.947 227766 DEBUG nova.virt.libvirt.vif [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:02:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1829658395',display_name='tempest-TestNetworkAdvancedServerOps-server-1829658395',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1829658395',id=215,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIhJKs1r/JsFpu0BWPjavFqXIYAEvCluC0N9cRbmna2+YHH/vJ8/TdScoYsLAbGtghLOti713rszaER/EFM55mk2aJZ8CNxjq9lExotDDxKoBGUHWGshyO59EAlOfmR5Og==',key_name='tempest-TestNetworkAdvancedServerOps-632629835',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-znrqda5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:02:14Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=c442e253-3331-4edd-8629-6321ddc21de6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.948 227766 DEBUG nova.network.os_vif_util [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.948 227766 DEBUG nova.network.os_vif_util [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.949 227766 DEBUG os_vif [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.949 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.950 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.950 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.954 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.954 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ca05cbe-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.954 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7ca05cbe-77, col_values=(('external_ids', {'iface-id': '7ca05cbe-776c-477b-a267-f19b2dcefdb6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:50:12', 'vm-uuid': 'c442e253-3331-4edd-8629-6321ddc21de6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.956 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:23 np0005593234 NetworkManager[48942]: <info>  [1769166143.9569] manager: (tap7ca05cbe-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/455)
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.958 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.966 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:23 np0005593234 nova_compute[227762]: 2026-01-23 11:02:23.967 227766 INFO os_vif [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77')#033[00m
Jan 23 06:02:24 np0005593234 nova_compute[227762]: 2026-01-23 11:02:24.023 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:02:24 np0005593234 nova_compute[227762]: 2026-01-23 11:02:24.023 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:02:24 np0005593234 nova_compute[227762]: 2026-01-23 11:02:24.024 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No VIF found with MAC fa:16:3e:7b:50:12, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 06:02:24 np0005593234 nova_compute[227762]: 2026-01-23 11:02:24.025 227766 INFO nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Using config drive#033[00m
Jan 23 06:02:24 np0005593234 nova_compute[227762]: 2026-01-23 11:02:24.059 227766 DEBUG nova.storage.rbd_utils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:24 np0005593234 nova_compute[227762]: 2026-01-23 11:02:24.068 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:24 np0005593234 nova_compute[227762]: 2026-01-23 11:02:24.069 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:02:24 np0005593234 nova_compute[227762]: 2026-01-23 11:02:24.069 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:02:24 np0005593234 nova_compute[227762]: 2026-01-23 11:02:24.099 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Jan 23 06:02:24 np0005593234 nova_compute[227762]: 2026-01-23 11:02:24.100 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:02:24 np0005593234 nova_compute[227762]: 2026-01-23 11:02:24.100 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:24 np0005593234 nova_compute[227762]: 2026-01-23 11:02:24.861 227766 INFO nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Creating config drive at /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/disk.config#033[00m
Jan 23 06:02:24 np0005593234 nova_compute[227762]: 2026-01-23 11:02:24.872 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_dze_plj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:24.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.025 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_dze_plj" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.061 227766 DEBUG nova.storage.rbd_utils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.065 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/disk.config c442e253-3331-4edd-8629-6321ddc21de6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.107 227766 DEBUG nova.network.neutron [req-a87fb160-ac24-4cf3-8ea6-2e1901ef7653 req-3c3936bb-1fba-4f3a-aae3-cc8c8aaa44e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Updated VIF entry in instance network info cache for port 7ca05cbe-776c-477b-a267-f19b2dcefdb6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.108 227766 DEBUG nova.network.neutron [req-a87fb160-ac24-4cf3-8ea6-2e1901ef7653 req-3c3936bb-1fba-4f3a-aae3-cc8c8aaa44e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Updating instance_info_cache with network_info: [{"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:02:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:02:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:25.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.135 227766 DEBUG oslo_concurrency.lockutils [req-a87fb160-ac24-4cf3-8ea6-2e1901ef7653 req-3c3936bb-1fba-4f3a-aae3-cc8c8aaa44e9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c442e253-3331-4edd-8629-6321ddc21de6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.404 227766 DEBUG oslo_concurrency.processutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/disk.config c442e253-3331-4edd-8629-6321ddc21de6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.339s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.405 227766 INFO nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Deleting local config drive /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/disk.config because it was imported into RBD.#033[00m
Jan 23 06:02:25 np0005593234 kernel: tap7ca05cbe-77: entered promiscuous mode
Jan 23 06:02:25 np0005593234 NetworkManager[48942]: <info>  [1769166145.4522] manager: (tap7ca05cbe-77): new Tun device (/org/freedesktop/NetworkManager/Devices/456)
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.452 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:25 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:25Z|00956|binding|INFO|Claiming lport 7ca05cbe-776c-477b-a267-f19b2dcefdb6 for this chassis.
Jan 23 06:02:25 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:25Z|00957|binding|INFO|7ca05cbe-776c-477b-a267-f19b2dcefdb6: Claiming fa:16:3e:7b:50:12 10.100.0.10
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.457 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.467 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:50:12 10.100.0.10'], port_security=['fa:16:3e:7b:50:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c442e253-3331-4edd-8629-6321ddc21de6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61d10787-dafd-4592-8a90-5156be0ee76e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b3661b36-5769-4538-ae1c-8c4dac03c6a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0a736d2-23ab-41ff-b228-b06b4b7f67c9, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=7ca05cbe-776c-477b-a267-f19b2dcefdb6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.468 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 7ca05cbe-776c-477b-a267-f19b2dcefdb6 in datapath 61d10787-dafd-4592-8a90-5156be0ee76e bound to our chassis#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.469 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 61d10787-dafd-4592-8a90-5156be0ee76e#033[00m
Jan 23 06:02:25 np0005593234 systemd-udevd[338765]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.480 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8a42c872-3f7a-4c7b-88ca-c99e6d8d7455]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.481 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap61d10787-d1 in ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.483 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap61d10787-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.483 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2fed0aa9-c7e5-4577-a01d-eb22b04e309f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.483 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0bc0c302-3548-4c0b-865e-4425952ae995]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.492 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[92462aaa-7e57-4b6b-9ce6-7522dff3d41c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 systemd-machined[195626]: New machine qemu-107-instance-000000d7.
Jan 23 06:02:25 np0005593234 NetworkManager[48942]: <info>  [1769166145.4971] device (tap7ca05cbe-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:02:25 np0005593234 NetworkManager[48942]: <info>  [1769166145.4982] device (tap7ca05cbe-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:02:25 np0005593234 systemd[1]: Started Virtual Machine qemu-107-instance-000000d7.
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.517 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3d49ad5e-8a55-435e-8112-d5b785594a63]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.520 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:25 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:25Z|00958|binding|INFO|Setting lport 7ca05cbe-776c-477b-a267-f19b2dcefdb6 ovn-installed in OVS
Jan 23 06:02:25 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:25Z|00959|binding|INFO|Setting lport 7ca05cbe-776c-477b-a267-f19b2dcefdb6 up in Southbound
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.528 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.543 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[2b61bdca-5dd2-4c8c-a619-7bd11cf91571]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.548 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[65f75beb-1548-41d3-87e6-18e3c0f81898]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 NetworkManager[48942]: <info>  [1769166145.5494] manager: (tap61d10787-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/457)
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.577 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[dfbf1161-41e1-46d6-94a4-f7c5b7c552e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.579 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a32dbaa9-d9a0-4216-bdc1-9d859c282478]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 NetworkManager[48942]: <info>  [1769166145.6019] device (tap61d10787-d0): carrier: link connected
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.613 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[1ec5941e-541f-4d54-807c-4630999ee5b5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.634 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9bd21fd6-72d8-4af8-afb2-b88c94844610]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61d10787-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:2a:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 294], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1004056, 'reachable_time': 44133, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 338798, 'error': None, 'target': 'ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.649 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[066d6e0d-d45c-4f81-aa29-3c24bc63d8cf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee9:2af2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1004056, 'tstamp': 1004056}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 338799, 'error': None, 'target': 'ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.669 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc64f17-ca37-41c3-83bc-be1709888f24]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61d10787-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:2a:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 294], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1004056, 'reachable_time': 44133, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 338800, 'error': None, 'target': 'ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.708 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b05b5d70-5c1a-4770-9df2-32475a980031]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.787 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[93f6b5ec-e56d-4b72-b728-3e8548996ef6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.789 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61d10787-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.790 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.791 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61d10787-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:25 np0005593234 NetworkManager[48942]: <info>  [1769166145.7932] manager: (tap61d10787-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/458)
Jan 23 06:02:25 np0005593234 kernel: tap61d10787-d0: entered promiscuous mode
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.794 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.797 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap61d10787-d0, col_values=(('external_ids', {'iface-id': '011329dd-9cad-4775-b208-cac68cef5e57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:25 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:25Z|00960|binding|INFO|Releasing lport 011329dd-9cad-4775-b208-cac68cef5e57 from this chassis (sb_readonly=0)
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.798 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.818 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.818 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/61d10787-dafd-4592-8a90-5156be0ee76e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/61d10787-dafd-4592-8a90-5156be0ee76e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.820 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9fe95700-3e02-4389-a61b-46c9ef32ab5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.820 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-61d10787-dafd-4592-8a90-5156be0ee76e
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/61d10787-dafd-4592-8a90-5156be0ee76e.pid.haproxy
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 61d10787-dafd-4592-8a90-5156be0ee76e
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:02:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:25.821 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e', 'env', 'PROCESS_TAG=haproxy-61d10787-dafd-4592-8a90-5156be0ee76e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/61d10787-dafd-4592-8a90-5156be0ee76e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.938 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166145.937806, c442e253-3331-4edd-8629-6321ddc21de6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.939 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] VM Started (Lifecycle Event)#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.964 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.968 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166145.9379013, c442e253-3331-4edd-8629-6321ddc21de6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.968 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] VM Paused (Lifecycle Event)#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.982 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:25 np0005593234 nova_compute[227762]: 2026-01-23 11:02:25.998 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:02:26 np0005593234 nova_compute[227762]: 2026-01-23 11:02:26.001 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:02:26 np0005593234 nova_compute[227762]: 2026-01-23 11:02:26.039 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:02:26 np0005593234 podman[338874]: 2026-01-23 11:02:26.183099487 +0000 UTC m=+0.057031275 container create 78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:02:26 np0005593234 systemd[1]: Started libpod-conmon-78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7.scope.
Jan 23 06:02:26 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:02:26 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/634a191669809a9f99663d7737cfefb8a7e4bd870b8321539a1a720ec9f66b7c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:02:26 np0005593234 podman[338874]: 2026-01-23 11:02:26.150597441 +0000 UTC m=+0.024529249 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:02:26 np0005593234 podman[338874]: 2026-01-23 11:02:26.255831292 +0000 UTC m=+0.129763100 container init 78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 06:02:26 np0005593234 podman[338874]: 2026-01-23 11:02:26.260860539 +0000 UTC m=+0.134792337 container start 78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 06:02:26 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[338889]: [NOTICE]   (338893) : New worker (338896) forked
Jan 23 06:02:26 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[338889]: [NOTICE]   (338893) : Loading success.
Jan 23 06:02:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:26.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:27.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.252 227766 DEBUG nova.compute.manager [req-619a8ba6-4c27-49f6-bef5-54d34c5cf1d2 req-88cc22d8-a78a-4df7-a23a-f33bdef455df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received event network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.253 227766 DEBUG oslo_concurrency.lockutils [req-619a8ba6-4c27-49f6-bef5-54d34c5cf1d2 req-88cc22d8-a78a-4df7-a23a-f33bdef455df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c442e253-3331-4edd-8629-6321ddc21de6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.253 227766 DEBUG oslo_concurrency.lockutils [req-619a8ba6-4c27-49f6-bef5-54d34c5cf1d2 req-88cc22d8-a78a-4df7-a23a-f33bdef455df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.253 227766 DEBUG oslo_concurrency.lockutils [req-619a8ba6-4c27-49f6-bef5-54d34c5cf1d2 req-88cc22d8-a78a-4df7-a23a-f33bdef455df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.254 227766 DEBUG nova.compute.manager [req-619a8ba6-4c27-49f6-bef5-54d34c5cf1d2 req-88cc22d8-a78a-4df7-a23a-f33bdef455df 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Processing event network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.254 227766 DEBUG nova.compute.manager [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.260 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166147.2601864, c442e253-3331-4edd-8629-6321ddc21de6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.261 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.263 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.268 227766 INFO nova.virt.libvirt.driver [-] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Instance spawned successfully.#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.268 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.284 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.290 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.295 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.295 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.296 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.296 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.297 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.297 227766 DEBUG nova.virt.libvirt.driver [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.322 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.360 227766 INFO nova.compute.manager [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Took 12.81 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.361 227766 DEBUG nova.compute.manager [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:27.382 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=95, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=94) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.384 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:27.385 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.444 227766 INFO nova.compute.manager [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Took 14.01 seconds to build instance.#033[00m
Jan 23 06:02:27 np0005593234 nova_compute[227762]: 2026-01-23 11:02:27.461 227766 DEBUG oslo_concurrency.lockutils [None req-64b4142c-815e-4239-b00d-b294286a92c9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:28 np0005593234 ceph-mgr[77448]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 06:02:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:28.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:28 np0005593234 nova_compute[227762]: 2026-01-23 11:02:28.958 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:02:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:29.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:02:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:02:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.5 total, 600.0 interval#012Cumulative writes: 77K writes, 307K keys, 77K commit groups, 1.0 writes per commit group, ingest: 0.31 GB, 0.04 MB/s#012Cumulative WAL: 77K writes, 28K syncs, 2.69 writes per sync, written: 0.31 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2146 writes, 8860 keys, 2146 commit groups, 1.0 writes per commit group, ingest: 7.11 MB, 0.01 MB/s#012Interval WAL: 2146 writes, 874 syncs, 2.46 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:02:29 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:29.387 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '95'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:29 np0005593234 nova_compute[227762]: 2026-01-23 11:02:29.393 227766 DEBUG nova.compute.manager [req-fe979658-7594-4024-8eb2-e0df99b055be req-5cd54533-3a6d-46bc-8967-f6f3b7c0b8d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received event network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:02:29 np0005593234 nova_compute[227762]: 2026-01-23 11:02:29.393 227766 DEBUG oslo_concurrency.lockutils [req-fe979658-7594-4024-8eb2-e0df99b055be req-5cd54533-3a6d-46bc-8967-f6f3b7c0b8d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c442e253-3331-4edd-8629-6321ddc21de6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:29 np0005593234 nova_compute[227762]: 2026-01-23 11:02:29.394 227766 DEBUG oslo_concurrency.lockutils [req-fe979658-7594-4024-8eb2-e0df99b055be req-5cd54533-3a6d-46bc-8967-f6f3b7c0b8d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:29 np0005593234 nova_compute[227762]: 2026-01-23 11:02:29.394 227766 DEBUG oslo_concurrency.lockutils [req-fe979658-7594-4024-8eb2-e0df99b055be req-5cd54533-3a6d-46bc-8967-f6f3b7c0b8d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:29 np0005593234 nova_compute[227762]: 2026-01-23 11:02:29.394 227766 DEBUG nova.compute.manager [req-fe979658-7594-4024-8eb2-e0df99b055be req-5cd54533-3a6d-46bc-8967-f6f3b7c0b8d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] No waiting events found dispatching network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:02:29 np0005593234 nova_compute[227762]: 2026-01-23 11:02:29.394 227766 WARNING nova.compute.manager [req-fe979658-7594-4024-8eb2-e0df99b055be req-5cd54533-3a6d-46bc-8967-f6f3b7c0b8d6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received unexpected event network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:02:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:30.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:30 np0005593234 nova_compute[227762]: 2026-01-23 11:02:30.983 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:31.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:32 np0005593234 nova_compute[227762]: 2026-01-23 11:02:32.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:32 np0005593234 nova_compute[227762]: 2026-01-23 11:02:32.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:02:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:32.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:02:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:33.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:02:33 np0005593234 nova_compute[227762]: 2026-01-23 11:02:33.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:33 np0005593234 nova_compute[227762]: 2026-01-23 11:02:33.961 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:34 np0005593234 nova_compute[227762]: 2026-01-23 11:02:34.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:34 np0005593234 NetworkManager[48942]: <info>  [1769166154.8308] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/459)
Jan 23 06:02:34 np0005593234 NetworkManager[48942]: <info>  [1769166154.8318] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/460)
Jan 23 06:02:34 np0005593234 nova_compute[227762]: 2026-01-23 11:02:34.830 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:34 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:34Z|00961|binding|INFO|Releasing lport 011329dd-9cad-4775-b208-cac68cef5e57 from this chassis (sb_readonly=0)
Jan 23 06:02:34 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:34Z|00962|binding|INFO|Releasing lport 011329dd-9cad-4775-b208-cac68cef5e57 from this chassis (sb_readonly=0)
Jan 23 06:02:34 np0005593234 nova_compute[227762]: 2026-01-23 11:02:34.878 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:34.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:34 np0005593234 nova_compute[227762]: 2026-01-23 11:02:34.885 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:35.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:35 np0005593234 nova_compute[227762]: 2026-01-23 11:02:35.985 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:36 np0005593234 nova_compute[227762]: 2026-01-23 11:02:36.018 227766 DEBUG nova.compute.manager [req-20923662-3f1c-4fa9-812e-4c73aefc79bf req-2e67deb7-d33e-4524-b5e8-5fc8831b6e63 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received event network-changed-7ca05cbe-776c-477b-a267-f19b2dcefdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:02:36 np0005593234 nova_compute[227762]: 2026-01-23 11:02:36.018 227766 DEBUG nova.compute.manager [req-20923662-3f1c-4fa9-812e-4c73aefc79bf req-2e67deb7-d33e-4524-b5e8-5fc8831b6e63 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Refreshing instance network info cache due to event network-changed-7ca05cbe-776c-477b-a267-f19b2dcefdb6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:02:36 np0005593234 nova_compute[227762]: 2026-01-23 11:02:36.018 227766 DEBUG oslo_concurrency.lockutils [req-20923662-3f1c-4fa9-812e-4c73aefc79bf req-2e67deb7-d33e-4524-b5e8-5fc8831b6e63 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c442e253-3331-4edd-8629-6321ddc21de6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:02:36 np0005593234 nova_compute[227762]: 2026-01-23 11:02:36.019 227766 DEBUG oslo_concurrency.lockutils [req-20923662-3f1c-4fa9-812e-4c73aefc79bf req-2e67deb7-d33e-4524-b5e8-5fc8831b6e63 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c442e253-3331-4edd-8629-6321ddc21de6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:02:36 np0005593234 nova_compute[227762]: 2026-01-23 11:02:36.019 227766 DEBUG nova.network.neutron [req-20923662-3f1c-4fa9-812e-4c73aefc79bf req-2e67deb7-d33e-4524-b5e8-5fc8831b6e63 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Refreshing network info cache for port 7ca05cbe-776c-477b-a267-f19b2dcefdb6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:02:36 np0005593234 podman[338961]: 2026-01-23 11:02:36.774211414 +0000 UTC m=+0.065034126 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Jan 23 06:02:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:36.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:37.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:38 np0005593234 nova_compute[227762]: 2026-01-23 11:02:38.803 227766 DEBUG nova.network.neutron [req-20923662-3f1c-4fa9-812e-4c73aefc79bf req-2e67deb7-d33e-4524-b5e8-5fc8831b6e63 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Updated VIF entry in instance network info cache for port 7ca05cbe-776c-477b-a267-f19b2dcefdb6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:02:38 np0005593234 nova_compute[227762]: 2026-01-23 11:02:38.803 227766 DEBUG nova.network.neutron [req-20923662-3f1c-4fa9-812e-4c73aefc79bf req-2e67deb7-d33e-4524-b5e8-5fc8831b6e63 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Updating instance_info_cache with network_info: [{"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:02:38 np0005593234 nova_compute[227762]: 2026-01-23 11:02:38.831 227766 DEBUG oslo_concurrency.lockutils [req-20923662-3f1c-4fa9-812e-4c73aefc79bf req-2e67deb7-d33e-4524-b5e8-5fc8831b6e63 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c442e253-3331-4edd-8629-6321ddc21de6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:02:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:38.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:38 np0005593234 nova_compute[227762]: 2026-01-23 11:02:38.964 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:39.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:39 np0005593234 nova_compute[227762]: 2026-01-23 11:02:39.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:39 np0005593234 nova_compute[227762]: 2026-01-23 11:02:39.766 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:39 np0005593234 nova_compute[227762]: 2026-01-23 11:02:39.766 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:40 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:40Z|00125|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:50:12 10.100.0.10
Jan 23 06:02:40 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:40Z|00126|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:50:12 10.100.0.10
Jan 23 06:02:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:40.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:40 np0005593234 nova_compute[227762]: 2026-01-23 11:02:40.989 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:41.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:42.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:42.911 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:42.912 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:42.913 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:43.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:43 np0005593234 nova_compute[227762]: 2026-01-23 11:02:43.968 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:02:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3551904520' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:02:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:02:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3551904520' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:02:44 np0005593234 podman[338986]: 2026-01-23 11:02:44.846489834 +0000 UTC m=+0.137336457 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 06:02:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:44.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:45.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:45 np0005593234 nova_compute[227762]: 2026-01-23 11:02:45.969 227766 INFO nova.compute.manager [None req-842240e3-e6bb-4b43-a65f-d3d9c407d456 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Get console output#033[00m
Jan 23 06:02:45 np0005593234 nova_compute[227762]: 2026-01-23 11:02:45.978 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 06:02:46 np0005593234 nova_compute[227762]: 2026-01-23 11:02:46.026 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:46.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:47.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:47 np0005593234 nova_compute[227762]: 2026-01-23 11:02:47.494 227766 INFO nova.compute.manager [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Rebuilding instance#033[00m
Jan 23 06:02:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:47 np0005593234 nova_compute[227762]: 2026-01-23 11:02:47.938 227766 DEBUG nova.objects.instance [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'trusted_certs' on Instance uuid c442e253-3331-4edd-8629-6321ddc21de6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:02:47 np0005593234 nova_compute[227762]: 2026-01-23 11:02:47.971 227766 DEBUG nova.compute.manager [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:02:48 np0005593234 nova_compute[227762]: 2026-01-23 11:02:48.038 227766 DEBUG nova.objects.instance [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_requests' on Instance uuid c442e253-3331-4edd-8629-6321ddc21de6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:02:48 np0005593234 nova_compute[227762]: 2026-01-23 11:02:48.053 227766 DEBUG nova.objects.instance [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid c442e253-3331-4edd-8629-6321ddc21de6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:02:48 np0005593234 nova_compute[227762]: 2026-01-23 11:02:48.068 227766 DEBUG nova.objects.instance [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'resources' on Instance uuid c442e253-3331-4edd-8629-6321ddc21de6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:02:48 np0005593234 nova_compute[227762]: 2026-01-23 11:02:48.083 227766 DEBUG nova.objects.instance [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'migration_context' on Instance uuid c442e253-3331-4edd-8629-6321ddc21de6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:02:48 np0005593234 nova_compute[227762]: 2026-01-23 11:02:48.101 227766 DEBUG nova.objects.instance [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 06:02:48 np0005593234 nova_compute[227762]: 2026-01-23 11:02:48.105 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 06:02:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:48.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:48 np0005593234 nova_compute[227762]: 2026-01-23 11:02:48.970 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:02:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:49.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:02:50 np0005593234 kernel: tap7ca05cbe-77 (unregistering): left promiscuous mode
Jan 23 06:02:50 np0005593234 NetworkManager[48942]: <info>  [1769166170.4320] device (tap7ca05cbe-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:02:50 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:50Z|00963|binding|INFO|Releasing lport 7ca05cbe-776c-477b-a267-f19b2dcefdb6 from this chassis (sb_readonly=0)
Jan 23 06:02:50 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:50Z|00964|binding|INFO|Setting lport 7ca05cbe-776c-477b-a267-f19b2dcefdb6 down in Southbound
Jan 23 06:02:50 np0005593234 nova_compute[227762]: 2026-01-23 11:02:50.441 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:50 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:50Z|00965|binding|INFO|Removing iface tap7ca05cbe-77 ovn-installed in OVS
Jan 23 06:02:50 np0005593234 nova_compute[227762]: 2026-01-23 11:02:50.444 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.451 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:50:12 10.100.0.10'], port_security=['fa:16:3e:7b:50:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c442e253-3331-4edd-8629-6321ddc21de6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61d10787-dafd-4592-8a90-5156be0ee76e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b3661b36-5769-4538-ae1c-8c4dac03c6a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.200'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0a736d2-23ab-41ff-b228-b06b4b7f67c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=7ca05cbe-776c-477b-a267-f19b2dcefdb6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.453 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 7ca05cbe-776c-477b-a267-f19b2dcefdb6 in datapath 61d10787-dafd-4592-8a90-5156be0ee76e unbound from our chassis#033[00m
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.454 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 61d10787-dafd-4592-8a90-5156be0ee76e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.456 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b79005e7-29a5-4b0c-b93f-493a90286347]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.457 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e namespace which is not needed anymore#033[00m
Jan 23 06:02:50 np0005593234 nova_compute[227762]: 2026-01-23 11:02:50.466 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:50 np0005593234 systemd[1]: machine-qemu\x2d107\x2dinstance\x2d000000d7.scope: Deactivated successfully.
Jan 23 06:02:50 np0005593234 systemd[1]: machine-qemu\x2d107\x2dinstance\x2d000000d7.scope: Consumed 13.813s CPU time.
Jan 23 06:02:50 np0005593234 systemd-machined[195626]: Machine qemu-107-instance-000000d7 terminated.
Jan 23 06:02:50 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[338889]: [NOTICE]   (338893) : haproxy version is 2.8.14-c23fe91
Jan 23 06:02:50 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[338889]: [NOTICE]   (338893) : path to executable is /usr/sbin/haproxy
Jan 23 06:02:50 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[338889]: [WARNING]  (338893) : Exiting Master process...
Jan 23 06:02:50 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[338889]: [WARNING]  (338893) : Exiting Master process...
Jan 23 06:02:50 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[338889]: [ALERT]    (338893) : Current worker (338896) exited with code 143 (Terminated)
Jan 23 06:02:50 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[338889]: [WARNING]  (338893) : All workers exited. Exiting... (0)
Jan 23 06:02:50 np0005593234 systemd[1]: libpod-78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7.scope: Deactivated successfully.
Jan 23 06:02:50 np0005593234 podman[339041]: 2026-01-23 11:02:50.585623844 +0000 UTC m=+0.038699172 container died 78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 06:02:50 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7-userdata-shm.mount: Deactivated successfully.
Jan 23 06:02:50 np0005593234 systemd[1]: var-lib-containers-storage-overlay-634a191669809a9f99663d7737cfefb8a7e4bd870b8321539a1a720ec9f66b7c-merged.mount: Deactivated successfully.
Jan 23 06:02:50 np0005593234 podman[339041]: 2026-01-23 11:02:50.616538281 +0000 UTC m=+0.069613619 container cleanup 78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 06:02:50 np0005593234 systemd[1]: libpod-conmon-78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7.scope: Deactivated successfully.
Jan 23 06:02:50 np0005593234 podman[339070]: 2026-01-23 11:02:50.674842975 +0000 UTC m=+0.040483248 container remove 78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.680 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cb2d5e95-bccc-4ec4-9579-e0905ae5da60]: (4, ('Fri Jan 23 11:02:50 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e (78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7)\n78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7\nFri Jan 23 11:02:50 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e (78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7)\n78378bca0990c299596796d2ceed01f6cd1518fe5af9d61b13eb9a7d386d5fd7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.682 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[beddf2a1-a9a8-40a7-bec3-76b82734389b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.683 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61d10787-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:50 np0005593234 nova_compute[227762]: 2026-01-23 11:02:50.685 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:50 np0005593234 kernel: tap61d10787-d0: left promiscuous mode
Jan 23 06:02:50 np0005593234 nova_compute[227762]: 2026-01-23 11:02:50.700 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.705 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e08f3712-33ef-4c5b-85e1-239afed7c830]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.723 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[132c755c-18cf-42bb-9488-33a1202cb58f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.725 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[75854f8f-61b9-4598-8235-c3829f0bbac8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.740 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d7b502e6-0515-4f14-a622-3740e84d48a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1004050, 'reachable_time': 43860, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 339098, 'error': None, 'target': 'ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.743 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:02:50 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:50.743 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[2835c9a8-f2ce-47e3-ae12-66d07e392826]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:50 np0005593234 systemd[1]: run-netns-ovnmeta\x2d61d10787\x2ddafd\x2d4592\x2d8a90\x2d5156be0ee76e.mount: Deactivated successfully.
Jan 23 06:02:50 np0005593234 nova_compute[227762]: 2026-01-23 11:02:50.793 227766 DEBUG nova.compute.manager [req-075c0107-b538-41ad-a910-f40956350e4e req-ae644335-3f4b-42fd-a09c-a21f6fb3b000 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received event network-vif-unplugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:02:50 np0005593234 nova_compute[227762]: 2026-01-23 11:02:50.793 227766 DEBUG oslo_concurrency.lockutils [req-075c0107-b538-41ad-a910-f40956350e4e req-ae644335-3f4b-42fd-a09c-a21f6fb3b000 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c442e253-3331-4edd-8629-6321ddc21de6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:50 np0005593234 nova_compute[227762]: 2026-01-23 11:02:50.794 227766 DEBUG oslo_concurrency.lockutils [req-075c0107-b538-41ad-a910-f40956350e4e req-ae644335-3f4b-42fd-a09c-a21f6fb3b000 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:50 np0005593234 nova_compute[227762]: 2026-01-23 11:02:50.794 227766 DEBUG oslo_concurrency.lockutils [req-075c0107-b538-41ad-a910-f40956350e4e req-ae644335-3f4b-42fd-a09c-a21f6fb3b000 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:50 np0005593234 nova_compute[227762]: 2026-01-23 11:02:50.794 227766 DEBUG nova.compute.manager [req-075c0107-b538-41ad-a910-f40956350e4e req-ae644335-3f4b-42fd-a09c-a21f6fb3b000 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] No waiting events found dispatching network-vif-unplugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:02:50 np0005593234 nova_compute[227762]: 2026-01-23 11:02:50.794 227766 WARNING nova.compute.manager [req-075c0107-b538-41ad-a910-f40956350e4e req-ae644335-3f4b-42fd-a09c-a21f6fb3b000 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received unexpected event network-vif-unplugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 for instance with vm_state active and task_state rebuilding.#033[00m
Jan 23 06:02:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:50.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.026 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.125 227766 INFO nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.129 227766 INFO nova.virt.libvirt.driver [-] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Instance destroyed successfully.#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.133 227766 INFO nova.virt.libvirt.driver [-] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Instance destroyed successfully.#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.134 227766 DEBUG nova.virt.libvirt.vif [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:02:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1829658395',display_name='tempest-TestNetworkAdvancedServerOps-server-1829658395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1829658395',id=215,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIhJKs1r/JsFpu0BWPjavFqXIYAEvCluC0N9cRbmna2+YHH/vJ8/TdScoYsLAbGtghLOti713rszaER/EFM55mk2aJZ8CNxjq9lExotDDxKoBGUHWGshyO59EAlOfmR5Og==',key_name='tempest-TestNetworkAdvancedServerOps-632629835',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:02:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-znrqda5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:02:46Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=c442e253-3331-4edd-8629-6321ddc21de6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.134 227766 DEBUG nova.network.os_vif_util [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.136 227766 DEBUG nova.network.os_vif_util [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.136 227766 DEBUG os_vif [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.138 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.138 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ca05cbe-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.139 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.140 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.143 227766 INFO os_vif [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77')#033[00m
Jan 23 06:02:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:02:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:51.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.587 227766 INFO nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Deleting instance files /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6_del#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.589 227766 INFO nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Deletion of /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6_del complete#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.816 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.816 227766 INFO nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Creating image(s)#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.845 227766 DEBUG nova.storage.rbd_utils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.877 227766 DEBUG nova.storage.rbd_utils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.905 227766 DEBUG nova.storage.rbd_utils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:51 np0005593234 nova_compute[227762]: 2026-01-23 11:02:51.910 227766 DEBUG oslo_concurrency.processutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.004 227766 DEBUG oslo_concurrency.processutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.006 227766 DEBUG oslo_concurrency.lockutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.007 227766 DEBUG oslo_concurrency.lockutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.008 227766 DEBUG oslo_concurrency.lockutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "8edc4c18d7d1964a485fb1b305c460bdc5a45b20" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.044 227766 DEBUG nova.storage.rbd_utils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.050 227766 DEBUG oslo_concurrency.processutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 c442e253-3331-4edd-8629-6321ddc21de6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.377 227766 DEBUG oslo_concurrency.processutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/8edc4c18d7d1964a485fb1b305c460bdc5a45b20 c442e253-3331-4edd-8629-6321ddc21de6_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.326s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.451 227766 DEBUG nova.storage.rbd_utils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] resizing rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.582 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.583 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Ensure instance console log exists: /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.584 227766 DEBUG oslo_concurrency.lockutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.585 227766 DEBUG oslo_concurrency.lockutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.585 227766 DEBUG oslo_concurrency.lockutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.587 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Start _get_guest_xml network_info=[{"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.592 227766 WARNING nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.598 227766 DEBUG nova.virt.libvirt.host [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.599 227766 DEBUG nova.virt.libvirt.host [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.602 227766 DEBUG nova.virt.libvirt.host [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.602 227766 DEBUG nova.virt.libvirt.host [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.603 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.604 227766 DEBUG nova.virt.hardware [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:31Z,direct_url=<?>,disk_format='qcow2',id=ae1f9e37-418c-462f-81d1-3599a6d89de9,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:34Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.604 227766 DEBUG nova.virt.hardware [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.604 227766 DEBUG nova.virt.hardware [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.604 227766 DEBUG nova.virt.hardware [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.604 227766 DEBUG nova.virt.hardware [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.604 227766 DEBUG nova.virt.hardware [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.605 227766 DEBUG nova.virt.hardware [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.605 227766 DEBUG nova.virt.hardware [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.605 227766 DEBUG nova.virt.hardware [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.605 227766 DEBUG nova.virt.hardware [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.605 227766 DEBUG nova.virt.hardware [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.606 227766 DEBUG nova.objects.instance [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'vcpu_model' on Instance uuid c442e253-3331-4edd-8629-6321ddc21de6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.637 227766 DEBUG oslo_concurrency.processutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:02:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:52.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.922 227766 DEBUG nova.compute.manager [req-475b7116-b5e2-4875-83e7-1172c8af1e29 req-e56cff5f-51a4-43c3-9fc5-708d618614f7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received event network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.923 227766 DEBUG oslo_concurrency.lockutils [req-475b7116-b5e2-4875-83e7-1172c8af1e29 req-e56cff5f-51a4-43c3-9fc5-708d618614f7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c442e253-3331-4edd-8629-6321ddc21de6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.924 227766 DEBUG oslo_concurrency.lockutils [req-475b7116-b5e2-4875-83e7-1172c8af1e29 req-e56cff5f-51a4-43c3-9fc5-708d618614f7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.924 227766 DEBUG oslo_concurrency.lockutils [req-475b7116-b5e2-4875-83e7-1172c8af1e29 req-e56cff5f-51a4-43c3-9fc5-708d618614f7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.924 227766 DEBUG nova.compute.manager [req-475b7116-b5e2-4875-83e7-1172c8af1e29 req-e56cff5f-51a4-43c3-9fc5-708d618614f7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] No waiting events found dispatching network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:02:52 np0005593234 nova_compute[227762]: 2026-01-23 11:02:52.924 227766 WARNING nova.compute.manager [req-475b7116-b5e2-4875-83e7-1172c8af1e29 req-e56cff5f-51a4-43c3-9fc5-708d618614f7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received unexpected event network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 23 06:02:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:02:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4031343623' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.140 227766 DEBUG oslo_concurrency.processutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.503s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.169 227766 DEBUG nova.storage.rbd_utils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:53.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.175 227766 DEBUG oslo_concurrency.processutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:02:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1752577057' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.674 227766 DEBUG oslo_concurrency.processutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.678 227766 DEBUG nova.virt.libvirt.vif [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T11:02:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1829658395',display_name='tempest-TestNetworkAdvancedServerOps-server-1829658395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1829658395',id=215,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIhJKs1r/JsFpu0BWPjavFqXIYAEvCluC0N9cRbmna2+YHH/vJ8/TdScoYsLAbGtghLOti713rszaER/EFM55mk2aJZ8CNxjq9lExotDDxKoBGUHWGshyO59EAlOfmR5Og==',key_name='tempest-TestNetworkAdvancedServerOps-632629835',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:02:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-znrqda5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:02:51Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=c442e253-3331-4edd-8629-6321ddc21de6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.679 227766 DEBUG nova.network.os_vif_util [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.680 227766 DEBUG nova.network.os_vif_util [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.686 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] End _get_guest_xml xml=<domain type="kvm">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  <uuid>c442e253-3331-4edd-8629-6321ddc21de6</uuid>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  <name>instance-000000d7</name>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1829658395</nova:name>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 11:02:52</nova:creationTime>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="ae1f9e37-418c-462f-81d1-3599a6d89de9"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <nova:port uuid="7ca05cbe-776c-477b-a267-f19b2dcefdb6">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <system>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <entry name="serial">c442e253-3331-4edd-8629-6321ddc21de6</entry>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <entry name="uuid">c442e253-3331-4edd-8629-6321ddc21de6</entry>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    </system>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  <os>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  </os>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  <features>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  </features>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  </clock>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  <devices>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/c442e253-3331-4edd-8629-6321ddc21de6_disk">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/c442e253-3331-4edd-8629-6321ddc21de6_disk.config">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:7b:50:12"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <target dev="tap7ca05cbe-77"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    </interface>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/console.log" append="off"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    </serial>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <video>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    </video>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    </rng>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 06:02:53 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 06:02:53 np0005593234 nova_compute[227762]:  </devices>
Jan 23 06:02:53 np0005593234 nova_compute[227762]: </domain>
Jan 23 06:02:53 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.688 227766 DEBUG nova.virt.libvirt.vif [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T11:02:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1829658395',display_name='tempest-TestNetworkAdvancedServerOps-server-1829658395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1829658395',id=215,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIhJKs1r/JsFpu0BWPjavFqXIYAEvCluC0N9cRbmna2+YHH/vJ8/TdScoYsLAbGtghLOti713rszaER/EFM55mk2aJZ8CNxjq9lExotDDxKoBGUHWGshyO59EAlOfmR5Og==',key_name='tempest-TestNetworkAdvancedServerOps-632629835',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:02:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-znrqda5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:02:51Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=c442e253-3331-4edd-8629-6321ddc21de6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.689 227766 DEBUG nova.network.os_vif_util [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.690 227766 DEBUG nova.network.os_vif_util [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.691 227766 DEBUG os_vif [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.692 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.693 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.694 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.697 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.698 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ca05cbe-77, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.699 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7ca05cbe-77, col_values=(('external_ids', {'iface-id': '7ca05cbe-776c-477b-a267-f19b2dcefdb6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7b:50:12', 'vm-uuid': 'c442e253-3331-4edd-8629-6321ddc21de6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.701 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:53 np0005593234 NetworkManager[48942]: <info>  [1769166173.7021] manager: (tap7ca05cbe-77): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/461)
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.704 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.704 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.705 227766 INFO os_vif [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77')#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.770 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.771 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.771 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No VIF found with MAC fa:16:3e:7b:50:12, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.771 227766 INFO nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Using config drive#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.797 227766 DEBUG nova.storage.rbd_utils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.818 227766 DEBUG nova.objects.instance [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'ec2_ids' on Instance uuid c442e253-3331-4edd-8629-6321ddc21de6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:02:53 np0005593234 nova_compute[227762]: 2026-01-23 11:02:53.866 227766 DEBUG nova.objects.instance [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'keypairs' on Instance uuid c442e253-3331-4edd-8629-6321ddc21de6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:02:54 np0005593234 nova_compute[227762]: 2026-01-23 11:02:54.765 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:02:54 np0005593234 nova_compute[227762]: 2026-01-23 11:02:54.792 227766 INFO nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Creating config drive at /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/disk.config#033[00m
Jan 23 06:02:54 np0005593234 nova_compute[227762]: 2026-01-23 11:02:54.803 227766 DEBUG oslo_concurrency.processutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnsvr01dn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:02:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:54.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:02:54 np0005593234 nova_compute[227762]: 2026-01-23 11:02:54.942 227766 DEBUG oslo_concurrency.processutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpnsvr01dn" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:54 np0005593234 nova_compute[227762]: 2026-01-23 11:02:54.986 227766 DEBUG nova.storage.rbd_utils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image c442e253-3331-4edd-8629-6321ddc21de6_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:02:54 np0005593234 nova_compute[227762]: 2026-01-23 11:02:54.990 227766 DEBUG oslo_concurrency.processutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/disk.config c442e253-3331-4edd-8629-6321ddc21de6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:02:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:55.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.028 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.079 227766 DEBUG oslo_concurrency.processutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/disk.config c442e253-3331-4edd-8629-6321ddc21de6_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.079 227766 INFO nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Deleting local config drive /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6/disk.config because it was imported into RBD.#033[00m
Jan 23 06:02:56 np0005593234 kernel: tap7ca05cbe-77: entered promiscuous mode
Jan 23 06:02:56 np0005593234 NetworkManager[48942]: <info>  [1769166176.1497] manager: (tap7ca05cbe-77): new Tun device (/org/freedesktop/NetworkManager/Devices/462)
Jan 23 06:02:56 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:56Z|00966|binding|INFO|Claiming lport 7ca05cbe-776c-477b-a267-f19b2dcefdb6 for this chassis.
Jan 23 06:02:56 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:56Z|00967|binding|INFO|7ca05cbe-776c-477b-a267-f19b2dcefdb6: Claiming fa:16:3e:7b:50:12 10.100.0.10
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.152 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.167 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:50:12 10.100.0.10'], port_security=['fa:16:3e:7b:50:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c442e253-3331-4edd-8629-6321ddc21de6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61d10787-dafd-4592-8a90-5156be0ee76e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'b3661b36-5769-4538-ae1c-8c4dac03c6a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.200'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0a736d2-23ab-41ff-b228-b06b4b7f67c9, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=7ca05cbe-776c-477b-a267-f19b2dcefdb6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.169 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 7ca05cbe-776c-477b-a267-f19b2dcefdb6 in datapath 61d10787-dafd-4592-8a90-5156be0ee76e bound to our chassis#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.171 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 61d10787-dafd-4592-8a90-5156be0ee76e#033[00m
Jan 23 06:02:56 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:56Z|00968|binding|INFO|Setting lport 7ca05cbe-776c-477b-a267-f19b2dcefdb6 ovn-installed in OVS
Jan 23 06:02:56 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:56Z|00969|binding|INFO|Setting lport 7ca05cbe-776c-477b-a267-f19b2dcefdb6 up in Southbound
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.188 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.190 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9b123a73-9062-4d4b-b8d7-8145b7dcc21f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.192 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap61d10787-d1 in ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.193 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.194 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap61d10787-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.195 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e8a03990-f6cf-4ea8-9a69-2c94c85d4ac7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.196 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ddc01a12-4641-4182-8aeb-838eeaaa1b06]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 systemd-udevd[339473]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:02:56 np0005593234 systemd-machined[195626]: New machine qemu-108-instance-000000d7.
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.211 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[125d1d78-52f0-482f-a8c7-5f660fcf3efd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 NetworkManager[48942]: <info>  [1769166176.2204] device (tap7ca05cbe-77): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:02:56 np0005593234 NetworkManager[48942]: <info>  [1769166176.2210] device (tap7ca05cbe-77): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:02:56 np0005593234 systemd[1]: Started Virtual Machine qemu-108-instance-000000d7.
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.233 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d94d3d11-759e-4600-b75c-a52fc68f3518]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.273 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[2238c740-dfcc-41d7-b0bf-14777ca977ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 systemd-udevd[339477]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:02:56 np0005593234 NetworkManager[48942]: <info>  [1769166176.2834] manager: (tap61d10787-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/463)
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.284 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[44e4e6e4-8632-46af-86d1-5a77d58b58af]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.327 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[b90f806f-9938-4a61-911e-98e6abbdc07c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.330 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a184ca8e-dc52-4631-a617-8bd500a435c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 NetworkManager[48942]: <info>  [1769166176.3731] device (tap61d10787-d0): carrier: link connected
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.379 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[583fdf13-8ebc-43c3-8c61-a51edf8fdfc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.396 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f865471c-4527-472e-b6f3-0aaaec226835]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61d10787-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:2a:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 297], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1007133, 'reachable_time': 39129, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 339506, 'error': None, 'target': 'ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.413 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5c1ffe31-f98c-492b-b820-ef1a87019abe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee9:2af2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1007133, 'tstamp': 1007133}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 339507, 'error': None, 'target': 'ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.429 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e0a517a2-9c84-4ffa-99a0-bff395e3053a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap61d10787-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:2a:f2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 297], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1007133, 'reachable_time': 39129, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 339508, 'error': None, 'target': 'ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.466 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1d7bec83-775c-462f-b5fe-419f6d8735b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.523 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8ad2ade9-16c3-4d40-821c-f6a2ebc0b3df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.524 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61d10787-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.525 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.525 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap61d10787-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.527 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:56 np0005593234 NetworkManager[48942]: <info>  [1769166176.5279] manager: (tap61d10787-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/464)
Jan 23 06:02:56 np0005593234 kernel: tap61d10787-d0: entered promiscuous mode
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.529 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.530 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap61d10787-d0, col_values=(('external_ids', {'iface-id': '011329dd-9cad-4775-b208-cac68cef5e57'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.532 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:56 np0005593234 ovn_controller[134547]: 2026-01-23T11:02:56Z|00970|binding|INFO|Releasing lport 011329dd-9cad-4775-b208-cac68cef5e57 from this chassis (sb_readonly=0)
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.549 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.551 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/61d10787-dafd-4592-8a90-5156be0ee76e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/61d10787-dafd-4592-8a90-5156be0ee76e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.552 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[de943abf-6e7b-4348-9fd8-9b3c9dbd38f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.553 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-61d10787-dafd-4592-8a90-5156be0ee76e
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/61d10787-dafd-4592-8a90-5156be0ee76e.pid.haproxy
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 61d10787-dafd-4592-8a90-5156be0ee76e
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:02:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:02:56.554 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e', 'env', 'PROCESS_TAG=haproxy-61d10787-dafd-4592-8a90-5156be0ee76e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/61d10787-dafd-4592-8a90-5156be0ee76e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.828 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for c442e253-3331-4edd-8629-6321ddc21de6 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.830 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166176.828108, c442e253-3331-4edd-8629-6321ddc21de6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.830 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.834 227766 DEBUG nova.compute.manager [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.835 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.840 227766 INFO nova.virt.libvirt.driver [-] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Instance spawned successfully.#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.841 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.876 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.883 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.884 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.885 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.886 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.887 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.888 227766 DEBUG nova.virt.libvirt.driver [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.896 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:02:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:56.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.941 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.942 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166176.8305135, c442e253-3331-4edd-8629-6321ddc21de6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.942 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] VM Started (Lifecycle Event)#033[00m
Jan 23 06:02:56 np0005593234 podman[339582]: 2026-01-23 11:02:56.956725632 +0000 UTC m=+0.059926097 container create e4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.957 227766 DEBUG nova.compute.manager [req-f31a16ac-13d9-4459-94cd-6e9aca56e3e3 req-ee576673-d933-470a-bb88-1886bf332da8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received event network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.958 227766 DEBUG oslo_concurrency.lockutils [req-f31a16ac-13d9-4459-94cd-6e9aca56e3e3 req-ee576673-d933-470a-bb88-1886bf332da8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c442e253-3331-4edd-8629-6321ddc21de6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.958 227766 DEBUG oslo_concurrency.lockutils [req-f31a16ac-13d9-4459-94cd-6e9aca56e3e3 req-ee576673-d933-470a-bb88-1886bf332da8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.958 227766 DEBUG oslo_concurrency.lockutils [req-f31a16ac-13d9-4459-94cd-6e9aca56e3e3 req-ee576673-d933-470a-bb88-1886bf332da8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.958 227766 DEBUG nova.compute.manager [req-f31a16ac-13d9-4459-94cd-6e9aca56e3e3 req-ee576673-d933-470a-bb88-1886bf332da8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] No waiting events found dispatching network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.959 227766 WARNING nova.compute.manager [req-f31a16ac-13d9-4459-94cd-6e9aca56e3e3 req-ee576673-d933-470a-bb88-1886bf332da8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received unexpected event network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.989 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:02:56 np0005593234 nova_compute[227762]: 2026-01-23 11:02:56.992 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:02:56 np0005593234 systemd[1]: Started libpod-conmon-e4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138.scope.
Jan 23 06:02:57 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:02:57 np0005593234 nova_compute[227762]: 2026-01-23 11:02:57.022 227766 DEBUG nova.compute.manager [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:02:57 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6b71e1509ec851e1baf4f4fa8d88e0d7bab2d9df0cf3e8fe71fe065a3e9df03/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:02:57 np0005593234 nova_compute[227762]: 2026-01-23 11:02:57.023 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Jan 23 06:02:57 np0005593234 podman[339582]: 2026-01-23 11:02:56.933380521 +0000 UTC m=+0.036581006 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:02:57 np0005593234 podman[339582]: 2026-01-23 11:02:57.039228522 +0000 UTC m=+0.142429007 container init e4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:02:57 np0005593234 podman[339582]: 2026-01-23 11:02:57.046238272 +0000 UTC m=+0.149438737 container start e4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 06:02:57 np0005593234 nova_compute[227762]: 2026-01-23 11:02:57.085 227766 DEBUG oslo_concurrency.lockutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:57 np0005593234 nova_compute[227762]: 2026-01-23 11:02:57.086 227766 DEBUG oslo_concurrency.lockutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:57 np0005593234 nova_compute[227762]: 2026-01-23 11:02:57.086 227766 DEBUG nova.objects.instance [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Jan 23 06:02:57 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[339597]: [NOTICE]   (339601) : New worker (339603) forked
Jan 23 06:02:57 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[339597]: [NOTICE]   (339601) : Loading success.
Jan 23 06:02:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:57.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:57 np0005593234 nova_compute[227762]: 2026-01-23 11:02:57.195 227766 DEBUG oslo_concurrency.lockutils [None req-d7a3fe64-9d58-4564-b00c-713b0fb8ab8d 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.109s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:02:58 np0005593234 nova_compute[227762]: 2026-01-23 11:02:58.702 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:02:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:02:58.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:02:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:02:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:02:59.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:02:59 np0005593234 nova_compute[227762]: 2026-01-23 11:02:59.634 227766 DEBUG nova.compute.manager [req-a681bdea-b2d3-4b26-9acd-d2e30a56f0c0 req-720a7e55-e2e2-40ba-9358-7ef91c8e7667 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received event network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:02:59 np0005593234 nova_compute[227762]: 2026-01-23 11:02:59.635 227766 DEBUG oslo_concurrency.lockutils [req-a681bdea-b2d3-4b26-9acd-d2e30a56f0c0 req-720a7e55-e2e2-40ba-9358-7ef91c8e7667 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c442e253-3331-4edd-8629-6321ddc21de6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:02:59 np0005593234 nova_compute[227762]: 2026-01-23 11:02:59.635 227766 DEBUG oslo_concurrency.lockutils [req-a681bdea-b2d3-4b26-9acd-d2e30a56f0c0 req-720a7e55-e2e2-40ba-9358-7ef91c8e7667 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:02:59 np0005593234 nova_compute[227762]: 2026-01-23 11:02:59.635 227766 DEBUG oslo_concurrency.lockutils [req-a681bdea-b2d3-4b26-9acd-d2e30a56f0c0 req-720a7e55-e2e2-40ba-9358-7ef91c8e7667 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:02:59 np0005593234 nova_compute[227762]: 2026-01-23 11:02:59.635 227766 DEBUG nova.compute.manager [req-a681bdea-b2d3-4b26-9acd-d2e30a56f0c0 req-720a7e55-e2e2-40ba-9358-7ef91c8e7667 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] No waiting events found dispatching network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:02:59 np0005593234 nova_compute[227762]: 2026-01-23 11:02:59.636 227766 WARNING nova.compute.manager [req-a681bdea-b2d3-4b26-9acd-d2e30a56f0c0 req-720a7e55-e2e2-40ba-9358-7ef91c8e7667 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received unexpected event network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:03:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:03:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:00.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:03:01 np0005593234 nova_compute[227762]: 2026-01-23 11:03:01.031 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:01.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:02.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:03.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:03 np0005593234 nova_compute[227762]: 2026-01-23 11:03:03.706 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:04.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:05.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:06 np0005593234 nova_compute[227762]: 2026-01-23 11:03:06.033 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:06.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:07.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:07 np0005593234 podman[339617]: 2026-01-23 11:03:07.758685672 +0000 UTC m=+0.053873545 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 06:03:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:08 np0005593234 nova_compute[227762]: 2026-01-23 11:03:08.708 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:08.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:03:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:09.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:03:10 np0005593234 ovn_controller[134547]: 2026-01-23T11:03:10Z|00127|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7b:50:12 10.100.0.10
Jan 23 06:03:10 np0005593234 ovn_controller[134547]: 2026-01-23T11:03:10Z|00128|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7b:50:12 10.100.0.10
Jan 23 06:03:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:03:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:10.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:03:11 np0005593234 nova_compute[227762]: 2026-01-23 11:03:11.078 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:03:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:11.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:03:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:12.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:13.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:13 np0005593234 nova_compute[227762]: 2026-01-23 11:03:13.711 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:14.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:14 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 23 06:03:14 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 23 06:03:14 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 23 06:03:15 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000009 to be held by another RGW process; skipping for now
Jan 23 06:03:15 np0005593234 radosgw[83946]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 23 06:03:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:15.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:15 np0005593234 podman[339796]: 2026-01-23 11:03:15.519062944 +0000 UTC m=+0.181792347 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 06:03:16 np0005593234 nova_compute[227762]: 2026-01-23 11:03:16.081 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:16.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:17.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:17 np0005593234 nova_compute[227762]: 2026-01-23 11:03:17.717 227766 INFO nova.compute.manager [None req-8f8c23e9-7cc1-46aa-b157-9a0507219fc3 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Get console output#033[00m
Jan 23 06:03:17 np0005593234 nova_compute[227762]: 2026-01-23 11:03:17.723 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 06:03:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:18 np0005593234 nova_compute[227762]: 2026-01-23 11:03:18.713 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:18.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:19.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:19.615 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=96, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=95) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:03:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:19.618 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:03:19 np0005593234 nova_compute[227762]: 2026-01-23 11:03:19.659 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:19 np0005593234 nova_compute[227762]: 2026-01-23 11:03:19.823 227766 DEBUG nova.compute.manager [req-3de63cb7-a0cc-443d-8db8-910590b49bf2 req-664d0d15-9a49-4a3d-9c84-25ad6a23e397 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received event network-changed-7ca05cbe-776c-477b-a267-f19b2dcefdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:03:19 np0005593234 nova_compute[227762]: 2026-01-23 11:03:19.823 227766 DEBUG nova.compute.manager [req-3de63cb7-a0cc-443d-8db8-910590b49bf2 req-664d0d15-9a49-4a3d-9c84-25ad6a23e397 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Refreshing instance network info cache due to event network-changed-7ca05cbe-776c-477b-a267-f19b2dcefdb6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:03:19 np0005593234 nova_compute[227762]: 2026-01-23 11:03:19.824 227766 DEBUG oslo_concurrency.lockutils [req-3de63cb7-a0cc-443d-8db8-910590b49bf2 req-664d0d15-9a49-4a3d-9c84-25ad6a23e397 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c442e253-3331-4edd-8629-6321ddc21de6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:03:19 np0005593234 nova_compute[227762]: 2026-01-23 11:03:19.824 227766 DEBUG oslo_concurrency.lockutils [req-3de63cb7-a0cc-443d-8db8-910590b49bf2 req-664d0d15-9a49-4a3d-9c84-25ad6a23e397 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c442e253-3331-4edd-8629-6321ddc21de6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:03:19 np0005593234 nova_compute[227762]: 2026-01-23 11:03:19.824 227766 DEBUG nova.network.neutron [req-3de63cb7-a0cc-443d-8db8-910590b49bf2 req-664d0d15-9a49-4a3d-9c84-25ad6a23e397 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Refreshing network info cache for port 7ca05cbe-776c-477b-a267-f19b2dcefdb6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:03:19 np0005593234 nova_compute[227762]: 2026-01-23 11:03:19.892 227766 DEBUG oslo_concurrency.lockutils [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "c442e253-3331-4edd-8629-6321ddc21de6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:19 np0005593234 nova_compute[227762]: 2026-01-23 11:03:19.893 227766 DEBUG oslo_concurrency.lockutils [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:19 np0005593234 nova_compute[227762]: 2026-01-23 11:03:19.893 227766 DEBUG oslo_concurrency.lockutils [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "c442e253-3331-4edd-8629-6321ddc21de6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:19 np0005593234 nova_compute[227762]: 2026-01-23 11:03:19.893 227766 DEBUG oslo_concurrency.lockutils [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:19 np0005593234 nova_compute[227762]: 2026-01-23 11:03:19.893 227766 DEBUG oslo_concurrency.lockutils [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:19 np0005593234 nova_compute[227762]: 2026-01-23 11:03:19.895 227766 INFO nova.compute.manager [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Terminating instance#033[00m
Jan 23 06:03:19 np0005593234 nova_compute[227762]: 2026-01-23 11:03:19.896 227766 DEBUG nova.compute.manager [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 06:03:20 np0005593234 kernel: tap7ca05cbe-77 (unregistering): left promiscuous mode
Jan 23 06:03:20 np0005593234 NetworkManager[48942]: <info>  [1769166200.8869] device (tap7ca05cbe-77): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:03:20 np0005593234 nova_compute[227762]: 2026-01-23 11:03:20.892 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:20 np0005593234 ovn_controller[134547]: 2026-01-23T11:03:20Z|00971|binding|INFO|Releasing lport 7ca05cbe-776c-477b-a267-f19b2dcefdb6 from this chassis (sb_readonly=0)
Jan 23 06:03:20 np0005593234 ovn_controller[134547]: 2026-01-23T11:03:20Z|00972|binding|INFO|Setting lport 7ca05cbe-776c-477b-a267-f19b2dcefdb6 down in Southbound
Jan 23 06:03:20 np0005593234 ovn_controller[134547]: 2026-01-23T11:03:20Z|00973|binding|INFO|Removing iface tap7ca05cbe-77 ovn-installed in OVS
Jan 23 06:03:20 np0005593234 nova_compute[227762]: 2026-01-23 11:03:20.896 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:20.903 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7b:50:12 10.100.0.10'], port_security=['fa:16:3e:7b:50:12 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'c442e253-3331-4edd-8629-6321ddc21de6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-61d10787-dafd-4592-8a90-5156be0ee76e', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'b3661b36-5769-4538-ae1c-8c4dac03c6a9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c0a736d2-23ab-41ff-b228-b06b4b7f67c9, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=7ca05cbe-776c-477b-a267-f19b2dcefdb6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:03:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:20.905 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 7ca05cbe-776c-477b-a267-f19b2dcefdb6 in datapath 61d10787-dafd-4592-8a90-5156be0ee76e unbound from our chassis#033[00m
Jan 23 06:03:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:20.907 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 61d10787-dafd-4592-8a90-5156be0ee76e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:03:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:20.909 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7f972053-53c6-4ece-9685-0649a6c853ce]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:20.911 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e namespace which is not needed anymore#033[00m
Jan 23 06:03:20 np0005593234 nova_compute[227762]: 2026-01-23 11:03:20.917 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:20.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:20 np0005593234 systemd[1]: machine-qemu\x2d108\x2dinstance\x2d000000d7.scope: Deactivated successfully.
Jan 23 06:03:20 np0005593234 systemd[1]: machine-qemu\x2d108\x2dinstance\x2d000000d7.scope: Consumed 13.669s CPU time.
Jan 23 06:03:20 np0005593234 systemd-machined[195626]: Machine qemu-108-instance-000000d7 terminated.
Jan 23 06:03:21 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[339597]: [NOTICE]   (339601) : haproxy version is 2.8.14-c23fe91
Jan 23 06:03:21 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[339597]: [NOTICE]   (339601) : path to executable is /usr/sbin/haproxy
Jan 23 06:03:21 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[339597]: [WARNING]  (339601) : Exiting Master process...
Jan 23 06:03:21 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[339597]: [ALERT]    (339601) : Current worker (339603) exited with code 143 (Terminated)
Jan 23 06:03:21 np0005593234 neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e[339597]: [WARNING]  (339601) : All workers exited. Exiting... (0)
Jan 23 06:03:21 np0005593234 systemd[1]: libpod-e4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138.scope: Deactivated successfully.
Jan 23 06:03:21 np0005593234 podman[339996]: 2026-01-23 11:03:21.069498929 +0000 UTC m=+0.050388627 container died e4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.082 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:21 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138-userdata-shm.mount: Deactivated successfully.
Jan 23 06:03:21 np0005593234 systemd[1]: var-lib-containers-storage-overlay-e6b71e1509ec851e1baf4f4fa8d88e0d7bab2d9df0cf3e8fe71fe065a3e9df03-merged.mount: Deactivated successfully.
Jan 23 06:03:21 np0005593234 podman[339996]: 2026-01-23 11:03:21.106797145 +0000 UTC m=+0.087686843 container cleanup e4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 06:03:21 np0005593234 systemd[1]: libpod-conmon-e4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138.scope: Deactivated successfully.
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.133 227766 INFO nova.virt.libvirt.driver [-] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Instance destroyed successfully.#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.133 227766 DEBUG nova.objects.instance [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'resources' on Instance uuid c442e253-3331-4edd-8629-6321ddc21de6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.157 227766 DEBUG nova.virt.libvirt.vif [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T11:02:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1829658395',display_name='tempest-TestNetworkAdvancedServerOps-server-1829658395',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1829658395',id=215,image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIhJKs1r/JsFpu0BWPjavFqXIYAEvCluC0N9cRbmna2+YHH/vJ8/TdScoYsLAbGtghLOti713rszaER/EFM55mk2aJZ8CNxjq9lExotDDxKoBGUHWGshyO59EAlOfmR5Og==',key_name='tempest-TestNetworkAdvancedServerOps-632629835',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:02:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-znrqda5i',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='ae1f9e37-418c-462f-81d1-3599a6d89de9',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:02:57Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=c442e253-3331-4edd-8629-6321ddc21de6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.158 227766 DEBUG nova.network.os_vif_util [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.200", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.159 227766 DEBUG nova.network.os_vif_util [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.159 227766 DEBUG os_vif [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.161 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.162 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ca05cbe-77, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.163 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.165 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.168 227766 INFO os_vif [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7b:50:12,bridge_name='br-int',has_traffic_filtering=True,id=7ca05cbe-776c-477b-a267-f19b2dcefdb6,network=Network(61d10787-dafd-4592-8a90-5156be0ee76e),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ca05cbe-77')#033[00m
Jan 23 06:03:21 np0005593234 podman[340029]: 2026-01-23 11:03:21.172590483 +0000 UTC m=+0.041909602 container remove e4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 06:03:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:21.178 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[26a47995-96e3-43fc-b672-898a611564a0]: (4, ('Fri Jan 23 11:03:21 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e (e4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138)\ne4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138\nFri Jan 23 11:03:21 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e (e4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138)\ne4eff3af0844ba8321d9d19a00641288d1ad68611ba64464f957e91cdf74c138\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:21.180 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8124c82e-c28c-4117-9633-d25486564ed7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:21.181 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap61d10787-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:03:21 np0005593234 kernel: tap61d10787-d0: left promiscuous mode
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.191 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.200 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:21.202 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2bd1f0d0-391d-4023-812b-a9a5c2e069f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:21.216 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7aed1368-7500-46b9-a58b-ed74986632bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:21.217 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[29e05426-bee7-4bec-8e92-39189f5a1703]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:21.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.222 227766 DEBUG nova.compute.manager [req-678f1b18-8c09-498b-a35d-ebb461f2dcb6 req-90593b4b-7f8b-4de3-a0f2-36da71d83eb1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received event network-vif-unplugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.222 227766 DEBUG oslo_concurrency.lockutils [req-678f1b18-8c09-498b-a35d-ebb461f2dcb6 req-90593b4b-7f8b-4de3-a0f2-36da71d83eb1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c442e253-3331-4edd-8629-6321ddc21de6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.222 227766 DEBUG oslo_concurrency.lockutils [req-678f1b18-8c09-498b-a35d-ebb461f2dcb6 req-90593b4b-7f8b-4de3-a0f2-36da71d83eb1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.223 227766 DEBUG oslo_concurrency.lockutils [req-678f1b18-8c09-498b-a35d-ebb461f2dcb6 req-90593b4b-7f8b-4de3-a0f2-36da71d83eb1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.223 227766 DEBUG nova.compute.manager [req-678f1b18-8c09-498b-a35d-ebb461f2dcb6 req-90593b4b-7f8b-4de3-a0f2-36da71d83eb1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] No waiting events found dispatching network-vif-unplugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.223 227766 DEBUG nova.compute.manager [req-678f1b18-8c09-498b-a35d-ebb461f2dcb6 req-90593b4b-7f8b-4de3-a0f2-36da71d83eb1 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received event network-vif-unplugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 06:03:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:21.233 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[181b5ee1-3872-4532-aee5-3d3704c92878]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1007122, 'reachable_time': 43732, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 340069, 'error': None, 'target': 'ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:21 np0005593234 systemd[1]: run-netns-ovnmeta\x2d61d10787\x2ddafd\x2d4592\x2d8a90\x2d5156be0ee76e.mount: Deactivated successfully.
Jan 23 06:03:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:21.237 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-61d10787-dafd-4592-8a90-5156be0ee76e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:03:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:21.237 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[854e8373-b965-4979-913e-7dfdc3abfe36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.481 227766 DEBUG nova.network.neutron [req-3de63cb7-a0cc-443d-8db8-910590b49bf2 req-664d0d15-9a49-4a3d-9c84-25ad6a23e397 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Updated VIF entry in instance network info cache for port 7ca05cbe-776c-477b-a267-f19b2dcefdb6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.482 227766 DEBUG nova.network.neutron [req-3de63cb7-a0cc-443d-8db8-910590b49bf2 req-664d0d15-9a49-4a3d-9c84-25ad6a23e397 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Updating instance_info_cache with network_info: [{"id": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "address": "fa:16:3e:7b:50:12", "network": {"id": "61d10787-dafd-4592-8a90-5156be0ee76e", "bridge": "br-int", "label": "tempest-network-smoke--1234554670", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ca05cbe-77", "ovs_interfaceid": "7ca05cbe-776c-477b-a267-f19b2dcefdb6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:03:21 np0005593234 nova_compute[227762]: 2026-01-23 11:03:21.507 227766 DEBUG oslo_concurrency.lockutils [req-3de63cb7-a0cc-443d-8db8-910590b49bf2 req-664d0d15-9a49-4a3d-9c84-25ad6a23e397 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c442e253-3331-4edd-8629-6321ddc21de6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:03:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:03:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:22 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:03:22 np0005593234 nova_compute[227762]: 2026-01-23 11:03:22.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:22 np0005593234 nova_compute[227762]: 2026-01-23 11:03:22.765 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:22 np0005593234 nova_compute[227762]: 2026-01-23 11:03:22.766 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:22 np0005593234 nova_compute[227762]: 2026-01-23 11:03:22.767 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:22 np0005593234 nova_compute[227762]: 2026-01-23 11:03:22.767 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:03:22 np0005593234 nova_compute[227762]: 2026-01-23 11:03:22.768 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:03:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:22.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.175 227766 INFO nova.virt.libvirt.driver [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Deleting instance files /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6_del#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.181 227766 INFO nova.virt.libvirt.driver [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Deletion of /var/lib/nova/instances/c442e253-3331-4edd-8629-6321ddc21de6_del complete#033[00m
Jan 23 06:03:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:23.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.243 227766 INFO nova.compute.manager [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Took 3.35 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.244 227766 DEBUG oslo.service.loopingcall [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.245 227766 DEBUG nova.compute.manager [-] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.245 227766 DEBUG nova.network.neutron [-] [instance: c442e253-3331-4edd-8629-6321ddc21de6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 06:03:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:03:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2632353959' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.277 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.510s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.301 227766 DEBUG nova.compute.manager [req-32ea4496-dfb9-4723-852b-0aa2ce09db8e req-d878bfd1-324c-4843-ba64-19b476f249f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received event network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.302 227766 DEBUG oslo_concurrency.lockutils [req-32ea4496-dfb9-4723-852b-0aa2ce09db8e req-d878bfd1-324c-4843-ba64-19b476f249f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c442e253-3331-4edd-8629-6321ddc21de6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.302 227766 DEBUG oslo_concurrency.lockutils [req-32ea4496-dfb9-4723-852b-0aa2ce09db8e req-d878bfd1-324c-4843-ba64-19b476f249f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.302 227766 DEBUG oslo_concurrency.lockutils [req-32ea4496-dfb9-4723-852b-0aa2ce09db8e req-d878bfd1-324c-4843-ba64-19b476f249f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.303 227766 DEBUG nova.compute.manager [req-32ea4496-dfb9-4723-852b-0aa2ce09db8e req-d878bfd1-324c-4843-ba64-19b476f249f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] No waiting events found dispatching network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.303 227766 WARNING nova.compute.manager [req-32ea4496-dfb9-4723-852b-0aa2ce09db8e req-d878bfd1-324c-4843-ba64-19b476f249f2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received unexpected event network-vif-plugged-7ca05cbe-776c-477b-a267-f19b2dcefdb6 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.478 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.479 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4082MB free_disk=20.942752838134766GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.480 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.480 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.634 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance c442e253-3331-4edd-8629-6321ddc21de6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.635 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.635 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.691 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.711 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.711 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.724 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.744 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 06:03:23 np0005593234 nova_compute[227762]: 2026-01-23 11:03:23.775 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.012 227766 DEBUG nova.network.neutron [-] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.027 227766 INFO nova.compute.manager [-] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Took 0.78 seconds to deallocate network for instance.#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.081 227766 DEBUG nova.compute.manager [req-3acd29c3-e1ff-4b59-9af4-5a8a0e11b204 req-ea132dc2-1f15-4428-a26c-af8da51c538c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Received event network-vif-deleted-7ca05cbe-776c-477b-a267-f19b2dcefdb6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.083 227766 DEBUG oslo_concurrency.lockutils [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:03:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2581897017' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.216 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.223 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.237 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.262 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.263 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.783s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.264 227766 DEBUG oslo_concurrency.lockutils [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.326 227766 DEBUG oslo_concurrency.processutils [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:03:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:03:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4030606684' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.752 227766 DEBUG oslo_concurrency.processutils [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.426s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.758 227766 DEBUG nova.compute.provider_tree [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.772 227766 DEBUG nova.scheduler.client.report [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.790 227766 DEBUG oslo_concurrency.lockutils [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.527s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.820 227766 INFO nova.scheduler.client.report [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Deleted allocations for instance c442e253-3331-4edd-8629-6321ddc21de6#033[00m
Jan 23 06:03:24 np0005593234 nova_compute[227762]: 2026-01-23 11:03:24.903 227766 DEBUG oslo_concurrency.lockutils [None req-870bb657-ffd6-44f1-970b-a820ca494e9f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "c442e253-3331-4edd-8629-6321ddc21de6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.010s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:24.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:03:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:25.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:03:25 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:25.622 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '96'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:03:26 np0005593234 nova_compute[227762]: 2026-01-23 11:03:26.085 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:26 np0005593234 nova_compute[227762]: 2026-01-23 11:03:26.163 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:26 np0005593234 nova_compute[227762]: 2026-01-23 11:03:26.263 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:26 np0005593234 nova_compute[227762]: 2026-01-23 11:03:26.264 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:03:26 np0005593234 nova_compute[227762]: 2026-01-23 11:03:26.264 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:03:26 np0005593234 nova_compute[227762]: 2026-01-23 11:03:26.305 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:03:26 np0005593234 nova_compute[227762]: 2026-01-23 11:03:26.306 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.003000095s ======
Jan 23 06:03:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:26.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 23 06:03:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:27.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:27 np0005593234 nova_compute[227762]: 2026-01-23 11:03:27.591 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:27 np0005593234 nova_compute[227762]: 2026-01-23 11:03:27.664 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:03:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:28.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:29.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:30.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:31 np0005593234 nova_compute[227762]: 2026-01-23 11:03:31.086 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:31 np0005593234 nova_compute[227762]: 2026-01-23 11:03:31.166 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:31.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:32.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:33.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:33 np0005593234 nova_compute[227762]: 2026-01-23 11:03:33.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:34 np0005593234 nova_compute[227762]: 2026-01-23 11:03:34.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:34 np0005593234 nova_compute[227762]: 2026-01-23 11:03:34.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:34 np0005593234 nova_compute[227762]: 2026-01-23 11:03:34.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:03:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:34.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:35.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:36 np0005593234 nova_compute[227762]: 2026-01-23 11:03:36.089 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:36 np0005593234 nova_compute[227762]: 2026-01-23 11:03:36.131 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769166201.130901, c442e253-3331-4edd-8629-6321ddc21de6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:03:36 np0005593234 nova_compute[227762]: 2026-01-23 11:03:36.132 227766 INFO nova.compute.manager [-] [instance: c442e253-3331-4edd-8629-6321ddc21de6] VM Stopped (Lifecycle Event)#033[00m
Jan 23 06:03:36 np0005593234 nova_compute[227762]: 2026-01-23 11:03:36.167 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:36 np0005593234 nova_compute[227762]: 2026-01-23 11:03:36.804 227766 DEBUG nova.compute.manager [None req-f6267117-2528-41bc-906e-8d2ee9b364e1 - - - - - -] [instance: c442e253-3331-4edd-8629-6321ddc21de6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:03:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:36.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:37.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:38 np0005593234 podman[340250]: 2026-01-23 11:03:38.775763145 +0000 UTC m=+0.070726744 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:03:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:38.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:03:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:39.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:03:39 np0005593234 nova_compute[227762]: 2026-01-23 11:03:39.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:40.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:41 np0005593234 nova_compute[227762]: 2026-01-23 11:03:41.091 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:41 np0005593234 nova_compute[227762]: 2026-01-23 11:03:41.170 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:41.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:41 np0005593234 nova_compute[227762]: 2026-01-23 11:03:41.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:42.912 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:03:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:42.913 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:03:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:03:42.913 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:03:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:42.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:43.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:03:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1358292175' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:03:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:03:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1358292175' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:03:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:44.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000064s ======
Jan 23 06:03:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:45.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 23 06:03:45 np0005593234 podman[340274]: 2026-01-23 11:03:45.79578986 +0000 UTC m=+0.090926475 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:03:46 np0005593234 nova_compute[227762]: 2026-01-23 11:03:46.125 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:46 np0005593234 nova_compute[227762]: 2026-01-23 11:03:46.171 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:46.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:03:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:47.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:03:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:48.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 06:03:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:49.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 06:03:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:50.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:51 np0005593234 nova_compute[227762]: 2026-01-23 11:03:51.127 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:51 np0005593234 nova_compute[227762]: 2026-01-23 11:03:51.172 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:51.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:52.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:53.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:54.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:55.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:03:56 np0005593234 nova_compute[227762]: 2026-01-23 11:03:56.158 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:56 np0005593234 nova_compute[227762]: 2026-01-23 11:03:56.173 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:03:56 np0005593234 nova_compute[227762]: 2026-01-23 11:03:56.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:03:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:56.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:03:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:57.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:03:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:03:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:03:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:03:58.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:03:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:03:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:03:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:03:59.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:04:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:00.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:04:01 np0005593234 nova_compute[227762]: 2026-01-23 11:04:01.174 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:04:01 np0005593234 nova_compute[227762]: 2026-01-23 11:04:01.176 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:04:01 np0005593234 nova_compute[227762]: 2026-01-23 11:04:01.176 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 06:04:01 np0005593234 nova_compute[227762]: 2026-01-23 11:04:01.176 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:04:01 np0005593234 nova_compute[227762]: 2026-01-23 11:04:01.183 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:01 np0005593234 nova_compute[227762]: 2026-01-23 11:04:01.184 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:04:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:01.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:02.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:03.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:04.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:05.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:06 np0005593234 nova_compute[227762]: 2026-01-23 11:04:06.185 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:04:06 np0005593234 nova_compute[227762]: 2026-01-23 11:04:06.186 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:06 np0005593234 nova_compute[227762]: 2026-01-23 11:04:06.186 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 06:04:06 np0005593234 nova_compute[227762]: 2026-01-23 11:04:06.186 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:04:06 np0005593234 nova_compute[227762]: 2026-01-23 11:04:06.187 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:04:06 np0005593234 nova_compute[227762]: 2026-01-23 11:04:06.188 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:06.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:07.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:08.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:09.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:09 np0005593234 podman[340364]: 2026-01-23 11:04:09.755366096 +0000 UTC m=+0.050678187 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Jan 23 06:04:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:10.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:11 np0005593234 nova_compute[227762]: 2026-01-23 11:04:11.189 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:04:11 np0005593234 nova_compute[227762]: 2026-01-23 11:04:11.191 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:04:11 np0005593234 nova_compute[227762]: 2026-01-23 11:04:11.191 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 06:04:11 np0005593234 nova_compute[227762]: 2026-01-23 11:04:11.191 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:04:11 np0005593234 nova_compute[227762]: 2026-01-23 11:04:11.207 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:11 np0005593234 nova_compute[227762]: 2026-01-23 11:04:11.207 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:04:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:11.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:12 np0005593234 ovn_controller[134547]: 2026-01-23T11:04:12Z|00974|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #205. Immutable memtables: 0.
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.684809) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 131] Flushing memtable with next log file: 205
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166252684866, "job": 131, "event": "flush_started", "num_memtables": 1, "num_entries": 2354, "num_deletes": 251, "total_data_size": 6009188, "memory_usage": 6075536, "flush_reason": "Manual Compaction"}
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 131] Level-0 flush table #206: started
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166252759038, "cf_name": "default", "job": 131, "event": "table_file_creation", "file_number": 206, "file_size": 3896466, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 97141, "largest_seqno": 99490, "table_properties": {"data_size": 3886792, "index_size": 6167, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19489, "raw_average_key_size": 20, "raw_value_size": 3867713, "raw_average_value_size": 4033, "num_data_blocks": 269, "num_entries": 959, "num_filter_entries": 959, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166039, "oldest_key_time": 1769166039, "file_creation_time": 1769166252, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 206, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 131] Flush lasted 74301 microseconds, and 8328 cpu microseconds.
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.759112) [db/flush_job.cc:967] [default] [JOB 131] Level-0 flush table #206: 3896466 bytes OK
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.759131) [db/memtable_list.cc:519] [default] Level-0 commit table #206 started
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.762219) [db/memtable_list.cc:722] [default] Level-0 commit table #206: memtable #1 done
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.762235) EVENT_LOG_v1 {"time_micros": 1769166252762230, "job": 131, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.762252) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 131] Try to delete WAL files size 5998747, prev total WAL file size 5998747, number of live WAL files 2.
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000202.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.763514) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730038373835' seq:72057594037927935, type:22 .. '7061786F730039303337' seq:0, type:0; will stop at (end)
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 132] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 131 Base level 0, inputs: [206(3805KB)], [204(11MB)]
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166252763644, "job": 132, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [206], "files_L6": [204], "score": -1, "input_data_size": 16318538, "oldest_snapshot_seqno": -1}
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 132] Generated table #207: 11615 keys, 14318429 bytes, temperature: kUnknown
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166252913413, "cf_name": "default", "job": 132, "event": "table_file_creation", "file_number": 207, "file_size": 14318429, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14244798, "index_size": 43502, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29061, "raw_key_size": 306553, "raw_average_key_size": 26, "raw_value_size": 14043287, "raw_average_value_size": 1209, "num_data_blocks": 1651, "num_entries": 11615, "num_filter_entries": 11615, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769166252, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 207, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.933639) [db/compaction/compaction_job.cc:1663] [default] [JOB 132] Compacted 1@0 + 1@6 files to L6 => 14318429 bytes
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.935353) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 108.9 rd, 95.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 11.8 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(7.9) write-amplify(3.7) OK, records in: 12134, records dropped: 519 output_compression: NoCompression
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.935377) EVENT_LOG_v1 {"time_micros": 1769166252935367, "job": 132, "event": "compaction_finished", "compaction_time_micros": 149837, "compaction_time_cpu_micros": 31484, "output_level": 6, "num_output_files": 1, "total_output_size": 14318429, "num_input_records": 12134, "num_output_records": 11615, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000206.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166252936070, "job": 132, "event": "table_file_deletion", "file_number": 206}
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000204.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166252938197, "job": 132, "event": "table_file_deletion", "file_number": 204}
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.763400) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.938268) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.938274) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.938276) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.938277) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:04:12 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:04:12.938282) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:04:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:04:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:13.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:04:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:13.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:15.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:15.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:16 np0005593234 nova_compute[227762]: 2026-01-23 11:04:16.207 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:16 np0005593234 podman[340409]: 2026-01-23 11:04:16.301253588 +0000 UTC m=+0.070875530 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 06:04:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:04:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:17.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:04:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:04:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:17.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:04:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:04:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:19.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:04:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:19.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:21.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:21 np0005593234 nova_compute[227762]: 2026-01-23 11:04:21.208 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:04:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:21.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:04:22 np0005593234 nova_compute[227762]: 2026-01-23 11:04:22.388 227766 DEBUG nova.compute.manager [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Stashing vm_state: active _prep_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:5560#033[00m
Jan 23 06:04:22 np0005593234 nova_compute[227762]: 2026-01-23 11:04:22.502 227766 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:22 np0005593234 nova_compute[227762]: 2026-01-23 11:04:22.503 227766 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:22 np0005593234 nova_compute[227762]: 2026-01-23 11:04:22.533 227766 DEBUG nova.objects.instance [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_requests' on Instance uuid 15466683-985e-412a-b13a-037d70f393ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:04:22 np0005593234 nova_compute[227762]: 2026-01-23 11:04:22.556 227766 DEBUG nova.virt.hardware [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 06:04:22 np0005593234 nova_compute[227762]: 2026-01-23 11:04:22.557 227766 INFO nova.compute.claims [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 06:04:22 np0005593234 nova_compute[227762]: 2026-01-23 11:04:22.557 227766 DEBUG nova.objects.instance [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'resources' on Instance uuid 15466683-985e-412a-b13a-037d70f393ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:04:22 np0005593234 nova_compute[227762]: 2026-01-23 11:04:22.570 227766 DEBUG nova.objects.instance [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid 15466683-985e-412a-b13a-037d70f393ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:04:22 np0005593234 nova_compute[227762]: 2026-01-23 11:04:22.609 227766 INFO nova.compute.resource_tracker [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating resource usage from migration 591a7f31-28ee-4ebf-96f8-82c1aa98c447#033[00m
Jan 23 06:04:22 np0005593234 nova_compute[227762]: 2026-01-23 11:04:22.609 227766 DEBUG nova.compute.resource_tracker [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Starting to track incoming migration 591a7f31-28ee-4ebf-96f8-82c1aa98c447 with flavor eebea5f8-9b11-45ad-873d-c4ea90d3de87 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 06:04:22 np0005593234 nova_compute[227762]: 2026-01-23 11:04:22.663 227766 DEBUG oslo_concurrency.processutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:04:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:23.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:04:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3216388691' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:04:23 np0005593234 nova_compute[227762]: 2026-01-23 11:04:23.108 227766 DEBUG oslo_concurrency.processutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:04:23 np0005593234 nova_compute[227762]: 2026-01-23 11:04:23.114 227766 DEBUG nova.compute.provider_tree [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:04:23 np0005593234 nova_compute[227762]: 2026-01-23 11:04:23.142 227766 DEBUG nova.scheduler.client.report [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:04:23 np0005593234 nova_compute[227762]: 2026-01-23 11:04:23.182 227766 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.resize_claim" :: held 0.679s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:23 np0005593234 nova_compute[227762]: 2026-01-23 11:04:23.182 227766 INFO nova.compute.manager [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Migrating#033[00m
Jan 23 06:04:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:23.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:23 np0005593234 nova_compute[227762]: 2026-01-23 11:04:23.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:23 np0005593234 nova_compute[227762]: 2026-01-23 11:04:23.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:23 np0005593234 nova_compute[227762]: 2026-01-23 11:04:23.778 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:23 np0005593234 nova_compute[227762]: 2026-01-23 11:04:23.778 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:23 np0005593234 nova_compute[227762]: 2026-01-23 11:04:23.778 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:04:23 np0005593234 nova_compute[227762]: 2026-01-23 11:04:23.778 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:04:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:04:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/764230134' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:04:24 np0005593234 nova_compute[227762]: 2026-01-23 11:04:24.226 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:04:24 np0005593234 nova_compute[227762]: 2026-01-23 11:04:24.370 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:04:24 np0005593234 nova_compute[227762]: 2026-01-23 11:04:24.371 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4140MB free_disk=20.942890167236328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:04:24 np0005593234 nova_compute[227762]: 2026-01-23 11:04:24.372 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:24 np0005593234 nova_compute[227762]: 2026-01-23 11:04:24.372 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:24 np0005593234 nova_compute[227762]: 2026-01-23 11:04:24.453 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Migration for instance 15466683-985e-412a-b13a-037d70f393ef refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903#033[00m
Jan 23 06:04:24 np0005593234 nova_compute[227762]: 2026-01-23 11:04:24.491 227766 INFO nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating resource usage from migration 591a7f31-28ee-4ebf-96f8-82c1aa98c447#033[00m
Jan 23 06:04:24 np0005593234 nova_compute[227762]: 2026-01-23 11:04:24.491 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Starting to track incoming migration 591a7f31-28ee-4ebf-96f8-82c1aa98c447 with flavor eebea5f8-9b11-45ad-873d-c4ea90d3de87 _update_usage_from_migration /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1431#033[00m
Jan 23 06:04:24 np0005593234 nova_compute[227762]: 2026-01-23 11:04:24.546 227766 WARNING nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance 15466683-985e-412a-b13a-037d70f393ef has been moved to another host compute-1.ctlplane.example.com(compute-1.ctlplane.example.com). There are allocations remaining against the source host that might need to be removed: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 192, 'VCPU': 1}}.#033[00m
Jan 23 06:04:24 np0005593234 nova_compute[227762]: 2026-01-23 11:04:24.547 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:04:24 np0005593234 nova_compute[227762]: 2026-01-23 11:04:24.547 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=704MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:04:24 np0005593234 nova_compute[227762]: 2026-01-23 11:04:24.625 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:04:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:25.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:04:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3349493717' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:04:25 np0005593234 nova_compute[227762]: 2026-01-23 11:04:25.053 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:04:25 np0005593234 nova_compute[227762]: 2026-01-23 11:04:25.059 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:04:25 np0005593234 nova_compute[227762]: 2026-01-23 11:04:25.077 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:04:25 np0005593234 nova_compute[227762]: 2026-01-23 11:04:25.101 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:04:25 np0005593234 nova_compute[227762]: 2026-01-23 11:04:25.101 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:25.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:25 np0005593234 systemd[1]: Created slice User Slice of UID 42436.
Jan 23 06:04:25 np0005593234 systemd[1]: Starting User Runtime Directory /run/user/42436...
Jan 23 06:04:25 np0005593234 systemd-logind[794]: New session 72 of user nova.
Jan 23 06:04:25 np0005593234 systemd[1]: Finished User Runtime Directory /run/user/42436.
Jan 23 06:04:25 np0005593234 systemd[1]: Starting User Manager for UID 42436...
Jan 23 06:04:26 np0005593234 systemd[340541]: Queued start job for default target Main User Target.
Jan 23 06:04:26 np0005593234 systemd[340541]: Created slice User Application Slice.
Jan 23 06:04:26 np0005593234 systemd[340541]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 06:04:26 np0005593234 systemd[340541]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 06:04:26 np0005593234 systemd[340541]: Reached target Paths.
Jan 23 06:04:26 np0005593234 systemd[340541]: Reached target Timers.
Jan 23 06:04:26 np0005593234 systemd[340541]: Starting D-Bus User Message Bus Socket...
Jan 23 06:04:26 np0005593234 systemd[340541]: Starting Create User's Volatile Files and Directories...
Jan 23 06:04:26 np0005593234 systemd[340541]: Finished Create User's Volatile Files and Directories.
Jan 23 06:04:26 np0005593234 systemd[340541]: Listening on D-Bus User Message Bus Socket.
Jan 23 06:04:26 np0005593234 systemd[340541]: Reached target Sockets.
Jan 23 06:04:26 np0005593234 systemd[340541]: Reached target Basic System.
Jan 23 06:04:26 np0005593234 systemd[340541]: Reached target Main User Target.
Jan 23 06:04:26 np0005593234 systemd[340541]: Startup finished in 154ms.
Jan 23 06:04:26 np0005593234 systemd[1]: Started User Manager for UID 42436.
Jan 23 06:04:26 np0005593234 systemd[1]: Started Session 72 of User nova.
Jan 23 06:04:26 np0005593234 nova_compute[227762]: 2026-01-23 11:04:26.210 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:26 np0005593234 systemd[1]: session-72.scope: Deactivated successfully.
Jan 23 06:04:26 np0005593234 systemd-logind[794]: Session 72 logged out. Waiting for processes to exit.
Jan 23 06:04:26 np0005593234 systemd-logind[794]: Removed session 72.
Jan 23 06:04:26 np0005593234 auditd[702]: Audit daemon rotating log files
Jan 23 06:04:26 np0005593234 systemd-logind[794]: New session 74 of user nova.
Jan 23 06:04:26 np0005593234 systemd[1]: Started Session 74 of User nova.
Jan 23 06:04:26 np0005593234 systemd[1]: session-74.scope: Deactivated successfully.
Jan 23 06:04:26 np0005593234 systemd-logind[794]: Session 74 logged out. Waiting for processes to exit.
Jan 23 06:04:26 np0005593234 systemd-logind[794]: Removed session 74.
Jan 23 06:04:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:04:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:27.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:04:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:27.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:28 np0005593234 nova_compute[227762]: 2026-01-23 11:04:28.102 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:28 np0005593234 nova_compute[227762]: 2026-01-23 11:04:28.103 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:04:28 np0005593234 nova_compute[227762]: 2026-01-23 11:04:28.103 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:04:28 np0005593234 nova_compute[227762]: 2026-01-23 11:04:28.117 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:04:28 np0005593234 nova_compute[227762]: 2026-01-23 11:04:28.117 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:29.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:29 np0005593234 podman[340737]: 2026-01-23 11:04:29.312912091 +0000 UTC m=+0.054164525 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=reef, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, io.buildah.version=1.39.3, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 06:04:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:29.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:29 np0005593234 podman[340737]: 2026-01-23 11:04:29.430151331 +0000 UTC m=+0.171403775 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=reef, io.buildah.version=1.39.3, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, ceph=True, org.label-schema.build-date=20250507, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 23 06:04:30 np0005593234 podman[340891]: 2026-01-23 11:04:30.073230336 +0000 UTC m=+0.136048589 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 06:04:30 np0005593234 podman[340891]: 2026-01-23 11:04:30.127918488 +0000 UTC m=+0.190736721 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 06:04:30 np0005593234 podman[340958]: 2026-01-23 11:04:30.311402541 +0000 UTC m=+0.048559451 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, distribution-scope=public, name=keepalived, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=keepalived for Ceph, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=Ceph keepalived, vcs-type=git, vendor=Red Hat, Inc., summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, io.openshift.expose-services=, architecture=x86_64, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, io.k8s.display-name=Keepalived on RHEL 9, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container)
Jan 23 06:04:30 np0005593234 podman[340958]: 2026-01-23 11:04:30.324980946 +0000 UTC m=+0.062137836 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.buildah.version=1.28.2, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=keepalived-container, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, distribution-scope=public, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Jan 23 06:04:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:30 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:30 np0005593234 nova_compute[227762]: 2026-01-23 11:04:30.488 227766 DEBUG nova.compute.manager [req-06e9dc1c-3e90-4257-9f93-1d1300d3985a req-895f2e6e-40be-4d39-af84-9bfb4f19baae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-unplugged-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:04:30 np0005593234 nova_compute[227762]: 2026-01-23 11:04:30.489 227766 DEBUG oslo_concurrency.lockutils [req-06e9dc1c-3e90-4257-9f93-1d1300d3985a req-895f2e6e-40be-4d39-af84-9bfb4f19baae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:30 np0005593234 nova_compute[227762]: 2026-01-23 11:04:30.489 227766 DEBUG oslo_concurrency.lockutils [req-06e9dc1c-3e90-4257-9f93-1d1300d3985a req-895f2e6e-40be-4d39-af84-9bfb4f19baae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:30 np0005593234 nova_compute[227762]: 2026-01-23 11:04:30.489 227766 DEBUG oslo_concurrency.lockutils [req-06e9dc1c-3e90-4257-9f93-1d1300d3985a req-895f2e6e-40be-4d39-af84-9bfb4f19baae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:30 np0005593234 nova_compute[227762]: 2026-01-23 11:04:30.489 227766 DEBUG nova.compute.manager [req-06e9dc1c-3e90-4257-9f93-1d1300d3985a req-895f2e6e-40be-4d39-af84-9bfb4f19baae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] No waiting events found dispatching network-vif-unplugged-fa33e1ff-e04f-4862-a822-18bec48babca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:04:30 np0005593234 nova_compute[227762]: 2026-01-23 11:04:30.489 227766 WARNING nova.compute.manager [req-06e9dc1c-3e90-4257-9f93-1d1300d3985a req-895f2e6e-40be-4d39-af84-9bfb4f19baae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received unexpected event network-vif-unplugged-fa33e1ff-e04f-4862-a822-18bec48babca for instance with vm_state active and task_state resize_migrating.#033[00m
Jan 23 06:04:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:04:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:31.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:04:31 np0005593234 nova_compute[227762]: 2026-01-23 11:04:31.211 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:04:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:31.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:04:31 np0005593234 podman[341262]: 2026-01-23 11:04:31.633230899 +0000 UTC m=+0.048713816 container create b334dbd44da3230018f0c3652fde1ba572432f4bcdaa8bc4f107c35c999fc7e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_morse, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.39.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 06:04:31 np0005593234 systemd[1]: Started libpod-conmon-b334dbd44da3230018f0c3652fde1ba572432f4bcdaa8bc4f107c35c999fc7e2.scope.
Jan 23 06:04:31 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:04:31 np0005593234 podman[341262]: 2026-01-23 11:04:31.61220251 +0000 UTC m=+0.027685457 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 06:04:31 np0005593234 podman[341262]: 2026-01-23 11:04:31.717449464 +0000 UTC m=+0.132932411 container init b334dbd44da3230018f0c3652fde1ba572432f4bcdaa8bc4f107c35c999fc7e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_morse, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 06:04:31 np0005593234 podman[341262]: 2026-01-23 11:04:31.724477414 +0000 UTC m=+0.139960331 container start b334dbd44da3230018f0c3652fde1ba572432f4bcdaa8bc4f107c35c999fc7e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_morse, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250507, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.license=GPLv2, OSD_FLAVOR=default, ceph=True)
Jan 23 06:04:31 np0005593234 podman[341262]: 2026-01-23 11:04:31.727769987 +0000 UTC m=+0.143252924 container attach b334dbd44da3230018f0c3652fde1ba572432f4bcdaa8bc4f107c35c999fc7e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_morse, org.label-schema.license=GPLv2, CEPH_REF=reef, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 06:04:31 np0005593234 thirsty_morse[341278]: 167 167
Jan 23 06:04:31 np0005593234 systemd[1]: libpod-b334dbd44da3230018f0c3652fde1ba572432f4bcdaa8bc4f107c35c999fc7e2.scope: Deactivated successfully.
Jan 23 06:04:31 np0005593234 conmon[341278]: conmon b334dbd44da3230018f0 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-b334dbd44da3230018f0c3652fde1ba572432f4bcdaa8bc4f107c35c999fc7e2.scope/container/memory.events
Jan 23 06:04:31 np0005593234 podman[341262]: 2026-01-23 11:04:31.731172944 +0000 UTC m=+0.146655861 container died b334dbd44da3230018f0c3652fde1ba572432f4bcdaa8bc4f107c35c999fc7e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_morse, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.build-date=20250507)
Jan 23 06:04:31 np0005593234 systemd[1]: var-lib-containers-storage-overlay-dfe0044b2447038879c07c9d2d01f1520cfcbf173b3bf166243d3d7c271734e5-merged.mount: Deactivated successfully.
Jan 23 06:04:31 np0005593234 podman[341262]: 2026-01-23 11:04:31.778643139 +0000 UTC m=+0.194126056 container remove b334dbd44da3230018f0c3652fde1ba572432f4bcdaa8bc4f107c35c999fc7e2 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=thirsty_morse, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=reef, ceph=True, io.buildah.version=1.39.3, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 06:04:31 np0005593234 systemd[1]: libpod-conmon-b334dbd44da3230018f0c3652fde1ba572432f4bcdaa8bc4f107c35c999fc7e2.scope: Deactivated successfully.
Jan 23 06:04:31 np0005593234 podman[341302]: 2026-01-23 11:04:31.920004763 +0000 UTC m=+0.037947158 container create d72c647587c429fe379d8ba719c7d36f654791cdc46111d74f30bb29e964b5b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_moore, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.39.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507)
Jan 23 06:04:31 np0005593234 systemd[1]: Started libpod-conmon-d72c647587c429fe379d8ba719c7d36f654791cdc46111d74f30bb29e964b5b1.scope.
Jan 23 06:04:31 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:04:31 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b34e8177eb1af6b3177a2dfc00c01b253ed0d08e1962162a64015b830dc3addb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 06:04:31 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b34e8177eb1af6b3177a2dfc00c01b253ed0d08e1962162a64015b830dc3addb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 06:04:31 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b34e8177eb1af6b3177a2dfc00c01b253ed0d08e1962162a64015b830dc3addb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 06:04:31 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b34e8177eb1af6b3177a2dfc00c01b253ed0d08e1962162a64015b830dc3addb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 06:04:31 np0005593234 podman[341302]: 2026-01-23 11:04:31.99179874 +0000 UTC m=+0.109741165 container init d72c647587c429fe379d8ba719c7d36f654791cdc46111d74f30bb29e964b5b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_moore, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True)
Jan 23 06:04:31 np0005593234 podman[341302]: 2026-01-23 11:04:31.99818786 +0000 UTC m=+0.116130255 container start d72c647587c429fe379d8ba719c7d36f654791cdc46111d74f30bb29e964b5b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_moore, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.39.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 06:04:31 np0005593234 podman[341302]: 2026-01-23 11:04:31.90299405 +0000 UTC m=+0.020936465 image pull 0f5473a1e726b0feaff0f41f8de8341c0a94f60365d4584f4c10bd6b40d44bc1 quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0
Jan 23 06:04:32 np0005593234 podman[341302]: 2026-01-23 11:04:32.001350769 +0000 UTC m=+0.119293184 container attach d72c647587c429fe379d8ba719c7d36f654791cdc46111d74f30bb29e964b5b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_moore, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250507, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:04:32 np0005593234 nova_compute[227762]: 2026-01-23 11:04:32.579 227766 DEBUG nova.compute.manager [req-fa7a2cb7-6e7d-4df2-b9b8-d09e3131e86a req-7d0a57aa-3774-4562-9cd2-2cb5c0365ebc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:04:32 np0005593234 nova_compute[227762]: 2026-01-23 11:04:32.581 227766 DEBUG oslo_concurrency.lockutils [req-fa7a2cb7-6e7d-4df2-b9b8-d09e3131e86a req-7d0a57aa-3774-4562-9cd2-2cb5c0365ebc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:32 np0005593234 nova_compute[227762]: 2026-01-23 11:04:32.581 227766 DEBUG oslo_concurrency.lockutils [req-fa7a2cb7-6e7d-4df2-b9b8-d09e3131e86a req-7d0a57aa-3774-4562-9cd2-2cb5c0365ebc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:32 np0005593234 nova_compute[227762]: 2026-01-23 11:04:32.582 227766 DEBUG oslo_concurrency.lockutils [req-fa7a2cb7-6e7d-4df2-b9b8-d09e3131e86a req-7d0a57aa-3774-4562-9cd2-2cb5c0365ebc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:32 np0005593234 nova_compute[227762]: 2026-01-23 11:04:32.582 227766 DEBUG nova.compute.manager [req-fa7a2cb7-6e7d-4df2-b9b8-d09e3131e86a req-7d0a57aa-3774-4562-9cd2-2cb5c0365ebc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] No waiting events found dispatching network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:04:32 np0005593234 nova_compute[227762]: 2026-01-23 11:04:32.582 227766 WARNING nova.compute.manager [req-fa7a2cb7-6e7d-4df2-b9b8-d09e3131e86a req-7d0a57aa-3774-4562-9cd2-2cb5c0365ebc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received unexpected event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca for instance with vm_state active and task_state resize_migrated.#033[00m
Jan 23 06:04:32 np0005593234 nova_compute[227762]: 2026-01-23 11:04:32.659 227766 INFO nova.network.neutron [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating port fa33e1ff-e04f-4862-a822-18bec48babca with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 06:04:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e424 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:04:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:33.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:04:33 np0005593234 crazy_moore[341319]: [
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:    {
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:        "available": false,
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:        "ceph_device": false,
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:        "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:        "lsm_data": {},
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:        "lvs": [],
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:        "path": "/dev/sr0",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:        "rejected_reasons": [
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "Insufficient space (<5GB)",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "Has a FileSystem"
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:        ],
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:        "sys_api": {
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "actuators": null,
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "device_nodes": "sr0",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "devname": "sr0",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "human_readable_size": "482.00 KB",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "id_bus": "ata",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "model": "QEMU DVD-ROM",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "nr_requests": "2",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "parent": "/dev/sr0",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "partitions": {},
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "path": "/dev/sr0",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "removable": "1",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "rev": "2.5+",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "ro": "0",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "rotational": "1",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "sas_address": "",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "sas_device_handle": "",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "scheduler_mode": "mq-deadline",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "sectors": 0,
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "sectorsize": "2048",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "size": 493568.0,
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "support_discard": "2048",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "type": "disk",
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:            "vendor": "QEMU"
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:        }
Jan 23 06:04:33 np0005593234 crazy_moore[341319]:    }
Jan 23 06:04:33 np0005593234 crazy_moore[341319]: ]
Jan 23 06:04:33 np0005593234 systemd[1]: libpod-d72c647587c429fe379d8ba719c7d36f654791cdc46111d74f30bb29e964b5b1.scope: Deactivated successfully.
Jan 23 06:04:33 np0005593234 systemd[1]: libpod-d72c647587c429fe379d8ba719c7d36f654791cdc46111d74f30bb29e964b5b1.scope: Consumed 1.190s CPU time.
Jan 23 06:04:33 np0005593234 podman[341302]: 2026-01-23 11:04:33.180285545 +0000 UTC m=+1.298227950 container died d72c647587c429fe379d8ba719c7d36f654791cdc46111d74f30bb29e964b5b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_moore, CEPH_REF=reef, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250507, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 06:04:33 np0005593234 systemd[1]: var-lib-containers-storage-overlay-b34e8177eb1af6b3177a2dfc00c01b253ed0d08e1962162a64015b830dc3addb-merged.mount: Deactivated successfully.
Jan 23 06:04:33 np0005593234 podman[341302]: 2026-01-23 11:04:33.251193484 +0000 UTC m=+1.369135879 container remove d72c647587c429fe379d8ba719c7d36f654791cdc46111d74f30bb29e964b5b1 (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=crazy_moore, org.label-schema.build-date=20250507, CEPH_REF=reef, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, io.buildah.version=1.39.3, ceph=True)
Jan 23 06:04:33 np0005593234 systemd[1]: libpod-conmon-d72c647587c429fe379d8ba719c7d36f654791cdc46111d74f30bb29e964b5b1.scope: Deactivated successfully.
Jan 23 06:04:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:33.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:33.378 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=97, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=96) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:04:33 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:33.381 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:04:33 np0005593234 nova_compute[227762]: 2026-01-23 11:04:33.417 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:33 np0005593234 nova_compute[227762]: 2026-01-23 11:04:33.672 227766 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:04:33 np0005593234 nova_compute[227762]: 2026-01-23 11:04:33.673 227766 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:04:33 np0005593234 nova_compute[227762]: 2026-01-23 11:04:33.673 227766 DEBUG nova.network.neutron [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:04:33 np0005593234 nova_compute[227762]: 2026-01-23 11:04:33.759 227766 DEBUG nova.compute.manager [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-changed-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:04:33 np0005593234 nova_compute[227762]: 2026-01-23 11:04:33.759 227766 DEBUG nova.compute.manager [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Refreshing instance network info cache due to event network-changed-fa33e1ff-e04f-4862-a822-18bec48babca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:04:33 np0005593234 nova_compute[227762]: 2026-01-23 11:04:33.759 227766 DEBUG oslo_concurrency.lockutils [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:04:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:04:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:34 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:04:34 np0005593234 nova_compute[227762]: 2026-01-23 11:04:34.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:34 np0005593234 nova_compute[227762]: 2026-01-23 11:04:34.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:34 np0005593234 nova_compute[227762]: 2026-01-23 11:04:34.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:04:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:04:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:35.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:04:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:35.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:35 np0005593234 nova_compute[227762]: 2026-01-23 11:04:35.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:35 np0005593234 nova_compute[227762]: 2026-01-23 11:04:35.852 227766 DEBUG nova.network.neutron [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating instance_info_cache with network_info: [{"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:04:35 np0005593234 nova_compute[227762]: 2026-01-23 11:04:35.885 227766 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:04:35 np0005593234 nova_compute[227762]: 2026-01-23 11:04:35.888 227766 DEBUG oslo_concurrency.lockutils [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:04:35 np0005593234 nova_compute[227762]: 2026-01-23 11:04:35.888 227766 DEBUG nova.network.neutron [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Refreshing network info cache for port fa33e1ff-e04f-4862-a822-18bec48babca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:04:35 np0005593234 nova_compute[227762]: 2026-01-23 11:04:35.979 227766 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Starting finish_migration finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11698#033[00m
Jan 23 06:04:35 np0005593234 nova_compute[227762]: 2026-01-23 11:04:35.981 227766 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Instance directory exists: not creating _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4719#033[00m
Jan 23 06:04:35 np0005593234 nova_compute[227762]: 2026-01-23 11:04:35.981 227766 INFO nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Creating image(s)#033[00m
Jan 23 06:04:36 np0005593234 nova_compute[227762]: 2026-01-23 11:04:36.013 227766 DEBUG nova.storage.rbd_utils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] creating snapshot(nova-resize) on rbd image(15466683-985e-412a-b13a-037d70f393ef_disk) create_snap /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:462#033[00m
Jan 23 06:04:36 np0005593234 nova_compute[227762]: 2026-01-23 11:04:36.212 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e425 e425: 3 total, 3 up, 3 in
Jan 23 06:04:36 np0005593234 systemd[1]: Stopping User Manager for UID 42436...
Jan 23 06:04:36 np0005593234 systemd[340541]: Activating special unit Exit the Session...
Jan 23 06:04:36 np0005593234 systemd[340541]: Stopped target Main User Target.
Jan 23 06:04:36 np0005593234 systemd[340541]: Stopped target Basic System.
Jan 23 06:04:36 np0005593234 systemd[340541]: Stopped target Paths.
Jan 23 06:04:36 np0005593234 systemd[340541]: Stopped target Sockets.
Jan 23 06:04:36 np0005593234 systemd[340541]: Stopped target Timers.
Jan 23 06:04:36 np0005593234 systemd[340541]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 06:04:36 np0005593234 systemd[340541]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 06:04:36 np0005593234 systemd[340541]: Closed D-Bus User Message Bus Socket.
Jan 23 06:04:36 np0005593234 systemd[340541]: Stopped Create User's Volatile Files and Directories.
Jan 23 06:04:36 np0005593234 systemd[340541]: Removed slice User Application Slice.
Jan 23 06:04:36 np0005593234 systemd[340541]: Reached target Shutdown.
Jan 23 06:04:36 np0005593234 systemd[340541]: Finished Exit the Session.
Jan 23 06:04:36 np0005593234 systemd[340541]: Reached target Exit the Session.
Jan 23 06:04:36 np0005593234 systemd[1]: user@42436.service: Deactivated successfully.
Jan 23 06:04:36 np0005593234 systemd[1]: Stopped User Manager for UID 42436.
Jan 23 06:04:36 np0005593234 systemd[1]: Stopping User Runtime Directory /run/user/42436...
Jan 23 06:04:36 np0005593234 systemd[1]: run-user-42436.mount: Deactivated successfully.
Jan 23 06:04:36 np0005593234 systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Jan 23 06:04:36 np0005593234 systemd[1]: Stopped User Runtime Directory /run/user/42436.
Jan 23 06:04:36 np0005593234 systemd[1]: Removed slice User Slice of UID 42436.
Jan 23 06:04:36 np0005593234 nova_compute[227762]: 2026-01-23 11:04:36.926 227766 DEBUG nova.objects.instance [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'trusted_certs' on Instance uuid 15466683-985e-412a-b13a-037d70f393ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.020 227766 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.021 227766 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Ensure instance console log exists: /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.022 227766 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.022 227766 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.022 227766 DEBUG oslo_concurrency.lockutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.024 227766 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Start _get_guest_xml network_info=[{"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1509915587", "vif_mac": "fa:16:3e:93:d3:f5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 06:04:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:04:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:37.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.031 227766 WARNING nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.037 227766 DEBUG nova.virt.libvirt.host [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.038 227766 DEBUG nova.virt.libvirt.host [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.040 227766 DEBUG nova.virt.libvirt.host [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.040 227766 DEBUG nova.virt.libvirt.host [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.041 227766 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.041 227766 DEBUG nova.virt.hardware [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='eebea5f8-9b11-45ad-873d-c4ea90d3de87',id=2,is_public=True,memory_mb=192,name='m1.micro',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.042 227766 DEBUG nova.virt.hardware [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.042 227766 DEBUG nova.virt.hardware [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.042 227766 DEBUG nova.virt.hardware [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.042 227766 DEBUG nova.virt.hardware [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.043 227766 DEBUG nova.virt.hardware [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.043 227766 DEBUG nova.virt.hardware [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.043 227766 DEBUG nova.virt.hardware [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.043 227766 DEBUG nova.virt.hardware [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.043 227766 DEBUG nova.virt.hardware [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.044 227766 DEBUG nova.virt.hardware [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.044 227766 DEBUG nova.objects.instance [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'vcpu_model' on Instance uuid 15466683-985e-412a-b13a-037d70f393ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.059 227766 DEBUG oslo_concurrency.processutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:04:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:04:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:37.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:04:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:37.384 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '97'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:04:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:04:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2490989787' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.489 227766 DEBUG oslo_concurrency.processutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.528 227766 DEBUG oslo_concurrency.processutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.810 227766 DEBUG nova.network.neutron [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updated VIF entry in instance network info cache for port fa33e1ff-e04f-4862-a822-18bec48babca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.811 227766 DEBUG nova.network.neutron [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating instance_info_cache with network_info: [{"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:04:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:04:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1618089923' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.944 227766 DEBUG oslo_concurrency.processutils [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.946 227766 DEBUG nova.virt.libvirt.vif [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:03:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1089282985',display_name='tempest-TestNetworkAdvancedServerOps-server-1089282985',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1089282985',id=216,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEQHldo3iXIUHcOmtxaL7zVrpPDPH1Yesk4w7Ms5SWolpItN2rDCNRTv1dU1IjdebJkV+f//XdfFq7rpNDTnYMRAq+vDfd2aGH28+aEe0zfJXXxcRZnPFq5MH+XPzShNwQ==',key_name='tempest-TestNetworkAdvancedServerOps-1714633987',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:03:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-i3ccnlqn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:04:31Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=15466683-985e-412a-b13a-037d70f393ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1509915587", "vif_mac": "fa:16:3e:93:d3:f5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.946 227766 DEBUG nova.network.os_vif_util [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1509915587", "vif_mac": "fa:16:3e:93:d3:f5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.947 227766 DEBUG nova.network.os_vif_util [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.950 227766 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] End _get_guest_xml xml=<domain type="kvm">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  <uuid>15466683-985e-412a-b13a-037d70f393ef</uuid>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  <name>instance-000000d8</name>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  <memory>196608</memory>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1089282985</nova:name>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 11:04:37</nova:creationTime>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.micro">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <nova:memory>192</nova:memory>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <nova:port uuid="fa33e1ff-e04f-4862-a822-18bec48babca">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <system>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <entry name="serial">15466683-985e-412a-b13a-037d70f393ef</entry>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <entry name="uuid">15466683-985e-412a-b13a-037d70f393ef</entry>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    </system>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  <os>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  </os>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  <features>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  </features>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  </clock>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  <devices>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/15466683-985e-412a-b13a-037d70f393ef_disk">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/15466683-985e-412a-b13a-037d70f393ef_disk.config">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:93:d3:f5"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <target dev="tapfa33e1ff-e0"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    </interface>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef/console.log" append="off"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    </serial>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <video>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    </video>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    </rng>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 06:04:37 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 06:04:37 np0005593234 nova_compute[227762]:  </devices>
Jan 23 06:04:37 np0005593234 nova_compute[227762]: </domain>
Jan 23 06:04:37 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.952 227766 DEBUG nova.virt.libvirt.vif [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:03:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1089282985',display_name='tempest-TestNetworkAdvancedServerOps-server-1089282985',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1089282985',id=216,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEQHldo3iXIUHcOmtxaL7zVrpPDPH1Yesk4w7Ms5SWolpItN2rDCNRTv1dU1IjdebJkV+f//XdfFq7rpNDTnYMRAq+vDfd2aGH28+aEe0zfJXXxcRZnPFq5MH+XPzShNwQ==',key_name='tempest-TestNetworkAdvancedServerOps-1714633987',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:03:59Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=MigrationContext,new_flavor=Flavor(2),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-i3ccnlqn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='resize_finish',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:04:31Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=15466683-985e-412a-b13a-037d70f393ef,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1509915587", "vif_mac": "fa:16:3e:93:d3:f5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.953 227766 DEBUG nova.network.os_vif_util [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1509915587", "vif_mac": "fa:16:3e:93:d3:f5"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.953 227766 DEBUG nova.network.os_vif_util [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.954 227766 DEBUG os_vif [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.954 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.955 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.955 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.957 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.957 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa33e1ff-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.958 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfa33e1ff-e0, col_values=(('external_ids', {'iface-id': 'fa33e1ff-e04f-4862-a822-18bec48babca', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:93:d3:f5', 'vm-uuid': '15466683-985e-412a-b13a-037d70f393ef'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.959 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:37 np0005593234 NetworkManager[48942]: <info>  [1769166277.9603] manager: (tapfa33e1ff-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/465)
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.961 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.965 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:37 np0005593234 nova_compute[227762]: 2026-01-23 11:04:37.966 227766 INFO os_vif [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0')#033[00m
Jan 23 06:04:38 np0005593234 nova_compute[227762]: 2026-01-23 11:04:38.324 227766 DEBUG oslo_concurrency.lockutils [req-acc18bc7-b609-4ce1-9cc9-5005f99f4587 req-477d9d65-69c3-4d3b-a420-01608c4865a4 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:04:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:39.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:39.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:39 np0005593234 nova_compute[227762]: 2026-01-23 11:04:39.741 227766 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:04:39 np0005593234 nova_compute[227762]: 2026-01-23 11:04:39.741 227766 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:04:39 np0005593234 nova_compute[227762]: 2026-01-23 11:04:39.742 227766 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No VIF found with MAC fa:16:3e:93:d3:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 06:04:39 np0005593234 nova_compute[227762]: 2026-01-23 11:04:39.742 227766 INFO nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Using config drive#033[00m
Jan 23 06:04:39 np0005593234 nova_compute[227762]: 2026-01-23 11:04:39.776 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:39 np0005593234 kernel: tapfa33e1ff-e0: entered promiscuous mode
Jan 23 06:04:39 np0005593234 NetworkManager[48942]: <info>  [1769166279.8368] manager: (tapfa33e1ff-e0): new Tun device (/org/freedesktop/NetworkManager/Devices/466)
Jan 23 06:04:39 np0005593234 nova_compute[227762]: 2026-01-23 11:04:39.837 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:39 np0005593234 ovn_controller[134547]: 2026-01-23T11:04:39Z|00975|binding|INFO|Claiming lport fa33e1ff-e04f-4862-a822-18bec48babca for this chassis.
Jan 23 06:04:39 np0005593234 ovn_controller[134547]: 2026-01-23T11:04:39Z|00976|binding|INFO|fa33e1ff-e04f-4862-a822-18bec48babca: Claiming fa:16:3e:93:d3:f5 10.100.0.8
Jan 23 06:04:39 np0005593234 nova_compute[227762]: 2026-01-23 11:04:39.844 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:39 np0005593234 nova_compute[227762]: 2026-01-23 11:04:39.849 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:39 np0005593234 NetworkManager[48942]: <info>  [1769166279.8494] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/467)
Jan 23 06:04:39 np0005593234 NetworkManager[48942]: <info>  [1769166279.8499] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/468)
Jan 23 06:04:39 np0005593234 systemd-udevd[342650]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:04:39 np0005593234 systemd-machined[195626]: New machine qemu-109-instance-000000d8.
Jan 23 06:04:39 np0005593234 NetworkManager[48942]: <info>  [1769166279.8884] device (tapfa33e1ff-e0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:04:39 np0005593234 NetworkManager[48942]: <info>  [1769166279.8892] device (tapfa33e1ff-e0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:04:39 np0005593234 systemd[1]: Started Virtual Machine qemu-109-instance-000000d8.
Jan 23 06:04:39 np0005593234 nova_compute[227762]: 2026-01-23 11:04:39.930 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:39 np0005593234 nova_compute[227762]: 2026-01-23 11:04:39.936 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:39 np0005593234 podman[342641]: 2026-01-23 11:04:39.944522008 +0000 UTC m=+0.075580506 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.149 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:d3:f5 10.100.0.8'], port_security=['fa:16:3e:93:d3:f5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '15466683-985e-412a-b13a-037d70f393ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '05cfd882-0f82-42fb-b766-52bbef7ec922', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.172'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e248dcd5-89a8-4e77-af01-5f9fab92dfca, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fa33e1ff-e04f-4862-a822-18bec48babca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.151 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fa33e1ff-e04f-4862-a822-18bec48babca in datapath 3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 bound to our chassis#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.151 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3ce7eb8b-e719-4e00-bbf7-177b1f60cd38#033[00m
Jan 23 06:04:40 np0005593234 ovn_controller[134547]: 2026-01-23T11:04:40Z|00977|binding|INFO|Setting lport fa33e1ff-e04f-4862-a822-18bec48babca ovn-installed in OVS
Jan 23 06:04:40 np0005593234 ovn_controller[134547]: 2026-01-23T11:04:40Z|00978|binding|INFO|Setting lport fa33e1ff-e04f-4862-a822-18bec48babca up in Southbound
Jan 23 06:04:40 np0005593234 nova_compute[227762]: 2026-01-23 11:04:40.163 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.164 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9aae931a-6197-45e1-b146-e915e837770c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.165 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3ce7eb8b-e1 in ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:04:40 np0005593234 nova_compute[227762]: 2026-01-23 11:04:40.166 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.168 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3ce7eb8b-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.168 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2b1e0da9-059c-47f8-b1fd-202fff3ebd24]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.169 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1bb5b872-87e4-4cf6-952a-5edacd72c8e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.187 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[f0db6ad5-eb7e-45f5-8bb8-bfadb0c18bba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.212 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2e6266f5-6873-4170-8c66-0e9c7d3f7c6d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.242 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[31140412-f75a-430c-a05f-a5a26f2f62ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.247 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1f03446c-641f-4f96-981a-1c56d359921c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 NetworkManager[48942]: <info>  [1769166280.2486] manager: (tap3ce7eb8b-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/469)
Jan 23 06:04:40 np0005593234 systemd-udevd[342655]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.279 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c965961e-87cf-4138-aeaf-a6c8f308ab3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.282 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[55a7f08a-de78-4e34-b99a-1c512bfcdf0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 NetworkManager[48942]: <info>  [1769166280.3043] device (tap3ce7eb8b-e0): carrier: link connected
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.309 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[bd4e13a7-c6d7-412e-9827-eeeda25dcae0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.325 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1352ebd5-83e6-4608-9e13-04588aa68a70]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ce7eb8b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 300], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1017526, 'reachable_time': 40854, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 342734, 'error': None, 'target': 'ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.341 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6a2947ef-0929-4d0d-aeab-8dcaecbe8ffa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5d:862'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1017526, 'tstamp': 1017526}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 342739, 'error': None, 'target': 'ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.357 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6661f9ff-96b6-4f8b-951f-121533b96a33]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ce7eb8b-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5d:08:62'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 300], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1017526, 'reachable_time': 40854, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 342740, 'error': None, 'target': 'ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.387 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4a9b65a9-d865-4e4e-b0c7-b503f44ea499]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 nova_compute[227762]: 2026-01-23 11:04:40.411 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166280.4110978, 15466683-985e-412a-b13a-037d70f393ef => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:04:40 np0005593234 nova_compute[227762]: 2026-01-23 11:04:40.412 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:04:40 np0005593234 nova_compute[227762]: 2026-01-23 11:04:40.413 227766 DEBUG nova.compute.manager [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:04:40 np0005593234 nova_compute[227762]: 2026-01-23 11:04:40.418 227766 INFO nova.virt.libvirt.driver [-] [instance: 15466683-985e-412a-b13a-037d70f393ef] Instance running successfully.#033[00m
Jan 23 06:04:40 np0005593234 virtqemud[227483]: argument unsupported: QEMU guest agent is not configured
Jan 23 06:04:40 np0005593234 nova_compute[227762]: 2026-01-23 11:04:40.420 227766 DEBUG nova.virt.libvirt.guest [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 06:04:40 np0005593234 nova_compute[227762]: 2026-01-23 11:04:40.420 227766 DEBUG nova.virt.libvirt.driver [None req-4ed9d96e-3c97-4f22-910a-cb0bc7528dff 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] finish_migration finished successfully. finish_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11793#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.449 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[50971b68-a023-41fe-8309-4ccecd562010]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.451 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ce7eb8b-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.451 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.451 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ce7eb8b-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:04:40 np0005593234 nova_compute[227762]: 2026-01-23 11:04:40.453 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:40 np0005593234 NetworkManager[48942]: <info>  [1769166280.4537] manager: (tap3ce7eb8b-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/470)
Jan 23 06:04:40 np0005593234 kernel: tap3ce7eb8b-e0: entered promiscuous mode
Jan 23 06:04:40 np0005593234 nova_compute[227762]: 2026-01-23 11:04:40.455 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.456 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3ce7eb8b-e0, col_values=(('external_ids', {'iface-id': 'e3d8e09f-6a23-413b-b880-82ad4418b7d7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:04:40 np0005593234 nova_compute[227762]: 2026-01-23 11:04:40.457 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:40 np0005593234 ovn_controller[134547]: 2026-01-23T11:04:40Z|00979|binding|INFO|Releasing lport e3d8e09f-6a23-413b-b880-82ad4418b7d7 from this chassis (sb_readonly=1)
Jan 23 06:04:40 np0005593234 nova_compute[227762]: 2026-01-23 11:04:40.469 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.470 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3ce7eb8b-e719-4e00-bbf7-177b1f60cd38.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3ce7eb8b-e719-4e00-bbf7-177b1f60cd38.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.471 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b92f4e71-785e-4bc2-b0b4-0b639c213c49]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.471 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/3ce7eb8b-e719-4e00-bbf7-177b1f60cd38.pid.haproxy
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 3ce7eb8b-e719-4e00-bbf7-177b1f60cd38
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:04:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:40.472 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'env', 'PROCESS_TAG=haproxy-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3ce7eb8b-e719-4e00-bbf7-177b1f60cd38.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:04:40 np0005593234 podman[342773]: 2026-01-23 11:04:40.838333921 +0000 UTC m=+0.060234126 container create 12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:04:40 np0005593234 systemd[1]: Started libpod-conmon-12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b.scope.
Jan 23 06:04:40 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:04:40 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/29a52a3c2565296afa9619b3cdf17803176f949b477ff3449b7bd411dadf8b23/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:04:40 np0005593234 podman[342773]: 2026-01-23 11:04:40.812385179 +0000 UTC m=+0.034285404 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:04:40 np0005593234 podman[342773]: 2026-01-23 11:04:40.91115666 +0000 UTC m=+0.133056935 container init 12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 06:04:40 np0005593234 podman[342773]: 2026-01-23 11:04:40.917149497 +0000 UTC m=+0.139049702 container start 12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 06:04:40 np0005593234 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[342788]: [NOTICE]   (342792) : New worker (342794) forked
Jan 23 06:04:40 np0005593234 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[342788]: [NOTICE]   (342792) : Loading success.
Jan 23 06:04:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:41.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:41 np0005593234 nova_compute[227762]: 2026-01-23 11:04:41.215 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:04:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:41.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:04:41 np0005593234 nova_compute[227762]: 2026-01-23 11:04:41.435 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:04:41 np0005593234 nova_compute[227762]: 2026-01-23 11:04:41.439 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: resize_finish, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:04:41 np0005593234 nova_compute[227762]: 2026-01-23 11:04:41.489 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] During sync_power_state the instance has a pending task (resize_finish). Skip.#033[00m
Jan 23 06:04:41 np0005593234 nova_compute[227762]: 2026-01-23 11:04:41.489 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166280.4136682, 15466683-985e-412a-b13a-037d70f393ef => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:04:41 np0005593234 nova_compute[227762]: 2026-01-23 11:04:41.490 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] VM Started (Lifecycle Event)#033[00m
Jan 23 06:04:41 np0005593234 nova_compute[227762]: 2026-01-23 11:04:41.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:41 np0005593234 nova_compute[227762]: 2026-01-23 11:04:41.793 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:04:41 np0005593234 nova_compute[227762]: 2026-01-23 11:04:41.798 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Synchronizing instance power state after lifecycle event "Started"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:04:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:42 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:04:42 np0005593234 nova_compute[227762]: 2026-01-23 11:04:42.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:42.913 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:42.914 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:04:42.915 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:42 np0005593234 nova_compute[227762]: 2026-01-23 11:04:42.960 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:43.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:04:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:43.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:04:43 np0005593234 nova_compute[227762]: 2026-01-23 11:04:43.585 227766 DEBUG nova.compute.manager [req-50bbe144-3658-46f9-970d-e111d69d7d16 req-b813bf5f-f193-4208-989f-288fb5d17228 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:04:43 np0005593234 nova_compute[227762]: 2026-01-23 11:04:43.586 227766 DEBUG oslo_concurrency.lockutils [req-50bbe144-3658-46f9-970d-e111d69d7d16 req-b813bf5f-f193-4208-989f-288fb5d17228 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:43 np0005593234 nova_compute[227762]: 2026-01-23 11:04:43.586 227766 DEBUG oslo_concurrency.lockutils [req-50bbe144-3658-46f9-970d-e111d69d7d16 req-b813bf5f-f193-4208-989f-288fb5d17228 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:43 np0005593234 nova_compute[227762]: 2026-01-23 11:04:43.587 227766 DEBUG oslo_concurrency.lockutils [req-50bbe144-3658-46f9-970d-e111d69d7d16 req-b813bf5f-f193-4208-989f-288fb5d17228 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:43 np0005593234 nova_compute[227762]: 2026-01-23 11:04:43.587 227766 DEBUG nova.compute.manager [req-50bbe144-3658-46f9-970d-e111d69d7d16 req-b813bf5f-f193-4208-989f-288fb5d17228 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] No waiting events found dispatching network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:04:43 np0005593234 nova_compute[227762]: 2026-01-23 11:04:43.587 227766 WARNING nova.compute.manager [req-50bbe144-3658-46f9-970d-e111d69d7d16 req-b813bf5f-f193-4208-989f-288fb5d17228 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received unexpected event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca for instance with vm_state resized and task_state None.#033[00m
Jan 23 06:04:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:45.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:45.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:46 np0005593234 nova_compute[227762]: 2026-01-23 11:04:46.218 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:46 np0005593234 podman[342856]: 2026-01-23 11:04:46.787749572 +0000 UTC m=+0.079155377 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 06:04:46 np0005593234 nova_compute[227762]: 2026-01-23 11:04:46.809 227766 DEBUG nova.compute.manager [req-a60ffa61-9e49-416c-98e2-016bb485a8b0 req-54d64b47-a229-44c0-a831-b858bc65546c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:04:46 np0005593234 nova_compute[227762]: 2026-01-23 11:04:46.810 227766 DEBUG oslo_concurrency.lockutils [req-a60ffa61-9e49-416c-98e2-016bb485a8b0 req-54d64b47-a229-44c0-a831-b858bc65546c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:04:46 np0005593234 nova_compute[227762]: 2026-01-23 11:04:46.810 227766 DEBUG oslo_concurrency.lockutils [req-a60ffa61-9e49-416c-98e2-016bb485a8b0 req-54d64b47-a229-44c0-a831-b858bc65546c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:04:46 np0005593234 nova_compute[227762]: 2026-01-23 11:04:46.811 227766 DEBUG oslo_concurrency.lockutils [req-a60ffa61-9e49-416c-98e2-016bb485a8b0 req-54d64b47-a229-44c0-a831-b858bc65546c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:04:46 np0005593234 nova_compute[227762]: 2026-01-23 11:04:46.811 227766 DEBUG nova.compute.manager [req-a60ffa61-9e49-416c-98e2-016bb485a8b0 req-54d64b47-a229-44c0-a831-b858bc65546c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] No waiting events found dispatching network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:04:46 np0005593234 nova_compute[227762]: 2026-01-23 11:04:46.811 227766 WARNING nova.compute.manager [req-a60ffa61-9e49-416c-98e2-016bb485a8b0 req-54d64b47-a229-44c0-a831-b858bc65546c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received unexpected event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca for instance with vm_state resized and task_state None.#033[00m
Jan 23 06:04:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:47.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:47.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e425 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:47 np0005593234 nova_compute[227762]: 2026-01-23 11:04:47.961 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:04:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:49.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:04:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:50.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:04:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:51.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:04:51 np0005593234 nova_compute[227762]: 2026-01-23 11:04:51.219 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:04:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:52.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:04:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e426 e426: 3 total, 3 up, 3 in
Jan 23 06:04:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:52 np0005593234 nova_compute[227762]: 2026-01-23 11:04:52.964 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:04:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:53.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:04:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:54.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:54 np0005593234 ovn_controller[134547]: 2026-01-23T11:04:54Z|00129|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:93:d3:f5 10.100.0.8
Jan 23 06:04:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:55.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:56.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:56 np0005593234 nova_compute[227762]: 2026-01-23 11:04:56.224 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:57.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:57 np0005593234 nova_compute[227762]: 2026-01-23 11:04:57.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:04:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:04:57 np0005593234 nova_compute[227762]: 2026-01-23 11:04:57.965 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:04:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:04:58.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:04:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:04:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:04:59.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:04:59 np0005593234 nova_compute[227762]: 2026-01-23 11:04:59.835 227766 INFO nova.compute.manager [None req-c87e29f3-879a-4bf7-980e-5e08c963e68e 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Get console output#033[00m
Jan 23 06:04:59 np0005593234 nova_compute[227762]: 2026-01-23 11:04:59.841 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 06:05:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:00.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:05:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:01.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:05:01 np0005593234 nova_compute[227762]: 2026-01-23 11:05:01.372 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:02.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e426 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:02 np0005593234 nova_compute[227762]: 2026-01-23 11:05:02.966 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:03.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 e427: 3 total, 3 up, 3 in
Jan 23 06:05:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:04.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:04 np0005593234 nova_compute[227762]: 2026-01-23 11:05:04.351 227766 DEBUG nova.compute.manager [req-9a068a6a-b3bb-45a6-a478-e52cabf47585 req-9a06b7f1-a290-41cb-bfc8-7418f06b5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-changed-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:05:04 np0005593234 nova_compute[227762]: 2026-01-23 11:05:04.351 227766 DEBUG nova.compute.manager [req-9a068a6a-b3bb-45a6-a478-e52cabf47585 req-9a06b7f1-a290-41cb-bfc8-7418f06b5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Refreshing instance network info cache due to event network-changed-fa33e1ff-e04f-4862-a822-18bec48babca. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:05:04 np0005593234 nova_compute[227762]: 2026-01-23 11:05:04.352 227766 DEBUG oslo_concurrency.lockutils [req-9a068a6a-b3bb-45a6-a478-e52cabf47585 req-9a06b7f1-a290-41cb-bfc8-7418f06b5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:05:04 np0005593234 nova_compute[227762]: 2026-01-23 11:05:04.352 227766 DEBUG oslo_concurrency.lockutils [req-9a068a6a-b3bb-45a6-a478-e52cabf47585 req-9a06b7f1-a290-41cb-bfc8-7418f06b5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:05:04 np0005593234 nova_compute[227762]: 2026-01-23 11:05:04.352 227766 DEBUG nova.network.neutron [req-9a068a6a-b3bb-45a6-a478-e52cabf47585 req-9a06b7f1-a290-41cb-bfc8-7418f06b5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Refreshing network info cache for port fa33e1ff-e04f-4862-a822-18bec48babca _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:05:04 np0005593234 nova_compute[227762]: 2026-01-23 11:05:04.736 227766 DEBUG oslo_concurrency.lockutils [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:04 np0005593234 nova_compute[227762]: 2026-01-23 11:05:04.737 227766 DEBUG oslo_concurrency.lockutils [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:04 np0005593234 nova_compute[227762]: 2026-01-23 11:05:04.737 227766 DEBUG oslo_concurrency.lockutils [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:04 np0005593234 nova_compute[227762]: 2026-01-23 11:05:04.737 227766 DEBUG oslo_concurrency.lockutils [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:04 np0005593234 nova_compute[227762]: 2026-01-23 11:05:04.737 227766 DEBUG oslo_concurrency.lockutils [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:05:04 np0005593234 nova_compute[227762]: 2026-01-23 11:05:04.739 227766 INFO nova.compute.manager [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Terminating instance#033[00m
Jan 23 06:05:04 np0005593234 nova_compute[227762]: 2026-01-23 11:05:04.740 227766 DEBUG nova.compute.manager [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 06:05:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:05.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:05 np0005593234 kernel: tapfa33e1ff-e0 (unregistering): left promiscuous mode
Jan 23 06:05:05 np0005593234 NetworkManager[48942]: <info>  [1769166305.9294] device (tapfa33e1ff-e0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:05:05 np0005593234 nova_compute[227762]: 2026-01-23 11:05:05.941 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:05 np0005593234 ovn_controller[134547]: 2026-01-23T11:05:05Z|00980|binding|INFO|Releasing lport fa33e1ff-e04f-4862-a822-18bec48babca from this chassis (sb_readonly=0)
Jan 23 06:05:05 np0005593234 ovn_controller[134547]: 2026-01-23T11:05:05Z|00981|binding|INFO|Setting lport fa33e1ff-e04f-4862-a822-18bec48babca down in Southbound
Jan 23 06:05:05 np0005593234 ovn_controller[134547]: 2026-01-23T11:05:05Z|00982|binding|INFO|Removing iface tapfa33e1ff-e0 ovn-installed in OVS
Jan 23 06:05:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:05.949 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:93:d3:f5 10.100.0.8'], port_security=['fa:16:3e:93:d3:f5 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '15466683-985e-412a-b13a-037d70f393ef', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '8', 'neutron:security_group_ids': '05cfd882-0f82-42fb-b766-52bbef7ec922', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e248dcd5-89a8-4e77-af01-5f9fab92dfca, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=fa33e1ff-e04f-4862-a822-18bec48babca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:05:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:05.950 144381 INFO neutron.agent.ovn.metadata.agent [-] Port fa33e1ff-e04f-4862-a822-18bec48babca in datapath 3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 unbound from our chassis#033[00m
Jan 23 06:05:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:05.951 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:05:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:05.952 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e4ee7039-e308-44ad-a33d-1ec8089dc138]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:05:05 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:05.953 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 namespace which is not needed anymore#033[00m
Jan 23 06:05:05 np0005593234 nova_compute[227762]: 2026-01-23 11:05:05.961 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:06 np0005593234 systemd[1]: machine-qemu\x2d109\x2dinstance\x2d000000d8.scope: Deactivated successfully.
Jan 23 06:05:06 np0005593234 systemd[1]: machine-qemu\x2d109\x2dinstance\x2d000000d8.scope: Consumed 14.046s CPU time.
Jan 23 06:05:06 np0005593234 systemd-machined[195626]: Machine qemu-109-instance-000000d8 terminated.
Jan 23 06:05:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:06.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:06 np0005593234 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[342788]: [NOTICE]   (342792) : haproxy version is 2.8.14-c23fe91
Jan 23 06:05:06 np0005593234 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[342788]: [NOTICE]   (342792) : path to executable is /usr/sbin/haproxy
Jan 23 06:05:06 np0005593234 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[342788]: [WARNING]  (342792) : Exiting Master process...
Jan 23 06:05:06 np0005593234 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[342788]: [ALERT]    (342792) : Current worker (342794) exited with code 143 (Terminated)
Jan 23 06:05:06 np0005593234 neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38[342788]: [WARNING]  (342792) : All workers exited. Exiting... (0)
Jan 23 06:05:06 np0005593234 systemd[1]: libpod-12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b.scope: Deactivated successfully.
Jan 23 06:05:06 np0005593234 podman[342968]: 2026-01-23 11:05:06.081924723 +0000 UTC m=+0.041864221 container died 12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:05:06 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b-userdata-shm.mount: Deactivated successfully.
Jan 23 06:05:06 np0005593234 systemd[1]: var-lib-containers-storage-overlay-29a52a3c2565296afa9619b3cdf17803176f949b477ff3449b7bd411dadf8b23-merged.mount: Deactivated successfully.
Jan 23 06:05:06 np0005593234 podman[342968]: 2026-01-23 11:05:06.125517547 +0000 UTC m=+0.085457015 container cleanup 12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 06:05:06 np0005593234 systemd[1]: libpod-conmon-12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b.scope: Deactivated successfully.
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.184 227766 INFO nova.virt.libvirt.driver [-] [instance: 15466683-985e-412a-b13a-037d70f393ef] Instance destroyed successfully.#033[00m
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.184 227766 DEBUG nova.objects.instance [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'resources' on Instance uuid 15466683-985e-412a-b13a-037d70f393ef obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:05:06 np0005593234 podman[342999]: 2026-01-23 11:05:06.190906714 +0000 UTC m=+0.046386333 container remove 12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:05:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:06.205 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[702c035d-ee53-487a-b858-79b4243b1557]: (4, ('Fri Jan 23 11:05:06 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 (12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b)\n12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b\nFri Jan 23 11:05:06 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 (12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b)\n12b0aa2f386e21f2bf3068b562416ec54627596bae9646187cea7dd04506072b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.206 227766 DEBUG nova.virt.libvirt.vif [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:03:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1089282985',display_name='tempest-TestNetworkAdvancedServerOps-server-1089282985',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(2),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1089282985',id=216,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=2,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEQHldo3iXIUHcOmtxaL7zVrpPDPH1Yesk4w7Ms5SWolpItN2rDCNRTv1dU1IjdebJkV+f//XdfFq7rpNDTnYMRAq+vDfd2aGH28+aEe0zfJXXxcRZnPFq5MH+XPzShNwQ==',key_name='tempest-TestNetworkAdvancedServerOps-1714633987',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:04:41Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=192,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-i3ccnlqn',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:04:55Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=15466683-985e-412a-b13a-037d70f393ef,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.207 227766 DEBUG nova.network.os_vif_util [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.172", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:05:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:06.208 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c925bdc8-5fed-49d7-ab51-2496fc88b930]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.208 227766 DEBUG nova.network.os_vif_util [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.208 227766 DEBUG os_vif [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:05:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:06.209 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ce7eb8b-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.245 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:06 np0005593234 kernel: tap3ce7eb8b-e0: left promiscuous mode
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.246 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa33e1ff-e0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.247 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.247 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.249 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.267 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.268 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:06.270 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0cf79a7e-4b47-4f74-8364-0634b9ea7506]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.270 227766 INFO os_vif [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:93:d3:f5,bridge_name='br-int',has_traffic_filtering=True,id=fa33e1ff-e04f-4862-a822-18bec48babca,network=Network(3ce7eb8b-e719-4e00-bbf7-177b1f60cd38),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfa33e1ff-e0')#033[00m
Jan 23 06:05:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:06.285 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e2c440ac-cdfa-4064-81cd-a796a8507c22]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:05:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:06.288 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[38139c93-5c13-476c-afac-40ce496fa024]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:05:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:06.306 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[573c5c6b-8a52-4c8b-8c84-3d8669e9400f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1017519, 'reachable_time': 25684, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 343043, 'error': None, 'target': 'ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:05:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:06.309 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3ce7eb8b-e719-4e00-bbf7-177b1f60cd38 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:05:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:06.309 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[412fc783-bd7b-4e81-8ce4-b24e36fd74a4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:05:06 np0005593234 systemd[1]: run-netns-ovnmeta\x2d3ce7eb8b\x2de719\x2d4e00\x2dbbf7\x2d177b1f60cd38.mount: Deactivated successfully.
Jan 23 06:05:06 np0005593234 nova_compute[227762]: 2026-01-23 11:05:06.373 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:07.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:07 np0005593234 nova_compute[227762]: 2026-01-23 11:05:07.275 227766 DEBUG nova.compute.manager [req-25a20891-0d7b-43a4-9161-7905e7d67b0b req-2fedd911-e489-49b7-b604-4ce2290e66d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-unplugged-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:05:07 np0005593234 nova_compute[227762]: 2026-01-23 11:05:07.276 227766 DEBUG oslo_concurrency.lockutils [req-25a20891-0d7b-43a4-9161-7905e7d67b0b req-2fedd911-e489-49b7-b604-4ce2290e66d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:07 np0005593234 nova_compute[227762]: 2026-01-23 11:05:07.277 227766 DEBUG oslo_concurrency.lockutils [req-25a20891-0d7b-43a4-9161-7905e7d67b0b req-2fedd911-e489-49b7-b604-4ce2290e66d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:07 np0005593234 nova_compute[227762]: 2026-01-23 11:05:07.277 227766 DEBUG oslo_concurrency.lockutils [req-25a20891-0d7b-43a4-9161-7905e7d67b0b req-2fedd911-e489-49b7-b604-4ce2290e66d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:05:07 np0005593234 nova_compute[227762]: 2026-01-23 11:05:07.278 227766 DEBUG nova.compute.manager [req-25a20891-0d7b-43a4-9161-7905e7d67b0b req-2fedd911-e489-49b7-b604-4ce2290e66d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] No waiting events found dispatching network-vif-unplugged-fa33e1ff-e04f-4862-a822-18bec48babca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:05:07 np0005593234 nova_compute[227762]: 2026-01-23 11:05:07.278 227766 DEBUG nova.compute.manager [req-25a20891-0d7b-43a4-9161-7905e7d67b0b req-2fedd911-e489-49b7-b604-4ce2290e66d9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-unplugged-fa33e1ff-e04f-4862-a822-18bec48babca for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 06:05:07 np0005593234 nova_compute[227762]: 2026-01-23 11:05:07.399 227766 DEBUG nova.network.neutron [req-9a068a6a-b3bb-45a6-a478-e52cabf47585 req-9a06b7f1-a290-41cb-bfc8-7418f06b5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updated VIF entry in instance network info cache for port fa33e1ff-e04f-4862-a822-18bec48babca. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:05:07 np0005593234 nova_compute[227762]: 2026-01-23 11:05:07.400 227766 DEBUG nova.network.neutron [req-9a068a6a-b3bb-45a6-a478-e52cabf47585 req-9a06b7f1-a290-41cb-bfc8-7418f06b5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating instance_info_cache with network_info: [{"id": "fa33e1ff-e04f-4862-a822-18bec48babca", "address": "fa:16:3e:93:d3:f5", "network": {"id": "3ce7eb8b-e719-4e00-bbf7-177b1f60cd38", "bridge": "br-int", "label": "tempest-network-smoke--1509915587", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfa33e1ff-e0", "ovs_interfaceid": "fa33e1ff-e04f-4862-a822-18bec48babca", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:05:07 np0005593234 nova_compute[227762]: 2026-01-23 11:05:07.418 227766 DEBUG oslo_concurrency.lockutils [req-9a068a6a-b3bb-45a6-a478-e52cabf47585 req-9a06b7f1-a290-41cb-bfc8-7418f06b5329 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-15466683-985e-412a-b13a-037d70f393ef" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:05:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:08.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:09.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:09 np0005593234 nova_compute[227762]: 2026-01-23 11:05:09.371 227766 DEBUG nova.compute.manager [req-25ca6cf8-14fc-44ed-a488-bed6ef0e5d2b req-68607642-28d8-4b47-a2c8-57e8bfd13df8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:05:09 np0005593234 nova_compute[227762]: 2026-01-23 11:05:09.372 227766 DEBUG oslo_concurrency.lockutils [req-25ca6cf8-14fc-44ed-a488-bed6ef0e5d2b req-68607642-28d8-4b47-a2c8-57e8bfd13df8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "15466683-985e-412a-b13a-037d70f393ef-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:09 np0005593234 nova_compute[227762]: 2026-01-23 11:05:09.372 227766 DEBUG oslo_concurrency.lockutils [req-25ca6cf8-14fc-44ed-a488-bed6ef0e5d2b req-68607642-28d8-4b47-a2c8-57e8bfd13df8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:09 np0005593234 nova_compute[227762]: 2026-01-23 11:05:09.372 227766 DEBUG oslo_concurrency.lockutils [req-25ca6cf8-14fc-44ed-a488-bed6ef0e5d2b req-68607642-28d8-4b47-a2c8-57e8bfd13df8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:05:09 np0005593234 nova_compute[227762]: 2026-01-23 11:05:09.372 227766 DEBUG nova.compute.manager [req-25ca6cf8-14fc-44ed-a488-bed6ef0e5d2b req-68607642-28d8-4b47-a2c8-57e8bfd13df8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] No waiting events found dispatching network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:05:09 np0005593234 nova_compute[227762]: 2026-01-23 11:05:09.373 227766 WARNING nova.compute.manager [req-25ca6cf8-14fc-44ed-a488-bed6ef0e5d2b req-68607642-28d8-4b47-a2c8-57e8bfd13df8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received unexpected event network-vif-plugged-fa33e1ff-e04f-4862-a822-18bec48babca for instance with vm_state active and task_state deleting.#033[00m
Jan 23 06:05:09 np0005593234 nova_compute[227762]: 2026-01-23 11:05:09.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:09 np0005593234 nova_compute[227762]: 2026-01-23 11:05:09.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 06:05:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:10.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:10 np0005593234 podman[343050]: 2026-01-23 11:05:10.79207657 +0000 UTC m=+0.074824322 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:05:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:05:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:11.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:05:11 np0005593234 nova_compute[227762]: 2026-01-23 11:05:11.247 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:11 np0005593234 nova_compute[227762]: 2026-01-23 11:05:11.374 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:11 np0005593234 nova_compute[227762]: 2026-01-23 11:05:11.418 227766 INFO nova.virt.libvirt.driver [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Deleting instance files /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef_del#033[00m
Jan 23 06:05:11 np0005593234 nova_compute[227762]: 2026-01-23 11:05:11.418 227766 INFO nova.virt.libvirt.driver [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Deletion of /var/lib/nova/instances/15466683-985e-412a-b13a-037d70f393ef_del complete#033[00m
Jan 23 06:05:11 np0005593234 nova_compute[227762]: 2026-01-23 11:05:11.488 227766 INFO nova.compute.manager [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Took 6.75 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 06:05:11 np0005593234 nova_compute[227762]: 2026-01-23 11:05:11.488 227766 DEBUG oslo.service.loopingcall [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 06:05:11 np0005593234 nova_compute[227762]: 2026-01-23 11:05:11.489 227766 DEBUG nova.compute.manager [-] [instance: 15466683-985e-412a-b13a-037d70f393ef] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 06:05:11 np0005593234 nova_compute[227762]: 2026-01-23 11:05:11.489 227766 DEBUG nova.network.neutron [-] [instance: 15466683-985e-412a-b13a-037d70f393ef] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 06:05:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:12.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:12 np0005593234 nova_compute[227762]: 2026-01-23 11:05:12.324 227766 DEBUG nova.compute.manager [req-de6a3544-ad15-4f99-9bee-b498efc28102 req-97cb990a-8dda-4fe3-910c-85d8da975ac7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Received event network-vif-deleted-fa33e1ff-e04f-4862-a822-18bec48babca external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:05:12 np0005593234 nova_compute[227762]: 2026-01-23 11:05:12.324 227766 INFO nova.compute.manager [req-de6a3544-ad15-4f99-9bee-b498efc28102 req-97cb990a-8dda-4fe3-910c-85d8da975ac7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Neutron deleted interface fa33e1ff-e04f-4862-a822-18bec48babca; detaching it from the instance and deleting it from the info cache#033[00m
Jan 23 06:05:12 np0005593234 nova_compute[227762]: 2026-01-23 11:05:12.324 227766 DEBUG nova.network.neutron [req-de6a3544-ad15-4f99-9bee-b498efc28102 req-97cb990a-8dda-4fe3-910c-85d8da975ac7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:05:12 np0005593234 nova_compute[227762]: 2026-01-23 11:05:12.344 227766 DEBUG nova.network.neutron [-] [instance: 15466683-985e-412a-b13a-037d70f393ef] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:05:12 np0005593234 nova_compute[227762]: 2026-01-23 11:05:12.368 227766 DEBUG nova.compute.manager [req-de6a3544-ad15-4f99-9bee-b498efc28102 req-97cb990a-8dda-4fe3-910c-85d8da975ac7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: 15466683-985e-412a-b13a-037d70f393ef] Detach interface failed, port_id=fa33e1ff-e04f-4862-a822-18bec48babca, reason: Instance 15466683-985e-412a-b13a-037d70f393ef could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Jan 23 06:05:12 np0005593234 nova_compute[227762]: 2026-01-23 11:05:12.369 227766 INFO nova.compute.manager [-] [instance: 15466683-985e-412a-b13a-037d70f393ef] Took 0.88 seconds to deallocate network for instance.#033[00m
Jan 23 06:05:12 np0005593234 nova_compute[227762]: 2026-01-23 11:05:12.411 227766 DEBUG oslo_concurrency.lockutils [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:12 np0005593234 nova_compute[227762]: 2026-01-23 11:05:12.412 227766 DEBUG oslo_concurrency.lockutils [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:12 np0005593234 nova_compute[227762]: 2026-01-23 11:05:12.424 227766 DEBUG oslo_concurrency.lockutils [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.012s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:05:12 np0005593234 nova_compute[227762]: 2026-01-23 11:05:12.468 227766 INFO nova.scheduler.client.report [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Deleted allocations for instance 15466683-985e-412a-b13a-037d70f393ef#033[00m
Jan 23 06:05:12 np0005593234 nova_compute[227762]: 2026-01-23 11:05:12.530 227766 DEBUG oslo_concurrency.lockutils [None req-b48e1d29-497d-427a-bfd8-7ef210ed5a77 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "15466683-985e-412a-b13a-037d70f393ef" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 7.793s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:05:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:13.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:14.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:14 np0005593234 nova_compute[227762]: 2026-01-23 11:05:14.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:14.810 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=98, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=97) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:05:14 np0005593234 nova_compute[227762]: 2026-01-23 11:05:14.810 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:14 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:14.811 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:05:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:15.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:16.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:16 np0005593234 nova_compute[227762]: 2026-01-23 11:05:16.249 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:16 np0005593234 nova_compute[227762]: 2026-01-23 11:05:16.376 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:17.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:17 np0005593234 podman[343122]: 2026-01-23 11:05:17.801144526 +0000 UTC m=+0.100405994 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 06:05:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:18.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:18 np0005593234 nova_compute[227762]: 2026-01-23 11:05:18.743 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:18 np0005593234 nova_compute[227762]: 2026-01-23 11:05:18.818 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:19.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:20.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:20 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:20.812 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '98'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:05:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:21.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:21 np0005593234 nova_compute[227762]: 2026-01-23 11:05:21.181 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769166306.179724, 15466683-985e-412a-b13a-037d70f393ef => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:05:21 np0005593234 nova_compute[227762]: 2026-01-23 11:05:21.181 227766 INFO nova.compute.manager [-] [instance: 15466683-985e-412a-b13a-037d70f393ef] VM Stopped (Lifecycle Event)#033[00m
Jan 23 06:05:21 np0005593234 nova_compute[227762]: 2026-01-23 11:05:21.203 227766 DEBUG nova.compute.manager [None req-22f4eaff-6413-47c9-8dbe-df6b902ec016 - - - - - -] [instance: 15466683-985e-412a-b13a-037d70f393ef] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:05:21 np0005593234 nova_compute[227762]: 2026-01-23 11:05:21.250 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:21 np0005593234 nova_compute[227762]: 2026-01-23 11:05:21.378 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:22.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:23.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:23 np0005593234 nova_compute[227762]: 2026-01-23 11:05:23.810 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:23 np0005593234 nova_compute[227762]: 2026-01-23 11:05:23.842 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:23 np0005593234 nova_compute[227762]: 2026-01-23 11:05:23.843 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:23 np0005593234 nova_compute[227762]: 2026-01-23 11:05:23.843 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:05:23 np0005593234 nova_compute[227762]: 2026-01-23 11:05:23.843 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:05:23 np0005593234 nova_compute[227762]: 2026-01-23 11:05:23.843 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:05:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:24.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:05:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3495412757' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:05:24 np0005593234 nova_compute[227762]: 2026-01-23 11:05:24.268 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:05:24 np0005593234 nova_compute[227762]: 2026-01-23 11:05:24.435 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:05:24 np0005593234 nova_compute[227762]: 2026-01-23 11:05:24.436 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4144MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:05:24 np0005593234 nova_compute[227762]: 2026-01-23 11:05:24.436 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:24 np0005593234 nova_compute[227762]: 2026-01-23 11:05:24.437 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:24 np0005593234 nova_compute[227762]: 2026-01-23 11:05:24.583 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:05:24 np0005593234 nova_compute[227762]: 2026-01-23 11:05:24.584 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:05:24 np0005593234 nova_compute[227762]: 2026-01-23 11:05:24.599 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:05:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:05:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3807290668' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:05:25 np0005593234 nova_compute[227762]: 2026-01-23 11:05:25.027 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.427s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:05:25 np0005593234 nova_compute[227762]: 2026-01-23 11:05:25.032 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:05:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:05:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:25.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:05:25 np0005593234 nova_compute[227762]: 2026-01-23 11:05:25.472 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:05:25 np0005593234 nova_compute[227762]: 2026-01-23 11:05:25.508 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:05:25 np0005593234 nova_compute[227762]: 2026-01-23 11:05:25.508 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.072s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:05:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:26.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:26 np0005593234 nova_compute[227762]: 2026-01-23 11:05:26.252 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:26 np0005593234 nova_compute[227762]: 2026-01-23 11:05:26.380 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:27.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:28.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:29.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:29 np0005593234 nova_compute[227762]: 2026-01-23 11:05:29.443 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:29 np0005593234 nova_compute[227762]: 2026-01-23 11:05:29.444 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:05:29 np0005593234 nova_compute[227762]: 2026-01-23 11:05:29.444 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:05:29 np0005593234 nova_compute[227762]: 2026-01-23 11:05:29.641 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:05:29 np0005593234 nova_compute[227762]: 2026-01-23 11:05:29.642 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:05:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:30.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:05:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:31.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:31 np0005593234 nova_compute[227762]: 2026-01-23 11:05:31.255 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:31 np0005593234 nova_compute[227762]: 2026-01-23 11:05:31.384 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:32.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000064s ======
Jan 23 06:05:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:33.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 23 06:05:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:34.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:35.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:36.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:36 np0005593234 nova_compute[227762]: 2026-01-23 11:05:36.257 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:36 np0005593234 nova_compute[227762]: 2026-01-23 11:05:36.384 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:36 np0005593234 nova_compute[227762]: 2026-01-23 11:05:36.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:36 np0005593234 nova_compute[227762]: 2026-01-23 11:05:36.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:36 np0005593234 nova_compute[227762]: 2026-01-23 11:05:36.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:05:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:37.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:37 np0005593234 nova_compute[227762]: 2026-01-23 11:05:37.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:38.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:39.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:39 np0005593234 nova_compute[227762]: 2026-01-23 11:05:39.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:40.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:41.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:41 np0005593234 nova_compute[227762]: 2026-01-23 11:05:41.299 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:41 np0005593234 nova_compute[227762]: 2026-01-23 11:05:41.386 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:41 np0005593234 podman[343256]: 2026-01-23 11:05:41.778359142 +0000 UTC m=+0.062867129 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:05:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:42.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:42 np0005593234 nova_compute[227762]: 2026-01-23 11:05:42.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:42.915 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:42.915 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:05:42.915 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:05:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:05:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:43.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:05:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:44.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:05:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:05:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:05:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:05:44 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:05:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:45.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:45 np0005593234 nova_compute[227762]: 2026-01-23 11:05:45.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:05:45 np0005593234 nova_compute[227762]: 2026-01-23 11:05:45.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 06:05:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:46.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:46 np0005593234 nova_compute[227762]: 2026-01-23 11:05:46.301 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:46 np0005593234 nova_compute[227762]: 2026-01-23 11:05:46.387 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:05:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:47.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:05:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:05:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:48.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:05:48 np0005593234 nova_compute[227762]: 2026-01-23 11:05:48.773 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 06:05:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:49.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:49 np0005593234 podman[343413]: 2026-01-23 11:05:49.270495316 +0000 UTC m=+0.552720049 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:05:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:50.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:05:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:51.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:05:51 np0005593234 nova_compute[227762]: 2026-01-23 11:05:51.304 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:51 np0005593234 nova_compute[227762]: 2026-01-23 11:05:51.390 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:05:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:52.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:05:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:05:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:53.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:05:54 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 06:05:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:54 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 06:05:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:05:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:54.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:05:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:05:54 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:05:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:05:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:55.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:05:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:05:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:56.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:05:56 np0005593234 nova_compute[227762]: 2026-01-23 11:05:56.307 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:56 np0005593234 nova_compute[227762]: 2026-01-23 11:05:56.390 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:05:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:57.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:57 np0005593234 nova_compute[227762]: 2026-01-23 11:05:57.370 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:57 np0005593234 nova_compute[227762]: 2026-01-23 11:05:57.371 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:57 np0005593234 nova_compute[227762]: 2026-01-23 11:05:57.404 227766 DEBUG nova.compute.manager [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 06:05:57 np0005593234 nova_compute[227762]: 2026-01-23 11:05:57.523 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:57 np0005593234 nova_compute[227762]: 2026-01-23 11:05:57.524 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:57 np0005593234 nova_compute[227762]: 2026-01-23 11:05:57.530 227766 DEBUG nova.virt.hardware [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 06:05:57 np0005593234 nova_compute[227762]: 2026-01-23 11:05:57.531 227766 INFO nova.compute.claims [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 06:05:57 np0005593234 nova_compute[227762]: 2026-01-23 11:05:57.659 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:05:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:05:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:05:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1351761991' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:05:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:05:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:05:58.089 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:05:58 np0005593234 nova_compute[227762]: 2026-01-23 11:05:58.103 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:05:58 np0005593234 nova_compute[227762]: 2026-01-23 11:05:58.109 227766 DEBUG nova.compute.provider_tree [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:05:58 np0005593234 nova_compute[227762]: 2026-01-23 11:05:58.781 227766 DEBUG nova.scheduler.client.report [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:05:58 np0005593234 nova_compute[227762]: 2026-01-23 11:05:58.823 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:05:58 np0005593234 nova_compute[227762]: 2026-01-23 11:05:58.824 227766 DEBUG nova.compute.manager [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 06:05:58 np0005593234 nova_compute[227762]: 2026-01-23 11:05:58.922 227766 DEBUG nova.compute.manager [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 06:05:58 np0005593234 nova_compute[227762]: 2026-01-23 11:05:58.923 227766 DEBUG nova.network.neutron [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 06:05:58 np0005593234 nova_compute[227762]: 2026-01-23 11:05:58.949 227766 INFO nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 06:05:58 np0005593234 nova_compute[227762]: 2026-01-23 11:05:58.982 227766 DEBUG nova.compute.manager [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.094 227766 DEBUG nova.compute.manager [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.096 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.096 227766 INFO nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Creating image(s)#033[00m
Jan 23 06:05:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:05:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:05:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:05:59.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.128 227766 DEBUG nova.storage.rbd_utils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.159 227766 DEBUG nova.storage.rbd_utils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.188 227766 DEBUG nova.storage.rbd_utils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.192 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.232 227766 DEBUG nova.policy [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '420c366dc5dc45a48da4e0b18c93043f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c06f98b51aeb48de91d116fda54a161f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.283 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.284 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.285 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.286 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.315 227766 DEBUG nova.storage.rbd_utils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:05:59 np0005593234 nova_compute[227762]: 2026-01-23 11:05:59.319 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:06:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:00.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:00 np0005593234 nova_compute[227762]: 2026-01-23 11:06:00.769 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:01.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:01 np0005593234 nova_compute[227762]: 2026-01-23 11:06:01.309 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:01 np0005593234 nova_compute[227762]: 2026-01-23 11:06:01.392 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:01 np0005593234 nova_compute[227762]: 2026-01-23 11:06:01.395 227766 DEBUG nova.network.neutron [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Successfully created port: 061c373f-1971-4439-80c0-2491edc62fc7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 06:06:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:02.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:02 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:02Z|00983|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 06:06:02 np0005593234 nova_compute[227762]: 2026-01-23 11:06:02.743 227766 DEBUG nova.network.neutron [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Successfully updated port: 061c373f-1971-4439-80c0-2491edc62fc7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 06:06:02 np0005593234 nova_compute[227762]: 2026-01-23 11:06:02.766 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:06:02 np0005593234 nova_compute[227762]: 2026-01-23 11:06:02.767 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:06:02 np0005593234 nova_compute[227762]: 2026-01-23 11:06:02.767 227766 DEBUG nova.network.neutron [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:06:02 np0005593234 nova_compute[227762]: 2026-01-23 11:06:02.897 227766 DEBUG nova.compute.manager [req-72ac6450-8fae-46fa-a6bb-8912f1198d6c req-0248db9a-aa26-47d4-8500-e47d5ddf8e52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received event network-changed-061c373f-1971-4439-80c0-2491edc62fc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:06:02 np0005593234 nova_compute[227762]: 2026-01-23 11:06:02.898 227766 DEBUG nova.compute.manager [req-72ac6450-8fae-46fa-a6bb-8912f1198d6c req-0248db9a-aa26-47d4-8500-e47d5ddf8e52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Refreshing instance network info cache due to event network-changed-061c373f-1971-4439-80c0-2491edc62fc7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:06:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:02 np0005593234 nova_compute[227762]: 2026-01-23 11:06:02.898 227766 DEBUG oslo_concurrency.lockutils [req-72ac6450-8fae-46fa-a6bb-8912f1198d6c req-0248db9a-aa26-47d4-8500-e47d5ddf8e52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:06:02 np0005593234 nova_compute[227762]: 2026-01-23 11:06:02.996 227766 DEBUG nova.network.neutron [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 06:06:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:03.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:03 np0005593234 nova_compute[227762]: 2026-01-23 11:06:03.199 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 3.880s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:06:03 np0005593234 nova_compute[227762]: 2026-01-23 11:06:03.285 227766 DEBUG nova.storage.rbd_utils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] resizing rbd image dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 06:06:03 np0005593234 nova_compute[227762]: 2026-01-23 11:06:03.837 227766 DEBUG nova.objects.instance [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'migration_context' on Instance uuid dbc61477-df97-4232-9a9e-5b81e6b9b29f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:06:03 np0005593234 nova_compute[227762]: 2026-01-23 11:06:03.885 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 06:06:03 np0005593234 nova_compute[227762]: 2026-01-23 11:06:03.886 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Ensure instance console log exists: /var/lib/nova/instances/dbc61477-df97-4232-9a9e-5b81e6b9b29f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 06:06:03 np0005593234 nova_compute[227762]: 2026-01-23 11:06:03.886 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:03 np0005593234 nova_compute[227762]: 2026-01-23 11:06:03.886 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:03 np0005593234 nova_compute[227762]: 2026-01-23 11:06:03.887 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:04.098 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #208. Immutable memtables: 0.
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.226079) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 133] Flushing memtable with next log file: 208
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166364226308, "job": 133, "event": "flush_started", "num_memtables": 1, "num_entries": 1407, "num_deletes": 258, "total_data_size": 3164725, "memory_usage": 3222544, "flush_reason": "Manual Compaction"}
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 133] Level-0 flush table #209: started
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.240 227766 DEBUG nova.network.neutron [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Updating instance_info_cache with network_info: [{"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166364241886, "cf_name": "default", "job": 133, "event": "table_file_creation", "file_number": 209, "file_size": 2056536, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 99495, "largest_seqno": 100897, "table_properties": {"data_size": 2050540, "index_size": 3262, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1669, "raw_key_size": 12993, "raw_average_key_size": 19, "raw_value_size": 2038280, "raw_average_value_size": 3107, "num_data_blocks": 144, "num_entries": 656, "num_filter_entries": 656, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166253, "oldest_key_time": 1769166253, "file_creation_time": 1769166364, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 209, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 133] Flush lasted 15850 microseconds, and 7287 cpu microseconds.
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.241999) [db/flush_job.cc:967] [default] [JOB 133] Level-0 flush table #209: 2056536 bytes OK
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.242039) [db/memtable_list.cc:519] [default] Level-0 commit table #209 started
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.244099) [db/memtable_list.cc:722] [default] Level-0 commit table #209: memtable #1 done
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.244126) EVENT_LOG_v1 {"time_micros": 1769166364244118, "job": 133, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.244149) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 133] Try to delete WAL files size 3158050, prev total WAL file size 3174187, number of live WAL files 2.
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000205.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.245925) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303231' seq:72057594037927935, type:22 .. '6C6F676D0034323734' seq:0, type:0; will stop at (end)
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 134] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 133 Base level 0, inputs: [209(2008KB)], [207(13MB)]
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166364246219, "job": 134, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [209], "files_L6": [207], "score": -1, "input_data_size": 16374965, "oldest_snapshot_seqno": -1}
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.261 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.261 227766 DEBUG nova.compute.manager [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Instance network_info: |[{"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.262 227766 DEBUG oslo_concurrency.lockutils [req-72ac6450-8fae-46fa-a6bb-8912f1198d6c req-0248db9a-aa26-47d4-8500-e47d5ddf8e52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.263 227766 DEBUG nova.network.neutron [req-72ac6450-8fae-46fa-a6bb-8912f1198d6c req-0248db9a-aa26-47d4-8500-e47d5ddf8e52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Refreshing network info cache for port 061c373f-1971-4439-80c0-2491edc62fc7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.269 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Start _get_guest_xml network_info=[{"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.276 227766 WARNING nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.289 227766 DEBUG nova.virt.libvirt.host [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.290 227766 DEBUG nova.virt.libvirt.host [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.294 227766 DEBUG nova.virt.libvirt.host [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.295 227766 DEBUG nova.virt.libvirt.host [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.297 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.297 227766 DEBUG nova.virt.hardware [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.298 227766 DEBUG nova.virt.hardware [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.298 227766 DEBUG nova.virt.hardware [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.298 227766 DEBUG nova.virt.hardware [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.299 227766 DEBUG nova.virt.hardware [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.299 227766 DEBUG nova.virt.hardware [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.299 227766 DEBUG nova.virt.hardware [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.300 227766 DEBUG nova.virt.hardware [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.300 227766 DEBUG nova.virt.hardware [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.300 227766 DEBUG nova.virt.hardware [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.301 227766 DEBUG nova.virt.hardware [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.304 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 134] Generated table #210: 11736 keys, 16240273 bytes, temperature: kUnknown
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166364337018, "cf_name": "default", "job": 134, "event": "table_file_creation", "file_number": 210, "file_size": 16240273, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16163577, "index_size": 46247, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29381, "raw_key_size": 310023, "raw_average_key_size": 26, "raw_value_size": 15957706, "raw_average_value_size": 1359, "num_data_blocks": 1766, "num_entries": 11736, "num_filter_entries": 11736, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769166364, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 210, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.337540) [db/compaction/compaction_job.cc:1663] [default] [JOB 134] Compacted 1@0 + 1@6 files to L6 => 16240273 bytes
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.339134) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 180.1 rd, 178.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 13.7 +0.0 blob) out(15.5 +0.0 blob), read-write-amplify(15.9) write-amplify(7.9) OK, records in: 12271, records dropped: 535 output_compression: NoCompression
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.339167) EVENT_LOG_v1 {"time_micros": 1769166364339152, "job": 134, "event": "compaction_finished", "compaction_time_micros": 90937, "compaction_time_cpu_micros": 44350, "output_level": 6, "num_output_files": 1, "total_output_size": 16240273, "num_input_records": 12271, "num_output_records": 11736, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000209.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166364340107, "job": 134, "event": "table_file_deletion", "file_number": 209}
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000207.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166364344708, "job": 134, "event": "table_file_deletion", "file_number": 207}
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.245680) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.344828) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.344836) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.344839) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.344841) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:04.344844) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:06:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1911848793' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.748 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.777 227766 DEBUG nova.storage.rbd_utils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:06:04 np0005593234 nova_compute[227762]: 2026-01-23 11:06:04.783 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:06:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:05.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:06:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3270002167' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.217 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.220 227766 DEBUG nova.virt.libvirt.vif [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:05:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1130318724',display_name='tempest-TestNetworkAdvancedServerOps-server-1130318724',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1130318724',id=217,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDTi5YUYEUSCCRpABwmlOECwIPF3oJc744XKav7Se2iIl6x1TaLG6GzrgU0yBFa3lXoulXBlqxuqmTORiSzCeTVMC0OVafcQCaIT9wfjlg14116foZdvXXpNovDttARlJQ==',key_name='tempest-TestNetworkAdvancedServerOps-191544259',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-367vmfka',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:05:59Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=dbc61477-df97-4232-9a9e-5b81e6b9b29f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.220 227766 DEBUG nova.network.os_vif_util [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.222 227766 DEBUG nova.network.os_vif_util [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.223 227766 DEBUG nova.objects.instance [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid dbc61477-df97-4232-9a9e-5b81e6b9b29f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.342 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] End _get_guest_xml xml=<domain type="kvm">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  <uuid>dbc61477-df97-4232-9a9e-5b81e6b9b29f</uuid>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  <name>instance-000000d9</name>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1130318724</nova:name>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 11:06:04</nova:creationTime>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <nova:port uuid="061c373f-1971-4439-80c0-2491edc62fc7">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <system>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <entry name="serial">dbc61477-df97-4232-9a9e-5b81e6b9b29f</entry>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <entry name="uuid">dbc61477-df97-4232-9a9e-5b81e6b9b29f</entry>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    </system>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  <os>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  </os>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  <features>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  </features>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  </clock>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  <devices>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk.config">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:65:c3:ba"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <target dev="tap061c373f-19"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    </interface>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/dbc61477-df97-4232-9a9e-5b81e6b9b29f/console.log" append="off"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    </serial>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <video>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    </video>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    </rng>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 06:06:05 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 06:06:05 np0005593234 nova_compute[227762]:  </devices>
Jan 23 06:06:05 np0005593234 nova_compute[227762]: </domain>
Jan 23 06:06:05 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.344 227766 DEBUG nova.compute.manager [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Preparing to wait for external event network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.345 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.345 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.346 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.346 227766 DEBUG nova.virt.libvirt.vif [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:05:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1130318724',display_name='tempest-TestNetworkAdvancedServerOps-server-1130318724',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1130318724',id=217,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDTi5YUYEUSCCRpABwmlOECwIPF3oJc744XKav7Se2iIl6x1TaLG6GzrgU0yBFa3lXoulXBlqxuqmTORiSzCeTVMC0OVafcQCaIT9wfjlg14116foZdvXXpNovDttARlJQ==',key_name='tempest-TestNetworkAdvancedServerOps-191544259',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-367vmfka',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:05:59Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=dbc61477-df97-4232-9a9e-5b81e6b9b29f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.346 227766 DEBUG nova.network.os_vif_util [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.347 227766 DEBUG nova.network.os_vif_util [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.347 227766 DEBUG os_vif [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.348 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.348 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.349 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.352 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.352 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap061c373f-19, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.353 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap061c373f-19, col_values=(('external_ids', {'iface-id': '061c373f-1971-4439-80c0-2491edc62fc7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:c3:ba', 'vm-uuid': 'dbc61477-df97-4232-9a9e-5b81e6b9b29f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:05 np0005593234 NetworkManager[48942]: <info>  [1769166365.3550] manager: (tap061c373f-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/471)
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.354 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.357 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.362 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:05 np0005593234 nova_compute[227762]: 2026-01-23 11:06:05.362 227766 INFO os_vif [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19')#033[00m
Jan 23 06:06:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:06:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:06.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:06:06 np0005593234 nova_compute[227762]: 2026-01-23 11:06:06.111 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:06:06 np0005593234 nova_compute[227762]: 2026-01-23 11:06:06.112 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:06:06 np0005593234 nova_compute[227762]: 2026-01-23 11:06:06.112 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No VIF found with MAC fa:16:3e:65:c3:ba, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 06:06:06 np0005593234 nova_compute[227762]: 2026-01-23 11:06:06.113 227766 INFO nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Using config drive#033[00m
Jan 23 06:06:06 np0005593234 nova_compute[227762]: 2026-01-23 11:06:06.151 227766 DEBUG nova.storage.rbd_utils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:06:06 np0005593234 nova_compute[227762]: 2026-01-23 11:06:06.396 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:07.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.143 227766 INFO nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Creating config drive at /var/lib/nova/instances/dbc61477-df97-4232-9a9e-5b81e6b9b29f/disk.config#033[00m
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.148 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/dbc61477-df97-4232-9a9e-5b81e6b9b29f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpun6lm5j1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.280 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/dbc61477-df97-4232-9a9e-5b81e6b9b29f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpun6lm5j1" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.326 227766 DEBUG nova.storage.rbd_utils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.331 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/dbc61477-df97-4232-9a9e-5b81e6b9b29f/disk.config dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.590 227766 DEBUG nova.network.neutron [req-72ac6450-8fae-46fa-a6bb-8912f1198d6c req-0248db9a-aa26-47d4-8500-e47d5ddf8e52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Updated VIF entry in instance network info cache for port 061c373f-1971-4439-80c0-2491edc62fc7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.591 227766 DEBUG nova.network.neutron [req-72ac6450-8fae-46fa-a6bb-8912f1198d6c req-0248db9a-aa26-47d4-8500-e47d5ddf8e52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Updating instance_info_cache with network_info: [{"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.614 227766 DEBUG oslo_concurrency.lockutils [req-72ac6450-8fae-46fa-a6bb-8912f1198d6c req-0248db9a-aa26-47d4-8500-e47d5ddf8e52 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.799 227766 DEBUG oslo_concurrency.processutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/dbc61477-df97-4232-9a9e-5b81e6b9b29f/disk.config dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.800 227766 INFO nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Deleting local config drive /var/lib/nova/instances/dbc61477-df97-4232-9a9e-5b81e6b9b29f/disk.config because it was imported into RBD.#033[00m
Jan 23 06:06:07 np0005593234 kernel: tap061c373f-19: entered promiscuous mode
Jan 23 06:06:07 np0005593234 NetworkManager[48942]: <info>  [1769166367.8580] manager: (tap061c373f-19): new Tun device (/org/freedesktop/NetworkManager/Devices/472)
Jan 23 06:06:07 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:07Z|00984|binding|INFO|Claiming lport 061c373f-1971-4439-80c0-2491edc62fc7 for this chassis.
Jan 23 06:06:07 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:07Z|00985|binding|INFO|061c373f-1971-4439-80c0-2491edc62fc7: Claiming fa:16:3e:65:c3:ba 10.100.0.5
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.860 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.868 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.873 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:07 np0005593234 systemd-udevd[343873]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:06:07 np0005593234 systemd-machined[195626]: New machine qemu-110-instance-000000d9.
Jan 23 06:06:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:07 np0005593234 NetworkManager[48942]: <info>  [1769166367.9100] device (tap061c373f-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:06:07 np0005593234 NetworkManager[48942]: <info>  [1769166367.9107] device (tap061c373f-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:06:07 np0005593234 systemd[1]: Started Virtual Machine qemu-110-instance-000000d9.
Jan 23 06:06:07 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:07Z|00986|binding|INFO|Setting lport 061c373f-1971-4439-80c0-2491edc62fc7 ovn-installed in OVS
Jan 23 06:06:07 np0005593234 nova_compute[227762]: 2026-01-23 11:06:07.951 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:07 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:07Z|00987|binding|INFO|Setting lport 061c373f-1971-4439-80c0-2491edc62fc7 up in Southbound
Jan 23 06:06:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:07.988 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:c3:ba 10.100.0.5'], port_security=['fa:16:3e:65:c3:ba 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dbc61477-df97-4232-9a9e-5b81e6b9b29f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc54b072-b184-4226-a773-66ebba149a9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ced6954a-63e2-476d-b999-70afa5e07339', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8aecfd0-07e8-49d5-aa82-be340c0b5653, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=061c373f-1971-4439-80c0-2491edc62fc7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:06:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:07.989 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 061c373f-1971-4439-80c0-2491edc62fc7 in datapath fc54b072-b184-4226-a773-66ebba149a9c bound to our chassis#033[00m
Jan 23 06:06:07 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:07.990 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc54b072-b184-4226-a773-66ebba149a9c#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.001 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ad8bfee7-1292-4faf-8b75-ec83ef6d8aeb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.001 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfc54b072-b1 in ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.003 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfc54b072-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.003 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[100d69e7-dd61-49d9-b857-b499a8673e96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.004 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[903a8754-35a5-4a86-a8da-b04d3e3b42b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.017 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[6e03fed5-f917-479c-a84c-6eb367901be3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.039 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c519c694-90e3-4167-ba05-cc54723dbcd3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.066 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7ca97ab7-356b-4d5c-8af4-7693a09876df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 NetworkManager[48942]: <info>  [1769166368.0742] manager: (tapfc54b072-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/473)
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.072 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e8d3a9c8-81ab-43ac-ba92-965d769d4237]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 systemd-udevd[343876]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:06:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:08.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.112 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[6edd8865-f87b-4b20-a555-da0faeb58aaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.118 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[36b71615-bd5f-4003-a28a-f0511ced2ada]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 NetworkManager[48942]: <info>  [1769166368.1397] device (tapfc54b072-b0): carrier: link connected
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.146 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f1dfa96b-0095-4d1d-ad2e-a1289c2ae787]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.163 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cf0692c4-dc7f-4c80-b602-42f718a3de43]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc54b072-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:50:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 303], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1026309, 'reachable_time': 17376, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 343909, 'error': None, 'target': 'ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.178 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[34b57a78-b496-4dfc-9b50-bf1ce86ada44]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1f:5052'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1026309, 'tstamp': 1026309}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 343910, 'error': None, 'target': 'ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.195 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[218cca59-9fdf-4a7e-99b5-ea1b9920503c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc54b072-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:50:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 303], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1026309, 'reachable_time': 17376, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 343911, 'error': None, 'target': 'ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.229 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[79c9082a-fc86-414a-91ad-d4ed4dbe9a9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.284 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d7946f52-6bdd-4d9e-8b76-7e93e659dddb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.286 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc54b072-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.287 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.288 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc54b072-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:08 np0005593234 kernel: tapfc54b072-b0: entered promiscuous mode
Jan 23 06:06:08 np0005593234 NetworkManager[48942]: <info>  [1769166368.2911] manager: (tapfc54b072-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/474)
Jan 23 06:06:08 np0005593234 nova_compute[227762]: 2026-01-23 11:06:08.292 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.299 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc54b072-b0, col_values=(('external_ids', {'iface-id': 'ff96d808-47bc-43bf-ad0e-b336caa55e2e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:08 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:08Z|00988|binding|INFO|Releasing lport ff96d808-47bc-43bf-ad0e-b336caa55e2e from this chassis (sb_readonly=0)
Jan 23 06:06:08 np0005593234 nova_compute[227762]: 2026-01-23 11:06:08.301 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.305 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fc54b072-b184-4226-a773-66ebba149a9c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fc54b072-b184-4226-a773-66ebba149a9c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.306 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5c6fe067-b05c-47d3-8a67-2e71e9020701]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.307 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-fc54b072-b184-4226-a773-66ebba149a9c
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/fc54b072-b184-4226-a773-66ebba149a9c.pid.haproxy
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID fc54b072-b184-4226-a773-66ebba149a9c
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:06:08 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:08.308 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c', 'env', 'PROCESS_TAG=haproxy-fc54b072-b184-4226-a773-66ebba149a9c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fc54b072-b184-4226-a773-66ebba149a9c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:06:08 np0005593234 nova_compute[227762]: 2026-01-23 11:06:08.314 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:08 np0005593234 podman[343960]: 2026-01-23 11:06:08.665606855 +0000 UTC m=+0.046265810 container create 06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:06:08 np0005593234 systemd[1]: Started libpod-conmon-06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0.scope.
Jan 23 06:06:08 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:06:08 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/976c6f54da9ce063fadd65d56f7bab4e871526ac68c84f44aa7b3183729e376f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:06:08 np0005593234 podman[343960]: 2026-01-23 11:06:08.641792439 +0000 UTC m=+0.022451404 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:06:08 np0005593234 podman[343960]: 2026-01-23 11:06:08.742244664 +0000 UTC m=+0.122903629 container init 06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:06:08 np0005593234 podman[343960]: 2026-01-23 11:06:08.748425436 +0000 UTC m=+0.129084381 container start 06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 06:06:08 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[343975]: [NOTICE]   (343979) : New worker (343981) forked
Jan 23 06:06:08 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[343975]: [NOTICE]   (343979) : Loading success.
Jan 23 06:06:08 np0005593234 nova_compute[227762]: 2026-01-23 11:06:08.992 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166368.9911551, dbc61477-df97-4232-9a9e-5b81e6b9b29f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:06:08 np0005593234 nova_compute[227762]: 2026-01-23 11:06:08.992 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] VM Started (Lifecycle Event)#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.026 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.030 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166368.9914782, dbc61477-df97-4232-9a9e-5b81e6b9b29f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.030 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] VM Paused (Lifecycle Event)#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.070 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.074 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.094 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:06:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:09.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.158 227766 DEBUG nova.compute.manager [req-1291d1dd-c971-4873-9cfa-9b508e1acfc5 req-5f29e7c7-2a87-4cf0-b52a-6f5f14e0359d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received event network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.159 227766 DEBUG oslo_concurrency.lockutils [req-1291d1dd-c971-4873-9cfa-9b508e1acfc5 req-5f29e7c7-2a87-4cf0-b52a-6f5f14e0359d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.159 227766 DEBUG oslo_concurrency.lockutils [req-1291d1dd-c971-4873-9cfa-9b508e1acfc5 req-5f29e7c7-2a87-4cf0-b52a-6f5f14e0359d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.159 227766 DEBUG oslo_concurrency.lockutils [req-1291d1dd-c971-4873-9cfa-9b508e1acfc5 req-5f29e7c7-2a87-4cf0-b52a-6f5f14e0359d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.159 227766 DEBUG nova.compute.manager [req-1291d1dd-c971-4873-9cfa-9b508e1acfc5 req-5f29e7c7-2a87-4cf0-b52a-6f5f14e0359d 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Processing event network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.160 227766 DEBUG nova.compute.manager [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.164 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166369.1631515, dbc61477-df97-4232-9a9e-5b81e6b9b29f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.164 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.165 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.168 227766 INFO nova.virt.libvirt.driver [-] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Instance spawned successfully.#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.168 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.204 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.208 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.209 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.209 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.209 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.210 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.210 227766 DEBUG nova.virt.libvirt.driver [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.213 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.248 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.283 227766 INFO nova.compute.manager [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Took 10.19 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.284 227766 DEBUG nova.compute.manager [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.401 227766 INFO nova.compute.manager [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Took 11.92 seconds to build instance.#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.421 227766 DEBUG oslo_concurrency.lockutils [None req-0e46decb-c682-4f0a-af6c-180ef932e7a9 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.960 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.966 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:09.966 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=99, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=98) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:06:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:09.968 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.998 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Triggering sync for uuid dbc61477-df97-4232-9a9e-5b81e6b9b29f _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.999 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:09 np0005593234 nova_compute[227762]: 2026-01-23 11:06:09.999 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:10 np0005593234 nova_compute[227762]: 2026-01-23 11:06:10.044 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.045s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:10.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:10 np0005593234 nova_compute[227762]: 2026-01-23 11:06:10.354 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:11.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:11 np0005593234 nova_compute[227762]: 2026-01-23 11:06:11.396 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:11 np0005593234 nova_compute[227762]: 2026-01-23 11:06:11.443 227766 DEBUG nova.compute.manager [req-23203b14-73e6-40e0-b7a0-c71a00d05f44 req-37819d07-1426-4a5e-923c-54118462f6a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received event network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:06:11 np0005593234 nova_compute[227762]: 2026-01-23 11:06:11.444 227766 DEBUG oslo_concurrency.lockutils [req-23203b14-73e6-40e0-b7a0-c71a00d05f44 req-37819d07-1426-4a5e-923c-54118462f6a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:11 np0005593234 nova_compute[227762]: 2026-01-23 11:06:11.444 227766 DEBUG oslo_concurrency.lockutils [req-23203b14-73e6-40e0-b7a0-c71a00d05f44 req-37819d07-1426-4a5e-923c-54118462f6a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:11 np0005593234 nova_compute[227762]: 2026-01-23 11:06:11.445 227766 DEBUG oslo_concurrency.lockutils [req-23203b14-73e6-40e0-b7a0-c71a00d05f44 req-37819d07-1426-4a5e-923c-54118462f6a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:11 np0005593234 nova_compute[227762]: 2026-01-23 11:06:11.445 227766 DEBUG nova.compute.manager [req-23203b14-73e6-40e0-b7a0-c71a00d05f44 req-37819d07-1426-4a5e-923c-54118462f6a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] No waiting events found dispatching network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:06:11 np0005593234 nova_compute[227762]: 2026-01-23 11:06:11.446 227766 WARNING nova.compute.manager [req-23203b14-73e6-40e0-b7a0-c71a00d05f44 req-37819d07-1426-4a5e-923c-54118462f6a6 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received unexpected event network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:06:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:12.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:12 np0005593234 podman[344016]: 2026-01-23 11:06:12.760638672 +0000 UTC m=+0.055298021 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 06:06:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:13.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:06:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:14.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:06:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:15.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:15 np0005593234 nova_compute[227762]: 2026-01-23 11:06:15.356 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:16.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:16 np0005593234 nova_compute[227762]: 2026-01-23 11:06:16.398 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:16 np0005593234 nova_compute[227762]: 2026-01-23 11:06:16.877 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:16 np0005593234 NetworkManager[48942]: <info>  [1769166376.8778] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/475)
Jan 23 06:06:16 np0005593234 NetworkManager[48942]: <info>  [1769166376.8794] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/476)
Jan 23 06:06:16 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:16Z|00989|binding|INFO|Releasing lport ff96d808-47bc-43bf-ad0e-b336caa55e2e from this chassis (sb_readonly=0)
Jan 23 06:06:16 np0005593234 nova_compute[227762]: 2026-01-23 11:06:16.925 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:16 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:16Z|00990|binding|INFO|Releasing lport ff96d808-47bc-43bf-ad0e-b336caa55e2e from this chassis (sb_readonly=0)
Jan 23 06:06:16 np0005593234 nova_compute[227762]: 2026-01-23 11:06:16.931 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:17.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:17 np0005593234 nova_compute[227762]: 2026-01-23 11:06:17.414 227766 DEBUG nova.compute.manager [req-8aaf546a-de28-4a36-b823-f9bb1fb0b2f1 req-e5d06d8c-c522-48ac-b512-f3f2be0160a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received event network-changed-061c373f-1971-4439-80c0-2491edc62fc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:06:17 np0005593234 nova_compute[227762]: 2026-01-23 11:06:17.416 227766 DEBUG nova.compute.manager [req-8aaf546a-de28-4a36-b823-f9bb1fb0b2f1 req-e5d06d8c-c522-48ac-b512-f3f2be0160a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Refreshing instance network info cache due to event network-changed-061c373f-1971-4439-80c0-2491edc62fc7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:06:17 np0005593234 nova_compute[227762]: 2026-01-23 11:06:17.416 227766 DEBUG oslo_concurrency.lockutils [req-8aaf546a-de28-4a36-b823-f9bb1fb0b2f1 req-e5d06d8c-c522-48ac-b512-f3f2be0160a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:06:17 np0005593234 nova_compute[227762]: 2026-01-23 11:06:17.416 227766 DEBUG oslo_concurrency.lockutils [req-8aaf546a-de28-4a36-b823-f9bb1fb0b2f1 req-e5d06d8c-c522-48ac-b512-f3f2be0160a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:06:17 np0005593234 nova_compute[227762]: 2026-01-23 11:06:17.416 227766 DEBUG nova.network.neutron [req-8aaf546a-de28-4a36-b823-f9bb1fb0b2f1 req-e5d06d8c-c522-48ac-b512-f3f2be0160a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Refreshing network info cache for port 061c373f-1971-4439-80c0-2491edc62fc7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:06:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:06:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:18.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:06:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:19.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:19 np0005593234 podman[344089]: 2026-01-23 11:06:19.949411621 +0000 UTC m=+0.133986574 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.license=GPLv2)
Jan 23 06:06:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:19.970 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '99'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:20.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:20 np0005593234 nova_compute[227762]: 2026-01-23 11:06:20.357 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:20 np0005593234 nova_compute[227762]: 2026-01-23 11:06:20.800 227766 DEBUG nova.network.neutron [req-8aaf546a-de28-4a36-b823-f9bb1fb0b2f1 req-e5d06d8c-c522-48ac-b512-f3f2be0160a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Updated VIF entry in instance network info cache for port 061c373f-1971-4439-80c0-2491edc62fc7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:06:20 np0005593234 nova_compute[227762]: 2026-01-23 11:06:20.801 227766 DEBUG nova.network.neutron [req-8aaf546a-de28-4a36-b823-f9bb1fb0b2f1 req-e5d06d8c-c522-48ac-b512-f3f2be0160a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Updating instance_info_cache with network_info: [{"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:06:20 np0005593234 nova_compute[227762]: 2026-01-23 11:06:20.837 227766 DEBUG oslo_concurrency.lockutils [req-8aaf546a-de28-4a36-b823-f9bb1fb0b2f1 req-e5d06d8c-c522-48ac-b512-f3f2be0160a5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:06:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:21.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:21 np0005593234 nova_compute[227762]: 2026-01-23 11:06:21.399 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:22.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:23.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:23 np0005593234 nova_compute[227762]: 2026-01-23 11:06:23.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:23 np0005593234 nova_compute[227762]: 2026-01-23 11:06:23.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:23 np0005593234 nova_compute[227762]: 2026-01-23 11:06:23.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:23 np0005593234 nova_compute[227762]: 2026-01-23 11:06:23.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:23 np0005593234 nova_compute[227762]: 2026-01-23 11:06:23.774 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:06:23 np0005593234 nova_compute[227762]: 2026-01-23 11:06:23.774 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #211. Immutable memtables: 0.
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.053840) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 135] Flushing memtable with next log file: 211
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166384053979, "job": 135, "event": "flush_started", "num_memtables": 1, "num_entries": 436, "num_deletes": 256, "total_data_size": 556064, "memory_usage": 565184, "flush_reason": "Manual Compaction"}
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 135] Level-0 flush table #212: started
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166384058160, "cf_name": "default", "job": 135, "event": "table_file_creation", "file_number": 212, "file_size": 290117, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 100902, "largest_seqno": 101333, "table_properties": {"data_size": 287744, "index_size": 472, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 837, "raw_key_size": 6387, "raw_average_key_size": 20, "raw_value_size": 283030, "raw_average_value_size": 898, "num_data_blocks": 21, "num_entries": 315, "num_filter_entries": 315, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166364, "oldest_key_time": 1769166364, "file_creation_time": 1769166384, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 212, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 135] Flush lasted 4308 microseconds, and 1712 cpu microseconds.
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.058199) [db/flush_job.cc:967] [default] [JOB 135] Level-0 flush table #212: 290117 bytes OK
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.058214) [db/memtable_list.cc:519] [default] Level-0 commit table #212 started
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.060264) [db/memtable_list.cc:722] [default] Level-0 commit table #212: memtable #1 done
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.060275) EVENT_LOG_v1 {"time_micros": 1769166384060271, "job": 135, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.060290) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 135] Try to delete WAL files size 553308, prev total WAL file size 553308, number of live WAL files 2.
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000208.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.060783) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033353136' seq:72057594037927935, type:22 .. '6D6772737461740033373733' seq:0, type:0; will stop at (end)
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 136] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 135 Base level 0, inputs: [212(283KB)], [210(15MB)]
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166384061036, "job": 136, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [212], "files_L6": [210], "score": -1, "input_data_size": 16530390, "oldest_snapshot_seqno": -1}
Jan 23 06:06:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:24.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:24 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:24Z|00130|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:65:c3:ba 10.100.0.5
Jan 23 06:06:24 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:24Z|00131|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:c3:ba 10.100.0.5
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 136] Generated table #213: 11536 keys, 12675958 bytes, temperature: kUnknown
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166384155743, "cf_name": "default", "job": 136, "event": "table_file_creation", "file_number": 213, "file_size": 12675958, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12605316, "index_size": 40705, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28869, "raw_key_size": 306050, "raw_average_key_size": 26, "raw_value_size": 12407657, "raw_average_value_size": 1075, "num_data_blocks": 1535, "num_entries": 11536, "num_filter_entries": 11536, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769166384, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 213, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.156111) [db/compaction/compaction_job.cc:1663] [default] [JOB 136] Compacted 1@0 + 1@6 files to L6 => 12675958 bytes
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.157602) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 174.2 rd, 133.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 15.5 +0.0 blob) out(12.1 +0.0 blob), read-write-amplify(100.7) write-amplify(43.7) OK, records in: 12051, records dropped: 515 output_compression: NoCompression
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.157618) EVENT_LOG_v1 {"time_micros": 1769166384157610, "job": 136, "event": "compaction_finished", "compaction_time_micros": 94904, "compaction_time_cpu_micros": 32877, "output_level": 6, "num_output_files": 1, "total_output_size": 12675958, "num_input_records": 12051, "num_output_records": 11536, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000212.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166384158045, "job": 136, "event": "table_file_deletion", "file_number": 212}
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000210.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166384160696, "job": 136, "event": "table_file_deletion", "file_number": 210}
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.060642) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.160894) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.160901) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.160904) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.160906) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:06:24.160907) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:06:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3742826252' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:06:24 np0005593234 nova_compute[227762]: 2026-01-23 11:06:24.309 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.534s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:06:24 np0005593234 nova_compute[227762]: 2026-01-23 11:06:24.401 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000d9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:06:24 np0005593234 nova_compute[227762]: 2026-01-23 11:06:24.402 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000d9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:06:24 np0005593234 nova_compute[227762]: 2026-01-23 11:06:24.546 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:06:24 np0005593234 nova_compute[227762]: 2026-01-23 11:06:24.547 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3917MB free_disk=20.949779510498047GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:06:24 np0005593234 nova_compute[227762]: 2026-01-23 11:06:24.548 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:24 np0005593234 nova_compute[227762]: 2026-01-23 11:06:24.548 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:24 np0005593234 nova_compute[227762]: 2026-01-23 11:06:24.705 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance dbc61477-df97-4232-9a9e-5b81e6b9b29f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 06:06:24 np0005593234 nova_compute[227762]: 2026-01-23 11:06:24.706 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:06:24 np0005593234 nova_compute[227762]: 2026-01-23 11:06:24.706 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:06:24 np0005593234 nova_compute[227762]: 2026-01-23 11:06:24.746 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:06:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:25.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:06:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1191182869' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:06:25 np0005593234 nova_compute[227762]: 2026-01-23 11:06:25.175 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:06:25 np0005593234 nova_compute[227762]: 2026-01-23 11:06:25.181 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:06:25 np0005593234 nova_compute[227762]: 2026-01-23 11:06:25.201 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:06:25 np0005593234 nova_compute[227762]: 2026-01-23 11:06:25.359 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:25 np0005593234 nova_compute[227762]: 2026-01-23 11:06:25.378 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:06:25 np0005593234 nova_compute[227762]: 2026-01-23 11:06:25.378 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.831s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:06:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:26.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:06:26 np0005593234 nova_compute[227762]: 2026-01-23 11:06:26.401 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:27.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:28.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:29.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:30 np0005593234 nova_compute[227762]: 2026-01-23 11:06:30.111 227766 INFO nova.compute.manager [None req-79b40463-4f55-4427-b4f9-a88a35a1558c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Get console output#033[00m
Jan 23 06:06:30 np0005593234 nova_compute[227762]: 2026-01-23 11:06:30.119 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 06:06:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:30.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:30 np0005593234 nova_compute[227762]: 2026-01-23 11:06:30.360 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:30 np0005593234 nova_compute[227762]: 2026-01-23 11:06:30.378 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:30 np0005593234 nova_compute[227762]: 2026-01-23 11:06:30.379 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:06:30 np0005593234 nova_compute[227762]: 2026-01-23 11:06:30.379 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:06:30 np0005593234 nova_compute[227762]: 2026-01-23 11:06:30.569 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:06:30 np0005593234 nova_compute[227762]: 2026-01-23 11:06:30.569 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:06:30 np0005593234 nova_compute[227762]: 2026-01-23 11:06:30.570 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 06:06:30 np0005593234 nova_compute[227762]: 2026-01-23 11:06:30.570 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid dbc61477-df97-4232-9a9e-5b81e6b9b29f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:06:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:06:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:31.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:06:31 np0005593234 nova_compute[227762]: 2026-01-23 11:06:31.436 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:31 np0005593234 nova_compute[227762]: 2026-01-23 11:06:31.705 227766 DEBUG oslo_concurrency.lockutils [None req-3135ed0b-ba35-4ae1-8957-b0f0e6ca0736 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:31 np0005593234 nova_compute[227762]: 2026-01-23 11:06:31.705 227766 DEBUG oslo_concurrency.lockutils [None req-3135ed0b-ba35-4ae1-8957-b0f0e6ca0736 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f" acquired by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:31 np0005593234 nova_compute[227762]: 2026-01-23 11:06:31.706 227766 DEBUG nova.compute.manager [None req-3135ed0b-ba35-4ae1-8957-b0f0e6ca0736 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:06:31 np0005593234 nova_compute[227762]: 2026-01-23 11:06:31.717 227766 DEBUG nova.compute.manager [None req-3135ed0b-ba35-4ae1-8957-b0f0e6ca0736 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Stopping instance; current vm_state: active, current task_state: powering-off, current DB power_state: 1, current VM power_state: 1 do_stop_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3338#033[00m
Jan 23 06:06:31 np0005593234 nova_compute[227762]: 2026-01-23 11:06:31.718 227766 DEBUG nova.objects.instance [None req-3135ed0b-ba35-4ae1-8957-b0f0e6ca0736 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'flavor' on Instance uuid dbc61477-df97-4232-9a9e-5b81e6b9b29f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:06:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:06:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:32.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:06:32 np0005593234 nova_compute[227762]: 2026-01-23 11:06:32.525 227766 DEBUG nova.virt.libvirt.driver [None req-3135ed0b-ba35-4ae1-8957-b0f0e6ca0736 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 06:06:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:06:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:33.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:06:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:34.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.031 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Updating instance_info_cache with network_info: [{"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:06:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:06:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:35.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:06:35 np0005593234 kernel: tap061c373f-19 (unregistering): left promiscuous mode
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.363 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:35 np0005593234 NetworkManager[48942]: <info>  [1769166395.3668] device (tap061c373f-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:06:35 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:35Z|00991|binding|INFO|Releasing lport 061c373f-1971-4439-80c0-2491edc62fc7 from this chassis (sb_readonly=0)
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.379 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:35 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:35Z|00992|binding|INFO|Setting lport 061c373f-1971-4439-80c0-2491edc62fc7 down in Southbound
Jan 23 06:06:35 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:35Z|00993|binding|INFO|Removing iface tap061c373f-19 ovn-installed in OVS
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.383 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.401 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:35 np0005593234 systemd[1]: machine-qemu\x2d110\x2dinstance\x2d000000d9.scope: Deactivated successfully.
Jan 23 06:06:35 np0005593234 systemd[1]: machine-qemu\x2d110\x2dinstance\x2d000000d9.scope: Consumed 14.477s CPU time.
Jan 23 06:06:35 np0005593234 systemd-machined[195626]: Machine qemu-110-instance-000000d9 terminated.
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.473 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:c3:ba 10.100.0.5'], port_security=['fa:16:3e:65:c3:ba 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dbc61477-df97-4232-9a9e-5b81e6b9b29f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc54b072-b184-4226-a773-66ebba149a9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ced6954a-63e2-476d-b999-70afa5e07339', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.247'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8aecfd0-07e8-49d5-aa82-be340c0b5653, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=061c373f-1971-4439-80c0-2491edc62fc7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.474 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 061c373f-1971-4439-80c0-2491edc62fc7 in datapath fc54b072-b184-4226-a773-66ebba149a9c unbound from our chassis#033[00m
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.475 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc54b072-b184-4226-a773-66ebba149a9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.477 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[498f2a15-8831-4b06-a192-1699531e4f1a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.478 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c namespace which is not needed anymore#033[00m
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.488 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.488 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.488 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:35 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[343975]: [NOTICE]   (343979) : haproxy version is 2.8.14-c23fe91
Jan 23 06:06:35 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[343975]: [NOTICE]   (343979) : path to executable is /usr/sbin/haproxy
Jan 23 06:06:35 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[343975]: [WARNING]  (343979) : Exiting Master process...
Jan 23 06:06:35 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[343975]: [WARNING]  (343979) : Exiting Master process...
Jan 23 06:06:35 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[343975]: [ALERT]    (343979) : Current worker (343981) exited with code 143 (Terminated)
Jan 23 06:06:35 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[343975]: [WARNING]  (343979) : All workers exited. Exiting... (0)
Jan 23 06:06:35 np0005593234 systemd[1]: libpod-06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0.scope: Deactivated successfully.
Jan 23 06:06:35 np0005593234 podman[344196]: 2026-01-23 11:06:35.600839897 +0000 UTC m=+0.041706956 container died 06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.613 227766 INFO nova.virt.libvirt.driver [None req-3135ed0b-ba35-4ae1-8957-b0f0e6ca0736 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.620 227766 INFO nova.virt.libvirt.driver [-] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Instance destroyed successfully.#033[00m
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.620 227766 DEBUG nova.objects.instance [None req-3135ed0b-ba35-4ae1-8957-b0f0e6ca0736 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'numa_topology' on Instance uuid dbc61477-df97-4232-9a9e-5b81e6b9b29f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:06:35 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0-userdata-shm.mount: Deactivated successfully.
Jan 23 06:06:35 np0005593234 systemd[1]: var-lib-containers-storage-overlay-976c6f54da9ce063fadd65d56f7bab4e871526ac68c84f44aa7b3183729e376f-merged.mount: Deactivated successfully.
Jan 23 06:06:35 np0005593234 podman[344196]: 2026-01-23 11:06:35.646756083 +0000 UTC m=+0.087623142 container cleanup 06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 06:06:35 np0005593234 systemd[1]: libpod-conmon-06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0.scope: Deactivated successfully.
Jan 23 06:06:35 np0005593234 podman[344238]: 2026-01-23 11:06:35.707118493 +0000 UTC m=+0.039717484 container remove 06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.712 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f3f3273d-1cd2-445e-9ed3-9666a506a8df]: (4, ('Fri Jan 23 11:06:35 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c (06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0)\n06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0\nFri Jan 23 11:06:35 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c (06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0)\n06b60935a5cbef177b5247ed8c437e53a6c09f4745e8706f78a013bd83c6edd0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.714 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4f3b65b2-be99-4ab3-a815-91c5edfb3c28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.715 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc54b072-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.717 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:35 np0005593234 kernel: tapfc54b072-b0: left promiscuous mode
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.731 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:35 np0005593234 nova_compute[227762]: 2026-01-23 11:06:35.732 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.739 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[32ae1c9c-9a4d-4bf8-b8d4-8a9f2e2be140]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.752 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[87f29f1e-8d7f-4a2c-b0d0-46e1ea3e1fd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.753 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bc2a6186-6fdb-45d2-b669-5cb72c455c4f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.768 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3cd4d963-2444-4760-8b43-b44cde4265ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1026302, 'reachable_time': 27832, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 344258, 'error': None, 'target': 'ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.772 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:06:35 np0005593234 systemd[1]: run-netns-ovnmeta\x2dfc54b072\x2db184\x2d4226\x2da773\x2d66ebba149a9c.mount: Deactivated successfully.
Jan 23 06:06:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:35.772 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[015fc694-7783-45f0-81b4-563513e575cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:36 np0005593234 nova_compute[227762]: 2026-01-23 11:06:36.061 227766 DEBUG nova.compute.manager [None req-3135ed0b-ba35-4ae1-8957-b0f0e6ca0736 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:06:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:36.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:36 np0005593234 nova_compute[227762]: 2026-01-23 11:06:36.266 227766 DEBUG oslo_concurrency.lockutils [None req-3135ed0b-ba35-4ae1-8957-b0f0e6ca0736 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f" "released" by "nova.compute.manager.ComputeManager.stop_instance.<locals>.do_stop_instance" :: held 4.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:36 np0005593234 nova_compute[227762]: 2026-01-23 11:06:36.439 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:36 np0005593234 nova_compute[227762]: 2026-01-23 11:06:36.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:36 np0005593234 nova_compute[227762]: 2026-01-23 11:06:36.870 227766 DEBUG nova.compute.manager [req-3876df17-686c-44ba-9a14-f046bc955549 req-fb783ff2-e0f2-4a7e-a44b-5633ef852567 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received event network-vif-unplugged-061c373f-1971-4439-80c0-2491edc62fc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:06:36 np0005593234 nova_compute[227762]: 2026-01-23 11:06:36.871 227766 DEBUG oslo_concurrency.lockutils [req-3876df17-686c-44ba-9a14-f046bc955549 req-fb783ff2-e0f2-4a7e-a44b-5633ef852567 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:36 np0005593234 nova_compute[227762]: 2026-01-23 11:06:36.871 227766 DEBUG oslo_concurrency.lockutils [req-3876df17-686c-44ba-9a14-f046bc955549 req-fb783ff2-e0f2-4a7e-a44b-5633ef852567 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:36 np0005593234 nova_compute[227762]: 2026-01-23 11:06:36.871 227766 DEBUG oslo_concurrency.lockutils [req-3876df17-686c-44ba-9a14-f046bc955549 req-fb783ff2-e0f2-4a7e-a44b-5633ef852567 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:36 np0005593234 nova_compute[227762]: 2026-01-23 11:06:36.871 227766 DEBUG nova.compute.manager [req-3876df17-686c-44ba-9a14-f046bc955549 req-fb783ff2-e0f2-4a7e-a44b-5633ef852567 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] No waiting events found dispatching network-vif-unplugged-061c373f-1971-4439-80c0-2491edc62fc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:06:36 np0005593234 nova_compute[227762]: 2026-01-23 11:06:36.872 227766 WARNING nova.compute.manager [req-3876df17-686c-44ba-9a14-f046bc955549 req-fb783ff2-e0f2-4a7e-a44b-5633ef852567 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received unexpected event network-vif-unplugged-061c373f-1971-4439-80c0-2491edc62fc7 for instance with vm_state stopped and task_state None.#033[00m
Jan 23 06:06:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:37.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:37 np0005593234 nova_compute[227762]: 2026-01-23 11:06:37.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:37 np0005593234 nova_compute[227762]: 2026-01-23 11:06:37.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:37 np0005593234 nova_compute[227762]: 2026-01-23 11:06:37.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:06:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:06:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:38.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:06:38 np0005593234 nova_compute[227762]: 2026-01-23 11:06:38.951 227766 DEBUG nova.compute.manager [req-1c5e0b5b-d22d-4f5a-92f4-977855bdbd4c req-221f6c93-a0d3-4f84-95e2-71b5094331b0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received event network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:06:38 np0005593234 nova_compute[227762]: 2026-01-23 11:06:38.952 227766 DEBUG oslo_concurrency.lockutils [req-1c5e0b5b-d22d-4f5a-92f4-977855bdbd4c req-221f6c93-a0d3-4f84-95e2-71b5094331b0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:38 np0005593234 nova_compute[227762]: 2026-01-23 11:06:38.952 227766 DEBUG oslo_concurrency.lockutils [req-1c5e0b5b-d22d-4f5a-92f4-977855bdbd4c req-221f6c93-a0d3-4f84-95e2-71b5094331b0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:38 np0005593234 nova_compute[227762]: 2026-01-23 11:06:38.952 227766 DEBUG oslo_concurrency.lockutils [req-1c5e0b5b-d22d-4f5a-92f4-977855bdbd4c req-221f6c93-a0d3-4f84-95e2-71b5094331b0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:38 np0005593234 nova_compute[227762]: 2026-01-23 11:06:38.953 227766 DEBUG nova.compute.manager [req-1c5e0b5b-d22d-4f5a-92f4-977855bdbd4c req-221f6c93-a0d3-4f84-95e2-71b5094331b0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] No waiting events found dispatching network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:06:38 np0005593234 nova_compute[227762]: 2026-01-23 11:06:38.953 227766 WARNING nova.compute.manager [req-1c5e0b5b-d22d-4f5a-92f4-977855bdbd4c req-221f6c93-a0d3-4f84-95e2-71b5094331b0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received unexpected event network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 for instance with vm_state stopped and task_state None.#033[00m
Jan 23 06:06:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:39.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:39 np0005593234 nova_compute[227762]: 2026-01-23 11:06:39.275 227766 INFO nova.compute.manager [None req-5f2a9f6f-7735-4bbc-8551-b20c8e87feb1 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Get console output#033[00m
Jan 23 06:06:39 np0005593234 nova_compute[227762]: 2026-01-23 11:06:39.509 227766 DEBUG nova.objects.instance [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'flavor' on Instance uuid dbc61477-df97-4232-9a9e-5b81e6b9b29f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:06:39 np0005593234 nova_compute[227762]: 2026-01-23 11:06:39.549 227766 DEBUG oslo_concurrency.lockutils [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:06:39 np0005593234 nova_compute[227762]: 2026-01-23 11:06:39.550 227766 DEBUG oslo_concurrency.lockutils [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:06:39 np0005593234 nova_compute[227762]: 2026-01-23 11:06:39.550 227766 DEBUG nova.network.neutron [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:06:39 np0005593234 nova_compute[227762]: 2026-01-23 11:06:39.550 227766 DEBUG nova.objects.instance [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'info_cache' on Instance uuid dbc61477-df97-4232-9a9e-5b81e6b9b29f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:06:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:40.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:40 np0005593234 nova_compute[227762]: 2026-01-23 11:06:40.365 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:40 np0005593234 nova_compute[227762]: 2026-01-23 11:06:40.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.032 227766 DEBUG nova.network.neutron [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Updating instance_info_cache with network_info: [{"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.050 227766 DEBUG oslo_concurrency.lockutils [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.073 227766 INFO nova.virt.libvirt.driver [-] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Instance destroyed successfully.#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.073 227766 DEBUG nova.objects.instance [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'numa_topology' on Instance uuid dbc61477-df97-4232-9a9e-5b81e6b9b29f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.091 227766 DEBUG nova.objects.instance [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'resources' on Instance uuid dbc61477-df97-4232-9a9e-5b81e6b9b29f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.106 227766 DEBUG nova.virt.libvirt.vif [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:05:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1130318724',display_name='tempest-TestNetworkAdvancedServerOps-server-1130318724',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1130318724',id=217,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDTi5YUYEUSCCRpABwmlOECwIPF3oJc744XKav7Se2iIl6x1TaLG6GzrgU0yBFa3lXoulXBlqxuqmTORiSzCeTVMC0OVafcQCaIT9wfjlg14116foZdvXXpNovDttARlJQ==',key_name='tempest-TestNetworkAdvancedServerOps-191544259',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:06:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-367vmfka',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:06:36Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=dbc61477-df97-4232-9a9e-5b81e6b9b29f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.107 227766 DEBUG nova.network.os_vif_util [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.108 227766 DEBUG nova.network.os_vif_util [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.109 227766 DEBUG os_vif [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.114 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.115 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap061c373f-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.117 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.119 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.124 227766 INFO os_vif [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19')#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.134 227766 DEBUG nova.virt.libvirt.driver [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Start _get_guest_xml network_info=[{"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.139 227766 WARNING nova.virt.libvirt.driver [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.144 227766 DEBUG nova.virt.libvirt.host [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.145 227766 DEBUG nova.virt.libvirt.host [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.150 227766 DEBUG nova.virt.libvirt.host [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.150 227766 DEBUG nova.virt.libvirt.host [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.152 227766 DEBUG nova.virt.libvirt.driver [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.152 227766 DEBUG nova.virt.hardware [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.153 227766 DEBUG nova.virt.hardware [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.154 227766 DEBUG nova.virt.hardware [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.154 227766 DEBUG nova.virt.hardware [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.154 227766 DEBUG nova.virt.hardware [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.155 227766 DEBUG nova.virt.hardware [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.155 227766 DEBUG nova.virt.hardware [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.156 227766 DEBUG nova.virt.hardware [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.156 227766 DEBUG nova.virt.hardware [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.157 227766 DEBUG nova.virt.hardware [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.157 227766 DEBUG nova.virt.hardware [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.157 227766 DEBUG nova.objects.instance [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'vcpu_model' on Instance uuid dbc61477-df97-4232-9a9e-5b81e6b9b29f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:06:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:41.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.171 227766 DEBUG oslo_concurrency.processutils [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.441 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:06:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/566768126' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.658 227766 DEBUG oslo_concurrency.processutils [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.698 227766 DEBUG oslo_concurrency.processutils [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:06:41 np0005593234 nova_compute[227762]: 2026-01-23 11:06:41.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:06:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2172874385' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.137 227766 DEBUG oslo_concurrency.processutils [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.439s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.139 227766 DEBUG nova.virt.libvirt.vif [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:05:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1130318724',display_name='tempest-TestNetworkAdvancedServerOps-server-1130318724',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1130318724',id=217,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDTi5YUYEUSCCRpABwmlOECwIPF3oJc744XKav7Se2iIl6x1TaLG6GzrgU0yBFa3lXoulXBlqxuqmTORiSzCeTVMC0OVafcQCaIT9wfjlg14116foZdvXXpNovDttARlJQ==',key_name='tempest-TestNetworkAdvancedServerOps-191544259',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:06:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-367vmfka',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:06:36Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=dbc61477-df97-4232-9a9e-5b81e6b9b29f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.139 227766 DEBUG nova.network.os_vif_util [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.140 227766 DEBUG nova.network.os_vif_util [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.141 227766 DEBUG nova.objects.instance [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid dbc61477-df97-4232-9a9e-5b81e6b9b29f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:06:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:42.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.181 227766 DEBUG nova.virt.libvirt.driver [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] End _get_guest_xml xml=<domain type="kvm">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  <uuid>dbc61477-df97-4232-9a9e-5b81e6b9b29f</uuid>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  <name>instance-000000d9</name>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1130318724</nova:name>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 11:06:41</nova:creationTime>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <nova:port uuid="061c373f-1971-4439-80c0-2491edc62fc7">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <system>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <entry name="serial">dbc61477-df97-4232-9a9e-5b81e6b9b29f</entry>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <entry name="uuid">dbc61477-df97-4232-9a9e-5b81e6b9b29f</entry>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    </system>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  <os>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  </os>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  <features>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  </features>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  </clock>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  <devices>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/dbc61477-df97-4232-9a9e-5b81e6b9b29f_disk.config">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:65:c3:ba"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <target dev="tap061c373f-19"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    </interface>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/dbc61477-df97-4232-9a9e-5b81e6b9b29f/console.log" append="off"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    </serial>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <video>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    </video>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <input type="keyboard" bus="usb"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    </rng>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 06:06:42 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 06:06:42 np0005593234 nova_compute[227762]:  </devices>
Jan 23 06:06:42 np0005593234 nova_compute[227762]: </domain>
Jan 23 06:06:42 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.183 227766 DEBUG nova.virt.libvirt.driver [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] skipping disk for instance-000000d9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.183 227766 DEBUG nova.virt.libvirt.driver [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] skipping disk for instance-000000d9 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.184 227766 DEBUG nova.virt.libvirt.vif [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:05:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1130318724',display_name='tempest-TestNetworkAdvancedServerOps-server-1130318724',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1130318724',id=217,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDTi5YUYEUSCCRpABwmlOECwIPF3oJc744XKav7Se2iIl6x1TaLG6GzrgU0yBFa3lXoulXBlqxuqmTORiSzCeTVMC0OVafcQCaIT9wfjlg14116foZdvXXpNovDttARlJQ==',key_name='tempest-TestNetworkAdvancedServerOps-191544259',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:06:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=<?>,power_state=4,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-367vmfka',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='powering-on',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:06:36Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=dbc61477-df97-4232-9a9e-5b81e6b9b29f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='stopped') vif={"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.185 227766 DEBUG nova.network.os_vif_util [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.185 227766 DEBUG nova.network.os_vif_util [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.186 227766 DEBUG os_vif [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.187 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.188 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.188 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.190 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.190 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap061c373f-19, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.191 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap061c373f-19, col_values=(('external_ids', {'iface-id': '061c373f-1971-4439-80c0-2491edc62fc7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:65:c3:ba', 'vm-uuid': 'dbc61477-df97-4232-9a9e-5b81e6b9b29f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.192 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:42 np0005593234 NetworkManager[48942]: <info>  [1769166402.1939] manager: (tap061c373f-19): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/477)
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.195 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.197 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.199 227766 INFO os_vif [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19')#033[00m
Jan 23 06:06:42 np0005593234 kernel: tap061c373f-19: entered promiscuous mode
Jan 23 06:06:42 np0005593234 NetworkManager[48942]: <info>  [1769166402.2717] manager: (tap061c373f-19): new Tun device (/org/freedesktop/NetworkManager/Devices/478)
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.272 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:42 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:42Z|00994|binding|INFO|Claiming lport 061c373f-1971-4439-80c0-2491edc62fc7 for this chassis.
Jan 23 06:06:42 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:42Z|00995|binding|INFO|061c373f-1971-4439-80c0-2491edc62fc7: Claiming fa:16:3e:65:c3:ba 10.100.0.5
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.279 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:c3:ba 10.100.0.5'], port_security=['fa:16:3e:65:c3:ba 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dbc61477-df97-4232-9a9e-5b81e6b9b29f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc54b072-b184-4226-a773-66ebba149a9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'ced6954a-63e2-476d-b999-70afa5e07339', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.247'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8aecfd0-07e8-49d5-aa82-be340c0b5653, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=061c373f-1971-4439-80c0-2491edc62fc7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.280 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 061c373f-1971-4439-80c0-2491edc62fc7 in datapath fc54b072-b184-4226-a773-66ebba149a9c bound to our chassis#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.281 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fc54b072-b184-4226-a773-66ebba149a9c#033[00m
Jan 23 06:06:42 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:42Z|00996|binding|INFO|Setting lport 061c373f-1971-4439-80c0-2491edc62fc7 ovn-installed in OVS
Jan 23 06:06:42 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:42Z|00997|binding|INFO|Setting lport 061c373f-1971-4439-80c0-2491edc62fc7 up in Southbound
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.288 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.291 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.292 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[9c2aa589-4bee-4787-95c8-e627192143ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.293 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfc54b072-b1 in ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.295 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfc54b072-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.295 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aaecc906-d88c-44cc-a126-2267e6e58d73]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.296 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f040a55a-a523-4484-ae5a-4001ce216ab0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 systemd-machined[195626]: New machine qemu-111-instance-000000d9.
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.310 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[16afaf38-f7c2-4bdd-bbda-8d30e1693d80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 systemd[1]: Started Virtual Machine qemu-111-instance-000000d9.
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.337 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1011d4c1-0ead-4595-b2cf-b540cff88262]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 systemd-udevd[344393]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:06:42 np0005593234 NetworkManager[48942]: <info>  [1769166402.3562] device (tap061c373f-19): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:06:42 np0005593234 NetworkManager[48942]: <info>  [1769166402.3571] device (tap061c373f-19): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.367 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[f1662bdf-129e-4643-9c49-0b6c08243440]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 systemd-udevd[344402]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:06:42 np0005593234 NetworkManager[48942]: <info>  [1769166402.3742] manager: (tapfc54b072-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/479)
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.373 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[349f0632-3ee1-436b-a914-d590cec059ec]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.405 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d74694f6-cbf8-4891-9d43-1164eefd802e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.407 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ee6fc8e7-896b-4467-aa6c-bce1ef6756f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 NetworkManager[48942]: <info>  [1769166402.4290] device (tapfc54b072-b0): carrier: link connected
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.435 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[c27f1384-9d7c-438d-9d68-a64908dd3950]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.452 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4d73e9f4-0aa0-4151-9d99-37180e89fb8d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc54b072-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:50:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 306], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1029738, 'reachable_time': 33946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 344423, 'error': None, 'target': 'ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.467 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f27d1cbb-1bbb-4204-8abd-a59932b79234]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1f:5052'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1029738, 'tstamp': 1029738}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 344424, 'error': None, 'target': 'ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.485 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[25fdd4de-4535-4447-aa94-5d1360930f77]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfc54b072-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:50:52'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 306], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1029738, 'reachable_time': 33946, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 344425, 'error': None, 'target': 'ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.512 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[741f34ea-fdfa-478a-b03e-1756571db3bc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.566 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[84087ed3-8a92-4a91-9918-c1164235f86a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.567 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc54b072-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.567 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.568 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfc54b072-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.569 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:42 np0005593234 kernel: tapfc54b072-b0: entered promiscuous mode
Jan 23 06:06:42 np0005593234 NetworkManager[48942]: <info>  [1769166402.5726] manager: (tapfc54b072-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/480)
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.574 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.575 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfc54b072-b0, col_values=(('external_ids', {'iface-id': 'ff96d808-47bc-43bf-ad0e-b336caa55e2e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:06:42 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:42Z|00998|binding|INFO|Releasing lport ff96d808-47bc-43bf-ad0e-b336caa55e2e from this chassis (sb_readonly=0)
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.576 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.578 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fc54b072-b184-4226-a773-66ebba149a9c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fc54b072-b184-4226-a773-66ebba149a9c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.577 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.578 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6323643a-bf12-4b6f-83ee-c16e5fd0aa52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.579 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-fc54b072-b184-4226-a773-66ebba149a9c
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/fc54b072-b184-4226-a773-66ebba149a9c.pid.haproxy
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID fc54b072-b184-4226-a773-66ebba149a9c
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.580 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c', 'env', 'PROCESS_TAG=haproxy-fc54b072-b184-4226-a773-66ebba149a9c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fc54b072-b184-4226-a773-66ebba149a9c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.589 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.916 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.917 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:06:42.917 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:42 np0005593234 podman[344471]: 2026-01-23 11:06:42.959628336 +0000 UTC m=+0.051904305 container create 41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.967 227766 DEBUG nova.compute.manager [req-4820a543-b2bd-4036-92e9-b1fe09a8ec17 req-4cb90e01-287f-4874-97ad-76f70731d4cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received event network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.967 227766 DEBUG oslo_concurrency.lockutils [req-4820a543-b2bd-4036-92e9-b1fe09a8ec17 req-4cb90e01-287f-4874-97ad-76f70731d4cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.968 227766 DEBUG oslo_concurrency.lockutils [req-4820a543-b2bd-4036-92e9-b1fe09a8ec17 req-4cb90e01-287f-4874-97ad-76f70731d4cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.968 227766 DEBUG oslo_concurrency.lockutils [req-4820a543-b2bd-4036-92e9-b1fe09a8ec17 req-4cb90e01-287f-4874-97ad-76f70731d4cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.968 227766 DEBUG nova.compute.manager [req-4820a543-b2bd-4036-92e9-b1fe09a8ec17 req-4cb90e01-287f-4874-97ad-76f70731d4cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] No waiting events found dispatching network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:06:42 np0005593234 nova_compute[227762]: 2026-01-23 11:06:42.968 227766 WARNING nova.compute.manager [req-4820a543-b2bd-4036-92e9-b1fe09a8ec17 req-4cb90e01-287f-4874-97ad-76f70731d4cb 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received unexpected event network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 for instance with vm_state stopped and task_state powering-on.#033[00m
Jan 23 06:06:42 np0005593234 systemd[1]: Started libpod-conmon-41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126.scope.
Jan 23 06:06:43 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:06:43 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f204693adddc0ac749cbf60f6277ed1dc0f2d9b444fda6d88f2541cd03fc586/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:06:43 np0005593234 podman[344471]: 2026-01-23 11:06:42.934985695 +0000 UTC m=+0.027261694 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:06:43 np0005593234 podman[344471]: 2026-01-23 11:06:43.037765681 +0000 UTC m=+0.130041670 container init 41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:06:43 np0005593234 podman[344471]: 2026-01-23 11:06:43.043419398 +0000 UTC m=+0.135695367 container start 41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 06:06:43 np0005593234 podman[344489]: 2026-01-23 11:06:43.052908635 +0000 UTC m=+0.055934541 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 06:06:43 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[344492]: [NOTICE]   (344512) : New worker (344514) forked
Jan 23 06:06:43 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[344492]: [NOTICE]   (344512) : Loading success.
Jan 23 06:06:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:43.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.439 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for dbc61477-df97-4232-9a9e-5b81e6b9b29f due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.440 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166403.4386663, dbc61477-df97-4232-9a9e-5b81e6b9b29f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.440 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.441 227766 DEBUG nova.compute.manager [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.445 227766 INFO nova.virt.libvirt.driver [-] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Instance rebooted successfully.#033[00m
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.445 227766 DEBUG nova.compute.manager [None req-60419b9e-5f0b-4649-a7db-fe1247d5a036 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.480 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.484 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: stopped, current task_state: powering-on, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.514 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] During sync_power_state the instance has a pending task (powering-on). Skip.#033[00m
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.514 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166403.4399502, dbc61477-df97-4232-9a9e-5b81e6b9b29f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.515 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] VM Started (Lifecycle Event)#033[00m
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.550 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.554 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:06:43 np0005593234 nova_compute[227762]: 2026-01-23 11:06:43.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:44.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:06:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3343155991' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:06:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:06:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3343155991' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:06:45 np0005593234 nova_compute[227762]: 2026-01-23 11:06:45.065 227766 DEBUG nova.compute.manager [req-c334a8ec-4155-4714-9b88-aca82101dff2 req-7adc4822-9843-4562-979c-762ca12b0268 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received event network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:06:45 np0005593234 nova_compute[227762]: 2026-01-23 11:06:45.065 227766 DEBUG oslo_concurrency.lockutils [req-c334a8ec-4155-4714-9b88-aca82101dff2 req-7adc4822-9843-4562-979c-762ca12b0268 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:06:45 np0005593234 nova_compute[227762]: 2026-01-23 11:06:45.065 227766 DEBUG oslo_concurrency.lockutils [req-c334a8ec-4155-4714-9b88-aca82101dff2 req-7adc4822-9843-4562-979c-762ca12b0268 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:06:45 np0005593234 nova_compute[227762]: 2026-01-23 11:06:45.066 227766 DEBUG oslo_concurrency.lockutils [req-c334a8ec-4155-4714-9b88-aca82101dff2 req-7adc4822-9843-4562-979c-762ca12b0268 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:06:45 np0005593234 nova_compute[227762]: 2026-01-23 11:06:45.066 227766 DEBUG nova.compute.manager [req-c334a8ec-4155-4714-9b88-aca82101dff2 req-7adc4822-9843-4562-979c-762ca12b0268 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] No waiting events found dispatching network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:06:45 np0005593234 nova_compute[227762]: 2026-01-23 11:06:45.067 227766 WARNING nova.compute.manager [req-c334a8ec-4155-4714-9b88-aca82101dff2 req-7adc4822-9843-4562-979c-762ca12b0268 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received unexpected event network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:06:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:06:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:45.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:06:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:46.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:46 np0005593234 nova_compute[227762]: 2026-01-23 11:06:46.442 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:47.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:47 np0005593234 nova_compute[227762]: 2026-01-23 11:06:47.193 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:48.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:49.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:06:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:50.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:06:50 np0005593234 podman[344552]: 2026-01-23 11:06:50.805163569 +0000 UTC m=+0.098453692 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:06:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:06:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:51.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:06:51 np0005593234 nova_compute[227762]: 2026-01-23 11:06:51.487 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:52.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:52 np0005593234 nova_compute[227762]: 2026-01-23 11:06:52.196 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:53.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:54.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:55.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:06:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:56.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:06:56 np0005593234 nova_compute[227762]: 2026-01-23 11:06:56.489 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:57.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:57 np0005593234 nova_compute[227762]: 2026-01-23 11:06:57.199 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:06:57 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:06:57 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:06:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:06:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:06:58.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:06:58 np0005593234 nova_compute[227762]: 2026-01-23 11:06:58.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:06:59 np0005593234 ovn_controller[134547]: 2026-01-23T11:06:59Z|00132|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:65:c3:ba 10.100.0.5
Jan 23 06:06:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:06:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:06:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:06:59.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:07:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:00.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:07:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:07:00 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:07:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:01.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:01 np0005593234 nova_compute[227762]: 2026-01-23 11:07:01.492 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:01 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:07:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:07:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:02.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:07:02 np0005593234 nova_compute[227762]: 2026-01-23 11:07:02.201 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:03.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:04.175 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:05.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:05 np0005593234 ovn_controller[134547]: 2026-01-23T11:07:05Z|00999|memory_trim|INFO|Detected inactivity (last active 30012 ms ago): trimming memory
Jan 23 06:07:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:06.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:06 np0005593234 nova_compute[227762]: 2026-01-23 11:07:06.270 227766 INFO nova.compute.manager [None req-ca23050d-e813-42da-b1ac-4d1e20d44211 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Get console output#033[00m
Jan 23 06:07:06 np0005593234 nova_compute[227762]: 2026-01-23 11:07:06.276 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 06:07:06 np0005593234 nova_compute[227762]: 2026-01-23 11:07:06.493 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:07.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:07 np0005593234 nova_compute[227762]: 2026-01-23 11:07:07.203 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:08.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:09.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:09 np0005593234 nova_compute[227762]: 2026-01-23 11:07:09.387 227766 DEBUG nova.compute.manager [req-1b644150-3bd8-464b-961b-9a8f5eb7dd0a req-80cb6fcf-ca11-438d-956b-5b6cab8acd69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received event network-changed-061c373f-1971-4439-80c0-2491edc62fc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:07:09 np0005593234 nova_compute[227762]: 2026-01-23 11:07:09.387 227766 DEBUG nova.compute.manager [req-1b644150-3bd8-464b-961b-9a8f5eb7dd0a req-80cb6fcf-ca11-438d-956b-5b6cab8acd69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Refreshing instance network info cache due to event network-changed-061c373f-1971-4439-80c0-2491edc62fc7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:07:09 np0005593234 nova_compute[227762]: 2026-01-23 11:07:09.388 227766 DEBUG oslo_concurrency.lockutils [req-1b644150-3bd8-464b-961b-9a8f5eb7dd0a req-80cb6fcf-ca11-438d-956b-5b6cab8acd69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:07:09 np0005593234 nova_compute[227762]: 2026-01-23 11:07:09.388 227766 DEBUG oslo_concurrency.lockutils [req-1b644150-3bd8-464b-961b-9a8f5eb7dd0a req-80cb6fcf-ca11-438d-956b-5b6cab8acd69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:07:09 np0005593234 nova_compute[227762]: 2026-01-23 11:07:09.388 227766 DEBUG nova.network.neutron [req-1b644150-3bd8-464b-961b-9a8f5eb7dd0a req-80cb6fcf-ca11-438d-956b-5b6cab8acd69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Refreshing network info cache for port 061c373f-1971-4439-80c0-2491edc62fc7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:07:09 np0005593234 nova_compute[227762]: 2026-01-23 11:07:09.457 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:09.457 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=100, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=99) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:07:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:09.459 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:07:09 np0005593234 nova_compute[227762]: 2026-01-23 11:07:09.517 227766 DEBUG oslo_concurrency.lockutils [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:09 np0005593234 nova_compute[227762]: 2026-01-23 11:07:09.517 227766 DEBUG oslo_concurrency.lockutils [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:09 np0005593234 nova_compute[227762]: 2026-01-23 11:07:09.517 227766 DEBUG oslo_concurrency.lockutils [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:09 np0005593234 nova_compute[227762]: 2026-01-23 11:07:09.518 227766 DEBUG oslo_concurrency.lockutils [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:09 np0005593234 nova_compute[227762]: 2026-01-23 11:07:09.518 227766 DEBUG oslo_concurrency.lockutils [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:09 np0005593234 nova_compute[227762]: 2026-01-23 11:07:09.519 227766 INFO nova.compute.manager [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Terminating instance#033[00m
Jan 23 06:07:09 np0005593234 nova_compute[227762]: 2026-01-23 11:07:09.520 227766 DEBUG nova.compute.manager [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 06:07:09 np0005593234 kernel: tap061c373f-19 (unregistering): left promiscuous mode
Jan 23 06:07:09 np0005593234 NetworkManager[48942]: <info>  [1769166429.9445] device (tap061c373f-19): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:07:10 np0005593234 ovn_controller[134547]: 2026-01-23T11:07:10Z|01000|binding|INFO|Releasing lport 061c373f-1971-4439-80c0-2491edc62fc7 from this chassis (sb_readonly=0)
Jan 23 06:07:10 np0005593234 ovn_controller[134547]: 2026-01-23T11:07:10Z|01001|binding|INFO|Setting lport 061c373f-1971-4439-80c0-2491edc62fc7 down in Southbound
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.001 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:10 np0005593234 ovn_controller[134547]: 2026-01-23T11:07:10Z|01002|binding|INFO|Removing iface tap061c373f-19 ovn-installed in OVS
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.063 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:65:c3:ba 10.100.0.5'], port_security=['fa:16:3e:65:c3:ba 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'dbc61477-df97-4232-9a9e-5b81e6b9b29f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fc54b072-b184-4226-a773-66ebba149a9c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'ced6954a-63e2-476d-b999-70afa5e07339', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e8aecfd0-07e8-49d5-aa82-be340c0b5653, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=061c373f-1971-4439-80c0-2491edc62fc7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.065 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 061c373f-1971-4439-80c0-2491edc62fc7 in datapath fc54b072-b184-4226-a773-66ebba149a9c unbound from our chassis#033[00m
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.065 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fc54b072-b184-4226-a773-66ebba149a9c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.066 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.069 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e68b2dd0-998c-4aed-87f4-d36cfe06f5c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.069 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c namespace which is not needed anymore#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.077 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:10 np0005593234 systemd[1]: machine-qemu\x2d111\x2dinstance\x2d000000d9.scope: Deactivated successfully.
Jan 23 06:07:10 np0005593234 systemd[1]: machine-qemu\x2d111\x2dinstance\x2d000000d9.scope: Consumed 14.055s CPU time.
Jan 23 06:07:10 np0005593234 systemd-machined[195626]: Machine qemu-111-instance-000000d9 terminated.
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.160 227766 INFO nova.virt.libvirt.driver [-] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Instance destroyed successfully.#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.161 227766 DEBUG nova.objects.instance [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'resources' on Instance uuid dbc61477-df97-4232-9a9e-5b81e6b9b29f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.175 227766 DEBUG nova.virt.libvirt.vif [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:05:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1130318724',display_name='tempest-TestNetworkAdvancedServerOps-server-1130318724',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1130318724',id=217,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDTi5YUYEUSCCRpABwmlOECwIPF3oJc744XKav7Se2iIl6x1TaLG6GzrgU0yBFa3lXoulXBlqxuqmTORiSzCeTVMC0OVafcQCaIT9wfjlg14116foZdvXXpNovDttARlJQ==',key_name='tempest-TestNetworkAdvancedServerOps-191544259',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:06:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-367vmfka',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:06:43Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=dbc61477-df97-4232-9a9e-5b81e6b9b29f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.175 227766 DEBUG nova.network.os_vif_util [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.247", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.176 227766 DEBUG nova.network.os_vif_util [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.177 227766 DEBUG os_vif [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.179 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.179 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap061c373f-19, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.181 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.184 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:07:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:10.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.186 227766 INFO os_vif [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:65:c3:ba,bridge_name='br-int',has_traffic_filtering=True,id=061c373f-1971-4439-80c0-2491edc62fc7,network=Network(fc54b072-b184-4226-a773-66ebba149a9c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap061c373f-19')#033[00m
Jan 23 06:07:10 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[344492]: [NOTICE]   (344512) : haproxy version is 2.8.14-c23fe91
Jan 23 06:07:10 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[344492]: [NOTICE]   (344512) : path to executable is /usr/sbin/haproxy
Jan 23 06:07:10 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[344492]: [WARNING]  (344512) : Exiting Master process...
Jan 23 06:07:10 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[344492]: [ALERT]    (344512) : Current worker (344514) exited with code 143 (Terminated)
Jan 23 06:07:10 np0005593234 neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c[344492]: [WARNING]  (344512) : All workers exited. Exiting... (0)
Jan 23 06:07:10 np0005593234 systemd[1]: libpod-41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126.scope: Deactivated successfully.
Jan 23 06:07:10 np0005593234 podman[344801]: 2026-01-23 11:07:10.223048691 +0000 UTC m=+0.055405065 container died 41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:07:10 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126-userdata-shm.mount: Deactivated successfully.
Jan 23 06:07:10 np0005593234 systemd[1]: var-lib-containers-storage-overlay-3f204693adddc0ac749cbf60f6277ed1dc0f2d9b444fda6d88f2541cd03fc586-merged.mount: Deactivated successfully.
Jan 23 06:07:10 np0005593234 podman[344801]: 2026-01-23 11:07:10.257229211 +0000 UTC m=+0.089585595 container cleanup 41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:07:10 np0005593234 systemd[1]: libpod-conmon-41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126.scope: Deactivated successfully.
Jan 23 06:07:10 np0005593234 podman[344851]: 2026-01-23 11:07:10.325951171 +0000 UTC m=+0.045566427 container remove 41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.331 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b450adbb-7757-405c-9896-870172388d43]: (4, ('Fri Jan 23 11:07:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c (41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126)\n41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126\nFri Jan 23 11:07:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c (41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126)\n41e6d910cdaa0233d5b228cdc5a8c35d9a6b8d9a4e7f4367e3733a97e8dda126\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.333 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc28464-673e-40c5-9d18-a3e794567cd5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.334 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfc54b072-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.335 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:10 np0005593234 kernel: tapfc54b072-b0: left promiscuous mode
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.348 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.351 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3c66dd50-2dfa-43a0-bb49-db0dc3bd8545]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.367 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ec6b923d-e602-4dd3-84a8-a5b3f6a37ff5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.368 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a0023a4d-77a8-42d4-8f1b-7211495e4730]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.384 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[04e42ce3-c2d6-47a5-a21f-af72665ed243]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1029732, 'reachable_time': 22219, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 344867, 'error': None, 'target': 'ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.387 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fc54b072-b184-4226-a773-66ebba149a9c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:07:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:10.387 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[32281712-c0b0-4c78-bafd-e41c0045a489]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:10 np0005593234 systemd[1]: run-netns-ovnmeta\x2dfc54b072\x2db184\x2d4226\x2da773\x2d66ebba149a9c.mount: Deactivated successfully.
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.608 227766 DEBUG nova.network.neutron [req-1b644150-3bd8-464b-961b-9a8f5eb7dd0a req-80cb6fcf-ca11-438d-956b-5b6cab8acd69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Updated VIF entry in instance network info cache for port 061c373f-1971-4439-80c0-2491edc62fc7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.609 227766 DEBUG nova.network.neutron [req-1b644150-3bd8-464b-961b-9a8f5eb7dd0a req-80cb6fcf-ca11-438d-956b-5b6cab8acd69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Updating instance_info_cache with network_info: [{"id": "061c373f-1971-4439-80c0-2491edc62fc7", "address": "fa:16:3e:65:c3:ba", "network": {"id": "fc54b072-b184-4226-a773-66ebba149a9c", "bridge": "br-int", "label": "tempest-network-smoke--2047221408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap061c373f-19", "ovs_interfaceid": "061c373f-1971-4439-80c0-2491edc62fc7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:07:10 np0005593234 nova_compute[227762]: 2026-01-23 11:07:10.627 227766 DEBUG oslo_concurrency.lockutils [req-1b644150-3bd8-464b-961b-9a8f5eb7dd0a req-80cb6fcf-ca11-438d-956b-5b6cab8acd69 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-dbc61477-df97-4232-9a9e-5b81e6b9b29f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:07:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:11.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:11 np0005593234 nova_compute[227762]: 2026-01-23 11:07:11.496 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:11 np0005593234 nova_compute[227762]: 2026-01-23 11:07:11.511 227766 DEBUG nova.compute.manager [req-00cb0218-bf73-46c0-8438-c3fe9e2e910b req-0ea17c34-c05f-4de3-a1ac-a7d5a051bdb8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received event network-vif-unplugged-061c373f-1971-4439-80c0-2491edc62fc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:07:11 np0005593234 nova_compute[227762]: 2026-01-23 11:07:11.512 227766 DEBUG oslo_concurrency.lockutils [req-00cb0218-bf73-46c0-8438-c3fe9e2e910b req-0ea17c34-c05f-4de3-a1ac-a7d5a051bdb8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:11 np0005593234 nova_compute[227762]: 2026-01-23 11:07:11.513 227766 DEBUG oslo_concurrency.lockutils [req-00cb0218-bf73-46c0-8438-c3fe9e2e910b req-0ea17c34-c05f-4de3-a1ac-a7d5a051bdb8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:11 np0005593234 nova_compute[227762]: 2026-01-23 11:07:11.513 227766 DEBUG oslo_concurrency.lockutils [req-00cb0218-bf73-46c0-8438-c3fe9e2e910b req-0ea17c34-c05f-4de3-a1ac-a7d5a051bdb8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:11 np0005593234 nova_compute[227762]: 2026-01-23 11:07:11.514 227766 DEBUG nova.compute.manager [req-00cb0218-bf73-46c0-8438-c3fe9e2e910b req-0ea17c34-c05f-4de3-a1ac-a7d5a051bdb8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] No waiting events found dispatching network-vif-unplugged-061c373f-1971-4439-80c0-2491edc62fc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:07:11 np0005593234 nova_compute[227762]: 2026-01-23 11:07:11.514 227766 DEBUG nova.compute.manager [req-00cb0218-bf73-46c0-8438-c3fe9e2e910b req-0ea17c34-c05f-4de3-a1ac-a7d5a051bdb8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received event network-vif-unplugged-061c373f-1971-4439-80c0-2491edc62fc7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 06:07:11 np0005593234 nova_compute[227762]: 2026-01-23 11:07:11.515 227766 DEBUG nova.compute.manager [req-00cb0218-bf73-46c0-8438-c3fe9e2e910b req-0ea17c34-c05f-4de3-a1ac-a7d5a051bdb8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received event network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:07:11 np0005593234 nova_compute[227762]: 2026-01-23 11:07:11.515 227766 DEBUG oslo_concurrency.lockutils [req-00cb0218-bf73-46c0-8438-c3fe9e2e910b req-0ea17c34-c05f-4de3-a1ac-a7d5a051bdb8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:11 np0005593234 nova_compute[227762]: 2026-01-23 11:07:11.516 227766 DEBUG oslo_concurrency.lockutils [req-00cb0218-bf73-46c0-8438-c3fe9e2e910b req-0ea17c34-c05f-4de3-a1ac-a7d5a051bdb8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:11 np0005593234 nova_compute[227762]: 2026-01-23 11:07:11.517 227766 DEBUG oslo_concurrency.lockutils [req-00cb0218-bf73-46c0-8438-c3fe9e2e910b req-0ea17c34-c05f-4de3-a1ac-a7d5a051bdb8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:11 np0005593234 nova_compute[227762]: 2026-01-23 11:07:11.517 227766 DEBUG nova.compute.manager [req-00cb0218-bf73-46c0-8438-c3fe9e2e910b req-0ea17c34-c05f-4de3-a1ac-a7d5a051bdb8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] No waiting events found dispatching network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:07:11 np0005593234 nova_compute[227762]: 2026-01-23 11:07:11.518 227766 WARNING nova.compute.manager [req-00cb0218-bf73-46c0-8438-c3fe9e2e910b req-0ea17c34-c05f-4de3-a1ac-a7d5a051bdb8 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received unexpected event network-vif-plugged-061c373f-1971-4439-80c0-2491edc62fc7 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 06:07:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:07:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:12.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:07:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:07:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:07:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:13.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:13 np0005593234 podman[344920]: 2026-01-23 11:07:13.767326621 +0000 UTC m=+0.055044994 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 06:07:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:14.189 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:14 np0005593234 nova_compute[227762]: 2026-01-23 11:07:14.829 227766 INFO nova.virt.libvirt.driver [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Deleting instance files /var/lib/nova/instances/dbc61477-df97-4232-9a9e-5b81e6b9b29f_del#033[00m
Jan 23 06:07:14 np0005593234 nova_compute[227762]: 2026-01-23 11:07:14.830 227766 INFO nova.virt.libvirt.driver [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Deletion of /var/lib/nova/instances/dbc61477-df97-4232-9a9e-5b81e6b9b29f_del complete#033[00m
Jan 23 06:07:15 np0005593234 nova_compute[227762]: 2026-01-23 11:07:15.184 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:15.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:16.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:16 np0005593234 nova_compute[227762]: 2026-01-23 11:07:16.337 227766 INFO nova.compute.manager [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Took 6.82 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 06:07:16 np0005593234 nova_compute[227762]: 2026-01-23 11:07:16.338 227766 DEBUG oslo.service.loopingcall [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 06:07:16 np0005593234 nova_compute[227762]: 2026-01-23 11:07:16.339 227766 DEBUG nova.compute.manager [-] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 06:07:16 np0005593234 nova_compute[227762]: 2026-01-23 11:07:16.339 227766 DEBUG nova.network.neutron [-] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 06:07:16 np0005593234 nova_compute[227762]: 2026-01-23 11:07:16.497 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:17.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:17.461 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '100'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:07:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:18.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:19.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:20.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:20 np0005593234 nova_compute[227762]: 2026-01-23 11:07:20.220 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:21.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:21 np0005593234 nova_compute[227762]: 2026-01-23 11:07:21.538 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:21 np0005593234 podman[344994]: 2026-01-23 11:07:21.845681181 +0000 UTC m=+0.128035888 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 06:07:22 np0005593234 nova_compute[227762]: 2026-01-23 11:07:22.061 227766 DEBUG nova.network.neutron [-] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:07:22 np0005593234 nova_compute[227762]: 2026-01-23 11:07:22.083 227766 INFO nova.compute.manager [-] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Took 5.74 seconds to deallocate network for instance.#033[00m
Jan 23 06:07:22 np0005593234 nova_compute[227762]: 2026-01-23 11:07:22.145 227766 DEBUG oslo_concurrency.lockutils [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:22 np0005593234 nova_compute[227762]: 2026-01-23 11:07:22.145 227766 DEBUG oslo_concurrency.lockutils [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:22 np0005593234 nova_compute[227762]: 2026-01-23 11:07:22.182 227766 DEBUG nova.compute.manager [req-3cf75988-860b-4584-ab70-dec6c597a640 req-13852754-1a48-4877-8bee-1c978e4710cd 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Received event network-vif-deleted-061c373f-1971-4439-80c0-2491edc62fc7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:07:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:22.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:22 np0005593234 nova_compute[227762]: 2026-01-23 11:07:22.213 227766 DEBUG oslo_concurrency.processutils [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:07:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:07:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3199168074' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:07:22 np0005593234 nova_compute[227762]: 2026-01-23 11:07:22.670 227766 DEBUG oslo_concurrency.processutils [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:07:22 np0005593234 nova_compute[227762]: 2026-01-23 11:07:22.676 227766 DEBUG nova.compute.provider_tree [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:07:22 np0005593234 nova_compute[227762]: 2026-01-23 11:07:22.715 227766 DEBUG nova.scheduler.client.report [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:07:22 np0005593234 nova_compute[227762]: 2026-01-23 11:07:22.753 227766 DEBUG oslo_concurrency.lockutils [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:22 np0005593234 nova_compute[227762]: 2026-01-23 11:07:22.792 227766 INFO nova.scheduler.client.report [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Deleted allocations for instance dbc61477-df97-4232-9a9e-5b81e6b9b29f#033[00m
Jan 23 06:07:22 np0005593234 nova_compute[227762]: 2026-01-23 11:07:22.856 227766 DEBUG oslo_concurrency.lockutils [None req-a586da77-b48b-4ea8-88d7-e9d4dc30f957 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "dbc61477-df97-4232-9a9e-5b81e6b9b29f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 13.338s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:23.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:24.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:25 np0005593234 nova_compute[227762]: 2026-01-23 11:07:25.157 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769166430.1559646, dbc61477-df97-4232-9a9e-5b81e6b9b29f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:07:25 np0005593234 nova_compute[227762]: 2026-01-23 11:07:25.158 227766 INFO nova.compute.manager [-] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] VM Stopped (Lifecycle Event)#033[00m
Jan 23 06:07:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:07:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:25.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:07:25 np0005593234 nova_compute[227762]: 2026-01-23 11:07:25.224 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:25 np0005593234 nova_compute[227762]: 2026-01-23 11:07:25.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:26.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:26 np0005593234 nova_compute[227762]: 2026-01-23 11:07:26.232 227766 DEBUG nova.compute.manager [None req-6bbbd81f-0c1e-4c95-9912-3ce8f9674143 - - - - - -] [instance: dbc61477-df97-4232-9a9e-5b81e6b9b29f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:07:26 np0005593234 nova_compute[227762]: 2026-01-23 11:07:26.234 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:26 np0005593234 nova_compute[227762]: 2026-01-23 11:07:26.234 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:26 np0005593234 nova_compute[227762]: 2026-01-23 11:07:26.234 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:26 np0005593234 nova_compute[227762]: 2026-01-23 11:07:26.234 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:07:26 np0005593234 nova_compute[227762]: 2026-01-23 11:07:26.235 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:07:26 np0005593234 nova_compute[227762]: 2026-01-23 11:07:26.540 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:07:26 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/331511642' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:07:26 np0005593234 nova_compute[227762]: 2026-01-23 11:07:26.672 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:07:26 np0005593234 nova_compute[227762]: 2026-01-23 11:07:26.821 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:07:26 np0005593234 nova_compute[227762]: 2026-01-23 11:07:26.822 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4116MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:07:26 np0005593234 nova_compute[227762]: 2026-01-23 11:07:26.822 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:26 np0005593234 nova_compute[227762]: 2026-01-23 11:07:26.822 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:27.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:28.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:28 np0005593234 nova_compute[227762]: 2026-01-23 11:07:28.422 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:07:28 np0005593234 nova_compute[227762]: 2026-01-23 11:07:28.422 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:07:28 np0005593234 nova_compute[227762]: 2026-01-23 11:07:28.440 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:07:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:07:28 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2806186032' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:07:28 np0005593234 nova_compute[227762]: 2026-01-23 11:07:28.881 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:07:28 np0005593234 nova_compute[227762]: 2026-01-23 11:07:28.888 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:07:28 np0005593234 nova_compute[227762]: 2026-01-23 11:07:28.917 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:07:28 np0005593234 nova_compute[227762]: 2026-01-23 11:07:28.944 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:07:28 np0005593234 nova_compute[227762]: 2026-01-23 11:07:28.945 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 2.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:29.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:30.212 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:30 np0005593234 nova_compute[227762]: 2026-01-23 11:07:30.228 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:30 np0005593234 nova_compute[227762]: 2026-01-23 11:07:30.537 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:30 np0005593234 nova_compute[227762]: 2026-01-23 11:07:30.615 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:31.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:31 np0005593234 nova_compute[227762]: 2026-01-23 11:07:31.542 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:32.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:32 np0005593234 nova_compute[227762]: 2026-01-23 11:07:32.946 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:32 np0005593234 nova_compute[227762]: 2026-01-23 11:07:32.946 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:07:32 np0005593234 nova_compute[227762]: 2026-01-23 11:07:32.946 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:07:32 np0005593234 nova_compute[227762]: 2026-01-23 11:07:32.963 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:07:32 np0005593234 nova_compute[227762]: 2026-01-23 11:07:32.963 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:33.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:34.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:35.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:35 np0005593234 nova_compute[227762]: 2026-01-23 11:07:35.232 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:36.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:36 np0005593234 nova_compute[227762]: 2026-01-23 11:07:36.544 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:07:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:37.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:07:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:38.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:38 np0005593234 nova_compute[227762]: 2026-01-23 11:07:38.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:38 np0005593234 nova_compute[227762]: 2026-01-23 11:07:38.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:39.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:39 np0005593234 nova_compute[227762]: 2026-01-23 11:07:39.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:39 np0005593234 nova_compute[227762]: 2026-01-23 11:07:39.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:07:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:40.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:40 np0005593234 nova_compute[227762]: 2026-01-23 11:07:40.236 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:40 np0005593234 nova_compute[227762]: 2026-01-23 11:07:40.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:41.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:41 np0005593234 nova_compute[227762]: 2026-01-23 11:07:41.548 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:42.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:42.916 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:42.917 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:42.917 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:43.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:43 np0005593234 nova_compute[227762]: 2026-01-23 11:07:43.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:07:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:44.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:44 np0005593234 podman[345150]: 2026-01-23 11:07:44.788160977 +0000 UTC m=+0.075799984 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2)
Jan 23 06:07:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:45.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:45 np0005593234 nova_compute[227762]: 2026-01-23 11:07:45.240 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:46.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:46 np0005593234 nova_compute[227762]: 2026-01-23 11:07:46.549 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:47.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:48.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #214. Immutable memtables: 0.
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.650734) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 137] Flushing memtable with next log file: 214
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166468650793, "job": 137, "event": "flush_started", "num_memtables": 1, "num_entries": 1032, "num_deletes": 251, "total_data_size": 2220907, "memory_usage": 2242016, "flush_reason": "Manual Compaction"}
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 137] Level-0 flush table #215: started
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166468663969, "cf_name": "default", "job": 137, "event": "table_file_creation", "file_number": 215, "file_size": 1466995, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 101338, "largest_seqno": 102365, "table_properties": {"data_size": 1462255, "index_size": 2327, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10433, "raw_average_key_size": 19, "raw_value_size": 1452808, "raw_average_value_size": 2772, "num_data_blocks": 102, "num_entries": 524, "num_filter_entries": 524, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166384, "oldest_key_time": 1769166384, "file_creation_time": 1769166468, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 215, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 137] Flush lasted 13274 microseconds, and 3988 cpu microseconds.
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.664012) [db/flush_job.cc:967] [default] [JOB 137] Level-0 flush table #215: 1466995 bytes OK
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.664027) [db/memtable_list.cc:519] [default] Level-0 commit table #215 started
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.665469) [db/memtable_list.cc:722] [default] Level-0 commit table #215: memtable #1 done
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.665483) EVENT_LOG_v1 {"time_micros": 1769166468665478, "job": 137, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.665499) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 137] Try to delete WAL files size 2215760, prev total WAL file size 2215760, number of live WAL files 2.
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000211.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.666135) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730039303336' seq:72057594037927935, type:22 .. '7061786F730039323838' seq:0, type:0; will stop at (end)
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 138] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 137 Base level 0, inputs: [215(1432KB)], [213(12MB)]
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166468666215, "job": 138, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [215], "files_L6": [213], "score": -1, "input_data_size": 14142953, "oldest_snapshot_seqno": -1}
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 138] Generated table #216: 11543 keys, 12191057 bytes, temperature: kUnknown
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166468725313, "cf_name": "default", "job": 138, "event": "table_file_creation", "file_number": 216, "file_size": 12191057, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12120822, "index_size": 40298, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 28869, "raw_key_size": 306913, "raw_average_key_size": 26, "raw_value_size": 11923423, "raw_average_value_size": 1032, "num_data_blocks": 1512, "num_entries": 11543, "num_filter_entries": 11543, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769166468, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 216, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.725613) [db/compaction/compaction_job.cc:1663] [default] [JOB 138] Compacted 1@0 + 1@6 files to L6 => 12191057 bytes
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.727767) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 238.9 rd, 205.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 12.1 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(18.0) write-amplify(8.3) OK, records in: 12060, records dropped: 517 output_compression: NoCompression
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.727783) EVENT_LOG_v1 {"time_micros": 1769166468727775, "job": 138, "event": "compaction_finished", "compaction_time_micros": 59199, "compaction_time_cpu_micros": 29000, "output_level": 6, "num_output_files": 1, "total_output_size": 12191057, "num_input_records": 12060, "num_output_records": 11543, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000215.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166468728070, "job": 138, "event": "table_file_deletion", "file_number": 215}
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000213.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166468729997, "job": 138, "event": "table_file_deletion", "file_number": 213}
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.666032) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.730058) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.730065) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.730067) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.730069) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:07:48 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:07:48.730070) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:07:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:07:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:49.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:07:49 np0005593234 nova_compute[227762]: 2026-01-23 11:07:49.678 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "abd79b15-c133-4b9f-ae4c-545309f52322" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:49 np0005593234 nova_compute[227762]: 2026-01-23 11:07:49.679 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:49 np0005593234 nova_compute[227762]: 2026-01-23 11:07:49.694 227766 DEBUG nova.compute.manager [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 06:07:49 np0005593234 nova_compute[227762]: 2026-01-23 11:07:49.787 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:49 np0005593234 nova_compute[227762]: 2026-01-23 11:07:49.788 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:49 np0005593234 nova_compute[227762]: 2026-01-23 11:07:49.794 227766 DEBUG nova.virt.hardware [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 06:07:49 np0005593234 nova_compute[227762]: 2026-01-23 11:07:49.794 227766 INFO nova.compute.claims [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.093 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:07:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:50.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.244 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:07:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3174168256' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.536 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.543 227766 DEBUG nova.compute.provider_tree [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.563 227766 DEBUG nova.scheduler.client.report [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.591 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.803s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.592 227766 DEBUG nova.compute.manager [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.638 227766 DEBUG nova.compute.manager [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.638 227766 DEBUG nova.network.neutron [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.660 227766 INFO nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.679 227766 DEBUG nova.compute.manager [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.773 227766 DEBUG nova.compute.manager [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.774 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.775 227766 INFO nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Creating image(s)#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.806 227766 DEBUG nova.storage.rbd_utils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image abd79b15-c133-4b9f-ae4c-545309f52322_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.838 227766 DEBUG nova.storage.rbd_utils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image abd79b15-c133-4b9f-ae4c-545309f52322_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.873 227766 DEBUG nova.storage.rbd_utils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image abd79b15-c133-4b9f-ae4c-545309f52322_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.877 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.912 227766 DEBUG nova.policy [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '420c366dc5dc45a48da4e0b18c93043f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'c06f98b51aeb48de91d116fda54a161f', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.948 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.949 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.949 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.950 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.976 227766 DEBUG nova.storage.rbd_utils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image abd79b15-c133-4b9f-ae4c-545309f52322_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:07:50 np0005593234 nova_compute[227762]: 2026-01-23 11:07:50.979 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 abd79b15-c133-4b9f-ae4c-545309f52322_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:07:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:51.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:51 np0005593234 nova_compute[227762]: 2026-01-23 11:07:51.452 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 abd79b15-c133-4b9f-ae4c-545309f52322_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:07:51 np0005593234 nova_compute[227762]: 2026-01-23 11:07:51.524 227766 DEBUG nova.storage.rbd_utils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] resizing rbd image abd79b15-c133-4b9f-ae4c-545309f52322_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 06:07:51 np0005593234 nova_compute[227762]: 2026-01-23 11:07:51.562 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:51 np0005593234 nova_compute[227762]: 2026-01-23 11:07:51.702 227766 DEBUG nova.network.neutron [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Successfully created port: 577aba9c-4ead-4d00-b570-3cb9d0fe2866 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 06:07:51 np0005593234 nova_compute[227762]: 2026-01-23 11:07:51.877 227766 DEBUG nova.objects.instance [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'migration_context' on Instance uuid abd79b15-c133-4b9f-ae4c-545309f52322 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:07:51 np0005593234 nova_compute[227762]: 2026-01-23 11:07:51.903 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 06:07:51 np0005593234 nova_compute[227762]: 2026-01-23 11:07:51.903 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Ensure instance console log exists: /var/lib/nova/instances/abd79b15-c133-4b9f-ae4c-545309f52322/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 06:07:51 np0005593234 nova_compute[227762]: 2026-01-23 11:07:51.904 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:51 np0005593234 nova_compute[227762]: 2026-01-23 11:07:51.904 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:51 np0005593234 nova_compute[227762]: 2026-01-23 11:07:51.904 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:52.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:52 np0005593234 nova_compute[227762]: 2026-01-23 11:07:52.450 227766 DEBUG nova.network.neutron [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Successfully updated port: 577aba9c-4ead-4d00-b570-3cb9d0fe2866 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 06:07:52 np0005593234 nova_compute[227762]: 2026-01-23 11:07:52.476 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:07:52 np0005593234 nova_compute[227762]: 2026-01-23 11:07:52.476 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:07:52 np0005593234 nova_compute[227762]: 2026-01-23 11:07:52.476 227766 DEBUG nova.network.neutron [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:07:52 np0005593234 nova_compute[227762]: 2026-01-23 11:07:52.559 227766 DEBUG nova.compute.manager [req-911941f1-51f7-4222-82e6-f4e7d0afae02 req-52ac2de8-b122-43fa-87ee-206796e94c26 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received event network-changed-577aba9c-4ead-4d00-b570-3cb9d0fe2866 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:07:52 np0005593234 nova_compute[227762]: 2026-01-23 11:07:52.560 227766 DEBUG nova.compute.manager [req-911941f1-51f7-4222-82e6-f4e7d0afae02 req-52ac2de8-b122-43fa-87ee-206796e94c26 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Refreshing instance network info cache due to event network-changed-577aba9c-4ead-4d00-b570-3cb9d0fe2866. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:07:52 np0005593234 nova_compute[227762]: 2026-01-23 11:07:52.561 227766 DEBUG oslo_concurrency.lockutils [req-911941f1-51f7-4222-82e6-f4e7d0afae02 req-52ac2de8-b122-43fa-87ee-206796e94c26 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:07:52 np0005593234 nova_compute[227762]: 2026-01-23 11:07:52.613 227766 DEBUG nova.network.neutron [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 06:07:52 np0005593234 podman[345362]: 2026-01-23 11:07:52.854640606 +0000 UTC m=+0.139112744 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Jan 23 06:07:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:53.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.381 227766 DEBUG nova.network.neutron [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Updating instance_info_cache with network_info: [{"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.417 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.418 227766 DEBUG nova.compute.manager [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Instance network_info: |[{"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.418 227766 DEBUG oslo_concurrency.lockutils [req-911941f1-51f7-4222-82e6-f4e7d0afae02 req-52ac2de8-b122-43fa-87ee-206796e94c26 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.419 227766 DEBUG nova.network.neutron [req-911941f1-51f7-4222-82e6-f4e7d0afae02 req-52ac2de8-b122-43fa-87ee-206796e94c26 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Refreshing network info cache for port 577aba9c-4ead-4d00-b570-3cb9d0fe2866 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.423 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Start _get_guest_xml network_info=[{"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.427 227766 WARNING nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.432 227766 DEBUG nova.virt.libvirt.host [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.433 227766 DEBUG nova.virt.libvirt.host [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.435 227766 DEBUG nova.virt.libvirt.host [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.436 227766 DEBUG nova.virt.libvirt.host [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.437 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.438 227766 DEBUG nova.virt.hardware [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.438 227766 DEBUG nova.virt.hardware [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.438 227766 DEBUG nova.virt.hardware [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.439 227766 DEBUG nova.virt.hardware [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.439 227766 DEBUG nova.virt.hardware [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.439 227766 DEBUG nova.virt.hardware [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.439 227766 DEBUG nova.virt.hardware [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.440 227766 DEBUG nova.virt.hardware [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.440 227766 DEBUG nova.virt.hardware [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.440 227766 DEBUG nova.virt.hardware [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.440 227766 DEBUG nova.virt.hardware [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.444 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:07:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:07:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/876996690' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:07:53 np0005593234 nova_compute[227762]: 2026-01-23 11:07:53.996 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.552s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.035 227766 DEBUG nova.storage.rbd_utils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image abd79b15-c133-4b9f-ae4c-545309f52322_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.039 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:07:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:54.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:07:54 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2007743820' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.494 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.497 227766 DEBUG nova.virt.libvirt.vif [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:07:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1045500755',display_name='tempest-TestNetworkAdvancedServerOps-server-1045500755',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1045500755',id=218,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKWebjGpREtiyJMsjND/HOQ0onI8JYEhYgd5aCBoyMWcKs/YexmbUn3MXfhZKI/tI0Y2jhFftd8sJYYag+FX9uiRDMUE0Xzgy1B+1zHl6CM829ibbYed2eKCx5OjnHYcWg==',key_name='tempest-TestNetworkAdvancedServerOps-709731046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-z75uh4kf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:07:50Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=abd79b15-c133-4b9f-ae4c-545309f52322,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.498 227766 DEBUG nova.network.os_vif_util [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.499 227766 DEBUG nova.network.os_vif_util [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:c1:86,bridge_name='br-int',has_traffic_filtering=True,id=577aba9c-4ead-4d00-b570-3cb9d0fe2866,network=Network(da794c9e-a13c-4a02-a110-50b557ab3517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap577aba9c-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.502 227766 DEBUG nova.objects.instance [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid abd79b15-c133-4b9f-ae4c-545309f52322 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.524 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] End _get_guest_xml xml=<domain type="kvm">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  <uuid>abd79b15-c133-4b9f-ae4c-545309f52322</uuid>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  <name>instance-000000da</name>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1045500755</nova:name>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 11:07:53</nova:creationTime>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <nova:user uuid="420c366dc5dc45a48da4e0b18c93043f">tempest-TestNetworkAdvancedServerOps-1886747874-project-member</nova:user>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <nova:project uuid="c06f98b51aeb48de91d116fda54a161f">tempest-TestNetworkAdvancedServerOps-1886747874</nova:project>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <nova:port uuid="577aba9c-4ead-4d00-b570-3cb9d0fe2866">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <system>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <entry name="serial">abd79b15-c133-4b9f-ae4c-545309f52322</entry>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <entry name="uuid">abd79b15-c133-4b9f-ae4c-545309f52322</entry>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    </system>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  <os>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  </os>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  <features>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  </features>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  </clock>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  <devices>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/abd79b15-c133-4b9f-ae4c-545309f52322_disk">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/abd79b15-c133-4b9f-ae4c-545309f52322_disk.config">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:38:c1:86"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <target dev="tap577aba9c-4e"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    </interface>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/abd79b15-c133-4b9f-ae4c-545309f52322/console.log" append="off"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    </serial>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <video>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    </video>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    </rng>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 06:07:54 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 06:07:54 np0005593234 nova_compute[227762]:  </devices>
Jan 23 06:07:54 np0005593234 nova_compute[227762]: </domain>
Jan 23 06:07:54 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.526 227766 DEBUG nova.compute.manager [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Preparing to wait for external event network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.526 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.526 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.527 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.527 227766 DEBUG nova.virt.libvirt.vif [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:07:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1045500755',display_name='tempest-TestNetworkAdvancedServerOps-server-1045500755',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1045500755',id=218,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKWebjGpREtiyJMsjND/HOQ0onI8JYEhYgd5aCBoyMWcKs/YexmbUn3MXfhZKI/tI0Y2jhFftd8sJYYag+FX9uiRDMUE0Xzgy1B+1zHl6CM829ibbYed2eKCx5OjnHYcWg==',key_name='tempest-TestNetworkAdvancedServerOps-709731046',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-z75uh4kf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:07:50Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=abd79b15-c133-4b9f-ae4c-545309f52322,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.528 227766 DEBUG nova.network.os_vif_util [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.528 227766 DEBUG nova.network.os_vif_util [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:c1:86,bridge_name='br-int',has_traffic_filtering=True,id=577aba9c-4ead-4d00-b570-3cb9d0fe2866,network=Network(da794c9e-a13c-4a02-a110-50b557ab3517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap577aba9c-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.528 227766 DEBUG os_vif [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:c1:86,bridge_name='br-int',has_traffic_filtering=True,id=577aba9c-4ead-4d00-b570-3cb9d0fe2866,network=Network(da794c9e-a13c-4a02-a110-50b557ab3517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap577aba9c-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.529 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.530 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.530 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.533 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.533 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap577aba9c-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.534 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap577aba9c-4e, col_values=(('external_ids', {'iface-id': '577aba9c-4ead-4d00-b570-3cb9d0fe2866', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:c1:86', 'vm-uuid': 'abd79b15-c133-4b9f-ae4c-545309f52322'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:07:54 np0005593234 NetworkManager[48942]: <info>  [1769166474.5362] manager: (tap577aba9c-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/481)
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.537 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.543 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.543 227766 INFO os_vif [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:c1:86,bridge_name='br-int',has_traffic_filtering=True,id=577aba9c-4ead-4d00-b570-3cb9d0fe2866,network=Network(da794c9e-a13c-4a02-a110-50b557ab3517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap577aba9c-4e')#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.588 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.588 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.588 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] No VIF found with MAC fa:16:3e:38:c1:86, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.589 227766 INFO nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Using config drive#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.611 227766 DEBUG nova.storage.rbd_utils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image abd79b15-c133-4b9f-ae4c-545309f52322_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.920 227766 INFO nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Creating config drive at /var/lib/nova/instances/abd79b15-c133-4b9f-ae4c-545309f52322/disk.config#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.927 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/abd79b15-c133-4b9f-ae4c-545309f52322/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa5aeyro0 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.984 227766 DEBUG nova.network.neutron [req-911941f1-51f7-4222-82e6-f4e7d0afae02 req-52ac2de8-b122-43fa-87ee-206796e94c26 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Updated VIF entry in instance network info cache for port 577aba9c-4ead-4d00-b570-3cb9d0fe2866. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:07:54 np0005593234 nova_compute[227762]: 2026-01-23 11:07:54.986 227766 DEBUG nova.network.neutron [req-911941f1-51f7-4222-82e6-f4e7d0afae02 req-52ac2de8-b122-43fa-87ee-206796e94c26 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Updating instance_info_cache with network_info: [{"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:07:55 np0005593234 nova_compute[227762]: 2026-01-23 11:07:55.006 227766 DEBUG oslo_concurrency.lockutils [req-911941f1-51f7-4222-82e6-f4e7d0afae02 req-52ac2de8-b122-43fa-87ee-206796e94c26 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:07:55 np0005593234 nova_compute[227762]: 2026-01-23 11:07:55.066 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/abd79b15-c133-4b9f-ae4c-545309f52322/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpa5aeyro0" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:07:55 np0005593234 nova_compute[227762]: 2026-01-23 11:07:55.107 227766 DEBUG nova.storage.rbd_utils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] rbd image abd79b15-c133-4b9f-ae4c-545309f52322_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:07:55 np0005593234 nova_compute[227762]: 2026-01-23 11:07:55.110 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/abd79b15-c133-4b9f-ae4c-545309f52322/disk.config abd79b15-c133-4b9f-ae4c-545309f52322_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:07:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:55.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:55 np0005593234 nova_compute[227762]: 2026-01-23 11:07:55.602 227766 DEBUG oslo_concurrency.processutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/abd79b15-c133-4b9f-ae4c-545309f52322/disk.config abd79b15-c133-4b9f-ae4c-545309f52322_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:07:55 np0005593234 nova_compute[227762]: 2026-01-23 11:07:55.603 227766 INFO nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Deleting local config drive /var/lib/nova/instances/abd79b15-c133-4b9f-ae4c-545309f52322/disk.config because it was imported into RBD.#033[00m
Jan 23 06:07:55 np0005593234 kernel: tap577aba9c-4e: entered promiscuous mode
Jan 23 06:07:55 np0005593234 NetworkManager[48942]: <info>  [1769166475.6690] manager: (tap577aba9c-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/482)
Jan 23 06:07:55 np0005593234 ovn_controller[134547]: 2026-01-23T11:07:55Z|01003|binding|INFO|Claiming lport 577aba9c-4ead-4d00-b570-3cb9d0fe2866 for this chassis.
Jan 23 06:07:55 np0005593234 ovn_controller[134547]: 2026-01-23T11:07:55Z|01004|binding|INFO|577aba9c-4ead-4d00-b570-3cb9d0fe2866: Claiming fa:16:3e:38:c1:86 10.100.0.5
Jan 23 06:07:55 np0005593234 nova_compute[227762]: 2026-01-23 11:07:55.670 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.682 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:c1:86 10.100.0.5'], port_security=['fa:16:3e:38:c1:86 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'abd79b15-c133-4b9f-ae4c-545309f52322', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da794c9e-a13c-4a02-a110-50b557ab3517', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1d5b2423-0771-4eca-b525-6940c398c745', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71aaf5a5-1e19-4c40-a6d5-6d2650b9b03f, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=577aba9c-4ead-4d00-b570-3cb9d0fe2866) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.683 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 577aba9c-4ead-4d00-b570-3cb9d0fe2866 in datapath da794c9e-a13c-4a02-a110-50b557ab3517 bound to our chassis#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.684 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network da794c9e-a13c-4a02-a110-50b557ab3517#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.701 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8632950c-821f-4131-a8a9-605c594441c0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.703 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapda794c9e-a1 in ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.704 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapda794c9e-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.704 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1094fbc0-4935-4e12-bc11-565de0e20c51]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.706 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[21dc7eb1-704a-4acb-aef9-f13158ce5448]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:55 np0005593234 systemd-machined[195626]: New machine qemu-112-instance-000000da.
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.725 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[51f927c9-08a7-4de3-8989-0abd7c04884a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.750 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8e724b5e-7a04-4491-a2fd-cbf9d0d73952]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:55 np0005593234 systemd[1]: Started Virtual Machine qemu-112-instance-000000da.
Jan 23 06:07:55 np0005593234 nova_compute[227762]: 2026-01-23 11:07:55.768 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:55 np0005593234 ovn_controller[134547]: 2026-01-23T11:07:55Z|01005|binding|INFO|Setting lport 577aba9c-4ead-4d00-b570-3cb9d0fe2866 ovn-installed in OVS
Jan 23 06:07:55 np0005593234 ovn_controller[134547]: 2026-01-23T11:07:55Z|01006|binding|INFO|Setting lport 577aba9c-4ead-4d00-b570-3cb9d0fe2866 up in Southbound
Jan 23 06:07:55 np0005593234 nova_compute[227762]: 2026-01-23 11:07:55.774 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:55 np0005593234 systemd-udevd[345528]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:07:55 np0005593234 NetworkManager[48942]: <info>  [1769166475.7968] device (tap577aba9c-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:07:55 np0005593234 NetworkManager[48942]: <info>  [1769166475.7977] device (tap577aba9c-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.800 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[62637b59-984d-4da0-b5a8-6ddc07cdab5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.806 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[aa7e77b4-061f-4abe-b8a0-c578ffc79a0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:55 np0005593234 systemd-udevd[345531]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:07:55 np0005593234 NetworkManager[48942]: <info>  [1769166475.8082] manager: (tapda794c9e-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/483)
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.840 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[fe48cc85-0abf-4025-979a-dc8818e24ff8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.844 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[cfcfcf89-1332-4fdc-949c-12ce89f28cbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:55 np0005593234 NetworkManager[48942]: <info>  [1769166475.8674] device (tapda794c9e-a0): carrier: link connected
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.875 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a88349eb-e141-45a9-bc33-8e9d5a1e01d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.896 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[64a0d7eb-040d-49e3-b703-c69ede6bba05]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda794c9e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:fb:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 309], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1037082, 'reachable_time': 17304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 345557, 'error': None, 'target': 'ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.915 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ef263a81-54f8-4f5b-aa5d-efd784a53486]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:fbca'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1037082, 'tstamp': 1037082}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 345558, 'error': None, 'target': 'ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.933 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[417b6b18-dec5-4124-8eda-2ca206bf8720]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda794c9e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:fb:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 309], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1037082, 'reachable_time': 17304, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 345559, 'error': None, 'target': 'ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:55.974 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a6fbecb1-5860-42ff-875e-20813bf40db7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:56.044 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cc65930a-570b-4ae8-94c8-4c2cb4be9c3b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:56.046 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda794c9e-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:56.047 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:56.048 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda794c9e-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:07:56 np0005593234 NetworkManager[48942]: <info>  [1769166476.0505] manager: (tapda794c9e-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/484)
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.050 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:56 np0005593234 kernel: tapda794c9e-a0: entered promiscuous mode
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.052 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:56.058 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapda794c9e-a0, col_values=(('external_ids', {'iface-id': '6c5ce001-b215-4367-b2cd-4f00d251a399'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.059 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:56 np0005593234 ovn_controller[134547]: 2026-01-23T11:07:56Z|01007|binding|INFO|Releasing lport 6c5ce001-b215-4367-b2cd-4f00d251a399 from this chassis (sb_readonly=0)
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.060 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:56.063 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da794c9e-a13c-4a02-a110-50b557ab3517.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da794c9e-a13c-4a02-a110-50b557ab3517.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:56.064 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[dd166dc6-13c3-4bd1-b731-d5c6347149b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:56.065 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-da794c9e-a13c-4a02-a110-50b557ab3517
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/da794c9e-a13c-4a02-a110-50b557ab3517.pid.haproxy
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID da794c9e-a13c-4a02-a110-50b557ab3517
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:07:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:07:56.067 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517', 'env', 'PROCESS_TAG=haproxy-da794c9e-a13c-4a02-a110-50b557ab3517', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/da794c9e-a13c-4a02-a110-50b557ab3517.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.078 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:56.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.280 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166476.2802036, abd79b15-c133-4b9f-ae4c-545309f52322 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.280 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] VM Started (Lifecycle Event)#033[00m
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.296 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.299 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166476.2803237, abd79b15-c133-4b9f-ae4c-545309f52322 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.300 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] VM Paused (Lifecycle Event)#033[00m
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.317 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.320 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.337 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:07:56 np0005593234 podman[345634]: 2026-01-23 11:07:56.490316788 +0000 UTC m=+0.052864316 container create 161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 06:07:56 np0005593234 systemd[1]: Started libpod-conmon-161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b.scope.
Jan 23 06:07:56 np0005593234 nova_compute[227762]: 2026-01-23 11:07:56.553 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:56 np0005593234 podman[345634]: 2026-01-23 11:07:56.461030702 +0000 UTC m=+0.023578230 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:07:56 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:07:56 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3f17056045d543b14438755b547a11d9db3c894e12df7ee20fb547c95d161fc0/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:07:56 np0005593234 podman[345634]: 2026-01-23 11:07:56.591235606 +0000 UTC m=+0.153783164 container init 161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2)
Jan 23 06:07:56 np0005593234 podman[345634]: 2026-01-23 11:07:56.596966176 +0000 UTC m=+0.159513704 container start 161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 06:07:56 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[345649]: [NOTICE]   (345653) : New worker (345655) forked
Jan 23 06:07:56 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[345649]: [NOTICE]   (345653) : Loading success.
Jan 23 06:07:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:57.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.279 227766 DEBUG nova.compute.manager [req-b61dc5fd-d7b4-4864-8da8-9c82138ba394 req-f9dc293d-1975-4eb4-ad65-ad2ce642a502 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received event network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.280 227766 DEBUG oslo_concurrency.lockutils [req-b61dc5fd-d7b4-4864-8da8-9c82138ba394 req-f9dc293d-1975-4eb4-ad65-ad2ce642a502 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.280 227766 DEBUG oslo_concurrency.lockutils [req-b61dc5fd-d7b4-4864-8da8-9c82138ba394 req-f9dc293d-1975-4eb4-ad65-ad2ce642a502 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.280 227766 DEBUG oslo_concurrency.lockutils [req-b61dc5fd-d7b4-4864-8da8-9c82138ba394 req-f9dc293d-1975-4eb4-ad65-ad2ce642a502 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.281 227766 DEBUG nova.compute.manager [req-b61dc5fd-d7b4-4864-8da8-9c82138ba394 req-f9dc293d-1975-4eb4-ad65-ad2ce642a502 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Processing event network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.281 227766 DEBUG nova.compute.manager [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.285 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166477.2856195, abd79b15-c133-4b9f-ae4c-545309f52322 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.286 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.287 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.291 227766 INFO nova.virt.libvirt.driver [-] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Instance spawned successfully.#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.291 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.308 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.312 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.316 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.317 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.317 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.317 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.318 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.318 227766 DEBUG nova.virt.libvirt.driver [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.330 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.372 227766 INFO nova.compute.manager [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Took 6.60 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.373 227766 DEBUG nova.compute.manager [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.449 227766 INFO nova.compute.manager [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Took 7.69 seconds to build instance.#033[00m
Jan 23 06:07:57 np0005593234 nova_compute[227762]: 2026-01-23 11:07:57.469 227766 DEBUG oslo_concurrency.lockutils [None req-ccffd73b-20f0-44ac-8154-99b5e540e9de 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.791s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:07:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:07:58.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:07:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:07:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:07:59.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:07:59 np0005593234 nova_compute[227762]: 2026-01-23 11:07:59.368 227766 DEBUG nova.compute.manager [req-69813469-ab76-4511-a457-097a3471e8e8 req-f160d641-baf7-40e8-bfe9-2aa21bf50d50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received event network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:07:59 np0005593234 nova_compute[227762]: 2026-01-23 11:07:59.368 227766 DEBUG oslo_concurrency.lockutils [req-69813469-ab76-4511-a457-097a3471e8e8 req-f160d641-baf7-40e8-bfe9-2aa21bf50d50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:07:59 np0005593234 nova_compute[227762]: 2026-01-23 11:07:59.369 227766 DEBUG oslo_concurrency.lockutils [req-69813469-ab76-4511-a457-097a3471e8e8 req-f160d641-baf7-40e8-bfe9-2aa21bf50d50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:07:59 np0005593234 nova_compute[227762]: 2026-01-23 11:07:59.369 227766 DEBUG oslo_concurrency.lockutils [req-69813469-ab76-4511-a457-097a3471e8e8 req-f160d641-baf7-40e8-bfe9-2aa21bf50d50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:07:59 np0005593234 nova_compute[227762]: 2026-01-23 11:07:59.369 227766 DEBUG nova.compute.manager [req-69813469-ab76-4511-a457-097a3471e8e8 req-f160d641-baf7-40e8-bfe9-2aa21bf50d50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] No waiting events found dispatching network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:07:59 np0005593234 nova_compute[227762]: 2026-01-23 11:07:59.369 227766 WARNING nova.compute.manager [req-69813469-ab76-4511-a457-097a3471e8e8 req-f160d641-baf7-40e8-bfe9-2aa21bf50d50 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received unexpected event network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:07:59 np0005593234 nova_compute[227762]: 2026-01-23 11:07:59.537 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:07:59 np0005593234 nova_compute[227762]: 2026-01-23 11:07:59.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:00.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:01.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:01 np0005593234 nova_compute[227762]: 2026-01-23 11:08:01.555 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:02 np0005593234 nova_compute[227762]: 2026-01-23 11:08:02.071 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:02 np0005593234 NetworkManager[48942]: <info>  [1769166482.0723] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/485)
Jan 23 06:08:02 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:02Z|01008|binding|INFO|Releasing lport 6c5ce001-b215-4367-b2cd-4f00d251a399 from this chassis (sb_readonly=0)
Jan 23 06:08:02 np0005593234 NetworkManager[48942]: <info>  [1769166482.0741] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/486)
Jan 23 06:08:02 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:02Z|01009|binding|INFO|Releasing lport 6c5ce001-b215-4367-b2cd-4f00d251a399 from this chassis (sb_readonly=0)
Jan 23 06:08:02 np0005593234 nova_compute[227762]: 2026-01-23 11:08:02.109 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:02 np0005593234 nova_compute[227762]: 2026-01-23 11:08:02.114 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:02.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:02 np0005593234 nova_compute[227762]: 2026-01-23 11:08:02.932 227766 DEBUG nova.compute.manager [req-6851fe0c-bee4-4ea9-a2a9-ca5512073019 req-199c810a-500a-40cf-94fb-8b872181c0ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received event network-changed-577aba9c-4ead-4d00-b570-3cb9d0fe2866 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:08:02 np0005593234 nova_compute[227762]: 2026-01-23 11:08:02.933 227766 DEBUG nova.compute.manager [req-6851fe0c-bee4-4ea9-a2a9-ca5512073019 req-199c810a-500a-40cf-94fb-8b872181c0ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Refreshing instance network info cache due to event network-changed-577aba9c-4ead-4d00-b570-3cb9d0fe2866. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:08:02 np0005593234 nova_compute[227762]: 2026-01-23 11:08:02.933 227766 DEBUG oslo_concurrency.lockutils [req-6851fe0c-bee4-4ea9-a2a9-ca5512073019 req-199c810a-500a-40cf-94fb-8b872181c0ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:08:02 np0005593234 nova_compute[227762]: 2026-01-23 11:08:02.933 227766 DEBUG oslo_concurrency.lockutils [req-6851fe0c-bee4-4ea9-a2a9-ca5512073019 req-199c810a-500a-40cf-94fb-8b872181c0ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:08:02 np0005593234 nova_compute[227762]: 2026-01-23 11:08:02.934 227766 DEBUG nova.network.neutron [req-6851fe0c-bee4-4ea9-a2a9-ca5512073019 req-199c810a-500a-40cf-94fb-8b872181c0ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Refreshing network info cache for port 577aba9c-4ead-4d00-b570-3cb9d0fe2866 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:08:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:03.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:04 np0005593234 nova_compute[227762]: 2026-01-23 11:08:04.070 227766 DEBUG nova.network.neutron [req-6851fe0c-bee4-4ea9-a2a9-ca5512073019 req-199c810a-500a-40cf-94fb-8b872181c0ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Updated VIF entry in instance network info cache for port 577aba9c-4ead-4d00-b570-3cb9d0fe2866. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:08:04 np0005593234 nova_compute[227762]: 2026-01-23 11:08:04.072 227766 DEBUG nova.network.neutron [req-6851fe0c-bee4-4ea9-a2a9-ca5512073019 req-199c810a-500a-40cf-94fb-8b872181c0ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Updating instance_info_cache with network_info: [{"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:08:04 np0005593234 nova_compute[227762]: 2026-01-23 11:08:04.092 227766 DEBUG oslo_concurrency.lockutils [req-6851fe0c-bee4-4ea9-a2a9-ca5512073019 req-199c810a-500a-40cf-94fb-8b872181c0ef 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:08:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:04.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:04 np0005593234 nova_compute[227762]: 2026-01-23 11:08:04.539 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:05.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:06.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:06 np0005593234 nova_compute[227762]: 2026-01-23 11:08:06.558 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:07.255 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:08.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:09 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #59. Immutable memtables: 15.
Jan 23 06:08:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:09.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:09 np0005593234 nova_compute[227762]: 2026-01-23 11:08:09.541 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:10.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:08:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:11.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:08:11 np0005593234 nova_compute[227762]: 2026-01-23 11:08:11.560 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:12.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:13.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 06:08:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 06:08:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:08:13 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:08:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:14.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:14 np0005593234 nova_compute[227762]: 2026-01-23 11:08:14.543 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:14 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:14Z|00133|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:38:c1:86 10.100.0.5
Jan 23 06:08:14 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:14Z|00134|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:38:c1:86 10.100.0.5
Jan 23 06:08:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:15.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:08:15 np0005593234 podman[345977]: 2026-01-23 11:08:15.764276836 +0000 UTC m=+0.054191298 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 06:08:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:08:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:16.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:08:16 np0005593234 nova_compute[227762]: 2026-01-23 11:08:16.563 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:17.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:08:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:08:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:18.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:19.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:19 np0005593234 nova_compute[227762]: 2026-01-23 11:08:19.546 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:20.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:21.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:21 np0005593234 nova_compute[227762]: 2026-01-23 11:08:21.565 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:22.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:23.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:23 np0005593234 podman[346051]: 2026-01-23 11:08:23.837103074 +0000 UTC m=+0.114611808 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:08:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:24.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:24 np0005593234 nova_compute[227762]: 2026-01-23 11:08:24.549 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:25.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:25 np0005593234 nova_compute[227762]: 2026-01-23 11:08:25.448 227766 INFO nova.compute.manager [None req-d200b6a2-23a4-4fa7-837d-1cb6ba8e745b 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Get console output#033[00m
Jan 23 06:08:25 np0005593234 nova_compute[227762]: 2026-01-23 11:08:25.455 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 06:08:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:26.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:26 np0005593234 nova_compute[227762]: 2026-01-23 11:08:26.568 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:26 np0005593234 nova_compute[227762]: 2026-01-23 11:08:26.577 227766 DEBUG nova.objects.instance [None req-a9961d0d-f063-48db-aedb-70f87a6d749c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid abd79b15-c133-4b9f-ae4c-545309f52322 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:08:26 np0005593234 nova_compute[227762]: 2026-01-23 11:08:26.603 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166506.6031568, abd79b15-c133-4b9f-ae4c-545309f52322 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:08:26 np0005593234 nova_compute[227762]: 2026-01-23 11:08:26.603 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] VM Paused (Lifecycle Event)#033[00m
Jan 23 06:08:26 np0005593234 nova_compute[227762]: 2026-01-23 11:08:26.620 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:08:26 np0005593234 nova_compute[227762]: 2026-01-23 11:08:26.624 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:08:26 np0005593234 nova_compute[227762]: 2026-01-23 11:08:26.648 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Jan 23 06:08:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:27.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:27 np0005593234 nova_compute[227762]: 2026-01-23 11:08:27.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:27 np0005593234 nova_compute[227762]: 2026-01-23 11:08:27.766 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:27 np0005593234 nova_compute[227762]: 2026-01-23 11:08:27.767 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:27 np0005593234 nova_compute[227762]: 2026-01-23 11:08:27.767 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:27 np0005593234 nova_compute[227762]: 2026-01-23 11:08:27.767 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:08:27 np0005593234 nova_compute[227762]: 2026-01-23 11:08:27.767 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:08:27 np0005593234 kernel: tap577aba9c-4e (unregistering): left promiscuous mode
Jan 23 06:08:27 np0005593234 NetworkManager[48942]: <info>  [1769166507.7729] device (tap577aba9c-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:08:27 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:27Z|01010|binding|INFO|Releasing lport 577aba9c-4ead-4d00-b570-3cb9d0fe2866 from this chassis (sb_readonly=0)
Jan 23 06:08:27 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:27Z|01011|binding|INFO|Setting lport 577aba9c-4ead-4d00-b570-3cb9d0fe2866 down in Southbound
Jan 23 06:08:27 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:27Z|01012|binding|INFO|Removing iface tap577aba9c-4e ovn-installed in OVS
Jan 23 06:08:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:27.787 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:c1:86 10.100.0.5'], port_security=['fa:16:3e:38:c1:86 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'abd79b15-c133-4b9f-ae4c-545309f52322', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da794c9e-a13c-4a02-a110-50b557ab3517', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1d5b2423-0771-4eca-b525-6940c398c745', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71aaf5a5-1e19-4c40-a6d5-6d2650b9b03f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=577aba9c-4ead-4d00-b570-3cb9d0fe2866) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:08:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:27.790 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 577aba9c-4ead-4d00-b570-3cb9d0fe2866 in datapath da794c9e-a13c-4a02-a110-50b557ab3517 unbound from our chassis#033[00m
Jan 23 06:08:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:27.791 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network da794c9e-a13c-4a02-a110-50b557ab3517, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:08:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:27.793 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[55bb4ca6-9d4e-4f92-a853-3c481e7d0050]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:27 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:27.794 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517 namespace which is not needed anymore#033[00m
Jan 23 06:08:27 np0005593234 nova_compute[227762]: 2026-01-23 11:08:27.802 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:27 np0005593234 systemd[1]: machine-qemu\x2d112\x2dinstance\x2d000000da.scope: Deactivated successfully.
Jan 23 06:08:27 np0005593234 systemd[1]: machine-qemu\x2d112\x2dinstance\x2d000000da.scope: Consumed 14.161s CPU time.
Jan 23 06:08:27 np0005593234 systemd-machined[195626]: Machine qemu-112-instance-000000da terminated.
Jan 23 06:08:27 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[345649]: [NOTICE]   (345653) : haproxy version is 2.8.14-c23fe91
Jan 23 06:08:27 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[345649]: [NOTICE]   (345653) : path to executable is /usr/sbin/haproxy
Jan 23 06:08:27 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[345649]: [WARNING]  (345653) : Exiting Master process...
Jan 23 06:08:27 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[345649]: [ALERT]    (345653) : Current worker (345655) exited with code 143 (Terminated)
Jan 23 06:08:27 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[345649]: [WARNING]  (345653) : All workers exited. Exiting... (0)
Jan 23 06:08:27 np0005593234 systemd[1]: libpod-161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b.scope: Deactivated successfully.
Jan 23 06:08:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:27 np0005593234 podman[346107]: 2026-01-23 11:08:27.926731301 +0000 UTC m=+0.042486071 container died 161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 06:08:27 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b-userdata-shm.mount: Deactivated successfully.
Jan 23 06:08:27 np0005593234 systemd[1]: var-lib-containers-storage-overlay-3f17056045d543b14438755b547a11d9db3c894e12df7ee20fb547c95d161fc0-merged.mount: Deactivated successfully.
Jan 23 06:08:27 np0005593234 podman[346107]: 2026-01-23 11:08:27.969487499 +0000 UTC m=+0.085242269 container cleanup 161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:08:27 np0005593234 systemd[1]: libpod-conmon-161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b.scope: Deactivated successfully.
Jan 23 06:08:27 np0005593234 nova_compute[227762]: 2026-01-23 11:08:27.982 227766 DEBUG nova.compute.manager [None req-a9961d0d-f063-48db-aedb-70f87a6d749c 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:08:28 np0005593234 podman[346164]: 2026-01-23 11:08:28.028352961 +0000 UTC m=+0.034579183 container remove 161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:08:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:28.036 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f44c6b18-a2b3-41e6-8dd3-4a069511a1f0]: (4, ('Fri Jan 23 11:08:27 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517 (161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b)\n161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b\nFri Jan 23 11:08:27 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517 (161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b)\n161dc34d28f942652dac07cbab15dc3ff86d0d6481131bae35a7f82cf4fb731b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:28.038 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f5544602-70b1-4b17-95e6-ea79eece19fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:28.038 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda794c9e-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.040 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:28 np0005593234 kernel: tapda794c9e-a0: left promiscuous mode
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.057 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:28.058 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3bf31003-8193-48ed-a6b8-ac2fc3c35df2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:28.076 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7c2f7eca-b860-422e-ab76-a60083201951]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:28.078 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6df034c4-6b25-42fa-bd74-470c490a3ddb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:28.091 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[02b18499-b18d-44d5-813e-b7ba9b9fb4f1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1037075, 'reachable_time': 20192, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 346185, 'error': None, 'target': 'ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:28.094 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:08:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:28.094 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[06b4ce77-6999-412a-a213-f4650c2fbe0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:28 np0005593234 systemd[1]: run-netns-ovnmeta\x2dda794c9e\x2da13c\x2d4a02\x2da110\x2d50b557ab3517.mount: Deactivated successfully.
Jan 23 06:08:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:08:28 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3074650615' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.246 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:08:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:28.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.326 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000da as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.326 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000da as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.507 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.508 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4038MB free_disk=20.942890167236328GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.508 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.509 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.592 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance abd79b15-c133-4b9f-ae4c-545309f52322 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.593 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.593 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.638 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.692 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.693 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.708 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.732 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 06:08:28 np0005593234 nova_compute[227762]: 2026-01-23 11:08:28.780 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:08:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:08:29 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1156440673' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:08:29 np0005593234 nova_compute[227762]: 2026-01-23 11:08:29.208 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:08:29 np0005593234 nova_compute[227762]: 2026-01-23 11:08:29.216 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:08:29 np0005593234 nova_compute[227762]: 2026-01-23 11:08:29.246 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:08:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:29.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:29 np0005593234 nova_compute[227762]: 2026-01-23 11:08:29.280 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:08:29 np0005593234 nova_compute[227762]: 2026-01-23 11:08:29.281 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.773s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:29 np0005593234 nova_compute[227762]: 2026-01-23 11:08:29.588 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:30.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:31 np0005593234 nova_compute[227762]: 2026-01-23 11:08:31.213 227766 DEBUG nova.compute.manager [req-37488573-e610-44be-bbb0-61ebee445721 req-b74e97f2-9d3b-42a2-b582-3d9f692079f0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received event network-vif-unplugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:08:31 np0005593234 nova_compute[227762]: 2026-01-23 11:08:31.214 227766 DEBUG oslo_concurrency.lockutils [req-37488573-e610-44be-bbb0-61ebee445721 req-b74e97f2-9d3b-42a2-b582-3d9f692079f0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:31 np0005593234 nova_compute[227762]: 2026-01-23 11:08:31.215 227766 DEBUG oslo_concurrency.lockutils [req-37488573-e610-44be-bbb0-61ebee445721 req-b74e97f2-9d3b-42a2-b582-3d9f692079f0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:31 np0005593234 nova_compute[227762]: 2026-01-23 11:08:31.216 227766 DEBUG oslo_concurrency.lockutils [req-37488573-e610-44be-bbb0-61ebee445721 req-b74e97f2-9d3b-42a2-b582-3d9f692079f0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:31 np0005593234 nova_compute[227762]: 2026-01-23 11:08:31.216 227766 DEBUG nova.compute.manager [req-37488573-e610-44be-bbb0-61ebee445721 req-b74e97f2-9d3b-42a2-b582-3d9f692079f0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] No waiting events found dispatching network-vif-unplugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:08:31 np0005593234 nova_compute[227762]: 2026-01-23 11:08:31.216 227766 WARNING nova.compute.manager [req-37488573-e610-44be-bbb0-61ebee445721 req-b74e97f2-9d3b-42a2-b582-3d9f692079f0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received unexpected event network-vif-unplugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 for instance with vm_state suspended and task_state None.#033[00m
Jan 23 06:08:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:31.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:31 np0005593234 nova_compute[227762]: 2026-01-23 11:08:31.572 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:32 np0005593234 nova_compute[227762]: 2026-01-23 11:08:32.283 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:32 np0005593234 nova_compute[227762]: 2026-01-23 11:08:32.283 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:08:32 np0005593234 nova_compute[227762]: 2026-01-23 11:08:32.283 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:08:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:08:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:32.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:08:32 np0005593234 nova_compute[227762]: 2026-01-23 11:08:32.411 227766 INFO nova.compute.manager [None req-1d9edbe9-1f73-4891-8ef8-41d8f6ef819f 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Get console output#033[00m
Jan 23 06:08:32 np0005593234 nova_compute[227762]: 2026-01-23 11:08:32.633 227766 INFO nova.compute.manager [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Resuming#033[00m
Jan 23 06:08:32 np0005593234 nova_compute[227762]: 2026-01-23 11:08:32.634 227766 DEBUG nova.objects.instance [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'flavor' on Instance uuid abd79b15-c133-4b9f-ae4c-545309f52322 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:08:32 np0005593234 nova_compute[227762]: 2026-01-23 11:08:32.676 227766 DEBUG oslo_concurrency.lockutils [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:08:32 np0005593234 nova_compute[227762]: 2026-01-23 11:08:32.676 227766 DEBUG oslo_concurrency.lockutils [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquired lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:08:32 np0005593234 nova_compute[227762]: 2026-01-23 11:08:32.677 227766 DEBUG nova.network.neutron [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:08:32 np0005593234 nova_compute[227762]: 2026-01-23 11:08:32.830 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:08:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:33.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:33 np0005593234 nova_compute[227762]: 2026-01-23 11:08:33.472 227766 DEBUG nova.compute.manager [req-f897042b-c1e6-40ee-a54f-75df7fa6ed09 req-421fc59a-40c5-4187-a46d-1ba34fc3b31c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received event network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:08:33 np0005593234 nova_compute[227762]: 2026-01-23 11:08:33.472 227766 DEBUG oslo_concurrency.lockutils [req-f897042b-c1e6-40ee-a54f-75df7fa6ed09 req-421fc59a-40c5-4187-a46d-1ba34fc3b31c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:33 np0005593234 nova_compute[227762]: 2026-01-23 11:08:33.472 227766 DEBUG oslo_concurrency.lockutils [req-f897042b-c1e6-40ee-a54f-75df7fa6ed09 req-421fc59a-40c5-4187-a46d-1ba34fc3b31c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:33 np0005593234 nova_compute[227762]: 2026-01-23 11:08:33.473 227766 DEBUG oslo_concurrency.lockutils [req-f897042b-c1e6-40ee-a54f-75df7fa6ed09 req-421fc59a-40c5-4187-a46d-1ba34fc3b31c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:33 np0005593234 nova_compute[227762]: 2026-01-23 11:08:33.473 227766 DEBUG nova.compute.manager [req-f897042b-c1e6-40ee-a54f-75df7fa6ed09 req-421fc59a-40c5-4187-a46d-1ba34fc3b31c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] No waiting events found dispatching network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:08:33 np0005593234 nova_compute[227762]: 2026-01-23 11:08:33.473 227766 WARNING nova.compute.manager [req-f897042b-c1e6-40ee-a54f-75df7fa6ed09 req-421fc59a-40c5-4187-a46d-1ba34fc3b31c 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received unexpected event network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 23 06:08:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:34.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:34 np0005593234 nova_compute[227762]: 2026-01-23 11:08:34.483 227766 DEBUG nova.network.neutron [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Updating instance_info_cache with network_info: [{"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:08:34 np0005593234 nova_compute[227762]: 2026-01-23 11:08:34.592 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:35.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.343 227766 DEBUG oslo_concurrency.lockutils [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Releasing lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.344 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.345 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.345 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid abd79b15-c133-4b9f-ae4c-545309f52322 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.351 227766 DEBUG nova.virt.libvirt.vif [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:07:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1045500755',display_name='tempest-TestNetworkAdvancedServerOps-server-1045500755',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1045500755',id=218,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKWebjGpREtiyJMsjND/HOQ0onI8JYEhYgd5aCBoyMWcKs/YexmbUn3MXfhZKI/tI0Y2jhFftd8sJYYag+FX9uiRDMUE0Xzgy1B+1zHl6CM829ibbYed2eKCx5OjnHYcWg==',key_name='tempest-TestNetworkAdvancedServerOps-709731046',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:07:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-z75uh4kf',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:08:28Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=abd79b15-c133-4b9f-ae4c-545309f52322,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.352 227766 DEBUG nova.network.os_vif_util [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.352 227766 DEBUG nova.network.os_vif_util [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:c1:86,bridge_name='br-int',has_traffic_filtering=True,id=577aba9c-4ead-4d00-b570-3cb9d0fe2866,network=Network(da794c9e-a13c-4a02-a110-50b557ab3517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap577aba9c-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.353 227766 DEBUG os_vif [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:c1:86,bridge_name='br-int',has_traffic_filtering=True,id=577aba9c-4ead-4d00-b570-3cb9d0fe2866,network=Network(da794c9e-a13c-4a02-a110-50b557ab3517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap577aba9c-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.354 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.355 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.355 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.358 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.359 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap577aba9c-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.359 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap577aba9c-4e, col_values=(('external_ids', {'iface-id': '577aba9c-4ead-4d00-b570-3cb9d0fe2866', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:c1:86', 'vm-uuid': 'abd79b15-c133-4b9f-ae4c-545309f52322'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.360 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.360 227766 INFO os_vif [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:c1:86,bridge_name='br-int',has_traffic_filtering=True,id=577aba9c-4ead-4d00-b570-3cb9d0fe2866,network=Network(da794c9e-a13c-4a02-a110-50b557ab3517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap577aba9c-4e')#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.375 227766 DEBUG nova.objects.instance [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'numa_topology' on Instance uuid abd79b15-c133-4b9f-ae4c-545309f52322 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:08:35 np0005593234 kernel: tap577aba9c-4e: entered promiscuous mode
Jan 23 06:08:35 np0005593234 NetworkManager[48942]: <info>  [1769166515.4349] manager: (tap577aba9c-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/487)
Jan 23 06:08:35 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:35Z|01013|binding|INFO|Claiming lport 577aba9c-4ead-4d00-b570-3cb9d0fe2866 for this chassis.
Jan 23 06:08:35 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:35Z|01014|binding|INFO|577aba9c-4ead-4d00-b570-3cb9d0fe2866: Claiming fa:16:3e:38:c1:86 10.100.0.5
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.436 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.442 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:c1:86 10.100.0.5'], port_security=['fa:16:3e:38:c1:86 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'abd79b15-c133-4b9f-ae4c-545309f52322', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da794c9e-a13c-4a02-a110-50b557ab3517', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '5', 'neutron:security_group_ids': '1d5b2423-0771-4eca-b525-6940c398c745', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71aaf5a5-1e19-4c40-a6d5-6d2650b9b03f, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=577aba9c-4ead-4d00-b570-3cb9d0fe2866) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.443 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 577aba9c-4ead-4d00-b570-3cb9d0fe2866 in datapath da794c9e-a13c-4a02-a110-50b557ab3517 bound to our chassis#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.444 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network da794c9e-a13c-4a02-a110-50b557ab3517#033[00m
Jan 23 06:08:35 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:35Z|01015|binding|INFO|Setting lport 577aba9c-4ead-4d00-b570-3cb9d0fe2866 ovn-installed in OVS
Jan 23 06:08:35 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:35Z|01016|binding|INFO|Setting lport 577aba9c-4ead-4d00-b570-3cb9d0fe2866 up in Southbound
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.450 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.452 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.457 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[d1cd087f-0b9c-4541-ac8b-2a26715ef673]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.458 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapda794c9e-a1 in ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.460 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapda794c9e-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.460 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[3f58d61a-a0bc-423c-98ff-b5f3c8618cb3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.461 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b67d5931-9249-4129-9a35-10d949edc653]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 systemd-udevd[346229]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:08:35 np0005593234 systemd-machined[195626]: New machine qemu-113-instance-000000da.
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.474 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[da3e6c55-6758-4009-b334-a4354f5d1d11]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 systemd[1]: Started Virtual Machine qemu-113-instance-000000da.
Jan 23 06:08:35 np0005593234 NetworkManager[48942]: <info>  [1769166515.4838] device (tap577aba9c-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:08:35 np0005593234 NetworkManager[48942]: <info>  [1769166515.4842] device (tap577aba9c-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.497 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cbef1882-b78c-4964-9878-dc3a26cca31d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.529 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[0155e897-0662-4265-b67f-ac97d00dbb4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 systemd-udevd[346233]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.536 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a4650a7a-637d-4de0-abe5-379c0556d3b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 NetworkManager[48942]: <info>  [1769166515.5384] manager: (tapda794c9e-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/488)
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.570 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[08c7ed94-bfed-4928-ac12-a1933056920d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.573 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ed6db94d-9039-4145-9641-57e8dd441a4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 NetworkManager[48942]: <info>  [1769166515.5952] device (tapda794c9e-a0): carrier: link connected
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.602 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[2cbb1f4e-5e7a-4212-92d7-bdcf1e12fbf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.619 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[400aa5c7-0157-494a-acf3-25a5013c37e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda794c9e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:fb:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 312], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1041055, 'reachable_time': 34534, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 346261, 'error': None, 'target': 'ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.633 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8f1f7d70-edc2-4114-8ea3-8774ec4917d4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1b:fbca'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1041055, 'tstamp': 1041055}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 346262, 'error': None, 'target': 'ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.649 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c6f1d0d9-257e-44a2-93ee-27e60ba99ea1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapda794c9e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1b:fb:ca'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 312], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1041055, 'reachable_time': 34534, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 346263, 'error': None, 'target': 'ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.675 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a9da4d1a-0076-485b-b1f1-ef1476b228b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.724 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[bd633d9c-73b5-475a-b61c-b6fb247672e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.726 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda794c9e-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.726 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.726 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapda794c9e-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:08:35 np0005593234 NetworkManager[48942]: <info>  [1769166515.7287] manager: (tapda794c9e-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/489)
Jan 23 06:08:35 np0005593234 kernel: tapda794c9e-a0: entered promiscuous mode
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.730 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapda794c9e-a0, col_values=(('external_ids', {'iface-id': '6c5ce001-b215-4367-b2cd-4f00d251a399'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.728 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.731 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.733 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.733 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/da794c9e-a13c-4a02-a110-50b557ab3517.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/da794c9e-a13c-4a02-a110-50b557ab3517.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:08:35 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:35Z|01017|binding|INFO|Releasing lport 6c5ce001-b215-4367-b2cd-4f00d251a399 from this chassis (sb_readonly=0)
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.734 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[178beec5-8441-4279-a80b-52c09d9b1ee9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.734 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-da794c9e-a13c-4a02-a110-50b557ab3517
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/da794c9e-a13c-4a02-a110-50b557ab3517.pid.haproxy
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID da794c9e-a13c-4a02-a110-50b557ab3517
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:08:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:35.735 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517', 'env', 'PROCESS_TAG=haproxy-da794c9e-a13c-4a02-a110-50b557ab3517', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/da794c9e-a13c-4a02-a110-50b557ab3517.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:08:35 np0005593234 nova_compute[227762]: 2026-01-23 11:08:35.746 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.005 227766 DEBUG nova.compute.manager [req-4a06020a-1297-4770-8a6c-9c3badc7dc03 req-afed8167-4f05-42da-b449-1ab1bff168ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received event network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.006 227766 DEBUG oslo_concurrency.lockutils [req-4a06020a-1297-4770-8a6c-9c3badc7dc03 req-afed8167-4f05-42da-b449-1ab1bff168ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.006 227766 DEBUG oslo_concurrency.lockutils [req-4a06020a-1297-4770-8a6c-9c3badc7dc03 req-afed8167-4f05-42da-b449-1ab1bff168ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.007 227766 DEBUG oslo_concurrency.lockutils [req-4a06020a-1297-4770-8a6c-9c3badc7dc03 req-afed8167-4f05-42da-b449-1ab1bff168ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.007 227766 DEBUG nova.compute.manager [req-4a06020a-1297-4770-8a6c-9c3badc7dc03 req-afed8167-4f05-42da-b449-1ab1bff168ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] No waiting events found dispatching network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.007 227766 WARNING nova.compute.manager [req-4a06020a-1297-4770-8a6c-9c3badc7dc03 req-afed8167-4f05-42da-b449-1ab1bff168ae 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received unexpected event network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 for instance with vm_state suspended and task_state resuming.#033[00m
Jan 23 06:08:36 np0005593234 podman[346363]: 2026-01-23 11:08:36.073495321 +0000 UTC m=+0.042720498 container create 73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 06:08:36 np0005593234 systemd[1]: Started libpod-conmon-73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b.scope.
Jan 23 06:08:36 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:08:36 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/55a8d0c0058ad2e4e216fe76042389b54f54fe0acd1e642a1c01796709daadad/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:08:36 np0005593234 podman[346363]: 2026-01-23 11:08:36.051982968 +0000 UTC m=+0.021208165 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:08:36 np0005593234 podman[346363]: 2026-01-23 11:08:36.153258778 +0000 UTC m=+0.122483975 container init 73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:08:36 np0005593234 podman[346363]: 2026-01-23 11:08:36.160625308 +0000 UTC m=+0.129850485 container start 73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:08:36 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[346399]: [NOTICE]   (346406) : New worker (346409) forked
Jan 23 06:08:36 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[346399]: [NOTICE]   (346406) : Loading success.
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.298 227766 DEBUG nova.virt.libvirt.host [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Removed pending event for abd79b15-c133-4b9f-ae4c-545309f52322 due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.299 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166516.2977748, abd79b15-c133-4b9f-ae4c-545309f52322 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.299 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] VM Started (Lifecycle Event)#033[00m
Jan 23 06:08:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:36.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.332 227766 DEBUG nova.compute.manager [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.332 227766 DEBUG nova.objects.instance [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'pci_devices' on Instance uuid abd79b15-c133-4b9f-ae4c-545309f52322 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.335 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.339 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.361 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.361 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166516.30167, abd79b15-c133-4b9f-ae4c-545309f52322 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.361 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.364 227766 INFO nova.virt.libvirt.driver [-] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Instance running successfully.#033[00m
Jan 23 06:08:36 np0005593234 virtqemud[227483]: argument unsupported: QEMU guest agent is not configured
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.367 227766 DEBUG nova.virt.libvirt.guest [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.367 227766 DEBUG nova.compute.manager [None req-7fb2dfe7-f6f6-4001-b36f-440aa4a3d231 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.376 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.380 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.402 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Jan 23 06:08:36 np0005593234 nova_compute[227762]: 2026-01-23 11:08:36.615 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:08:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:08:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:08:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:37.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:08:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:38 np0005593234 nova_compute[227762]: 2026-01-23 11:08:38.108 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Updating instance_info_cache with network_info: [{"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:08:38 np0005593234 nova_compute[227762]: 2026-01-23 11:08:38.122 227766 DEBUG nova.compute.manager [req-43f98ed3-8724-46a7-9b62-1d2e330d57cc req-146833b3-b994-4a6e-ac55-32206adbd965 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received event network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:08:38 np0005593234 nova_compute[227762]: 2026-01-23 11:08:38.122 227766 DEBUG oslo_concurrency.lockutils [req-43f98ed3-8724-46a7-9b62-1d2e330d57cc req-146833b3-b994-4a6e-ac55-32206adbd965 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:38 np0005593234 nova_compute[227762]: 2026-01-23 11:08:38.123 227766 DEBUG oslo_concurrency.lockutils [req-43f98ed3-8724-46a7-9b62-1d2e330d57cc req-146833b3-b994-4a6e-ac55-32206adbd965 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:38 np0005593234 nova_compute[227762]: 2026-01-23 11:08:38.123 227766 DEBUG oslo_concurrency.lockutils [req-43f98ed3-8724-46a7-9b62-1d2e330d57cc req-146833b3-b994-4a6e-ac55-32206adbd965 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:38 np0005593234 nova_compute[227762]: 2026-01-23 11:08:38.124 227766 DEBUG nova.compute.manager [req-43f98ed3-8724-46a7-9b62-1d2e330d57cc req-146833b3-b994-4a6e-ac55-32206adbd965 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] No waiting events found dispatching network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:08:38 np0005593234 nova_compute[227762]: 2026-01-23 11:08:38.124 227766 WARNING nova.compute.manager [req-43f98ed3-8724-46a7-9b62-1d2e330d57cc req-146833b3-b994-4a6e-ac55-32206adbd965 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received unexpected event network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:08:38 np0005593234 nova_compute[227762]: 2026-01-23 11:08:38.133 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:08:38 np0005593234 nova_compute[227762]: 2026-01-23 11:08:38.134 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 06:08:38 np0005593234 nova_compute[227762]: 2026-01-23 11:08:38.135 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:38.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:38 np0005593234 nova_compute[227762]: 2026-01-23 11:08:38.405 227766 INFO nova.compute.manager [None req-b90170fb-03b2-472b-ba15-8e73cfa73274 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Get console output#033[00m
Jan 23 06:08:38 np0005593234 nova_compute[227762]: 2026-01-23 11:08:38.411 273674 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Jan 23 06:08:38 np0005593234 nova_compute[227762]: 2026-01-23 11:08:38.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:39.005 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=101, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=100) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:39.006 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:39.007 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '101'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.057 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.255 227766 DEBUG nova.compute.manager [req-f463767d-5474-4ce5-bc5b-b2ba40daf975 req-3a6f645e-af2a-48a5-a9d1-4abf1f06c374 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received event network-changed-577aba9c-4ead-4d00-b570-3cb9d0fe2866 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.255 227766 DEBUG nova.compute.manager [req-f463767d-5474-4ce5-bc5b-b2ba40daf975 req-3a6f645e-af2a-48a5-a9d1-4abf1f06c374 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Refreshing instance network info cache due to event network-changed-577aba9c-4ead-4d00-b570-3cb9d0fe2866. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.255 227766 DEBUG oslo_concurrency.lockutils [req-f463767d-5474-4ce5-bc5b-b2ba40daf975 req-3a6f645e-af2a-48a5-a9d1-4abf1f06c374 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.256 227766 DEBUG oslo_concurrency.lockutils [req-f463767d-5474-4ce5-bc5b-b2ba40daf975 req-3a6f645e-af2a-48a5-a9d1-4abf1f06c374 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.256 227766 DEBUG nova.network.neutron [req-f463767d-5474-4ce5-bc5b-b2ba40daf975 req-3a6f645e-af2a-48a5-a9d1-4abf1f06c374 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Refreshing network info cache for port 577aba9c-4ead-4d00-b570-3cb9d0fe2866 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:08:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:08:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:39.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.347 227766 DEBUG oslo_concurrency.lockutils [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "abd79b15-c133-4b9f-ae4c-545309f52322" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.348 227766 DEBUG oslo_concurrency.lockutils [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.348 227766 DEBUG oslo_concurrency.lockutils [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.348 227766 DEBUG oslo_concurrency.lockutils [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.349 227766 DEBUG oslo_concurrency.lockutils [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.350 227766 INFO nova.compute.manager [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Terminating instance#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.351 227766 DEBUG nova.compute.manager [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.595 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:39 np0005593234 kernel: tap577aba9c-4e (unregistering): left promiscuous mode
Jan 23 06:08:39 np0005593234 NetworkManager[48942]: <info>  [1769166519.7122] device (tap577aba9c-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.722 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.724 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:39 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:39Z|01018|binding|INFO|Releasing lport 577aba9c-4ead-4d00-b570-3cb9d0fe2866 from this chassis (sb_readonly=0)
Jan 23 06:08:39 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:39Z|01019|binding|INFO|Setting lport 577aba9c-4ead-4d00-b570-3cb9d0fe2866 down in Southbound
Jan 23 06:08:39 np0005593234 ovn_controller[134547]: 2026-01-23T11:08:39Z|01020|binding|INFO|Removing iface tap577aba9c-4e ovn-installed in OVS
Jan 23 06:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:39.730 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:c1:86 10.100.0.5'], port_security=['fa:16:3e:38:c1:86 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'abd79b15-c133-4b9f-ae4c-545309f52322', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-da794c9e-a13c-4a02-a110-50b557ab3517', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c06f98b51aeb48de91d116fda54a161f', 'neutron:revision_number': '6', 'neutron:security_group_ids': '1d5b2423-0771-4eca-b525-6940c398c745', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71aaf5a5-1e19-4c40-a6d5-6d2650b9b03f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=577aba9c-4ead-4d00-b570-3cb9d0fe2866) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:39.732 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 577aba9c-4ead-4d00-b570-3cb9d0fe2866 in datapath da794c9e-a13c-4a02-a110-50b557ab3517 unbound from our chassis#033[00m
Jan 23 06:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:39.733 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network da794c9e-a13c-4a02-a110-50b557ab3517, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:39.735 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4507a3f1-8ea2-4187-9e96-7c1317f5ec9c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:39.735 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517 namespace which is not needed anymore#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.746 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:39 np0005593234 systemd[1]: machine-qemu\x2d113\x2dinstance\x2d000000da.scope: Deactivated successfully.
Jan 23 06:08:39 np0005593234 systemd-machined[195626]: Machine qemu-113-instance-000000da terminated.
Jan 23 06:08:39 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[346399]: [NOTICE]   (346406) : haproxy version is 2.8.14-c23fe91
Jan 23 06:08:39 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[346399]: [NOTICE]   (346406) : path to executable is /usr/sbin/haproxy
Jan 23 06:08:39 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[346399]: [WARNING]  (346406) : Exiting Master process...
Jan 23 06:08:39 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[346399]: [ALERT]    (346406) : Current worker (346409) exited with code 143 (Terminated)
Jan 23 06:08:39 np0005593234 neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517[346399]: [WARNING]  (346406) : All workers exited. Exiting... (0)
Jan 23 06:08:39 np0005593234 systemd[1]: libpod-73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b.scope: Deactivated successfully.
Jan 23 06:08:39 np0005593234 podman[346495]: 2026-01-23 11:08:39.889842518 +0000 UTC m=+0.047389044 container died 73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 06:08:39 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b-userdata-shm.mount: Deactivated successfully.
Jan 23 06:08:39 np0005593234 systemd[1]: var-lib-containers-storage-overlay-55a8d0c0058ad2e4e216fe76042389b54f54fe0acd1e642a1c01796709daadad-merged.mount: Deactivated successfully.
Jan 23 06:08:39 np0005593234 podman[346495]: 2026-01-23 11:08:39.924868454 +0000 UTC m=+0.082414980 container cleanup 73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 06:08:39 np0005593234 systemd[1]: libpod-conmon-73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b.scope: Deactivated successfully.
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.983 227766 INFO nova.virt.libvirt.driver [-] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Instance destroyed successfully.#033[00m
Jan 23 06:08:39 np0005593234 nova_compute[227762]: 2026-01-23 11:08:39.984 227766 DEBUG nova.objects.instance [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lazy-loading 'resources' on Instance uuid abd79b15-c133-4b9f-ae4c-545309f52322 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:08:39 np0005593234 podman[346526]: 2026-01-23 11:08:39.98961385 +0000 UTC m=+0.046819006 container remove 73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 06:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:39.994 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[29b84f66-b003-4ed3-94bf-dc06f2a45d66]: (4, ('Fri Jan 23 11:08:39 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517 (73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b)\n73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b\nFri Jan 23 11:08:39 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517 (73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b)\n73cb7de49104e5d0fce0b3770fbe4b0508cd1522548e6be8fb2bd51459c8024b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:39.997 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[4141a607-b11f-49a3-831b-b20e8457f43e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:39 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:39.998 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapda794c9e-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.000 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:40 np0005593234 kernel: tapda794c9e-a0: left promiscuous mode
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.006 227766 DEBUG nova.virt.libvirt.vif [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:07:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1045500755',display_name='tempest-TestNetworkAdvancedServerOps-server-1045500755',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1045500755',id=218,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKWebjGpREtiyJMsjND/HOQ0onI8JYEhYgd5aCBoyMWcKs/YexmbUn3MXfhZKI/tI0Y2jhFftd8sJYYag+FX9uiRDMUE0Xzgy1B+1zHl6CM829ibbYed2eKCx5OjnHYcWg==',key_name='tempest-TestNetworkAdvancedServerOps-709731046',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:07:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='c06f98b51aeb48de91d116fda54a161f',ramdisk_id='',reservation_id='r-z75uh4kf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-1886747874',owner_user_name='tempest-TestNetworkAdvancedServerOps-1886747874-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:08:36Z,user_data=None,user_id='420c366dc5dc45a48da4e0b18c93043f',uuid=abd79b15-c133-4b9f-ae4c-545309f52322,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.006 227766 DEBUG nova.network.os_vif_util [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converting VIF {"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.007 227766 DEBUG nova.network.os_vif_util [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:c1:86,bridge_name='br-int',has_traffic_filtering=True,id=577aba9c-4ead-4d00-b570-3cb9d0fe2866,network=Network(da794c9e-a13c-4a02-a110-50b557ab3517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap577aba9c-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.008 227766 DEBUG os_vif [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:c1:86,bridge_name='br-int',has_traffic_filtering=True,id=577aba9c-4ead-4d00-b570-3cb9d0fe2866,network=Network(da794c9e-a13c-4a02-a110-50b557ab3517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap577aba9c-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.009 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.010 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap577aba9c-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.011 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.013 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.023 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:40.025 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[0d8d218c-3d1b-4804-8850-c837d1adbbd0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.028 227766 INFO os_vif [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:c1:86,bridge_name='br-int',has_traffic_filtering=True,id=577aba9c-4ead-4d00-b570-3cb9d0fe2866,network=Network(da794c9e-a13c-4a02-a110-50b557ab3517),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap577aba9c-4e')#033[00m
Jan 23 06:08:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:40.036 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[efdae639-b35f-4d0e-875d-4b1ffd724fa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:40.037 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a4fdb95e-285d-4691-b2e7-40e2709999f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:40.054 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7f6827ae-18c2-48af-be4d-2eceed11c2a5]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1041048, 'reachable_time': 20074, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 346567, 'error': None, 'target': 'ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:40 np0005593234 systemd[1]: run-netns-ovnmeta\x2dda794c9e\x2da13c\x2d4a02\x2da110\x2d50b557ab3517.mount: Deactivated successfully.
Jan 23 06:08:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:40.057 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-da794c9e-a13c-4a02-a110-50b557ab3517 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:08:40 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:40.057 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[5ecd775f-23d5-4694-b252-30af350dda01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.190 227766 DEBUG nova.compute.manager [req-170d49f0-2d60-4eb8-bc8c-46b2c2d216e9 req-83bc0bb2-d177-411b-b195-32c395fb4612 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received event network-vif-unplugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.190 227766 DEBUG oslo_concurrency.lockutils [req-170d49f0-2d60-4eb8-bc8c-46b2c2d216e9 req-83bc0bb2-d177-411b-b195-32c395fb4612 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.191 227766 DEBUG oslo_concurrency.lockutils [req-170d49f0-2d60-4eb8-bc8c-46b2c2d216e9 req-83bc0bb2-d177-411b-b195-32c395fb4612 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.191 227766 DEBUG oslo_concurrency.lockutils [req-170d49f0-2d60-4eb8-bc8c-46b2c2d216e9 req-83bc0bb2-d177-411b-b195-32c395fb4612 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.191 227766 DEBUG nova.compute.manager [req-170d49f0-2d60-4eb8-bc8c-46b2c2d216e9 req-83bc0bb2-d177-411b-b195-32c395fb4612 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] No waiting events found dispatching network-vif-unplugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.191 227766 DEBUG nova.compute.manager [req-170d49f0-2d60-4eb8-bc8c-46b2c2d216e9 req-83bc0bb2-d177-411b-b195-32c395fb4612 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received event network-vif-unplugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.192 227766 DEBUG nova.compute.manager [req-170d49f0-2d60-4eb8-bc8c-46b2c2d216e9 req-83bc0bb2-d177-411b-b195-32c395fb4612 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received event network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.192 227766 DEBUG oslo_concurrency.lockutils [req-170d49f0-2d60-4eb8-bc8c-46b2c2d216e9 req-83bc0bb2-d177-411b-b195-32c395fb4612 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.192 227766 DEBUG oslo_concurrency.lockutils [req-170d49f0-2d60-4eb8-bc8c-46b2c2d216e9 req-83bc0bb2-d177-411b-b195-32c395fb4612 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.192 227766 DEBUG oslo_concurrency.lockutils [req-170d49f0-2d60-4eb8-bc8c-46b2c2d216e9 req-83bc0bb2-d177-411b-b195-32c395fb4612 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.192 227766 DEBUG nova.compute.manager [req-170d49f0-2d60-4eb8-bc8c-46b2c2d216e9 req-83bc0bb2-d177-411b-b195-32c395fb4612 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] No waiting events found dispatching network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.193 227766 WARNING nova.compute.manager [req-170d49f0-2d60-4eb8-bc8c-46b2c2d216e9 req-83bc0bb2-d177-411b-b195-32c395fb4612 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received unexpected event network-vif-plugged-577aba9c-4ead-4d00-b570-3cb9d0fe2866 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 06:08:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:40.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.358 227766 DEBUG nova.network.neutron [req-f463767d-5474-4ce5-bc5b-b2ba40daf975 req-3a6f645e-af2a-48a5-a9d1-4abf1f06c374 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Updated VIF entry in instance network info cache for port 577aba9c-4ead-4d00-b570-3cb9d0fe2866. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.358 227766 DEBUG nova.network.neutron [req-f463767d-5474-4ce5-bc5b-b2ba40daf975 req-3a6f645e-af2a-48a5-a9d1-4abf1f06c374 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Updating instance_info_cache with network_info: [{"id": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "address": "fa:16:3e:38:c1:86", "network": {"id": "da794c9e-a13c-4a02-a110-50b557ab3517", "bridge": "br-int", "label": "tempest-network-smoke--1154464235", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "c06f98b51aeb48de91d116fda54a161f", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap577aba9c-4e", "ovs_interfaceid": "577aba9c-4ead-4d00-b570-3cb9d0fe2866", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.375 227766 DEBUG oslo_concurrency.lockutils [req-f463767d-5474-4ce5-bc5b-b2ba40daf975 req-3a6f645e-af2a-48a5-a9d1-4abf1f06c374 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-abd79b15-c133-4b9f-ae4c-545309f52322" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.504 227766 INFO nova.virt.libvirt.driver [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Deleting instance files /var/lib/nova/instances/abd79b15-c133-4b9f-ae4c-545309f52322_del#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.505 227766 INFO nova.virt.libvirt.driver [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Deletion of /var/lib/nova/instances/abd79b15-c133-4b9f-ae4c-545309f52322_del complete#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.556 227766 INFO nova.compute.manager [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Took 1.21 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.557 227766 DEBUG oslo.service.loopingcall [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.557 227766 DEBUG nova.compute.manager [-] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.557 227766 DEBUG nova.network.neutron [-] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 06:08:40 np0005593234 nova_compute[227762]: 2026-01-23 11:08:40.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:41 np0005593234 nova_compute[227762]: 2026-01-23 11:08:41.205 227766 DEBUG nova.network.neutron [-] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:08:41 np0005593234 nova_compute[227762]: 2026-01-23 11:08:41.225 227766 INFO nova.compute.manager [-] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Took 0.67 seconds to deallocate network for instance.#033[00m
Jan 23 06:08:41 np0005593234 nova_compute[227762]: 2026-01-23 11:08:41.279 227766 DEBUG nova.compute.manager [req-0205e65e-75db-4e98-a33c-becd33b04c28 req-9776398f-a646-443b-9a96-fd78c274b2ad 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Received event network-vif-deleted-577aba9c-4ead-4d00-b570-3cb9d0fe2866 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:08:41 np0005593234 nova_compute[227762]: 2026-01-23 11:08:41.281 227766 DEBUG oslo_concurrency.lockutils [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:41 np0005593234 nova_compute[227762]: 2026-01-23 11:08:41.282 227766 DEBUG oslo_concurrency.lockutils [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:41.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:41 np0005593234 nova_compute[227762]: 2026-01-23 11:08:41.334 227766 DEBUG oslo_concurrency.processutils [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:08:41 np0005593234 nova_compute[227762]: 2026-01-23 11:08:41.617 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:41 np0005593234 nova_compute[227762]: 2026-01-23 11:08:41.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:41 np0005593234 nova_compute[227762]: 2026-01-23 11:08:41.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:41 np0005593234 nova_compute[227762]: 2026-01-23 11:08:41.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:08:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:08:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3895280870' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:08:41 np0005593234 nova_compute[227762]: 2026-01-23 11:08:41.778 227766 DEBUG oslo_concurrency.processutils [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:08:41 np0005593234 nova_compute[227762]: 2026-01-23 11:08:41.784 227766 DEBUG nova.compute.provider_tree [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:08:41 np0005593234 nova_compute[227762]: 2026-01-23 11:08:41.984 227766 DEBUG nova.scheduler.client.report [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:08:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:42.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:42.917 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:08:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:42.918 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:08:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:08:42.918 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:42 np0005593234 nova_compute[227762]: 2026-01-23 11:08:42.973 227766 DEBUG oslo_concurrency.lockutils [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.691s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:43 np0005593234 nova_compute[227762]: 2026-01-23 11:08:43.188 227766 INFO nova.scheduler.client.report [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Deleted allocations for instance abd79b15-c133-4b9f-ae4c-545309f52322#033[00m
Jan 23 06:08:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:43.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:43 np0005593234 nova_compute[227762]: 2026-01-23 11:08:43.978 227766 DEBUG oslo_concurrency.lockutils [None req-4fa8d425-f4bb-4e27-8b7c-0fcc55dd42ba 420c366dc5dc45a48da4e0b18c93043f c06f98b51aeb48de91d116fda54a161f - - default default] Lock "abd79b15-c133-4b9f-ae4c-545309f52322" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.630s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:08:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:44.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:08:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/789291558' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:08:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:08:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/789291558' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:08:45 np0005593234 nova_compute[227762]: 2026-01-23 11:08:45.035 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:45.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:45 np0005593234 nova_compute[227762]: 2026-01-23 11:08:45.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:08:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:46.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:08:46 np0005593234 nova_compute[227762]: 2026-01-23 11:08:46.620 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:46 np0005593234 nova_compute[227762]: 2026-01-23 11:08:46.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:08:46 np0005593234 podman[346601]: 2026-01-23 11:08:46.773186439 +0000 UTC m=+0.057726027 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Jan 23 06:08:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:47.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:48.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:49.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:50 np0005593234 nova_compute[227762]: 2026-01-23 11:08:50.038 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:50.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:51.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:51 np0005593234 nova_compute[227762]: 2026-01-23 11:08:51.622 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:51 np0005593234 nova_compute[227762]: 2026-01-23 11:08:51.953 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:52 np0005593234 nova_compute[227762]: 2026-01-23 11:08:52.035 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:52.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:08:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:53.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:08:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:54.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:54 np0005593234 podman[346626]: 2026-01-23 11:08:54.823144968 +0000 UTC m=+0.116896709 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:08:54 np0005593234 nova_compute[227762]: 2026-01-23 11:08:54.982 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769166519.9808552, abd79b15-c133-4b9f-ae4c-545309f52322 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:08:54 np0005593234 nova_compute[227762]: 2026-01-23 11:08:54.983 227766 INFO nova.compute.manager [-] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] VM Stopped (Lifecycle Event)#033[00m
Jan 23 06:08:55 np0005593234 nova_compute[227762]: 2026-01-23 11:08:55.041 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:55 np0005593234 nova_compute[227762]: 2026-01-23 11:08:55.046 227766 DEBUG nova.compute.manager [None req-d37429f2-f867-4cc9-a362-b0cf23ec6474 - - - - - -] [instance: abd79b15-c133-4b9f-ae4c-545309f52322] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:08:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:55.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:56.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:08:56 np0005593234 nova_compute[227762]: 2026-01-23 11:08:56.624 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:08:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:57.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:08:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:08:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:08:58.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:08:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:08:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:08:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:08:59.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:09:00 np0005593234 nova_compute[227762]: 2026-01-23 11:09:00.043 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:09:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:00.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:09:00 np0005593234 nova_compute[227762]: 2026-01-23 11:09:00.934 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:01.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:01 np0005593234 nova_compute[227762]: 2026-01-23 11:09:01.626 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:09:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:02.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:09:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:03.312 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:04.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:05 np0005593234 nova_compute[227762]: 2026-01-23 11:09:05.045 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:05.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:06.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:06 np0005593234 nova_compute[227762]: 2026-01-23 11:09:06.629 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:09:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:07.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:09:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:08.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:09.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:10 np0005593234 nova_compute[227762]: 2026-01-23 11:09:10.080 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:09:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:10.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:09:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:11.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:11 np0005593234 nova_compute[227762]: 2026-01-23 11:09:11.631 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:09:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:12.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:09:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:09:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:13.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:09:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:14.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:15 np0005593234 nova_compute[227762]: 2026-01-23 11:09:15.083 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:15.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:16.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:16 np0005593234 nova_compute[227762]: 2026-01-23 11:09:16.635 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:17.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:17 np0005593234 podman[346715]: 2026-01-23 11:09:17.776092338 +0000 UTC m=+0.064414107 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Jan 23 06:09:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:09:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:18.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:09:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:09:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:19.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:09:20 np0005593234 nova_compute[227762]: 2026-01-23 11:09:20.086 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:20.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:21.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:21 np0005593234 nova_compute[227762]: 2026-01-23 11:09:21.636 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:22.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:23.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:24.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:25 np0005593234 nova_compute[227762]: 2026-01-23 11:09:25.090 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:25.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:25 np0005593234 podman[346788]: 2026-01-23 11:09:25.849551524 +0000 UTC m=+0.129794502 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 06:09:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:26.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:26 np0005593234 nova_compute[227762]: 2026-01-23 11:09:26.638 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:09:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:27.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:09:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:28.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:28 np0005593234 nova_compute[227762]: 2026-01-23 11:09:28.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:28 np0005593234 nova_compute[227762]: 2026-01-23 11:09:28.776 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:09:28 np0005593234 nova_compute[227762]: 2026-01-23 11:09:28.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:09:28 np0005593234 nova_compute[227762]: 2026-01-23 11:09:28.777 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:09:28 np0005593234 nova_compute[227762]: 2026-01-23 11:09:28.777 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:09:28 np0005593234 nova_compute[227762]: 2026-01-23 11:09:28.777 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:09:29 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:09:29 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3580887910' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:09:29 np0005593234 nova_compute[227762]: 2026-01-23 11:09:29.214 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:09:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:29.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:29 np0005593234 nova_compute[227762]: 2026-01-23 11:09:29.437 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:09:29 np0005593234 nova_compute[227762]: 2026-01-23 11:09:29.439 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4089MB free_disk=20.967525482177734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:09:29 np0005593234 nova_compute[227762]: 2026-01-23 11:09:29.439 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:09:29 np0005593234 nova_compute[227762]: 2026-01-23 11:09:29.440 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:09:29 np0005593234 nova_compute[227762]: 2026-01-23 11:09:29.694 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:09:29 np0005593234 nova_compute[227762]: 2026-01-23 11:09:29.694 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:09:29 np0005593234 nova_compute[227762]: 2026-01-23 11:09:29.717 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:09:29 np0005593234 ovn_controller[134547]: 2026-01-23T11:09:29Z|01021|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 06:09:30 np0005593234 nova_compute[227762]: 2026-01-23 11:09:30.093 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:30 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:09:30 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/169849577' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:09:30 np0005593234 nova_compute[227762]: 2026-01-23 11:09:30.204 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:09:30 np0005593234 nova_compute[227762]: 2026-01-23 11:09:30.213 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:09:30 np0005593234 nova_compute[227762]: 2026-01-23 11:09:30.289 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:09:30 np0005593234 nova_compute[227762]: 2026-01-23 11:09:30.355 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:09:30 np0005593234 nova_compute[227762]: 2026-01-23 11:09:30.355 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.916s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:09:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:30.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:31.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:31 np0005593234 nova_compute[227762]: 2026-01-23 11:09:31.640 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:32 np0005593234 nova_compute[227762]: 2026-01-23 11:09:32.356 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:32 np0005593234 nova_compute[227762]: 2026-01-23 11:09:32.357 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:09:32 np0005593234 nova_compute[227762]: 2026-01-23 11:09:32.357 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:09:32 np0005593234 nova_compute[227762]: 2026-01-23 11:09:32.373 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:09:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:32.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:33.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:33 np0005593234 nova_compute[227762]: 2026-01-23 11:09:33.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:34.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:35 np0005593234 nova_compute[227762]: 2026-01-23 11:09:35.096 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:09:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:35.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:09:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:36.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:36 np0005593234 nova_compute[227762]: 2026-01-23 11:09:36.683 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:37.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 06:09:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:09:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:09:37 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:09:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:38.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:09:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:39.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:09:39 np0005593234 nova_compute[227762]: 2026-01-23 11:09:39.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:40 np0005593234 nova_compute[227762]: 2026-01-23 11:09:40.099 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:09:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:40.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:09:40 np0005593234 nova_compute[227762]: 2026-01-23 11:09:40.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:09:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:41.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:09:41 np0005593234 nova_compute[227762]: 2026-01-23 11:09:41.685 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:42.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:42 np0005593234 nova_compute[227762]: 2026-01-23 11:09:42.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:09:42.918 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:09:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:09:42.919 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:09:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:09:42.919 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:09:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:09:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:43.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:09:43 np0005593234 nova_compute[227762]: 2026-01-23 11:09:43.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:43 np0005593234 nova_compute[227762]: 2026-01-23 11:09:43.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:09:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:44.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:09:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3290225871' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:09:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:09:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3290225871' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:09:45 np0005593234 nova_compute[227762]: 2026-01-23 11:09:45.102 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:45.353 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:45 np0005593234 nova_compute[227762]: 2026-01-23 11:09:45.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:09:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:46.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:46 np0005593234 nova_compute[227762]: 2026-01-23 11:09:46.686 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:09:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:47.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:09:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:48.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:49 np0005593234 podman[347058]: 2026-01-23 11:09:49.102649941 +0000 UTC m=+0.050595085 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 06:09:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:09:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:49.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:09:50 np0005593234 nova_compute[227762]: 2026-01-23 11:09:50.105 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:09:50 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:09:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:50.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:09:51.245 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=102, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=101) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:09:51 np0005593234 nova_compute[227762]: 2026-01-23 11:09:51.246 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:51 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:09:51.246 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:09:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:51.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:51 np0005593234 nova_compute[227762]: 2026-01-23 11:09:51.688 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:09:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:52.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:09:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:53.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:54.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:55 np0005593234 nova_compute[227762]: 2026-01-23 11:09:55.108 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:09:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:55.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:09:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:56.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:56 np0005593234 nova_compute[227762]: 2026-01-23 11:09:56.689 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:09:56 np0005593234 podman[347132]: 2026-01-23 11:09:56.824662691 +0000 UTC m=+0.120713609 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Jan 23 06:09:57 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:09:57.249 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '102'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:09:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:57.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:09:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:09:58.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:09:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:09:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:09:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:09:59.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:00 np0005593234 nova_compute[227762]: 2026-01-23 11:10:00.111 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:00.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:01 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 06:10:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:10:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:01.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:10:01 np0005593234 nova_compute[227762]: 2026-01-23 11:10:01.691 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:02.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:02 np0005593234 nova_compute[227762]: 2026-01-23 11:10:02.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:10:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:03.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:10:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:10:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:04.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:10:05 np0005593234 nova_compute[227762]: 2026-01-23 11:10:05.115 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:05.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:06.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:06 np0005593234 nova_compute[227762]: 2026-01-23 11:10:06.734 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:07.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:10:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:08.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:10:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:10:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:09.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:10:10 np0005593234 nova_compute[227762]: 2026-01-23 11:10:10.149 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:10.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:11.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:11 np0005593234 nova_compute[227762]: 2026-01-23 11:10:11.736 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:11 np0005593234 nova_compute[227762]: 2026-01-23 11:10:11.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:11 np0005593234 nova_compute[227762]: 2026-01-23 11:10:11.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 06:10:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:10:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:12.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:10:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:13.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:14.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:14 np0005593234 nova_compute[227762]: 2026-01-23 11:10:14.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:15 np0005593234 nova_compute[227762]: 2026-01-23 11:10:15.152 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:15.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:16.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:16 np0005593234 nova_compute[227762]: 2026-01-23 11:10:16.737 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:17.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:18.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:19.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:19 np0005593234 podman[347270]: 2026-01-23 11:10:19.768880082 +0000 UTC m=+0.062638371 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 06:10:20 np0005593234 nova_compute[227762]: 2026-01-23 11:10:20.205 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:10:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:20.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:10:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:10:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:21.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:10:21 np0005593234 nova_compute[227762]: 2026-01-23 11:10:21.739 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:22.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:10:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:23.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:10:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:10:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:24.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:10:25 np0005593234 nova_compute[227762]: 2026-01-23 11:10:25.207 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:25.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:10:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:26.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:10:26 np0005593234 nova_compute[227762]: 2026-01-23 11:10:26.742 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:10:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:27.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:10:27 np0005593234 podman[347294]: 2026-01-23 11:10:27.823663804 +0000 UTC m=+0.117139097 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 23 06:10:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:28.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:10:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:29.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:10:30 np0005593234 nova_compute[227762]: 2026-01-23 11:10:30.212 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:30.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:30 np0005593234 nova_compute[227762]: 2026-01-23 11:10:30.756 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:30 np0005593234 nova_compute[227762]: 2026-01-23 11:10:30.789 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:30 np0005593234 nova_compute[227762]: 2026-01-23 11:10:30.790 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:30 np0005593234 nova_compute[227762]: 2026-01-23 11:10:30.790 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:30 np0005593234 nova_compute[227762]: 2026-01-23 11:10:30.791 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:10:30 np0005593234 nova_compute[227762]: 2026-01-23 11:10:30.791 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:10:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2597416388' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:10:31 np0005593234 nova_compute[227762]: 2026-01-23 11:10:31.220 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:31.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:31 np0005593234 nova_compute[227762]: 2026-01-23 11:10:31.421 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:10:31 np0005593234 nova_compute[227762]: 2026-01-23 11:10:31.423 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4114MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:10:31 np0005593234 nova_compute[227762]: 2026-01-23 11:10:31.424 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:31 np0005593234 nova_compute[227762]: 2026-01-23 11:10:31.425 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:31 np0005593234 nova_compute[227762]: 2026-01-23 11:10:31.634 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:10:31 np0005593234 nova_compute[227762]: 2026-01-23 11:10:31.635 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:10:31 np0005593234 nova_compute[227762]: 2026-01-23 11:10:31.656 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:31 np0005593234 nova_compute[227762]: 2026-01-23 11:10:31.794 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:10:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/714219731' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:10:32 np0005593234 nova_compute[227762]: 2026-01-23 11:10:32.119 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:32 np0005593234 nova_compute[227762]: 2026-01-23 11:10:32.126 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:10:32 np0005593234 nova_compute[227762]: 2026-01-23 11:10:32.149 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:10:32 np0005593234 nova_compute[227762]: 2026-01-23 11:10:32.151 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:10:32 np0005593234 nova_compute[227762]: 2026-01-23 11:10:32.151 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.726s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:10:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:32.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:10:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:33 np0005593234 nova_compute[227762]: 2026-01-23 11:10:33.140 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:33 np0005593234 nova_compute[227762]: 2026-01-23 11:10:33.140 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:10:33 np0005593234 nova_compute[227762]: 2026-01-23 11:10:33.140 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:10:33 np0005593234 nova_compute[227762]: 2026-01-23 11:10:33.289 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:10:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:33.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:33 np0005593234 nova_compute[227762]: 2026-01-23 11:10:33.557 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "c019f125-db84-4dc5-959e-117b53966ffa" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:33 np0005593234 nova_compute[227762]: 2026-01-23 11:10:33.557 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:33 np0005593234 nova_compute[227762]: 2026-01-23 11:10:33.604 227766 DEBUG nova.compute.manager [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 06:10:33 np0005593234 nova_compute[227762]: 2026-01-23 11:10:33.697 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:33 np0005593234 nova_compute[227762]: 2026-01-23 11:10:33.698 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:33 np0005593234 nova_compute[227762]: 2026-01-23 11:10:33.706 227766 DEBUG nova.virt.hardware [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 06:10:33 np0005593234 nova_compute[227762]: 2026-01-23 11:10:33.707 227766 INFO nova.compute.claims [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 06:10:33 np0005593234 nova_compute[227762]: 2026-01-23 11:10:33.848 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:10:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4033523539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.376 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.385 227766 DEBUG nova.compute.provider_tree [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.410 227766 DEBUG nova.scheduler.client.report [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.436 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.738s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.438 227766 DEBUG nova.compute.manager [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 06:10:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:34.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.509 227766 DEBUG nova.compute.manager [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.510 227766 DEBUG nova.network.neutron [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.545 227766 INFO nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.567 227766 DEBUG nova.compute.manager [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.702 227766 DEBUG nova.compute.manager [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.704 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.704 227766 INFO nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Creating image(s)#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.751 227766 DEBUG nova.storage.rbd_utils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image c019f125-db84-4dc5-959e-117b53966ffa_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.792 227766 DEBUG nova.storage.rbd_utils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image c019f125-db84-4dc5-959e-117b53966ffa_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.822 227766 DEBUG nova.storage.rbd_utils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image c019f125-db84-4dc5-959e-117b53966ffa_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.826 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.853 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.889 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.890 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.891 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.891 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "a6f655456a04e1d13ef2e44ed4544c38917863a2" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.926 227766 DEBUG nova.storage.rbd_utils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image c019f125-db84-4dc5-959e-117b53966ffa_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:34 np0005593234 nova_compute[227762]: 2026-01-23 11:10:34.930 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 c019f125-db84-4dc5-959e-117b53966ffa_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:35 np0005593234 nova_compute[227762]: 2026-01-23 11:10:35.246 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:10:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:35.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:10:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:36.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:36 np0005593234 nova_compute[227762]: 2026-01-23 11:10:36.796 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:37 np0005593234 ovn_controller[134547]: 2026-01-23T11:10:37Z|01022|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 23 06:10:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:37.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:37 np0005593234 nova_compute[227762]: 2026-01-23 11:10:37.863 227766 DEBUG nova.network.neutron [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Successfully created port: b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 06:10:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:38.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:39 np0005593234 nova_compute[227762]: 2026-01-23 11:10:39.098 227766 DEBUG nova.network.neutron [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Successfully updated port: b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 06:10:39 np0005593234 nova_compute[227762]: 2026-01-23 11:10:39.122 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "refresh_cache-c019f125-db84-4dc5-959e-117b53966ffa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:10:39 np0005593234 nova_compute[227762]: 2026-01-23 11:10:39.123 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquired lock "refresh_cache-c019f125-db84-4dc5-959e-117b53966ffa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:10:39 np0005593234 nova_compute[227762]: 2026-01-23 11:10:39.123 227766 DEBUG nova.network.neutron [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:10:39 np0005593234 nova_compute[227762]: 2026-01-23 11:10:39.213 227766 DEBUG nova.compute.manager [req-83e3fc13-0a0a-4065-aa5a-eab09f084a12 req-7627e90f-643d-414b-983e-a9567d50f6ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Received event network-changed-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:10:39 np0005593234 nova_compute[227762]: 2026-01-23 11:10:39.214 227766 DEBUG nova.compute.manager [req-83e3fc13-0a0a-4065-aa5a-eab09f084a12 req-7627e90f-643d-414b-983e-a9567d50f6ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Refreshing instance network info cache due to event network-changed-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:10:39 np0005593234 nova_compute[227762]: 2026-01-23 11:10:39.214 227766 DEBUG oslo_concurrency.lockutils [req-83e3fc13-0a0a-4065-aa5a-eab09f084a12 req-7627e90f-643d-414b-983e-a9567d50f6ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-c019f125-db84-4dc5-959e-117b53966ffa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:10:39 np0005593234 nova_compute[227762]: 2026-01-23 11:10:39.281 227766 DEBUG nova.network.neutron [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 06:10:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:39.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:40 np0005593234 nova_compute[227762]: 2026-01-23 11:10:40.250 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:40 np0005593234 nova_compute[227762]: 2026-01-23 11:10:40.427 227766 DEBUG nova.network.neutron [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Updating instance_info_cache with network_info: [{"id": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "address": "fa:16:3e:ba:3c:02", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d39f5f-2a", "ovs_interfaceid": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:10:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:40.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:40 np0005593234 nova_compute[227762]: 2026-01-23 11:10:40.503 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Releasing lock "refresh_cache-c019f125-db84-4dc5-959e-117b53966ffa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:10:40 np0005593234 nova_compute[227762]: 2026-01-23 11:10:40.503 227766 DEBUG nova.compute.manager [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Instance network_info: |[{"id": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "address": "fa:16:3e:ba:3c:02", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d39f5f-2a", "ovs_interfaceid": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 06:10:40 np0005593234 nova_compute[227762]: 2026-01-23 11:10:40.503 227766 DEBUG oslo_concurrency.lockutils [req-83e3fc13-0a0a-4065-aa5a-eab09f084a12 req-7627e90f-643d-414b-983e-a9567d50f6ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-c019f125-db84-4dc5-959e-117b53966ffa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:10:40 np0005593234 nova_compute[227762]: 2026-01-23 11:10:40.503 227766 DEBUG nova.network.neutron [req-83e3fc13-0a0a-4065-aa5a-eab09f084a12 req-7627e90f-643d-414b-983e-a9567d50f6ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Refreshing network info cache for port b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:10:40 np0005593234 nova_compute[227762]: 2026-01-23 11:10:40.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:40 np0005593234 nova_compute[227762]: 2026-01-23 11:10:40.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:41.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:41 np0005593234 nova_compute[227762]: 2026-01-23 11:10:41.797 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:42.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:42.919 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:42.920 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:42.920 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:43 np0005593234 nova_compute[227762]: 2026-01-23 11:10:43.060 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/a6f655456a04e1d13ef2e44ed4544c38917863a2 c019f125-db84-4dc5-959e-117b53966ffa_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 8.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:43 np0005593234 nova_compute[227762]: 2026-01-23 11:10:43.145 227766 DEBUG nova.storage.rbd_utils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] resizing rbd image c019f125-db84-4dc5-959e-117b53966ffa_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288#033[00m
Jan 23 06:10:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:43.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:43 np0005593234 nova_compute[227762]: 2026-01-23 11:10:43.514 227766 DEBUG nova.objects.instance [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lazy-loading 'migration_context' on Instance uuid c019f125-db84-4dc5-959e-117b53966ffa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:10:43 np0005593234 nova_compute[227762]: 2026-01-23 11:10:43.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:44.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:10:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/123823444' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:10:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:10:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/123823444' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:10:44 np0005593234 nova_compute[227762]: 2026-01-23 11:10:44.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:44 np0005593234 nova_compute[227762]: 2026-01-23 11:10:44.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:10:45 np0005593234 nova_compute[227762]: 2026-01-23 11:10:45.254 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:10:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:45.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:10:45 np0005593234 nova_compute[227762]: 2026-01-23 11:10:45.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:10:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:46.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.800 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.943 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.944 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Ensure instance console log exists: /var/lib/nova/instances/c019f125-db84-4dc5-959e-117b53966ffa/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.945 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.945 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.946 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.950 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Start _get_guest_xml network_info=[{"id": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "address": "fa:16:3e:ba:3c:02", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d39f5f-2a", "ovs_interfaceid": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.957 227766 WARNING nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.965 227766 DEBUG nova.virt.libvirt.host [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.966 227766 DEBUG nova.virt.libvirt.host [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.981 227766 DEBUG nova.virt.libvirt.host [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.982 227766 DEBUG nova.virt.libvirt.host [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.984 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.985 227766 DEBUG nova.virt.hardware [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T09:27:25Z,direct_url=<?>,disk_format='qcow2',id=84c0ef19-7f67-4bd3-95d8-507c3e0942ed,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='7ace0d3e1d354841bc1ddea0c12699d6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T09:27:30Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.986 227766 DEBUG nova.virt.hardware [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.986 227766 DEBUG nova.virt.hardware [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.987 227766 DEBUG nova.virt.hardware [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.987 227766 DEBUG nova.virt.hardware [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.988 227766 DEBUG nova.virt.hardware [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.988 227766 DEBUG nova.virt.hardware [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.989 227766 DEBUG nova.virt.hardware [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.989 227766 DEBUG nova.virt.hardware [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.990 227766 DEBUG nova.virt.hardware [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.990 227766 DEBUG nova.virt.hardware [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 06:10:46 np0005593234 nova_compute[227762]: 2026-01-23 11:10:46.997 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:47 np0005593234 nova_compute[227762]: 2026-01-23 11:10:47.195 227766 DEBUG nova.network.neutron [req-83e3fc13-0a0a-4065-aa5a-eab09f084a12 req-7627e90f-643d-414b-983e-a9567d50f6ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Updated VIF entry in instance network info cache for port b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:10:47 np0005593234 nova_compute[227762]: 2026-01-23 11:10:47.196 227766 DEBUG nova.network.neutron [req-83e3fc13-0a0a-4065-aa5a-eab09f084a12 req-7627e90f-643d-414b-983e-a9567d50f6ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Updating instance_info_cache with network_info: [{"id": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "address": "fa:16:3e:ba:3c:02", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d39f5f-2a", "ovs_interfaceid": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:10:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:10:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3670660150' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:10:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:47.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:47 np0005593234 nova_compute[227762]: 2026-01-23 11:10:47.430 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:47 np0005593234 nova_compute[227762]: 2026-01-23 11:10:47.459 227766 DEBUG nova.storage.rbd_utils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image c019f125-db84-4dc5-959e-117b53966ffa_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:47 np0005593234 nova_compute[227762]: 2026-01-23 11:10:47.465 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:10:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2957678545' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:10:47 np0005593234 nova_compute[227762]: 2026-01-23 11:10:47.906 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:47 np0005593234 nova_compute[227762]: 2026-01-23 11:10:47.909 227766 DEBUG nova.virt.libvirt.vif [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:10:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1989694612',display_name='tempest-TestServerMultinode-server-1989694612',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1989694612',id=222,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4691e06029a4b11bbda2856a451bd88',ramdisk_id='',reservation_id='r-i0szs682',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1152571872',owner_user_name='tempest-TestServerMultinode-1152571872-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:10:34Z,user_data=None,user_id='ac51edf400184ec0b11ee5acc335ff21',uuid=c019f125-db84-4dc5-959e-117b53966ffa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "address": "fa:16:3e:ba:3c:02", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d39f5f-2a", "ovs_interfaceid": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 06:10:47 np0005593234 nova_compute[227762]: 2026-01-23 11:10:47.909 227766 DEBUG nova.network.os_vif_util [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Converting VIF {"id": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "address": "fa:16:3e:ba:3c:02", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d39f5f-2a", "ovs_interfaceid": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:10:47 np0005593234 nova_compute[227762]: 2026-01-23 11:10:47.910 227766 DEBUG nova.network.os_vif_util [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d39f5f-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:10:47 np0005593234 nova_compute[227762]: 2026-01-23 11:10:47.912 227766 DEBUG nova.objects.instance [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lazy-loading 'pci_devices' on Instance uuid c019f125-db84-4dc5-959e-117b53966ffa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:10:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.090 227766 DEBUG oslo_concurrency.lockutils [req-83e3fc13-0a0a-4065-aa5a-eab09f084a12 req-7627e90f-643d-414b-983e-a9567d50f6ba 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-c019f125-db84-4dc5-959e-117b53966ffa" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.250 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] End _get_guest_xml xml=<domain type="kvm">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  <uuid>c019f125-db84-4dc5-959e-117b53966ffa</uuid>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  <name>instance-000000de</name>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestServerMultinode-server-1989694612</nova:name>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 11:10:46</nova:creationTime>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <nova:user uuid="ac51edf400184ec0b11ee5acc335ff21">tempest-TestServerMultinode-1152571872-project-admin</nova:user>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <nova:project uuid="d4691e06029a4b11bbda2856a451bd88">tempest-TestServerMultinode-1152571872</nova:project>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="84c0ef19-7f67-4bd3-95d8-507c3e0942ed"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <nova:port uuid="b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <system>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <entry name="serial">c019f125-db84-4dc5-959e-117b53966ffa</entry>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <entry name="uuid">c019f125-db84-4dc5-959e-117b53966ffa</entry>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    </system>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  <os>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  </os>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  <features>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  </features>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  </clock>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  <devices>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/c019f125-db84-4dc5-959e-117b53966ffa_disk">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/c019f125-db84-4dc5-959e-117b53966ffa_disk.config">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:ba:3c:02"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <target dev="tapb2d39f5f-2a"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    </interface>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/c019f125-db84-4dc5-959e-117b53966ffa/console.log" append="off"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    </serial>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <video>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    </video>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    </rng>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 06:10:48 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 06:10:48 np0005593234 nova_compute[227762]:  </devices>
Jan 23 06:10:48 np0005593234 nova_compute[227762]: </domain>
Jan 23 06:10:48 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.251 227766 DEBUG nova.compute.manager [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Preparing to wait for external event network-vif-plugged-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.252 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "c019f125-db84-4dc5-959e-117b53966ffa-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.252 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.253 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.254 227766 DEBUG nova.virt.libvirt.vif [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:10:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1989694612',display_name='tempest-TestServerMultinode-server-1989694612',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1989694612',id=222,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d4691e06029a4b11bbda2856a451bd88',ramdisk_id='',reservation_id='r-i0szs682',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1152571872',owner_user_name='tempest-TestServerMultinode-1152571872-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:10:34Z,user_data=None,user_id='ac51edf400184ec0b11ee5acc335ff21',uuid=c019f125-db84-4dc5-959e-117b53966ffa,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "address": "fa:16:3e:ba:3c:02", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d39f5f-2a", "ovs_interfaceid": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.254 227766 DEBUG nova.network.os_vif_util [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Converting VIF {"id": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "address": "fa:16:3e:ba:3c:02", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d39f5f-2a", "ovs_interfaceid": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.255 227766 DEBUG nova.network.os_vif_util [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d39f5f-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.256 227766 DEBUG os_vif [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d39f5f-2a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.257 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.258 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.259 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.266 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.266 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2d39f5f-2a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.267 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb2d39f5f-2a, col_values=(('external_ids', {'iface-id': 'b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ba:3c:02', 'vm-uuid': 'c019f125-db84-4dc5-959e-117b53966ffa'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.269 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:48 np0005593234 NetworkManager[48942]: <info>  [1769166648.2709] manager: (tapb2d39f5f-2a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/490)
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.273 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.281 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.282 227766 INFO os_vif [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d39f5f-2a')#033[00m
Jan 23 06:10:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:10:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:48.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.924 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.925 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.925 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] No VIF found with MAC fa:16:3e:ba:3c:02, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.926 227766 INFO nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Using config drive#033[00m
Jan 23 06:10:48 np0005593234 nova_compute[227762]: 2026-01-23 11:10:48.963 227766 DEBUG nova.storage.rbd_utils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image c019f125-db84-4dc5-959e-117b53966ffa_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:10:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:49.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:10:49 np0005593234 podman[347747]: 2026-01-23 11:10:49.879509984 +0000 UTC m=+0.067841825 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 06:10:50 np0005593234 nova_compute[227762]: 2026-01-23 11:10:50.002 227766 INFO nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Creating config drive at /var/lib/nova/instances/c019f125-db84-4dc5-959e-117b53966ffa/disk.config#033[00m
Jan 23 06:10:50 np0005593234 nova_compute[227762]: 2026-01-23 11:10:50.009 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c019f125-db84-4dc5-959e-117b53966ffa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4_6bt8rs execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:50 np0005593234 nova_compute[227762]: 2026-01-23 11:10:50.145 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c019f125-db84-4dc5-959e-117b53966ffa/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp4_6bt8rs" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:50 np0005593234 nova_compute[227762]: 2026-01-23 11:10:50.180 227766 DEBUG nova.storage.rbd_utils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] rbd image c019f125-db84-4dc5-959e-117b53966ffa_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:10:50 np0005593234 nova_compute[227762]: 2026-01-23 11:10:50.185 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/c019f125-db84-4dc5-959e-117b53966ffa/disk.config c019f125-db84-4dc5-959e-117b53966ffa_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:10:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:50.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:51.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:51 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:10:51 np0005593234 nova_compute[227762]: 2026-01-23 11:10:51.784 227766 DEBUG oslo_concurrency.processutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/c019f125-db84-4dc5-959e-117b53966ffa/disk.config c019f125-db84-4dc5-959e-117b53966ffa_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.600s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:10:51 np0005593234 nova_compute[227762]: 2026-01-23 11:10:51.786 227766 INFO nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Deleting local config drive /var/lib/nova/instances/c019f125-db84-4dc5-959e-117b53966ffa/disk.config because it was imported into RBD.#033[00m
Jan 23 06:10:51 np0005593234 nova_compute[227762]: 2026-01-23 11:10:51.804 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:51 np0005593234 kernel: tapb2d39f5f-2a: entered promiscuous mode
Jan 23 06:10:51 np0005593234 ovn_controller[134547]: 2026-01-23T11:10:51Z|01023|binding|INFO|Claiming lport b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 for this chassis.
Jan 23 06:10:51 np0005593234 nova_compute[227762]: 2026-01-23 11:10:51.869 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:51 np0005593234 ovn_controller[134547]: 2026-01-23T11:10:51Z|01024|binding|INFO|b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7: Claiming fa:16:3e:ba:3c:02 10.100.0.14
Jan 23 06:10:51 np0005593234 NetworkManager[48942]: <info>  [1769166651.8703] manager: (tapb2d39f5f-2a): new Tun device (/org/freedesktop/NetworkManager/Devices/491)
Jan 23 06:10:51 np0005593234 nova_compute[227762]: 2026-01-23 11:10:51.879 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:51 np0005593234 nova_compute[227762]: 2026-01-23 11:10:51.881 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:51 np0005593234 systemd-machined[195626]: New machine qemu-114-instance-000000de.
Jan 23 06:10:51 np0005593234 nova_compute[227762]: 2026-01-23 11:10:51.952 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:51 np0005593234 ovn_controller[134547]: 2026-01-23T11:10:51Z|01025|binding|INFO|Setting lport b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 ovn-installed in OVS
Jan 23 06:10:51 np0005593234 nova_compute[227762]: 2026-01-23 11:10:51.956 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:51 np0005593234 systemd[1]: Started Virtual Machine qemu-114-instance-000000de.
Jan 23 06:10:51 np0005593234 systemd-udevd[347905]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:10:51 np0005593234 NetworkManager[48942]: <info>  [1769166651.9921] device (tapb2d39f5f-2a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:10:51 np0005593234 NetworkManager[48942]: <info>  [1769166651.9929] device (tapb2d39f5f-2a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:10:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:10:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:52.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:10:52 np0005593234 ovn_controller[134547]: 2026-01-23T11:10:52Z|01026|binding|INFO|Setting lport b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 up in Southbound
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.611 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:3c:02 10.100.0.14'], port_security=['fa:16:3e:ba:3c:02 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c019f125-db84-4dc5-959e-117b53966ffa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4691e06029a4b11bbda2856a451bd88', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0ca02cfe-9b98-40f4-8c92-4cc40f5f9499', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9205c727-159c-48df-8bc6-3771f4de4cfc, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.613 144381 INFO neutron.agent.ovn.metadata.agent [-] Port b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 in datapath f4706ca2-15b6-4141-8d7b-8d4cab159f24 bound to our chassis#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.614 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f4706ca2-15b6-4141-8d7b-8d4cab159f24#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.634 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e533e5c0-4013-48f2-9f08-b9c873a665ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.635 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf4706ca2-11 in ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.636 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf4706ca2-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.636 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c04b8f7c-787f-41a9-b150-b8aa6c639bb5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.637 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[122ff6e9-798a-4d45-996f-0a55a0af8eb7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:10:52 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.656 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[0f0f573f-2c08-4749-8ec7-544cfd82040a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.675 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[249b5d3f-253e-4965-8bc3-57ca9af05629]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.706 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7eea25bc-e78c-4f53-9c71-befea42439c2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 NetworkManager[48942]: <info>  [1769166652.7177] manager: (tapf4706ca2-10): new Veth device (/org/freedesktop/NetworkManager/Devices/492)
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.718 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[14f07e18-2716-428f-a7b2-3fe5da400f3f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.751 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[2d665df8-2411-43de-9a91-7763dde9f641]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.755 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[68763fd1-523c-47c7-9e14-5545b76b61f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 NetworkManager[48942]: <info>  [1769166652.7802] device (tapf4706ca2-10): carrier: link connected
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.785 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[ed73124d-15ce-4d99-ba1e-4b2b1aebd85a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.801 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eb681601-a92d-43df-a262-4af0d4549fa4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4706ca2-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:aa:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 315], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1054774, 'reachable_time': 36373, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 347975, 'error': None, 'target': 'ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.816 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[5e48c1fe-d730-475f-a713-5e78517527ce]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feba:aa5b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1054774, 'tstamp': 1054774}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 347980, 'error': None, 'target': 'ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.830 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c18aaf4b-6e99-4361-9635-b90e1e7f5344]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf4706ca2-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ba:aa:5b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 315], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1054774, 'reachable_time': 36373, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 347982, 'error': None, 'target': 'ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.858 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[804448e4-a9e0-4698-af7a-0162fda3a2cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 nova_compute[227762]: 2026-01-23 11:10:52.884 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166652.883897, c019f125-db84-4dc5-959e-117b53966ffa => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:10:52 np0005593234 nova_compute[227762]: 2026-01-23 11:10:52.885 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c019f125-db84-4dc5-959e-117b53966ffa] VM Started (Lifecycle Event)#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.915 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ddcc316e-08dc-4879-8663-3de5dcbb5c59]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.917 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4706ca2-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.917 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.918 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf4706ca2-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:10:52 np0005593234 nova_compute[227762]: 2026-01-23 11:10:52.920 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:52 np0005593234 NetworkManager[48942]: <info>  [1769166652.9207] manager: (tapf4706ca2-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/493)
Jan 23 06:10:52 np0005593234 kernel: tapf4706ca2-10: entered promiscuous mode
Jan 23 06:10:52 np0005593234 nova_compute[227762]: 2026-01-23 11:10:52.922 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.923 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf4706ca2-10, col_values=(('external_ids', {'iface-id': '5655a848-aba1-4fa8-84e5-387dc4198f8a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:10:52 np0005593234 nova_compute[227762]: 2026-01-23 11:10:52.924 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:52 np0005593234 ovn_controller[134547]: 2026-01-23T11:10:52Z|01027|binding|INFO|Releasing lport 5655a848-aba1-4fa8-84e5-387dc4198f8a from this chassis (sb_readonly=1)
Jan 23 06:10:52 np0005593234 nova_compute[227762]: 2026-01-23 11:10:52.936 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:52 np0005593234 nova_compute[227762]: 2026-01-23 11:10:52.936 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.937 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f4706ca2-15b6-4141-8d7b-8d4cab159f24.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f4706ca2-15b6-4141-8d7b-8d4cab159f24.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.938 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[276f06b5-5dc8-4f0a-b2c5-27cf72547363]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:10:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.939 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-f4706ca2-15b6-4141-8d7b-8d4cab159f24
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/f4706ca2-15b6-4141-8d7b-8d4cab159f24.pid.haproxy
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID f4706ca2-15b6-4141-8d7b-8d4cab159f24
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:10:52 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:52.940 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'env', 'PROCESS_TAG=haproxy-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f4706ca2-15b6-4141-8d7b-8d4cab159f24.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:10:53 np0005593234 nova_compute[227762]: 2026-01-23 11:10:53.269 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:53 np0005593234 podman[348015]: 2026-01-23 11:10:53.370529798 +0000 UTC m=+0.087584462 container create cd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 06:10:53 np0005593234 systemd[1]: Started libpod-conmon-cd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d.scope.
Jan 23 06:10:53 np0005593234 podman[348015]: 2026-01-23 11:10:53.327831031 +0000 UTC m=+0.044885765 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:10:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:53.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:53 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:10:53 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/73c5b8ca4127e8578a0ef23e353939d206c8781ce35fb9e527cb20222434fd88/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:10:53 np0005593234 podman[348015]: 2026-01-23 11:10:53.455547649 +0000 UTC m=+0.172602343 container init cd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 06:10:53 np0005593234 podman[348015]: 2026-01-23 11:10:53.461755042 +0000 UTC m=+0.178809716 container start cd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Jan 23 06:10:53 np0005593234 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[348030]: [NOTICE]   (348034) : New worker (348036) forked
Jan 23 06:10:53 np0005593234 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[348030]: [NOTICE]   (348034) : Loading success.
Jan 23 06:10:54 np0005593234 nova_compute[227762]: 2026-01-23 11:10:54.146 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:10:54 np0005593234 nova_compute[227762]: 2026-01-23 11:10:54.151 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166652.8842118, c019f125-db84-4dc5-959e-117b53966ffa => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:10:54 np0005593234 nova_compute[227762]: 2026-01-23 11:10:54.151 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c019f125-db84-4dc5-959e-117b53966ffa] VM Paused (Lifecycle Event)#033[00m
Jan 23 06:10:54 np0005593234 nova_compute[227762]: 2026-01-23 11:10:54.367 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:10:54 np0005593234 nova_compute[227762]: 2026-01-23 11:10:54.371 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:10:54 np0005593234 nova_compute[227762]: 2026-01-23 11:10:54.405 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c019f125-db84-4dc5-959e-117b53966ffa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:10:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:54.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:54 np0005593234 nova_compute[227762]: 2026-01-23 11:10:54.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:10:54 np0005593234 nova_compute[227762]: 2026-01-23 11:10:54.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.388 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 06:10:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:10:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:55.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.886 227766 DEBUG nova.compute.manager [req-ac94ef25-e7ff-4772-ba97-a24acc33db1c req-e3599d9c-8b4d-4c13-976b-9b5c27054f99 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Received event network-vif-plugged-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.887 227766 DEBUG oslo_concurrency.lockutils [req-ac94ef25-e7ff-4772-ba97-a24acc33db1c req-e3599d9c-8b4d-4c13-976b-9b5c27054f99 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c019f125-db84-4dc5-959e-117b53966ffa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.888 227766 DEBUG oslo_concurrency.lockutils [req-ac94ef25-e7ff-4772-ba97-a24acc33db1c req-e3599d9c-8b4d-4c13-976b-9b5c27054f99 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.888 227766 DEBUG oslo_concurrency.lockutils [req-ac94ef25-e7ff-4772-ba97-a24acc33db1c req-e3599d9c-8b4d-4c13-976b-9b5c27054f99 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.888 227766 DEBUG nova.compute.manager [req-ac94ef25-e7ff-4772-ba97-a24acc33db1c req-e3599d9c-8b4d-4c13-976b-9b5c27054f99 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Processing event network-vif-plugged-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.890 227766 DEBUG nova.compute.manager [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Instance event wait completed in 3 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.895 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166655.8947656, c019f125-db84-4dc5-959e-117b53966ffa => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.895 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c019f125-db84-4dc5-959e-117b53966ffa] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.900 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.904 227766 INFO nova.virt.libvirt.driver [-] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Instance spawned successfully.#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.905 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.940 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.948 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.955 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.955 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.956 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.957 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.957 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:10:55 np0005593234 nova_compute[227762]: 2026-01-23 11:10:55.958 227766 DEBUG nova.virt.libvirt.driver [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:10:56 np0005593234 nova_compute[227762]: 2026-01-23 11:10:56.081 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: c019f125-db84-4dc5-959e-117b53966ffa] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:10:56 np0005593234 nova_compute[227762]: 2026-01-23 11:10:56.238 227766 INFO nova.compute.manager [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Took 21.54 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 06:10:56 np0005593234 nova_compute[227762]: 2026-01-23 11:10:56.238 227766 DEBUG nova.compute.manager [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:10:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:56.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:56 np0005593234 nova_compute[227762]: 2026-01-23 11:10:56.513 227766 INFO nova.compute.manager [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Took 22.84 seconds to build instance.#033[00m
Jan 23 06:10:56 np0005593234 nova_compute[227762]: 2026-01-23 11:10:56.695 227766 DEBUG oslo_concurrency.lockutils [None req-230f570d-d9af-43e5-a00c-de0d0dee1584 ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 23.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:56 np0005593234 nova_compute[227762]: 2026-01-23 11:10:56.805 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:56 np0005593234 nova_compute[227762]: 2026-01-23 11:10:56.969 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:56.968 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=103, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=102) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:10:56 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:10:56.971 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:10:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:57.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:10:58 np0005593234 nova_compute[227762]: 2026-01-23 11:10:58.271 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:10:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:10:58.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:10:58 np0005593234 podman[348050]: 2026-01-23 11:10:58.857375553 +0000 UTC m=+0.119868002 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:10:59 np0005593234 nova_compute[227762]: 2026-01-23 11:10:59.162 227766 DEBUG nova.compute.manager [req-0d99dcbd-eb7c-43ba-aa8e-1d54ffc5c4cc req-a3932735-8341-4604-bbf5-b4f721bcecd9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Received event network-vif-plugged-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:10:59 np0005593234 nova_compute[227762]: 2026-01-23 11:10:59.162 227766 DEBUG oslo_concurrency.lockutils [req-0d99dcbd-eb7c-43ba-aa8e-1d54ffc5c4cc req-a3932735-8341-4604-bbf5-b4f721bcecd9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c019f125-db84-4dc5-959e-117b53966ffa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:10:59 np0005593234 nova_compute[227762]: 2026-01-23 11:10:59.163 227766 DEBUG oslo_concurrency.lockutils [req-0d99dcbd-eb7c-43ba-aa8e-1d54ffc5c4cc req-a3932735-8341-4604-bbf5-b4f721bcecd9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:10:59 np0005593234 nova_compute[227762]: 2026-01-23 11:10:59.163 227766 DEBUG oslo_concurrency.lockutils [req-0d99dcbd-eb7c-43ba-aa8e-1d54ffc5c4cc req-a3932735-8341-4604-bbf5-b4f721bcecd9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:10:59 np0005593234 nova_compute[227762]: 2026-01-23 11:10:59.163 227766 DEBUG nova.compute.manager [req-0d99dcbd-eb7c-43ba-aa8e-1d54ffc5c4cc req-a3932735-8341-4604-bbf5-b4f721bcecd9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] No waiting events found dispatching network-vif-plugged-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:10:59 np0005593234 nova_compute[227762]: 2026-01-23 11:10:59.163 227766 WARNING nova.compute.manager [req-0d99dcbd-eb7c-43ba-aa8e-1d54ffc5c4cc req-a3932735-8341-4604-bbf5-b4f721bcecd9 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Received unexpected event network-vif-plugged-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:10:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:10:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:10:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:10:59.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:00.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:01.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:01 np0005593234 nova_compute[227762]: 2026-01-23 11:11:01.807 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:02.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:11:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:11:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:03 np0005593234 nova_compute[227762]: 2026-01-23 11:11:03.272 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:11:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:03.433 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:11:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:11:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:04.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:11:05 np0005593234 nova_compute[227762]: 2026-01-23 11:11:05.383 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:11:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:05.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:11:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:06.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:06 np0005593234 nova_compute[227762]: 2026-01-23 11:11:06.844 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:06 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:06.974 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '103'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:11:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:07.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:08 np0005593234 nova_compute[227762]: 2026-01-23 11:11:08.273 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:08.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:11:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:09.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.345 227766 DEBUG oslo_concurrency.lockutils [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "c019f125-db84-4dc5-959e-117b53966ffa" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.346 227766 DEBUG oslo_concurrency.lockutils [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.346 227766 DEBUG oslo_concurrency.lockutils [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "c019f125-db84-4dc5-959e-117b53966ffa-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.346 227766 DEBUG oslo_concurrency.lockutils [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.346 227766 DEBUG oslo_concurrency.lockutils [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.348 227766 INFO nova.compute.manager [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Terminating instance#033[00m
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.349 227766 DEBUG nova.compute.manager [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 06:11:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:10.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:10 np0005593234 kernel: tapb2d39f5f-2a (unregistering): left promiscuous mode
Jan 23 06:11:10 np0005593234 NetworkManager[48942]: <info>  [1769166670.6369] device (tapb2d39f5f-2a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.647 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:10 np0005593234 ovn_controller[134547]: 2026-01-23T11:11:10Z|01028|binding|INFO|Releasing lport b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 from this chassis (sb_readonly=0)
Jan 23 06:11:10 np0005593234 ovn_controller[134547]: 2026-01-23T11:11:10Z|01029|binding|INFO|Setting lport b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 down in Southbound
Jan 23 06:11:10 np0005593234 ovn_controller[134547]: 2026-01-23T11:11:10Z|01030|binding|INFO|Removing iface tapb2d39f5f-2a ovn-installed in OVS
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.651 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.656 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ba:3c:02 10.100.0.14'], port_security=['fa:16:3e:ba:3c:02 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'c019f125-db84-4dc5-959e-117b53966ffa', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'd4691e06029a4b11bbda2856a451bd88', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0ca02cfe-9b98-40f4-8c92-4cc40f5f9499', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9205c727-159c-48df-8bc6-3771f4de4cfc, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.657 144381 INFO neutron.agent.ovn.metadata.agent [-] Port b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 in datapath f4706ca2-15b6-4141-8d7b-8d4cab159f24 unbound from our chassis#033[00m
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.658 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f4706ca2-15b6-4141-8d7b-8d4cab159f24, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.659 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[94f040c7-c344-4cd2-baaf-8237db10081c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.660 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24 namespace which is not needed anymore#033[00m
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.670 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:10 np0005593234 systemd[1]: machine-qemu\x2d114\x2dinstance\x2d000000de.scope: Deactivated successfully.
Jan 23 06:11:10 np0005593234 systemd[1]: machine-qemu\x2d114\x2dinstance\x2d000000de.scope: Consumed 12.586s CPU time.
Jan 23 06:11:10 np0005593234 systemd-machined[195626]: Machine qemu-114-instance-000000de terminated.
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.784 227766 INFO nova.virt.libvirt.driver [-] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Instance destroyed successfully.#033[00m
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.784 227766 DEBUG nova.objects.instance [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lazy-loading 'resources' on Instance uuid c019f125-db84-4dc5-959e-117b53966ffa obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:11:10 np0005593234 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[348030]: [NOTICE]   (348034) : haproxy version is 2.8.14-c23fe91
Jan 23 06:11:10 np0005593234 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[348030]: [NOTICE]   (348034) : path to executable is /usr/sbin/haproxy
Jan 23 06:11:10 np0005593234 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[348030]: [WARNING]  (348034) : Exiting Master process...
Jan 23 06:11:10 np0005593234 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[348030]: [ALERT]    (348034) : Current worker (348036) exited with code 143 (Terminated)
Jan 23 06:11:10 np0005593234 neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24[348030]: [WARNING]  (348034) : All workers exited. Exiting... (0)
Jan 23 06:11:10 np0005593234 systemd[1]: libpod-cd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d.scope: Deactivated successfully.
Jan 23 06:11:10 np0005593234 podman[348208]: 2026-01-23 11:11:10.800199225 +0000 UTC m=+0.049679805 container died cd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:11:10 np0005593234 systemd[1]: var-lib-containers-storage-overlay-73c5b8ca4127e8578a0ef23e353939d206c8781ce35fb9e527cb20222434fd88-merged.mount: Deactivated successfully.
Jan 23 06:11:10 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d-userdata-shm.mount: Deactivated successfully.
Jan 23 06:11:10 np0005593234 podman[348208]: 2026-01-23 11:11:10.834672344 +0000 UTC m=+0.084152924 container cleanup cd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:11:10 np0005593234 systemd[1]: libpod-conmon-cd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d.scope: Deactivated successfully.
Jan 23 06:11:10 np0005593234 podman[348248]: 2026-01-23 11:11:10.899845294 +0000 UTC m=+0.042571493 container remove cd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.911 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b5b1f3b9-c62c-42a6-ad06-0e879c27c56f]: (4, ('Fri Jan 23 11:11:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24 (cd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d)\ncd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d\nFri Jan 23 11:11:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24 (cd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d)\ncd65217a7b08e2bde2200f57c0498e869a4595c8841d729c9133ba3bfc69181d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.914 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f0cb9018-3e8c-4099-87d6-b8a7a12bcee3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.916 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf4706ca2-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:11:10 np0005593234 kernel: tapf4706ca2-10: left promiscuous mode
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.919 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:10 np0005593234 nova_compute[227762]: 2026-01-23 11:11:10.938 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.941 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[8ee23d29-612a-4756-b79b-fa8917f85830]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.970 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a1ae50-1bb5-40ae-aa44-8f36be598331]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.972 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[96771046-9a4b-49ed-a7df-025c78aeecf2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.989 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6d65c397-86c5-45b0-be27-b40d273461a1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1054765, 'reachable_time': 35041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 348267, 'error': None, 'target': 'ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:10 np0005593234 systemd[1]: run-netns-ovnmeta\x2df4706ca2\x2d15b6\x2d4141\x2d8d7b\x2d8d4cab159f24.mount: Deactivated successfully.
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.993 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f4706ca2-15b6-4141-8d7b-8d4cab159f24 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:11:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:10.993 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[143a30b8-b4e7-4326-8884-2507799e529e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.026 227766 DEBUG nova.virt.libvirt.vif [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:10:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1989694612',display_name='tempest-TestServerMultinode-server-1989694612',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1989694612',id=222,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:10:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='d4691e06029a4b11bbda2856a451bd88',ramdisk_id='',reservation_id='r-i0szs682',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader,admin',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1152571872',owner_user_name='tempest-TestServerMultinode-1152571872-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:10:56Z,user_data=None,user_id='ac51edf400184ec0b11ee5acc335ff21',uuid=c019f125-db84-4dc5-959e-117b53966ffa,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "address": "fa:16:3e:ba:3c:02", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d39f5f-2a", "ovs_interfaceid": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.026 227766 DEBUG nova.network.os_vif_util [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Converting VIF {"id": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "address": "fa:16:3e:ba:3c:02", "network": {"id": "f4706ca2-15b6-4141-8d7b-8d4cab159f24", "bridge": "br-int", "label": "tempest-TestServerMultinode-1792921973-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "76608d1b79f84e2385a2dcadacaea9f3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb2d39f5f-2a", "ovs_interfaceid": "b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.027 227766 DEBUG nova.network.os_vif_util [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ba:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d39f5f-2a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.028 227766 DEBUG os_vif [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d39f5f-2a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.031 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.031 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2d39f5f-2a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.033 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.034 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.036 227766 INFO os_vif [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ba:3c:02,bridge_name='br-int',has_traffic_filtering=True,id=b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7,network=Network(f4706ca2-15b6-4141-8d7b-8d4cab159f24),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb2d39f5f-2a')#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.299 227766 DEBUG nova.compute.manager [req-6038f72b-d243-4206-a38a-900070fc7e76 req-24183f88-e9b1-4da5-9445-6977550b6967 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Received event network-vif-unplugged-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.300 227766 DEBUG oslo_concurrency.lockutils [req-6038f72b-d243-4206-a38a-900070fc7e76 req-24183f88-e9b1-4da5-9445-6977550b6967 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c019f125-db84-4dc5-959e-117b53966ffa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.300 227766 DEBUG oslo_concurrency.lockutils [req-6038f72b-d243-4206-a38a-900070fc7e76 req-24183f88-e9b1-4da5-9445-6977550b6967 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.300 227766 DEBUG oslo_concurrency.lockutils [req-6038f72b-d243-4206-a38a-900070fc7e76 req-24183f88-e9b1-4da5-9445-6977550b6967 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.300 227766 DEBUG nova.compute.manager [req-6038f72b-d243-4206-a38a-900070fc7e76 req-24183f88-e9b1-4da5-9445-6977550b6967 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] No waiting events found dispatching network-vif-unplugged-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.301 227766 DEBUG nova.compute.manager [req-6038f72b-d243-4206-a38a-900070fc7e76 req-24183f88-e9b1-4da5-9445-6977550b6967 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Received event network-vif-unplugged-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 06:11:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:11.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:11 np0005593234 nova_compute[227762]: 2026-01-23 11:11:11.846 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:12.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:13 np0005593234 nova_compute[227762]: 2026-01-23 11:11:13.409 227766 DEBUG nova.compute.manager [req-53b3bda7-ca4b-4534-8519-56bd0223c68a req-0bfaac2c-a18a-4d3b-bc00-98f3ac67f039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Received event network-vif-plugged-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:11:13 np0005593234 nova_compute[227762]: 2026-01-23 11:11:13.409 227766 DEBUG oslo_concurrency.lockutils [req-53b3bda7-ca4b-4534-8519-56bd0223c68a req-0bfaac2c-a18a-4d3b-bc00-98f3ac67f039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "c019f125-db84-4dc5-959e-117b53966ffa-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:13 np0005593234 nova_compute[227762]: 2026-01-23 11:11:13.409 227766 DEBUG oslo_concurrency.lockutils [req-53b3bda7-ca4b-4534-8519-56bd0223c68a req-0bfaac2c-a18a-4d3b-bc00-98f3ac67f039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:13 np0005593234 nova_compute[227762]: 2026-01-23 11:11:13.409 227766 DEBUG oslo_concurrency.lockutils [req-53b3bda7-ca4b-4534-8519-56bd0223c68a req-0bfaac2c-a18a-4d3b-bc00-98f3ac67f039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:13 np0005593234 nova_compute[227762]: 2026-01-23 11:11:13.409 227766 DEBUG nova.compute.manager [req-53b3bda7-ca4b-4534-8519-56bd0223c68a req-0bfaac2c-a18a-4d3b-bc00-98f3ac67f039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] No waiting events found dispatching network-vif-plugged-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:11:13 np0005593234 nova_compute[227762]: 2026-01-23 11:11:13.410 227766 WARNING nova.compute.manager [req-53b3bda7-ca4b-4534-8519-56bd0223c68a req-0bfaac2c-a18a-4d3b-bc00-98f3ac67f039 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Received unexpected event network-vif-plugged-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 for instance with vm_state active and task_state deleting.#033[00m
Jan 23 06:11:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:13.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:13 np0005593234 nova_compute[227762]: 2026-01-23 11:11:13.931 227766 INFO nova.virt.libvirt.driver [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Deleting instance files /var/lib/nova/instances/c019f125-db84-4dc5-959e-117b53966ffa_del#033[00m
Jan 23 06:11:13 np0005593234 nova_compute[227762]: 2026-01-23 11:11:13.932 227766 INFO nova.virt.libvirt.driver [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Deletion of /var/lib/nova/instances/c019f125-db84-4dc5-959e-117b53966ffa_del complete#033[00m
Jan 23 06:11:13 np0005593234 nova_compute[227762]: 2026-01-23 11:11:13.995 227766 INFO nova.compute.manager [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Took 3.65 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 06:11:13 np0005593234 nova_compute[227762]: 2026-01-23 11:11:13.996 227766 DEBUG oslo.service.loopingcall [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 06:11:13 np0005593234 nova_compute[227762]: 2026-01-23 11:11:13.997 227766 DEBUG nova.compute.manager [-] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 06:11:13 np0005593234 nova_compute[227762]: 2026-01-23 11:11:13.998 227766 DEBUG nova.network.neutron [-] [instance: c019f125-db84-4dc5-959e-117b53966ffa] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 06:11:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:11:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:14.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:11:15 np0005593234 nova_compute[227762]: 2026-01-23 11:11:15.109 227766 DEBUG nova.network.neutron [-] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:11:15 np0005593234 nova_compute[227762]: 2026-01-23 11:11:15.142 227766 INFO nova.compute.manager [-] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Took 1.14 seconds to deallocate network for instance.#033[00m
Jan 23 06:11:15 np0005593234 nova_compute[227762]: 2026-01-23 11:11:15.189 227766 DEBUG oslo_concurrency.lockutils [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:15 np0005593234 nova_compute[227762]: 2026-01-23 11:11:15.190 227766 DEBUG oslo_concurrency.lockutils [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:15 np0005593234 nova_compute[227762]: 2026-01-23 11:11:15.232 227766 DEBUG oslo_concurrency.processutils [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:11:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:15.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:15 np0005593234 nova_compute[227762]: 2026-01-23 11:11:15.572 227766 DEBUG nova.compute.manager [req-da73fdce-2911-4d7d-ac26-9fe9a569d8c5 req-934ace03-ef49-4089-ab1b-8a7447a35de5 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Received event network-vif-deleted-b2d39f5f-2a20-42b6-b800-9e9ef29c2ea7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:11:15 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:11:15 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4020103989' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:11:15 np0005593234 nova_compute[227762]: 2026-01-23 11:11:15.750 227766 DEBUG oslo_concurrency.processutils [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.517s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:11:15 np0005593234 nova_compute[227762]: 2026-01-23 11:11:15.756 227766 DEBUG nova.compute.provider_tree [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:11:15 np0005593234 nova_compute[227762]: 2026-01-23 11:11:15.773 227766 DEBUG nova.scheduler.client.report [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:11:15 np0005593234 nova_compute[227762]: 2026-01-23 11:11:15.792 227766 DEBUG oslo_concurrency.lockutils [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.602s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:15 np0005593234 nova_compute[227762]: 2026-01-23 11:11:15.821 227766 INFO nova.scheduler.client.report [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Deleted allocations for instance c019f125-db84-4dc5-959e-117b53966ffa#033[00m
Jan 23 06:11:15 np0005593234 nova_compute[227762]: 2026-01-23 11:11:15.885 227766 DEBUG oslo_concurrency.lockutils [None req-2935e187-846f-4877-9adc-054043f1eaae ac51edf400184ec0b11ee5acc335ff21 d4691e06029a4b11bbda2856a451bd88 - - default default] Lock "c019f125-db84-4dc5-959e-117b53966ffa" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.539s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:16 np0005593234 nova_compute[227762]: 2026-01-23 11:11:16.032 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:11:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:16.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:11:16 np0005593234 nova_compute[227762]: 2026-01-23 11:11:16.848 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:17.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:18.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:19.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:11:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:20.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:11:20 np0005593234 podman[348364]: 2026-01-23 11:11:20.778231966 +0000 UTC m=+0.058676068 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:11:20 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:11:20 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2232437038' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:11:21 np0005593234 nova_compute[227762]: 2026-01-23 11:11:21.034 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:21.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:21 np0005593234 nova_compute[227762]: 2026-01-23 11:11:21.851 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:22.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:23.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:24.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:25.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:25 np0005593234 nova_compute[227762]: 2026-01-23 11:11:25.782 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769166670.7808628, c019f125-db84-4dc5-959e-117b53966ffa => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:11:25 np0005593234 nova_compute[227762]: 2026-01-23 11:11:25.782 227766 INFO nova.compute.manager [-] [instance: c019f125-db84-4dc5-959e-117b53966ffa] VM Stopped (Lifecycle Event)#033[00m
Jan 23 06:11:25 np0005593234 nova_compute[227762]: 2026-01-23 11:11:25.802 227766 DEBUG nova.compute.manager [None req-045dd98b-b5b5-4e1b-a34c-cffb061657fd - - - - - -] [instance: c019f125-db84-4dc5-959e-117b53966ffa] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:11:26 np0005593234 nova_compute[227762]: 2026-01-23 11:11:26.036 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:26.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:26 np0005593234 nova_compute[227762]: 2026-01-23 11:11:26.853 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:27.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:11:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:28.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:11:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:29.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:29 np0005593234 podman[348387]: 2026-01-23 11:11:29.794382653 +0000 UTC m=+0.092149385 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 06:11:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:11:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:30.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:11:30 np0005593234 nova_compute[227762]: 2026-01-23 11:11:30.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:30 np0005593234 nova_compute[227762]: 2026-01-23 11:11:30.785 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:30 np0005593234 nova_compute[227762]: 2026-01-23 11:11:30.785 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:30 np0005593234 nova_compute[227762]: 2026-01-23 11:11:30.785 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:30 np0005593234 nova_compute[227762]: 2026-01-23 11:11:30.786 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:11:30 np0005593234 nova_compute[227762]: 2026-01-23 11:11:30.786 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:11:31 np0005593234 nova_compute[227762]: 2026-01-23 11:11:31.038 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:11:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1539788190' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:11:31 np0005593234 nova_compute[227762]: 2026-01-23 11:11:31.204 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.418s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:11:31 np0005593234 nova_compute[227762]: 2026-01-23 11:11:31.357 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:11:31 np0005593234 nova_compute[227762]: 2026-01-23 11:11:31.358 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4110MB free_disk=20.98827362060547GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:11:31 np0005593234 nova_compute[227762]: 2026-01-23 11:11:31.359 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:31 np0005593234 nova_compute[227762]: 2026-01-23 11:11:31.359 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:31.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:31 np0005593234 nova_compute[227762]: 2026-01-23 11:11:31.502 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:11:31 np0005593234 nova_compute[227762]: 2026-01-23 11:11:31.503 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:11:31 np0005593234 nova_compute[227762]: 2026-01-23 11:11:31.521 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:11:31 np0005593234 nova_compute[227762]: 2026-01-23 11:11:31.855 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:11:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3013792844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:11:31 np0005593234 nova_compute[227762]: 2026-01-23 11:11:31.955 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:11:31 np0005593234 nova_compute[227762]: 2026-01-23 11:11:31.961 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:11:32 np0005593234 nova_compute[227762]: 2026-01-23 11:11:32.041 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:11:32 np0005593234 nova_compute[227762]: 2026-01-23 11:11:32.062 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:11:32 np0005593234 nova_compute[227762]: 2026-01-23 11:11:32.062 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.703s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:11:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:32.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:11:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:33 np0005593234 nova_compute[227762]: 2026-01-23 11:11:33.063 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:11:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:33.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:11:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:34.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:35 np0005593234 nova_compute[227762]: 2026-01-23 11:11:35.061 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:35 np0005593234 nova_compute[227762]: 2026-01-23 11:11:35.062 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:11:35 np0005593234 nova_compute[227762]: 2026-01-23 11:11:35.062 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:11:35 np0005593234 nova_compute[227762]: 2026-01-23 11:11:35.076 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:11:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:35.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:36 np0005593234 nova_compute[227762]: 2026-01-23 11:11:36.041 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:36.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:36 np0005593234 nova_compute[227762]: 2026-01-23 11:11:36.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:36 np0005593234 nova_compute[227762]: 2026-01-23 11:11:36.896 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:37.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:38.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:39.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:40.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:40 np0005593234 nova_compute[227762]: 2026-01-23 11:11:40.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:41 np0005593234 nova_compute[227762]: 2026-01-23 11:11:41.042 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:41.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:41 np0005593234 nova_compute[227762]: 2026-01-23 11:11:41.898 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:11:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:42.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:11:42 np0005593234 nova_compute[227762]: 2026-01-23 11:11:42.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:42.921 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:11:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:42.921 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:11:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:11:42.921 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:11:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:43.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #217. Immutable memtables: 0.
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.188041) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 139] Flushing memtable with next log file: 217
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166704188163, "job": 139, "event": "flush_started", "num_memtables": 1, "num_entries": 2359, "num_deletes": 251, "total_data_size": 5854837, "memory_usage": 5924344, "flush_reason": "Manual Compaction"}
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 139] Level-0 flush table #218: started
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166704212017, "cf_name": "default", "job": 139, "event": "table_file_creation", "file_number": 218, "file_size": 3844408, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 102370, "largest_seqno": 104724, "table_properties": {"data_size": 3834799, "index_size": 6102, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2437, "raw_key_size": 19802, "raw_average_key_size": 20, "raw_value_size": 3815687, "raw_average_value_size": 3941, "num_data_blocks": 265, "num_entries": 968, "num_filter_entries": 968, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166469, "oldest_key_time": 1769166469, "file_creation_time": 1769166704, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 218, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 139] Flush lasted 24138 microseconds, and 11409 cpu microseconds.
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.212192) [db/flush_job.cc:967] [default] [JOB 139] Level-0 flush table #218: 3844408 bytes OK
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.212225) [db/memtable_list.cc:519] [default] Level-0 commit table #218 started
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.214853) [db/memtable_list.cc:722] [default] Level-0 commit table #218: memtable #1 done
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.214944) EVENT_LOG_v1 {"time_micros": 1769166704214924, "job": 139, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.214986) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 139] Try to delete WAL files size 5844385, prev total WAL file size 5844385, number of live WAL files 2.
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000214.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.218082) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730039323837' seq:72057594037927935, type:22 .. '7061786F730039353339' seq:0, type:0; will stop at (end)
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 140] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 139 Base level 0, inputs: [218(3754KB)], [216(11MB)]
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166704218702, "job": 140, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [218], "files_L6": [216], "score": -1, "input_data_size": 16035465, "oldest_snapshot_seqno": -1}
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 140] Generated table #219: 11994 keys, 14002919 bytes, temperature: kUnknown
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166704314540, "cf_name": "default", "job": 140, "event": "table_file_creation", "file_number": 219, "file_size": 14002919, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13928144, "index_size": 43686, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30021, "raw_key_size": 317145, "raw_average_key_size": 26, "raw_value_size": 13721441, "raw_average_value_size": 1144, "num_data_blocks": 1651, "num_entries": 11994, "num_filter_entries": 11994, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769166704, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 219, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.314916) [db/compaction/compaction_job.cc:1663] [default] [JOB 140] Compacted 1@0 + 1@6 files to L6 => 14002919 bytes
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.316531) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 167.0 rd, 145.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.7, 11.6 +0.0 blob) out(13.4 +0.0 blob), read-write-amplify(7.8) write-amplify(3.6) OK, records in: 12511, records dropped: 517 output_compression: NoCompression
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.316547) EVENT_LOG_v1 {"time_micros": 1769166704316539, "job": 140, "event": "compaction_finished", "compaction_time_micros": 96007, "compaction_time_cpu_micros": 62892, "output_level": 6, "num_output_files": 1, "total_output_size": 14002919, "num_input_records": 12511, "num_output_records": 11994, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000218.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166704317331, "job": 140, "event": "table_file_deletion", "file_number": 218}
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000216.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166704319523, "job": 140, "event": "table_file_deletion", "file_number": 216}
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.217829) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.319690) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.319695) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.319696) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.319698) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:11:44.319700) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:11:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:44.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3105557347' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:11:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3105557347' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:11:44 np0005593234 nova_compute[227762]: 2026-01-23 11:11:44.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:44 np0005593234 nova_compute[227762]: 2026-01-23 11:11:44.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:44 np0005593234 nova_compute[227762]: 2026-01-23 11:11:44.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:11:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:11:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:45.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:11:46 np0005593234 nova_compute[227762]: 2026-01-23 11:11:46.044 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:46.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:46 np0005593234 nova_compute[227762]: 2026-01-23 11:11:46.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:11:46 np0005593234 nova_compute[227762]: 2026-01-23 11:11:46.901 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:47.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:48.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:49.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:50.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:51 np0005593234 nova_compute[227762]: 2026-01-23 11:11:51.046 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:51.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:51 np0005593234 podman[348521]: 2026-01-23 11:11:51.767312785 +0000 UTC m=+0.057827271 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:11:51 np0005593234 nova_compute[227762]: 2026-01-23 11:11:51.903 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:11:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:52.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:11:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:53.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:11:54 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.0 total, 600.0 interval#012Cumulative writes: 20K writes, 104K keys, 20K commit groups, 1.0 writes per commit group, ingest: 0.21 GB, 0.03 MB/s#012Cumulative WAL: 20K writes, 20K syncs, 1.00 writes per sync, written: 0.21 GB, 0.03 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 1421 writes, 6970 keys, 1421 commit groups, 1.0 writes per commit group, ingest: 15.32 MB, 0.03 MB/s#012Interval WAL: 1421 writes, 1421 syncs, 1.00 writes per sync, written: 0.01 GB, 0.03 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent#012#012** Compaction Stats [default] **#012Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012  L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     49.1      2.70              0.40        70    0.039       0      0       0.0       0.0#012  L6      1/0   13.35 MB   0.0      0.8     0.1      0.7       0.7      0.0       0.0   5.5    122.2    105.0      6.92              2.27        69    0.100    576K    36K       0.0       0.0#012 Sum      1/0   13.35 MB   0.0      0.8     0.1      0.7       0.8      0.1       0.0   6.5     87.9     89.3      9.61              2.67       139    0.069    576K    36K       0.0       0.0#012 Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.0    121.6    124.0      0.62              0.23        10    0.062     61K   2603       0.0       0.0#012#012** Compaction Stats [default] **#012Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)#012---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------#012 Low      0/0    0.00 KB   0.0      0.8     0.1      0.7       0.7      0.0       0.0   0.0    122.2    105.0      6.92              2.27        69    0.100    576K    36K       0.0       0.0#012High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     49.1      2.69              0.40        69    0.039       0      0       0.0       0.0#012User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0#012#012Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0#012#012Uptime(secs): 7800.0 total, 600.0 interval#012Flush(GB): cumulative 0.129, interval 0.011#012AddFile(GB): cumulative 0.000, interval 0.000#012AddFile(Total Files): cumulative 0, interval 0#012AddFile(L0 Files): cumulative 0, interval 0#012AddFile(Keys): cumulative 0, interval 0#012Cumulative compaction: 0.84 GB write, 0.11 MB/s write, 0.83 GB read, 0.11 MB/s read, 9.6 seconds#012Interval compaction: 0.08 GB write, 0.13 MB/s write, 0.07 GB read, 0.13 MB/s read, 0.6 seconds#012Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count#012Block cache BinnedLRUCache@0x55f10fc171f0#2 capacity: 304.00 MB usage: 89.97 MB table_size: 0 occupancy: 18446744073709551615 collections: 14 last_copies: 0 last_secs: 0.000451 secs_since: 0#012Block cache entry stats(count,size,portion): DataBlock(5130,85.96 MB,28.2763%) FilterBlock(139,1.57 MB,0.516405%) IndexBlock(139,2.44 MB,0.80322%) Misc(1,0.00 KB,0%)#012#012** File Read Latency Histogram By Level [default] **
Jan 23 06:11:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:54.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:55.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:56 np0005593234 nova_compute[227762]: 2026-01-23 11:11:56.047 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:56.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:56 np0005593234 nova_compute[227762]: 2026-01-23 11:11:56.949 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:11:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:11:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:57.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:11:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:11:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:11:58.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:11:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:11:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:11:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:11:59.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:12:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:00.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:12:00 np0005593234 podman[348595]: 2026-01-23 11:12:00.818657705 +0000 UTC m=+0.118694706 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:12:01 np0005593234 nova_compute[227762]: 2026-01-23 11:12:01.048 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:01.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:01 np0005593234 nova_compute[227762]: 2026-01-23 11:12:01.951 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:02.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:03.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:12:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:04.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:12:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:12:04 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:12:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:05.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:06 np0005593234 nova_compute[227762]: 2026-01-23 11:12:06.049 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:12:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:12:06 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:12:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:06.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:06 np0005593234 nova_compute[227762]: 2026-01-23 11:12:06.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:06 np0005593234 nova_compute[227762]: 2026-01-23 11:12:06.993 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:07 np0005593234 ovn_controller[134547]: 2026-01-23T11:12:07Z|01031|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Jan 23 06:12:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:07.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:08.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:09.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:10.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:11 np0005593234 nova_compute[227762]: 2026-01-23 11:12:11.051 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:12:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:11.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:12:11 np0005593234 nova_compute[227762]: 2026-01-23 11:12:11.995 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:12:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:12.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:12:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:13.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:14.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:12:15 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:12:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:15.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:15 np0005593234 ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-crash-compute-2[77801]: ERROR:ceph-crash:Error scraping /var/lib/ceph/crash: [Errno 13] Permission denied: '/var/lib/ceph/crash'
Jan 23 06:12:16 np0005593234 nova_compute[227762]: 2026-01-23 11:12:16.096 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:16.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:16 np0005593234 nova_compute[227762]: 2026-01-23 11:12:16.996 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:17.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:12:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:18.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:12:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:12:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:19.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:12:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:12:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:20.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:12:21 np0005593234 nova_compute[227762]: 2026-01-23 11:12:21.098 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:21.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:12:21.719 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=104, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=103) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:12:21 np0005593234 nova_compute[227762]: 2026-01-23 11:12:21.719 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:21 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:12:21.720 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:12:22 np0005593234 nova_compute[227762]: 2026-01-23 11:12:21.999 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:22.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:22 np0005593234 podman[348865]: 2026-01-23 11:12:22.786342084 +0000 UTC m=+0.069602089 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 06:12:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:23.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:24.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:25.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:26 np0005593234 nova_compute[227762]: 2026-01-23 11:12:26.116 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:26.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:27 np0005593234 nova_compute[227762]: 2026-01-23 11:12:27.000 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:27.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:28.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:12:28.722 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '104'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:12:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:12:29 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.5 total, 600.0 interval#012Cumulative writes: 80K writes, 319K keys, 80K commit groups, 1.0 writes per commit group, ingest: 0.32 GB, 0.04 MB/s#012Cumulative WAL: 80K writes, 29K syncs, 2.68 writes per sync, written: 0.32 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3146 writes, 12K keys, 3146 commit groups, 1.0 writes per commit group, ingest: 12.82 MB, 0.02 MB/s#012Interval WAL: 3146 writes, 1219 syncs, 2.58 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:12:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:29.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:30.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:30 np0005593234 nova_compute[227762]: 2026-01-23 11:12:30.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:30 np0005593234 nova_compute[227762]: 2026-01-23 11:12:30.771 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:12:30 np0005593234 nova_compute[227762]: 2026-01-23 11:12:30.772 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:12:30 np0005593234 nova_compute[227762]: 2026-01-23 11:12:30.772 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:12:30 np0005593234 nova_compute[227762]: 2026-01-23 11:12:30.772 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:12:30 np0005593234 nova_compute[227762]: 2026-01-23 11:12:30.772 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:12:31 np0005593234 nova_compute[227762]: 2026-01-23 11:12:31.120 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:12:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3330837087' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:12:31 np0005593234 nova_compute[227762]: 2026-01-23 11:12:31.202 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:12:31 np0005593234 nova_compute[227762]: 2026-01-23 11:12:31.370 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:12:31 np0005593234 nova_compute[227762]: 2026-01-23 11:12:31.373 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4120MB free_disk=20.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:12:31 np0005593234 nova_compute[227762]: 2026-01-23 11:12:31.373 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:12:31 np0005593234 nova_compute[227762]: 2026-01-23 11:12:31.374 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:12:31 np0005593234 nova_compute[227762]: 2026-01-23 11:12:31.483 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:12:31 np0005593234 nova_compute[227762]: 2026-01-23 11:12:31.484 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:12:31 np0005593234 nova_compute[227762]: 2026-01-23 11:12:31.510 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:12:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:31.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:31 np0005593234 podman[348930]: 2026-01-23 11:12:31.792453529 +0000 UTC m=+0.088805860 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller)
Jan 23 06:12:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:12:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3989685976' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:12:31 np0005593234 nova_compute[227762]: 2026-01-23 11:12:31.978 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:12:31 np0005593234 nova_compute[227762]: 2026-01-23 11:12:31.985 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:12:32 np0005593234 nova_compute[227762]: 2026-01-23 11:12:31.999 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:12:32 np0005593234 nova_compute[227762]: 2026-01-23 11:12:32.001 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:12:32 np0005593234 nova_compute[227762]: 2026-01-23 11:12:32.001 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:12:32 np0005593234 nova_compute[227762]: 2026-01-23 11:12:32.007 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:32.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:33.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:12:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:34.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:12:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:35.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:36 np0005593234 nova_compute[227762]: 2026-01-23 11:12:36.001 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:36 np0005593234 nova_compute[227762]: 2026-01-23 11:12:36.001 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:12:36 np0005593234 nova_compute[227762]: 2026-01-23 11:12:36.001 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:12:36 np0005593234 nova_compute[227762]: 2026-01-23 11:12:36.022 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:12:36 np0005593234 nova_compute[227762]: 2026-01-23 11:12:36.123 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:36.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:37 np0005593234 nova_compute[227762]: 2026-01-23 11:12:37.010 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:37.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:12:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:38.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:12:38 np0005593234 nova_compute[227762]: 2026-01-23 11:12:38.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:39.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:12:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:40.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:12:40 np0005593234 nova_compute[227762]: 2026-01-23 11:12:40.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:41 np0005593234 nova_compute[227762]: 2026-01-23 11:12:41.127 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:41.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:42 np0005593234 nova_compute[227762]: 2026-01-23 11:12:42.070 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:12:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:42.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:12:42 np0005593234 nova_compute[227762]: 2026-01-23 11:12:42.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:12:42.923 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:12:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:12:42.923 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:12:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:12:42.923 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:12:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:12:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:43.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:12:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:44.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:44 np0005593234 nova_compute[227762]: 2026-01-23 11:12:44.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:44 np0005593234 nova_compute[227762]: 2026-01-23 11:12:44.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:12:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:12:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3681091467' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:12:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:12:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3681091467' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:12:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:12:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:45.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:12:46 np0005593234 nova_compute[227762]: 2026-01-23 11:12:46.131 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:46.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:46 np0005593234 nova_compute[227762]: 2026-01-23 11:12:46.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:47 np0005593234 nova_compute[227762]: 2026-01-23 11:12:47.117 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:12:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:47.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:12:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:48.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:48 np0005593234 nova_compute[227762]: 2026-01-23 11:12:48.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:12:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:49.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:12:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:50.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:51 np0005593234 nova_compute[227762]: 2026-01-23 11:12:51.133 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:51.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:51 np0005593234 nova_compute[227762]: 2026-01-23 11:12:51.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:12:52 np0005593234 nova_compute[227762]: 2026-01-23 11:12:52.121 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:12:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:52.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:12:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:12:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:53.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:12:53 np0005593234 podman[349021]: 2026-01-23 11:12:53.787917966 +0000 UTC m=+0.078077544 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 06:12:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:12:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:54.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:12:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:55.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:56 np0005593234 nova_compute[227762]: 2026-01-23 11:12:56.137 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:56.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:57 np0005593234 nova_compute[227762]: 2026-01-23 11:12:57.121 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:12:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:57.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:12:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:12:58.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:12:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:12:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:12:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:12:59.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:00.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:01 np0005593234 nova_compute[227762]: 2026-01-23 11:13:01.141 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:01.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:02 np0005593234 nova_compute[227762]: 2026-01-23 11:13:02.123 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:02.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:02 np0005593234 podman[349095]: 2026-01-23 11:13:02.81480809 +0000 UTC m=+0.106475093 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 06:13:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e427 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:03.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:04.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:05.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e428 e428: 3 total, 3 up, 3 in
Jan 23 06:13:06 np0005593234 nova_compute[227762]: 2026-01-23 11:13:06.145 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:06.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:07 np0005593234 nova_compute[227762]: 2026-01-23 11:13:07.125 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e429 e429: 3 total, 3 up, 3 in
Jan 23 06:13:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:07.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e429 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e430 e430: 3 total, 3 up, 3 in
Jan 23 06:13:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:08.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:08 np0005593234 nova_compute[227762]: 2026-01-23 11:13:08.777 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:13:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:09.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:13:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:10.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:11 np0005593234 nova_compute[227762]: 2026-01-23 11:13:11.149 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:11.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:12 np0005593234 nova_compute[227762]: 2026-01-23 11:13:12.126 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:13:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:12.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:13:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e430 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:13:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:13.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:13:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:13:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:14.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:13:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:15.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:16 np0005593234 nova_compute[227762]: 2026-01-23 11:13:16.153 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:16.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e431 e431: 3 total, 3 up, 3 in
Jan 23 06:13:17 np0005593234 nova_compute[227762]: 2026-01-23 11:13:17.128 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:17.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:13:17 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:13:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e431 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:13:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:18.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:13:18 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:13:18 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:13:18 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:13:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:19.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:20.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:21 np0005593234 nova_compute[227762]: 2026-01-23 11:13:21.156 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:21.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:22 np0005593234 nova_compute[227762]: 2026-01-23 11:13:22.130 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:22.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e431 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:23.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:13:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:24.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:13:24 np0005593234 podman[349316]: 2026-01-23 11:13:24.799301935 +0000 UTC m=+0.082525404 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:13:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:25.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.157 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "d4c75524-52b8-4c2b-b0cb-18d94089013b" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.158 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.158 227766 INFO nova.compute.manager [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Unshelving#033[00m
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.160 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.304 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.305 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.315 227766 DEBUG nova.objects.instance [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'pci_requests' on Instance uuid d4c75524-52b8-4c2b-b0cb-18d94089013b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.338 227766 DEBUG nova.objects.instance [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'numa_topology' on Instance uuid d4c75524-52b8-4c2b-b0cb-18d94089013b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.352 227766 DEBUG nova.virt.hardware [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.353 227766 INFO nova.compute.claims [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.460 227766 DEBUG oslo_concurrency.processutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:13:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:26.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:13:26 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1151010546' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.947 227766 DEBUG oslo_concurrency.processutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.959 227766 DEBUG nova.compute.provider_tree [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:13:26 np0005593234 nova_compute[227762]: 2026-01-23 11:13:26.978 227766 DEBUG nova.scheduler.client.report [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:13:27 np0005593234 nova_compute[227762]: 2026-01-23 11:13:27.009 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:13:27 np0005593234 nova_compute[227762]: 2026-01-23 11:13:27.177 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:27 np0005593234 nova_compute[227762]: 2026-01-23 11:13:27.325 227766 INFO nova.network.neutron [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Updating port 56bb5fc8-f112-47c7-84d3-d47e53c4d481 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Jan 23 06:13:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:13:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:27.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:13:27 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:13:27 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:13:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e431 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:28.151 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=105, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=104) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:13:28 np0005593234 nova_compute[227762]: 2026-01-23 11:13:28.153 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:28 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:28.154 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:13:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:13:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:28.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:13:28 np0005593234 nova_compute[227762]: 2026-01-23 11:13:28.884 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "refresh_cache-d4c75524-52b8-4c2b-b0cb-18d94089013b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:13:28 np0005593234 nova_compute[227762]: 2026-01-23 11:13:28.885 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquired lock "refresh_cache-d4c75524-52b8-4c2b-b0cb-18d94089013b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:13:28 np0005593234 nova_compute[227762]: 2026-01-23 11:13:28.885 227766 DEBUG nova.network.neutron [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:13:28 np0005593234 nova_compute[227762]: 2026-01-23 11:13:28.993 227766 DEBUG nova.compute.manager [req-b21e257f-3aeb-4460-ac32-5620e9a60bb5 req-1a169919-21a7-42c6-926a-5f5bbef81afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Received event network-changed-56bb5fc8-f112-47c7-84d3-d47e53c4d481 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:13:28 np0005593234 nova_compute[227762]: 2026-01-23 11:13:28.993 227766 DEBUG nova.compute.manager [req-b21e257f-3aeb-4460-ac32-5620e9a60bb5 req-1a169919-21a7-42c6-926a-5f5bbef81afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Refreshing instance network info cache due to event network-changed-56bb5fc8-f112-47c7-84d3-d47e53c4d481. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:13:28 np0005593234 nova_compute[227762]: 2026-01-23 11:13:28.994 227766 DEBUG oslo_concurrency.lockutils [req-b21e257f-3aeb-4460-ac32-5620e9a60bb5 req-1a169919-21a7-42c6-926a-5f5bbef81afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d4c75524-52b8-4c2b-b0cb-18d94089013b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:13:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:29.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.388 227766 DEBUG nova.network.neutron [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Updating instance_info_cache with network_info: [{"id": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "address": "fa:16:3e:be:cd:e8", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56bb5fc8-f1", "ovs_interfaceid": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.407 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Releasing lock "refresh_cache-d4c75524-52b8-4c2b-b0cb-18d94089013b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.410 227766 DEBUG nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.411 227766 INFO nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Creating image(s)#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.459 227766 DEBUG nova.storage.rbd_utils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] rbd image d4c75524-52b8-4c2b-b0cb-18d94089013b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.465 227766 DEBUG nova.objects.instance [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'trusted_certs' on Instance uuid d4c75524-52b8-4c2b-b0cb-18d94089013b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.467 227766 DEBUG oslo_concurrency.lockutils [req-b21e257f-3aeb-4460-ac32-5620e9a60bb5 req-1a169919-21a7-42c6-926a-5f5bbef81afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d4c75524-52b8-4c2b-b0cb-18d94089013b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.468 227766 DEBUG nova.network.neutron [req-b21e257f-3aeb-4460-ac32-5620e9a60bb5 req-1a169919-21a7-42c6-926a-5f5bbef81afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Refreshing network info cache for port 56bb5fc8-f112-47c7-84d3-d47e53c4d481 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.518 227766 DEBUG nova.storage.rbd_utils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] rbd image d4c75524-52b8-4c2b-b0cb-18d94089013b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.552 227766 DEBUG nova.storage.rbd_utils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] rbd image d4c75524-52b8-4c2b-b0cb-18d94089013b_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.557 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "d0af2d2d201673293151bf18ae267bcbc1a3cc63" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.558 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "d0af2d2d201673293151bf18ae267bcbc1a3cc63" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:13:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:30.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.765 227766 DEBUG nova.virt.libvirt.imagebackend [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Image locations are: [{'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/7da2f187-f7de-4714-b817-454a50a6b19a/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/7da2f187-f7de-4714-b817-454a50a6b19a/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.815 227766 DEBUG nova.virt.libvirt.imagebackend [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Selected location: {'url': 'rbd://e1533653-0a5a-584c-b34b-8689f0d32e77/images/7da2f187-f7de-4714-b817-454a50a6b19a/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.816 227766 DEBUG nova.storage.rbd_utils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] cloning images/7da2f187-f7de-4714-b817-454a50a6b19a@snap to None/d4c75524-52b8-4c2b-b0cb-18d94089013b_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261#033[00m
Jan 23 06:13:30 np0005593234 nova_compute[227762]: 2026-01-23 11:13:30.950 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "d0af2d2d201673293151bf18ae267bcbc1a3cc63" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.091 227766 DEBUG nova.objects.instance [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'migration_context' on Instance uuid d4c75524-52b8-4c2b-b0cb-18d94089013b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.164 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.171 227766 DEBUG nova.storage.rbd_utils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] flattening vms/d4c75524-52b8-4c2b-b0cb-18d94089013b_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314#033[00m
Jan 23 06:13:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:13:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:31.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.698 227766 DEBUG nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Image rbd:vms/d4c75524-52b8-4c2b-b0cb-18d94089013b_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.699 227766 DEBUG nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.700 227766 DEBUG nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Ensure instance console log exists: /var/lib/nova/instances/d4c75524-52b8-4c2b-b0cb-18d94089013b/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.700 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.701 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.701 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.705 227766 DEBUG nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Start _get_guest_xml network_info=[{"id": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "address": "fa:16:3e:be:cd:e8", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56bb5fc8-f1", "ovs_interfaceid": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T11:12:59Z,direct_url=<?>,disk_format='raw',id=7da2f187-f7de-4714-b817-454a50a6b19a,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-926357377-shelved',owner='3a245f7970f14fffa60af2ff972b4bfd',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T11:13:10Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'encryption_options': None, 'guest_format': None, 'encryption_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'size': 0, 'encrypted': False, 'encryption_secret_uuid': None, 'image_id': '84c0ef19-7f67-4bd3-95d8-507c3e0942ed'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.711 227766 WARNING nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.718 227766 DEBUG nova.virt.libvirt.host [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.719 227766 DEBUG nova.virt.libvirt.host [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.722 227766 DEBUG nova.virt.libvirt.host [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.723 227766 DEBUG nova.virt.libvirt.host [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.725 227766 DEBUG nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.726 227766 DEBUG nova.virt.hardware [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-01-23T11:12:59Z,direct_url=<?>,disk_format='raw',id=7da2f187-f7de-4714-b817-454a50a6b19a,min_disk=1,min_ram=0,name='tempest-TestShelveInstance-server-926357377-shelved',owner='3a245f7970f14fffa60af2ff972b4bfd',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-01-23T11:13:10Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.727 227766 DEBUG nova.virt.hardware [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.727 227766 DEBUG nova.virt.hardware [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.727 227766 DEBUG nova.virt.hardware [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.728 227766 DEBUG nova.virt.hardware [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.728 227766 DEBUG nova.virt.hardware [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.729 227766 DEBUG nova.virt.hardware [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.729 227766 DEBUG nova.virt.hardware [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.730 227766 DEBUG nova.virt.hardware [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.730 227766 DEBUG nova.virt.hardware [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.731 227766 DEBUG nova.virt.hardware [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.731 227766 DEBUG nova.objects.instance [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'vcpu_model' on Instance uuid d4c75524-52b8-4c2b-b0cb-18d94089013b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #220. Immutable memtables: 0.
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.737007) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 141] Flushing memtable with next log file: 220
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166811737088, "job": 141, "event": "flush_started", "num_memtables": 1, "num_entries": 1362, "num_deletes": 256, "total_data_size": 2999251, "memory_usage": 3044760, "flush_reason": "Manual Compaction"}
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 141] Level-0 flush table #221: started
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166811751676, "cf_name": "default", "job": 141, "event": "table_file_creation", "file_number": 221, "file_size": 1958201, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 104729, "largest_seqno": 106086, "table_properties": {"data_size": 1952288, "index_size": 3243, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12748, "raw_average_key_size": 19, "raw_value_size": 1940261, "raw_average_value_size": 3041, "num_data_blocks": 141, "num_entries": 638, "num_filter_entries": 638, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166704, "oldest_key_time": 1769166704, "file_creation_time": 1769166811, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 221, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 141] Flush lasted 14762 microseconds, and 7018 cpu microseconds.
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.751766) [db/flush_job.cc:967] [default] [JOB 141] Level-0 flush table #221: 1958201 bytes OK
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.751812) [db/memtable_list.cc:519] [default] Level-0 commit table #221 started
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.754319) [db/memtable_list.cc:722] [default] Level-0 commit table #221: memtable #1 done
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.754344) EVENT_LOG_v1 {"time_micros": 1769166811754337, "job": 141, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.754373) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 141] Try to delete WAL files size 2992773, prev total WAL file size 3008217, number of live WAL files 2.
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000217.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.755992) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323733' seq:72057594037927935, type:22 .. '6C6F676D0034353234' seq:0, type:0; will stop at (end)
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 142] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 141 Base level 0, inputs: [221(1912KB)], [219(13MB)]
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166811756052, "job": 142, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [221], "files_L6": [219], "score": -1, "input_data_size": 15961120, "oldest_snapshot_seqno": -1}
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.764 227766 DEBUG oslo_concurrency.processutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 142] Generated table #222: 12101 keys, 15831107 bytes, temperature: kUnknown
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166811846308, "cf_name": "default", "job": 142, "event": "table_file_creation", "file_number": 222, "file_size": 15831107, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15753363, "index_size": 46399, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30277, "raw_key_size": 320438, "raw_average_key_size": 26, "raw_value_size": 15542533, "raw_average_value_size": 1284, "num_data_blocks": 1762, "num_entries": 12101, "num_filter_entries": 12101, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769166811, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 222, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.846584) [db/compaction/compaction_job.cc:1663] [default] [JOB 142] Compacted 1@0 + 1@6 files to L6 => 15831107 bytes
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.848354) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.9 rd, 175.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 13.4 +0.0 blob) out(15.1 +0.0 blob), read-write-amplify(16.2) write-amplify(8.1) OK, records in: 12632, records dropped: 531 output_compression: NoCompression
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.848375) EVENT_LOG_v1 {"time_micros": 1769166811848364, "job": 142, "event": "compaction_finished", "compaction_time_micros": 90211, "compaction_time_cpu_micros": 39029, "output_level": 6, "num_output_files": 1, "total_output_size": 15831107, "num_input_records": 12632, "num_output_records": 12101, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000221.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166811848841, "job": 142, "event": "table_file_deletion", "file_number": 221}
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000219.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166811852057, "job": 142, "event": "table_file_deletion", "file_number": 219}
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.755899) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.852212) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.852224) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.852228) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.852232) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:13:31 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:13:31.852236) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.972 227766 DEBUG nova.network.neutron [req-b21e257f-3aeb-4460-ac32-5620e9a60bb5 req-1a169919-21a7-42c6-926a-5f5bbef81afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Updated VIF entry in instance network info cache for port 56bb5fc8-f112-47c7-84d3-d47e53c4d481. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.973 227766 DEBUG nova.network.neutron [req-b21e257f-3aeb-4460-ac32-5620e9a60bb5 req-1a169919-21a7-42c6-926a-5f5bbef81afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Updating instance_info_cache with network_info: [{"id": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "address": "fa:16:3e:be:cd:e8", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56bb5fc8-f1", "ovs_interfaceid": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:13:31 np0005593234 nova_compute[227762]: 2026-01-23 11:13:31.990 227766 DEBUG oslo_concurrency.lockutils [req-b21e257f-3aeb-4460-ac32-5620e9a60bb5 req-1a169919-21a7-42c6-926a-5f5bbef81afc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d4c75524-52b8-4c2b-b0cb-18d94089013b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.179 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:13:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2606057014' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.265 227766 DEBUG oslo_concurrency.processutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.297 227766 DEBUG nova.storage.rbd_utils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] rbd image d4c75524-52b8-4c2b-b0cb-18d94089013b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.302 227766 DEBUG oslo_concurrency.processutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:13:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:32.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:32 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:13:32 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3534238657' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.761 227766 DEBUG oslo_concurrency.processutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.763 227766 DEBUG nova.virt.libvirt.vif [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T11:12:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-926357377',display_name='tempest-TestShelveInstance-server-926357377',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-926357377',id=223,image_ref='7da2f187-f7de-4714-b817-454a50a6b19a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-93398597',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:12:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='3a245f7970f14fffa60af2ff972b4bfd',ramdisk_id='',reservation_id='r-9cvenz37',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-869807080',owner_user_name='tempest-TestShelveInstance-869807080-project-member',shelved_at='2026-01-23T11:13:10.515610',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='7da2f187-f7de-4714-b817-454a50a6b19a'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:13:26Z,user_data=None,user_id='5d6a458f5d9345379b05f0cdb69a7b0f',uuid=d4c75524-52b8-4c2b-b0cb-18d94089013b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "address": "fa:16:3e:be:cd:e8", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56bb5fc8-f1", "ovs_interfaceid": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.764 227766 DEBUG nova.network.os_vif_util [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converting VIF {"id": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "address": "fa:16:3e:be:cd:e8", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56bb5fc8-f1", "ovs_interfaceid": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.765 227766 DEBUG nova.network.os_vif_util [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:cd:e8,bridge_name='br-int',has_traffic_filtering=True,id=56bb5fc8-f112-47c7-84d3-d47e53c4d481,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56bb5fc8-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.767 227766 DEBUG nova.objects.instance [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'pci_devices' on Instance uuid d4c75524-52b8-4c2b-b0cb-18d94089013b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.963 227766 DEBUG nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] End _get_guest_xml xml=<domain type="kvm">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  <uuid>d4c75524-52b8-4c2b-b0cb-18d94089013b</uuid>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  <name>instance-000000df</name>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestShelveInstance-server-926357377</nova:name>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 11:13:31</nova:creationTime>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <nova:user uuid="5d6a458f5d9345379b05f0cdb69a7b0f">tempest-TestShelveInstance-869807080-project-member</nova:user>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <nova:project uuid="3a245f7970f14fffa60af2ff972b4bfd">tempest-TestShelveInstance-869807080</nova:project>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <nova:root type="image" uuid="7da2f187-f7de-4714-b817-454a50a6b19a"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <nova:port uuid="56bb5fc8-f112-47c7-84d3-d47e53c4d481">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <system>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <entry name="serial">d4c75524-52b8-4c2b-b0cb-18d94089013b</entry>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <entry name="uuid">d4c75524-52b8-4c2b-b0cb-18d94089013b</entry>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    </system>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  <os>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  </os>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  <features>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  </features>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  </clock>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  <devices>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/d4c75524-52b8-4c2b-b0cb-18d94089013b_disk">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/d4c75524-52b8-4c2b-b0cb-18d94089013b_disk.config">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:be:cd:e8"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <target dev="tap56bb5fc8-f1"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    </interface>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/d4c75524-52b8-4c2b-b0cb-18d94089013b/console.log" append="off"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    </serial>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <video>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    </video>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <input type="keyboard" bus="usb"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    </rng>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 06:13:32 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 06:13:32 np0005593234 nova_compute[227762]:  </devices>
Jan 23 06:13:32 np0005593234 nova_compute[227762]: </domain>
Jan 23 06:13:32 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.965 227766 DEBUG nova.compute.manager [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Preparing to wait for external event network-vif-plugged-56bb5fc8-f112-47c7-84d3-d47e53c4d481 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.965 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.965 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.966 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.966 227766 DEBUG nova.virt.libvirt.vif [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T11:12:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-926357377',display_name='tempest-TestShelveInstance-server-926357377',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-926357377',id=223,image_ref='7da2f187-f7de-4714-b817-454a50a6b19a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name='tempest-TestShelveInstance-93398597',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:12:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=4,progress=0,project_id='3a245f7970f14fffa60af2ff972b4bfd',ramdisk_id='',reservation_id='r-9cvenz37',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-869807080',owner_user_name='tempest-TestShelveInstance-869807080-project-member',shelved_at='2026-01-23T11:13:10.515610',shelved_host='compute-0.ctlplane.example.com',shelved_image_id='7da2f187-f7de-4714-b817-454a50a6b19a'},tags=<?>,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:13:26Z,user_data=None,user_id='5d6a458f5d9345379b05f0cdb69a7b0f',uuid=d4c75524-52b8-4c2b-b0cb-18d94089013b,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='shelved_offloaded') vif={"id": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "address": "fa:16:3e:be:cd:e8", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56bb5fc8-f1", "ovs_interfaceid": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.967 227766 DEBUG nova.network.os_vif_util [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converting VIF {"id": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "address": "fa:16:3e:be:cd:e8", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56bb5fc8-f1", "ovs_interfaceid": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.967 227766 DEBUG nova.network.os_vif_util [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:be:cd:e8,bridge_name='br-int',has_traffic_filtering=True,id=56bb5fc8-f112-47c7-84d3-d47e53c4d481,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56bb5fc8-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.968 227766 DEBUG os_vif [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:cd:e8,bridge_name='br-int',has_traffic_filtering=True,id=56bb5fc8-f112-47c7-84d3-d47e53c4d481,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56bb5fc8-f1') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.968 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.969 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.969 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.971 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.972 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.972 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.972 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.972 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.998 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:32 np0005593234 nova_compute[227762]: 2026-01-23 11:13:32.998 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap56bb5fc8-f1, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.000 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap56bb5fc8-f1, col_values=(('external_ids', {'iface-id': '56bb5fc8-f112-47c7-84d3-d47e53c4d481', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:be:cd:e8', 'vm-uuid': 'd4c75524-52b8-4c2b-b0cb-18d94089013b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:13:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e431 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.052 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:13:33 np0005593234 NetworkManager[48942]: <info>  [1769166813.0533] manager: (tap56bb5fc8-f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/494)
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.057 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.058 227766 INFO os_vif [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:be:cd:e8,bridge_name='br-int',has_traffic_filtering=True,id=56bb5fc8-f112-47c7-84d3-d47e53c4d481,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56bb5fc8-f1')#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.127 227766 DEBUG nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.128 227766 DEBUG nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.128 227766 DEBUG nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] No VIF found with MAC fa:16:3e:be:cd:e8, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.128 227766 INFO nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Using config drive#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.153 227766 DEBUG nova.storage.rbd_utils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] rbd image d4c75524-52b8-4c2b-b0cb-18d94089013b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.183 227766 DEBUG nova.objects.instance [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'ec2_ids' on Instance uuid d4c75524-52b8-4c2b-b0cb-18d94089013b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.222 227766 DEBUG nova.objects.instance [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'keypairs' on Instance uuid d4c75524-52b8-4c2b-b0cb-18d94089013b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:13:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:13:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1645727341' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.439 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.549 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000df as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.549 227766 DEBUG nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] skipping disk for instance-000000df as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231#033[00m
Jan 23 06:13:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:33.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.706 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.706 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4077MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.707 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.707 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.796 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance d4c75524-52b8-4c2b-b0cb-18d94089013b actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.796 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.796 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:13:33 np0005593234 podman[349728]: 2026-01-23 11:13:33.805529144 +0000 UTC m=+0.099764033 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.855 227766 INFO nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Creating config drive at /var/lib/nova/instances/d4c75524-52b8-4c2b-b0cb-18d94089013b/disk.config#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.860 227766 DEBUG oslo_concurrency.processutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d4c75524-52b8-4c2b-b0cb-18d94089013b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr_x8p3o9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.907 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.965 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.965 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 06:13:33 np0005593234 nova_compute[227762]: 2026-01-23 11:13:33.981 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.004 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.013 227766 DEBUG oslo_concurrency.processutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d4c75524-52b8-4c2b-b0cb-18d94089013b/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpr_x8p3o9" returned: 0 in 0.153s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.041 227766 DEBUG nova.storage.rbd_utils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] rbd image d4c75524-52b8-4c2b-b0cb-18d94089013b_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.045 227766 DEBUG oslo_concurrency.processutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/d4c75524-52b8-4c2b-b0cb-18d94089013b/disk.config d4c75524-52b8-4c2b-b0cb-18d94089013b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.089 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.242 227766 DEBUG oslo_concurrency.processutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/d4c75524-52b8-4c2b-b0cb-18d94089013b/disk.config d4c75524-52b8-4c2b-b0cb-18d94089013b_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.197s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.244 227766 INFO nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Deleting local config drive /var/lib/nova/instances/d4c75524-52b8-4c2b-b0cb-18d94089013b/disk.config because it was imported into RBD.#033[00m
Jan 23 06:13:34 np0005593234 kernel: tap56bb5fc8-f1: entered promiscuous mode
Jan 23 06:13:34 np0005593234 NetworkManager[48942]: <info>  [1769166814.3370] manager: (tap56bb5fc8-f1): new Tun device (/org/freedesktop/NetworkManager/Devices/495)
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.378 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:34 np0005593234 ovn_controller[134547]: 2026-01-23T11:13:34Z|01032|binding|INFO|Claiming lport 56bb5fc8-f112-47c7-84d3-d47e53c4d481 for this chassis.
Jan 23 06:13:34 np0005593234 ovn_controller[134547]: 2026-01-23T11:13:34Z|01033|binding|INFO|56bb5fc8-f112-47c7-84d3-d47e53c4d481: Claiming fa:16:3e:be:cd:e8 10.100.0.10
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.388 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:34 np0005593234 NetworkManager[48942]: <info>  [1769166814.3927] manager: (patch-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/496)
Jan 23 06:13:34 np0005593234 NetworkManager[48942]: <info>  [1769166814.3933] manager: (patch-br-int-to-provnet-d9c92ce5-db5b-485c-b1fb-94d4128a85f1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/497)
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.392 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.397 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:cd:e8 10.100.0.10'], port_security=['fa:16:3e:be:cd:e8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd4c75524-52b8-4c2b-b0cb-18d94089013b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42899517-91b9-42e3-96a7-29180211a7a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a245f7970f14fffa60af2ff972b4bfd', 'neutron:revision_number': '7', 'neutron:security_group_ids': '3f0cc653-92a4-4b83-958a-564f01bb9144', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.249'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef3519b3-9b5b-4b40-8630-d2487396abc0, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=56bb5fc8-f112-47c7-84d3-d47e53c4d481) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.399 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 56bb5fc8-f112-47c7-84d3-d47e53c4d481 in datapath 42899517-91b9-42e3-96a7-29180211a7a4 bound to our chassis#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.401 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 42899517-91b9-42e3-96a7-29180211a7a4#033[00m
Jan 23 06:13:34 np0005593234 systemd-machined[195626]: New machine qemu-115-instance-000000df.
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.427 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[70d9acc0-1a77-49e5-a273-2931076114e3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.430 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap42899517-91 in ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.436 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap42899517-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.436 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[30573185-1c8f-4506-b023-ee1d87c1d071]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.437 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[67c52863-31bb-47cb-8900-ec472ffe390f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.459 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[e84e76f7-269c-4ca2-a8d8-2b3e09b6f8e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 systemd[1]: Started Virtual Machine qemu-115-instance-000000df.
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.492 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[6c605008-9fd7-4dcd-a91e-a9ac467742a7]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 systemd-udevd[349833]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:13:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:13:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2626459803' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:13:34 np0005593234 NetworkManager[48942]: <info>  [1769166814.5208] device (tap56bb5fc8-f1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:13:34 np0005593234 NetworkManager[48942]: <info>  [1769166814.5227] device (tap56bb5fc8-f1): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.536 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.538 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.541 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a42b0c8a-ad0f-4d94-b590-77438a1bd81b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.547 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.551 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.556 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[418b268b-3ba5-4f42-b865-e2af4424daac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 NetworkManager[48942]: <info>  [1769166814.5580] manager: (tap42899517-90): new Veth device (/org/freedesktop/NetworkManager/Devices/498)
Jan 23 06:13:34 np0005593234 ovn_controller[134547]: 2026-01-23T11:13:34Z|01034|binding|INFO|Setting lport 56bb5fc8-f112-47c7-84d3-d47e53c4d481 ovn-installed in OVS
Jan 23 06:13:34 np0005593234 ovn_controller[134547]: 2026-01-23T11:13:34Z|01035|binding|INFO|Setting lport 56bb5fc8-f112-47c7-84d3-d47e53c4d481 up in Southbound
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.560 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.571 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.573 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.573 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.866s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.593 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[459cc57f-e319-49da-af28-97d744557926]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.597 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[d07d5d57-0881-4970-b712-ee1a6a923b74]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 NetworkManager[48942]: <info>  [1769166814.6249] device (tap42899517-90): carrier: link connected
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.644 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[6a5dcc7b-275b-41c9-82d0-9c2cc15f0db1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.665 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[938025f6-f7a5-4196-9b30-99be1f4cb82b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42899517-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:74:09:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 318], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1070958, 'reachable_time': 26952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 349866, 'error': None, 'target': 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.683 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1bc9ef02-19db-4668-acbf-d68e6f5f8efc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe74:998'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1070958, 'tstamp': 1070958}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 349867, 'error': None, 'target': 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.701 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[665127fc-aa6b-44ce-a464-b0e1391d55b6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42899517-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:74:09:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 318], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1070958, 'reachable_time': 26952, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 349868, 'error': None, 'target': 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:34.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.744 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[241278a6-7cd9-400f-8522-3f70ffe33988]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.837 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[1d24518b-bd3a-4a9e-b031-557e08b194e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.840 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42899517-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.841 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.842 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap42899517-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.843 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:34 np0005593234 kernel: tap42899517-90: entered promiscuous mode
Jan 23 06:13:34 np0005593234 NetworkManager[48942]: <info>  [1769166814.8453] manager: (tap42899517-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/499)
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.845 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.847 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap42899517-90, col_values=(('external_ids', {'iface-id': '82ae71e6-e83a-4506-8f0f-261163163937'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.848 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:34 np0005593234 ovn_controller[134547]: 2026-01-23T11:13:34Z|01036|binding|INFO|Releasing lport 82ae71e6-e83a-4506-8f0f-261163163937 from this chassis (sb_readonly=0)
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.848 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.849 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/42899517-91b9-42e3-96a7-29180211a7a4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/42899517-91b9-42e3-96a7-29180211a7a4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.850 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[07cada3d-a9cb-483f-a32d-8f096e657c8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.851 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-42899517-91b9-42e3-96a7-29180211a7a4
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/42899517-91b9-42e3-96a7-29180211a7a4.pid.haproxy
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 42899517-91b9-42e3-96a7-29180211a7a4
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:13:34 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:34.852 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'env', 'PROCESS_TAG=haproxy-42899517-91b9-42e3-96a7-29180211a7a4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/42899517-91b9-42e3-96a7-29180211a7a4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.860 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.964 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166814.964233, d4c75524-52b8-4c2b-b0cb-18d94089013b => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.964 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] VM Started (Lifecycle Event)#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.988 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.993 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166814.9661999, d4c75524-52b8-4c2b-b0cb-18d94089013b => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:13:34 np0005593234 nova_compute[227762]: 2026-01-23 11:13:34.993 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] VM Paused (Lifecycle Event)#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.013 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.017 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.047 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.152 227766 DEBUG nova.compute.manager [req-cee0e9a6-c539-4d96-ac82-ea06e05502f9 req-9092fac7-de5b-4143-b29f-d2557c563316 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Received event network-vif-plugged-56bb5fc8-f112-47c7-84d3-d47e53c4d481 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.152 227766 DEBUG oslo_concurrency.lockutils [req-cee0e9a6-c539-4d96-ac82-ea06e05502f9 req-9092fac7-de5b-4143-b29f-d2557c563316 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.152 227766 DEBUG oslo_concurrency.lockutils [req-cee0e9a6-c539-4d96-ac82-ea06e05502f9 req-9092fac7-de5b-4143-b29f-d2557c563316 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.152 227766 DEBUG oslo_concurrency.lockutils [req-cee0e9a6-c539-4d96-ac82-ea06e05502f9 req-9092fac7-de5b-4143-b29f-d2557c563316 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.153 227766 DEBUG nova.compute.manager [req-cee0e9a6-c539-4d96-ac82-ea06e05502f9 req-9092fac7-de5b-4143-b29f-d2557c563316 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Processing event network-vif-plugged-56bb5fc8-f112-47c7-84d3-d47e53c4d481 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.153 227766 DEBUG nova.compute.manager [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.156 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166815.156513, d4c75524-52b8-4c2b-b0cb-18d94089013b => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.157 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:13:35 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:35.157 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '105'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.158 227766 DEBUG nova.virt.libvirt.driver [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.161 227766 INFO nova.virt.libvirt.driver [-] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Instance spawned successfully.#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.201 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.206 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:13:35 np0005593234 nova_compute[227762]: 2026-01-23 11:13:35.224 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:13:35 np0005593234 podman[349942]: 2026-01-23 11:13:35.262261244 +0000 UTC m=+0.052290078 container create 8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 06:13:35 np0005593234 systemd[1]: Started libpod-conmon-8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749.scope.
Jan 23 06:13:35 np0005593234 podman[349942]: 2026-01-23 11:13:35.232897905 +0000 UTC m=+0.022926759 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:13:35 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:13:35 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6f35cf7d27f487bb6363c6b7a4bce1574a5311fc0c0798b43293edcb23ce7dc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:13:35 np0005593234 podman[349942]: 2026-01-23 11:13:35.380794353 +0000 UTC m=+0.170823237 container init 8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 06:13:35 np0005593234 podman[349942]: 2026-01-23 11:13:35.387352389 +0000 UTC m=+0.177381223 container start 8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 06:13:35 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[349955]: [NOTICE]   (349959) : New worker (349961) forked
Jan 23 06:13:35 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[349955]: [NOTICE]   (349959) : Loading success.
Jan 23 06:13:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:35.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e432 e432: 3 total, 3 up, 3 in
Jan 23 06:13:36 np0005593234 nova_compute[227762]: 2026-01-23 11:13:36.532 227766 DEBUG nova.compute.manager [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:13:36 np0005593234 nova_compute[227762]: 2026-01-23 11:13:36.705 227766 DEBUG oslo_concurrency.lockutils [None req-aac81dc0-d167-49c7-9d65-2e55287d0d17 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 10.547s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:13:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:36.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.180 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.252 227766 DEBUG nova.compute.manager [req-b2e93f3d-08cb-408a-b73a-e957722d7f15 req-74c2b227-48fe-446d-8129-aff73925de91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Received event network-vif-plugged-56bb5fc8-f112-47c7-84d3-d47e53c4d481 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.252 227766 DEBUG oslo_concurrency.lockutils [req-b2e93f3d-08cb-408a-b73a-e957722d7f15 req-74c2b227-48fe-446d-8129-aff73925de91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.253 227766 DEBUG oslo_concurrency.lockutils [req-b2e93f3d-08cb-408a-b73a-e957722d7f15 req-74c2b227-48fe-446d-8129-aff73925de91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.253 227766 DEBUG oslo_concurrency.lockutils [req-b2e93f3d-08cb-408a-b73a-e957722d7f15 req-74c2b227-48fe-446d-8129-aff73925de91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.254 227766 DEBUG nova.compute.manager [req-b2e93f3d-08cb-408a-b73a-e957722d7f15 req-74c2b227-48fe-446d-8129-aff73925de91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] No waiting events found dispatching network-vif-plugged-56bb5fc8-f112-47c7-84d3-d47e53c4d481 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.254 227766 WARNING nova.compute.manager [req-b2e93f3d-08cb-408a-b73a-e957722d7f15 req-74c2b227-48fe-446d-8129-aff73925de91 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Received unexpected event network-vif-plugged-56bb5fc8-f112-47c7-84d3-d47e53c4d481 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.576 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.578 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.578 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:13:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:13:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:37.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.720 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-d4c75524-52b8-4c2b-b0cb-18d94089013b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.721 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-d4c75524-52b8-4c2b-b0cb-18d94089013b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.722 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 06:13:37 np0005593234 nova_compute[227762]: 2026-01-23 11:13:37.722 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid d4c75524-52b8-4c2b-b0cb-18d94089013b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:13:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e432 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:38 np0005593234 nova_compute[227762]: 2026-01-23 11:13:38.051 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:38.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:39 np0005593234 nova_compute[227762]: 2026-01-23 11:13:39.261 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Updating instance_info_cache with network_info: [{"id": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "address": "fa:16:3e:be:cd:e8", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56bb5fc8-f1", "ovs_interfaceid": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:13:39 np0005593234 nova_compute[227762]: 2026-01-23 11:13:39.279 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-d4c75524-52b8-4c2b-b0cb-18d94089013b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:13:39 np0005593234 nova_compute[227762]: 2026-01-23 11:13:39.280 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 06:13:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:13:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:39.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:13:39 np0005593234 nova_compute[227762]: 2026-01-23 11:13:39.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:40 np0005593234 nova_compute[227762]: 2026-01-23 11:13:40.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:40.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:41.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 e433: 3 total, 3 up, 3 in
Jan 23 06:13:42 np0005593234 nova_compute[227762]: 2026-01-23 11:13:42.182 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:42.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:42.924 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:13:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:42.925 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:13:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:13:42.926 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:13:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:43 np0005593234 nova_compute[227762]: 2026-01-23 11:13:43.053 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:43.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:43 np0005593234 nova_compute[227762]: 2026-01-23 11:13:43.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:13:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1898619751' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:13:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:13:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/1898619751' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:13:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:44.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:13:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:45.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:13:46 np0005593234 nova_compute[227762]: 2026-01-23 11:13:46.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:46 np0005593234 nova_compute[227762]: 2026-01-23 11:13:46.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:13:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:46.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:47 np0005593234 nova_compute[227762]: 2026-01-23 11:13:47.185 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:47.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:48 np0005593234 nova_compute[227762]: 2026-01-23 11:13:48.055 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:48 np0005593234 nova_compute[227762]: 2026-01-23 11:13:48.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:48.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:48 np0005593234 ovn_controller[134547]: 2026-01-23T11:13:48Z|00135|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:be:cd:e8 10.100.0.10
Jan 23 06:13:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:49.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:49 np0005593234 nova_compute[227762]: 2026-01-23 11:13:49.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:13:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:50.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:51.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:52 np0005593234 nova_compute[227762]: 2026-01-23 11:13:52.187 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:52.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:53 np0005593234 nova_compute[227762]: 2026-01-23 11:13:53.057 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:53.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:13:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:54.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:13:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:13:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:55.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:13:55 np0005593234 podman[350031]: 2026-01-23 11:13:55.774339648 +0000 UTC m=+0.066422030 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:13:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:13:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:56.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:13:57 np0005593234 nova_compute[227762]: 2026-01-23 11:13:57.189 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:57.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:13:58 np0005593234 nova_compute[227762]: 2026-01-23 11:13:58.059 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:13:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:13:58.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:13:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:13:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:13:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:13:59.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:14:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:00.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:14:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:01.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:02 np0005593234 nova_compute[227762]: 2026-01-23 11:14:02.191 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:02.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:03 np0005593234 nova_compute[227762]: 2026-01-23 11:14:03.090 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:03.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:04.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:04 np0005593234 podman[350105]: 2026-01-23 11:14:04.813929971 +0000 UTC m=+0.114339259 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 06:14:04 np0005593234 ovn_controller[134547]: 2026-01-23T11:14:04Z|01037|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 06:14:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:05.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:14:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:06.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:14:07 np0005593234 nova_compute[227762]: 2026-01-23 11:14:07.191 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:07.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:08 np0005593234 nova_compute[227762]: 2026-01-23 11:14:08.092 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:08 np0005593234 nova_compute[227762]: 2026-01-23 11:14:08.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:14:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:08.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:14:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:14:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:09.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:14:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:09.939 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=106, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=105) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:14:09 np0005593234 nova_compute[227762]: 2026-01-23 11:14:09.940 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:09 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:09.941 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.250 227766 DEBUG nova.compute.manager [req-4328c874-11b6-49db-809f-bd8ac39e0eae req-c1a583d4-cfe6-4955-a579-6bd91dcfd2a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Received event network-changed-56bb5fc8-f112-47c7-84d3-d47e53c4d481 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.251 227766 DEBUG nova.compute.manager [req-4328c874-11b6-49db-809f-bd8ac39e0eae req-c1a583d4-cfe6-4955-a579-6bd91dcfd2a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Refreshing instance network info cache due to event network-changed-56bb5fc8-f112-47c7-84d3-d47e53c4d481. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.251 227766 DEBUG oslo_concurrency.lockutils [req-4328c874-11b6-49db-809f-bd8ac39e0eae req-c1a583d4-cfe6-4955-a579-6bd91dcfd2a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-d4c75524-52b8-4c2b-b0cb-18d94089013b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.251 227766 DEBUG oslo_concurrency.lockutils [req-4328c874-11b6-49db-809f-bd8ac39e0eae req-c1a583d4-cfe6-4955-a579-6bd91dcfd2a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-d4c75524-52b8-4c2b-b0cb-18d94089013b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.252 227766 DEBUG nova.network.neutron [req-4328c874-11b6-49db-809f-bd8ac39e0eae req-c1a583d4-cfe6-4955-a579-6bd91dcfd2a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Refreshing network info cache for port 56bb5fc8-f112-47c7-84d3-d47e53c4d481 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.371 227766 DEBUG oslo_concurrency.lockutils [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "d4c75524-52b8-4c2b-b0cb-18d94089013b" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.371 227766 DEBUG oslo_concurrency.lockutils [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.371 227766 DEBUG oslo_concurrency.lockutils [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.372 227766 DEBUG oslo_concurrency.lockutils [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.372 227766 DEBUG oslo_concurrency.lockutils [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.373 227766 INFO nova.compute.manager [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Terminating instance#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.375 227766 DEBUG nova.compute.manager [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Jan 23 06:14:10 np0005593234 kernel: tap56bb5fc8-f1 (unregistering): left promiscuous mode
Jan 23 06:14:10 np0005593234 NetworkManager[48942]: <info>  [1769166850.4473] device (tap56bb5fc8-f1): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:14:10 np0005593234 ovn_controller[134547]: 2026-01-23T11:14:10Z|01038|binding|INFO|Releasing lport 56bb5fc8-f112-47c7-84d3-d47e53c4d481 from this chassis (sb_readonly=0)
Jan 23 06:14:10 np0005593234 ovn_controller[134547]: 2026-01-23T11:14:10Z|01039|binding|INFO|Setting lport 56bb5fc8-f112-47c7-84d3-d47e53c4d481 down in Southbound
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.456 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:10 np0005593234 ovn_controller[134547]: 2026-01-23T11:14:10Z|01040|binding|INFO|Removing iface tap56bb5fc8-f1 ovn-installed in OVS
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.466 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:be:cd:e8 10.100.0.10'], port_security=['fa:16:3e:be:cd:e8 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'd4c75524-52b8-4c2b-b0cb-18d94089013b', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42899517-91b9-42e3-96a7-29180211a7a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a245f7970f14fffa60af2ff972b4bfd', 'neutron:revision_number': '9', 'neutron:security_group_ids': '3f0cc653-92a4-4b83-958a-564f01bb9144', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef3519b3-9b5b-4b40-8630-d2487396abc0, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=56bb5fc8-f112-47c7-84d3-d47e53c4d481) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.468 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 56bb5fc8-f112-47c7-84d3-d47e53c4d481 in datapath 42899517-91b9-42e3-96a7-29180211a7a4 unbound from our chassis#033[00m
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.470 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42899517-91b9-42e3-96a7-29180211a7a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.472 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[afd29e0f-9ff9-4462-b58b-8870cc20da01]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.473 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 namespace which is not needed anymore#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.480 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:10 np0005593234 systemd[1]: machine-qemu\x2d115\x2dinstance\x2d000000df.scope: Deactivated successfully.
Jan 23 06:14:10 np0005593234 systemd[1]: machine-qemu\x2d115\x2dinstance\x2d000000df.scope: Consumed 14.945s CPU time.
Jan 23 06:14:10 np0005593234 systemd-machined[195626]: Machine qemu-115-instance-000000df terminated.
Jan 23 06:14:10 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[349955]: [NOTICE]   (349959) : haproxy version is 2.8.14-c23fe91
Jan 23 06:14:10 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[349955]: [NOTICE]   (349959) : path to executable is /usr/sbin/haproxy
Jan 23 06:14:10 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[349955]: [WARNING]  (349959) : Exiting Master process...
Jan 23 06:14:10 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[349955]: [WARNING]  (349959) : Exiting Master process...
Jan 23 06:14:10 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[349955]: [ALERT]    (349959) : Current worker (349961) exited with code 143 (Terminated)
Jan 23 06:14:10 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[349955]: [WARNING]  (349959) : All workers exited. Exiting... (0)
Jan 23 06:14:10 np0005593234 systemd[1]: libpod-8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749.scope: Deactivated successfully.
Jan 23 06:14:10 np0005593234 podman[350158]: 2026-01-23 11:14:10.605223222 +0000 UTC m=+0.053137374 container died 8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.617 227766 INFO nova.virt.libvirt.driver [-] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Instance destroyed successfully.#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.618 227766 DEBUG nova.objects.instance [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'resources' on Instance uuid d4c75524-52b8-4c2b-b0cb-18d94089013b obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.634 227766 DEBUG nova.virt.libvirt.vif [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-01-23T11:12:10Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-926357377',display_name='tempest-TestShelveInstance-server-926357377',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-926357377',id=223,image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAEFOGuv6jcY9DwIy/xYvt2X4Vb1HISACx7cX6o9lDAD3l3O1QRG07pd8MQdd17GGOSBZRG+y+TaN6Gc5Y3oNpLF+mD7AORx9ZprSr452pQ3EZnBrolaOtjqq79YfGAlew==',key_name='tempest-TestShelveInstance-93398597',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:13:36Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3a245f7970f14fffa60af2ff972b4bfd',ramdisk_id='',reservation_id='r-9cvenz37',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='84c0ef19-7f67-4bd3-95d8-507c3e0942ed',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestShelveInstance-869807080',owner_user_name='tempest-TestShelveInstance-869807080-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:13:36Z,user_data=None,user_id='5d6a458f5d9345379b05f0cdb69a7b0f',uuid=d4c75524-52b8-4c2b-b0cb-18d94089013b,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "address": "fa:16:3e:be:cd:e8", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56bb5fc8-f1", "ovs_interfaceid": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:14:10 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749-userdata-shm.mount: Deactivated successfully.
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.635 227766 DEBUG nova.network.os_vif_util [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converting VIF {"id": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "address": "fa:16:3e:be:cd:e8", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56bb5fc8-f1", "ovs_interfaceid": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:14:10 np0005593234 systemd[1]: var-lib-containers-storage-overlay-e6f35cf7d27f487bb6363c6b7a4bce1574a5311fc0c0798b43293edcb23ce7dc-merged.mount: Deactivated successfully.
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.642 227766 DEBUG nova.network.os_vif_util [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:be:cd:e8,bridge_name='br-int',has_traffic_filtering=True,id=56bb5fc8-f112-47c7-84d3-d47e53c4d481,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56bb5fc8-f1') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.643 227766 DEBUG os_vif [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:be:cd:e8,bridge_name='br-int',has_traffic_filtering=True,id=56bb5fc8-f112-47c7-84d3-d47e53c4d481,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56bb5fc8-f1') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.648 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:10 np0005593234 podman[350158]: 2026-01-23 11:14:10.649117587 +0000 UTC m=+0.097031729 container cleanup 8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.649 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap56bb5fc8-f1, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.651 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.652 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.657 227766 INFO os_vif [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:be:cd:e8,bridge_name='br-int',has_traffic_filtering=True,id=56bb5fc8-f112-47c7-84d3-d47e53c4d481,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap56bb5fc8-f1')#033[00m
Jan 23 06:14:10 np0005593234 systemd[1]: libpod-conmon-8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749.scope: Deactivated successfully.
Jan 23 06:14:10 np0005593234 podman[350201]: 2026-01-23 11:14:10.769198675 +0000 UTC m=+0.098356480 container remove 8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.775 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[613fd5ff-e9a1-4507-a49f-9ad14f2b5069]: (4, ('Fri Jan 23 11:14:10 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 (8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749)\n8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749\nFri Jan 23 11:14:10 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 (8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749)\n8e956c58c5095424a96112ffc55fb2a27306fd1878a6a2ae320a5b9f21e11749\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.777 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[84206c06-d25c-45c9-9751-0a44cbe2e9b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.778 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42899517-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.782 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:10 np0005593234 kernel: tap42899517-90: left promiscuous mode
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.790 227766 DEBUG nova.compute.manager [req-0f2d0c7e-d663-4e8b-8be6-c58007d75102 req-8c5ca42a-6197-46c1-ae33-5ea35f3c3630 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Received event network-vif-unplugged-56bb5fc8-f112-47c7-84d3-d47e53c4d481 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.790 227766 DEBUG oslo_concurrency.lockutils [req-0f2d0c7e-d663-4e8b-8be6-c58007d75102 req-8c5ca42a-6197-46c1-ae33-5ea35f3c3630 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.791 227766 DEBUG oslo_concurrency.lockutils [req-0f2d0c7e-d663-4e8b-8be6-c58007d75102 req-8c5ca42a-6197-46c1-ae33-5ea35f3c3630 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.791 227766 DEBUG oslo_concurrency.lockutils [req-0f2d0c7e-d663-4e8b-8be6-c58007d75102 req-8c5ca42a-6197-46c1-ae33-5ea35f3c3630 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.791 227766 DEBUG nova.compute.manager [req-0f2d0c7e-d663-4e8b-8be6-c58007d75102 req-8c5ca42a-6197-46c1-ae33-5ea35f3c3630 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] No waiting events found dispatching network-vif-unplugged-56bb5fc8-f112-47c7-84d3-d47e53c4d481 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.792 227766 DEBUG nova.compute.manager [req-0f2d0c7e-d663-4e8b-8be6-c58007d75102 req-8c5ca42a-6197-46c1-ae33-5ea35f3c3630 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Received event network-vif-unplugged-56bb5fc8-f112-47c7-84d3-d47e53c4d481 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Jan 23 06:14:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:10.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:10 np0005593234 nova_compute[227762]: 2026-01-23 11:14:10.803 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.808 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a63675d7-ddb0-42ab-9a63-0e993e7e2121]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.826 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[721b3279-2fc9-4519-bc0c-4bf8bcc4a02d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.828 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[59cd1d16-2ed5-471d-ae06-5d7275b0fbb4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.846 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[cc0c48d4-48cb-45a9-ad4c-a6b6a7b70bcc]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1070948, 'reachable_time': 33894, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 350234, 'error': None, 'target': 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.854 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:14:10 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:10.854 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[6894dc15-49ae-4990-a942-cdaedd9f30ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:10 np0005593234 systemd[1]: run-netns-ovnmeta\x2d42899517\x2d91b9\x2d42e3\x2d96a7\x2d29180211a7a4.mount: Deactivated successfully.
Jan 23 06:14:11 np0005593234 nova_compute[227762]: 2026-01-23 11:14:11.318 227766 INFO nova.virt.libvirt.driver [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Deleting instance files /var/lib/nova/instances/d4c75524-52b8-4c2b-b0cb-18d94089013b_del#033[00m
Jan 23 06:14:11 np0005593234 nova_compute[227762]: 2026-01-23 11:14:11.319 227766 INFO nova.virt.libvirt.driver [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Deletion of /var/lib/nova/instances/d4c75524-52b8-4c2b-b0cb-18d94089013b_del complete#033[00m
Jan 23 06:14:11 np0005593234 nova_compute[227762]: 2026-01-23 11:14:11.366 227766 INFO nova.compute.manager [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Took 0.99 seconds to destroy the instance on the hypervisor.#033[00m
Jan 23 06:14:11 np0005593234 nova_compute[227762]: 2026-01-23 11:14:11.366 227766 DEBUG oslo.service.loopingcall [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Jan 23 06:14:11 np0005593234 nova_compute[227762]: 2026-01-23 11:14:11.367 227766 DEBUG nova.compute.manager [-] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Jan 23 06:14:11 np0005593234 nova_compute[227762]: 2026-01-23 11:14:11.367 227766 DEBUG nova.network.neutron [-] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Jan 23 06:14:11 np0005593234 nova_compute[227762]: 2026-01-23 11:14:11.544 227766 DEBUG nova.network.neutron [req-4328c874-11b6-49db-809f-bd8ac39e0eae req-c1a583d4-cfe6-4955-a579-6bd91dcfd2a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Updated VIF entry in instance network info cache for port 56bb5fc8-f112-47c7-84d3-d47e53c4d481. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:14:11 np0005593234 nova_compute[227762]: 2026-01-23 11:14:11.544 227766 DEBUG nova.network.neutron [req-4328c874-11b6-49db-809f-bd8ac39e0eae req-c1a583d4-cfe6-4955-a579-6bd91dcfd2a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Updating instance_info_cache with network_info: [{"id": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "address": "fa:16:3e:be:cd:e8", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap56bb5fc8-f1", "ovs_interfaceid": "56bb5fc8-f112-47c7-84d3-d47e53c4d481", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:14:11 np0005593234 nova_compute[227762]: 2026-01-23 11:14:11.561 227766 DEBUG oslo_concurrency.lockutils [req-4328c874-11b6-49db-809f-bd8ac39e0eae req-c1a583d4-cfe6-4955-a579-6bd91dcfd2a7 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-d4c75524-52b8-4c2b-b0cb-18d94089013b" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:14:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:11.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.003 227766 DEBUG nova.network.neutron [-] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.022 227766 INFO nova.compute.manager [-] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Took 0.66 seconds to deallocate network for instance.#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.065 227766 DEBUG oslo_concurrency.lockutils [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.066 227766 DEBUG oslo_concurrency.lockutils [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.092 227766 DEBUG nova.compute.manager [req-6b598864-bab2-47cc-84e3-66fd65e63c9e req-5dda78fd-6b91-4d29-943a-9b97e77bf672 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Received event network-vif-deleted-56bb5fc8-f112-47c7-84d3-d47e53c4d481 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.121 227766 DEBUG oslo_concurrency.processutils [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.194 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:14:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2363363290' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.585 227766 DEBUG oslo_concurrency.processutils [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.592 227766 DEBUG nova.compute.provider_tree [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.741 227766 DEBUG nova.scheduler.client.report [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:14:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:14:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:12.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.958 227766 DEBUG nova.compute.manager [req-ccabb204-9f04-4921-86d5-f34253fde553 req-bfbfafc9-703f-4685-8562-c62809b6723e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Received event network-vif-plugged-56bb5fc8-f112-47c7-84d3-d47e53c4d481 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.959 227766 DEBUG oslo_concurrency.lockutils [req-ccabb204-9f04-4921-86d5-f34253fde553 req-bfbfafc9-703f-4685-8562-c62809b6723e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.960 227766 DEBUG oslo_concurrency.lockutils [req-ccabb204-9f04-4921-86d5-f34253fde553 req-bfbfafc9-703f-4685-8562-c62809b6723e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.961 227766 DEBUG oslo_concurrency.lockutils [req-ccabb204-9f04-4921-86d5-f34253fde553 req-bfbfafc9-703f-4685-8562-c62809b6723e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.962 227766 DEBUG nova.compute.manager [req-ccabb204-9f04-4921-86d5-f34253fde553 req-bfbfafc9-703f-4685-8562-c62809b6723e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] No waiting events found dispatching network-vif-plugged-56bb5fc8-f112-47c7-84d3-d47e53c4d481 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.962 227766 WARNING nova.compute.manager [req-ccabb204-9f04-4921-86d5-f34253fde553 req-bfbfafc9-703f-4685-8562-c62809b6723e 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Received unexpected event network-vif-plugged-56bb5fc8-f112-47c7-84d3-d47e53c4d481 for instance with vm_state deleted and task_state None.#033[00m
Jan 23 06:14:12 np0005593234 nova_compute[227762]: 2026-01-23 11:14:12.993 227766 DEBUG oslo_concurrency.lockutils [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:13 np0005593234 nova_compute[227762]: 2026-01-23 11:14:13.025 227766 INFO nova.scheduler.client.report [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Deleted allocations for instance d4c75524-52b8-4c2b-b0cb-18d94089013b#033[00m
Jan 23 06:14:13 np0005593234 nova_compute[227762]: 2026-01-23 11:14:13.164 227766 DEBUG oslo_concurrency.lockutils [None req-6c5356a6-c521-44ea-9a50-21d46a1f4745 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "d4c75524-52b8-4c2b-b0cb-18d94089013b" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.793s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:14:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:13.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:14:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:14.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:15.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:15 np0005593234 nova_compute[227762]: 2026-01-23 11:14:15.653 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:14:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:16.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:14:17 np0005593234 nova_compute[227762]: 2026-01-23 11:14:17.196 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:17.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:17 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:17.943 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '106'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:14:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:14:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:18.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:14:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:19.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:20 np0005593234 nova_compute[227762]: 2026-01-23 11:14:20.656 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:14:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:20.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:14:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:21.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:22 np0005593234 nova_compute[227762]: 2026-01-23 11:14:22.199 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:22.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:23.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:14:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:24.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:14:25 np0005593234 nova_compute[227762]: 2026-01-23 11:14:25.616 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769166850.6154492, d4c75524-52b8-4c2b-b0cb-18d94089013b => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:14:25 np0005593234 nova_compute[227762]: 2026-01-23 11:14:25.617 227766 INFO nova.compute.manager [-] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] VM Stopped (Lifecycle Event)#033[00m
Jan 23 06:14:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:14:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:25.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:14:25 np0005593234 nova_compute[227762]: 2026-01-23 11:14:25.643 227766 DEBUG nova.compute.manager [None req-4b1cb081-f910-4cb0-bf0c-7203b38cadfa - - - - - -] [instance: d4c75524-52b8-4c2b-b0cb-18d94089013b] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:14:25 np0005593234 nova_compute[227762]: 2026-01-23 11:14:25.659 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:26 np0005593234 podman[350316]: 2026-01-23 11:14:26.808538439 +0000 UTC m=+0.092127505 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:14:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:14:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:26.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:14:27 np0005593234 nova_compute[227762]: 2026-01-23 11:14:27.200 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:27.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:28.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:14:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:14:29 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:14:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:29.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:30 np0005593234 nova_compute[227762]: 2026-01-23 11:14:30.376 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:30 np0005593234 nova_compute[227762]: 2026-01-23 11:14:30.376 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:30 np0005593234 nova_compute[227762]: 2026-01-23 11:14:30.392 227766 DEBUG nova.compute.manager [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Jan 23 06:14:30 np0005593234 nova_compute[227762]: 2026-01-23 11:14:30.469 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:30 np0005593234 nova_compute[227762]: 2026-01-23 11:14:30.470 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:30 np0005593234 nova_compute[227762]: 2026-01-23 11:14:30.478 227766 DEBUG nova.virt.hardware [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Jan 23 06:14:30 np0005593234 nova_compute[227762]: 2026-01-23 11:14:30.479 227766 INFO nova.compute.claims [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Claim successful on node compute-2.ctlplane.example.com#033[00m
Jan 23 06:14:30 np0005593234 nova_compute[227762]: 2026-01-23 11:14:30.595 227766 DEBUG oslo_concurrency.processutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:14:30 np0005593234 nova_compute[227762]: 2026-01-23 11:14:30.663 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:14:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:30.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:14:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:14:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/37151522' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.043 227766 DEBUG oslo_concurrency.processutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.053 227766 DEBUG nova.compute.provider_tree [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.075 227766 DEBUG nova.scheduler.client.report [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.102 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.632s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.103 227766 DEBUG nova.compute.manager [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.221 227766 DEBUG nova.compute.manager [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.222 227766 DEBUG nova.network.neutron [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.246 227766 INFO nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.269 227766 DEBUG nova.compute.manager [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.331 227766 INFO nova.virt.block_device [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Booting with volume 525f185e-d0f4-4a0b-bd48-9219445747c5 at /dev/vda#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.484 227766 DEBUG os_brick.utils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] ==> get_connector_properties: call "{'root_helper': 'sudo nova-rootwrap /etc/nova/rootwrap.conf', 'my_ip': '192.168.122.102', 'multipath': True, 'enforce_multipath': True, 'host': 'compute-2.ctlplane.example.com', 'execute': None}" trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:176#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.486 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): multipathd show status execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.497 233340 DEBUG oslo_concurrency.processutils [-] CMD "multipathd show status" returned: 0 in 0.011s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.498 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[93f77ba3-5550-481d-a01d-e22d0052c173]: (4, ('path checker states:\n\npaths: 0\nbusy: False\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.499 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): cat /etc/iscsi/initiatorname.iscsi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.506 233340 DEBUG oslo_concurrency.processutils [-] CMD "cat /etc/iscsi/initiatorname.iscsi" returned: 0 in 0.007s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.506 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[3c8da775-f608-48c2-8886-0e1d379ee543]: (4, ('InitiatorName=iqn.1994-05.com.redhat:e6da4ae895b4', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.508 233340 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): findmnt -v / -n -o SOURCE execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.521 233340 DEBUG oslo_concurrency.processutils [-] CMD "findmnt -v / -n -o SOURCE" returned: 0 in 0.013s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.521 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[f432b834-e964-4758-b781-aab5c9878fda]: (4, ('overlay\n', '')) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.523 233340 DEBUG oslo.privsep.daemon [-] privsep: reply[78eafb45-67b5-44c1-9453-b9613347a4e1]: (4, '3e200bf7-7634-42a0-8184-2372f58672f7') _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.523 227766 DEBUG oslo_concurrency.processutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): nvme version execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.550 227766 DEBUG oslo_concurrency.processutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "nvme version" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.552 227766 DEBUG os_brick.initiator.connectors.lightos [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] LIGHTOS: [Errno 111] ECONNREFUSED find_dsc /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:98#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.552 227766 DEBUG os_brick.initiator.connectors.lightos [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] LIGHTOS: did not find dsc, continuing anyway. get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:76#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.552 227766 DEBUG os_brick.initiator.connectors.lightos [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] LIGHTOS: finally hostnqn: nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d dsc:  get_connector_properties /usr/lib/python3.9/site-packages/os_brick/initiator/connectors/lightos.py:79#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.553 227766 DEBUG os_brick.utils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] <== get_connector_properties: return (67ms) {'platform': 'x86_64', 'os_type': 'linux', 'ip': '192.168.122.102', 'host': 'compute-2.ctlplane.example.com', 'multipath': True, 'initiator': 'iqn.1994-05.com.redhat:e6da4ae895b4', 'do_local_attach': False, 'nvme_hostid': '5350774e-8b5e-4dba-80a9-92d405981c1d', 'system uuid': '3e200bf7-7634-42a0-8184-2372f58672f7', 'nqn': 'nqn.2014-08.org.nvmexpress:uuid:5350774e-8b5e-4dba-80a9-92d405981c1d', 'nvme_native_multipath': True, 'found_dsc': ''} trace_logging_wrapper /usr/lib/python3.9/site-packages/os_brick/utils.py:203#033[00m
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.553 227766 DEBUG nova.virt.block_device [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating existing volume attachment record: 1cb3b49f-526a-4fbc-a093-9feffe2c7d1e _volume_attach /usr/lib/python3.9/site-packages/nova/virt/block_device.py:631#033[00m
Jan 23 06:14:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:31.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:31 np0005593234 nova_compute[227762]: 2026-01-23 11:14:31.979 227766 DEBUG nova.policy [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '5d6a458f5d9345379b05f0cdb69a7b0f', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3a245f7970f14fffa60af2ff972b4bfd', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.202 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.633 227766 DEBUG nova.compute.manager [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.635 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.635 227766 INFO nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Creating image(s)#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.636 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Did not create local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4859#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.636 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Ensure instance console log exists: /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.636 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.637 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.637 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.773 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.774 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.774 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.774 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:14:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:32.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:32 np0005593234 nova_compute[227762]: 2026-01-23 11:14:32.873 227766 DEBUG nova.network.neutron [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Successfully created port: 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Jan 23 06:14:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:14:33 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3715297681' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:14:33 np0005593234 nova_compute[227762]: 2026-01-23 11:14:33.302 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.528s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:14:33 np0005593234 nova_compute[227762]: 2026-01-23 11:14:33.500 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:14:33 np0005593234 nova_compute[227762]: 2026-01-23 11:14:33.501 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4097MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:14:33 np0005593234 nova_compute[227762]: 2026-01-23 11:14:33.501 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:33 np0005593234 nova_compute[227762]: 2026-01-23 11:14:33.501 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:33.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:33 np0005593234 nova_compute[227762]: 2026-01-23 11:14:33.684 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance ed71c532-711c-49b9-b0d5-eaf409f0bc76 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 06:14:33 np0005593234 nova_compute[227762]: 2026-01-23 11:14:33.684 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:14:33 np0005593234 nova_compute[227762]: 2026-01-23 11:14:33.685 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:14:33 np0005593234 nova_compute[227762]: 2026-01-23 11:14:33.717 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:14:34 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:14:34 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1106326540' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:14:34 np0005593234 nova_compute[227762]: 2026-01-23 11:14:34.163 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:14:34 np0005593234 nova_compute[227762]: 2026-01-23 11:14:34.171 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:14:34 np0005593234 nova_compute[227762]: 2026-01-23 11:14:34.396 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:14:34 np0005593234 nova_compute[227762]: 2026-01-23 11:14:34.420 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:14:34 np0005593234 nova_compute[227762]: 2026-01-23 11:14:34.421 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.920s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:34 np0005593234 nova_compute[227762]: 2026-01-23 11:14:34.580 227766 DEBUG nova.network.neutron [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Successfully updated port: 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Jan 23 06:14:34 np0005593234 nova_compute[227762]: 2026-01-23 11:14:34.598 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:14:34 np0005593234 nova_compute[227762]: 2026-01-23 11:14:34.598 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquired lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:14:34 np0005593234 nova_compute[227762]: 2026-01-23 11:14:34.599 227766 DEBUG nova.network.neutron [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:14:34 np0005593234 nova_compute[227762]: 2026-01-23 11:14:34.710 227766 DEBUG nova.compute.manager [req-749d88be-8f88-421a-b31c-319b897cffd1 req-048728a8-e759-47e3-9486-e15a3bc5d754 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-changed-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:14:34 np0005593234 nova_compute[227762]: 2026-01-23 11:14:34.711 227766 DEBUG nova.compute.manager [req-749d88be-8f88-421a-b31c-319b897cffd1 req-048728a8-e759-47e3-9486-e15a3bc5d754 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Refreshing instance network info cache due to event network-changed-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:14:34 np0005593234 nova_compute[227762]: 2026-01-23 11:14:34.712 227766 DEBUG oslo_concurrency.lockutils [req-749d88be-8f88-421a-b31c-319b897cffd1 req-048728a8-e759-47e3-9486-e15a3bc5d754 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:14:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:34.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:34 np0005593234 nova_compute[227762]: 2026-01-23 11:14:34.955 227766 DEBUG nova.network.neutron [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Jan 23 06:14:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:35.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.667 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.672 227766 DEBUG nova.network.neutron [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating instance_info_cache with network_info: [{"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.697 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Releasing lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.697 227766 DEBUG nova.compute.manager [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Instance network_info: |[{"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.698 227766 DEBUG oslo_concurrency.lockutils [req-749d88be-8f88-421a-b31c-319b897cffd1 req-048728a8-e759-47e3-9486-e15a3bc5d754 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.698 227766 DEBUG nova.network.neutron [req-749d88be-8f88-421a-b31c-319b897cffd1 req-048728a8-e759-47e3-9486-e15a3bc5d754 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Refreshing network info cache for port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.701 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Start _get_guest_xml network_info=[{"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, '/dev/vda': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [], 'ephemerals': [], 'block_device_mapping': [{'boot_index': 0, 'mount_device': '/dev/vda', 'delete_on_termination': True, 'guest_format': None, 'connection_info': {'driver_volume_type': 'rbd', 'data': {'name': 'volumes/volume-525f185e-d0f4-4a0b-bd48-9219445747c5', 'hosts': ['192.168.122.100', '192.168.122.102', '192.168.122.101'], 'ports': ['6789', '6789', '6789'], 'cluster_name': 'ceph', 'auth_enabled': True, 'auth_username': 'openstack', 'secret_type': 'ceph', 'secret_uuid': '***', 'volume_id': '525f185e-d0f4-4a0b-bd48-9219445747c5', 'discard': True, 'qos_specs': None, 'access_mode': 'rw', 'encrypted': False, 'cacheable': False}, 'status': 'reserved', 'instance': 'ed71c532-711c-49b9-b0d5-eaf409f0bc76', 'attached_at': '', 'detached_at': '', 'volume_id': '525f185e-d0f4-4a0b-bd48-9219445747c5', 'serial': '525f185e-d0f4-4a0b-bd48-9219445747c5'}, 'disk_bus': 'virtio', 'device_type': 'disk', 'attachment_id': '1cb3b49f-526a-4fbc-a093-9feffe2c7d1e', 'volume_type': None}], ': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.704 227766 WARNING nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.708 227766 DEBUG nova.virt.libvirt.host [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.709 227766 DEBUG nova.virt.libvirt.host [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.712 227766 DEBUG nova.virt.libvirt.host [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.713 227766 DEBUG nova.virt.libvirt.host [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.714 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.714 227766 DEBUG nova.virt.hardware [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T09:27:22Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='68d42077-c749-4366-ba3e-07758debb02d',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format=<?>,created_at=<?>,direct_url=<?>,disk_format=<?>,id=<?>,min_disk=0,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.715 227766 DEBUG nova.virt.hardware [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.715 227766 DEBUG nova.virt.hardware [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.715 227766 DEBUG nova.virt.hardware [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.716 227766 DEBUG nova.virt.hardware [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.716 227766 DEBUG nova.virt.hardware [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.716 227766 DEBUG nova.virt.hardware [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.716 227766 DEBUG nova.virt.hardware [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.716 227766 DEBUG nova.virt.hardware [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.717 227766 DEBUG nova.virt.hardware [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.717 227766 DEBUG nova.virt.hardware [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.747 227766 DEBUG nova.storage.rbd_utils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] rbd image ed71c532-711c-49b9-b0d5-eaf409f0bc76_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:14:35 np0005593234 nova_compute[227762]: 2026-01-23 11:14:35.751 227766 DEBUG oslo_concurrency.processutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:14:35 np0005593234 podman[350545]: 2026-01-23 11:14:35.789800325 +0000 UTC m=+0.085305500 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 06:14:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:14:36 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1429919858' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.176 227766 DEBUG oslo_concurrency.processutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.208 227766 DEBUG nova.virt.libvirt.vif [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:14:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-2003932207',display_name='tempest-TestShelveInstance-server-2003932207',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-2003932207',id=224,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3bhlUYE6XIKDWLP/uD+8jgtWoi2zAcS0lWBO+SzamqVUAvHDBegRP4BFxXqktx7WnHXLwe9Z4SStWrBiFMiHWsxXNyXjRJKpQMiCgvbWujMjyVx4RONf0TXgED6xft/g==',key_name='tempest-TestShelveInstance-609770395',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a245f7970f14fffa60af2ff972b4bfd',ramdisk_id='',reservation_id='r-p0f4vjg5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestShelveInstance-869807080',owner_user_name='tempest-TestShelveInstance-869807080-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:14:31Z,user_data=None,user_id='5d6a458f5d9345379b05f0cdb69a7b0f',uuid=ed71c532-711c-49b9-b0d5-eaf409f0bc76,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.209 227766 DEBUG nova.network.os_vif_util [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converting VIF {"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.210 227766 DEBUG nova.network.os_vif_util [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.211 227766 DEBUG nova.objects.instance [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'pci_devices' on Instance uuid ed71c532-711c-49b9-b0d5-eaf409f0bc76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.225 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] End _get_guest_xml xml=<domain type="kvm">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  <uuid>ed71c532-711c-49b9-b0d5-eaf409f0bc76</uuid>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  <name>instance-000000e0</name>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  <memory>131072</memory>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  <vcpu>1</vcpu>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  <metadata>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <nova:name>tempest-TestShelveInstance-server-2003932207</nova:name>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <nova:creationTime>2026-01-23 11:14:35</nova:creationTime>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <nova:flavor name="m1.nano">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <nova:memory>128</nova:memory>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <nova:disk>1</nova:disk>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <nova:swap>0</nova:swap>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <nova:ephemeral>0</nova:ephemeral>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <nova:vcpus>1</nova:vcpus>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      </nova:flavor>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <nova:owner>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <nova:user uuid="5d6a458f5d9345379b05f0cdb69a7b0f">tempest-TestShelveInstance-869807080-project-member</nova:user>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <nova:project uuid="3a245f7970f14fffa60af2ff972b4bfd">tempest-TestShelveInstance-869807080</nova:project>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      </nova:owner>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <nova:ports>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <nova:port uuid="617c9ef0-df2c-4bd2-8d4c-fafc1723eb55">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        </nova:port>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      </nova:ports>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    </nova:instance>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  </metadata>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  <sysinfo type="smbios">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <system>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <entry name="manufacturer">RDO</entry>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <entry name="product">OpenStack Compute</entry>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <entry name="serial">ed71c532-711c-49b9-b0d5-eaf409f0bc76</entry>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <entry name="uuid">ed71c532-711c-49b9-b0d5-eaf409f0bc76</entry>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <entry name="family">Virtual Machine</entry>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    </system>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  </sysinfo>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  <os>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <type arch="x86_64" machine="q35">hvm</type>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <boot dev="hd"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <smbios mode="sysinfo"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  </os>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  <features>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <acpi/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <apic/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <vmcoreinfo/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  </features>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  <clock offset="utc">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <timer name="pit" tickpolicy="delay"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <timer name="rtc" tickpolicy="catchup"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <timer name="hpet" present="no"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  </clock>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  <cpu mode="custom" match="exact">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <model>Nehalem</model>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <topology sockets="1" cores="1" threads="1"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  </cpu>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  <devices>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <disk type="network" device="cdrom">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <driver type="raw" cache="none"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="vms/ed71c532-711c-49b9-b0d5-eaf409f0bc76_disk.config">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <target dev="sda" bus="sata"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <disk type="network" device="disk">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <driver name="qemu" type="raw" cache="none" discard="unmap"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <source protocol="rbd" name="volumes/volume-525f185e-d0f4-4a0b-bd48-9219445747c5">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.100" port="6789"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.102" port="6789"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <host name="192.168.122.101" port="6789"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      </source>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <auth username="openstack">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:        <secret type="ceph" uuid="e1533653-0a5a-584c-b34b-8689f0d32e77"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      </auth>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <target dev="vda" bus="virtio"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <serial>525f185e-d0f4-4a0b-bd48-9219445747c5</serial>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    </disk>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <interface type="ethernet">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <mac address="fa:16:3e:1d:a1:56"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <driver name="vhost" rx_queue_size="512"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <mtu size="1442"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <target dev="tap617c9ef0-df"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    </interface>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <serial type="pty">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <log file="/var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/console.log" append="off"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    </serial>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <video>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <model type="virtio"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    </video>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <input type="tablet" bus="usb"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <rng model="virtio">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <backend model="random">/dev/urandom</backend>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    </rng>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="pci" model="pcie-root-port"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <controller type="usb" index="0"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    <memballoon model="virtio">
Jan 23 06:14:36 np0005593234 nova_compute[227762]:      <stats period="10"/>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:    </memballoon>
Jan 23 06:14:36 np0005593234 nova_compute[227762]:  </devices>
Jan 23 06:14:36 np0005593234 nova_compute[227762]: </domain>
Jan 23 06:14:36 np0005593234 nova_compute[227762]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.226 227766 DEBUG nova.compute.manager [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Preparing to wait for external event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.226 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.227 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.227 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.227 227766 DEBUG nova.virt.libvirt.vif [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T11:14:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestShelveInstance-server-2003932207',display_name='tempest-TestShelveInstance-server-2003932207',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-2003932207',id=224,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3bhlUYE6XIKDWLP/uD+8jgtWoi2zAcS0lWBO+SzamqVUAvHDBegRP4BFxXqktx7WnHXLwe9Z4SStWrBiFMiHWsxXNyXjRJKpQMiCgvbWujMjyVx4RONf0TXgED6xft/g==',key_name='tempest-TestShelveInstance-609770395',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3a245f7970f14fffa60af2ff972b4bfd',ramdisk_id='',reservation_id='r-p0f4vjg5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',network_allocated='True',owner_project_name='tempest-TestShelveInstance-869807080',owner_user_name='tempest-TestShelveInstance-869807080-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T11:14:31Z,user_data=None,user_id='5d6a458f5d9345379b05f0cdb69a7b0f',uuid=ed71c532-711c-49b9-b0d5-eaf409f0bc76,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.228 227766 DEBUG nova.network.os_vif_util [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converting VIF {"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.228 227766 DEBUG nova.network.os_vif_util [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.229 227766 DEBUG os_vif [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.229 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.229 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.230 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.232 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.233 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap617c9ef0-df, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.233 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap617c9ef0-df, col_values=(('external_ids', {'iface-id': '617c9ef0-df2c-4bd2-8d4c-fafc1723eb55', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1d:a1:56', 'vm-uuid': 'ed71c532-711c-49b9-b0d5-eaf409f0bc76'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.266 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:36 np0005593234 NetworkManager[48942]: <info>  [1769166876.2669] manager: (tap617c9ef0-df): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/500)
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.269 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.272 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.273 227766 INFO os_vif [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df')#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.441 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.442 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.442 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] No VIF found with MAC fa:16:3e:1d:a1:56, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.442 227766 INFO nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Using config drive#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.466 227766 DEBUG nova.storage.rbd_utils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] rbd image ed71c532-711c-49b9-b0d5-eaf409f0bc76_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:14:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:14:36 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:14:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:14:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:36.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.883 227766 DEBUG nova.network.neutron [req-749d88be-8f88-421a-b31c-319b897cffd1 req-048728a8-e759-47e3-9486-e15a3bc5d754 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updated VIF entry in instance network info cache for port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:14:36 np0005593234 nova_compute[227762]: 2026-01-23 11:14:36.884 227766 DEBUG nova.network.neutron [req-749d88be-8f88-421a-b31c-319b897cffd1 req-048728a8-e759-47e3-9486-e15a3bc5d754 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating instance_info_cache with network_info: [{"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.113 227766 DEBUG oslo_concurrency.lockutils [req-749d88be-8f88-421a-b31c-319b897cffd1 req-048728a8-e759-47e3-9486-e15a3bc5d754 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.206 227766 INFO nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Creating config drive at /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/disk.config#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.211 227766 DEBUG oslo_concurrency.processutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6mg1hs9c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.237 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.347 227766 DEBUG oslo_concurrency.processutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp6mg1hs9c" returned: 0 in 0.136s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.377 227766 DEBUG nova.storage.rbd_utils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] rbd image ed71c532-711c-49b9-b0d5-eaf409f0bc76_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.381 227766 DEBUG oslo_concurrency.processutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/disk.config ed71c532-711c-49b9-b0d5-eaf409f0bc76_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.528 227766 DEBUG oslo_concurrency.processutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/disk.config ed71c532-711c-49b9-b0d5-eaf409f0bc76_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.529 227766 INFO nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Deleting local config drive /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76/disk.config because it was imported into RBD.#033[00m
Jan 23 06:14:37 np0005593234 kernel: tap617c9ef0-df: entered promiscuous mode
Jan 23 06:14:37 np0005593234 NetworkManager[48942]: <info>  [1769166877.5798] manager: (tap617c9ef0-df): new Tun device (/org/freedesktop/NetworkManager/Devices/501)
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.580 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:37 np0005593234 ovn_controller[134547]: 2026-01-23T11:14:37Z|01041|binding|INFO|Claiming lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 for this chassis.
Jan 23 06:14:37 np0005593234 ovn_controller[134547]: 2026-01-23T11:14:37Z|01042|binding|INFO|617c9ef0-df2c-4bd2-8d4c-fafc1723eb55: Claiming fa:16:3e:1d:a1:56 10.100.0.9
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.589 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:a1:56 10.100.0.9'], port_security=['fa:16:3e:1d:a1:56 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ed71c532-711c-49b9-b0d5-eaf409f0bc76', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42899517-91b9-42e3-96a7-29180211a7a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a245f7970f14fffa60af2ff972b4bfd', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2243181a-ba78-49d0-a310-35ec5fa364b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef3519b3-9b5b-4b40-8630-d2487396abc0, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.591 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 in datapath 42899517-91b9-42e3-96a7-29180211a7a4 bound to our chassis#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.592 144381 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 42899517-91b9-42e3-96a7-29180211a7a4#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.604 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f447c064-e465-404a-a4b7-c282c3bcbd85]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.605 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap42899517-91 in ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.607 232070 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap42899517-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.607 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b76dc48e-e8ba-47c8-8b37-ebdfe5de4fa2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.608 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2355639b-95c0-4165-84b4-6f2cf7c42fc6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 ovn_controller[134547]: 2026-01-23T11:14:37Z|01043|binding|INFO|Setting lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 ovn-installed in OVS
Jan 23 06:14:37 np0005593234 ovn_controller[134547]: 2026-01-23T11:14:37Z|01044|binding|INFO|Setting lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 up in Southbound
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.610 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:37 np0005593234 systemd-udevd[350735]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.615 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.624 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[6b809cab-cb50-451b-a037-0f452ed9eef0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 NetworkManager[48942]: <info>  [1769166877.6282] device (tap617c9ef0-df): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 06:14:37 np0005593234 systemd-machined[195626]: New machine qemu-116-instance-000000e0.
Jan 23 06:14:37 np0005593234 NetworkManager[48942]: <info>  [1769166877.6300] device (tap617c9ef0-df): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 06:14:37 np0005593234 systemd[1]: Started Virtual Machine qemu-116-instance-000000e0.
Jan 23 06:14:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:37.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.650 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[89b79bcb-b22c-45f8-9f02-db6268c6a532]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.681 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[e5059c04-6c82-483e-bb03-f6b01d3d9385]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 NetworkManager[48942]: <info>  [1769166877.6871] manager: (tap42899517-90): new Veth device (/org/freedesktop/NetworkManager/Devices/502)
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.686 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[b512cf33-43de-4d32-a49c-5fd15209b313]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.719 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[7adbec71-0ae3-4323-8013-05c2f4f6a5cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.723 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[a2ab7c96-08f7-46ee-84f1-4230e2d7637d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 NetworkManager[48942]: <info>  [1769166877.7551] device (tap42899517-90): carrier: link connected
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.762 232085 DEBUG oslo.privsep.daemon [-] privsep: reply[fccea708-cdc6-4b88-b4a5-c6d9bdb69b68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.779 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[f5830134-23a8-44b4-b883-b4afb829305a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42899517-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:74:09:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 321], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1077271, 'reachable_time': 27329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 350768, 'error': None, 'target': 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.796 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[ec3853d1-c775-45f5-82ab-c8f9f24703f1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe74:998'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1077271, 'tstamp': 1077271}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 350769, 'error': None, 'target': 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.813 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[81fe8437-12d0-4972-ab20-bc041aed5771]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap42899517-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:74:09:98'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 321], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1077271, 'reachable_time': 27329, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 350770, 'error': None, 'target': 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.821 227766 DEBUG nova.compute.manager [req-35c1c4a6-99b6-455f-8c4f-15dd2382a1e4 req-37a5d416-ff29-4a82-8e30-0a539e92f6b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.821 227766 DEBUG oslo_concurrency.lockutils [req-35c1c4a6-99b6-455f-8c4f-15dd2382a1e4 req-37a5d416-ff29-4a82-8e30-0a539e92f6b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.821 227766 DEBUG oslo_concurrency.lockutils [req-35c1c4a6-99b6-455f-8c4f-15dd2382a1e4 req-37a5d416-ff29-4a82-8e30-0a539e92f6b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.822 227766 DEBUG oslo_concurrency.lockutils [req-35c1c4a6-99b6-455f-8c4f-15dd2382a1e4 req-37a5d416-ff29-4a82-8e30-0a539e92f6b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.822 227766 DEBUG nova.compute.manager [req-35c1c4a6-99b6-455f-8c4f-15dd2382a1e4 req-37a5d416-ff29-4a82-8e30-0a539e92f6b3 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Processing event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.845 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a090ac8a-e54b-4498-8ac7-f99cc715d102]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.903 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a8279343-d05a-48de-b3c0-76d832f71932]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.905 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42899517-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.906 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.906 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap42899517-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:14:37 np0005593234 NetworkManager[48942]: <info>  [1769166877.9090] manager: (tap42899517-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/503)
Jan 23 06:14:37 np0005593234 kernel: tap42899517-90: entered promiscuous mode
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.910 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap42899517-90, col_values=(('external_ids', {'iface-id': '82ae71e6-e83a-4506-8f0f-261163163937'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:14:37 np0005593234 ovn_controller[134547]: 2026-01-23T11:14:37Z|01045|binding|INFO|Releasing lport 82ae71e6-e83a-4506-8f0f-261163163937 from this chassis (sb_readonly=0)
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.921 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:37 np0005593234 nova_compute[227762]: 2026-01-23 11:14:37.926 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.926 144381 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/42899517-91b9-42e3-96a7-29180211a7a4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/42899517-91b9-42e3-96a7-29180211a7a4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.927 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a5793a79-1cca-4a1b-b831-c193ff0f9ac9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.928 144381 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: global
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    log         /dev/log local0 debug
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    log-tag     haproxy-metadata-proxy-42899517-91b9-42e3-96a7-29180211a7a4
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    user        root
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    group       root
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    maxconn     1024
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    pidfile     /var/lib/neutron/external/pids/42899517-91b9-42e3-96a7-29180211a7a4.pid.haproxy
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    daemon
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: defaults
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    log global
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    mode http
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    option httplog
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    option dontlognull
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    option http-server-close
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    option forwardfor
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    retries                 3
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    timeout http-request    30s
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    timeout connect         30s
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    timeout client          32s
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    timeout server          32s
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    timeout http-keep-alive 30s
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: listen listener
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    bind 169.254.169.254:80
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    server metadata /var/lib/neutron/metadata_proxy
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]:    http-request add-header X-OVN-Network-ID 42899517-91b9-42e3-96a7-29180211a7a4
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Jan 23 06:14:37 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:37.930 144381 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'env', 'PROCESS_TAG=haproxy-42899517-91b9-42e3-96a7-29180211a7a4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/42899517-91b9-42e3-96a7-29180211a7a4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Jan 23 06:14:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.154 227766 DEBUG nova.compute.manager [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.155 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166878.1539328, ed71c532-711c-49b9-b0d5-eaf409f0bc76 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.155 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] VM Started (Lifecycle Event)#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.159 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.162 227766 INFO nova.virt.libvirt.driver [-] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Instance spawned successfully.#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.163 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.181 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.186 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.190 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.190 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.191 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.191 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.192 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.192 227766 DEBUG nova.virt.libvirt.driver [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.224 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.224 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166878.1550915, ed71c532-711c-49b9-b0d5-eaf409f0bc76 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.225 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] VM Paused (Lifecycle Event)#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.256 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.260 227766 DEBUG nova.virt.driver [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] Emitting event <LifecycleEvent: 1769166878.1589108, ed71c532-711c-49b9-b0d5-eaf409f0bc76 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.260 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] VM Resumed (Lifecycle Event)#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.272 227766 INFO nova.compute.manager [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Took 5.64 seconds to spawn the instance on the hypervisor.#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.272 227766 DEBUG nova.compute.manager [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.281 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.285 227766 DEBUG nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:14:38 np0005593234 podman[350844]: 2026-01-23 11:14:38.316842182 +0000 UTC m=+0.048962004 container create d68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.332 227766 INFO nova.compute.manager [None req-0e1867b5-96ac-43dc-ba94-5c86435d7497 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Jan 23 06:14:38 np0005593234 systemd[1]: Started libpod-conmon-d68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9.scope.
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.372 227766 INFO nova.compute.manager [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Took 7.93 seconds to build instance.#033[00m
Jan 23 06:14:38 np0005593234 systemd[1]: Started libcrun container.
Jan 23 06:14:38 np0005593234 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3789a0a7a6730f1599c50184c1844bc60a5a27a5ab977c58834c07864b6bcd8c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 06:14:38 np0005593234 podman[350844]: 2026-01-23 11:14:38.292402577 +0000 UTC m=+0.024522419 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.397 227766 DEBUG oslo_concurrency.lockutils [None req-f59a52e5-2bd1-41c3-8907-b0642632febb 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:38 np0005593234 podman[350844]: 2026-01-23 11:14:38.400595033 +0000 UTC m=+0.132714885 container init d68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 06:14:38 np0005593234 podman[350844]: 2026-01-23 11:14:38.406038494 +0000 UTC m=+0.138158316 container start d68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.421 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.421 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.421 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:14:38 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[350860]: [NOTICE]   (350864) : New worker (350866) forked
Jan 23 06:14:38 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[350860]: [NOTICE]   (350864) : Loading success.
Jan 23 06:14:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:38.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.869 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.869 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquired lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.869 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Jan 23 06:14:38 np0005593234 nova_compute[227762]: 2026-01-23 11:14:38.869 227766 DEBUG nova.objects.instance [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lazy-loading 'info_cache' on Instance uuid ed71c532-711c-49b9-b0d5-eaf409f0bc76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:14:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:39.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:39 np0005593234 nova_compute[227762]: 2026-01-23 11:14:39.912 227766 DEBUG nova.compute.manager [req-b973e881-df7b-4d4d-b7ae-bff59778accd req-25735baa-3401-4fac-87aa-39fcc67a2209 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:14:39 np0005593234 nova_compute[227762]: 2026-01-23 11:14:39.912 227766 DEBUG oslo_concurrency.lockutils [req-b973e881-df7b-4d4d-b7ae-bff59778accd req-25735baa-3401-4fac-87aa-39fcc67a2209 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:39 np0005593234 nova_compute[227762]: 2026-01-23 11:14:39.913 227766 DEBUG oslo_concurrency.lockutils [req-b973e881-df7b-4d4d-b7ae-bff59778accd req-25735baa-3401-4fac-87aa-39fcc67a2209 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:39 np0005593234 nova_compute[227762]: 2026-01-23 11:14:39.913 227766 DEBUG oslo_concurrency.lockutils [req-b973e881-df7b-4d4d-b7ae-bff59778accd req-25735baa-3401-4fac-87aa-39fcc67a2209 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:39 np0005593234 nova_compute[227762]: 2026-01-23 11:14:39.913 227766 DEBUG nova.compute.manager [req-b973e881-df7b-4d4d-b7ae-bff59778accd req-25735baa-3401-4fac-87aa-39fcc67a2209 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] No waiting events found dispatching network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:14:39 np0005593234 nova_compute[227762]: 2026-01-23 11:14:39.914 227766 WARNING nova.compute.manager [req-b973e881-df7b-4d4d-b7ae-bff59778accd req-25735baa-3401-4fac-87aa-39fcc67a2209 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received unexpected event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 for instance with vm_state active and task_state None.#033[00m
Jan 23 06:14:40 np0005593234 nova_compute[227762]: 2026-01-23 11:14:40.255 227766 DEBUG nova.network.neutron [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating instance_info_cache with network_info: [{"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:14:40 np0005593234 nova_compute[227762]: 2026-01-23 11:14:40.299 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Releasing lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:14:40 np0005593234 nova_compute[227762]: 2026-01-23 11:14:40.299 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Jan 23 06:14:40 np0005593234 nova_compute[227762]: 2026-01-23 11:14:40.300 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:14:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:40.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:14:41 np0005593234 nova_compute[227762]: 2026-01-23 11:14:41.268 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:41.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:41 np0005593234 nova_compute[227762]: 2026-01-23 11:14:41.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:42 np0005593234 nova_compute[227762]: 2026-01-23 11:14:42.182 227766 DEBUG nova.compute.manager [req-57587ac9-1f5e-4b6b-b753-2a9fce6827c1 req-567aa453-535b-48cc-9293-113bb589e7c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-changed-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:14:42 np0005593234 nova_compute[227762]: 2026-01-23 11:14:42.182 227766 DEBUG nova.compute.manager [req-57587ac9-1f5e-4b6b-b753-2a9fce6827c1 req-567aa453-535b-48cc-9293-113bb589e7c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Refreshing instance network info cache due to event network-changed-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:14:42 np0005593234 nova_compute[227762]: 2026-01-23 11:14:42.182 227766 DEBUG oslo_concurrency.lockutils [req-57587ac9-1f5e-4b6b-b753-2a9fce6827c1 req-567aa453-535b-48cc-9293-113bb589e7c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:14:42 np0005593234 nova_compute[227762]: 2026-01-23 11:14:42.182 227766 DEBUG oslo_concurrency.lockutils [req-57587ac9-1f5e-4b6b-b753-2a9fce6827c1 req-567aa453-535b-48cc-9293-113bb589e7c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:14:42 np0005593234 nova_compute[227762]: 2026-01-23 11:14:42.182 227766 DEBUG nova.network.neutron [req-57587ac9-1f5e-4b6b-b753-2a9fce6827c1 req-567aa453-535b-48cc-9293-113bb589e7c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Refreshing network info cache for port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:14:42 np0005593234 nova_compute[227762]: 2026-01-23 11:14:42.205 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:14:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:42.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:42.926 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:42.927 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:14:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:14:42.928 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:14:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:43 np0005593234 nova_compute[227762]: 2026-01-23 11:14:43.536 227766 DEBUG nova.network.neutron [req-57587ac9-1f5e-4b6b-b753-2a9fce6827c1 req-567aa453-535b-48cc-9293-113bb589e7c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updated VIF entry in instance network info cache for port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:14:43 np0005593234 nova_compute[227762]: 2026-01-23 11:14:43.537 227766 DEBUG nova.network.neutron [req-57587ac9-1f5e-4b6b-b753-2a9fce6827c1 req-567aa453-535b-48cc-9293-113bb589e7c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating instance_info_cache with network_info: [{"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:14:43 np0005593234 nova_compute[227762]: 2026-01-23 11:14:43.556 227766 DEBUG oslo_concurrency.lockutils [req-57587ac9-1f5e-4b6b-b753-2a9fce6827c1 req-567aa453-535b-48cc-9293-113bb589e7c0 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:14:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:43.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:43 np0005593234 nova_compute[227762]: 2026-01-23 11:14:43.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:14:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3984677424' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:14:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:14:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3984677424' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:14:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:44.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:45.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:46 np0005593234 nova_compute[227762]: 2026-01-23 11:14:46.272 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:46 np0005593234 nova_compute[227762]: 2026-01-23 11:14:46.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:46 np0005593234 nova_compute[227762]: 2026-01-23 11:14:46.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:14:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:14:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:46.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:14:47 np0005593234 nova_compute[227762]: 2026-01-23 11:14:47.206 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:47.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:48.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:49.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:49 np0005593234 nova_compute[227762]: 2026-01-23 11:14:49.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:50 np0005593234 nova_compute[227762]: 2026-01-23 11:14:50.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:50.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:51 np0005593234 nova_compute[227762]: 2026-01-23 11:14:51.276 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:51.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:52 np0005593234 nova_compute[227762]: 2026-01-23 11:14:52.207 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:52 np0005593234 ovn_controller[134547]: 2026-01-23T11:14:52Z|00136|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1d:a1:56 10.100.0.9
Jan 23 06:14:52 np0005593234 ovn_controller[134547]: 2026-01-23T11:14:52Z|00137|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1d:a1:56 10.100.0.9
Jan 23 06:14:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:14:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:52.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:14:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:53.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:53 np0005593234 nova_compute[227762]: 2026-01-23 11:14:53.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:14:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:54.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:55.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:56 np0005593234 nova_compute[227762]: 2026-01-23 11:14:56.279 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #223. Immutable memtables: 0.
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.803878) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 143] Flushing memtable with next log file: 223
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166896803963, "job": 143, "event": "flush_started", "num_memtables": 1, "num_entries": 1098, "num_deletes": 251, "total_data_size": 2268996, "memory_usage": 2305112, "flush_reason": "Manual Compaction"}
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 143] Level-0 flush table #224: started
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166896813993, "cf_name": "default", "job": 143, "event": "table_file_creation", "file_number": 224, "file_size": 931973, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 106091, "largest_seqno": 107184, "table_properties": {"data_size": 928020, "index_size": 1604, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1285, "raw_key_size": 10798, "raw_average_key_size": 21, "raw_value_size": 919386, "raw_average_value_size": 1802, "num_data_blocks": 69, "num_entries": 510, "num_filter_entries": 510, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166811, "oldest_key_time": 1769166811, "file_creation_time": 1769166896, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 224, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 143] Flush lasted 10497 microseconds, and 5793 cpu microseconds.
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.814373) [db/flush_job.cc:967] [default] [JOB 143] Level-0 flush table #224: 931973 bytes OK
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.814405) [db/memtable_list.cc:519] [default] Level-0 commit table #224 started
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.816634) [db/memtable_list.cc:722] [default] Level-0 commit table #224: memtable #1 done
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.816657) EVENT_LOG_v1 {"time_micros": 1769166896816649, "job": 143, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.816678) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 143] Try to delete WAL files size 2263615, prev total WAL file size 2263615, number of live WAL files 2.
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000220.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.818283) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373732' seq:72057594037927935, type:22 .. '6D6772737461740034303233' seq:0, type:0; will stop at (end)
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 144] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 143 Base level 0, inputs: [224(910KB)], [222(15MB)]
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166896818705, "job": 144, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [224], "files_L6": [222], "score": -1, "input_data_size": 16763080, "oldest_snapshot_seqno": -1}
Jan 23 06:14:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:56.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 144] Generated table #225: 12129 keys, 13490922 bytes, temperature: kUnknown
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166896912917, "cf_name": "default", "job": 144, "event": "table_file_creation", "file_number": 225, "file_size": 13490922, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13416619, "index_size": 42894, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30341, "raw_key_size": 321346, "raw_average_key_size": 26, "raw_value_size": 13208951, "raw_average_value_size": 1089, "num_data_blocks": 1616, "num_entries": 12129, "num_filter_entries": 12129, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769166896, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 225, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.913454) [db/compaction/compaction_job.cc:1663] [default] [JOB 144] Compacted 1@0 + 1@6 files to L6 => 13490922 bytes
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.914754) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 177.3 rd, 142.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 15.1 +0.0 blob) out(12.9 +0.0 blob), read-write-amplify(32.5) write-amplify(14.5) OK, records in: 12611, records dropped: 482 output_compression: NoCompression
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.914771) EVENT_LOG_v1 {"time_micros": 1769166896914763, "job": 144, "event": "compaction_finished", "compaction_time_micros": 94552, "compaction_time_cpu_micros": 65487, "output_level": 6, "num_output_files": 1, "total_output_size": 13490922, "num_input_records": 12611, "num_output_records": 12129, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000224.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166896915136, "job": 144, "event": "table_file_deletion", "file_number": 224}
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000222.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166896918064, "job": 144, "event": "table_file_deletion", "file_number": 222}
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.818030) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.918111) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.918115) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.918117) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.918120) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:14:56 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:14:56.918122) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:14:57 np0005593234 nova_compute[227762]: 2026-01-23 11:14:57.211 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:14:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:57.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:57 np0005593234 podman[350935]: 2026-01-23 11:14:57.77127904 +0000 UTC m=+0.058262315 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:14:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:14:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:14:58.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:14:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:14:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:14:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:14:59.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:15:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:00.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:15:01 np0005593234 nova_compute[227762]: 2026-01-23 11:15:01.282 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:01.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:02 np0005593234 nova_compute[227762]: 2026-01-23 11:15:02.212 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:02.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:03.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:04.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:05.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:06 np0005593234 nova_compute[227762]: 2026-01-23 11:15:06.320 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:06 np0005593234 podman[351012]: 2026-01-23 11:15:06.788037829 +0000 UTC m=+0.087228051 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 06:15:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:06.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:07 np0005593234 nova_compute[227762]: 2026-01-23 11:15:07.214 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:07.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:08 np0005593234 nova_compute[227762]: 2026-01-23 11:15:08.759 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:08.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:09.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:10.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:11 np0005593234 nova_compute[227762]: 2026-01-23 11:15:11.324 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:11 np0005593234 ovn_controller[134547]: 2026-01-23T11:15:11Z|01046|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 23 06:15:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:11.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:12 np0005593234 nova_compute[227762]: 2026-01-23 11:15:12.216 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:12.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:13.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #226. Immutable memtables: 0.
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.017330) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 145] Flushing memtable with next log file: 226
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166914017392, "job": 145, "event": "flush_started", "num_memtables": 1, "num_entries": 404, "num_deletes": 251, "total_data_size": 457479, "memory_usage": 464928, "flush_reason": "Manual Compaction"}
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 145] Level-0 flush table #227: started
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166914021957, "cf_name": "default", "job": 145, "event": "table_file_creation", "file_number": 227, "file_size": 301788, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 107189, "largest_seqno": 107588, "table_properties": {"data_size": 299466, "index_size": 485, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5716, "raw_average_key_size": 18, "raw_value_size": 294876, "raw_average_value_size": 963, "num_data_blocks": 21, "num_entries": 306, "num_filter_entries": 306, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166898, "oldest_key_time": 1769166898, "file_creation_time": 1769166914, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 227, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 145] Flush lasted 4651 microseconds, and 1623 cpu microseconds.
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.021992) [db/flush_job.cc:967] [default] [JOB 145] Level-0 flush table #227: 301788 bytes OK
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.022007) [db/memtable_list.cc:519] [default] Level-0 commit table #227 started
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.023293) [db/memtable_list.cc:722] [default] Level-0 commit table #227: memtable #1 done
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.023306) EVENT_LOG_v1 {"time_micros": 1769166914023301, "job": 145, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.023324) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 145] Try to delete WAL files size 454880, prev total WAL file size 454880, number of live WAL files 2.
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000223.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.023864) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730039353338' seq:72057594037927935, type:22 .. '7061786F730039373930' seq:0, type:0; will stop at (end)
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 146] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 145 Base level 0, inputs: [227(294KB)], [225(12MB)]
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166914023902, "job": 146, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [227], "files_L6": [225], "score": -1, "input_data_size": 13792710, "oldest_snapshot_seqno": -1}
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 146] Generated table #228: 11925 keys, 11803120 bytes, temperature: kUnknown
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166914081546, "cf_name": "default", "job": 146, "event": "table_file_creation", "file_number": 228, "file_size": 11803120, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11731606, "index_size": 40617, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29829, "raw_key_size": 317808, "raw_average_key_size": 26, "raw_value_size": 11528659, "raw_average_value_size": 966, "num_data_blocks": 1514, "num_entries": 11925, "num_filter_entries": 11925, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769166914, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 228, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.082016) [db/compaction/compaction_job.cc:1663] [default] [JOB 146] Compacted 1@0 + 1@6 files to L6 => 11803120 bytes
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.083262) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 238.5 rd, 204.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 12.9 +0.0 blob) out(11.3 +0.0 blob), read-write-amplify(84.8) write-amplify(39.1) OK, records in: 12435, records dropped: 510 output_compression: NoCompression
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.083278) EVENT_LOG_v1 {"time_micros": 1769166914083270, "job": 146, "event": "compaction_finished", "compaction_time_micros": 57821, "compaction_time_cpu_micros": 31388, "output_level": 6, "num_output_files": 1, "total_output_size": 11803120, "num_input_records": 12435, "num_output_records": 11925, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000227.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166914083435, "job": 146, "event": "table_file_deletion", "file_number": 227}
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000225.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769166914085763, "job": 146, "event": "table_file_deletion", "file_number": 225}
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.023785) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.085952) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.085961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.085963) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.085966) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:15:14 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:15:14.085969) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:15:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:15:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:14.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:15:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:15.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:16 np0005593234 nova_compute[227762]: 2026-01-23 11:15:16.327 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:16 np0005593234 nova_compute[227762]: 2026-01-23 11:15:16.374 227766 DEBUG oslo_concurrency.lockutils [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76" by "nova.compute.manager.ComputeManager.shelve_offload_instance.<locals>.do_shelve_offload_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:16 np0005593234 nova_compute[227762]: 2026-01-23 11:15:16.374 227766 DEBUG oslo_concurrency.lockutils [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76" acquired by "nova.compute.manager.ComputeManager.shelve_offload_instance.<locals>.do_shelve_offload_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:16 np0005593234 nova_compute[227762]: 2026-01-23 11:15:16.375 227766 INFO nova.compute.manager [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Shelve offloading#033[00m
Jan 23 06:15:16 np0005593234 nova_compute[227762]: 2026-01-23 11:15:16.406 227766 DEBUG nova.virt.libvirt.driver [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Jan 23 06:15:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:16.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:17 np0005593234 nova_compute[227762]: 2026-01-23 11:15:17.218 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:17.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:18.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:18 np0005593234 kernel: tap617c9ef0-df (unregistering): left promiscuous mode
Jan 23 06:15:18 np0005593234 NetworkManager[48942]: <info>  [1769166918.9699] device (tap617c9ef0-df): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 06:15:18 np0005593234 nova_compute[227762]: 2026-01-23 11:15:18.978 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:18 np0005593234 ovn_controller[134547]: 2026-01-23T11:15:18Z|01047|binding|INFO|Releasing lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 from this chassis (sb_readonly=0)
Jan 23 06:15:18 np0005593234 ovn_controller[134547]: 2026-01-23T11:15:18Z|01048|binding|INFO|Setting lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 down in Southbound
Jan 23 06:15:18 np0005593234 ovn_controller[134547]: 2026-01-23T11:15:18Z|01049|binding|INFO|Removing iface tap617c9ef0-df ovn-installed in OVS
Jan 23 06:15:18 np0005593234 nova_compute[227762]: 2026-01-23 11:15:18.981 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:18 np0005593234 nova_compute[227762]: 2026-01-23 11:15:18.997 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:19 np0005593234 systemd[1]: machine-qemu\x2d116\x2dinstance\x2d000000e0.scope: Deactivated successfully.
Jan 23 06:15:19 np0005593234 systemd[1]: machine-qemu\x2d116\x2dinstance\x2d000000e0.scope: Consumed 14.893s CPU time.
Jan 23 06:15:19 np0005593234 systemd-machined[195626]: Machine qemu-116-instance-000000e0 terminated.
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.086 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:a1:56 10.100.0.9'], port_security=['fa:16:3e:1d:a1:56 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ed71c532-711c-49b9-b0d5-eaf409f0bc76', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42899517-91b9-42e3-96a7-29180211a7a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a245f7970f14fffa60af2ff972b4bfd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2243181a-ba78-49d0-a310-35ec5fa364b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.235'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef3519b3-9b5b-4b40-8630-d2487396abc0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.087 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 in datapath 42899517-91b9-42e3-96a7-29180211a7a4 unbound from our chassis#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.089 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42899517-91b9-42e3-96a7-29180211a7a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.090 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a419593e-6233-4e9e-a2bb-ad54f5ced7eb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.091 144381 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 namespace which is not needed anymore#033[00m
Jan 23 06:15:19 np0005593234 NetworkManager[48942]: <info>  [1769166919.2022] manager: (tap617c9ef0-df): new Tun device (/org/freedesktop/NetworkManager/Devices/504)
Jan 23 06:15:19 np0005593234 kernel: tap617c9ef0-df: entered promiscuous mode
Jan 23 06:15:19 np0005593234 systemd-udevd[351050]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 06:15:19 np0005593234 kernel: tap617c9ef0-df (unregistering): left promiscuous mode
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.207 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:19 np0005593234 ovn_controller[134547]: 2026-01-23T11:15:19Z|01050|binding|INFO|Claiming lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 for this chassis.
Jan 23 06:15:19 np0005593234 ovn_controller[134547]: 2026-01-23T11:15:19Z|01051|binding|INFO|617c9ef0-df2c-4bd2-8d4c-fafc1723eb55: Claiming fa:16:3e:1d:a1:56 10.100.0.9
Jan 23 06:15:19 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[350860]: [NOTICE]   (350864) : haproxy version is 2.8.14-c23fe91
Jan 23 06:15:19 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[350860]: [NOTICE]   (350864) : path to executable is /usr/sbin/haproxy
Jan 23 06:15:19 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[350860]: [WARNING]  (350864) : Exiting Master process...
Jan 23 06:15:19 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[350860]: [ALERT]    (350864) : Current worker (350866) exited with code 143 (Terminated)
Jan 23 06:15:19 np0005593234 neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4[350860]: [WARNING]  (350864) : All workers exited. Exiting... (0)
Jan 23 06:15:19 np0005593234 systemd[1]: libpod-d68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9.scope: Deactivated successfully.
Jan 23 06:15:19 np0005593234 ovn_controller[134547]: 2026-01-23T11:15:19Z|01052|binding|INFO|Setting lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 ovn-installed in OVS
Jan 23 06:15:19 np0005593234 ovn_controller[134547]: 2026-01-23T11:15:19Z|01053|if_status|INFO|Dropped 2 log messages in last 3407 seconds (most recently, 3407 seconds ago) due to excessive rate
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.226 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:19 np0005593234 ovn_controller[134547]: 2026-01-23T11:15:19Z|01054|if_status|INFO|Not setting lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 down as sb is readonly
Jan 23 06:15:19 np0005593234 podman[351071]: 2026-01-23 11:15:19.227556896 +0000 UTC m=+0.046340821 container died d68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 06:15:19 np0005593234 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9-userdata-shm.mount: Deactivated successfully.
Jan 23 06:15:19 np0005593234 systemd[1]: var-lib-containers-storage-overlay-3789a0a7a6730f1599c50184c1844bc60a5a27a5ab977c58834c07864b6bcd8c-merged.mount: Deactivated successfully.
Jan 23 06:15:19 np0005593234 podman[351071]: 2026-01-23 11:15:19.256073659 +0000 UTC m=+0.074857584 container cleanup d68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 06:15:19 np0005593234 ovn_controller[134547]: 2026-01-23T11:15:19Z|01055|binding|INFO|Releasing lport 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 from this chassis (sb_readonly=0)
Jan 23 06:15:19 np0005593234 systemd[1]: libpod-conmon-d68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9.scope: Deactivated successfully.
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.270 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:a1:56 10.100.0.9'], port_security=['fa:16:3e:1d:a1:56 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ed71c532-711c-49b9-b0d5-eaf409f0bc76', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42899517-91b9-42e3-96a7-29180211a7a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a245f7970f14fffa60af2ff972b4bfd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2243181a-ba78-49d0-a310-35ec5fa364b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.235'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef3519b3-9b5b-4b40-8630-d2487396abc0, chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.278 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1d:a1:56 10.100.0.9'], port_security=['fa:16:3e:1d:a1:56 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'ed71c532-711c-49b9-b0d5-eaf409f0bc76', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42899517-91b9-42e3-96a7-29180211a7a4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3a245f7970f14fffa60af2ff972b4bfd', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2243181a-ba78-49d0-a310-35ec5fa364b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.235'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ef3519b3-9b5b-4b40-8630-d2487396abc0, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>], logical_port=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f09f40399a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.281 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:19 np0005593234 podman[351101]: 2026-01-23 11:15:19.310633677 +0000 UTC m=+0.036178413 container remove d68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.315 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eaaafedf-5b47-4865-bb8d-03dec3ef2de7]: (4, ('Fri Jan 23 11:15:19 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 (d68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9)\nd68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9\nFri Jan 23 11:15:19 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 (d68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9)\nd68d9dc2599cddaaf0e6f5ecd67dfa578a5df6bfb3041418f46467759391c0b9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.317 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[c32ee0d9-ed89-4536-9b70-27bb81d5d141]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.319 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap42899517-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:15:19 np0005593234 kernel: tap42899517-90: left promiscuous mode
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.321 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.335 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.338 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[a3977656-3437-4aa2-bac4-15cab751bbcb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.355 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[e8e1970e-bdde-4832-9b8d-d12406758288]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.356 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[2d368c15-850d-4e41-845f-500631ff5513]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.370 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[eec3be44-1950-47d2-b4d2-dfc439dd407e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1077263, 'reachable_time': 19787, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 351120, 'error': None, 'target': 'ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:19 np0005593234 systemd[1]: run-netns-ovnmeta\x2d42899517\x2d91b9\x2d42e3\x2d96a7\x2d29180211a7a4.mount: Deactivated successfully.
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.374 144923 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-42899517-91b9-42e3-96a7-29180211a7a4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.375 144923 DEBUG oslo.privsep.daemon [-] privsep: reply[706eecfa-0abd-4e0b-8806-0872d5ae3a2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.376 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 in datapath 42899517-91b9-42e3-96a7-29180211a7a4 unbound from our chassis#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.377 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42899517-91b9-42e3-96a7-29180211a7a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.377 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[7d938e21-3279-4f88-8f86-4deba87c9a4c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.377 144381 INFO neutron.agent.ovn.metadata.agent [-] Port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 in datapath 42899517-91b9-42e3-96a7-29180211a7a4 unbound from our chassis#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.378 144381 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42899517-91b9-42e3-96a7-29180211a7a4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Jan 23 06:15:19 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:19.378 232070 DEBUG oslo.privsep.daemon [-] privsep: reply[13389cbf-42c9-422f-9926-91f9dcc4c536]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.422 227766 INFO nova.virt.libvirt.driver [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Instance shutdown successfully after 3 seconds.#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.427 227766 INFO nova.virt.libvirt.driver [-] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Instance destroyed successfully.#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.428 227766 DEBUG nova.objects.instance [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'numa_topology' on Instance uuid ed71c532-711c-49b9-b0d5-eaf409f0bc76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.635 227766 DEBUG nova.compute.manager [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.641 227766 DEBUG nova.compute.manager [req-71634da6-456b-4e08-b8ff-bb964074a847 req-e905765f-1bad-4eec-8b49-63216a0197fc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-vif-unplugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.641 227766 DEBUG oslo_concurrency.lockutils [req-71634da6-456b-4e08-b8ff-bb964074a847 req-e905765f-1bad-4eec-8b49-63216a0197fc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.642 227766 DEBUG oslo_concurrency.lockutils [req-71634da6-456b-4e08-b8ff-bb964074a847 req-e905765f-1bad-4eec-8b49-63216a0197fc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.642 227766 DEBUG oslo_concurrency.lockutils [req-71634da6-456b-4e08-b8ff-bb964074a847 req-e905765f-1bad-4eec-8b49-63216a0197fc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.642 227766 DEBUG nova.compute.manager [req-71634da6-456b-4e08-b8ff-bb964074a847 req-e905765f-1bad-4eec-8b49-63216a0197fc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] No waiting events found dispatching network-vif-unplugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.643 227766 WARNING nova.compute.manager [req-71634da6-456b-4e08-b8ff-bb964074a847 req-e905765f-1bad-4eec-8b49-63216a0197fc 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received unexpected event network-vif-unplugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 for instance with vm_state active and task_state shelving.#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.644 227766 DEBUG oslo_concurrency.lockutils [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.645 227766 DEBUG oslo_concurrency.lockutils [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquired lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.645 227766 DEBUG nova.network.neutron [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Jan 23 06:15:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:19.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:19 np0005593234 nova_compute[227762]: 2026-01-23 11:15:19.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:20.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:21 np0005593234 nova_compute[227762]: 2026-01-23 11:15:21.330 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:21.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:22 np0005593234 nova_compute[227762]: 2026-01-23 11:15:22.220 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:22.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:22.979 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=107, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=106) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:15:22 np0005593234 nova_compute[227762]: 2026-01-23 11:15:22.980 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:22 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:22.981 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:15:22 np0005593234 nova_compute[227762]: 2026-01-23 11:15:22.982 227766 DEBUG nova.compute.manager [req-6bda8728-8623-4542-86af-bd8fbb3b035f req-a9cf598e-d602-4976-a589-bade475a79d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:15:22 np0005593234 nova_compute[227762]: 2026-01-23 11:15:22.983 227766 DEBUG oslo_concurrency.lockutils [req-6bda8728-8623-4542-86af-bd8fbb3b035f req-a9cf598e-d602-4976-a589-bade475a79d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:22 np0005593234 nova_compute[227762]: 2026-01-23 11:15:22.983 227766 DEBUG oslo_concurrency.lockutils [req-6bda8728-8623-4542-86af-bd8fbb3b035f req-a9cf598e-d602-4976-a589-bade475a79d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:22 np0005593234 nova_compute[227762]: 2026-01-23 11:15:22.983 227766 DEBUG oslo_concurrency.lockutils [req-6bda8728-8623-4542-86af-bd8fbb3b035f req-a9cf598e-d602-4976-a589-bade475a79d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:15:22 np0005593234 nova_compute[227762]: 2026-01-23 11:15:22.983 227766 DEBUG nova.compute.manager [req-6bda8728-8623-4542-86af-bd8fbb3b035f req-a9cf598e-d602-4976-a589-bade475a79d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] No waiting events found dispatching network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Jan 23 06:15:22 np0005593234 nova_compute[227762]: 2026-01-23 11:15:22.983 227766 WARNING nova.compute.manager [req-6bda8728-8623-4542-86af-bd8fbb3b035f req-a9cf598e-d602-4976-a589-bade475a79d2 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received unexpected event network-vif-plugged-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 for instance with vm_state active and task_state shelving.#033[00m
Jan 23 06:15:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:23 np0005593234 nova_compute[227762]: 2026-01-23 11:15:23.403 227766 DEBUG nova.network.neutron [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating instance_info_cache with network_info: [{"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:15:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:23.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:23 np0005593234 nova_compute[227762]: 2026-01-23 11:15:23.770 227766 DEBUG oslo_concurrency.lockutils [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Releasing lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:15:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:24.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:25.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:26 np0005593234 nova_compute[227762]: 2026-01-23 11:15:26.334 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:15:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:26.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:15:27 np0005593234 nova_compute[227762]: 2026-01-23 11:15:27.221 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:27 np0005593234 nova_compute[227762]: 2026-01-23 11:15:27.344 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:27 np0005593234 nova_compute[227762]: 2026-01-23 11:15:27.345 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 06:15:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:27.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:28 np0005593234 nova_compute[227762]: 2026-01-23 11:15:28.519 227766 INFO nova.virt.libvirt.driver [-] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Instance destroyed successfully.#033[00m
Jan 23 06:15:28 np0005593234 nova_compute[227762]: 2026-01-23 11:15:28.519 227766 DEBUG nova.objects.instance [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lazy-loading 'resources' on Instance uuid ed71c532-711c-49b9-b0d5-eaf409f0bc76 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Jan 23 06:15:28 np0005593234 podman[351176]: 2026-01-23 11:15:28.769031368 +0000 UTC m=+0.066362548 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Jan 23 06:15:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:15:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:28.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:15:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:29.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:30.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:31 np0005593234 nova_compute[227762]: 2026-01-23 11:15:31.337 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:15:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:31.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:15:31 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:31.983 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '107'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:15:32 np0005593234 nova_compute[227762]: 2026-01-23 11:15:32.225 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:32 np0005593234 nova_compute[227762]: 2026-01-23 11:15:32.620 227766 DEBUG nova.virt.libvirt.vif [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T11:14:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestShelveInstance-server-2003932207',display_name='tempest-TestShelveInstance-server-2003932207',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testshelveinstance-server-2003932207',id=224,image_ref='',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBE3bhlUYE6XIKDWLP/uD+8jgtWoi2zAcS0lWBO+SzamqVUAvHDBegRP4BFxXqktx7WnHXLwe9Z4SStWrBiFMiHWsxXNyXjRJKpQMiCgvbWujMjyVx4RONf0TXgED6xft/g==',key_name='tempest-TestShelveInstance-609770395',keypairs=<?>,launch_index=0,launched_at=2026-01-23T11:14:38Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3a245f7970f14fffa60af2ff972b4bfd',ramdisk_id='',reservation_id='r-p0f4vjg5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',image_signature_verified='False',owner_project_name='tempest-TestShelveInstance-869807080',owner_user_name='tempest-TestShelveInstance-869807080-project-member'},tags=<?>,task_state='shelving',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T11:14:38Z,user_data=None,user_id='5d6a458f5d9345379b05f0cdb69a7b0f',uuid=ed71c532-711c-49b9-b0d5-eaf409f0bc76,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Jan 23 06:15:32 np0005593234 nova_compute[227762]: 2026-01-23 11:15:32.621 227766 DEBUG nova.network.os_vif_util [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converting VIF {"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": "br-int", "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap617c9ef0-df", "ovs_interfaceid": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Jan 23 06:15:32 np0005593234 nova_compute[227762]: 2026-01-23 11:15:32.622 227766 DEBUG nova.network.os_vif_util [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Jan 23 06:15:32 np0005593234 nova_compute[227762]: 2026-01-23 11:15:32.623 227766 DEBUG os_vif [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Jan 23 06:15:32 np0005593234 nova_compute[227762]: 2026-01-23 11:15:32.626 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:32 np0005593234 nova_compute[227762]: 2026-01-23 11:15:32.626 227766 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap617c9ef0-df, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:15:32 np0005593234 nova_compute[227762]: 2026-01-23 11:15:32.629 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:32 np0005593234 nova_compute[227762]: 2026-01-23 11:15:32.630 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:15:32 np0005593234 nova_compute[227762]: 2026-01-23 11:15:32.632 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:32 np0005593234 nova_compute[227762]: 2026-01-23 11:15:32.634 227766 INFO os_vif [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1d:a1:56,bridge_name='br-int',has_traffic_filtering=True,id=617c9ef0-df2c-4bd2-8d4c-fafc1723eb55,network=Network(42899517-91b9-42e3-96a7-29180211a7a4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap617c9ef0-df')#033[00m
Jan 23 06:15:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:15:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:32.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:15:32 np0005593234 nova_compute[227762]: 2026-01-23 11:15:32.937 227766 INFO nova.virt.libvirt.driver [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Deleting instance files /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76_del#033[00m
Jan 23 06:15:32 np0005593234 nova_compute[227762]: 2026-01-23 11:15:32.938 227766 INFO nova.virt.libvirt.driver [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Deletion of /var/lib/nova/instances/ed71c532-711c-49b9-b0d5-eaf409f0bc76_del complete#033[00m
Jan 23 06:15:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:15:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:33.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:15:33 np0005593234 nova_compute[227762]: 2026-01-23 11:15:33.803 227766 DEBUG nova.compute.manager [req-6886dd6a-c618-4dc0-b065-38272e884947 req-767f3613-1f4a-40fc-8ede-adca96740b47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Received event network-changed-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Jan 23 06:15:33 np0005593234 nova_compute[227762]: 2026-01-23 11:15:33.803 227766 DEBUG nova.compute.manager [req-6886dd6a-c618-4dc0-b065-38272e884947 req-767f3613-1f4a-40fc-8ede-adca96740b47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Refreshing instance network info cache due to event network-changed-617c9ef0-df2c-4bd2-8d4c-fafc1723eb55. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Jan 23 06:15:33 np0005593234 nova_compute[227762]: 2026-01-23 11:15:33.804 227766 DEBUG oslo_concurrency.lockutils [req-6886dd6a-c618-4dc0-b065-38272e884947 req-767f3613-1f4a-40fc-8ede-adca96740b47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquiring lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Jan 23 06:15:33 np0005593234 nova_compute[227762]: 2026-01-23 11:15:33.804 227766 DEBUG oslo_concurrency.lockutils [req-6886dd6a-c618-4dc0-b065-38272e884947 req-767f3613-1f4a-40fc-8ede-adca96740b47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Acquired lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Jan 23 06:15:33 np0005593234 nova_compute[227762]: 2026-01-23 11:15:33.804 227766 DEBUG nova.network.neutron [req-6886dd6a-c618-4dc0-b065-38272e884947 req-767f3613-1f4a-40fc-8ede-adca96740b47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Refreshing network info cache for port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Jan 23 06:15:33 np0005593234 nova_compute[227762]: 2026-01-23 11:15:33.933 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:34 np0005593234 nova_compute[227762]: 2026-01-23 11:15:34.221 227766 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769166919.2207842, ed71c532-711c-49b9-b0d5-eaf409f0bc76 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Jan 23 06:15:34 np0005593234 nova_compute[227762]: 2026-01-23 11:15:34.221 227766 INFO nova.compute.manager [-] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] VM Stopped (Lifecycle Event)#033[00m
Jan 23 06:15:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:34.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:35.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:36 np0005593234 nova_compute[227762]: 2026-01-23 11:15:36.620 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:36 np0005593234 nova_compute[227762]: 2026-01-23 11:15:36.620 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:36 np0005593234 nova_compute[227762]: 2026-01-23 11:15:36.620 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:15:36 np0005593234 nova_compute[227762]: 2026-01-23 11:15:36.621 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:15:36 np0005593234 nova_compute[227762]: 2026-01-23 11:15:36.621 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:15:36 np0005593234 nova_compute[227762]: 2026-01-23 11:15:36.653 227766 DEBUG nova.compute.manager [None req-36e2ba14-32f6-4f40-b1ac-3b0d3ddacebd - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Jan 23 06:15:36 np0005593234 nova_compute[227762]: 2026-01-23 11:15:36.657 227766 DEBUG nova.compute.manager [None req-36e2ba14-32f6-4f40-b1ac-3b0d3ddacebd - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: active, current task_state: shelving, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Jan 23 06:15:36 np0005593234 nova_compute[227762]: 2026-01-23 11:15:36.706 227766 INFO nova.compute.manager [None req-36e2ba14-32f6-4f40-b1ac-3b0d3ddacebd - - - - - -] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] During sync_power_state the instance has a pending task (shelving). Skip.#033[00m
Jan 23 06:15:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:36.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:15:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1983409774' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:15:37 np0005593234 nova_compute[227762]: 2026-01-23 11:15:37.066 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:15:37 np0005593234 podman[351384]: 2026-01-23 11:15:37.16193578 +0000 UTC m=+0.080644615 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Jan 23 06:15:37 np0005593234 nova_compute[227762]: 2026-01-23 11:15:37.228 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:37 np0005593234 nova_compute[227762]: 2026-01-23 11:15:37.244 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:15:37 np0005593234 nova_compute[227762]: 2026-01-23 11:15:37.245 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4082MB free_disk=20.98813247680664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:15:37 np0005593234 nova_compute[227762]: 2026-01-23 11:15:37.245 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:37 np0005593234 nova_compute[227762]: 2026-01-23 11:15:37.245 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:37 np0005593234 podman[351437]: 2026-01-23 11:15:37.280545802 +0000 UTC m=+0.076239627 container exec 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.39.3, org.label-schema.schema-version=1.0, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250507, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=reef, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 06:15:37 np0005593234 podman[351437]: 2026-01-23 11:15:37.370241049 +0000 UTC m=+0.165934874 container exec_died 0a3f17f560c2583915d493d3553226a91dacf73d8b187d13f75acdf1693c595f (image=quay.io/ceph/ceph@sha256:1b9158ce28975f95def6a0ad459fa19f1336506074267a4b47c1bd914a00fec0, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-mon-compute-2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=6b0e988052ec84cf2d4a54ff9bbbc5e720b621ad, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250507, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=reef, io.buildah.version=1.39.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 23 06:15:37 np0005593234 nova_compute[227762]: 2026-01-23 11:15:37.568 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Instance ed71c532-711c-49b9-b0d5-eaf409f0bc76 actively managed on this compute host and has allocations in placement: {'resources': {'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Jan 23 06:15:37 np0005593234 nova_compute[227762]: 2026-01-23 11:15:37.569 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:15:37 np0005593234 nova_compute[227762]: 2026-01-23 11:15:37.569 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:15:37 np0005593234 nova_compute[227762]: 2026-01-23 11:15:37.629 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:37.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:37 np0005593234 podman[351593]: 2026-01-23 11:15:37.975838383 +0000 UTC m=+0.067176004 container exec 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 06:15:37 np0005593234 podman[351593]: 2026-01-23 11:15:37.984958277 +0000 UTC m=+0.076295888 container exec_died 2904f500d634145a9a04e80cefd65012412a4635cb27b4256030a1dffb7ee120 (image=quay.io/ceph/haproxy:2.3, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-haproxy-rgw-default-compute-2-xmknsp)
Jan 23 06:15:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:38 np0005593234 nova_compute[227762]: 2026-01-23 11:15:38.046 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:15:38 np0005593234 podman[351660]: 2026-01-23 11:15:38.207372378 +0000 UTC m=+0.054396474 container exec 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, architecture=x86_64, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, distribution-scope=public, release=1793, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.expose-services=, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, io.buildah.version=1.28.2, description=keepalived for Ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Jan 23 06:15:38 np0005593234 podman[351660]: 2026-01-23 11:15:38.218993042 +0000 UTC m=+0.066017158 container exec_died 1094da710442be90d7f78c5fa0b61d147d7e051ee389a23d0be525721d430b72 (image=quay.io/ceph/keepalived:2.2.4, name=ceph-e1533653-0a5a-584c-b34b-8689f0d32e77-keepalived-rgw-default-compute-2-tkmlem, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=keepalived, distribution-scope=public, io.buildah.version=1.28.2, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, architecture=x86_64, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9)
Jan 23 06:15:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:15:38 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2393423665' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:15:38 np0005593234 nova_compute[227762]: 2026-01-23 11:15:38.499 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:15:38 np0005593234 nova_compute[227762]: 2026-01-23 11:15:38.510 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:15:38 np0005593234 nova_compute[227762]: 2026-01-23 11:15:38.563 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:15:38 np0005593234 nova_compute[227762]: 2026-01-23 11:15:38.699 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:15:38 np0005593234 nova_compute[227762]: 2026-01-23 11:15:38.700 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.455s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:15:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:15:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:38.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:15:38 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:15:38 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:15:39 np0005593234 nova_compute[227762]: 2026-01-23 11:15:39.015 227766 DEBUG nova.network.neutron [req-6886dd6a-c618-4dc0-b065-38272e884947 req-767f3613-1f4a-40fc-8ede-adca96740b47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updated VIF entry in instance network info cache for port 617c9ef0-df2c-4bd2-8d4c-fafc1723eb55. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Jan 23 06:15:39 np0005593234 nova_compute[227762]: 2026-01-23 11:15:39.017 227766 DEBUG nova.network.neutron [req-6886dd6a-c618-4dc0-b065-38272e884947 req-767f3613-1f4a-40fc-8ede-adca96740b47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] [instance: ed71c532-711c-49b9-b0d5-eaf409f0bc76] Updating instance_info_cache with network_info: [{"id": "617c9ef0-df2c-4bd2-8d4c-fafc1723eb55", "address": "fa:16:3e:1d:a1:56", "network": {"id": "42899517-91b9-42e3-96a7-29180211a7a4", "bridge": null, "label": "tempest-TestShelveInstance-1168103236-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.235", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3a245f7970f14fffa60af2ff972b4bfd", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "unbound", "details": {}, "devname": "tap617c9ef0-df", "ovs_interfaceid": null, "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Jan 23 06:15:39 np0005593234 nova_compute[227762]: 2026-01-23 11:15:39.198 227766 DEBUG oslo_concurrency.lockutils [req-6886dd6a-c618-4dc0-b065-38272e884947 req-767f3613-1f4a-40fc-8ede-adca96740b47 010960fbe58245b384c2cbebe84d3b1f 87ac1761717c4b48bea28f65374beaf8 - - default default] Releasing lock "refresh_cache-ed71c532-711c-49b9-b0d5-eaf409f0bc76" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Jan 23 06:15:39 np0005593234 nova_compute[227762]: 2026-01-23 11:15:39.313 227766 INFO nova.scheduler.client.report [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Deleted allocations for instance ed71c532-711c-49b9-b0d5-eaf409f0bc76#033[00m
Jan 23 06:15:39 np0005593234 nova_compute[227762]: 2026-01-23 11:15:39.462 227766 DEBUG oslo_concurrency.lockutils [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:39 np0005593234 nova_compute[227762]: 2026-01-23 11:15:39.463 227766 DEBUG oslo_concurrency.lockutils [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:39 np0005593234 nova_compute[227762]: 2026-01-23 11:15:39.546 227766 DEBUG oslo_concurrency.processutils [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:15:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:39.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:15:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1980539543' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:15:39 np0005593234 nova_compute[227762]: 2026-01-23 11:15:39.984 227766 DEBUG oslo_concurrency.processutils [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:15:39 np0005593234 nova_compute[227762]: 2026-01-23 11:15:39.993 227766 DEBUG nova.compute.provider_tree [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:15:40 np0005593234 nova_compute[227762]: 2026-01-23 11:15:40.039 227766 DEBUG nova.scheduler.client.report [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:15:40 np0005593234 nova_compute[227762]: 2026-01-23 11:15:40.070 227766 DEBUG oslo_concurrency.lockutils [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:15:40 np0005593234 nova_compute[227762]: 2026-01-23 11:15:40.140 227766 DEBUG oslo_concurrency.lockutils [None req-94e452c2-ad66-4b4a-985d-22d575e80ac9 5d6a458f5d9345379b05f0cdb69a7b0f 3a245f7970f14fffa60af2ff972b4bfd - - default default] Lock "ed71c532-711c-49b9-b0d5-eaf409f0bc76" "released" by "nova.compute.manager.ComputeManager.shelve_offload_instance.<locals>.do_shelve_offload_instance" :: held 23.766s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:15:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:15:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:15:40 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:15:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:40.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:41.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:42 np0005593234 nova_compute[227762]: 2026-01-23 11:15:42.229 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:42 np0005593234 nova_compute[227762]: 2026-01-23 11:15:42.665 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:42.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:42.929 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:15:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:42.932 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.004s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:15:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:15:42.932 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:15:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:43 np0005593234 nova_compute[227762]: 2026-01-23 11:15:43.511 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:43 np0005593234 nova_compute[227762]: 2026-01-23 11:15:43.511 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:15:43 np0005593234 nova_compute[227762]: 2026-01-23 11:15:43.512 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:15:43 np0005593234 nova_compute[227762]: 2026-01-23 11:15:43.563 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:15:43 np0005593234 nova_compute[227762]: 2026-01-23 11:15:43.563 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:43 np0005593234 nova_compute[227762]: 2026-01-23 11:15:43.564 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:43.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:43 np0005593234 nova_compute[227762]: 2026-01-23 11:15:43.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:15:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/267363425' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:15:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:15:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/267363425' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:15:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:44.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0) v1
Jan 23 06:15:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/765387698' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 06:15:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:45.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:46 np0005593234 nova_compute[227762]: 2026-01-23 11:15:46.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:46 np0005593234 nova_compute[227762]: 2026-01-23 11:15:46.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:15:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:46.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:47 np0005593234 nova_compute[227762]: 2026-01-23 11:15:47.230 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:47 np0005593234 nova_compute[227762]: 2026-01-23 11:15:47.667 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:47.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:15:48 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:15:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:48.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:49.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:49 np0005593234 nova_compute[227762]: 2026-01-23 11:15:49.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:50 np0005593234 nova_compute[227762]: 2026-01-23 11:15:50.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:15:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:50.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:51.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:52 np0005593234 nova_compute[227762]: 2026-01-23 11:15:52.231 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:52 np0005593234 nova_compute[227762]: 2026-01-23 11:15:52.669 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:15:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:52.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:15:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:15:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:53.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:15:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:54.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:55.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:15:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:56.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:15:57 np0005593234 nova_compute[227762]: 2026-01-23 11:15:57.233 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:57 np0005593234 nova_compute[227762]: 2026-01-23 11:15:57.671 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:15:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:15:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:57.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:15:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:15:58 np0005593234 ovn_controller[134547]: 2026-01-23T11:15:58Z|01056|memory_trim|INFO|Detected inactivity (last active 30009 ms ago): trimming memory
Jan 23 06:15:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:15:58.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:15:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:15:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:15:59.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:15:59 np0005593234 podman[351978]: 2026-01-23 11:15:59.748602839 +0000 UTC m=+0.047050373 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 06:16:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:16:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:00.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:16:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:01.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:02 np0005593234 nova_compute[227762]: 2026-01-23 11:16:02.236 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:02 np0005593234 nova_compute[227762]: 2026-01-23 11:16:02.673 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:02.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:03.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:03 np0005593234 nova_compute[227762]: 2026-01-23 11:16:03.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:03 np0005593234 nova_compute[227762]: 2026-01-23 11:16:03.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 06:16:04 np0005593234 nova_compute[227762]: 2026-01-23 11:16:04.600 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 06:16:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:04.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:05.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:06.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:07 np0005593234 nova_compute[227762]: 2026-01-23 11:16:07.237 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:07 np0005593234 nova_compute[227762]: 2026-01-23 11:16:07.675 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:07.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:07 np0005593234 podman[352051]: 2026-01-23 11:16:07.786459401 +0000 UTC m=+0.076595078 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Jan 23 06:16:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:08.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:09.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:10 np0005593234 nova_compute[227762]: 2026-01-23 11:16:10.593 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:10.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:11.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:12 np0005593234 nova_compute[227762]: 2026-01-23 11:16:12.240 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:12 np0005593234 nova_compute[227762]: 2026-01-23 11:16:12.677 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:12.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:13.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:14.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:15.741 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:16.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:17 np0005593234 nova_compute[227762]: 2026-01-23 11:16:17.242 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:17 np0005593234 nova_compute[227762]: 2026-01-23 11:16:17.680 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:17.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:18.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:19.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:20.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:21.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:22 np0005593234 nova_compute[227762]: 2026-01-23 11:16:22.288 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:22 np0005593234 nova_compute[227762]: 2026-01-23 11:16:22.682 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:22.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:23.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:24.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:25.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:26.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:27 np0005593234 nova_compute[227762]: 2026-01-23 11:16:27.291 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:27 np0005593234 nova_compute[227762]: 2026-01-23 11:16:27.685 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:27.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:28.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:16:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:29.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:16:30 np0005593234 podman[352140]: 2026-01-23 11:16:30.780283718 +0000 UTC m=+0.066647467 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:16:30 np0005593234 nova_compute[227762]: 2026-01-23 11:16:30.961 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:30.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:31.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:32 np0005593234 nova_compute[227762]: 2026-01-23 11:16:32.292 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:32 np0005593234 nova_compute[227762]: 2026-01-23 11:16:32.687 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:33.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:16:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:33.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:16:34 np0005593234 nova_compute[227762]: 2026-01-23 11:16:34.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:34 np0005593234 nova_compute[227762]: 2026-01-23 11:16:34.792 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:16:34 np0005593234 nova_compute[227762]: 2026-01-23 11:16:34.792 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:16:34 np0005593234 nova_compute[227762]: 2026-01-23 11:16:34.793 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:34 np0005593234 nova_compute[227762]: 2026-01-23 11:16:34.793 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:16:34 np0005593234 nova_compute[227762]: 2026-01-23 11:16:34.793 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:16:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:35.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:35 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:16:35 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1840573309' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:16:35 np0005593234 nova_compute[227762]: 2026-01-23 11:16:35.227 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:16:35 np0005593234 nova_compute[227762]: 2026-01-23 11:16:35.392 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:16:35 np0005593234 nova_compute[227762]: 2026-01-23 11:16:35.393 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4098MB free_disk=20.98813247680664GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:16:35 np0005593234 nova_compute[227762]: 2026-01-23 11:16:35.394 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:16:35 np0005593234 nova_compute[227762]: 2026-01-23 11:16:35.394 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:16:35 np0005593234 nova_compute[227762]: 2026-01-23 11:16:35.592 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:16:35 np0005593234 nova_compute[227762]: 2026-01-23 11:16:35.593 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:16:35 np0005593234 nova_compute[227762]: 2026-01-23 11:16:35.656 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:16:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:35.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:16:36 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/430365394' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:16:36 np0005593234 nova_compute[227762]: 2026-01-23 11:16:36.157 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.501s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:16:36 np0005593234 nova_compute[227762]: 2026-01-23 11:16:36.163 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:16:36 np0005593234 nova_compute[227762]: 2026-01-23 11:16:36.188 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:16:36 np0005593234 nova_compute[227762]: 2026-01-23 11:16:36.220 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:16:36 np0005593234 nova_compute[227762]: 2026-01-23 11:16:36.220 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.827s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:37.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:37 np0005593234 nova_compute[227762]: 2026-01-23 11:16:37.308 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:37 np0005593234 nova_compute[227762]: 2026-01-23 11:16:37.689 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:37.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:38 np0005593234 podman[352209]: 2026-01-23 11:16:38.827881044 +0000 UTC m=+0.115118152 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Jan 23 06:16:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:39.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:39.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:41.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:41.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:42 np0005593234 nova_compute[227762]: 2026-01-23 11:16:42.221 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:42 np0005593234 nova_compute[227762]: 2026-01-23 11:16:42.221 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:16:42 np0005593234 nova_compute[227762]: 2026-01-23 11:16:42.222 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:16:42 np0005593234 nova_compute[227762]: 2026-01-23 11:16:42.243 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:16:42 np0005593234 nova_compute[227762]: 2026-01-23 11:16:42.243 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:42 np0005593234 nova_compute[227762]: 2026-01-23 11:16:42.243 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:42 np0005593234 nova_compute[227762]: 2026-01-23 11:16:42.309 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:42 np0005593234 nova_compute[227762]: 2026-01-23 11:16:42.691 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:16:42.929 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:16:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:16:42.930 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:16:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:16:42.930 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:16:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:43.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:43.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:16:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2406608558' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:16:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:16:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2406608558' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:16:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:45.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:45 np0005593234 nova_compute[227762]: 2026-01-23 11:16:45.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:45.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:16:46.162 144381 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=108, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a2:14:5d', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': 'e2:f0:52:36:44:3c'}, ipsec=False) old=SB_Global(nb_cfg=107) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Jan 23 06:16:46 np0005593234 nova_compute[227762]: 2026-01-23 11:16:46.162 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:46 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:16:46.163 144381 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Jan 23 06:16:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:47.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:47 np0005593234 nova_compute[227762]: 2026-01-23 11:16:47.311 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:47 np0005593234 nova_compute[227762]: 2026-01-23 11:16:47.693 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:47 np0005593234 nova_compute[227762]: 2026-01-23 11:16:47.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:47 np0005593234 nova_compute[227762]: 2026-01-23 11:16:47.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:16:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:47.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:49.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:16:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:16:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:16:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:16:49 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:16:49 np0005593234 nova_compute[227762]: 2026-01-23 11:16:49.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:49.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:51.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:51 np0005593234 nova_compute[227762]: 2026-01-23 11:16:51.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:51.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:52 np0005593234 nova_compute[227762]: 2026-01-23 11:16:52.314 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:16:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4146686762' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:16:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:16:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/4146686762' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:16:52 np0005593234 nova_compute[227762]: 2026-01-23 11:16:52.695 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:53.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:53.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:54 np0005593234 nova_compute[227762]: 2026-01-23 11:16:54.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:16:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:55.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:55 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:16:55.166 144381 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3ec410d4-99bb-47ec-9f70-86f8400b2621, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '108'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Jan 23 06:16:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:55.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:57.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:57 np0005593234 nova_compute[227762]: 2026-01-23 11:16:57.316 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:57 np0005593234 nova_compute[227762]: 2026-01-23 11:16:57.697 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:16:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:57.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:16:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:16:58 np0005593234 nova_compute[227762]: 2026-01-23 11:16:58.568 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:58 np0005593234 nova_compute[227762]: 2026-01-23 11:16:58.685 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:16:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:16:59.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:16:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:16:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:16:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:16:59.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:17:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:01.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:17:01 np0005593234 podman[352455]: 2026-01-23 11:17:01.517613461 +0000 UTC m=+0.059138272 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 06:17:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:01.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:17:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:17:02 np0005593234 nova_compute[227762]: 2026-01-23 11:17:02.363 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:02 np0005593234 nova_compute[227762]: 2026-01-23 11:17:02.702 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:03.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:03.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:05.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:17:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:05.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:17:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:07.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:07 np0005593234 nova_compute[227762]: 2026-01-23 11:17:07.409 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:07 np0005593234 nova_compute[227762]: 2026-01-23 11:17:07.704 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:07.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:09.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:09 np0005593234 nova_compute[227762]: 2026-01-23 11:17:09.762 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:17:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:09.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:17:09 np0005593234 podman[352555]: 2026-01-23 11:17:09.854309645 +0000 UTC m=+0.151367718 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 23 06:17:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:11.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:11.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:12 np0005593234 nova_compute[227762]: 2026-01-23 11:17:12.411 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:12 np0005593234 nova_compute[227762]: 2026-01-23 11:17:12.707 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:13.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:13.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:15.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:15.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:17.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:17 np0005593234 nova_compute[227762]: 2026-01-23 11:17:17.414 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:17 np0005593234 nova_compute[227762]: 2026-01-23 11:17:17.708 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:17.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:19.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:17:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:19.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:17:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:21.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:21.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:22 np0005593234 nova_compute[227762]: 2026-01-23 11:17:22.462 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:22 np0005593234 nova_compute[227762]: 2026-01-23 11:17:22.710 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:23.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:23.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:17:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:25.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:17:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:25.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:27.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:27 np0005593234 nova_compute[227762]: 2026-01-23 11:17:27.496 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:27 np0005593234 nova_compute[227762]: 2026-01-23 11:17:27.712 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:27.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:28 np0005593234 ceph-mgr[77448]: client.0 ms_handle_reset on v2:192.168.122.100:6800/530399322
Jan 23 06:17:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:29.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:29.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:17:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:31.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:17:31 np0005593234 podman[352644]: 2026-01-23 11:17:31.781515759 +0000 UTC m=+0.065108268 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 06:17:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:17:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:31.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:17:32 np0005593234 nova_compute[227762]: 2026-01-23 11:17:32.498 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:32 np0005593234 nova_compute[227762]: 2026-01-23 11:17:32.713 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:33.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:33.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:17:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:35.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:17:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:35.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:36 np0005593234 nova_compute[227762]: 2026-01-23 11:17:36.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:36 np0005593234 nova_compute[227762]: 2026-01-23 11:17:36.785 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:17:36 np0005593234 nova_compute[227762]: 2026-01-23 11:17:36.786 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:17:36 np0005593234 nova_compute[227762]: 2026-01-23 11:17:36.786 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:17:36 np0005593234 nova_compute[227762]: 2026-01-23 11:17:36.786 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:17:36 np0005593234 nova_compute[227762]: 2026-01-23 11:17:36.786 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:17:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:37.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:17:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/340127773' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.237 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.416 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.417 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4097MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.417 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.417 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.500 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.510 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.510 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.528 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.715 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:37.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:17:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/606007178' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.939 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.411s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.944 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.961 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.963 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:17:37 np0005593234 nova_compute[227762]: 2026-01-23 11:17:37.964 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.546s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:17:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:39.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:39 np0005593234 ovn_controller[134547]: 2026-01-23T11:17:39Z|01057|memory_trim|INFO|Detected inactivity (last active 30023 ms ago): trimming memory
Jan 23 06:17:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:39.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:40 np0005593234 podman[352712]: 2026-01-23 11:17:40.774555656 +0000 UTC m=+0.072137239 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3)
Jan 23 06:17:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:41.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:41.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:42 np0005593234 nova_compute[227762]: 2026-01-23 11:17:42.502 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:42 np0005593234 nova_compute[227762]: 2026-01-23 11:17:42.718 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:17:42.931 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:17:42.932 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:17:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:17:42.932 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:17:42 np0005593234 nova_compute[227762]: 2026-01-23 11:17:42.963 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:42 np0005593234 nova_compute[227762]: 2026-01-23 11:17:42.964 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:17:42 np0005593234 nova_compute[227762]: 2026-01-23 11:17:42.964 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:17:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:43.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:43.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:44 np0005593234 nova_compute[227762]: 2026-01-23 11:17:44.163 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:17:44 np0005593234 nova_compute[227762]: 2026-01-23 11:17:44.164 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:44 np0005593234 nova_compute[227762]: 2026-01-23 11:17:44.164 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:45.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:45.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:17:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:47.117 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:17:47 np0005593234 nova_compute[227762]: 2026-01-23 11:17:47.504 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:47 np0005593234 nova_compute[227762]: 2026-01-23 11:17:47.720 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:47 np0005593234 nova_compute[227762]: 2026-01-23 11:17:47.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:47 np0005593234 nova_compute[227762]: 2026-01-23 11:17:47.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:47 np0005593234 nova_compute[227762]: 2026-01-23 11:17:47.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:17:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:17:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:47.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:17:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:49.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:49.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:51.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:51 np0005593234 nova_compute[227762]: 2026-01-23 11:17:51.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:51.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:52 np0005593234 nova_compute[227762]: 2026-01-23 11:17:52.533 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:52 np0005593234 nova_compute[227762]: 2026-01-23 11:17:52.721 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:52 np0005593234 nova_compute[227762]: 2026-01-23 11:17:52.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:17:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:53.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:53.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:17:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:55.148 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:17:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:55.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:57.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:57 np0005593234 nova_compute[227762]: 2026-01-23 11:17:57.534 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:57 np0005593234 nova_compute[227762]: 2026-01-23 11:17:57.723 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:17:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:57.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:17:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:17:59.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:17:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:17:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:17:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:17:59.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:01.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:01.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:01 np0005593234 podman[352899]: 2026-01-23 11:18:01.912470214 +0000 UTC m=+0.065821171 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Jan 23 06:18:02 np0005593234 nova_compute[227762]: 2026-01-23 11:18:02.547 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:18:02 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:18:02 np0005593234 nova_compute[227762]: 2026-01-23 11:18:02.725 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:18:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:03.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:18:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:18:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:18:03 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:18:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:03.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:05.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:05.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:07.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:07 np0005593234 nova_compute[227762]: 2026-01-23 11:18:07.551 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #229. Immutable memtables: 0.
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.680832) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 147] Flushing memtable with next log file: 229
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167087680953, "job": 147, "event": "flush_started", "num_memtables": 1, "num_entries": 1936, "num_deletes": 250, "total_data_size": 4602583, "memory_usage": 4666576, "flush_reason": "Manual Compaction"}
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 147] Level-0 flush table #230: started
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167087695335, "cf_name": "default", "job": 147, "event": "table_file_creation", "file_number": 230, "file_size": 3018078, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 107593, "largest_seqno": 109524, "table_properties": {"data_size": 3009983, "index_size": 4909, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 15809, "raw_average_key_size": 19, "raw_value_size": 2994027, "raw_average_value_size": 3651, "num_data_blocks": 211, "num_entries": 820, "num_filter_entries": 820, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769166914, "oldest_key_time": 1769166914, "file_creation_time": 1769167087, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 230, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 147] Flush lasted 14548 microseconds, and 6640 cpu microseconds.
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.695417) [db/flush_job.cc:967] [default] [JOB 147] Level-0 flush table #230: 3018078 bytes OK
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.695438) [db/memtable_list.cc:519] [default] Level-0 commit table #230 started
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.697039) [db/memtable_list.cc:722] [default] Level-0 commit table #230: memtable #1 done
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.697054) EVENT_LOG_v1 {"time_micros": 1769167087697049, "job": 147, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.697072) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 147] Try to delete WAL files size 4593857, prev total WAL file size 4593857, number of live WAL files 2.
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000226.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.698415) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600353030' seq:72057594037927935, type:22 .. '6B7600373531' seq:0, type:0; will stop at (end)
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 148] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 147 Base level 0, inputs: [230(2947KB)], [228(11MB)]
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167087698539, "job": 148, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [230], "files_L6": [228], "score": -1, "input_data_size": 14821198, "oldest_snapshot_seqno": -1}
Jan 23 06:18:07 np0005593234 nova_compute[227762]: 2026-01-23 11:18:07.727 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 148] Generated table #231: 12228 keys, 13698951 bytes, temperature: kUnknown
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167087779217, "cf_name": "default", "job": 148, "event": "table_file_creation", "file_number": 231, "file_size": 13698951, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13623741, "index_size": 43545, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30597, "raw_key_size": 325978, "raw_average_key_size": 26, "raw_value_size": 13413673, "raw_average_value_size": 1096, "num_data_blocks": 1621, "num_entries": 12228, "num_filter_entries": 12228, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769167087, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 231, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.779406) [db/compaction/compaction_job.cc:1663] [default] [JOB 148] Compacted 1@0 + 1@6 files to L6 => 13698951 bytes
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.780641) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 183.6 rd, 169.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.9, 11.3 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(9.4) write-amplify(4.5) OK, records in: 12745, records dropped: 517 output_compression: NoCompression
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.780657) EVENT_LOG_v1 {"time_micros": 1769167087780649, "job": 148, "event": "compaction_finished", "compaction_time_micros": 80722, "compaction_time_cpu_micros": 46163, "output_level": 6, "num_output_files": 1, "total_output_size": 13698951, "num_input_records": 12745, "num_output_records": 12228, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.698318) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.780869) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.780876) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.780878) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.780880) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:18:07.780882) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000230.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167087781651, "job": 0, "event": "table_file_deletion", "file_number": 230}
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000228.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:18:07 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167087783795, "job": 0, "event": "table_file_deletion", "file_number": 228}
Jan 23 06:18:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:18:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:07.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:18:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:18:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:09.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:18:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:09.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:18:10 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:18:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:11.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:11 np0005593234 nova_compute[227762]: 2026-01-23 11:18:11.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:11 np0005593234 podman[353055]: 2026-01-23 11:18:11.821683309 +0000 UTC m=+0.107263517 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Jan 23 06:18:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:11.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:12 np0005593234 nova_compute[227762]: 2026-01-23 11:18:12.551 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:12 np0005593234 nova_compute[227762]: 2026-01-23 11:18:12.730 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:13.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:13.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:15.177 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:15.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:17.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:17 np0005593234 nova_compute[227762]: 2026-01-23 11:18:17.552 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:17 np0005593234 nova_compute[227762]: 2026-01-23 11:18:17.732 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:17.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:18:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:19.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:18:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:19.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:21.185 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:21.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:22 np0005593234 nova_compute[227762]: 2026-01-23 11:18:22.553 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:22 np0005593234 nova_compute[227762]: 2026-01-23 11:18:22.733 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:23.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:18:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:23.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:18:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:18:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:25.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:18:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:25.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.003000095s ======
Jan 23 06:18:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:27.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 23 06:18:27 np0005593234 systemd-logind[794]: New session 75 of user zuul.
Jan 23 06:18:27 np0005593234 systemd[1]: Started Session 75 of User zuul.
Jan 23 06:18:27 np0005593234 nova_compute[227762]: 2026-01-23 11:18:27.556 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:27 np0005593234 nova_compute[227762]: 2026-01-23 11:18:27.735 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:27.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:29.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:29.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:31 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 23 06:18:31 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/760405632' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 06:18:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:18:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:31.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:18:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:31.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:32 np0005593234 nova_compute[227762]: 2026-01-23 11:18:32.608 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:32 np0005593234 nova_compute[227762]: 2026-01-23 11:18:32.737 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:32 np0005593234 podman[353402]: 2026-01-23 11:18:32.810208312 +0000 UTC m=+0.088587623 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 06:18:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:33.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:33.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:34 np0005593234 ovs-vsctl[353451]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 23 06:18:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:35.209 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:35 np0005593234 virtqemud[227483]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 23 06:18:35 np0005593234 virtqemud[227483]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 23 06:18:35 np0005593234 virtqemud[227483]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 06:18:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:18:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:35.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:18:35 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: cache status {prefix=cache status} (starting...)
Jan 23 06:18:36 np0005593234 lvm[353777]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 06:18:36 np0005593234 lvm[353777]: VG ceph_vg0 finished
Jan 23 06:18:36 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: client ls {prefix=client ls} (starting...)
Jan 23 06:18:36 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: damage ls {prefix=damage ls} (starting...)
Jan 23 06:18:36 np0005593234 nova_compute[227762]: 2026-01-23 11:18:36.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:36 np0005593234 nova_compute[227762]: 2026-01-23 11:18:36.780 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:18:36 np0005593234 nova_compute[227762]: 2026-01-23 11:18:36.781 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:18:36 np0005593234 nova_compute[227762]: 2026-01-23 11:18:36.781 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:18:36 np0005593234 nova_compute[227762]: 2026-01-23 11:18:36.782 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:18:36 np0005593234 nova_compute[227762]: 2026-01-23 11:18:36.782 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:18:36 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: dump loads {prefix=dump loads} (starting...)
Jan 23 06:18:36 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Jan 23 06:18:36 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2896092586' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 06:18:36 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 23 06:18:37 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 23 06:18:37 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 23 06:18:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:18:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:37.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:18:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:18:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3248002980' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:18:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Jan 23 06:18:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3559598269' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 06:18:37 np0005593234 nova_compute[227762]: 2026-01-23 11:18:37.258 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:18:37 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 23 06:18:37 np0005593234 nova_compute[227762]: 2026-01-23 11:18:37.495 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:18:37 np0005593234 nova_compute[227762]: 2026-01-23 11:18:37.497 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3879MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:18:37 np0005593234 nova_compute[227762]: 2026-01-23 11:18:37.497 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:18:37 np0005593234 nova_compute[227762]: 2026-01-23 11:18:37.498 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:18:37 np0005593234 nova_compute[227762]: 2026-01-23 11:18:37.609 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:37 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 23 06:18:37 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Jan 23 06:18:37 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/139593511' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 06:18:37 np0005593234 nova_compute[227762]: 2026-01-23 11:18:37.740 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:37 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 23 06:18:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:18:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:37.888 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:18:38 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: ops {prefix=ops} (starting...)
Jan 23 06:18:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Jan 23 06:18:38 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/174190534' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 06:18:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Jan 23 06:18:38 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2960208504' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 06:18:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Jan 23 06:18:38 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2840223442' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 06:18:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 23 06:18:38 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/763015075' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 06:18:38 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: session ls {prefix=session ls} (starting...)
Jan 23 06:18:38 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: status {prefix=status} (starting...)
Jan 23 06:18:38 np0005593234 nova_compute[227762]: 2026-01-23 11:18:38.900 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:18:38 np0005593234 nova_compute[227762]: 2026-01-23 11:18:38.901 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:18:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 23 06:18:38 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/738542118' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 06:18:39 np0005593234 nova_compute[227762]: 2026-01-23 11:18:39.110 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing inventories for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Jan 23 06:18:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 06:18:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:39.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 06:18:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Jan 23 06:18:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4264695661' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 06:18:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 23 06:18:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2574358966' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 06:18:39 np0005593234 nova_compute[227762]: 2026-01-23 11:18:39.384 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating ProviderTree inventory for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Jan 23 06:18:39 np0005593234 nova_compute[227762]: 2026-01-23 11:18:39.385 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Updating inventory in ProviderTree for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Jan 23 06:18:39 np0005593234 nova_compute[227762]: 2026-01-23 11:18:39.409 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing aggregate associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Jan 23 06:18:39 np0005593234 nova_compute[227762]: 2026-01-23 11:18:39.444 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Refreshing trait associations for resource provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8, traits: COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SATA,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_RESCUE_BFV,COMPUTE_SECURITY_TPM_1_2,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_SSE42,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_AMI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_USB,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_TRUSTED_CERTS,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_NODE,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_CIRRUS _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Jan 23 06:18:39 np0005593234 nova_compute[227762]: 2026-01-23 11:18:39.575 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:18:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0) v1
Jan 23 06:18:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3823319975' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 06:18:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Jan 23 06:18:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1192203698' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 06:18:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:18:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:39.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:18:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:18:40 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4063146687' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:18:40 np0005593234 nova_compute[227762]: 2026-01-23 11:18:40.074 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:18:40 np0005593234 nova_compute[227762]: 2026-01-23 11:18:40.079 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:18:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Jan 23 06:18:40 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3803710366' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 06:18:40 np0005593234 nova_compute[227762]: 2026-01-23 11:18:40.557 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:18:40 np0005593234 nova_compute[227762]: 2026-01-23 11:18:40.558 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:18:40 np0005593234 nova_compute[227762]: 2026-01-23 11:18:40.558 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 3.060s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:18:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Jan 23 06:18:40 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1778451415' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 06:18:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 23 06:18:40 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1979966815' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 06:18:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Jan 23 06:18:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2167137350' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 06:18:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:18:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:41.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:18:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 23 06:18:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/3719489007' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 06:18:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 23 06:18:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/80388960' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 06:18:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:18:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:41.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:18:41 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 23 06:18:41 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/578823552' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5809652 data_alloc: 251658240 data_used: 38264832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549232640 unmapped: 99368960 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 407 heartbeat osd_stat(store_statfs(0x19702d000/0x0/0x1bfc00000, data 0x5321050/0x5491000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546119680 unmapped: 102481920 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 101965824 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546635776 unmapped: 101965824 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546693120 unmapped: 101908480 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5825688 data_alloc: 251658240 data_used: 39026688
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546693120 unmapped: 101908480 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 407 heartbeat osd_stat(store_statfs(0x196f8e000/0x0/0x1bfc00000, data 0x53c0050/0x5530000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546693120 unmapped: 101908480 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546693120 unmapped: 101908480 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546234368 unmapped: 102367232 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546234368 unmapped: 102367232 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.643473625s of 10.620517731s, submitted: 103
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 407 ms_handle_reset con 0x55bc3b720400 session 0x55bc3a162960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 407 ms_handle_reset con 0x55bc3c2a3c00 session 0x55bc3b2fa3c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5821284 data_alloc: 251658240 data_used: 39051264
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546234368 unmapped: 102367232 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546234368 unmapped: 102367232 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 407 ms_handle_reset con 0x55bc3b1a3000 session 0x55bc395dda40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 407 heartbeat osd_stat(store_statfs(0x196f8c000/0x0/0x1bfc00000, data 0x53c3040/0x5532000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546234368 unmapped: 102367232 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546234368 unmapped: 102367232 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 407 heartbeat osd_stat(store_statfs(0x196f8c000/0x0/0x1bfc00000, data 0x53c3040/0x5532000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546234368 unmapped: 102367232 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5819967 data_alloc: 251658240 data_used: 39051264
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546234368 unmapped: 102367232 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546234368 unmapped: 102367232 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546234368 unmapped: 102367232 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 407 ms_handle_reset con 0x55bc3b142400 session 0x55bc3b2fb2c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546242560 unmapped: 102359040 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546242560 unmapped: 102359040 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 407 heartbeat osd_stat(store_statfs(0x196f8c000/0x0/0x1bfc00000, data 0x53c3040/0x5532000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 7.219704151s of 10.150856018s, submitted: 35
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5792338 data_alloc: 251658240 data_used: 39026688
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546250752 unmapped: 102350848 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 407 ms_handle_reset con 0x55bc3a5f2000 session 0x55bc3971a780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546250752 unmapped: 102350848 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 407 ms_handle_reset con 0x55bc3a606800 session 0x55bc4337fe00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 407 handle_osd_map epochs [407,408], i have 407, src has [1,408]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 408 ms_handle_reset con 0x55bc3a606800 session 0x55bc41a75c20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546250752 unmapped: 102350848 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 408 ms_handle_reset con 0x55bc3a5f2000 session 0x55bc3be043c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546250752 unmapped: 102350848 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 408 ms_handle_reset con 0x55bc3b587800 session 0x55bc4337ef00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 408 ms_handle_reset con 0x55bc3e399800 session 0x55bc392ab680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546258944 unmapped: 102342656 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5792355 data_alloc: 251658240 data_used: 39034880
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546275328 unmapped: 102326272 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 408 heartbeat osd_stat(store_statfs(0x197305000/0x0/0x1bfc00000, data 0x4f95c9b/0x51b8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [0,0,0,0,0,0,0,1,2,1])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546283520 unmapped: 102318080 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546291712 unmapped: 102309888 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 408 ms_handle_reset con 0x55bc3bbdec00 session 0x55bc4337e000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546291712 unmapped: 102309888 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 408 ms_handle_reset con 0x55bc3b143400 session 0x55bc40a92960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 408 ms_handle_reset con 0x55bc3b4d5000 session 0x55bc39f2c960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546291712 unmapped: 102309888 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 408 ms_handle_reset con 0x55bc3b43a800 session 0x55bc3fd16f00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 408 ms_handle_reset con 0x55bc3ec53c00 session 0x55bc3a05cd20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 4.820722580s of 10.396615028s, submitted: 59
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 408 heartbeat osd_stat(store_statfs(0x19732b000/0x0/0x1bfc00000, data 0x4f72c68/0x5193000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [0,0,0,0,0,0,1])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5783462 data_alloc: 251658240 data_used: 38920192
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546308096 unmapped: 102293504 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 408 handle_osd_map epochs [408,409], i have 408, src has [1,409]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546316288 unmapped: 102285312 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 409 handle_osd_map epochs [409,410], i have 409, src has [1,410]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546316288 unmapped: 102285312 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 409 handle_osd_map epochs [410,410], i have 410, src has [1,410]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 410 ms_handle_reset con 0x55bc4a7a8800 session 0x55bc3e3af4a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546332672 unmapped: 102268928 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 410 ms_handle_reset con 0x55bc4a7a6000 session 0x55bc3be054a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 410 ms_handle_reset con 0x55bc3b15f800 session 0x55bc3b2481e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 410 heartbeat osd_stat(store_statfs(0x197323000/0x0/0x1bfc00000, data 0x4f76400/0x5199000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 410 ms_handle_reset con 0x55bc3b4d5c00 session 0x55bc3dbd34a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 571105280 unmapped: 77496320 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5781094 data_alloc: 251658240 data_used: 47583232
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565436416 unmapped: 83165184 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 410 heartbeat osd_stat(store_statfs(0x1969e7000/0x0/0x1bfc00000, data 0x4d90400/0x4fb3000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [0,0,0,0,0,0,0,0,0,0,8,2])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 410 ms_handle_reset con 0x55bc3b143400 session 0x55bc3971b4a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 410 ms_handle_reset con 0x55bc3a607000 session 0x55bc3e1f4d20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565051392 unmapped: 83550208 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 410 handle_osd_map epochs [410,411], i have 410, src has [1,411]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557301760 unmapped: 91299840 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 411 ms_handle_reset con 0x55bc3b13c000 session 0x55bc3a521c20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 411 handle_osd_map epochs [411,412], i have 411, src has [1,412]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 412 ms_handle_reset con 0x55bc3b13c000 session 0x55bc3d34a1e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557318144 unmapped: 91283456 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 412 heartbeat osd_stat(store_statfs(0x197502000/0x0/0x1bfc00000, data 0x4d93d84/0x4fba000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 412 ms_handle_reset con 0x55bc3ba8f000 session 0x55bc40a932c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557326336 unmapped: 91275264 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 412 ms_handle_reset con 0x55bc3f11c000 session 0x55bc3fd16d20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 412 ms_handle_reset con 0x55bc3ba8f800 session 0x55bc3fd17680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5762090 data_alloc: 251658240 data_used: 46854144
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557326336 unmapped: 91275264 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557326336 unmapped: 91275264 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557326336 unmapped: 91275264 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 412 heartbeat osd_stat(store_statfs(0x197502000/0x0/0x1bfc00000, data 0x4d93d84/0x4fba000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557326336 unmapped: 91275264 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557326336 unmapped: 91275264 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.391502380s of 14.387373924s, submitted: 58
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 412 ms_handle_reset con 0x55bc3f11d000 session 0x55bc39eaf4a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5347579 data_alloc: 218103808 data_used: 11968512
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 412 handle_osd_map epochs [412,413], i have 412, src has [1,413]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 heartbeat osd_stat(store_statfs(0x199646000/0x0/0x1bfc00000, data 0x2c518a3/0x2e77000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5351753 data_alloc: 218103808 data_used: 11976704
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 heartbeat osd_stat(store_statfs(0x199646000/0x0/0x1bfc00000, data 0x2c518a3/0x2e77000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 heartbeat osd_stat(store_statfs(0x199646000/0x0/0x1bfc00000, data 0x2c518a3/0x2e77000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5351753 data_alloc: 218103808 data_used: 11976704
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc3ef3cc00 session 0x55bc3be043c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc3a472800 session 0x55bc41a75c20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc3b587800 session 0x55bc4337fe00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 heartbeat osd_stat(store_statfs(0x199646000/0x0/0x1bfc00000, data 0x2c518a3/0x2e77000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5351753 data_alloc: 218103808 data_used: 11976704
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533438464 unmapped: 115163136 heap: 648601600 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc3a473400 session 0x55bc3971a780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.020261765s of 16.091646194s, submitted: 50
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc41dcdc00 session 0x55bc3a05c960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc3f11dc00 session 0x55bc3b2fb2c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc3a472800 session 0x55bc3b2fa3c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc3a473400 session 0x55bc3be045a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc3b587800 session 0x55bc3dbd3a40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc3ef3cc00 session 0x55bc3fd161e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536059904 unmapped: 116744192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc3a472800 session 0x55bc396e9680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc3a473400 session 0x55bc3be05e00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536059904 unmapped: 116744192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc3b587800 session 0x55bc3dbd21e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536059904 unmapped: 116744192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc3b15f400 session 0x55bc395dde00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 ms_handle_reset con 0x55bc42902400 session 0x55bc3e1f4f00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536059904 unmapped: 116744192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 heartbeat osd_stat(store_statfs(0x19750f000/0x0/0x1bfc00000, data 0x4d85947/0x4faf000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5649553 data_alloc: 234881024 data_used: 21680128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536059904 unmapped: 116744192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 413 handle_osd_map epochs [413,414], i have 413, src has [1,414]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 414 ms_handle_reset con 0x55bc3a473400 session 0x55bc3b2f52c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 414 heartbeat osd_stat(store_statfs(0x197e4a000/0x0/0x1bfc00000, data 0x4449592/0x4673000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528048128 unmapped: 124755968 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528048128 unmapped: 124755968 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528048128 unmapped: 124755968 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528048128 unmapped: 124755968 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5679865 data_alloc: 234881024 data_used: 29818880
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528048128 unmapped: 124755968 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 414 handle_osd_map epochs [414,415], i have 414, src has [1,415]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.669334412s of 10.113974571s, submitted: 81
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528031744 unmapped: 124772352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x197e47000/0x0/0x1bfc00000, data 0x444b0d1/0x4676000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528031744 unmapped: 124772352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528031744 unmapped: 124772352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528031744 unmapped: 124772352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x197e47000/0x0/0x1bfc00000, data 0x444b0d1/0x4676000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5682839 data_alloc: 234881024 data_used: 29818880
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528031744 unmapped: 124772352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528031744 unmapped: 124772352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528031744 unmapped: 124772352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528162816 unmapped: 124641280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x197e48000/0x0/0x1bfc00000, data 0x444b0d1/0x4676000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528162816 unmapped: 124641280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5716571 data_alloc: 234881024 data_used: 32489472
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528162816 unmapped: 124641280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528162816 unmapped: 124641280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528162816 unmapped: 124641280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528162816 unmapped: 124641280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x197e48000/0x0/0x1bfc00000, data 0x444b0d1/0x4676000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528162816 unmapped: 124641280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.270887375s of 14.310273170s, submitted: 15
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 ms_handle_reset con 0x55bc42902400 session 0x55bc4337e960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 ms_handle_reset con 0x55bc3a472800 session 0x55bc3e1f4960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5315386 data_alloc: 218103808 data_used: 11988992
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528179200 unmapped: 124624896 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x197e48000/0x0/0x1bfc00000, data 0x444b0d1/0x4676000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 ms_handle_reset con 0x55bc3eeec800 session 0x55bc3a0501e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 124616704 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 124616704 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 124616704 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 124616704 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5312215 data_alloc: 218103808 data_used: 11984896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 124616704 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x199f7f000/0x0/0x1bfc00000, data 0x231702d/0x253e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528187392 unmapped: 124616704 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 ms_handle_reset con 0x55bc3b144400 session 0x55bc3d34ba40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528195584 unmapped: 124608512 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 ms_handle_reset con 0x55bc3b144400 session 0x55bc41a745a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528195584 unmapped: 124608512 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 ms_handle_reset con 0x55bc3d27d400 session 0x55bc3a051a40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528203776 unmapped: 124600320 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 ms_handle_reset con 0x55bc3d1ecc00 session 0x55bc3a162b40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x199f81000/0x0/0x1bfc00000, data 0x231701d/0x253d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5310181 data_alloc: 218103808 data_used: 11984896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x199f81000/0x0/0x1bfc00000, data 0x231701d/0x253d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x199f81000/0x0/0x1bfc00000, data 0x231701d/0x253d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5310181 data_alloc: 218103808 data_used: 11984896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x199f81000/0x0/0x1bfc00000, data 0x231701d/0x253d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x199f81000/0x0/0x1bfc00000, data 0x231701d/0x253d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x199f81000/0x0/0x1bfc00000, data 0x231701d/0x253d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5310181 data_alloc: 218103808 data_used: 11984896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x199f81000/0x0/0x1bfc00000, data 0x231701d/0x253d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5310181 data_alloc: 218103808 data_used: 11984896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x199f81000/0x0/0x1bfc00000, data 0x231701d/0x253d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5310181 data_alloc: 218103808 data_used: 11984896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 heartbeat osd_stat(store_statfs(0x199f81000/0x0/0x1bfc00000, data 0x231701d/0x253d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5310181 data_alloc: 218103808 data_used: 11984896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 ms_handle_reset con 0x55bc3a607c00 session 0x55bc392a5680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528211968 unmapped: 124592128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 415 handle_osd_map epochs [415,416], i have 415, src has [1,416]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 37.342624664s of 37.717086792s, submitted: 89
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 416 ms_handle_reset con 0x55bc3b4d2000 session 0x55bc3a521c20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 416 ms_handle_reset con 0x55bc3b4d2000 session 0x55bc3b30ab40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528220160 unmapped: 124583936 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 416 handle_osd_map epochs [416,417], i have 416, src has [1,417]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 417 ms_handle_reset con 0x55bc3a607c00 session 0x55bc3f2df860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 417 heartbeat osd_stat(store_statfs(0x199f79000/0x0/0x1bfc00000, data 0x231a923/0x2543000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528228352 unmapped: 124575744 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 417 ms_handle_reset con 0x55bc391e5800 session 0x55bc4337fc20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5318001 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528228352 unmapped: 124575744 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 417 ms_handle_reset con 0x55bc4a7aec00 session 0x55bc3b203860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 417 heartbeat osd_stat(store_statfs(0x199f79000/0x0/0x1bfc00000, data 0x231a923/0x2543000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528236544 unmapped: 124567552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528236544 unmapped: 124567552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528236544 unmapped: 124567552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 417 heartbeat osd_stat(store_statfs(0x199f79000/0x0/0x1bfc00000, data 0x231a923/0x2543000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528236544 unmapped: 124567552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 417 heartbeat osd_stat(store_statfs(0x199f79000/0x0/0x1bfc00000, data 0x231a923/0x2543000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5318001 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 417 handle_osd_map epochs [417,418], i have 417, src has [1,418]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528244736 unmapped: 124559360 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f79000/0x0/0x1bfc00000, data 0x231a923/0x2543000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528244736 unmapped: 124559360 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528244736 unmapped: 124559360 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f77000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528244736 unmapped: 124559360 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4a7a8800 session 0x55bc3f2df860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528252928 unmapped: 124551168 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5320303 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528252928 unmapped: 124551168 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f77000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528252928 unmapped: 124551168 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.211046219s of 14.265490532s, submitted: 24
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4a7a8800 session 0x55bc3a521c20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528261120 unmapped: 124542976 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f76000/0x0/0x1bfc00000, data 0x231c4d4/0x2548000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc3a162b40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528261120 unmapped: 124542976 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528261120 unmapped: 124542976 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5324507 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528261120 unmapped: 124542976 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528261120 unmapped: 124542976 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528269312 unmapped: 124534784 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f76000/0x0/0x1bfc00000, data 0x231c4d4/0x2548000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528269312 unmapped: 124534784 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4a7b1800 session 0x55bc3a051a40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b4d4c00 session 0x55bc41a745a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528269312 unmapped: 124534784 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3d978c00 session 0x55bc3a0501e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5323633 data_alloc: 218103808 data_used: 11997184
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528285696 unmapped: 124518400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3d978c00 session 0x55bc3b2f52c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc4337fe00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b4d4c00 session 0x55bc41a75c20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528310272 unmapped: 124493824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528310272 unmapped: 124493824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528310272 unmapped: 124493824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528318464 unmapped: 124485632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528318464 unmapped: 124485632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528318464 unmapped: 124485632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528318464 unmapped: 124485632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528326656 unmapped: 124477440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528326656 unmapped: 124477440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc42951c00 session 0x55bc39eaf4a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528359424 unmapped: 124444672 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528359424 unmapped: 124444672 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3f11cc00 session 0x55bc3fd16d20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3f11cc00 session 0x55bc40a932c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc3d34a1e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b4d4c00 session 0x55bc3be043c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 62.704410553s of 62.801357269s, submitted: 27
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3d978c00 session 0x55bc3b249e00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc42951c00 session 0x55bc3a520000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc42951c00 session 0x55bc396650e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc4337eb40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b4d4c00 session 0x55bc3e1f4d20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19995f000/0x0/0x1bfc00000, data 0x293349b/0x2b5f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5381452 data_alloc: 218103808 data_used: 11997184
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19995f000/0x0/0x1bfc00000, data 0x29334d4/0x2b5f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5381452 data_alloc: 218103808 data_used: 11997184
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19995f000/0x0/0x1bfc00000, data 0x29334d4/0x2b5f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5381452 data_alloc: 218103808 data_used: 11997184
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529686528 unmapped: 123117568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529686528 unmapped: 123117568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19995f000/0x0/0x1bfc00000, data 0x29334d4/0x2b5f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529686528 unmapped: 123117568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4385c400 session 0x55bc4337e5a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529686528 unmapped: 123117568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4a7a7c00 session 0x55bc3fd16000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4a7a7c00 session 0x55bc3e1f4780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529694720 unmapped: 123109376 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.268370628s of 15.381249428s, submitted: 38
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc3fd16b40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5390086 data_alloc: 218103808 data_used: 12001280
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529899520 unmapped: 122904576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199933000/0x0/0x1bfc00000, data 0x295d507/0x2b8b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529899520 unmapped: 122904576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529907712 unmapped: 122896384 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5425446 data_alloc: 218103808 data_used: 16957440
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199933000/0x0/0x1bfc00000, data 0x295d507/0x2b8b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199933000/0x0/0x1bfc00000, data 0x295d507/0x2b8b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199933000/0x0/0x1bfc00000, data 0x295d507/0x2b8b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5425446 data_alloc: 218103808 data_used: 16957440
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199933000/0x0/0x1bfc00000, data 0x295d507/0x2b8b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.630040169s of 13.679882050s, submitted: 13
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 531103744 unmapped: 121700352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534528000 unmapped: 118276096 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5542170 data_alloc: 218103808 data_used: 17022976
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534536192 unmapped: 118267904 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533479424 unmapped: 119324672 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533479424 unmapped: 119324672 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198a34000/0x0/0x1bfc00000, data 0x385c507/0x3a8a000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533454848 unmapped: 119349248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534781952 unmapped: 118022144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19898d000/0x0/0x1bfc00000, data 0x3902507/0x3b30000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5567446 data_alloc: 218103808 data_used: 18030592
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19898d000/0x0/0x1bfc00000, data 0x3902507/0x3b30000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534781952 unmapped: 118022144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534781952 unmapped: 118022144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534781952 unmapped: 118022144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534781952 unmapped: 118022144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534781952 unmapped: 118022144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19898d000/0x0/0x1bfc00000, data 0x3902507/0x3b30000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.453640938s of 11.992197037s, submitted: 131
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5565398 data_alloc: 218103808 data_used: 18034688
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198969000/0x0/0x1bfc00000, data 0x3927507/0x3b55000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198969000/0x0/0x1bfc00000, data 0x3927507/0x3b55000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5565398 data_alloc: 218103808 data_used: 18034688
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5565090 data_alloc: 218103808 data_used: 18034688
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198966000/0x0/0x1bfc00000, data 0x392a507/0x3b58000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198966000/0x0/0x1bfc00000, data 0x392a507/0x3b58000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198966000/0x0/0x1bfc00000, data 0x392a507/0x3b58000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5565410 data_alloc: 218103808 data_used: 18042880
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 23 06:18:42 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4095533204' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198966000/0x0/0x1bfc00000, data 0x392a507/0x3b58000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.936965942s of 19.969482422s, submitted: 7
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5565494 data_alloc: 218103808 data_used: 18042880
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534937600 unmapped: 117866496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198961000/0x0/0x1bfc00000, data 0x392f507/0x3b5d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534937600 unmapped: 117866496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534937600 unmapped: 117866496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b1a3800 session 0x55bc3a4501e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534937600 unmapped: 117866496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b4d4c00 session 0x55bc392a5e00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc42951c00 session 0x55bc3fd172c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5565466 data_alloc: 218103808 data_used: 18042880
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534945792 unmapped: 117858304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc39fd4000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198961000/0x0/0x1bfc00000, data 0x392f507/0x3b5d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4a7aa800 session 0x55bc396e8000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4385dc00 session 0x55bc3ec2af00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534077440 unmapped: 118726656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534077440 unmapped: 118726656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534077440 unmapped: 118726656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534077440 unmapped: 118726656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534077440 unmapped: 118726656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534085632 unmapped: 118718464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534085632 unmapped: 118718464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534085632 unmapped: 118718464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534085632 unmapped: 118718464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.5 total, 600.0 interval#012Cumulative writes: 74K writes, 298K keys, 74K commit groups, 1.0 writes per commit group, ingest: 0.30 GB, 0.05 MB/s#012Cumulative WAL: 74K writes, 27K syncs, 2.69 writes per sync, written: 0.30 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4268 writes, 14K keys, 4268 commit groups, 1.0 writes per commit group, ingest: 13.04 MB, 0.02 MB/s#012Interval WAL: 4268 writes, 1717 syncs, 2.49 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534110208 unmapped: 118693888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b43a800 session 0x55bc3b2fa780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b143800 session 0x55bc3f2df680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b143800 session 0x55bc3d34bc20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc3b30a000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534118400 unmapped: 118685696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 42.591941833s of 42.745742798s, submitted: 57
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b43a800 session 0x55bc3fd17e00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3ba8f000 session 0x55bc39864780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b1a1800 session 0x55bc3d357680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b1a1800 session 0x55bc39665860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc395dc780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994f1000/0x0/0x1bfc00000, data 0x2da2472/0x2fcd000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422568 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994f1000/0x0/0x1bfc00000, data 0x2da2472/0x2fcd000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994f1000/0x0/0x1bfc00000, data 0x2da2472/0x2fcd000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422568 data_alloc: 218103808 data_used: 11993088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534151168 unmapped: 118652928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534151168 unmapped: 118652928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534151168 unmapped: 118652928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994f1000/0x0/0x1bfc00000, data 0x2da2472/0x2fcd000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.674189568s of 10.862952232s, submitted: 20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534151168 unmapped: 118652928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b1a2400 session 0x55bc3e3af860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534462464 unmapped: 118341632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5427541 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534175744 unmapped: 118628352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5505461 data_alloc: 234881024 data_used: 23035904
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5505461 data_alloc: 234881024 data_used: 23035904
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.972386360s of 12.997897148s, submitted: 7
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536199168 unmapped: 116604928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1990d3000/0x0/0x1bfc00000, data 0x31bf495/0x33eb000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536207360 unmapped: 116596736 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536084480 unmapped: 116719616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5598143 data_alloc: 234881024 data_used: 23625728
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537264128 unmapped: 115539968 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537264128 unmapped: 115539968 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537264128 unmapped: 115539968 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b29000/0x0/0x1bfc00000, data 0x3761495/0x398d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537264128 unmapped: 115539968 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537214976 unmapped: 115589120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5593615 data_alloc: 234881024 data_used: 23953408
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537223168 unmapped: 115580928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537223168 unmapped: 115580928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.472796440s of 10.179206848s, submitted: 121
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538345472 unmapped: 114458624 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2e000/0x0/0x1bfc00000, data 0x3764495/0x3990000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5594031 data_alloc: 234881024 data_used: 23953408
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2e000/0x0/0x1bfc00000, data 0x3764495/0x3990000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2e000/0x0/0x1bfc00000, data 0x3764495/0x3990000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5594031 data_alloc: 234881024 data_used: 23953408
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2e000/0x0/0x1bfc00000, data 0x3764495/0x3990000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.258337975s of 12.762209892s, submitted: 186
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2e000/0x0/0x1bfc00000, data 0x3764495/0x3990000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5599303 data_alloc: 234881024 data_used: 23961600
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2b000/0x0/0x1bfc00000, data 0x3766495/0x3992000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4a7ab000 session 0x55bc3d34b2c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b43a800 session 0x55bc396e8000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2c000/0x0/0x1bfc00000, data 0x3766495/0x3992000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5594471 data_alloc: 234881024 data_used: 23961600
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2c000/0x0/0x1bfc00000, data 0x3766495/0x3992000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2c000/0x0/0x1bfc00000, data 0x3766495/0x3992000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5594471 data_alloc: 234881024 data_used: 23961600
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc42903400 session 0x55bc39865a40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.233558655s of 11.280123711s, submitted: 29
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538402816 unmapped: 114401280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 handle_osd_map epochs [418,419], i have 418, src has [1,419]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 418 handle_osd_map epochs [419,419], i have 419, src has [1,419]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 419 ms_handle_reset con 0x55bc3ba8f400 session 0x55bc395ba1e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 419 ms_handle_reset con 0x55bc3b1a3000 session 0x55bc3d34a1e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 419 ms_handle_reset con 0x55bc3bf4bc00 session 0x55bc3e1f4780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5598697 data_alloc: 234881024 data_used: 24072192
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 419 heartbeat osd_stat(store_statfs(0x198b29000/0x0/0x1bfc00000, data 0x37680ee/0x3995000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 419 ms_handle_reset con 0x55bc3f124800 session 0x55bc39eaf4a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 419 handle_osd_map epochs [419,420], i have 419, src has [1,420]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 420 ms_handle_reset con 0x55bc3b1a3000 session 0x55bc3b249e00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 420 heartbeat osd_stat(store_statfs(0x198b29000/0x0/0x1bfc00000, data 0x37680ee/0x3995000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5603031 data_alloc: 234881024 data_used: 24084480
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 420 heartbeat osd_stat(store_statfs(0x198b25000/0x0/0x1bfc00000, data 0x3769d9b/0x3998000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 420 heartbeat osd_stat(store_statfs(0x198b25000/0x0/0x1bfc00000, data 0x3769d9b/0x3998000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.517007828s of 13.684580803s, submitted: 29
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 420 handle_osd_map epochs [420,421], i have 420, src has [1,421]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 420 handle_osd_map epochs [421,421], i have 421, src has [1,421]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5618769 data_alloc: 234881024 data_used: 25083904
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198b20000/0x0/0x1bfc00000, data 0x376b8da/0x399b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198b20000/0x0/0x1bfc00000, data 0x376b8da/0x399b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198b20000/0x0/0x1bfc00000, data 0x376b8da/0x399b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5618929 data_alloc: 234881024 data_used: 25088000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198b20000/0x0/0x1bfc00000, data 0x376b8da/0x399b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538394624 unmapped: 114409472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.060571671s of 10.098513603s, submitted: 57
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc3dc22400 session 0x55bc41a754a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc42c94c00 session 0x55bc3a0505a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538402816 unmapped: 114401280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366225 data_alloc: 218103808 data_used: 12021760
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc3e196400 session 0x55bc3a051680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366225 data_alloc: 218103808 data_used: 12021760
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366225 data_alloc: 218103808 data_used: 12021760
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 67.751548767s of 68.275238037s, submitted: 41
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529334272 unmapped: 123469824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc3b15e800 session 0x55bc3ec2b4a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199e27000/0x0/0x1bfc00000, data 0x24698a7/0x2697000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5437757 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19961d000/0x0/0x1bfc00000, data 0x2c738a7/0x2ea1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5437757 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529350656 unmapped: 123453440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529350656 unmapped: 123453440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19961d000/0x0/0x1bfc00000, data 0x2c738a7/0x2ea1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc391e5400 session 0x55bc3a0503c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529350656 unmapped: 123453440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc3b15f400 session 0x55bc3e1f5680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529350656 unmapped: 123453440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529350656 unmapped: 123453440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc391e5800 session 0x55bc40a92780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.719742775s of 12.813817024s, submitted: 11
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19961d000/0x0/0x1bfc00000, data 0x2c738a7/0x2ea1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5440786 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529170432 unmapped: 123633664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529170432 unmapped: 123633664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19961c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529170432 unmapped: 123633664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc4385d800 session 0x55bc3ec2a780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529170432 unmapped: 123633664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529170432 unmapped: 123633664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5440386 data_alloc: 218103808 data_used: 12025856
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529178624 unmapped: 123625472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529178624 unmapped: 123625472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529178624 unmapped: 123625472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19920c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529186816 unmapped: 123617280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5508838 data_alloc: 234881024 data_used: 21602304
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19920c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19920c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5508838 data_alloc: 234881024 data_used: 21602304
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19920c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19920c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5508838 data_alloc: 234881024 data_used: 21602304
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530219008 unmapped: 122585088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.664346695s of 20.694303513s, submitted: 8
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19920c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535150592 unmapped: 117653504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535158784 unmapped: 117645312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535240704 unmapped: 117563392 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535248896 unmapped: 117555200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5627006 data_alloc: 234881024 data_used: 23121920
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535248896 unmapped: 117555200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198327000/0x0/0x1bfc00000, data 0x3b588b6/0x3d87000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198327000/0x0/0x1bfc00000, data 0x3b588b6/0x3d87000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5640390 data_alloc: 234881024 data_used: 23592960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198327000/0x0/0x1bfc00000, data 0x3b588b6/0x3d87000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.309547424s of 11.783488274s, submitted: 105
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5640202 data_alloc: 234881024 data_used: 23601152
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198303000/0x0/0x1bfc00000, data 0x3b7c8b6/0x3dab000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5639906 data_alloc: 234881024 data_used: 23601152
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x1982fd000/0x0/0x1bfc00000, data 0x3b828b6/0x3db1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5639906 data_alloc: 234881024 data_used: 23601152
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x1982fd000/0x0/0x1bfc00000, data 0x3b828b6/0x3db1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.061479568s of 16.081272125s, submitted: 7
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x1982f5000/0x0/0x1bfc00000, data 0x3b8a8b6/0x3db9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5641226 data_alloc: 234881024 data_used: 23625728
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535969792 unmapped: 116834304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc42c94c00 session 0x55bc39fcdc20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc4a7a7c00 session 0x55bc3ec2b2c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x1982ef000/0x0/0x1bfc00000, data 0x3b908b6/0x3dbf000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 116817920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 116817920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5641094 data_alloc: 234881024 data_used: 23625728
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 116817920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 116817920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 116817920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 116817920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x1982ef000/0x0/0x1bfc00000, data 0x3b908b6/0x3dbf000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535994368 unmapped: 116809728 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 421 handle_osd_map epochs [421,422], i have 421, src has [1,422]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.262039185s of 11.294044495s, submitted: 10
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5644740 data_alloc: 234881024 data_used: 23633920
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982eb000/0x0/0x1bfc00000, data 0x3b9250f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 116793344 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 ms_handle_reset con 0x55bc3b4d4000 session 0x55bc3d34b0e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 116793344 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 ms_handle_reset con 0x55bc3b762400 session 0x55bc3e3ae5a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537067520 unmapped: 115736576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537092096 unmapped: 115712000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537092096 unmapped: 115712000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5644632 data_alloc: 234881024 data_used: 23674880
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537092096 unmapped: 115712000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x3b9250f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537092096 unmapped: 115712000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x3b9250f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537092096 unmapped: 115712000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x3b9250f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537092096 unmapped: 115712000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 ms_handle_reset con 0x55bc4a7b3000 session 0x55bc40a932c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 ms_handle_reset con 0x55bc3eeeb800 session 0x55bc3a521680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5644500 data_alloc: 234881024 data_used: 23674880
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x3b9250f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5644660 data_alloc: 234881024 data_used: 23678976
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.907535553s of 16.930467606s, submitted: 16
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536084480 unmapped: 116719616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x3b9250f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 ms_handle_reset con 0x55bc3b15f400 session 0x55bc3d356960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x540150f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536084480 unmapped: 116719616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x540150f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536084480 unmapped: 116719616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536084480 unmapped: 116719616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 422 handle_osd_map epochs [422,423], i have 422, src has [1,423]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5721002 data_alloc: 234881024 data_used: 23687168
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536092672 unmapped: 116711424 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 423 heartbeat osd_stat(store_statfs(0x1982e8000/0x0/0x1bfc00000, data 0x3b941bc/0x3dc5000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536092672 unmapped: 116711424 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536092672 unmapped: 116711424 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536100864 unmapped: 116703232 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 423 ms_handle_reset con 0x55bc3e398800 session 0x55bc39f2cd20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536109056 unmapped: 116695040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5721002 data_alloc: 234881024 data_used: 23687168
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536109056 unmapped: 116695040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 423 heartbeat osd_stat(store_statfs(0x1982e8000/0x0/0x1bfc00000, data 0x3b941bc/0x3dc5000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536109056 unmapped: 116695040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 423 heartbeat osd_stat(store_statfs(0x1982e8000/0x0/0x1bfc00000, data 0x3b941bc/0x3dc5000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536117248 unmapped: 116686848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536125440 unmapped: 116678656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 423 heartbeat osd_stat(store_statfs(0x1982e8000/0x0/0x1bfc00000, data 0x3b941bc/0x3dc5000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5721134 data_alloc: 234881024 data_used: 23687168
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 423 handle_osd_map epochs [423,424], i have 423, src has [1,424]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.398308754s of 15.507222176s, submitted: 36
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e5000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5724108 data_alloc: 234881024 data_used: 23687168
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e5000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536141824 unmapped: 116662272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5736340 data_alloc: 234881024 data_used: 25006080
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5736340 data_alloc: 234881024 data_used: 25006080
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5736500 data_alloc: 234881024 data_used: 25010176
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 116514816 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5736660 data_alloc: 234881024 data_used: 25014272
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 116514816 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 116514816 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 116514816 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 116514816 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.087652206s of 27.103370667s, submitted: 17
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536297472 unmapped: 116506624 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e3000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5737300 data_alloc: 234881024 data_used: 25014272
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536297472 unmapped: 116506624 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3d1ec800 session 0x55bc3ec2bc20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42902400 session 0x55bc3b248960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536297472 unmapped: 116506624 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528023552 unmapped: 124780544 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528023552 unmapped: 124780544 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528023552 unmapped: 124780544 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3f11d000 session 0x55bc3e3aeb40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399828 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527925248 unmapped: 124878848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527925248 unmapped: 124878848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527925248 unmapped: 124878848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527925248 unmapped: 124878848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527925248 unmapped: 124878848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399828 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527933440 unmapped: 124870656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527933440 unmapped: 124870656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527933440 unmapped: 124870656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399828 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399828 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399828 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527958016 unmapped: 124846080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527958016 unmapped: 124846080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527958016 unmapped: 124846080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399828 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527958016 unmapped: 124846080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527958016 unmapped: 124846080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527966208 unmapped: 124837888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527966208 unmapped: 124837888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3dc22800 session 0x55bc3fd17680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc3f2de1e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3eeebc00 session 0x55bc3be04000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b762000 session 0x55bc40a92b40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 35.318782806s of 35.413410187s, submitted: 31
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527982592 unmapped: 124821504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5401549 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532176896 unmapped: 120627200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc391e4800 session 0x55bc3b30be00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b762000 session 0x55bc39eafa40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3dc22800 session 0x55bc3b30a000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3eeebc00 session 0x55bc39864780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc395dc780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527990784 unmapped: 124813312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527990784 unmapped: 124813312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527990784 unmapped: 124813312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527990784 unmapped: 124813312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198f89000/0x0/0x1bfc00000, data 0x2ef3cec/0x3125000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5491623 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3ef3d000 session 0x55bc3d34b2c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528097280 unmapped: 124706816 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528113664 unmapped: 124690432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198f65000/0x0/0x1bfc00000, data 0x2f17cec/0x3149000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5582506 data_alloc: 234881024 data_used: 23056384
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198f65000/0x0/0x1bfc00000, data 0x2f17cec/0x3149000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5582506 data_alloc: 234881024 data_used: 23056384
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198f65000/0x0/0x1bfc00000, data 0x2f17cec/0x3149000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198f65000/0x0/0x1bfc00000, data 0x2f17cec/0x3149000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.492254257s of 19.686597824s, submitted: 31
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198f65000/0x0/0x1bfc00000, data 0x2f17cec/0x3149000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5613674 data_alloc: 234881024 data_used: 23433216
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5620858 data_alloc: 234881024 data_used: 23760896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5620858 data_alloc: 234881024 data_used: 23760896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5620858 data_alloc: 234881024 data_used: 23760896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.330408096s of 19.435075760s, submitted: 26
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3eeed000 session 0x55bc3971a780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b15fc00 session 0x55bc3be045a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5619886 data_alloc: 234881024 data_used: 23760896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5619886 data_alloc: 234881024 data_used: 23760896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529481728 unmapped: 123322368 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529481728 unmapped: 123322368 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529481728 unmapped: 123322368 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529489920 unmapped: 123314176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.088842392s of 11.104266167s, submitted: 5
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc40646000 session 0x55bc3e1f4960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc4a7afc00 session 0x55bc40a93a40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529489920 unmapped: 123314176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3e197c00 session 0x55bc395ba5a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5411382 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5411382 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5411382 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5411382 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5411382 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529481728 unmapped: 123322368 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529489920 unmapped: 123314176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529489920 unmapped: 123314176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529489920 unmapped: 123314176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 24.814048767s of 24.931154251s, submitted: 42
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a850000/0x0/0x1bfc00000, data 0x266bd15/0x289e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [0,0,0,0,1])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3e347400 session 0x55bc3d357860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529629184 unmapped: 123174912 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a606800 session 0x55bc39ed6f00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3c19e000 session 0x55bc3d34a960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3ba8e800 session 0x55bc3a051a40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc4a7b2800 session 0x55bc4337f860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5448616 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a850000/0x0/0x1bfc00000, data 0x266bd4e/0x289e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530685952 unmapped: 122118144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530685952 unmapped: 122118144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a850000/0x0/0x1bfc00000, data 0x266bd4e/0x289e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530685952 unmapped: 122118144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530685952 unmapped: 122118144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b764400 session 0x55bc3b2fbe00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530694144 unmapped: 122109952 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5461950 data_alloc: 218103808 data_used: 13312000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530694144 unmapped: 122109952 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a84f000/0x0/0x1bfc00000, data 0x266bd71/0x289f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a84f000/0x0/0x1bfc00000, data 0x266bd71/0x289f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5474590 data_alloc: 218103808 data_used: 15101952
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a84f000/0x0/0x1bfc00000, data 0x266bd71/0x289f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530857984 unmapped: 121946112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530857984 unmapped: 121946112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5474590 data_alloc: 218103808 data_used: 15101952
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530857984 unmapped: 121946112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.565097809s of 17.140762329s, submitted: 55
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535273472 unmapped: 117530624 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a0cc000/0x0/0x1bfc00000, data 0x2deed71/0x3022000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535281664 unmapped: 117522432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535568384 unmapped: 117235712 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535568384 unmapped: 117235712 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5554052 data_alloc: 218103808 data_used: 17031168
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535568384 unmapped: 117235712 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535568384 unmapped: 117235712 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a049000/0x0/0x1bfc00000, data 0x2e71d71/0x30a5000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535568384 unmapped: 117235712 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535568384 unmapped: 117235712 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534437888 unmapped: 118366208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a02a000/0x0/0x1bfc00000, data 0x2e90d71/0x30c4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5550496 data_alloc: 218103808 data_used: 17035264
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534437888 unmapped: 118366208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534437888 unmapped: 118366208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534437888 unmapped: 118366208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a02a000/0x0/0x1bfc00000, data 0x2e90d71/0x30c4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5550816 data_alloc: 218103808 data_used: 17043456
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a02a000/0x0/0x1bfc00000, data 0x2e90d71/0x30c4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a02a000/0x0/0x1bfc00000, data 0x2e90d71/0x30c4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a02a000/0x0/0x1bfc00000, data 0x2e90d71/0x30c4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5550816 data_alloc: 218103808 data_used: 17043456
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.096275330s of 20.498874664s, submitted: 111
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534454272 unmapped: 118349824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534454272 unmapped: 118349824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a01d000/0x0/0x1bfc00000, data 0x2e9dd71/0x30d1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534454272 unmapped: 118349824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534454272 unmapped: 118349824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a01d000/0x0/0x1bfc00000, data 0x2e9dd71/0x30d1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3e398c00 session 0x55bc3e1f4780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b142000 session 0x55bc3d34b680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5550120 data_alloc: 218103808 data_used: 17043456
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534454272 unmapped: 118349824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a01d000/0x0/0x1bfc00000, data 0x2e9dd71/0x30d1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534454272 unmapped: 118349824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3d5fd800 session 0x55bc39fcda40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532070400 unmapped: 120733696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532070400 unmapped: 120733696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532070400 unmapped: 120733696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422218 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422218 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422218 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422218 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422218 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422218 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532086784 unmapped: 120717312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532086784 unmapped: 120717312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 36.724323273s of 36.925601959s, submitted: 61
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc40646000 session 0x55bc396e85a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532094976 unmapped: 120709120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532094976 unmapped: 120709120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5529662 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532094976 unmapped: 120709120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199db1000/0x0/0x1bfc00000, data 0x310bcec/0x333d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5529662 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199db1000/0x0/0x1bfc00000, data 0x310bcec/0x333d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3f11d400 session 0x55bc3f2def00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc39eaf4a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3ba8f000 session 0x55bc3b2492c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.088305473s of 11.171178818s, submitted: 14
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b1a1800 session 0x55bc392ab860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532299776 unmapped: 120504320 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199db1000/0x0/0x1bfc00000, data 0x310bcec/0x333d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199d8b000/0x0/0x1bfc00000, data 0x312fd1f/0x3363000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5536818 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532357120 unmapped: 120446976 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199d8b000/0x0/0x1bfc00000, data 0x312fd1f/0x3363000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5640338 data_alloc: 234881024 data_used: 26611712
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199d8b000/0x0/0x1bfc00000, data 0x312fd1f/0x3363000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5640338 data_alloc: 234881024 data_used: 26611712
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199d8b000/0x0/0x1bfc00000, data 0x312fd1f/0x3363000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.251974106s of 12.289438248s, submitted: 13
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538451968 unmapped: 114352128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540524544 unmapped: 112279552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995a4000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5712800 data_alloc: 234881024 data_used: 27160576
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995a4000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5712800 data_alloc: 234881024 data_used: 27160576
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995a4000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.383799553s of 10.515339851s, submitted: 71
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541868032 unmapped: 110936064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541868032 unmapped: 110936064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541868032 unmapped: 110936064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3d5fd000 session 0x55bc3d3565a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc3b2fa1e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5712188 data_alloc: 234881024 data_used: 27148288
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541868032 unmapped: 110936064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541868032 unmapped: 110936064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5709728 data_alloc: 234881024 data_used: 27312128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5709728 data_alloc: 234881024 data_used: 27312128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.376385689s of 15.460625648s, submitted: 23
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5716368 data_alloc: 234881024 data_used: 27676672
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc39864780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc3e3ae960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6000 session 0x55bc3f2def00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326d0f/0x2559000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539688960 unmapped: 113115136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539688960 unmapped: 113115136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c94c00 session 0x55bc3d3561e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c94c00 session 0x55bc39eae3c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc4337f2c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc3f2dfe00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539688960 unmapped: 113115136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 55.381656647s of 55.483505249s, submitted: 52
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6000 session 0x55bc392abc20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc3a0501e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc3be04d20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc3e3afa40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc3fd161e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5524672 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a101000/0x0/0x1bfc00000, data 0x2dbacfc/0x2fed000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a101000/0x0/0x1bfc00000, data 0x2dbacfc/0x2fed000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5524672 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c95800 session 0x55bc395dd4a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc4a7aac00 session 0x55bc40a92b40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a101000/0x0/0x1bfc00000, data 0x2dbacfc/0x2fed000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc395bab40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.130089760s of 10.246961594s, submitted: 29
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc3fd16780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540180480 unmapped: 112623616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540180480 unmapped: 112623616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5530299 data_alloc: 218103808 data_used: 12058624
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a0db000/0x0/0x1bfc00000, data 0x2dded2f/0x3013000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540262400 unmapped: 112541696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.5 total, 600.0 interval#012Cumulative writes: 77K writes, 307K keys, 77K commit groups, 1.0 writes per commit group, ingest: 0.31 GB, 0.04 MB/s#012Cumulative WAL: 77K writes, 28K syncs, 2.69 writes per sync, written: 0.31 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2146 writes, 8860 keys, 2146 commit groups, 1.0 writes per commit group, ingest: 7.11 MB, 0.01 MB/s#012Interval WAL: 2146 writes, 874 syncs, 2.46 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5601179 data_alloc: 218103808 data_used: 21934080
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a0db000/0x0/0x1bfc00000, data 0x2dded2f/0x3013000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a0db000/0x0/0x1bfc00000, data 0x2dded2f/0x3013000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a0db000/0x0/0x1bfc00000, data 0x2dded2f/0x3013000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: mgrc ms_handle_reset ms_handle_reset con 0x55bc4a7b0800
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/530399322
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/530399322,v1:192.168.122.100:6801/530399322]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5601179 data_alloc: 218103808 data_used: 21934080
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: mgrc handle_mgr_configure stats_period=5
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.702584267s of 13.730019569s, submitted: 8
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539394048 unmapped: 113410048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c3f000/0x0/0x1bfc00000, data 0x3272d2f/0x34a7000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540614656 unmapped: 112189440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5656561 data_alloc: 234881024 data_used: 22425600
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b92000/0x0/0x1bfc00000, data 0x331fd2f/0x3554000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b92000/0x0/0x1bfc00000, data 0x331fd2f/0x3554000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5656577 data_alloc: 234881024 data_used: 22425600
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b92000/0x0/0x1bfc00000, data 0x331fd2f/0x3554000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b9a000/0x0/0x1bfc00000, data 0x331fd2f/0x3554000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 111796224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 111796224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.218096733s of 11.472540855s, submitted: 101
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc4337e000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c95800 session 0x55bc3a1621e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 111796224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc40646000 session 0x55bc3d356b40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc40646000 session 0x55bc39fc61e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c95000 session 0x55bc39eafa40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc3fd16b40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540508160 unmapped: 112295936 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc3be054a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc398650e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc39665860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc3a05d680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc3971ba40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c94800 session 0x55bc391de5a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5541273 data_alloc: 218103808 data_used: 12050432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540753920 unmapped: 112050176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b162800 session 0x55bc40a930e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a1c5000/0x0/0x1bfc00000, data 0x2cf5d5e/0x2f29000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc396e8000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540753920 unmapped: 112050176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc40a93680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540753920 unmapped: 112050176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540901376 unmapped: 111902720 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc4337e5a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540901376 unmapped: 111902720 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5557071 data_alloc: 218103808 data_used: 13787136
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 542064640 unmapped: 110739456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a1a0000/0x0/0x1bfc00000, data 0x2d19d6e/0x2f4e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5607791 data_alloc: 218103808 data_used: 20914176
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a1a0000/0x0/0x1bfc00000, data 0x2d19d6e/0x2f4e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5607791 data_alloc: 218103808 data_used: 20914176
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a1a0000/0x0/0x1bfc00000, data 0x2d19d6e/0x2f4e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.004535675s of 18.269424438s, submitted: 95
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3ef3dc00 session 0x55bc3b203680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545677312 unmapped: 107126784 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546119680 unmapped: 106684416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545898496 unmapped: 106905600 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545906688 unmapped: 106897408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1988b9000/0x0/0x1bfc00000, data 0x3457d6e/0x368c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23caf9c7), peers [0,1] op hist [0,1])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1988b9000/0x0/0x1bfc00000, data 0x3457d6e/0x368c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23caf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5684591 data_alloc: 218103808 data_used: 21942272
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545923072 unmapped: 106881024 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545923072 unmapped: 106881024 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545964032 unmapped: 106840064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545964032 unmapped: 106840064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1988c2000/0x0/0x1bfc00000, data 0x3457d6e/0x368c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23caf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545964032 unmapped: 106840064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5678819 data_alloc: 218103808 data_used: 21946368
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546029568 unmapped: 106774528 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.653300285s of 10.073910713s, submitted: 353
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546029568 unmapped: 106774528 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546037760 unmapped: 106766336 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c94800 session 0x55bc39dc5a40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3e346800 session 0x55bc3971ab40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547094528 unmapped: 105709568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198491000/0x0/0x1bfc00000, data 0x3478d6e/0x36ad000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547094528 unmapped: 105709568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5465246 data_alloc: 218103808 data_used: 12070912
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543596544 unmapped: 109207552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc3be045a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5465246 data_alloc: 218103808 data_used: 12070912
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5465246 data_alloc: 218103808 data_used: 12070912
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5465246 data_alloc: 218103808 data_used: 12070912
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5465246 data_alloc: 218103808 data_used: 12070912
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544669696 unmapped: 108134400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544669696 unmapped: 108134400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5465246 data_alloc: 218103808 data_used: 12070912
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544669696 unmapped: 108134400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544669696 unmapped: 108134400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.991601944s of 31.428083420s, submitted: 139
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544669696 unmapped: 108134400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3ba8e800 session 0x55bc3ec2a780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc4a7ab000 session 0x55bc3d3572c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc4a7b1800 session 0x55bc4337e1e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6800 session 0x55bc3e3aeb40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6800 session 0x55bc3dbd3860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19921e000/0x0/0x1bfc00000, data 0x26eecec/0x2920000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498387 data_alloc: 218103808 data_used: 12070912
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19921e000/0x0/0x1bfc00000, data 0x26eecec/0x2920000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc4a7aa000 session 0x55bc3fd161e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5500775 data_alloc: 218103808 data_used: 12070912
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544849920 unmapped: 107954176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544849920 unmapped: 107954176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544858112 unmapped: 107945984 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1991fa000/0x0/0x1bfc00000, data 0x2712cec/0x2944000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5528107 data_alloc: 218103808 data_used: 15839232
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1991fa000/0x0/0x1bfc00000, data 0x2712cec/0x2944000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5528107 data_alloc: 218103808 data_used: 15839232
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1991fa000/0x0/0x1bfc00000, data 0x2712cec/0x2944000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.569227219s of 21.706684113s, submitted: 22
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1991fa000/0x0/0x1bfc00000, data 0x2712cec/0x2944000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548446208 unmapped: 104357888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1991fa000/0x0/0x1bfc00000, data 0x2712cec/0x2944000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [0,0,0,0,0,0,1,1,11,20])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5629407 data_alloc: 218103808 data_used: 15859712
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546668544 unmapped: 106135552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546684928 unmapped: 106119168 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19851f000/0x0/0x1bfc00000, data 0x33edcec/0x361f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546684928 unmapped: 106119168 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546684928 unmapped: 106119168 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1984de000/0x0/0x1bfc00000, data 0x342ecec/0x3660000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547110912 unmapped: 105693184 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5639943 data_alloc: 218103808 data_used: 16769024
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19849d000/0x0/0x1bfc00000, data 0x346fcec/0x36a1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5640103 data_alloc: 218103808 data_used: 16773120
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.593935966s of 11.477274895s, submitted: 78
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19847b000/0x0/0x1bfc00000, data 0x3491cec/0x36c3000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5640823 data_alloc: 218103808 data_used: 16773120
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc39921400 session 0x55bc395ba5a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3e398400 session 0x55bc395dcd20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19846d000/0x0/0x1bfc00000, data 0x349fcec/0x36d1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19846d000/0x0/0x1bfc00000, data 0x349fcec/0x36d1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5639191 data_alloc: 218103808 data_used: 16773120
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19846d000/0x0/0x1bfc00000, data 0x349fcec/0x36d1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3eeecc00 session 0x55bc392aa5a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.454691887s of 12.702719688s, submitted: 11
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 424 handle_osd_map epochs [424,425], i have 424, src has [1,425]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 425 ms_handle_reset con 0x55bc3eeecc00 session 0x55bc3b30a780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 425 ms_handle_reset con 0x55bc39921400 session 0x55bc3d3570e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 425 ms_handle_reset con 0x55bc3e398400 session 0x55bc4337f860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5643365 data_alloc: 218103808 data_used: 16781312
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 425 heartbeat osd_stat(store_statfs(0x198469000/0x0/0x1bfc00000, data 0x34a1945/0x36d4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 425 heartbeat osd_stat(store_statfs(0x198469000/0x0/0x1bfc00000, data 0x34a1945/0x36d4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5644137 data_alloc: 218103808 data_used: 16822272
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 425 heartbeat osd_stat(store_statfs(0x198469000/0x0/0x1bfc00000, data 0x34a1945/0x36d4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 425 heartbeat osd_stat(store_statfs(0x198469000/0x0/0x1bfc00000, data 0x34a1945/0x36d4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5644137 data_alloc: 218103808 data_used: 16822272
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 425 heartbeat osd_stat(store_statfs(0x198469000/0x0/0x1bfc00000, data 0x34a1945/0x36d4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.551640511s of 15.687273026s, submitted: 8
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 425 ms_handle_reset con 0x55bc3b762800 session 0x55bc3b30be00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 425 heartbeat osd_stat(store_statfs(0x198469000/0x0/0x1bfc00000, data 0x34a1945/0x36d4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547135488 unmapped: 105668608 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 425 handle_osd_map epochs [425,426], i have 425, src has [1,426]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 426 ms_handle_reset con 0x55bc3a607800 session 0x55bc3e3ae1e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5654547 data_alloc: 218103808 data_used: 17137664
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 105660416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 426 heartbeat osd_stat(store_statfs(0x198464000/0x0/0x1bfc00000, data 0x34a35f2/0x36d7000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5660467 data_alloc: 218103808 data_used: 17682432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 426 heartbeat osd_stat(store_statfs(0x198464000/0x0/0x1bfc00000, data 0x34a35f2/0x36d7000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5660467 data_alloc: 218103808 data_used: 17682432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 426 handle_osd_map epochs [426,427], i have 426, src has [1,427]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.463970184s of 11.542026520s, submitted: 28
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3a42b400 session 0x55bc3e1f5c20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b1a1400 session 0x55bc392a5680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198464000/0x0/0x1bfc00000, data 0x34a5131/0x36da000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5661809 data_alloc: 218103808 data_used: 17682432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198464000/0x0/0x1bfc00000, data 0x34a5131/0x36da000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc42c95800 session 0x55bc41a75e00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7b3000 session 0x55bc40a93860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b145000 session 0x55bc3d34ad20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b145000 session 0x55bc3f2df0e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3a42b400 session 0x55bc3dbd21e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 56.036472321s of 56.152065277s, submitted: 44
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547225600 unmapped: 105578496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547241984 unmapped: 105562112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b1a1400 session 0x55bc3ec2ad20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198d41000/0x0/0x1bfc00000, data 0x2bc8131/0x2dfd000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5553765 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547250176 unmapped: 105553920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547250176 unmapped: 105553920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547250176 unmapped: 105553920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198d41000/0x0/0x1bfc00000, data 0x2bc8131/0x2dfd000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547250176 unmapped: 105553920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7b2000 session 0x55bc3d357e00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547250176 unmapped: 105553920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5558951 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547250176 unmapped: 105553920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547266560 unmapped: 105537536 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198d1c000/0x0/0x1bfc00000, data 0x2bec154/0x2e22000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5615723 data_alloc: 218103808 data_used: 19992576
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198d1c000/0x0/0x1bfc00000, data 0x2bec154/0x2e22000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198d1c000/0x0/0x1bfc00000, data 0x2bec154/0x2e22000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5615723 data_alloc: 218103808 data_used: 19992576
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.351263046s of 21.458789825s, submitted: 24
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547880960 unmapped: 104923136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198d1c000/0x0/0x1bfc00000, data 0x2bec154/0x2e22000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551190528 unmapped: 101613568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549167104 unmapped: 103636992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5682095 data_alloc: 218103808 data_used: 20029440
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1984a4000/0x0/0x1bfc00000, data 0x345e154/0x3694000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549167104 unmapped: 103636992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549167104 unmapped: 103636992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5695981 data_alloc: 218103808 data_used: 20627456
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198484000/0x0/0x1bfc00000, data 0x3479154/0x36af000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.743067741s of 11.654676437s, submitted: 77
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5695233 data_alloc: 218103808 data_used: 20615168
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848d000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f12a000 session 0x55bc3b2f45a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d1ed800 session 0x55bc39f2cd20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848d000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5695101 data_alloc: 218103808 data_used: 20615168
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4ac00 session 0x55bc3dbd3a40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848d000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848d000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550232064 unmapped: 102572032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.722215652s of 10.750947952s, submitted: 25
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5691153 data_alloc: 218103808 data_used: 20619264
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550240256 unmapped: 102563840 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5691953 data_alloc: 218103808 data_used: 20717568
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5691953 data_alloc: 218103808 data_used: 20717568
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550264832 unmapped: 102539264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.778625488s of 11.782457352s, submitted: 1
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550281216 unmapped: 102522880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848c000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550281216 unmapped: 102522880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848c000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5696401 data_alloc: 218103808 data_used: 20930560
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5696401 data_alloc: 218103808 data_used: 20930560
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5696401 data_alloc: 218103808 data_used: 20930560
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b15fc00 session 0x55bc3e3afa40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.834457397s of 14.847952843s, submitted: 15
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3ef3c400 session 0x55bc40a93c20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550305792 unmapped: 102498304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547397632 unmapped: 105406464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5505715 data_alloc: 218103808 data_used: 12201984
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3dc23400 session 0x55bc3a4501e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 105390080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 105390080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 105390080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 105390080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547430400 unmapped: 105373696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547430400 unmapped: 105373696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547438592 unmapped: 105365504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547438592 unmapped: 105365504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547438592 unmapped: 105365504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547438592 unmapped: 105365504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 41.052555084s of 41.146015167s, submitted: 37
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556376064 unmapped: 96428032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bbdec00 session 0x55bc3971a780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7afc00 session 0x55bc39ed7680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc42c94800 session 0x55bc3b30b860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7ac400 session 0x55bc3d34af00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4b000 session 0x55bc41a75e00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547684352 unmapped: 105119744 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1987e1000/0x0/0x1bfc00000, data 0x3128131/0x335d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547684352 unmapped: 105119744 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5617034 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4385c800 session 0x55bc395dd2c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547692544 unmapped: 105111552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f124800 session 0x55bc3b2f50e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b4d5000 session 0x55bc3e1f5680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547561472 unmapped: 105242624 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc39689400 session 0x55bc3fd17c20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547569664 unmapped: 105234432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1987e0000/0x0/0x1bfc00000, data 0x3128141/0x335e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547569664 unmapped: 105234432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547692544 unmapped: 105111552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5720140 data_alloc: 234881024 data_used: 26136576
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1987e0000/0x0/0x1bfc00000, data 0x3128141/0x335e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549036032 unmapped: 103768064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549036032 unmapped: 103768064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1987e0000/0x0/0x1bfc00000, data 0x3128141/0x335e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5720140 data_alloc: 234881024 data_used: 26136576
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1987e0000/0x0/0x1bfc00000, data 0x3128141/0x335e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.640283585s of 17.763603210s, submitted: 39
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5720712 data_alloc: 234881024 data_used: 26148864
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1987e0000/0x0/0x1bfc00000, data 0x3128141/0x335e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #59. Immutable memtables: 15.
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 555016192 unmapped: 97787904 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 555106304 unmapped: 97697792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196b3e000/0x0/0x1bfc00000, data 0x3c2a141/0x3e60000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 555294720 unmapped: 97509376 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556400640 unmapped: 96403456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19672a000/0x0/0x1bfc00000, data 0x403e141/0x4274000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556400640 unmapped: 96403456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5868382 data_alloc: 234881024 data_used: 27492352
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556400640 unmapped: 96403456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196330000/0x0/0x1bfc00000, data 0x4432141/0x4668000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556400640 unmapped: 96403456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196330000/0x0/0x1bfc00000, data 0x4432141/0x4668000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556138496 unmapped: 96665600 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556179456 unmapped: 96624640 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557449216 unmapped: 95354880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5871674 data_alloc: 234881024 data_used: 27303936
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.422229767s of 10.152475357s, submitted: 181
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196279000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196279000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5883714 data_alloc: 234881024 data_used: 27533312
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196279000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7a9000 session 0x55bc396e9680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3c19f800 session 0x55bc3d34a000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5879794 data_alloc: 234881024 data_used: 27533312
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19627f000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19627f000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5879794 data_alloc: 234881024 data_used: 27533312
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19627f000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.022401810s of 17.036853790s, submitted: 9
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 95338496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19627f000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 95338496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19627f000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 95338496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5881946 data_alloc: 234881024 data_used: 27537408
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19625a000/0x0/0x1bfc00000, data 0x450e141/0x4744000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557473792 unmapped: 95330304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc42903000 session 0x55bc395ba5a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3fa62800 session 0x55bc3a05d680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4b000 session 0x55bc3b30ba40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551870464 unmapped: 100933632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551870464 unmapped: 100933632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551870464 unmapped: 100933632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551870464 unmapped: 100933632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551878656 unmapped: 100925440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551878656 unmapped: 100925440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551878656 unmapped: 100925440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551878656 unmapped: 100925440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3ec53400 session 0x55bc39f2d860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4b000 session 0x55bc3e1f4b40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3c19f800 session 0x55bc3e3aeb40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3fa62800 session 0x55bc396e8000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 44.188873291s of 44.280071259s, submitted: 44
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552927232 unmapped: 99876864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc42903000 session 0x55bc3fd161e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d5fc000 session 0x55bc3e3ae780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4b000 session 0x55bc39fcc960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e7216a/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3c19f800 session 0x55bc3a520000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3fa62800 session 0x55bc3be052c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5612520 data_alloc: 218103808 data_used: 12091392
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d5fd000 session 0x55bc3fd16f00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552099840 unmapped: 100704256 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5612520 data_alloc: 218103808 data_used: 12091392
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552099840 unmapped: 100704256 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d029c00 session 0x55bc3ec2a780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5687400 data_alloc: 234881024 data_used: 22646784
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5687400 data_alloc: 234881024 data_used: 22646784
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b1a1800 session 0x55bc4337fe00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5687400 data_alloc: 234881024 data_used: 22646784
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f124800 session 0x55bc3dbd3680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.174053192s of 27.356210709s, submitted: 41
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553181184 unmapped: 99622912 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc42902400 session 0x55bc3dbd3680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548405248 unmapped: 104398848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548405248 unmapped: 104398848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548405248 unmapped: 104398848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 41.755805969s of 41.814205170s, submitted: 18
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b586000 session 0x55bc3e3ae780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3a42b400 session 0x55bc3b30ba40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b1a1800 session 0x55bc395ba5a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b586000 session 0x55bc3fd17c20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f124800 session 0x55bc41a75e00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548413440 unmapped: 104390656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548413440 unmapped: 104390656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548413440 unmapped: 104390656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548413440 unmapped: 104390656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197883000/0x0/0x1bfc00000, data 0x2ee6131/0x311b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548421632 unmapped: 104382464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7a6c00 session 0x55bc3b30b860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5619321 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7b2000 session 0x55bc3d34ba40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b1a1800 session 0x55bc39fccd20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b586000 session 0x55bc41a74f00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f124800 session 0x55bc3fd165a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548421632 unmapped: 104382464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b13c000 session 0x55bc3a0505a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548421632 unmapped: 104382464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197882000/0x0/0x1bfc00000, data 0x2ee615a/0x311c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548421632 unmapped: 104382464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548732928 unmapped: 104071168 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548438016 unmapped: 104366080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5690755 data_alloc: 218103808 data_used: 12087296
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f12bc00 session 0x55bc39665a40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.917882919s of 10.103110313s, submitted: 59
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548626432 unmapped: 104177664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197112000/0x0/0x1bfc00000, data 0x365615a/0x388c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548626432 unmapped: 104177664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966fa000/0x0/0x1bfc00000, data 0x406d16a/0x42a4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548626432 unmapped: 104177664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4ac00 session 0x55bc39ed7680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548626432 unmapped: 104177664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7a6c00 session 0x55bc40a92780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d1edc00 session 0x55bc3dbd34a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b721000 session 0x55bc41a74960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b13c000 session 0x55bc4337eb40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b721000 session 0x55bc3f2de5a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4ac00 session 0x55bc3e3afa40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d1edc00 session 0x55bc3b2f45a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548642816 unmapped: 104161280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5770569 data_alloc: 218103808 data_used: 12091392
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548642816 unmapped: 104161280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548642816 unmapped: 104161280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966fa000/0x0/0x1bfc00000, data 0x406d1a3/0x42a4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548651008 unmapped: 104153088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7a6400 session 0x55bc3be05a40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f11c400 session 0x55bc3e1f4f00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548659200 unmapped: 104144896 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc39921400 session 0x55bc3d34b4a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d1ec800 session 0x55bc3e3ae3c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548667392 unmapped: 104136704 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5840969 data_alloc: 218103808 data_used: 19673088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550084608 unmapped: 102719488 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3ef3cc00 session 0x55bc3b2f5680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7af000 session 0x55bc3d357860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.621556282s of 10.712708473s, submitted: 30
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966fa000/0x0/0x1bfc00000, data 0x406d1a3/0x42a4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550084608 unmapped: 102719488 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7aec00 session 0x55bc3f2dfc20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3e346c00 session 0x55bc3a05cd20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550395904 unmapped: 102408192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966b0000/0x0/0x1bfc00000, data 0x40b51c3/0x42ee000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550395904 unmapped: 102408192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966b0000/0x0/0x1bfc00000, data 0x40b51c3/0x42ee000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550395904 unmapped: 102408192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5864350 data_alloc: 218103808 data_used: 21291008
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966b0000/0x0/0x1bfc00000, data 0x40b51c3/0x42ee000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966b0000/0x0/0x1bfc00000, data 0x40b51c3/0x42ee000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550404096 unmapped: 102400000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550404096 unmapped: 102400000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551346176 unmapped: 101457920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551346176 unmapped: 101457920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551346176 unmapped: 101457920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5886590 data_alloc: 234881024 data_used: 24281088
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966b0000/0x0/0x1bfc00000, data 0x40b51c3/0x42ee000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551346176 unmapped: 101457920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.204458237s of 10.254082680s, submitted: 13
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551346176 unmapped: 101457920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966b0000/0x0/0x1bfc00000, data 0x40b51c3/0x42ee000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 555794432 unmapped: 97009664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557006848 unmapped: 95797248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556679168 unmapped: 96124928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6050632 data_alloc: 234881024 data_used: 33976320
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x195a88000/0x0/0x1bfc00000, data 0x4cdd1c3/0x4f16000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556679168 unmapped: 96124928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556679168 unmapped: 96124928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556679168 unmapped: 96124928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556679168 unmapped: 96124928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559374336 unmapped: 93429760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6144370 data_alloc: 234881024 data_used: 34029568
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558989312 unmapped: 93814784 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x194e3d000/0x0/0x1bfc00000, data 0x59281c3/0x5b61000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.682522774s of 10.029087067s, submitted: 159
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7aa000 session 0x55bc3b248960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b586000 session 0x55bc41a75680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 93806592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 93790208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 93790208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x194e96000/0x0/0x1bfc00000, data 0x42d01b3/0x4508000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3e196000 session 0x55bc3d34b680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 93790208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5940665 data_alloc: 234881024 data_used: 28221440
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x194eba000/0x0/0x1bfc00000, data 0x42ac1b3/0x44e4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565108736 unmapped: 87695360 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565108736 unmapped: 87695360 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc39921400 session 0x55bc3d34af00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d029000 session 0x55bc39dc5a40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19568f000/0x0/0x1bfc00000, data 0x50d11b3/0x5309000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565108736 unmapped: 87695360 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x195695000/0x0/0x1bfc00000, data 0x50d11b3/0x5309000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc39921400 session 0x55bc3e3ae780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196c2b000/0x0/0x1bfc00000, data 0x3b18141/0x3d4e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563871744 unmapped: 88932352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563871744 unmapped: 88932352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5804774 data_alloc: 218103808 data_used: 20484096
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196c2b000/0x0/0x1bfc00000, data 0x3b18141/0x3d4e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563871744 unmapped: 88932352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563871744 unmapped: 88932352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563871744 unmapped: 88932352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.577011108s of 12.010025024s, submitted: 196
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7a6c00 session 0x55bc3d34b0e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563871744 unmapped: 88932352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7ab800 session 0x55bc3f2de780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559185920 unmapped: 93618176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196c50000/0x0/0x1bfc00000, data 0x3b18141/0x3d4e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5804166 data_alloc: 218103808 data_used: 20484096
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f12b800 session 0x55bc3a520000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 49.914997101s of 49.995937347s, submitted: 32
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc42903000 session 0x55bc41a750e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3eeeb400 session 0x55bc3e1f50e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3c2a2c00 session 0x55bc3e1f4780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b587000 session 0x55bc39f2cd20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3ba8ec00 session 0x55bc3e3ae960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5659481 data_alloc: 218103808 data_used: 11952128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197882000/0x0/0x1bfc00000, data 0x2ee6193/0x311c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b43b400 session 0x55bc39865a40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5659481 data_alloc: 218103808 data_used: 11952128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551608320 unmapped: 101195776 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197882000/0x0/0x1bfc00000, data 0x2ee6193/0x311c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5736957 data_alloc: 234881024 data_used: 22507520
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197882000/0x0/0x1bfc00000, data 0x2ee6193/0x311c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.5 total, 600.0 interval#012Cumulative writes: 80K writes, 319K keys, 80K commit groups, 1.0 writes per commit group, ingest: 0.32 GB, 0.04 MB/s#012Cumulative WAL: 80K writes, 29K syncs, 2.68 writes per sync, written: 0.32 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3146 writes, 12K keys, 3146 commit groups, 1.0 writes per commit group, ingest: 12.82 MB, 0.02 MB/s#012Interval WAL: 3146 writes, 1219 syncs, 2.58 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5736957 data_alloc: 234881024 data_used: 22507520
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197882000/0x0/0x1bfc00000, data 0x2ee6193/0x311c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.433856964s of 19.576021194s, submitted: 48
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554033152 unmapped: 98770944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554041344 unmapped: 98762752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553369600 unmapped: 99434496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19721a000/0x0/0x1bfc00000, data 0x354d193/0x3783000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5791039 data_alloc: 234881024 data_used: 23113728
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197212000/0x0/0x1bfc00000, data 0x3553193/0x3789000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5806577 data_alloc: 234881024 data_used: 23191552
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5806593 data_alloc: 234881024 data_used: 23191552
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5806593 data_alloc: 234881024 data_used: 23191552
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5806593 data_alloc: 234881024 data_used: 23191552
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.315835953s of 25.517301559s, submitted: 100
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553443328 unmapped: 99360768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553443328 unmapped: 99360768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553451520 unmapped: 99352576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5801905 data_alloc: 234881024 data_used: 23179264
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553451520 unmapped: 99352576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553451520 unmapped: 99352576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971ff000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553451520 unmapped: 99352576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f124400 session 0x55bc3a47c1e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553459712 unmapped: 99344384 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971ff000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553459712 unmapped: 99344384 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5801097 data_alloc: 234881024 data_used: 23179264
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b43bc00 session 0x55bc4337e1e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553467904 unmapped: 99336192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 handle_osd_map epochs [427,428], i have 427, src has [1,428]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 427 handle_osd_map epochs [428,428], i have 428, src has [1,428]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 428 ms_handle_reset con 0x55bc3eeea400 session 0x55bc39fb1c20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 428 ms_handle_reset con 0x55bc3b764400 session 0x55bc39ed7680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 428 ms_handle_reset con 0x55bc3b13d000 session 0x55bc40a92780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 428 ms_handle_reset con 0x55bc3b43b400 session 0x55bc3e1f45a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 566214656 unmapped: 101122048 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 428 handle_osd_map epochs [428,429], i have 428, src has [1,429]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 429 ms_handle_reset con 0x55bc3b43bc00 session 0x55bc3f2de000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 101105664 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 429 handle_osd_map epochs [429,430], i have 429, src has [1,430]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.597118378s of 10.926770210s, submitted: 94
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 101105664 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 430 ms_handle_reset con 0x55bc3eeea400 session 0x55bc395bb860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 101105664 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 430 ms_handle_reset con 0x55bc4a7ab400 session 0x55bc3fd16b40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 430 heartbeat osd_stat(store_statfs(0x195ac3000/0x0/0x1bfc00000, data 0x4c9f70e/0x4ed9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 430 ms_handle_reset con 0x55bc4a7ab400 session 0x55bc3b2485a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6009786 data_alloc: 234881024 data_used: 30371840
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 430 ms_handle_reset con 0x55bc3b13d000 session 0x55bc39fd4f00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559546368 unmapped: 107790336 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 430 heartbeat osd_stat(store_statfs(0x195ac5000/0x0/0x1bfc00000, data 0x4c9f70e/0x4ed9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559546368 unmapped: 107790336 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559554560 unmapped: 107782144 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 430 ms_handle_reset con 0x55bc4a7b2c00 session 0x55bc3ec2be00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553918464 unmapped: 113418240 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 430 heartbeat osd_stat(store_statfs(0x196d02000/0x0/0x1bfc00000, data 0x3a626ac/0x3c9b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 430 heartbeat osd_stat(store_statfs(0x196d02000/0x0/0x1bfc00000, data 0x3a626ac/0x3c9b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553918464 unmapped: 113418240 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5769078 data_alloc: 218103808 data_used: 11976704
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 430 heartbeat osd_stat(store_statfs(0x196d02000/0x0/0x1bfc00000, data 0x3a626ac/0x3c9b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553967616 unmapped: 113369088 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 430 heartbeat osd_stat(store_statfs(0x196d02000/0x0/0x1bfc00000, data 0x3a626ac/0x3c9b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553967616 unmapped: 113369088 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 430 handle_osd_map epochs [430,431], i have 430, src has [1,431]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554008576 unmapped: 113328128 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 heartbeat osd_stat(store_statfs(0x1968ef000/0x0/0x1bfc00000, data 0x3a641eb/0x3c9e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.445232391s of 10.060069084s, submitted: 237
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554057728 unmapped: 113278976 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 heartbeat osd_stat(store_statfs(0x1968f0000/0x0/0x1bfc00000, data 0x3a641eb/0x3c9e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5772196 data_alloc: 218103808 data_used: 11984896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 heartbeat osd_stat(store_statfs(0x1968f0000/0x0/0x1bfc00000, data 0x3a641eb/0x3c9e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 heartbeat osd_stat(store_statfs(0x1968f0000/0x0/0x1bfc00000, data 0x3a641eb/0x3c9e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5772196 data_alloc: 218103808 data_used: 11984896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 heartbeat osd_stat(store_statfs(0x1968f0000/0x0/0x1bfc00000, data 0x3a641eb/0x3c9e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5772196 data_alloc: 218103808 data_used: 11984896
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc42950000 session 0x55bc3ec2a780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.059436798s of 12.233953476s, submitted: 70
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc3fa62800 session 0x55bc3e1f5e00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc3c19f800 session 0x55bc41a743c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc3b13d000 session 0x55bc39fcd860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc42950000 session 0x55bc3d34a000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc4a7ab400 session 0x55bc41a75e00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc4a7b2c00 session 0x55bc392ab2c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 heartbeat osd_stat(store_statfs(0x1968f0000/0x0/0x1bfc00000, data 0x3a641eb/0x3c9e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc3b13d000 session 0x55bc3a05c3c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 577232896 unmapped: 90103808 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 577232896 unmapped: 90103808 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 577232896 unmapped: 90103808 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc3eeea400 session 0x55bc391df2c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 569237504 unmapped: 98099200 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5926373 data_alloc: 234881024 data_used: 36306944
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 431 handle_osd_map epochs [431,432], i have 431, src has [1,432]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 432 heartbeat osd_stat(store_statfs(0x195e62000/0x0/0x1bfc00000, data 0x44f21eb/0x472c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [1])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 569253888 unmapped: 98082816 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 432 ms_handle_reset con 0x55bc3a472800 session 0x55bc4337fc20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 432 heartbeat osd_stat(store_statfs(0x19758f000/0x0/0x1bfc00000, data 0x2dc2e98/0x2ffe000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 432 heartbeat osd_stat(store_statfs(0x19758f000/0x0/0x1bfc00000, data 0x2dc2e98/0x2ffe000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5731863 data_alloc: 218103808 data_used: 16175104
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 432 handle_osd_map epochs [432,433], i have 432, src has [1,433]
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.662106514s of 10.894762039s, submitted: 70
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19758c000/0x0/0x1bfc00000, data 0x2dc49d7/0x3001000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19758c000/0x0/0x1bfc00000, data 0x2dc49d7/0x3001000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5734837 data_alloc: 218103808 data_used: 16175104
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19758c000/0x0/0x1bfc00000, data 0x2dc49d7/0x3001000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 102866944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 102866944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5754161 data_alloc: 218103808 data_used: 17903616
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 102866944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 102866944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19758d000/0x0/0x1bfc00000, data 0x2dc49d7/0x3001000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564404224 unmapped: 102932480 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.595741272s of 11.623991966s, submitted: 18
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5753809 data_alloc: 218103808 data_used: 17899520
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19758d000/0x0/0x1bfc00000, data 0x2dc49d7/0x3001000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19758d000/0x0/0x1bfc00000, data 0x2dc49d7/0x3001000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5753809 data_alloc: 218103808 data_used: 17899520
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19758d000/0x0/0x1bfc00000, data 0x2dc49d7/0x3001000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565534720 unmapped: 101801984 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565534720 unmapped: 101801984 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565534720 unmapped: 101801984 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5753549 data_alloc: 218103808 data_used: 17895424
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565534720 unmapped: 101801984 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x197585000/0x0/0x1bfc00000, data 0x2dc99d7/0x3006000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.786685944s of 12.819623947s, submitted: 8
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565469184 unmapped: 101867520 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565469184 unmapped: 101867520 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x197588000/0x0/0x1bfc00000, data 0x2dc99d7/0x3006000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565469184 unmapped: 101867520 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565469184 unmapped: 101867520 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5754957 data_alloc: 218103808 data_used: 17940480
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3bf4ac00 session 0x55bc3a47c000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3b4d5c00 session 0x55bc39fb0780
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 105062400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3a472800 session 0x55bc3e1f4f00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 105062400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 105062400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 105062400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 105062400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5622509 data_alloc: 218103808 data_used: 11997184
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 105062400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 105062400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 105062400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 562274304 unmapped: 105062400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3f125400 session 0x55bc4337e1e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 106692608 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3d27cc00 session 0x55bc39fd4f00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5634989 data_alloc: 218103808 data_used: 15536128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 106692608 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3e197800 session 0x55bc39dc4b40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 106692608 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 106692608 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 106692608 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560644096 unmapped: 106692608 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3b142400 session 0x55bc395dd4a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5634989 data_alloc: 218103808 data_used: 15536128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3a472800 session 0x55bc3f2de1e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.363405228s of 19.481782913s, submitted: 30
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3d27cc00 session 0x55bc41a74b40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3e197800 session 0x55bc3b2fa5a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 106381312 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 106381312 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 106381312 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x197a36000/0x0/0x1bfc00000, data 0x291b9d7/0x2b58000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 106381312 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 106381312 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5680753 data_alloc: 218103808 data_used: 15536128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 106381312 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560955392 unmapped: 106381312 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560963584 unmapped: 106373120 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x197a36000/0x0/0x1bfc00000, data 0x291b9d7/0x2b58000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560963584 unmapped: 106373120 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x197a36000/0x0/0x1bfc00000, data 0x291b9d7/0x2b58000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560963584 unmapped: 106373120 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5680753 data_alloc: 218103808 data_used: 15536128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3b763400 session 0x55bc392ab2c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560963584 unmapped: 106373120 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc4a7b0400 session 0x55bc4337fe00
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560963584 unmapped: 106373120 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3a472800 session 0x55bc392a5860
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.712238312s of 11.758738518s, submitted: 9
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3b763400 session 0x55bc3b2f41e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560979968 unmapped: 106356736 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 106348544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x197a34000/0x0/0x1bfc00000, data 0x291ba0a/0x2b5a000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 106348544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5728929 data_alloc: 234881024 data_used: 21544960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 106348544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 106348544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 106348544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x197a34000/0x0/0x1bfc00000, data 0x291ba0a/0x2b5a000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x197a34000/0x0/0x1bfc00000, data 0x291ba0a/0x2b5a000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 106348544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 106348544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5728929 data_alloc: 234881024 data_used: 21544960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x197a34000/0x0/0x1bfc00000, data 0x291ba0a/0x2b5a000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 106348544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 106348544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 106348544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 106348544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 106348544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5728929 data_alloc: 234881024 data_used: 21544960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x197a34000/0x0/0x1bfc00000, data 0x291ba0a/0x2b5a000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 560988160 unmapped: 106348544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.894924164s of 13.925003052s, submitted: 12
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563273728 unmapped: 104062976 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563273728 unmapped: 104062976 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563380224 unmapped: 103956480 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707f000/0x0/0x1bfc00000, data 0x32d0a0a/0x350f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563380224 unmapped: 103956480 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5818995 data_alloc: 234881024 data_used: 23060480
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563380224 unmapped: 103956480 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563380224 unmapped: 103956480 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563388416 unmapped: 103948288 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707f000/0x0/0x1bfc00000, data 0x32d0a0a/0x350f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563388416 unmapped: 103948288 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563388416 unmapped: 103948288 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5816259 data_alloc: 234881024 data_used: 23060480
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563388416 unmapped: 103948288 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707c000/0x0/0x1bfc00000, data 0x32d3a0a/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563388416 unmapped: 103948288 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563396608 unmapped: 103940096 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563396608 unmapped: 103940096 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563396608 unmapped: 103940096 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5816259 data_alloc: 234881024 data_used: 23060480
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 103931904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 103931904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707c000/0x0/0x1bfc00000, data 0x32d3a0a/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 103931904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707c000/0x0/0x1bfc00000, data 0x32d3a0a/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707c000/0x0/0x1bfc00000, data 0x32d3a0a/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 103931904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 103931904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5816259 data_alloc: 234881024 data_used: 23060480
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563404800 unmapped: 103931904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707c000/0x0/0x1bfc00000, data 0x32d3a0a/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563412992 unmapped: 103923712 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563412992 unmapped: 103923712 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 22.201332092s of 22.393074036s, submitted: 97
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563412992 unmapped: 103923712 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d4a0a/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563412992 unmapped: 103923712 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5816663 data_alloc: 234881024 data_used: 23064576
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563412992 unmapped: 103923712 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d4a0a/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563429376 unmapped: 103907328 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563429376 unmapped: 103907328 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707a000/0x0/0x1bfc00000, data 0x32d5a0a/0x3514000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3a607400 session 0x55bc3fd172c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3b145000 session 0x55bc41a74000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707a000/0x0/0x1bfc00000, data 0x32d5a0a/0x3514000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563429376 unmapped: 103907328 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563429376 unmapped: 103907328 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5816503 data_alloc: 234881024 data_used: 23072768
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563429376 unmapped: 103907328 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563437568 unmapped: 103899136 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707a000/0x0/0x1bfc00000, data 0x32d5a0a/0x3514000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 103890944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707a000/0x0/0x1bfc00000, data 0x32d5a0a/0x3514000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 103890944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 103890944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5816503 data_alloc: 234881024 data_used: 23072768
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 103890944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 103890944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563445760 unmapped: 103890944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707a000/0x0/0x1bfc00000, data 0x32d5a0a/0x3514000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563453952 unmapped: 103882752 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563453952 unmapped: 103882752 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5816503 data_alloc: 234881024 data_used: 23072768
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563453952 unmapped: 103882752 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707a000/0x0/0x1bfc00000, data 0x32d5a0a/0x3514000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563453952 unmapped: 103882752 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707a000/0x0/0x1bfc00000, data 0x32d5a0a/0x3514000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.878314972s of 18.905475616s, submitted: 9
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3dc22400 session 0x55bc3b248000
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563453952 unmapped: 103882752 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563453952 unmapped: 103882752 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563453952 unmapped: 103882752 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5814122 data_alloc: 234881024 data_used: 23068672
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563453952 unmapped: 103882752 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59d7/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563453952 unmapped: 103882752 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563462144 unmapped: 103874560 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563462144 unmapped: 103874560 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563462144 unmapped: 103874560 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5814122 data_alloc: 234881024 data_used: 23068672
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563462144 unmapped: 103874560 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59d7/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563462144 unmapped: 103874560 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563462144 unmapped: 103874560 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59d7/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563462144 unmapped: 103874560 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 103866368 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5814122 data_alloc: 234881024 data_used: 23068672
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 103866368 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 103866368 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 103866368 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 103866368 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59d7/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 103866368 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5814122 data_alloc: 234881024 data_used: 23068672
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59d7/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 103866368 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59d7/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 103866368 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563470336 unmapped: 103866368 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 103858176 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3d27cc00 session 0x55bc392ab680
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 103858176 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5814122 data_alloc: 234881024 data_used: 23068672
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 103858176 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59d7/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 103858176 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 103858176 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3dc22400 session 0x55bc3d3572c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 103858176 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59d7/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563478528 unmapped: 103858176 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5814122 data_alloc: 234881024 data_used: 23068672
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563486720 unmapped: 103849984 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563486720 unmapped: 103849984 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3dc22000 session 0x55bc39864960
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563486720 unmapped: 103849984 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 30.829244614s of 30.916475296s, submitted: 25
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563503104 unmapped: 103833600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563503104 unmapped: 103833600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5814847 data_alloc: 234881024 data_used: 23068672
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563511296 unmapped: 103825408 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3eeed000 session 0x55bc39fb14a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563511296 unmapped: 103825408 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563511296 unmapped: 103825408 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563519488 unmapped: 103817216 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 103809024 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5815487 data_alloc: 234881024 data_used: 23130112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 103809024 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 103809024 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 103809024 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 103809024 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 103809024 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5815487 data_alloc: 234881024 data_used: 23130112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 103809024 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 103809024 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 103809024 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563527680 unmapped: 103809024 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 103800832 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5816287 data_alloc: 234881024 data_used: 23150592
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 103800832 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.056394577s of 18.062255859s, submitted: 2
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 103800832 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 103800832 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 103800832 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 103800832 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5827503 data_alloc: 234881024 data_used: 24240128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 103800832 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 103800832 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 103800832 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 103800832 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 103800832 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5827503 data_alloc: 234881024 data_used: 24240128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563535872 unmapped: 103800832 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563544064 unmapped: 103792640 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563544064 unmapped: 103792640 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563544064 unmapped: 103792640 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563544064 unmapped: 103792640 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5827503 data_alloc: 234881024 data_used: 24240128
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563544064 unmapped: 103792640 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.042448044s of 15.051975250s, submitted: 2
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563560448 unmapped: 103776256 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563560448 unmapped: 103776256 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563560448 unmapped: 103776256 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563568640 unmapped: 103768064 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5827183 data_alloc: 234881024 data_used: 24231936
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563576832 unmapped: 103759872 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563576832 unmapped: 103759872 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563576832 unmapped: 103759872 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563576832 unmapped: 103759872 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563576832 unmapped: 103759872 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5831183 data_alloc: 234881024 data_used: 24850432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707b000/0x0/0x1bfc00000, data 0x32d59e7/0x3513000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563585024 unmapped: 103751680 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3d27d800 session 0x55bc3be043c0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3f124400 session 0x55bc3fd16d20
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563585024 unmapped: 103751680 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.085731506s of 11.091985703s, submitted: 2
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3d27d800 session 0x55bc3e3af4a0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563593216 unmapped: 103743488 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563593216 unmapped: 103743488 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563593216 unmapped: 103743488 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5830106 data_alloc: 234881024 data_used: 24850432
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc42902c00 session 0x55bc39665a40
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19707c000/0x0/0x1bfc00000, data 0x32d59d7/0x3512000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563593216 unmapped: 103743488 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc42950c00 session 0x55bc39ed61e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558768128 unmapped: 108568576 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558776320 unmapped: 108560384 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558776320 unmapped: 108560384 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558776320 unmapped: 108560384 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558776320 unmapped: 108560384 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558776320 unmapped: 108560384 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558776320 unmapped: 108560384 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558776320 unmapped: 108560384 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558776320 unmapped: 108560384 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558776320 unmapped: 108560384 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558784512 unmapped: 108552192 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558784512 unmapped: 108552192 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558784512 unmapped: 108552192 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558792704 unmapped: 108544000 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558792704 unmapped: 108544000 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558792704 unmapped: 108544000 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558792704 unmapped: 108544000 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558792704 unmapped: 108544000 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 108535808 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558800896 unmapped: 108535808 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 108527616 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 108527616 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 108527616 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 108527616 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 108527616 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558809088 unmapped: 108527616 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558825472 unmapped: 108511232 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558825472 unmapped: 108511232 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558825472 unmapped: 108511232 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558825472 unmapped: 108511232 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558825472 unmapped: 108511232 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558825472 unmapped: 108511232 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558825472 unmapped: 108511232 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558833664 unmapped: 108503040 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558833664 unmapped: 108503040 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558833664 unmapped: 108503040 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558833664 unmapped: 108503040 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558833664 unmapped: 108503040 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558833664 unmapped: 108503040 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558850048 unmapped: 108486656 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558850048 unmapped: 108486656 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558850048 unmapped: 108486656 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558866432 unmapped: 108470272 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558866432 unmapped: 108470272 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558866432 unmapped: 108470272 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558866432 unmapped: 108470272 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558866432 unmapped: 108470272 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558866432 unmapped: 108470272 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558866432 unmapped: 108470272 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558866432 unmapped: 108470272 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558874624 unmapped: 108462080 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558874624 unmapped: 108462080 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558874624 unmapped: 108462080 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558874624 unmapped: 108462080 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558874624 unmapped: 108462080 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558874624 unmapped: 108462080 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558882816 unmapped: 108453888 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558891008 unmapped: 108445696 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 ms_handle_reset con 0x55bc3f11dc00 session 0x55bc39f2c1e0
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558899200 unmapped: 108437504 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558899200 unmapped: 108437504 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558899200 unmapped: 108437504 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558899200 unmapped: 108437504 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558899200 unmapped: 108437504 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558899200 unmapped: 108437504 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558899200 unmapped: 108437504 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558899200 unmapped: 108437504 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558907392 unmapped: 108429312 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558907392 unmapped: 108429312 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558907392 unmapped: 108429312 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558907392 unmapped: 108429312 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558907392 unmapped: 108429312 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558915584 unmapped: 108421120 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558915584 unmapped: 108421120 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558923776 unmapped: 108412928 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558923776 unmapped: 108412928 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558923776 unmapped: 108412928 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558923776 unmapped: 108412928 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558923776 unmapped: 108412928 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558931968 unmapped: 108404736 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558931968 unmapped: 108404736 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558931968 unmapped: 108404736 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558940160 unmapped: 108396544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558940160 unmapped: 108396544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558940160 unmapped: 108396544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558940160 unmapped: 108396544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558940160 unmapped: 108396544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558940160 unmapped: 108396544 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558989312 unmapped: 108347392 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: do_command 'config diff' '{prefix=config diff}'
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: do_command 'config show' '{prefix=config show}'
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558432256 unmapped: 108904448 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558202880 unmapped: 109133824 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19801b000/0x0/0x1bfc00000, data 0x23369d7/0x2573000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5647604 data_alloc: 218103808 data_used: 15450112
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558333952 unmapped: 109002752 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:18:42 np0005593234 ceph-osd[79769]: do_command 'log dump' '{prefix=log dump}'
Jan 23 06:18:42 np0005593234 nova_compute[227762]: 2026-01-23 11:18:42.612 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:42 np0005593234 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 06:18:42 np0005593234 nova_compute[227762]: 2026-01-23 11:18:42.743 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:42 np0005593234 podman[354922]: 2026-01-23 11:18:42.859355399 +0000 UTC m=+0.145557616 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 06:18:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:18:42.932 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:18:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:18:42.933 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:18:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:18:42.933 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:18:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 23 06:18:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3795088294' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 06:18:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:18:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:43.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:18:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0) v1
Jan 23 06:18:43 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/384839873' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 06:18:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:43.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0) v1
Jan 23 06:18:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1299730316' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 06:18:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:18:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/866097552' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:18:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:18:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/866097552' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:18:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0) v1
Jan 23 06:18:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/227934310' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 06:18:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:45.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0) v1
Jan 23 06:18:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/525962273' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 06:18:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0) v1
Jan 23 06:18:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1303308864' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 06:18:45 np0005593234 nova_compute[227762]: 2026-01-23 11:18:45.558 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:45 np0005593234 nova_compute[227762]: 2026-01-23 11:18:45.560 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:18:45 np0005593234 nova_compute[227762]: 2026-01-23 11:18:45.560 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:18:45 np0005593234 nova_compute[227762]: 2026-01-23 11:18:45.591 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:18:45 np0005593234 nova_compute[227762]: 2026-01-23 11:18:45.592 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:45 np0005593234 nova_compute[227762]: 2026-01-23 11:18:45.592 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0) v1
Jan 23 06:18:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3881341180' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 06:18:45 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0) v1
Jan 23 06:18:45 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2979097699' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 06:18:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:45.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0) v1
Jan 23 06:18:46 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3930971627' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 06:18:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0) v1
Jan 23 06:18:46 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3459223332' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 06:18:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0) v1
Jan 23 06:18:46 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1712295976' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 06:18:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0) v1
Jan 23 06:18:46 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2758429565' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 06:18:46 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0) v1
Jan 23 06:18:46 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1896042018' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 06:18:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0) v1
Jan 23 06:18:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2681629307' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 06:18:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:18:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:47.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:18:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0) v1
Jan 23 06:18:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1363939653' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 06:18:47 np0005593234 systemd[1]: Starting Hostname Service...
Jan 23 06:18:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0) v1
Jan 23 06:18:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1294076065' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 06:18:47 np0005593234 nova_compute[227762]: 2026-01-23 11:18:47.615 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:47 np0005593234 systemd[1]: Started Hostname Service.
Jan 23 06:18:47 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0) v1
Jan 23 06:18:47 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2958200625' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 06:18:47 np0005593234 nova_compute[227762]: 2026-01-23 11:18:47.746 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 06:18:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:47.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 06:18:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:18:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:49.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:18:49 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0) v1
Jan 23 06:18:49 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3290531591' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 06:18:49 np0005593234 nova_compute[227762]: 2026-01-23 11:18:49.743 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:49 np0005593234 nova_compute[227762]: 2026-01-23 11:18:49.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:49 np0005593234 nova_compute[227762]: 2026-01-23 11:18:49.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:18:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:18:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:49.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:18:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "versions"} v 0) v1
Jan 23 06:18:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2717088684' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 06:18:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 23 06:18:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1300070670' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 06:18:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 06:18:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 06:18:50 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0) v1
Jan 23 06:18:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2291919197' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 06:18:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 06:18:50 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 06:18:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:51.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:51 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 06:18:51 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 06:18:51 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 06:18:51 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 06:18:51 np0005593234 nova_compute[227762]: 2026-01-23 11:18:51.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:18:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:51.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:18:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0) v1
Jan 23 06:18:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2874187444' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 06:18:52 np0005593234 nova_compute[227762]: 2026-01-23 11:18:52.617 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:52 np0005593234 nova_compute[227762]: 2026-01-23 11:18:52.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:52 np0005593234 nova_compute[227762]: 2026-01-23 11:18:52.748 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:52 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0) v1
Jan 23 06:18:52 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2631942330' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 06:18:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 06:18:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 06:18:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:53.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df"} v 0) v1
Jan 23 06:18:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/173830146' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 06:18:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 06:18:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 06:18:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0) v1
Jan 23 06:18:53 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2811127924' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 06:18:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:53.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:54 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0) v1
Jan 23 06:18:54 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/481975350' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 23 06:18:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:18:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:55.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:18:55 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0) v1
Jan 23 06:18:55 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/497665025' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 23 06:18:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:18:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:55.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:18:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0) v1
Jan 23 06:18:56 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4070605102' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 23 06:18:56 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0) v1
Jan 23 06:18:56 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2350077910' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 23 06:18:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:57.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:57 np0005593234 nova_compute[227762]: 2026-01-23 11:18:57.618 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:57 np0005593234 nova_compute[227762]: 2026-01-23 11:18:57.749 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:18:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:57.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:57 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0) v1
Jan 23 06:18:57 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3620985171' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 23 06:18:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:18:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0) v1
Jan 23 06:18:58 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1636578553' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 23 06:18:58 np0005593234 nova_compute[227762]: 2026-01-23 11:18:58.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:18:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:18:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:18:59.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:18:59 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0) v1
Jan 23 06:18:59 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4007154880' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 23 06:18:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:18:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:18:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:18:59.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:19:00 np0005593234 ovs-appctl[357862]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 06:19:00 np0005593234 ovs-appctl[357866]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 06:19:00 np0005593234 ovs-appctl[357871]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 06:19:00 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0) v1
Jan 23 06:19:00 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2081566022' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 23 06:19:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:01.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:01 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 23 06:19:01 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4239168788' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 06:19:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:01.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:02 np0005593234 nova_compute[227762]: 2026-01-23 11:19:02.621 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:02 np0005593234 nova_compute[227762]: 2026-01-23 11:19:02.750 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:02 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0) v1
Jan 23 06:19:02 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2560916061' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 23 06:19:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0) v1
Jan 23 06:19:03 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4153928609' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 23 06:19:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:03.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:03 np0005593234 podman[358961]: 2026-01-23 11:19:03.684613246 +0000 UTC m=+0.076795855 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:19:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:03.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 23 06:19:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1000043884' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 06:19:04 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0) v1
Jan 23 06:19:04 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3366235726' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 23 06:19:05 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0) v1
Jan 23 06:19:05 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1978595072' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 23 06:19:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:05.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:05.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:06 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0) v1
Jan 23 06:19:06 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2311996692' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 23 06:19:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:07.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0) v1
Jan 23 06:19:07 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2165298002' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 23 06:19:07 np0005593234 nova_compute[227762]: 2026-01-23 11:19:07.624 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:07 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0) v1
Jan 23 06:19:07 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3748300065' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 23 06:19:07 np0005593234 nova_compute[227762]: 2026-01-23 11:19:07.751 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:07.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0) v1
Jan 23 06:19:08 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3425905790' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 23 06:19:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:09.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:09 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0) v1
Jan 23 06:19:09 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2531547206' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 23 06:19:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:19:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:09.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:19:10 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0) v1
Jan 23 06:19:10 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/824974767' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 23 06:19:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:19:11 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:19:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0) v1
Jan 23 06:19:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2157795093' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 06:19:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:19:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:11.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:19:11 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0) v1
Jan 23 06:19:11 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3564721386' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 23 06:19:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:19:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:11.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:19:12 np0005593234 virtqemud[227483]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 06:19:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 06:19:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 06:19:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:19:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:19:12 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:19:12 np0005593234 nova_compute[227762]: 2026-01-23 11:19:12.651 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:12 np0005593234 nova_compute[227762]: 2026-01-23 11:19:12.753 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:12 np0005593234 nova_compute[227762]: 2026-01-23 11:19:12.757 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:12 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1
Jan 23 06:19:12 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3844908657' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 23 06:19:12 np0005593234 systemd[1]: Starting Time & Date Service...
Jan 23 06:19:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:13 np0005593234 podman[360216]: 2026-01-23 11:19:13.102632801 +0000 UTC m=+0.150468579 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 06:19:13 np0005593234 systemd[1]: Started Time & Date Service.
Jan 23 06:19:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0) v1
Jan 23 06:19:13 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2017364127' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 23 06:19:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:19:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:13.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:19:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:13.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:15.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:19:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:15.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:19:16 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0) v1
Jan 23 06:19:16 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3874298664' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 23 06:19:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:17.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:17 np0005593234 nova_compute[227762]: 2026-01-23 11:19:17.657 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:17 np0005593234 nova_compute[227762]: 2026-01-23 11:19:17.754 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:17.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:19.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:19.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:19:20 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:19:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:21.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:19:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:21.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:19:22 np0005593234 nova_compute[227762]: 2026-01-23 11:19:22.661 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:22 np0005593234 nova_compute[227762]: 2026-01-23 11:19:22.757 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:23.277 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:19:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:23.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:19:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:25.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:25.938 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:27.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:27 np0005593234 nova_compute[227762]: 2026-01-23 11:19:27.662 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:27 np0005593234 nova_compute[227762]: 2026-01-23 11:19:27.759 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:27.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:29.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:19:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:29.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:19:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:31.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:31.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:32 np0005593234 nova_compute[227762]: 2026-01-23 11:19:32.662 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:32 np0005593234 nova_compute[227762]: 2026-01-23 11:19:32.761 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:33.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:33 np0005593234 podman[360481]: 2026-01-23 11:19:33.839230342 +0000 UTC m=+0.065192652 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 06:19:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:33.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:35.294 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:35.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:37.297 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:37 np0005593234 nova_compute[227762]: 2026-01-23 11:19:37.665 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:37 np0005593234 nova_compute[227762]: 2026-01-23 11:19:37.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:37 np0005593234 nova_compute[227762]: 2026-01-23 11:19:37.762 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:37.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:38 np0005593234 nova_compute[227762]: 2026-01-23 11:19:38.380 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:19:38 np0005593234 nova_compute[227762]: 2026-01-23 11:19:38.380 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:19:38 np0005593234 nova_compute[227762]: 2026-01-23 11:19:38.381 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:19:38 np0005593234 nova_compute[227762]: 2026-01-23 11:19:38.381 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:19:38 np0005593234 nova_compute[227762]: 2026-01-23 11:19:38.382 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:19:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:19:38 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1195824746' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:19:38 np0005593234 nova_compute[227762]: 2026-01-23 11:19:38.938 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.556s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:19:39 np0005593234 nova_compute[227762]: 2026-01-23 11:19:39.087 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:19:39 np0005593234 nova_compute[227762]: 2026-01-23 11:19:39.089 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=3884MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:19:39 np0005593234 nova_compute[227762]: 2026-01-23 11:19:39.089 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:19:39 np0005593234 nova_compute[227762]: 2026-01-23 11:19:39.089 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:19:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:19:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:39.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:19:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:39.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:39 np0005593234 nova_compute[227762]: 2026-01-23 11:19:39.996 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:19:39 np0005593234 nova_compute[227762]: 2026-01-23 11:19:39.997 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:19:40 np0005593234 nova_compute[227762]: 2026-01-23 11:19:40.176 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:19:40 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:19:40 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2424402686' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:19:40 np0005593234 nova_compute[227762]: 2026-01-23 11:19:40.604 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:19:40 np0005593234 nova_compute[227762]: 2026-01-23 11:19:40.609 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:19:40 np0005593234 nova_compute[227762]: 2026-01-23 11:19:40.716 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:19:40 np0005593234 nova_compute[227762]: 2026-01-23 11:19:40.718 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:19:40 np0005593234 nova_compute[227762]: 2026-01-23 11:19:40.718 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.629s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:19:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:41.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:41.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:42 np0005593234 nova_compute[227762]: 2026-01-23 11:19:42.702 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:42 np0005593234 nova_compute[227762]: 2026-01-23 11:19:42.763 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:19:42.933 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:19:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:19:42.935 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:19:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:19:42.935 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:19:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:43 np0005593234 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 06:19:43 np0005593234 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 06:19:43 np0005593234 podman[360552]: 2026-01-23 11:19:43.25287799 +0000 UTC m=+0.084557778 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 06:19:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:19:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:43.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:19:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:43.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:44 np0005593234 nova_compute[227762]: 2026-01-23 11:19:44.720 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:19:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/263309211' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:19:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:19:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/263309211' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:19:44 np0005593234 nova_compute[227762]: 2026-01-23 11:19:44.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:45.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:45 np0005593234 nova_compute[227762]: 2026-01-23 11:19:45.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:45 np0005593234 nova_compute[227762]: 2026-01-23 11:19:45.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:19:45 np0005593234 nova_compute[227762]: 2026-01-23 11:19:45.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:19:45 np0005593234 nova_compute[227762]: 2026-01-23 11:19:45.836 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:19:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:45.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:47.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:47 np0005593234 nova_compute[227762]: 2026-01-23 11:19:47.747 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:47 np0005593234 nova_compute[227762]: 2026-01-23 11:19:47.765 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:19:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:47.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:19:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:49.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:49 np0005593234 nova_compute[227762]: 2026-01-23 11:19:49.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:49 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:49 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:49 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:49.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:50 np0005593234 nova_compute[227762]: 2026-01-23 11:19:50.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:50 np0005593234 nova_compute[227762]: 2026-01-23 11:19:50.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:19:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:19:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:51.319 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:19:51 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:51 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:51 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:51.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:52 np0005593234 nova_compute[227762]: 2026-01-23 11:19:52.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:52 np0005593234 nova_compute[227762]: 2026-01-23 11:19:52.752 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:52 np0005593234 nova_compute[227762]: 2026-01-23 11:19:52.766 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:53.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:53 np0005593234 nova_compute[227762]: 2026-01-23 11:19:53.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:19:53 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:53 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:53 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:53.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:55.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:55 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:55 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:55 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:55.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:56 np0005593234 systemd[1]: session-75.scope: Deactivated successfully.
Jan 23 06:19:56 np0005593234 systemd[1]: session-75.scope: Consumed 2min 48.186s CPU time, 1.0G memory peak, read 431.9M from disk, written 341.9M to disk.
Jan 23 06:19:56 np0005593234 systemd-logind[794]: Session 75 logged out. Waiting for processes to exit.
Jan 23 06:19:56 np0005593234 systemd-logind[794]: Removed session 75.
Jan 23 06:19:56 np0005593234 systemd-logind[794]: New session 76 of user zuul.
Jan 23 06:19:56 np0005593234 systemd[1]: Started Session 76 of User zuul.
Jan 23 06:19:56 np0005593234 systemd[1]: session-76.scope: Deactivated successfully.
Jan 23 06:19:56 np0005593234 systemd-logind[794]: Session 76 logged out. Waiting for processes to exit.
Jan 23 06:19:56 np0005593234 systemd-logind[794]: Removed session 76.
Jan 23 06:19:56 np0005593234 systemd-logind[794]: New session 77 of user zuul.
Jan 23 06:19:56 np0005593234 systemd[1]: Started Session 77 of User zuul.
Jan 23 06:19:56 np0005593234 systemd[1]: session-77.scope: Deactivated successfully.
Jan 23 06:19:56 np0005593234 systemd-logind[794]: Session 77 logged out. Waiting for processes to exit.
Jan 23 06:19:56 np0005593234 systemd-logind[794]: Removed session 77.
Jan 23 06:19:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:19:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:57.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:19:57 np0005593234 nova_compute[227762]: 2026-01-23 11:19:57.750 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:57 np0005593234 nova_compute[227762]: 2026-01-23 11:19:57.767 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:19:57 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:57 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:57 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:57.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:19:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:19:59.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:19:59 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:19:59 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:19:59 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:19:59.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:00 np0005593234 ceph-mon[77084]: overall HEALTH_OK
Jan 23 06:20:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:01.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:01 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:01 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:01 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:01.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:02 np0005593234 nova_compute[227762]: 2026-01-23 11:20:02.755 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:02 np0005593234 nova_compute[227762]: 2026-01-23 11:20:02.768 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:03.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:03 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:03 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:20:03 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:03.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:20:04 np0005593234 podman[360725]: 2026-01-23 11:20:04.171607962 +0000 UTC m=+0.051433810 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 06:20:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:05.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:05 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:05 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:05 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:05.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:07.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:07 np0005593234 nova_compute[227762]: 2026-01-23 11:20:07.756 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:07 np0005593234 nova_compute[227762]: 2026-01-23 11:20:07.769 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:07 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:07 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:20:07 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:07.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:20:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:09.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:09 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:09 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:20:09 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:09.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:20:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:11.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:11 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:11 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:11 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:11.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:12 np0005593234 nova_compute[227762]: 2026-01-23 11:20:12.739 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:12 np0005593234 nova_compute[227762]: 2026-01-23 11:20:12.770 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:20:12 np0005593234 nova_compute[227762]: 2026-01-23 11:20:12.771 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:20:12 np0005593234 nova_compute[227762]: 2026-01-23 11:20:12.772 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 06:20:12 np0005593234 nova_compute[227762]: 2026-01-23 11:20:12.772 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:20:12 np0005593234 nova_compute[227762]: 2026-01-23 11:20:12.798 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:12 np0005593234 nova_compute[227762]: 2026-01-23 11:20:12.799 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:20:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:20:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:13.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:20:13 np0005593234 podman[360775]: 2026-01-23 11:20:13.818628864 +0000 UTC m=+0.106237706 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 23 06:20:13 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:13 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:13 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:13.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:15.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:15 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:15 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:20:15 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:15.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:20:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:17.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:17 np0005593234 nova_compute[227762]: 2026-01-23 11:20:17.799 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:20:17 np0005593234 nova_compute[227762]: 2026-01-23 11:20:17.801 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:17 np0005593234 nova_compute[227762]: 2026-01-23 11:20:17.801 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 06:20:17 np0005593234 nova_compute[227762]: 2026-01-23 11:20:17.801 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:20:17 np0005593234 nova_compute[227762]: 2026-01-23 11:20:17.802 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:20:17 np0005593234 nova_compute[227762]: 2026-01-23 11:20:17.803 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:17 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:17 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:17 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:17.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:19.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:19 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:19 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:19 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:19.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:20:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:21.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:20:21 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 06:20:21 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 06:20:21 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:20:21 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 06:20:21 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:21 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:21 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:21.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #232. Immutable memtables: 0.
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.775185) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:856] [default] [JOB 149] Flushing memtable with next log file: 232
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167222775332, "job": 149, "event": "flush_started", "num_memtables": 1, "num_entries": 2235, "num_deletes": 506, "total_data_size": 4231076, "memory_usage": 4299632, "flush_reason": "Manual Compaction"}
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:885] [default] [JOB 149] Level-0 flush table #233: started
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167222793750, "cf_name": "default", "job": 149, "event": "table_file_creation", "file_number": 233, "file_size": 2780748, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 109529, "largest_seqno": 111759, "table_properties": {"data_size": 2770783, "index_size": 5498, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3397, "raw_key_size": 28941, "raw_average_key_size": 21, "raw_value_size": 2747336, "raw_average_value_size": 2050, "num_data_blocks": 234, "num_entries": 1340, "num_filter_entries": 1340, "num_deletions": 506, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769167087, "oldest_key_time": 1769167087, "file_creation_time": 1769167222, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 233, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 149] Flush lasted 18694 microseconds, and 10135 cpu microseconds.
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.793900) [db/flush_job.cc:967] [default] [JOB 149] Level-0 flush table #233: 2780748 bytes OK
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.793957) [db/memtable_list.cc:519] [default] Level-0 commit table #233 started
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.795646) [db/memtable_list.cc:722] [default] Level-0 commit table #233: memtable #1 done
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.795661) EVENT_LOG_v1 {"time_micros": 1769167222795656, "job": 149, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.795681) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 149] Try to delete WAL files size 4219017, prev total WAL file size 4219017, number of live WAL files 2.
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000229.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.797324) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034353233' seq:72057594037927935, type:22 .. '6C6F676D0034373734' seq:0, type:0; will stop at (end)
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 150] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 149 Base level 0, inputs: [233(2715KB)], [231(13MB)]
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167222797455, "job": 150, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [233], "files_L6": [231], "score": -1, "input_data_size": 16479699, "oldest_snapshot_seqno": -1}
Jan 23 06:20:22 np0005593234 nova_compute[227762]: 2026-01-23 11:20:22.804 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 150] Generated table #234: 12540 keys, 14393723 bytes, temperature: kUnknown
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167222873181, "cf_name": "default", "job": 150, "event": "table_file_creation", "file_number": 234, "file_size": 14393723, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14315370, "index_size": 45920, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31365, "raw_key_size": 336519, "raw_average_key_size": 26, "raw_value_size": 14098716, "raw_average_value_size": 1124, "num_data_blocks": 1708, "num_entries": 12540, "num_filter_entries": 12540, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769158914, "oldest_key_time": 0, "file_creation_time": 1769167222, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "8e19a509-cda7-49b3-9222-61516e1c69d3", "db_session_id": "HUKC432V5FL221EKKD8A", "orig_file_number": 234, "seqno_to_time_mapping": "N/A"}}
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.873474) [db/compaction/compaction_job.cc:1663] [default] [JOB 150] Compacted 1@0 + 1@6 files to L6 => 14393723 bytes
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.874722) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 217.3 rd, 189.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.7, 13.1 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(11.1) write-amplify(5.2) OK, records in: 13568, records dropped: 1028 output_compression: NoCompression
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.874744) EVENT_LOG_v1 {"time_micros": 1769167222874733, "job": 150, "event": "compaction_finished", "compaction_time_micros": 75824, "compaction_time_cpu_micros": 43027, "output_level": 6, "num_output_files": 1, "total_output_size": 14393723, "num_input_records": 13568, "num_output_records": 12540, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000233.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167222875499, "job": 150, "event": "table_file_deletion", "file_number": 233}
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-2/store.db/000231.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769167222878732, "job": 150, "event": "table_file_deletion", "file_number": 231}
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.797204) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.878847) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.878854) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.878855) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.878857) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:20:22 np0005593234 ceph-mon[77084]: rocksdb: (Original Log Time 2026/01/23-11:20:22.878859) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 06:20:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:23.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:23 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:23 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:23 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:23.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:25 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:25 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:25 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:25.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:25 np0005593234 nova_compute[227762]: 2026-01-23 11:20:25.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:20:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:25.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:20:27 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:27 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:20:27 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:27.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:20:27 np0005593234 nova_compute[227762]: 2026-01-23 11:20:27.807 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:20:28 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:28 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:28 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:28.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:20:28 np0005593234 ceph-mon[77084]: from='mgr.14132 192.168.122.100:0/1215693943' entity='mgr.compute-0.yntofk' 
Jan 23 06:20:28 np0005593234 nova_compute[227762]: 2026-01-23 11:20:28.797 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:28 np0005593234 nova_compute[227762]: 2026-01-23 11:20:28.798 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Jan 23 06:20:29 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:29 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:20:29 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:29.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:20:30 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:30 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:20:30 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:30.001 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:20:31 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:31 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:31 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:31.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:32 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:32 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:20:32 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:32.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:20:32 np0005593234 nova_compute[227762]: 2026-01-23 11:20:32.809 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:33 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:33 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:33 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:33 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:33.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:34 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:34 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:34 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:34.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:34 np0005593234 podman[361046]: 2026-01-23 11:20:34.761954726 +0000 UTC m=+0.051856014 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Jan 23 06:20:35 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:35 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:35 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:35.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:36 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:36 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:36 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:36.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:37 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:37 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:37 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:37.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:37 np0005593234 nova_compute[227762]: 2026-01-23 11:20:37.810 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:38 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:38 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:38 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:38.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:38 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:38 np0005593234 nova_compute[227762]: 2026-01-23 11:20:38.760 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:38 np0005593234 nova_compute[227762]: 2026-01-23 11:20:38.793 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:20:38 np0005593234 nova_compute[227762]: 2026-01-23 11:20:38.794 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:20:38 np0005593234 nova_compute[227762]: 2026-01-23 11:20:38.794 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:20:38 np0005593234 nova_compute[227762]: 2026-01-23 11:20:38.794 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Jan 23 06:20:38 np0005593234 nova_compute[227762]: 2026-01-23 11:20:38.795 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:20:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:20:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3326865106' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:20:39 np0005593234 nova_compute[227762]: 2026-01-23 11:20:39.268 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:20:39 np0005593234 nova_compute[227762]: 2026-01-23 11:20:39.427 227766 WARNING nova.virt.libvirt.driver [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Jan 23 06:20:39 np0005593234 nova_compute[227762]: 2026-01-23 11:20:39.428 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=4018MB free_disk=20.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Jan 23 06:20:39 np0005593234 nova_compute[227762]: 2026-01-23 11:20:39.428 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:20:39 np0005593234 nova_compute[227762]: 2026-01-23 11:20:39.428 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:20:39 np0005593234 nova_compute[227762]: 2026-01-23 11:20:39.506 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Jan 23 06:20:39 np0005593234 nova_compute[227762]: 2026-01-23 11:20:39.507 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=20GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Jan 23 06:20:39 np0005593234 nova_compute[227762]: 2026-01-23 11:20:39.525 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Jan 23 06:20:39 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:39 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:20:39 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:39.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:20:39 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0) v1
Jan 23 06:20:39 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3925823294' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 06:20:39 np0005593234 nova_compute[227762]: 2026-01-23 11:20:39.953 227766 DEBUG oslo_concurrency.processutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.428s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Jan 23 06:20:39 np0005593234 nova_compute[227762]: 2026-01-23 11:20:39.959 227766 DEBUG nova.compute.provider_tree [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed in ProviderTree for provider: 89873210-bee9-46e9-9f9d-0cd7a156c3a8 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Jan 23 06:20:39 np0005593234 nova_compute[227762]: 2026-01-23 11:20:39.988 227766 DEBUG nova.scheduler.client.report [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Inventory has not changed for provider 89873210-bee9-46e9-9f9d-0cd7a156c3a8 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 20, 'reserved': 1, 'min_unit': 1, 'max_unit': 20, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Jan 23 06:20:39 np0005593234 nova_compute[227762]: 2026-01-23 11:20:39.989 227766 DEBUG nova.compute.resource_tracker [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Jan 23 06:20:39 np0005593234 nova_compute[227762]: 2026-01-23 11:20:39.990 227766 DEBUG oslo_concurrency.lockutils [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.561s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:20:40 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:40 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:40 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:40.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:41 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:41 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:41 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:41.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:42 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:42 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:42 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:42.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:42 np0005593234 nova_compute[227762]: 2026-01-23 11:20:42.814 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:20:42 np0005593234 nova_compute[227762]: 2026-01-23 11:20:42.816 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:20:42 np0005593234 nova_compute[227762]: 2026-01-23 11:20:42.816 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 06:20:42 np0005593234 nova_compute[227762]: 2026-01-23 11:20:42.816 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:20:42 np0005593234 nova_compute[227762]: 2026-01-23 11:20:42.837 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:42 np0005593234 nova_compute[227762]: 2026-01-23 11:20:42.838 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:20:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:20:42.934 144381 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Jan 23 06:20:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:20:42.935 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Jan 23 06:20:42 np0005593234 ovn_metadata_agent[144376]: 2026-01-23 11:20:42.935 144381 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Jan 23 06:20:43 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:43 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:43 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:43 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:43.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:43 np0005593234 nova_compute[227762]: 2026-01-23 11:20:43.975 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:44 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:44 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:20:44 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:44.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:20:44 np0005593234 podman[361139]: 2026-01-23 11:20:44.573822779 +0000 UTC m=+0.102351785 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:20:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0) v1
Jan 23 06:20:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/152176732' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 06:20:44 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0) v1
Jan 23 06:20:44 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/152176732' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 06:20:44 np0005593234 nova_compute[227762]: 2026-01-23 11:20:44.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:45 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:45 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 06:20:45 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:45.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 06:20:46 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:46 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:46 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:46.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:46 np0005593234 nova_compute[227762]: 2026-01-23 11:20:46.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:46 np0005593234 nova_compute[227762]: 2026-01-23 11:20:46.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Jan 23 06:20:46 np0005593234 nova_compute[227762]: 2026-01-23 11:20:46.745 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Jan 23 06:20:47 np0005593234 nova_compute[227762]: 2026-01-23 11:20:47.268 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Jan 23 06:20:47 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:47 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:20:47 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:47.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:20:47 np0005593234 nova_compute[227762]: 2026-01-23 11:20:47.839 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:20:47 np0005593234 nova_compute[227762]: 2026-01-23 11:20:47.841 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:20:47 np0005593234 nova_compute[227762]: 2026-01-23 11:20:47.841 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 06:20:47 np0005593234 nova_compute[227762]: 2026-01-23 11:20:47.842 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:20:47 np0005593234 nova_compute[227762]: 2026-01-23 11:20:47.846 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:47 np0005593234 nova_compute[227762]: 2026-01-23 11:20:47.847 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:20:48 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:48 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:48 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:48.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:48 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:50 np0005593234 nova_compute[227762]: 2026-01-23 11:20:50.588 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4a2c6f0 =====
Jan 23 06:20:50 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4a2c6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4a2c6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:50.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:50 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:20:50 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:50.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:20:50 np0005593234 nova_compute[227762]: 2026-01-23 11:20:50.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:50 np0005593234 nova_compute[227762]: 2026-01-23 11:20:50.744 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Jan 23 06:20:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:52.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:52 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:52 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:52 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:52.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:52 np0005593234 nova_compute[227762]: 2026-01-23 11:20:52.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:52 np0005593234 nova_compute[227762]: 2026-01-23 11:20:52.846 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:20:53 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 06:20:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:54.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 06:20:54 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:54 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:54 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:54.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:54 np0005593234 nova_compute[227762]: 2026-01-23 11:20:54.744 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:20:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:56.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:56 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:56 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:56 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:56.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:57 np0005593234 nova_compute[227762]: 2026-01-23 11:20:57.848 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:20:58 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:20:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:20:58.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:20:58 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:20:58 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:20:58 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:20:58.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:00.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:00 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:00 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:00 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:00.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:02.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:02 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:02 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:02 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:02.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:02 np0005593234 nova_compute[227762]: 2026-01-23 11:21:02.851 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Jan 23 06:21:02 np0005593234 nova_compute[227762]: 2026-01-23 11:21:02.851 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:02 np0005593234 nova_compute[227762]: 2026-01-23 11:21:02.852 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Jan 23 06:21:02 np0005593234 nova_compute[227762]: 2026-01-23 11:21:02.852 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:21:02 np0005593234 nova_compute[227762]: 2026-01-23 11:21:02.852 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Jan 23 06:21:02 np0005593234 nova_compute[227762]: 2026-01-23 11:21:02.853 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:03 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:03 np0005593234 nova_compute[227762]: 2026-01-23 11:21:03.738 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:21:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:04.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:04 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:04 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:04 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:04.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:05 np0005593234 podman[361251]: 2026-01-23 11:21:05.750540543 +0000 UTC m=+0.048936082 container health_status 2020f1cf1f7e983c7809a536d07dfa96978e4bf0bdeb178e7db7e795e289cdd2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 06:21:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:06.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:06 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:06 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:06 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:06.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:07 np0005593234 nova_compute[227762]: 2026-01-23 11:21:07.852 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:07 np0005593234 nova_compute[227762]: 2026-01-23 11:21:07.853 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:08 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:08.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:08 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:08 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:08 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:08.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:10.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:10 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:10 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:10 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:10.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:12.609 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:12 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:12 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:12 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:12.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:12 np0005593234 nova_compute[227762]: 2026-01-23 11:21:12.853 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:12 np0005593234 nova_compute[227762]: 2026-01-23 11:21:12.854 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:13 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:13 np0005593234 systemd-logind[794]: New session 78 of user zuul.
Jan 23 06:21:13 np0005593234 systemd[1]: Started Session 78 of User zuul.
Jan 23 06:21:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:14.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:14 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:14 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:14 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:14.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:14 np0005593234 nova_compute[227762]: 2026-01-23 11:21:14.745 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:21:14 np0005593234 nova_compute[227762]: 2026-01-23 11:21:14.746 227766 DEBUG oslo_service.periodic_task [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Jan 23 06:21:14 np0005593234 nova_compute[227762]: 2026-01-23 11:21:14.746 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Jan 23 06:21:14 np0005593234 nova_compute[227762]: 2026-01-23 11:21:14.760 227766 DEBUG nova.compute.manager [None req-992dd191-a81e-4cd4-b4a9-75e8971337b2 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Jan 23 06:21:15 np0005593234 podman[361314]: 2026-01-23 11:21:15.194395878 +0000 UTC m=+0.099485214 container health_status dc205c6ac05522d75c89c78c06c108db4f3c8b86d90618552e8e328dcd1988c7 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '5578e70e41b73d3290da8fd2fbea1ee513fdfd20a4cfb3f7eb83917d8bc18a25-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42-a3481541a99bab1254ed5fd0fa22cf2bb9cc0e943d0dc5f1511f0fb172a31b42'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 06:21:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:16.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:16 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:16 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:16 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:16.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:17 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "status"} v 0) v1
Jan 23 06:21:17 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1966313493' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 06:21:17 np0005593234 nova_compute[227762]: 2026-01-23 11:21:17.855 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:18 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:18.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:18 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:18 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:18 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:18.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:20 np0005593234 ovs-vsctl[361593]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 23 06:21:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:20.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:20 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:20 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:20 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:20.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:21 np0005593234 virtqemud[227483]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 23 06:21:21 np0005593234 virtqemud[227483]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 23 06:21:21 np0005593234 virtqemud[227483]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 06:21:21 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: cache status {prefix=cache status} (starting...)
Jan 23 06:21:21 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: client ls {prefix=client ls} (starting...)
Jan 23 06:21:22 np0005593234 lvm[361941]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 06:21:22 np0005593234 lvm[361941]: VG ceph_vg0 finished
Jan 23 06:21:22 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: damage ls {prefix=damage ls} (starting...)
Jan 23 06:21:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:22.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:22 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:22 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:22 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:22.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:22 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: dump loads {prefix=dump loads} (starting...)
Jan 23 06:21:22 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "report"} v 0) v1
Jan 23 06:21:22 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/4100407542' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 06:21:22 np0005593234 nova_compute[227762]: 2026-01-23 11:21:22.857 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:22 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 23 06:21:22 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 23 06:21:23 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 23 06:21:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0) v1
Jan 23 06:21:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1301185793' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 06:21:23 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 23 06:21:23 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 23 06:21:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config log"} v 0) v1
Jan 23 06:21:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1282774758' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 06:21:23 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 23 06:21:23 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: ops {prefix=ops} (starting...)
Jan 23 06:21:23 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0) v1
Jan 23 06:21:23 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3153499203' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 06:21:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0) v1
Jan 23 06:21:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1800541611' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 06:21:24 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: session ls {prefix=session ls} (starting...)
Jan 23 06:21:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 23 06:21:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3246853965' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 06:21:24 np0005593234 ceph-mds[84342]: mds.cephfs.compute-2.cfzfln asok_command: status {prefix=status} (starting...)
Jan 23 06:21:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:24.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:24 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:24 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:24 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:24.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:24 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 23 06:21:24 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2195271761' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 06:21:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "features"} v 0) v1
Jan 23 06:21:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2850800035' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 06:21:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0) v1
Jan 23 06:21:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2152909181' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 06:21:25 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0) v1
Jan 23 06:21:25 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2723544314' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 06:21:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0) v1
Jan 23 06:21:26 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3483689940' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 06:21:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0) v1
Jan 23 06:21:26 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2099916231' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 06:21:26 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0) v1
Jan 23 06:21:26 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3411273927' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 06:21:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 06:21:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.102 - anonymous [23/Jan/2026:11:21:26.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 06:21:26 np0005593234 radosgw[83946]: ====== starting new request req=0x7f7cb4aad6f0 =====
Jan 23 06:21:26 np0005593234 radosgw[83946]: ====== req done req=0x7f7cb4aad6f0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 06:21:26 np0005593234 radosgw[83946]: beast: 0x7f7cb4aad6f0: 192.168.122.100 - anonymous [23/Jan/2026:11:21:26.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 06:21:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0) v1
Jan 23 06:21:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3388210623' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 06:21:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0) v1
Jan 23 06:21:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/1912617512' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 06:21:27 np0005593234 nova_compute[227762]: 2026-01-23 11:21:27.859 227766 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 26 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Jan 23 06:21:27 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0) v1
Jan 23 06:21:27 np0005593234 ceph-mon[77084]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2980902945' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528261120 unmapped: 124542976 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528269312 unmapped: 124534784 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f76000/0x0/0x1bfc00000, data 0x231c4d4/0x2548000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528269312 unmapped: 124534784 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4a7b1800 session 0x55bc3a051a40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b4d4c00 session 0x55bc41a745a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528269312 unmapped: 124534784 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3d978c00 session 0x55bc3a0501e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5323633 data_alloc: 218103808 data_used: 11997184
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528285696 unmapped: 124518400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3d978c00 session 0x55bc3b2f52c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc4337fe00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b4d4c00 session 0x55bc41a75c20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528293888 unmapped: 124510208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528302080 unmapped: 124502016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528310272 unmapped: 124493824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528310272 unmapped: 124493824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528310272 unmapped: 124493824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528318464 unmapped: 124485632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528318464 unmapped: 124485632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528318464 unmapped: 124485632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528318464 unmapped: 124485632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528326656 unmapped: 124477440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528326656 unmapped: 124477440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528334848 unmapped: 124469248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528343040 unmapped: 124461056 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc42951c00 session 0x55bc39eaf4a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5322169 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528351232 unmapped: 124452864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528359424 unmapped: 124444672 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528359424 unmapped: 124444672 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3f11cc00 session 0x55bc3fd16d20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3f11cc00 session 0x55bc40a932c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc3d34a1e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b4d4c00 session 0x55bc3be043c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 62.704410553s of 62.801357269s, submitted: 27
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3d978c00 session 0x55bc3b249e00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc42951c00 session 0x55bc3a520000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc42951c00 session 0x55bc396650e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199f78000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc4337eb40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b4d4c00 session 0x55bc3e1f4d20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19995f000/0x0/0x1bfc00000, data 0x293349b/0x2b5f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5381452 data_alloc: 218103808 data_used: 11997184
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19995f000/0x0/0x1bfc00000, data 0x29334d4/0x2b5f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5381452 data_alloc: 218103808 data_used: 11997184
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19995f000/0x0/0x1bfc00000, data 0x29334d4/0x2b5f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529678336 unmapped: 123125760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5381452 data_alloc: 218103808 data_used: 11997184
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529686528 unmapped: 123117568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529686528 unmapped: 123117568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19995f000/0x0/0x1bfc00000, data 0x29334d4/0x2b5f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529686528 unmapped: 123117568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4385c400 session 0x55bc4337e5a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529686528 unmapped: 123117568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4a7a7c00 session 0x55bc3fd16000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4a7a7c00 session 0x55bc3e1f4780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529694720 unmapped: 123109376 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.268370628s of 15.381249428s, submitted: 38
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc3fd16b40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5390086 data_alloc: 218103808 data_used: 12001280
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529899520 unmapped: 122904576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199933000/0x0/0x1bfc00000, data 0x295d507/0x2b8b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529899520 unmapped: 122904576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529907712 unmapped: 122896384 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5425446 data_alloc: 218103808 data_used: 16957440
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199933000/0x0/0x1bfc00000, data 0x295d507/0x2b8b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199933000/0x0/0x1bfc00000, data 0x295d507/0x2b8b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199933000/0x0/0x1bfc00000, data 0x295d507/0x2b8b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5425446 data_alloc: 218103808 data_used: 16957440
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199933000/0x0/0x1bfc00000, data 0x295d507/0x2b8b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529915904 unmapped: 122888192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.630040169s of 13.679882050s, submitted: 13
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 531103744 unmapped: 121700352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534528000 unmapped: 118276096 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5542170 data_alloc: 218103808 data_used: 17022976
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534536192 unmapped: 118267904 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533479424 unmapped: 119324672 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533479424 unmapped: 119324672 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198a34000/0x0/0x1bfc00000, data 0x385c507/0x3a8a000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 533454848 unmapped: 119349248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534781952 unmapped: 118022144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19898d000/0x0/0x1bfc00000, data 0x3902507/0x3b30000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5567446 data_alloc: 218103808 data_used: 18030592
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19898d000/0x0/0x1bfc00000, data 0x3902507/0x3b30000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534781952 unmapped: 118022144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534781952 unmapped: 118022144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534781952 unmapped: 118022144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534781952 unmapped: 118022144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534781952 unmapped: 118022144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x19898d000/0x0/0x1bfc00000, data 0x3902507/0x3b30000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.453640938s of 11.992197037s, submitted: 131
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5565398 data_alloc: 218103808 data_used: 18034688
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198969000/0x0/0x1bfc00000, data 0x3927507/0x3b55000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198969000/0x0/0x1bfc00000, data 0x3927507/0x3b55000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5565398 data_alloc: 218103808 data_used: 18034688
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5565090 data_alloc: 218103808 data_used: 18034688
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198966000/0x0/0x1bfc00000, data 0x392a507/0x3b58000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534921216 unmapped: 117882880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198966000/0x0/0x1bfc00000, data 0x392a507/0x3b58000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198966000/0x0/0x1bfc00000, data 0x392a507/0x3b58000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5565410 data_alloc: 218103808 data_used: 18042880
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198966000/0x0/0x1bfc00000, data 0x392a507/0x3b58000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.936965942s of 19.969482422s, submitted: 7
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5565494 data_alloc: 218103808 data_used: 18042880
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534929408 unmapped: 117874688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534937600 unmapped: 117866496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198961000/0x0/0x1bfc00000, data 0x392f507/0x3b5d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534937600 unmapped: 117866496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534937600 unmapped: 117866496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b1a3800 session 0x55bc3a4501e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534937600 unmapped: 117866496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b4d4c00 session 0x55bc392a5e00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc42951c00 session 0x55bc3fd172c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5565466 data_alloc: 218103808 data_used: 18042880
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534945792 unmapped: 117858304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc39fd4000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198961000/0x0/0x1bfc00000, data 0x392f507/0x3b5d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4a7aa800 session 0x55bc396e8000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4385dc00 session 0x55bc3ec2af00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534069248 unmapped: 118734848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534077440 unmapped: 118726656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534077440 unmapped: 118726656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534077440 unmapped: 118726656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534077440 unmapped: 118726656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534077440 unmapped: 118726656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534085632 unmapped: 118718464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534085632 unmapped: 118718464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534085632 unmapped: 118718464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534085632 unmapped: 118718464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 6600.5 total, 600.0 interval#012Cumulative writes: 74K writes, 298K keys, 74K commit groups, 1.0 writes per commit group, ingest: 0.30 GB, 0.05 MB/s#012Cumulative WAL: 74K writes, 27K syncs, 2.69 writes per sync, written: 0.30 GB, 0.05 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 4268 writes, 14K keys, 4268 commit groups, 1.0 writes per commit group, ingest: 13.04 MB, 0.02 MB/s#012Interval WAL: 4268 writes, 1717 syncs, 2.49 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534093824 unmapped: 118710272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5340823 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534102016 unmapped: 118702080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x199c0f000/0x0/0x1bfc00000, data 0x231c462/0x2546000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534110208 unmapped: 118693888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b43a800 session 0x55bc3b2fa780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b143800 session 0x55bc3f2df680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b143800 session 0x55bc3d34bc20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc3b30a000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534118400 unmapped: 118685696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 42.591941833s of 42.745742798s, submitted: 57
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b43a800 session 0x55bc3fd17e00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3ba8f000 session 0x55bc39864780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b1a1800 session 0x55bc3d357680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b1a1800 session 0x55bc39665860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc391e5800 session 0x55bc395dc780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994f1000/0x0/0x1bfc00000, data 0x2da2472/0x2fcd000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422568 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994f1000/0x0/0x1bfc00000, data 0x2da2472/0x2fcd000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534142976 unmapped: 118661120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994f1000/0x0/0x1bfc00000, data 0x2da2472/0x2fcd000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422568 data_alloc: 218103808 data_used: 11993088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534151168 unmapped: 118652928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534151168 unmapped: 118652928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534151168 unmapped: 118652928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994f1000/0x0/0x1bfc00000, data 0x2da2472/0x2fcd000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.674189568s of 10.862952232s, submitted: 20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534151168 unmapped: 118652928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b1a2400 session 0x55bc3e3af860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534462464 unmapped: 118341632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5427541 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534175744 unmapped: 118628352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5505461 data_alloc: 234881024 data_used: 23035904
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5505461 data_alloc: 234881024 data_used: 23035904
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1994cc000/0x0/0x1bfc00000, data 0x2dc6495/0x2ff2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.972386360s of 12.997897148s, submitted: 7
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534183936 unmapped: 118620160 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536199168 unmapped: 116604928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x1990d3000/0x0/0x1bfc00000, data 0x31bf495/0x33eb000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536207360 unmapped: 116596736 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536084480 unmapped: 116719616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5598143 data_alloc: 234881024 data_used: 23625728
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537264128 unmapped: 115539968 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537264128 unmapped: 115539968 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537264128 unmapped: 115539968 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b29000/0x0/0x1bfc00000, data 0x3761495/0x398d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537264128 unmapped: 115539968 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537214976 unmapped: 115589120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5593615 data_alloc: 234881024 data_used: 23953408
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537223168 unmapped: 115580928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537223168 unmapped: 115580928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.472796440s of 10.179206848s, submitted: 121
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538345472 unmapped: 114458624 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2e000/0x0/0x1bfc00000, data 0x3764495/0x3990000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5594031 data_alloc: 234881024 data_used: 23953408
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2e000/0x0/0x1bfc00000, data 0x3764495/0x3990000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2e000/0x0/0x1bfc00000, data 0x3764495/0x3990000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5594031 data_alloc: 234881024 data_used: 23953408
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2e000/0x0/0x1bfc00000, data 0x3764495/0x3990000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.258337975s of 12.762209892s, submitted: 186
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2e000/0x0/0x1bfc00000, data 0x3764495/0x3990000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5599303 data_alloc: 234881024 data_used: 23961600
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2b000/0x0/0x1bfc00000, data 0x3766495/0x3992000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc4a7ab000 session 0x55bc3d34b2c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc3b43a800 session 0x55bc396e8000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2c000/0x0/0x1bfc00000, data 0x3766495/0x3992000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5594471 data_alloc: 234881024 data_used: 23961600
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2c000/0x0/0x1bfc00000, data 0x3766495/0x3992000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 heartbeat osd_stat(store_statfs(0x198b2c000/0x0/0x1bfc00000, data 0x3766495/0x3992000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5594471 data_alloc: 234881024 data_used: 23961600
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 ms_handle_reset con 0x55bc42903400 session 0x55bc39865a40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.233558655s of 11.280123711s, submitted: 29
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538402816 unmapped: 114401280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 handle_osd_map epochs [418,419], i have 418, src has [1,419]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 418 handle_osd_map epochs [419,419], i have 419, src has [1,419]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 419 ms_handle_reset con 0x55bc3ba8f400 session 0x55bc395ba1e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 419 ms_handle_reset con 0x55bc3b1a3000 session 0x55bc3d34a1e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 419 ms_handle_reset con 0x55bc3bf4bc00 session 0x55bc3e1f4780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5598697 data_alloc: 234881024 data_used: 24072192
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 419 heartbeat osd_stat(store_statfs(0x198b29000/0x0/0x1bfc00000, data 0x37680ee/0x3995000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 419 ms_handle_reset con 0x55bc3f124800 session 0x55bc39eaf4a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 419 handle_osd_map epochs [419,420], i have 419, src has [1,420]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 420 ms_handle_reset con 0x55bc3b1a3000 session 0x55bc3b249e00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 420 heartbeat osd_stat(store_statfs(0x198b29000/0x0/0x1bfc00000, data 0x37680ee/0x3995000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5603031 data_alloc: 234881024 data_used: 24084480
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 420 heartbeat osd_stat(store_statfs(0x198b25000/0x0/0x1bfc00000, data 0x3769d9b/0x3998000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538370048 unmapped: 114434048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 420 heartbeat osd_stat(store_statfs(0x198b25000/0x0/0x1bfc00000, data 0x3769d9b/0x3998000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.517007828s of 13.684580803s, submitted: 29
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 420 handle_osd_map epochs [420,421], i have 420, src has [1,421]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 420 handle_osd_map epochs [421,421], i have 421, src has [1,421]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5618769 data_alloc: 234881024 data_used: 25083904
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538378240 unmapped: 114425856 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198b20000/0x0/0x1bfc00000, data 0x376b8da/0x399b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198b20000/0x0/0x1bfc00000, data 0x376b8da/0x399b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198b20000/0x0/0x1bfc00000, data 0x376b8da/0x399b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5618929 data_alloc: 234881024 data_used: 25088000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198b20000/0x0/0x1bfc00000, data 0x376b8da/0x399b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538386432 unmapped: 114417664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538394624 unmapped: 114409472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.060571671s of 10.098513603s, submitted: 57
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc3dc22400 session 0x55bc41a754a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc42c94c00 session 0x55bc3a0505a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538402816 unmapped: 114401280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366225 data_alloc: 218103808 data_used: 12021760
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc3e196400 session 0x55bc3a051680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366225 data_alloc: 218103808 data_used: 12021760
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366225 data_alloc: 218103808 data_used: 12021760
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529301504 unmapped: 123502592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529309696 unmapped: 123494400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529317888 unmapped: 123486208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199f6e000/0x0/0x1bfc00000, data 0x23218a7/0x254f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5366385 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529326080 unmapped: 123478016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 67.751548767s of 68.275238037s, submitted: 41
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529334272 unmapped: 123469824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc3b15e800 session 0x55bc3ec2b4a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x199e27000/0x0/0x1bfc00000, data 0x24698a7/0x2697000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5437757 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19961d000/0x0/0x1bfc00000, data 0x2c738a7/0x2ea1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529342464 unmapped: 123461632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5437757 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529350656 unmapped: 123453440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529350656 unmapped: 123453440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19961d000/0x0/0x1bfc00000, data 0x2c738a7/0x2ea1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc391e5400 session 0x55bc3a0503c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529350656 unmapped: 123453440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc3b15f400 session 0x55bc3e1f5680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529350656 unmapped: 123453440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529350656 unmapped: 123453440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc391e5800 session 0x55bc40a92780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.719742775s of 12.813817024s, submitted: 11
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19961d000/0x0/0x1bfc00000, data 0x2c738a7/0x2ea1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5440786 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529170432 unmapped: 123633664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529170432 unmapped: 123633664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19961c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2373f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529170432 unmapped: 123633664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc4385d800 session 0x55bc3ec2a780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529170432 unmapped: 123633664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529170432 unmapped: 123633664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5440386 data_alloc: 218103808 data_used: 12025856
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529178624 unmapped: 123625472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529178624 unmapped: 123625472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529178624 unmapped: 123625472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19920c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529186816 unmapped: 123617280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5508838 data_alloc: 234881024 data_used: 21602304
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19920c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19920c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5508838 data_alloc: 234881024 data_used: 21602304
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19920c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19920c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529522688 unmapped: 123281408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5508838 data_alloc: 234881024 data_used: 21602304
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530219008 unmapped: 122585088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.664346695s of 20.694303513s, submitted: 8
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x19920c000/0x0/0x1bfc00000, data 0x2c738b6/0x2ea2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535150592 unmapped: 117653504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535158784 unmapped: 117645312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535240704 unmapped: 117563392 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535248896 unmapped: 117555200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5627006 data_alloc: 234881024 data_used: 23121920
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535248896 unmapped: 117555200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198327000/0x0/0x1bfc00000, data 0x3b588b6/0x3d87000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198327000/0x0/0x1bfc00000, data 0x3b588b6/0x3d87000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5640390 data_alloc: 234881024 data_used: 23592960
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198327000/0x0/0x1bfc00000, data 0x3b588b6/0x3d87000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.309547424s of 11.783488274s, submitted: 105
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5640202 data_alloc: 234881024 data_used: 23601152
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535945216 unmapped: 116858880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x198303000/0x0/0x1bfc00000, data 0x3b7c8b6/0x3dab000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5639906 data_alloc: 234881024 data_used: 23601152
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535953408 unmapped: 116850688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x1982fd000/0x0/0x1bfc00000, data 0x3b828b6/0x3db1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5639906 data_alloc: 234881024 data_used: 23601152
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x1982fd000/0x0/0x1bfc00000, data 0x3b828b6/0x3db1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.061479568s of 16.081272125s, submitted: 7
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x1982f5000/0x0/0x1bfc00000, data 0x3b8a8b6/0x3db9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5641226 data_alloc: 234881024 data_used: 23625728
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535961600 unmapped: 116842496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535969792 unmapped: 116834304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc42c94c00 session 0x55bc39fcdc20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 ms_handle_reset con 0x55bc4a7a7c00 session 0x55bc3ec2b2c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x1982ef000/0x0/0x1bfc00000, data 0x3b908b6/0x3dbf000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 116817920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 116817920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5641094 data_alloc: 234881024 data_used: 23625728
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 116817920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 116817920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 116817920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535986176 unmapped: 116817920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 heartbeat osd_stat(store_statfs(0x1982ef000/0x0/0x1bfc00000, data 0x3b908b6/0x3dbf000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535994368 unmapped: 116809728 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 421 handle_osd_map epochs [421,422], i have 421, src has [1,422]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.262039185s of 11.294044495s, submitted: 10
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5644740 data_alloc: 234881024 data_used: 23633920
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982eb000/0x0/0x1bfc00000, data 0x3b9250f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 116793344 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 ms_handle_reset con 0x55bc3b4d4000 session 0x55bc3d34b0e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536010752 unmapped: 116793344 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 ms_handle_reset con 0x55bc3b762400 session 0x55bc3e3ae5a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537067520 unmapped: 115736576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537092096 unmapped: 115712000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537092096 unmapped: 115712000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5644632 data_alloc: 234881024 data_used: 23674880
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537092096 unmapped: 115712000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x3b9250f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537092096 unmapped: 115712000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x3b9250f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537092096 unmapped: 115712000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x3b9250f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 537092096 unmapped: 115712000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 ms_handle_reset con 0x55bc4a7b3000 session 0x55bc40a932c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 ms_handle_reset con 0x55bc3eeeb800 session 0x55bc3a521680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5644500 data_alloc: 234881024 data_used: 23674880
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x3b9250f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5644660 data_alloc: 234881024 data_used: 23678976
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536043520 unmapped: 116760576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.907535553s of 16.930467606s, submitted: 16
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536084480 unmapped: 116719616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x3b9250f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 ms_handle_reset con 0x55bc3b15f400 session 0x55bc3d356960
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x540150f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536084480 unmapped: 116719616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 heartbeat osd_stat(store_statfs(0x1982ec000/0x0/0x1bfc00000, data 0x540150f/0x3dc2000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536084480 unmapped: 116719616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536084480 unmapped: 116719616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 422 handle_osd_map epochs [422,423], i have 422, src has [1,423]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5721002 data_alloc: 234881024 data_used: 23687168
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536092672 unmapped: 116711424 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 423 heartbeat osd_stat(store_statfs(0x1982e8000/0x0/0x1bfc00000, data 0x3b941bc/0x3dc5000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536092672 unmapped: 116711424 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536092672 unmapped: 116711424 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536100864 unmapped: 116703232 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 423 ms_handle_reset con 0x55bc3e398800 session 0x55bc39f2cd20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536109056 unmapped: 116695040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5721002 data_alloc: 234881024 data_used: 23687168
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536109056 unmapped: 116695040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 423 heartbeat osd_stat(store_statfs(0x1982e8000/0x0/0x1bfc00000, data 0x3b941bc/0x3dc5000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536109056 unmapped: 116695040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 423 heartbeat osd_stat(store_statfs(0x1982e8000/0x0/0x1bfc00000, data 0x3b941bc/0x3dc5000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536117248 unmapped: 116686848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536125440 unmapped: 116678656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 423 heartbeat osd_stat(store_statfs(0x1982e8000/0x0/0x1bfc00000, data 0x3b941bc/0x3dc5000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5721134 data_alloc: 234881024 data_used: 23687168
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 423 handle_osd_map epochs [423,424], i have 423, src has [1,424]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.398308754s of 15.507222176s, submitted: 36
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e5000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5724108 data_alloc: 234881024 data_used: 23687168
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536133632 unmapped: 116670464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e5000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536141824 unmapped: 116662272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5736340 data_alloc: 234881024 data_used: 25006080
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5736340 data_alloc: 234881024 data_used: 25006080
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536272896 unmapped: 116531200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5736500 data_alloc: 234881024 data_used: 25010176
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536281088 unmapped: 116523008 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 116514816 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e4000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5736660 data_alloc: 234881024 data_used: 25014272
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 116514816 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 116514816 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 116514816 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536289280 unmapped: 116514816 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.087652206s of 27.103370667s, submitted: 17
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536297472 unmapped: 116506624 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1982e3000/0x0/0x1bfc00000, data 0x3b95cfb/0x3dc8000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5737300 data_alloc: 234881024 data_used: 25014272
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536297472 unmapped: 116506624 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3d1ec800 session 0x55bc3ec2bc20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42902400 session 0x55bc3b248960
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 536297472 unmapped: 116506624 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528023552 unmapped: 124780544 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528023552 unmapped: 124780544 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528023552 unmapped: 124780544 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3f11d000 session 0x55bc3e3aeb40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399828 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527925248 unmapped: 124878848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527925248 unmapped: 124878848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527925248 unmapped: 124878848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527925248 unmapped: 124878848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527925248 unmapped: 124878848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399828 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527933440 unmapped: 124870656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527933440 unmapped: 124870656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527933440 unmapped: 124870656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399828 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399828 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399828 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527949824 unmapped: 124854272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527958016 unmapped: 124846080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527958016 unmapped: 124846080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527958016 unmapped: 124846080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5399828 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527958016 unmapped: 124846080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527958016 unmapped: 124846080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527966208 unmapped: 124837888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527966208 unmapped: 124837888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3dc22800 session 0x55bc3fd17680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc3f2de1e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3eeebc00 session 0x55bc3be04000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b762000 session 0x55bc40a92b40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 35.318782806s of 35.413410187s, submitted: 31
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527982592 unmapped: 124821504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5401549 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532176896 unmapped: 120627200 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b56000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc391e4800 session 0x55bc3b30be00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b762000 session 0x55bc39eafa40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3dc22800 session 0x55bc3b30a000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3eeebc00 session 0x55bc39864780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc395dc780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527990784 unmapped: 124813312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527990784 unmapped: 124813312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527990784 unmapped: 124813312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527990784 unmapped: 124813312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198f89000/0x0/0x1bfc00000, data 0x2ef3cec/0x3125000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5491623 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 527941632 unmapped: 124862464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3ef3d000 session 0x55bc3d34b2c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528097280 unmapped: 124706816 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 528113664 unmapped: 124690432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198f65000/0x0/0x1bfc00000, data 0x2f17cec/0x3149000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5582506 data_alloc: 234881024 data_used: 23056384
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198f65000/0x0/0x1bfc00000, data 0x2f17cec/0x3149000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5582506 data_alloc: 234881024 data_used: 23056384
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198f65000/0x0/0x1bfc00000, data 0x2f17cec/0x3149000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198f65000/0x0/0x1bfc00000, data 0x2f17cec/0x3149000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529137664 unmapped: 123666432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.492254257s of 19.686597824s, submitted: 31
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198f65000/0x0/0x1bfc00000, data 0x2f17cec/0x3149000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23b4f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5613674 data_alloc: 234881024 data_used: 23433216
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5620858 data_alloc: 234881024 data_used: 23760896
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5620858 data_alloc: 234881024 data_used: 23760896
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5620858 data_alloc: 234881024 data_used: 23760896
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.330408096s of 19.435075760s, submitted: 26
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3eeed000 session 0x55bc3971a780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b15fc00 session 0x55bc3be045a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-mon[77084]: mon.compute-2@1(peon).osd e433 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5619886 data_alloc: 234881024 data_used: 23760896
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5619886 data_alloc: 234881024 data_used: 23760896
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529481728 unmapped: 123322368 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529481728 unmapped: 123322368 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529481728 unmapped: 123322368 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529489920 unmapped: 123314176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.088842392s of 11.104266167s, submitted: 5
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc40646000 session 0x55bc3e1f4960
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc4a7afc00 session 0x55bc40a93a40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c86000/0x0/0x1bfc00000, data 0x3236cec/0x3468000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529489920 unmapped: 123314176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3e197c00 session 0x55bc395ba5a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5411382 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5411382 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529448960 unmapped: 123355136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5411382 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529457152 unmapped: 123346944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5411382 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529465344 unmapped: 123338752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529473536 unmapped: 123330560 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab96000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5411382 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529481728 unmapped: 123322368 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529489920 unmapped: 123314176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529489920 unmapped: 123314176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529489920 unmapped: 123314176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 24.814048767s of 24.931154251s, submitted: 42
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a850000/0x0/0x1bfc00000, data 0x266bd15/0x289e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [0,0,0,0,1])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3e347400 session 0x55bc3d357860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 529629184 unmapped: 123174912 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a606800 session 0x55bc39ed6f00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3c19e000 session 0x55bc3d34a960
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3ba8e800 session 0x55bc3a051a40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc4a7b2800 session 0x55bc4337f860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5448616 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a850000/0x0/0x1bfc00000, data 0x266bd4e/0x289e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530685952 unmapped: 122118144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530685952 unmapped: 122118144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a850000/0x0/0x1bfc00000, data 0x266bd4e/0x289e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530685952 unmapped: 122118144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530685952 unmapped: 122118144 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b764400 session 0x55bc3b2fbe00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530694144 unmapped: 122109952 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5461950 data_alloc: 218103808 data_used: 13312000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530694144 unmapped: 122109952 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a84f000/0x0/0x1bfc00000, data 0x266bd71/0x289f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a84f000/0x0/0x1bfc00000, data 0x266bd71/0x289f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5474590 data_alloc: 218103808 data_used: 15101952
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530849792 unmapped: 121954304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a84f000/0x0/0x1bfc00000, data 0x266bd71/0x289f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530857984 unmapped: 121946112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530857984 unmapped: 121946112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5474590 data_alloc: 218103808 data_used: 15101952
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 530857984 unmapped: 121946112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 16.565097809s of 17.140762329s, submitted: 55
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535273472 unmapped: 117530624 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a0cc000/0x0/0x1bfc00000, data 0x2deed71/0x3022000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535281664 unmapped: 117522432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535568384 unmapped: 117235712 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535568384 unmapped: 117235712 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5554052 data_alloc: 218103808 data_used: 17031168
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535568384 unmapped: 117235712 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535568384 unmapped: 117235712 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a049000/0x0/0x1bfc00000, data 0x2e71d71/0x30a5000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535568384 unmapped: 117235712 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 535568384 unmapped: 117235712 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534437888 unmapped: 118366208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a02a000/0x0/0x1bfc00000, data 0x2e90d71/0x30c4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5550496 data_alloc: 218103808 data_used: 17035264
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534437888 unmapped: 118366208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534437888 unmapped: 118366208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534437888 unmapped: 118366208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a02a000/0x0/0x1bfc00000, data 0x2e90d71/0x30c4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5550816 data_alloc: 218103808 data_used: 17043456
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a02a000/0x0/0x1bfc00000, data 0x2e90d71/0x30c4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a02a000/0x0/0x1bfc00000, data 0x2e90d71/0x30c4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a02a000/0x0/0x1bfc00000, data 0x2e90d71/0x30c4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5550816 data_alloc: 218103808 data_used: 17043456
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534446080 unmapped: 118358016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 20.096275330s of 20.498874664s, submitted: 111
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534454272 unmapped: 118349824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534454272 unmapped: 118349824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a01d000/0x0/0x1bfc00000, data 0x2e9dd71/0x30d1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534454272 unmapped: 118349824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534454272 unmapped: 118349824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a01d000/0x0/0x1bfc00000, data 0x2e9dd71/0x30d1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3e398c00 session 0x55bc3e1f4780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b142000 session 0x55bc3d34b680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5550120 data_alloc: 218103808 data_used: 17043456
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534454272 unmapped: 118349824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a01d000/0x0/0x1bfc00000, data 0x2e9dd71/0x30d1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 534454272 unmapped: 118349824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3d5fd800 session 0x55bc39fcda40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532070400 unmapped: 120733696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532070400 unmapped: 120733696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532070400 unmapped: 120733696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422218 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422218 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422218 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422218 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422218 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5422218 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532078592 unmapped: 120725504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532086784 unmapped: 120717312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab94000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532086784 unmapped: 120717312 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 36.724323273s of 36.925601959s, submitted: 61
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc40646000 session 0x55bc396e85a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532094976 unmapped: 120709120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532094976 unmapped: 120709120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5529662 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532094976 unmapped: 120709120 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199db1000/0x0/0x1bfc00000, data 0x310bcec/0x333d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5529662 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199db1000/0x0/0x1bfc00000, data 0x310bcec/0x333d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3f11d400 session 0x55bc3f2def00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc39eaf4a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532103168 unmapped: 120700928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3ba8f000 session 0x55bc3b2492c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.088305473s of 11.171178818s, submitted: 14
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b1a1800 session 0x55bc392ab860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532299776 unmapped: 120504320 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199db1000/0x0/0x1bfc00000, data 0x310bcec/0x333d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199d8b000/0x0/0x1bfc00000, data 0x312fd1f/0x3363000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5536818 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 532357120 unmapped: 120446976 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199d8b000/0x0/0x1bfc00000, data 0x312fd1f/0x3363000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5640338 data_alloc: 234881024 data_used: 26611712
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199d8b000/0x0/0x1bfc00000, data 0x312fd1f/0x3363000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5640338 data_alloc: 234881024 data_used: 26611712
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199d8b000/0x0/0x1bfc00000, data 0x312fd1f/0x3363000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538411008 unmapped: 114393088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.251974106s of 12.289438248s, submitted: 13
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 538451968 unmapped: 114352128 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540524544 unmapped: 112279552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995a4000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5712800 data_alloc: 234881024 data_used: 27160576
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995a4000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5712800 data_alloc: 234881024 data_used: 27160576
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995a4000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541851648 unmapped: 110952448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.383799553s of 10.515339851s, submitted: 71
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541868032 unmapped: 110936064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541868032 unmapped: 110936064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541868032 unmapped: 110936064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3d5fd000 session 0x55bc3d3565a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc3b2fa1e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5712188 data_alloc: 234881024 data_used: 27148288
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541868032 unmapped: 110936064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541868032 unmapped: 110936064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5709728 data_alloc: 234881024 data_used: 27312128
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5709728 data_alloc: 234881024 data_used: 27312128
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.376385689s of 15.460625648s, submitted: 23
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5716368 data_alloc: 234881024 data_used: 27676672
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995ac000/0x0/0x1bfc00000, data 0x390ed1f/0x3b42000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc39864780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc3e3ae960
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541884416 unmapped: 110919680 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6000 session 0x55bc3f2def00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326d0f/0x2559000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539672576 unmapped: 113131520 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539680768 unmapped: 113123328 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5435847 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539688960 unmapped: 113115136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19ab95000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539688960 unmapped: 113115136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c94c00 session 0x55bc3d3561e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c94c00 session 0x55bc39eae3c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc4337f2c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc3f2dfe00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539688960 unmapped: 113115136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 55.381656647s of 55.483505249s, submitted: 52
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6000 session 0x55bc392abc20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc3a0501e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc3be04d20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc3e3afa40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc3fd161e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5524672 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a101000/0x0/0x1bfc00000, data 0x2dbacfc/0x2fed000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a101000/0x0/0x1bfc00000, data 0x2dbacfc/0x2fed000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5524672 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c95800 session 0x55bc395dd4a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc4a7aac00 session 0x55bc40a92b40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a101000/0x0/0x1bfc00000, data 0x2dbacfc/0x2fed000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539860992 unmapped: 112943104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc395bab40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.130089760s of 10.246961594s, submitted: 29
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc3fd16780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540180480 unmapped: 112623616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540180480 unmapped: 112623616 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5530299 data_alloc: 218103808 data_used: 12058624
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a0db000/0x0/0x1bfc00000, data 0x2dded2f/0x3013000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540262400 unmapped: 112541696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7200.5 total, 600.0 interval#012Cumulative writes: 77K writes, 307K keys, 77K commit groups, 1.0 writes per commit group, ingest: 0.31 GB, 0.04 MB/s#012Cumulative WAL: 77K writes, 28K syncs, 2.69 writes per sync, written: 0.31 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 2146 writes, 8860 keys, 2146 commit groups, 1.0 writes per commit group, ingest: 7.11 MB, 0.01 MB/s#012Interval WAL: 2146 writes, 874 syncs, 2.46 writes per sync, written: 0.01 GB, 0.01 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5601179 data_alloc: 218103808 data_used: 21934080
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a0db000/0x0/0x1bfc00000, data 0x2dded2f/0x3013000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a0db000/0x0/0x1bfc00000, data 0x2dded2f/0x3013000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a0db000/0x0/0x1bfc00000, data 0x2dded2f/0x3013000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: mgrc ms_handle_reset ms_handle_reset con 0x55bc4a7b0800
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/530399322
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/530399322,v1:192.168.122.100:6801/530399322]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5601179 data_alloc: 218103808 data_used: 21934080
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: mgrc handle_mgr_configure stats_period=5
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540270592 unmapped: 112533504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.702584267s of 13.730019569s, submitted: 8
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 539394048 unmapped: 113410048 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199c3f000/0x0/0x1bfc00000, data 0x3272d2f/0x34a7000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540614656 unmapped: 112189440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5656561 data_alloc: 234881024 data_used: 22425600
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b92000/0x0/0x1bfc00000, data 0x331fd2f/0x3554000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b92000/0x0/0x1bfc00000, data 0x331fd2f/0x3554000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5656577 data_alloc: 234881024 data_used: 22425600
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540999680 unmapped: 111804416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b92000/0x0/0x1bfc00000, data 0x331fd2f/0x3554000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x199b9a000/0x0/0x1bfc00000, data 0x331fd2f/0x3554000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 111796224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 111796224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.218096733s of 11.472540855s, submitted: 101
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc4337e000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c95800 session 0x55bc3a1621e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 541007872 unmapped: 111796224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc40646000 session 0x55bc3d356b40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc40646000 session 0x55bc39fc61e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c95000 session 0x55bc39eafa40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc3fd16b40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540508160 unmapped: 112295936 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc3be054a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc398650e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc39665860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc3a05d680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc3971ba40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c94800 session 0x55bc391de5a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5541273 data_alloc: 218103808 data_used: 12050432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540753920 unmapped: 112050176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3b162800 session 0x55bc40a930e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a1c5000/0x0/0x1bfc00000, data 0x2cf5d5e/0x2f29000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc396e8000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540753920 unmapped: 112050176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a607800 session 0x55bc40a93680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540753920 unmapped: 112050176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540901376 unmapped: 111902720 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6400 session 0x55bc4337e5a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 540901376 unmapped: 111902720 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5557071 data_alloc: 218103808 data_used: 13787136
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 542064640 unmapped: 110739456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a1a0000/0x0/0x1bfc00000, data 0x2d19d6e/0x2f4e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5607791 data_alloc: 218103808 data_used: 20914176
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a1a0000/0x0/0x1bfc00000, data 0x2d19d6e/0x2f4e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5607791 data_alloc: 218103808 data_used: 20914176
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19a1a0000/0x0/0x1bfc00000, data 0x2d19d6e/0x2f4e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x22b0f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543031296 unmapped: 109772800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.004535675s of 18.269424438s, submitted: 95
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3ef3dc00 session 0x55bc3b203680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545677312 unmapped: 107126784 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546119680 unmapped: 106684416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545898496 unmapped: 106905600 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545906688 unmapped: 106897408 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1988b9000/0x0/0x1bfc00000, data 0x3457d6e/0x368c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23caf9c7), peers [0,1] op hist [0,1])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1988b9000/0x0/0x1bfc00000, data 0x3457d6e/0x368c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23caf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5684591 data_alloc: 218103808 data_used: 21942272
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545923072 unmapped: 106881024 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545923072 unmapped: 106881024 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545964032 unmapped: 106840064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545964032 unmapped: 106840064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1988c2000/0x0/0x1bfc00000, data 0x3457d6e/0x368c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x23caf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 545964032 unmapped: 106840064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5678819 data_alloc: 218103808 data_used: 21946368
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546029568 unmapped: 106774528 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.653300285s of 10.073910713s, submitted: 353
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546029568 unmapped: 106774528 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546037760 unmapped: 106766336 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc42c94800 session 0x55bc39dc5a40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3e346800 session 0x55bc3971ab40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547094528 unmapped: 105709568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x198491000/0x0/0x1bfc00000, data 0x3478d6e/0x36ad000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547094528 unmapped: 105709568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5465246 data_alloc: 218103808 data_used: 12070912
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 543596544 unmapped: 109207552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3a42a000 session 0x55bc3be045a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5465246 data_alloc: 218103808 data_used: 12070912
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5465246 data_alloc: 218103808 data_used: 12070912
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5465246 data_alloc: 218103808 data_used: 12070912
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5465246 data_alloc: 218103808 data_used: 12070912
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544661504 unmapped: 108142592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544669696 unmapped: 108134400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544669696 unmapped: 108134400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5465246 data_alloc: 218103808 data_used: 12070912
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544669696 unmapped: 108134400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1995e5000/0x0/0x1bfc00000, data 0x2326cec/0x2558000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544669696 unmapped: 108134400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 29.991601944s of 31.428083420s, submitted: 139
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544669696 unmapped: 108134400 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3ba8e800 session 0x55bc3ec2a780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc4a7ab000 session 0x55bc3d3572c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc4a7b1800 session 0x55bc4337e1e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6800 session 0x55bc3e3aeb40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc401b6800 session 0x55bc3dbd3860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19921e000/0x0/0x1bfc00000, data 0x26eecec/0x2920000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498387 data_alloc: 218103808 data_used: 12070912
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19921e000/0x0/0x1bfc00000, data 0x26eecec/0x2920000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544694272 unmapped: 108109824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc4a7aa000 session 0x55bc3fd161e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5500775 data_alloc: 218103808 data_used: 12070912
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544849920 unmapped: 107954176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544849920 unmapped: 107954176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544858112 unmapped: 107945984 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1991fa000/0x0/0x1bfc00000, data 0x2712cec/0x2944000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5528107 data_alloc: 218103808 data_used: 15839232
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1991fa000/0x0/0x1bfc00000, data 0x2712cec/0x2944000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5528107 data_alloc: 218103808 data_used: 15839232
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1991fa000/0x0/0x1bfc00000, data 0x2712cec/0x2944000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 544866304 unmapped: 107937792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.569227219s of 21.706684113s, submitted: 22
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1991fa000/0x0/0x1bfc00000, data 0x2712cec/0x2944000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548446208 unmapped: 104357888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1991fa000/0x0/0x1bfc00000, data 0x2712cec/0x2944000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [0,0,0,0,0,0,1,1,11,20])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5629407 data_alloc: 218103808 data_used: 15859712
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546668544 unmapped: 106135552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546684928 unmapped: 106119168 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19851f000/0x0/0x1bfc00000, data 0x33edcec/0x361f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546684928 unmapped: 106119168 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 546684928 unmapped: 106119168 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x1984de000/0x0/0x1bfc00000, data 0x342ecec/0x3660000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547110912 unmapped: 105693184 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5639943 data_alloc: 218103808 data_used: 16769024
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19849d000/0x0/0x1bfc00000, data 0x346fcec/0x36a1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5640103 data_alloc: 218103808 data_used: 16773120
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.593935966s of 11.477274895s, submitted: 78
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19847b000/0x0/0x1bfc00000, data 0x3491cec/0x36c3000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5640823 data_alloc: 218103808 data_used: 16773120
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc39921400 session 0x55bc395ba5a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3e398400 session 0x55bc395dcd20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19846d000/0x0/0x1bfc00000, data 0x349fcec/0x36d1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19846d000/0x0/0x1bfc00000, data 0x349fcec/0x36d1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5639191 data_alloc: 218103808 data_used: 16773120
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 heartbeat osd_stat(store_statfs(0x19846d000/0x0/0x1bfc00000, data 0x349fcec/0x36d1000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547119104 unmapped: 105684992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 ms_handle_reset con 0x55bc3eeecc00 session 0x55bc392aa5a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.454691887s of 12.702719688s, submitted: 11
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 424 handle_osd_map epochs [424,425], i have 424, src has [1,425]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 425 ms_handle_reset con 0x55bc3eeecc00 session 0x55bc3b30a780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 425 ms_handle_reset con 0x55bc39921400 session 0x55bc3d3570e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 425 ms_handle_reset con 0x55bc3e398400 session 0x55bc4337f860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5643365 data_alloc: 218103808 data_used: 16781312
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 425 heartbeat osd_stat(store_statfs(0x198469000/0x0/0x1bfc00000, data 0x34a1945/0x36d4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 425 heartbeat osd_stat(store_statfs(0x198469000/0x0/0x1bfc00000, data 0x34a1945/0x36d4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5644137 data_alloc: 218103808 data_used: 16822272
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 425 heartbeat osd_stat(store_statfs(0x198469000/0x0/0x1bfc00000, data 0x34a1945/0x36d4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 425 heartbeat osd_stat(store_statfs(0x198469000/0x0/0x1bfc00000, data 0x34a1945/0x36d4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5644137 data_alloc: 218103808 data_used: 16822272
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 425 heartbeat osd_stat(store_statfs(0x198469000/0x0/0x1bfc00000, data 0x34a1945/0x36d4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547127296 unmapped: 105676800 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 15.551640511s of 15.687273026s, submitted: 8
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 425 ms_handle_reset con 0x55bc3b762800 session 0x55bc3b30be00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 425 heartbeat osd_stat(store_statfs(0x198469000/0x0/0x1bfc00000, data 0x34a1945/0x36d4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547135488 unmapped: 105668608 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 425 handle_osd_map epochs [425,426], i have 425, src has [1,426]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 426 ms_handle_reset con 0x55bc3a607800 session 0x55bc3e3ae1e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5654547 data_alloc: 218103808 data_used: 17137664
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547143680 unmapped: 105660416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 426 heartbeat osd_stat(store_statfs(0x198464000/0x0/0x1bfc00000, data 0x34a35f2/0x36d7000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5660467 data_alloc: 218103808 data_used: 17682432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 426 heartbeat osd_stat(store_statfs(0x198464000/0x0/0x1bfc00000, data 0x34a35f2/0x36d7000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547160064 unmapped: 105644032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5660467 data_alloc: 218103808 data_used: 17682432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 426 handle_osd_map epochs [426,427], i have 426, src has [1,427]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.463970184s of 11.542026520s, submitted: 28
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3a42b400 session 0x55bc3e1f5c20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b1a1400 session 0x55bc392a5680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198464000/0x0/0x1bfc00000, data 0x34a5131/0x36da000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5661809 data_alloc: 218103808 data_used: 17682432
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198464000/0x0/0x1bfc00000, data 0x34a5131/0x36da000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc42c95800 session 0x55bc41a75e00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547192832 unmapped: 105611264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547201024 unmapped: 105603072 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547209216 unmapped: 105594880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dd000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5485265 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7b3000 session 0x55bc40a93860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b145000 session 0x55bc3d34ad20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b145000 session 0x55bc3f2df0e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3a42b400 session 0x55bc3dbd21e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 56.036472321s of 56.152065277s, submitted: 44
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547217408 unmapped: 105586688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547225600 unmapped: 105578496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547241984 unmapped: 105562112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b1a1400 session 0x55bc3ec2ad20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198d41000/0x0/0x1bfc00000, data 0x2bc8131/0x2dfd000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5553765 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547250176 unmapped: 105553920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547250176 unmapped: 105553920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547250176 unmapped: 105553920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198d41000/0x0/0x1bfc00000, data 0x2bc8131/0x2dfd000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547250176 unmapped: 105553920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7b2000 session 0x55bc3d357e00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547250176 unmapped: 105553920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5558951 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547250176 unmapped: 105553920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547266560 unmapped: 105537536 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198d1c000/0x0/0x1bfc00000, data 0x2bec154/0x2e22000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5615723 data_alloc: 218103808 data_used: 19992576
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198d1c000/0x0/0x1bfc00000, data 0x2bec154/0x2e22000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198d1c000/0x0/0x1bfc00000, data 0x2bec154/0x2e22000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5615723 data_alloc: 218103808 data_used: 19992576
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547463168 unmapped: 105340928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 21.351263046s of 21.458789825s, submitted: 24
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547880960 unmapped: 104923136 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198d1c000/0x0/0x1bfc00000, data 0x2bec154/0x2e22000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551190528 unmapped: 101613568 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549167104 unmapped: 103636992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5682095 data_alloc: 218103808 data_used: 20029440
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1984a4000/0x0/0x1bfc00000, data 0x345e154/0x3694000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [0,0,0,0,0,0,0,1])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549167104 unmapped: 103636992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549167104 unmapped: 103636992 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5695981 data_alloc: 218103808 data_used: 20627456
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x198484000/0x0/0x1bfc00000, data 0x3479154/0x36af000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550215680 unmapped: 102588416 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.743067741s of 11.654676437s, submitted: 77
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5695233 data_alloc: 218103808 data_used: 20615168
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848d000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f12a000 session 0x55bc3b2f45a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d1ed800 session 0x55bc39f2cd20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848d000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5695101 data_alloc: 218103808 data_used: 20615168
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4ac00 session 0x55bc3dbd3a40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848d000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550223872 unmapped: 102580224 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848d000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550232064 unmapped: 102572032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.722215652s of 10.750947952s, submitted: 25
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5691153 data_alloc: 218103808 data_used: 20619264
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550240256 unmapped: 102563840 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5691953 data_alloc: 218103808 data_used: 20717568
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5691953 data_alloc: 218103808 data_used: 20717568
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550256640 unmapped: 102547456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550264832 unmapped: 102539264 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.778625488s of 11.782457352s, submitted: 1
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550281216 unmapped: 102522880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848c000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550281216 unmapped: 102522880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848c000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5696401 data_alloc: 218103808 data_used: 20930560
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5696401 data_alloc: 218103808 data_used: 20930560
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5696401 data_alloc: 218103808 data_used: 20930560
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b15fc00 session 0x55bc3e3afa40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 14.834457397s of 14.847952843s, submitted: 15
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3ef3c400 session 0x55bc40a93c20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550297600 unmapped: 102506496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19848e000/0x0/0x1bfc00000, data 0x347a154/0x36b0000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550305792 unmapped: 102498304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547397632 unmapped: 105406464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5505715 data_alloc: 218103808 data_used: 12201984
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3dc23400 session 0x55bc3a4501e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547405824 unmapped: 105398272 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 105390080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 105390080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 105390080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547414016 unmapped: 105390080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547422208 unmapped: 105381888 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547430400 unmapped: 105373696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547430400 unmapped: 105373696 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547438592 unmapped: 105365504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1995dc000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547438592 unmapped: 105365504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5498111 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547438592 unmapped: 105365504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547438592 unmapped: 105365504 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 41.052555084s of 41.146015167s, submitted: 37
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556376064 unmapped: 96428032 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bbdec00 session 0x55bc3971a780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7afc00 session 0x55bc39ed7680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc42c94800 session 0x55bc3b30b860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7ac400 session 0x55bc3d34af00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4b000 session 0x55bc41a75e00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547684352 unmapped: 105119744 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1987e1000/0x0/0x1bfc00000, data 0x3128131/0x335d000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547684352 unmapped: 105119744 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5617034 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4385c800 session 0x55bc395dd2c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547692544 unmapped: 105111552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f124800 session 0x55bc3b2f50e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b4d5000 session 0x55bc3e1f5680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547561472 unmapped: 105242624 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc39689400 session 0x55bc3fd17c20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547569664 unmapped: 105234432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1987e0000/0x0/0x1bfc00000, data 0x3128141/0x335e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547569664 unmapped: 105234432 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 547692544 unmapped: 105111552 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5720140 data_alloc: 234881024 data_used: 26136576
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1987e0000/0x0/0x1bfc00000, data 0x3128141/0x335e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549036032 unmapped: 103768064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549036032 unmapped: 103768064 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1987e0000/0x0/0x1bfc00000, data 0x3128141/0x335e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5720140 data_alloc: 234881024 data_used: 26136576
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1987e0000/0x0/0x1bfc00000, data 0x3128141/0x335e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 549044224 unmapped: 103759872 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.640283585s of 17.763603210s, submitted: 39
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5720712 data_alloc: 234881024 data_used: 26148864
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1987e0000/0x0/0x1bfc00000, data 0x3128141/0x335e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x240bf9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #59. Immutable memtables: 15.
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 555016192 unmapped: 97787904 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 555106304 unmapped: 97697792 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196b3e000/0x0/0x1bfc00000, data 0x3c2a141/0x3e60000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 555294720 unmapped: 97509376 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556400640 unmapped: 96403456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19672a000/0x0/0x1bfc00000, data 0x403e141/0x4274000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556400640 unmapped: 96403456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5868382 data_alloc: 234881024 data_used: 27492352
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556400640 unmapped: 96403456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196330000/0x0/0x1bfc00000, data 0x4432141/0x4668000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556400640 unmapped: 96403456 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196330000/0x0/0x1bfc00000, data 0x4432141/0x4668000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556138496 unmapped: 96665600 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556179456 unmapped: 96624640 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557449216 unmapped: 95354880 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5871674 data_alloc: 234881024 data_used: 27303936
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.422229767s of 10.152475357s, submitted: 181
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196279000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196279000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5883714 data_alloc: 234881024 data_used: 27533312
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196279000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7a9000 session 0x55bc396e9680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3c19f800 session 0x55bc3d34a000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5879794 data_alloc: 234881024 data_used: 27533312
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19627f000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19627f000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5879794 data_alloc: 234881024 data_used: 27533312
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19627f000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557457408 unmapped: 95346688 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 17.022401810s of 17.036853790s, submitted: 9
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 95338496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19627f000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 95338496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19627f000/0x0/0x1bfc00000, data 0x44e9141/0x471f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557465600 unmapped: 95338496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5881946 data_alloc: 234881024 data_used: 27537408
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19625a000/0x0/0x1bfc00000, data 0x450e141/0x4744000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557473792 unmapped: 95330304 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc42903000 session 0x55bc395ba5a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3fa62800 session 0x55bc3a05d680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4b000 session 0x55bc3b30ba40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551845888 unmapped: 100958208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551854080 unmapped: 100950016 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551862272 unmapped: 100941824 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551870464 unmapped: 100933632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551870464 unmapped: 100933632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551870464 unmapped: 100933632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551870464 unmapped: 100933632 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551878656 unmapped: 100925440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551878656 unmapped: 100925440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551878656 unmapped: 100925440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5518585 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551878656 unmapped: 100925440 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3ec53400 session 0x55bc39f2d860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4b000 session 0x55bc3e1f4b40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3c19f800 session 0x55bc3e3aeb40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3fa62800 session 0x55bc396e8000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 44.188873291s of 44.280071259s, submitted: 44
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552927232 unmapped: 99876864 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc42903000 session 0x55bc3fd161e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d5fc000 session 0x55bc3e3ae780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4b000 session 0x55bc39fcc960
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e7216a/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3c19f800 session 0x55bc3a520000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3fa62800 session 0x55bc3be052c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5612520 data_alloc: 218103808 data_used: 12091392
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552091648 unmapped: 100712448 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d5fd000 session 0x55bc3fd16f00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552099840 unmapped: 100704256 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5612520 data_alloc: 218103808 data_used: 12091392
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 552099840 unmapped: 100704256 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d029c00 session 0x55bc3ec2a780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5687400 data_alloc: 234881024 data_used: 22646784
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5687400 data_alloc: 234881024 data_used: 22646784
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b1a1800 session 0x55bc4337fe00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5687400 data_alloc: 234881024 data_used: 22646784
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553172992 unmapped: 99631104 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f124800 session 0x55bc3dbd3680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 27.174053192s of 27.356210709s, submitted: 41
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553181184 unmapped: 99622912 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1978f5000/0x0/0x1bfc00000, data 0x2e721a3/0x30a9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc42902400 session 0x55bc3dbd3680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548397056 unmapped: 104407040 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548405248 unmapped: 104398848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548405248 unmapped: 104398848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843c000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548405248 unmapped: 104398848 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5526552 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 41.755805969s of 41.814205170s, submitted: 18
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b586000 session 0x55bc3e3ae780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3a42b400 session 0x55bc3b30ba40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b1a1800 session 0x55bc395ba5a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b586000 session 0x55bc3fd17c20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f124800 session 0x55bc41a75e00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548413440 unmapped: 104390656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548413440 unmapped: 104390656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548413440 unmapped: 104390656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548413440 unmapped: 104390656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197883000/0x0/0x1bfc00000, data 0x2ee6131/0x311b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548421632 unmapped: 104382464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7a6c00 session 0x55bc3b30b860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5619321 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7b2000 session 0x55bc3d34ba40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b1a1800 session 0x55bc39fccd20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b586000 session 0x55bc41a74f00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f124800 session 0x55bc3fd165a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548421632 unmapped: 104382464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b13c000 session 0x55bc3a0505a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548421632 unmapped: 104382464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197882000/0x0/0x1bfc00000, data 0x2ee615a/0x311c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548421632 unmapped: 104382464 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548732928 unmapped: 104071168 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548438016 unmapped: 104366080 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5690755 data_alloc: 218103808 data_used: 12087296
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f12bc00 session 0x55bc39665a40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.917882919s of 10.103110313s, submitted: 59
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548626432 unmapped: 104177664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197112000/0x0/0x1bfc00000, data 0x365615a/0x388c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548626432 unmapped: 104177664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966fa000/0x0/0x1bfc00000, data 0x406d16a/0x42a4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548626432 unmapped: 104177664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4ac00 session 0x55bc39ed7680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548626432 unmapped: 104177664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7a6c00 session 0x55bc40a92780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d1edc00 session 0x55bc3dbd34a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b721000 session 0x55bc41a74960
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b13c000 session 0x55bc4337eb40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b721000 session 0x55bc3f2de5a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3bf4ac00 session 0x55bc3e3afa40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d1edc00 session 0x55bc3b2f45a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548642816 unmapped: 104161280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5770569 data_alloc: 218103808 data_used: 12091392
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548642816 unmapped: 104161280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548642816 unmapped: 104161280 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966fa000/0x0/0x1bfc00000, data 0x406d1a3/0x42a4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548651008 unmapped: 104153088 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7a6400 session 0x55bc3be05a40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f11c400 session 0x55bc3e1f4f00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548659200 unmapped: 104144896 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc39921400 session 0x55bc3d34b4a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d1ec800 session 0x55bc3e3ae3c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 548667392 unmapped: 104136704 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5840969 data_alloc: 218103808 data_used: 19673088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550084608 unmapped: 102719488 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3ef3cc00 session 0x55bc3b2f5680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7af000 session 0x55bc3d357860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.621556282s of 10.712708473s, submitted: 30
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966fa000/0x0/0x1bfc00000, data 0x406d1a3/0x42a4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550084608 unmapped: 102719488 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7aec00 session 0x55bc3f2dfc20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3e346c00 session 0x55bc3a05cd20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550395904 unmapped: 102408192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966b0000/0x0/0x1bfc00000, data 0x40b51c3/0x42ee000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550395904 unmapped: 102408192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966b0000/0x0/0x1bfc00000, data 0x40b51c3/0x42ee000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550395904 unmapped: 102408192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5864350 data_alloc: 218103808 data_used: 21291008
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966b0000/0x0/0x1bfc00000, data 0x40b51c3/0x42ee000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966b0000/0x0/0x1bfc00000, data 0x40b51c3/0x42ee000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550404096 unmapped: 102400000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 550404096 unmapped: 102400000 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551346176 unmapped: 101457920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551346176 unmapped: 101457920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551346176 unmapped: 101457920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5886590 data_alloc: 234881024 data_used: 24281088
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966b0000/0x0/0x1bfc00000, data 0x40b51c3/0x42ee000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551346176 unmapped: 101457920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.204458237s of 10.254082680s, submitted: 13
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551346176 unmapped: 101457920 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1966b0000/0x0/0x1bfc00000, data 0x40b51c3/0x42ee000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 555794432 unmapped: 97009664 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557006848 unmapped: 95797248 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556679168 unmapped: 96124928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6050632 data_alloc: 234881024 data_used: 33976320
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x195a88000/0x0/0x1bfc00000, data 0x4cdd1c3/0x4f16000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556679168 unmapped: 96124928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556679168 unmapped: 96124928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556679168 unmapped: 96124928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 556679168 unmapped: 96124928 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559374336 unmapped: 93429760 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6144370 data_alloc: 234881024 data_used: 34029568
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558989312 unmapped: 93814784 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x194e3d000/0x0/0x1bfc00000, data 0x59281c3/0x5b61000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.682522774s of 10.029087067s, submitted: 159
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7aa000 session 0x55bc3b248960
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b586000 session 0x55bc41a75680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 558997504 unmapped: 93806592 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 93790208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 93790208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x194e96000/0x0/0x1bfc00000, data 0x42d01b3/0x4508000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3e196000 session 0x55bc3d34b680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559013888 unmapped: 93790208 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5940665 data_alloc: 234881024 data_used: 28221440
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x194eba000/0x0/0x1bfc00000, data 0x42ac1b3/0x44e4000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565108736 unmapped: 87695360 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565108736 unmapped: 87695360 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc39921400 session 0x55bc3d34af00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3d029000 session 0x55bc39dc5a40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19568f000/0x0/0x1bfc00000, data 0x50d11b3/0x5309000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565108736 unmapped: 87695360 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x195695000/0x0/0x1bfc00000, data 0x50d11b3/0x5309000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc39921400 session 0x55bc3e3ae780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196c2b000/0x0/0x1bfc00000, data 0x3b18141/0x3d4e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563871744 unmapped: 88932352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563871744 unmapped: 88932352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5804774 data_alloc: 218103808 data_used: 20484096
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196c2b000/0x0/0x1bfc00000, data 0x3b18141/0x3d4e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563871744 unmapped: 88932352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563871744 unmapped: 88932352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563871744 unmapped: 88932352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.577011108s of 12.010025024s, submitted: 196
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7a6c00 session 0x55bc3d34b0e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 563871744 unmapped: 88932352 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc4a7ab800 session 0x55bc3f2de780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559185920 unmapped: 93618176 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x196c50000/0x0/0x1bfc00000, data 0x3b18141/0x3d4e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5804166 data_alloc: 218103808 data_used: 20484096
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f12b800 session 0x55bc3a520000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557539328 unmapped: 95264768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5563027 data_alloc: 218103808 data_used: 11952128
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 557547520 unmapped: 95256576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 49.914997101s of 49.995937347s, submitted: 32
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc42903000 session 0x55bc41a750e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19843d000/0x0/0x1bfc00000, data 0x232c131/0x2561000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3eeeb400 session 0x55bc3e1f50e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3c2a2c00 session 0x55bc3e1f4780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b587000 session 0x55bc39f2cd20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3ba8ec00 session 0x55bc3e3ae960
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5659481 data_alloc: 218103808 data_used: 11952128
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197882000/0x0/0x1bfc00000, data 0x2ee6193/0x311c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b43b400 session 0x55bc39865a40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5659481 data_alloc: 218103808 data_used: 11952128
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551485440 unmapped: 101318656 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551608320 unmapped: 101195776 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197882000/0x0/0x1bfc00000, data 0x2ee6193/0x311c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5736957 data_alloc: 234881024 data_used: 22507520
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197882000/0x0/0x1bfc00000, data 0x2ee6193/0x311c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: [db/db_impl/db_impl.cc:1111] #012** DB Stats **#012Uptime(secs): 7800.5 total, 600.0 interval#012Cumulative writes: 80K writes, 319K keys, 80K commit groups, 1.0 writes per commit group, ingest: 0.32 GB, 0.04 MB/s#012Cumulative WAL: 80K writes, 29K syncs, 2.68 writes per sync, written: 0.32 GB, 0.04 MB/s#012Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent#012Interval writes: 3146 writes, 12K keys, 3146 commit groups, 1.0 writes per commit group, ingest: 12.82 MB, 0.02 MB/s#012Interval WAL: 3146 writes, 1219 syncs, 2.58 writes per sync, written: 0.01 GB, 0.02 MB/s#012Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5736957 data_alloc: 234881024 data_used: 22507520
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197882000/0x0/0x1bfc00000, data 0x2ee6193/0x311c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 551706624 unmapped: 101097472 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.433856964s of 19.576021194s, submitted: 48
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554033152 unmapped: 98770944 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554041344 unmapped: 98762752 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553369600 unmapped: 99434496 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x19721a000/0x0/0x1bfc00000, data 0x354d193/0x3783000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5791039 data_alloc: 234881024 data_used: 23113728
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x197212000/0x0/0x1bfc00000, data 0x3553193/0x3789000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5806577 data_alloc: 234881024 data_used: 23191552
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5806593 data_alloc: 234881024 data_used: 23191552
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5806593 data_alloc: 234881024 data_used: 23191552
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5806593 data_alloc: 234881024 data_used: 23191552
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553385984 unmapped: 99418112 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 25.315835953s of 25.517301559s, submitted: 100
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553443328 unmapped: 99360768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971f6000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553443328 unmapped: 99360768 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553451520 unmapped: 99352576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5801905 data_alloc: 234881024 data_used: 23179264
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553451520 unmapped: 99352576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553451520 unmapped: 99352576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971ff000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553451520 unmapped: 99352576 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3f124400 session 0x55bc3a47c1e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553459712 unmapped: 99344384 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 heartbeat osd_stat(store_statfs(0x1971ff000/0x0/0x1bfc00000, data 0x3569193/0x379f000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553459712 unmapped: 99344384 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5801097 data_alloc: 234881024 data_used: 23179264
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 ms_handle_reset con 0x55bc3b43bc00 session 0x55bc4337e1e0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553467904 unmapped: 99336192 heap: 652804096 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 handle_osd_map epochs [427,428], i have 427, src has [1,428]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 427 handle_osd_map epochs [428,428], i have 428, src has [1,428]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 428 ms_handle_reset con 0x55bc3eeea400 session 0x55bc39fb1c20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 428 ms_handle_reset con 0x55bc3b764400 session 0x55bc39ed7680
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 428 ms_handle_reset con 0x55bc3b13d000 session 0x55bc40a92780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 428 ms_handle_reset con 0x55bc3b43b400 session 0x55bc3e1f45a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 566214656 unmapped: 101122048 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 428 handle_osd_map epochs [428,429], i have 428, src has [1,429]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 429 ms_handle_reset con 0x55bc3b43bc00 session 0x55bc3f2de000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 101105664 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 429 handle_osd_map epochs [429,430], i have 429, src has [1,430]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.597118378s of 10.926770210s, submitted: 94
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 101105664 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 430 ms_handle_reset con 0x55bc3eeea400 session 0x55bc395bb860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 566231040 unmapped: 101105664 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 430 ms_handle_reset con 0x55bc4a7ab400 session 0x55bc3fd16b40
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 430 heartbeat osd_stat(store_statfs(0x195ac3000/0x0/0x1bfc00000, data 0x4c9f70e/0x4ed9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 430 ms_handle_reset con 0x55bc4a7ab400 session 0x55bc3b2485a0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 6009786 data_alloc: 234881024 data_used: 30371840
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 430 ms_handle_reset con 0x55bc3b13d000 session 0x55bc39fd4f00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559546368 unmapped: 107790336 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 430 heartbeat osd_stat(store_statfs(0x195ac5000/0x0/0x1bfc00000, data 0x4c9f70e/0x4ed9000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559546368 unmapped: 107790336 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 559554560 unmapped: 107782144 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 430 ms_handle_reset con 0x55bc4a7b2c00 session 0x55bc3ec2be00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553918464 unmapped: 113418240 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 430 heartbeat osd_stat(store_statfs(0x196d02000/0x0/0x1bfc00000, data 0x3a626ac/0x3c9b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 430 heartbeat osd_stat(store_statfs(0x196d02000/0x0/0x1bfc00000, data 0x3a626ac/0x3c9b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553918464 unmapped: 113418240 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5769078 data_alloc: 218103808 data_used: 11976704
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 430 heartbeat osd_stat(store_statfs(0x196d02000/0x0/0x1bfc00000, data 0x3a626ac/0x3c9b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553967616 unmapped: 113369088 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 430 heartbeat osd_stat(store_statfs(0x196d02000/0x0/0x1bfc00000, data 0x3a626ac/0x3c9b000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2525f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 553967616 unmapped: 113369088 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 430 handle_osd_map epochs [430,431], i have 430, src has [1,431]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554008576 unmapped: 113328128 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 heartbeat osd_stat(store_statfs(0x1968ef000/0x0/0x1bfc00000, data 0x3a641eb/0x3c9e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.445232391s of 10.060069084s, submitted: 237
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554057728 unmapped: 113278976 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 heartbeat osd_stat(store_statfs(0x1968f0000/0x0/0x1bfc00000, data 0x3a641eb/0x3c9e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5772196 data_alloc: 218103808 data_used: 11984896
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 heartbeat osd_stat(store_statfs(0x1968f0000/0x0/0x1bfc00000, data 0x3a641eb/0x3c9e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 heartbeat osd_stat(store_statfs(0x1968f0000/0x0/0x1bfc00000, data 0x3a641eb/0x3c9e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5772196 data_alloc: 218103808 data_used: 11984896
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 heartbeat osd_stat(store_statfs(0x1968f0000/0x0/0x1bfc00000, data 0x3a641eb/0x3c9e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5772196 data_alloc: 218103808 data_used: 11984896
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc42950000 session 0x55bc3ec2a780
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 554082304 unmapped: 113254400 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.059436798s of 12.233953476s, submitted: 70
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc3fa62800 session 0x55bc3e1f5e00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc3c19f800 session 0x55bc41a743c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc3b13d000 session 0x55bc39fcd860
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc42950000 session 0x55bc3d34a000
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc4a7ab400 session 0x55bc41a75e00
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc4a7b2c00 session 0x55bc392ab2c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 heartbeat osd_stat(store_statfs(0x1968f0000/0x0/0x1bfc00000, data 0x3a641eb/0x3c9e000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc3b13d000 session 0x55bc3a05c3c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 577232896 unmapped: 90103808 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 577232896 unmapped: 90103808 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 577232896 unmapped: 90103808 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 ms_handle_reset con 0x55bc3eeea400 session 0x55bc391df2c0
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 569237504 unmapped: 98099200 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.056338
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1191182336 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1124073472 meta_used: 5926373 data_alloc: 234881024 data_used: 36306944
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 431 handle_osd_map epochs [431,432], i have 431, src has [1,432]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 432 heartbeat osd_stat(store_statfs(0x195e62000/0x0/0x1bfc00000, data 0x44f21eb/0x472c000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [1])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 569253888 unmapped: 98082816 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 432 ms_handle_reset con 0x55bc3a472800 session 0x55bc4337fc20
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 432 heartbeat osd_stat(store_statfs(0x19758f000/0x0/0x1bfc00000, data 0x2dc2e98/0x2ffe000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 432 heartbeat osd_stat(store_statfs(0x19758f000/0x0/0x1bfc00000, data 0x2dc2e98/0x2ffe000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5731863 data_alloc: 218103808 data_used: 16175104
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 432 handle_osd_map epochs [432,433], i have 432, src has [1,433]
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.662106514s of 10.894762039s, submitted: 70
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19758c000/0x0/0x1bfc00000, data 0x2dc49d7/0x3001000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19758c000/0x0/0x1bfc00000, data 0x2dc49d7/0x3001000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5734837 data_alloc: 218103808 data_used: 16175104
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19758c000/0x0/0x1bfc00000, data 0x2dc49d7/0x3001000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564527104 unmapped: 102809600 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 102866944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 102866944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5754161 data_alloc: 218103808 data_used: 17903616
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 102866944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564469760 unmapped: 102866944 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19758d000/0x0/0x1bfc00000, data 0x2dc49d7/0x3001000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 564404224 unmapped: 102932480 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.595741272s of 11.623991966s, submitted: 18
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: bluestore.MempoolThread(0x55bc37d29b60) _resize_shards cache_size: 2845415832 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 5753809 data_alloc: 218103808 data_used: 17899520
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: osd.2 433 heartbeat osd_stat(store_statfs(0x19758d000/0x0/0x1bfc00000, data 0x2dc49d7/0x3001000, compress 0x0/0x0/0x0, omap 0x639, meta 0x2566f9c7), peers [0,1] op hist [])
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
Jan 23 06:21:28 np0005593234 ceph-osd[79769]: prioritycache tune_memory target: 4294967296 mapped: 565452800 unmapped: 101883904 heap: 667336704 old mem: 2845415832 new mem: 2845415832
